Instagram Advertises Apps That Create Fake Nude Photos

According to an analysis by 404 Media, Instagram advertises applications that use artificial intelligence to create nude photos of people without their consent.

fake photos, It is one of the main concerns brought by artificial intelligence. Those who are most affected by this situation are celebrities. Inappropriate fake photos of many names, from singers to actors, created with the help of artificial intelligence, can easily spread on the internet. A report shared by 404 Media reveals that the applications that create these fake images are advertised on the platforms everyone uses.

It was revealed that Meta, the umbrella company of Instagram, includes applications that create deepfake photos in its ad library. These apps take photos of people in clothes. fake nude images promises to create.

Images are used in advertisements without people’s consent

These ads, from which Meta is said to make money, appear before users. In advertisements images of celebrities or other people without consent It is said that it is used and inappropriate images can be created. For example, in one ad, Kim Kardashian’s photo was added and disturbing expressions such as “Undress any girl for free” were placed next to it.

Due to the popularity of artificial intelligence in recent months, such “nudification” practices have become widespread. We often saw that deepfake images of any person, not just celebrities, could be created.

RELATED NEWS

The Dark Side of Artificial Intelligence Has Reached Children: “Deepfake Naked” Photos of Students Started to Be Shared in a School!

RELATED NEWS

YouTube Closed More than 100 Deepfake Channels Containing Explicit Content

They are removed when detected

Source :
https://www.404media.co/instagram-advertises-nonconsensual-ai-nude-apps/


source site-33