According to an analysis by 404 Media, Instagram advertises applications that use artificial intelligence to create nude photos of people without their consent.
fake photos, It is one of the main concerns brought by artificial intelligence. Those who are most affected by this situation are celebrities. Inappropriate fake photos of many names, from singers to actors, created with the help of artificial intelligence, can easily spread on the internet. A report shared by 404 Media reveals that the applications that create these fake images are advertised on the platforms everyone uses.
It was revealed that Meta, the umbrella company of Instagram, includes applications that create deepfake photos in its ad library. These apps take photos of people in clothes. fake nude images promises to create.
Images are used in advertisements without people’s consent
These ads, from which Meta is said to make money, appear before users. In advertisements images of celebrities or other people without consent It is said that it is used and inappropriate images can be created. For example, in one ad, Kim Kardashian’s photo was added and disturbing expressions such as “Undress any girl for free” were placed next to it.
Due to the popularity of artificial intelligence in recent months, such “nudification” practices have become widespread. We often saw that deepfake images of any person, not just celebrities, could be created.
RELATED NEWS
The Dark Side of Artificial Intelligence Has Reached Children: “Deepfake Naked” Photos of Students Started to Be Shared in a School!
RELATED NEWS
YouTube Closed More than 100 Deepfake Channels Containing Explicit Content
They are removed when detected
404 Media stated that most ads were removed in its review. However, he added that they can still be seen. A company spokesperson said in a statement: “Meta does not allow ads containing adult content. Infringing ads When we detect it, we remove it quickly. we are working.” he said. This means undetectable ads can still be seen on Instagram revealed.
We don’t just see this on Instagram. At the end of last year, advertisements for similar applications appeared before users on platforms such as X (Twitter). It would not be wrong to say that deepfake images are a snowballing problem. Social media platforms must make more efforts to prevent this situation.
RELATED NEWS
It Was Revealed That Pedophiles Blackmailed Children with Images They Created with Artificial Intelligence
Source :
https://www.404media.co/instagram-advertises-nonconsensual-ai-nude-apps/