It turned out that the artificial intelligence-supported Lensa AI, which created avatars that have taken over all social media recently, is fraught with problems. Many users stated that the application creates avatars that present women as sexual objects.
Artificial intelligence technologies, which have made significant progress, have recently become seriously popular. The works created with systems such as DALL-E and Midjourney have taken the internet media captive. In addition, there is an explosion in applications that create people’s avatars, and we see these images that appear with the support of artificial intelligence, no matter which platform we visit.
Created by Prisma Labs Lensa AI and one of them. This application, which can be used for certain fees, is called ‘Magic Avatars’. ‘It has a feature called ‘Magic Avatars’. This allows users to create avatars that can truly be described as magical. However, this situation also shows us a problem of artificial intelligence by causing some problems to arise. Let’s take a look at this issue.
First of all, what is this Lensa AI? How is it working?
Lensa AI is an app created by Prisma Labs as we mentioned. Lensa, which has been at the top of the app stores recently, has actually been around since 2018. However, earlier this month, the ‘‘Magic Avatars’ feature made it popular. With this artificial intelligence-supported feature, which can be used in Turkey with prices starting from 88 TL, the photos you add can be turned into unique avatars from different themes (such as futuristic or anime characters).
Lensa is an open-source artificial intelligence deep learning model that leverages a database of artifacts on the internet. Stable Diffusion uses. This database is called LAION-5B and uses 5.85 billion text-image pairs filtered by a neural network called CLIP. A new version of Stable Diffusion, a deep learning model that creates images from text, which was released in 2022, was also released last month.
If you want to learn about other details about Lensa AI and how it can be used, you can browse our content below;
RELATED NEWS
How To Make These Profile Photos That Are Viral Everywhere? We explained step by step…
Lensa AI is said to create avatars that objectify women
Although artificial intelligence systems are advanced, they still have some problems. The best example of this is that artificial intelligences that convert from text to images have problems creating hands and fingers. However, according to Polygon, the situation is slightly different in Lensa.
Recent information, women of Lensa AI you sexualized and revealed some racist conclusions. Many users on social media refer to women in the app. big boobs and cleavage Although there is no such request when saying that it has been added even nude images were created reported.
RELATED NEWS
What Would Marvel Characters Look Like If They Were Babies? Artificial Intelligence Answered
Olivia Snow of the University of California also wrote in Wired that even when she added photos from her own childhood, the app kept her going. sexual object He stated that he created images that were launched as When the owners of the application were asked about this issue, the team stated that Snow deliberately violated the terms of use. Lensa’s policies prohibit the use of images of children while prohibiting nude photos. Users must also declare that they are over the age of 18.
There are even racist results and nudity avatars according to users
In addition, the app is said to include racist stereotypes in avatars. For example, in the images created by a user using his own photos above, it can be seen that artificial intelligence has difficulties with Asian people.
Except this Asian women being fetishized is also reported. An Asian journalist named Melissa Heikkilä, who wrote in MIT Technology Review, shared her experiences with the Lensa application. Accordingly, a high amount of nudity was included in the avatars created from their own photos. The journalist stated that his other colleagues were less likely to encounter such a situation.
It has also recently been revealed that this artificial intelligence can be tricked into creating nudity photos. A report shared by TechCrunch shows that users can easily use the app. that you can create nudity photos, even showed that these can be done with images of celebrities. Bringing new features on December 13, Prisma Labs stated that they will make avatars that cause security concerns less frequently seen.
“Artificial intelligence is a reflection of our society”
The events in Lensa reveal that artificial intelligence still has a long way to go. As it is known, some negative situations have occurred in these systems many times before. Microsoft has shut down its artificial intelligence called Tay in recent years due to racist and misogynistic comments. Recently, Meta also shut down a language model for similar reasons.
A study conducted in June found that systems trained by the CLIP network, which we saw at Lensa, gender and race some about stereotypes revealed why. In addition, he stressed that they are less likely to know women and people of color.
We will see how artificial intelligence systems, including Lensa, will deal with this problem in the future. Prisma Labs, the owner of the application, also emphasizes that Stable Diffusion works on unfiltered data on the internet, saying that neither they nor Stable AI (the creator of Stable Diffusion) can do anything about it. He also said that the data introduced humanity’s existing biases into the model. that artificial intelligence holds a mirror up to our society he adds.
RELATED NEWS