Artists declared war on productive artificial intelligence

Generative artificial intelligence is slowly settling into every aspect of our lives. However, it needs lots of data to train its models. Professional artists, on the other hand, are uncomfortable with productive artificial intelligence companies using their work to train their own technologies. Photographers will soon be able to provide an effective response to this issue that does not require going to court. Here are the details…

Tension between professional artists and artificial intelligence is rising!

Generative AI burst onto the scene almost a year ago. With the launch of OpenAI company’s ChatGPT chat robot, artificial intelligence has now managed to touch almost every aspect of our lives. ChatGPT is extremely adept at communicating in a very natural, human-like way. However, in order to gain this ability, it needs to be trained on masses of data from the web.

Similar generative AI tools are also capable of generating images from texts. However, like ChatGPT, these are trained by scraping images found on the web. This situation brings with it completely different problems.

Google is working to make artificial intelligence more responsible

Google is working to make artificial intelligence more responsible

Google explained the method it uses to avoid using artificial intelligence for bad deeds. Here are the details.

Unfortunately, this is evidence that the works of artists and photographers are being used as resources for training artificial intelligence. Additionally, companies are using these studies without permission or compensation to develop their own productive AI tools.

Professional artists are quite disturbed by this situation. They are working to take precautions. To combat this, a team of researchers developed a tool called Nightshade, which can confuse the training model and cause it to output erroneous images in response to prompts. A tool called Nightshade adds invisible pixels to a work of art before it is uploaded to the web. Thus, it poisons the training data of artificial intelligence.

Using this tool to poison this training data will harm future AI models. Image producing companies such as DALL-E, Midjourney and Stable Diffusion will be most affected by this situation. It may even render some of its outputs useless. With this vehicle, dogs turn into cats and cars turn into cows.

University of Chicago professor Ben Zhao leads the research team behind Nightshade. According to Ben Zhao, this tool will help shift the balance of power back to artists. Additionally, the professor said he could fire a warning shot at tech companies that ignore copyright and intellectual property.

source site-28