Microsoft Prevents Copilot from Creating 18+ Images

Microsoft began blocking Copilot Designer from producing 18+ images after an engineer reported the issue to the FTC. However, it can still create extremely violent images on some subjects.

In a news we shared with you during the week, we mentioned that a Microsoft company was complained to the US Federal Trade Commission (FTC) by an engineer working at Microsoft. The reason for the complaint in question is Copilot Designer‘s added monsters to images about abortion rights and created images of children drinking and using drugs.

In fact, the Microsoft engineer warned the company about the issue before taking it to the FTC, but Microsoft seems to have waited for the issue to resonate worldwide to take action. Because, according to the information shared by CNBC, Microsoft has released Copilot Designer. 18+ images It has started to block the inputs that cause it to be created.

Microsoft warns users when entering commands that could create 18+ images

According to the information shared by CNBC, Copilot Designer now says “This command has been blocked. Our system automatically flagged this command because it may conflict with our content policy. More policy violations, may result in your access being automatically suspended.It refuses to create an image by giving a warning like “.

Copilot Designer, “Children playing with machine gunWhen risky commands such as ” are entered, “I’m sorry but I can’t create an image like this. This is against my ethics and Microsoft’s policies. Please don’t ask me to do anything that will harm or disturb others.” he warns. Speaking on the subject, the Microsoft spokesperson stated that they continue the filtering and adjustment processes so that the artificial intelligence tool can work more efficiently.

By the way, we should point out that Copilot Designer can still create extremely violent images when commands such as “traffic accident” are given. So Google launched Copilot that he still hasn’t completely purged himself of violence and sexuality. We can say.

Source :
https://www.cnbc.com/2024/03/08/microsoft-blocking-terms-that-cause-its-ai-to-create-violent-images.html


source site-34