applescans iPhone messages for nudity. started to offer targeted security feature. To iPhone users in the UK, with artificial intelligence nudity screening for child abuse to do.
Apple is preparing to bring the security feature, which uses artificial intelligence to scan messages sent to and from children, to iPhones. With this feature, Apple on children’s iPhones will be authorized to read text messages. But users have major security concerns.
Apple will scan messages on iPhone for nudity
A controversial security feature has been introduced for iPhone owners in the UK. Apple’s new update; It includes a security feature that uses artificial intelligence to scan messages sent to and from children.
Apple’s new security featuredistributed as additional security for iPhones of children under 18 years of age. According to the company’s statement, when photos are sent or received by children via the messaging app. nudity scan to do.
“Communication security in messagesIf ” is activated and nudity is detected in a received photo, the photo will be blurred and Apple will warn that it contains sensitive content. The similar function will be activated when a child sends a nude photo.
In addition to the last security measure, adult support for kids’ iPhones it’s coming too. If nudity is detected in the photo sent via text message, there will be a “Send Adult Message” option.
This isn’t actually Apple’s first attempt to scan iPhones. Earlier will scan user galleries for child abuse The company, which said that, received a great reaction from privacy advocates. After these reactions, in September child protection features submission was delayed.
In addition to this scanning innovation, the tech giant will also offer tools aimed at intervening when searches for child abuse are made in Spotlight, Siri or Safari. However, it should be noted that these have not yet been detailed.
What do you think about the iPhone nudity scanning feature from Apple? You can express your thoughts in the comments section or on the SDN Forum.