Google reports a parent for alleged child abuse

Google has implemented artificial intelligence-assisted child sexual abuse detection technology in recent years. According to The New York Times report, a parent who wanted to photograph their child’s groin infection was investigated.

The child sexual abuse material (CSAM) detection system, launched in 2018, instantly scans the photos uploaded to Google’s cloud storage service. Photos scanned with artificial intelligence are reported directly to the authorities when any abuse is detected.

Google reports a parent for alleged child abuse

The beginning of the pandemic process caused almost all countries to decide to close. In this process, citizens who could not benefit from health services communicated online with doctors and nurses. Concerned about the swelling in her child’s groin in 2021, a parent sent a photo of the genital area at the nurse’s request.

Stating that Google closed their accounts about 2 days after taking the photos, the parent announced that access to the phone number was also prohibited because they used the Google Fi service. Unable to access email, contact list, and photos, the parent was found not guilty as a result of an investigation by the San Francisco Police Department in December 2021.

Meta sues Mark Zuckerberg!

Meta sues Mark Zuckerberg!

The name lawsuit between the two companies, Meta and MetaX, is intriguing. Can Mark Zuckerberg’s new brand be cancelled?

Commenting on the subject, Christa Muldoon said, “The material called CSAM is sickening and our company will fight it to the end. We follow US law when defining what constitutes CSAM. We use a combination of hash matching technology and artificial intelligence to identify CSAM images and remove them from our platforms. In addition, our team of child safety experts review the accuracy of flagged content and consult with pediatricians to help us identify situations where users may seek medical advice.” made statements.

Apple announced its Child Safety plan last year for similar reasons. Stating that the images will be scanned before uploading to iCloud, the company underlines that the content will be reviewed by the moderator when it is matched with the CSAM database. The feature, which was criticized by the Electronic Frontier Foundation (EFF), was implemented to be activated on demand.

What do you think about this subject? You can share your ideas in the comments section and on the SDN Forum.

source site-28