The EU Commission wants to examine the network more closely

EU law proposal against child abuse

On May 11, 2022, the EU Commission presented a legislative proposal in Brussels that sets the framework for the hunt for perpetrators and for images and videos of sexual abuse of children.

(Photo: dpa)

Brussels In the fight against the sexual abuse of children, the Internet could be screened much more intensively in the future. According to a draft law presented on Wednesday, providers such as Google or Facebook could be obliged to use software to search their services for corresponding representations.

In addition, an EU center is to be set up that, among other things, should provide the appropriate technology. “We will find you,” EU Interior Commissioner Ylva Johansson said to criminals.

The network is currently being flooded with corresponding representations and the problem is getting bigger. According to the EU Commission, 85 million images and videos showing sexual abuse of children were reported worldwide in 2021. The number of unreported cases is significantly higher. The Internet Watch Foundation has found a 64 percent increase in reports of confirmed child sexual abuse in 2021 compared to the previous year.

According to Johansson, the perpetrators are often people the child trusts. “And these crimes very often remain in the dark until the perpetrator publishes them online.” It was often the photos and videos that made criminal prosecution possible.

Top jobs of the day

Find the best jobs now and
be notified by email.

The fact that images of severe sexual child abuse are increasingly finding their way onto the Internet is also due to the culture of exchange among criminals. In order to get child pornography from other perpetrators, it could be a prerequisite to broadcast a rape of a child via live stream.

Such extreme examples are only the tip of the iceberg. Johansson emphasized that there was a study from Sweden in which 80 percent of the girls surveyed between the ages of ten and 13 said they had already received nude pictures of unknown adults unintentionally. “I think I have a large majority of citizens on my side,” said the Swede, referring to her draft law.

Ylva Johansson

Her message to the perpetrators is: “We will find you.”

(Photo: AP)

In concrete terms, this states that companies must analyze how great the risk is that depictions of abuse will be disseminated via their services or so-called grooming will take place – i.e. if adults with intention to abuse will contact minors. If it is concluded that there is a significant risk, national authorities or courts can order that content be automatically checked by software and criminal content detected.

According to the draft law, the technology used for this should not be able to extract any information other than that which indicates the dissemination of abusive material. The same goes for grooming. The software should also be designed in such a way that it represents the least possible intrusion into the privacy of users.

The draft law does not specify which technology is to be used. It is therefore also unclear how the screening of the network content would be technically implemented and whether, for example, the encryption of messages could be circumvented.

However, providers must specifically ensure that children cannot download apps that pose an increased risk of grooming, and that depictions of abuse are deleted or blocked. It must also be known whether an account belongs to a minor or an adult.

The EU Parliament and EU states must now discuss the proposal and agree on a final version. So there may still be changes.

Software as a “horror filter”?

The first reactions were mixed. Federal Interior Minister Nancy Faeser (SPD) welcomed the proposal. “With a clear legal basis, binding reporting channels and a new EU center, we can significantly strengthen prevention and criminal prosecution across the EU,” she said. “The fact that we will oblige companies to recognize and report the sexual abuse of children in the future is an important and long overdue step in the fight against child abuse,” said the domestic policy spokeswoman for the CDU/CSU group in the European Parliament, Lena Düpont.

The FDP MP Moritz Körner, on the other hand, spoke of a “Stasi 2.0”. He fears invasions of the private sphere of citizens. Konstantin von Notz from the Greens criticizes that private companies could be obliged to systematically scan private text, image and video content. “There are massive doubts that this is compatible with applicable European and German fundamental rights and ECJ case law.”

The SPD MEP Tiemo Wölken described the software intended for detecting the network content as a “horror filter”. The regulation tries to pretend that privacy and data protection are guaranteed. “The text is also impenetrable and confusing,” Wölken wrote on Twitter.

More: EU wants to continue suspending debt rules – are member states preparing for a state of emergency?

source site-18