Fighting child sexual abuse online: what EU measures exist?
Ϸվ wants to establish effective rules to prevent and combat online child sexual abuse while protecting people's privacy.
The proliferation of online materials of children engaging or appearing to engage in a sexual act has been on the rise, particularly of materials depicting younger children. In 2022, there were more than 32 million reports of suspected online child sexual abuse, marking a historic high.
Developing EU legislation on child sexual abuse
The EU has adopted a . As part of this commitment, the European Commission aims to build on the existing rules from 2011. In November 2023, Ϸվ’s civil liberties committee adopted a report on a proposal for a regulation aiming to prevent and combat child sexual abuse.
Provisional rules from 2021 allow digital companies to look for content being posted on their platforms for child sexual abuse material. The rules provide a . The proposal that Ϸվ is working on seeks to establish permanent rules about how companies can detect child sexual abuse material online.
Safeguarding privacy
Ϸվ wants to strike a balance between safeguarding children in the digital sphere and upholding fundamental rights such as the right to privacy. MEPs’ position on the new rules does not endorse widespread web scanning, blanket monitoring of private communications or the creation of backdoors in apps to weaken encryption.
Providers’ duties: risk assessment and mitigation
According to the proposed legislation, providers of hosting or interpersonal communication services will be obliged to perform a risk assessment of the potential presence of sexual content involving children on their services. Once the providers have identified the level of risk, they must implement mitigation measures to address it.
The regulation provides an extensive list of potential mitigation measures that providers can opt to implement. These include the principle of safety by design (developing products or services in a way that avoids potential harm), mandatory parental controls, the establishment of user reporting mechanisms, and the use of age verification systems when there is a risk of child solicitation.
The regulation also introduces specific mandatory mitigation measures for services directly targeting children, platforms primarily used for the dissemination of pornographic content, and certain chat services within games.
Service providers will have the autonomy to choose the technologies they will use to fulfil their detection obligations. The rules foresee a simplified procedure for smaller businesses.
Detection orders as a measure of last resort
If providers fail to meet their obligations, a judicial authority would be able to issue a detection order only as a last resort. This order would compel the provider to employ certain technologies to detect known and new child sexual abuse material.
Detection orders would only be used if there was reasonable suspicion that individual users or groups are linked to child sexual abuse material. The orders would be time-limited, with end-to-end encrypted communication and text messages excluded from their scope. This approach aims to ensure that the privacy and security of users of digital services are maintained.
Support for victims and survivors
The proposal includes the establishment of an EU Centre for Child Protection. The centre would receive, filter, assess, and forward reports of child sexual abuse content to competent national authorities and Europol. It would also support national authorities, conduct investigations and issue fines.
The Commission’s proposal includes specific rights for victims to request information on online material depicting them and the right to request the removal of this content. Ϸվ expands these rights to include the right to receive support and assistance from the EU Centre for Child Protection as well as authorities at the national level.
Next steps
In November 2023, Ϸվ adopted its negotiating mandate for the new law on fighting and preventing child sexual abuse online, This forms the basis for negotiations with EU countries to determine the final text of the regulation.
The temporary rules exempting digital companies from e-privacy rules when they look for child sexual abuse material were set to expire in August 2024. To avoid a legal vacuum, Ϸվ and Council agreed in February 2024 to extend the derogation until April 2026. This provisional agreement was formally adopted by Ϸվin April 2024.
At the same time, they aim to reach an agreement on the long-term legal framework and avoid further extensions to the temporary derogation.