EU lawmakers have agreed to draft rules requiring Alphabet’s unit (GOOGL) (NASDAQ:GOOG) Google, Meta Platforms (NASDAQ:META) and other online service providers, to detect and remove online child pornography, noting that end-to-end encryption would not be affected, Reuters reported.
The draft rule on child sexual abuse material, or CSAM, proposed by the European Commission in 2022, has been a matter of debate between those advocating online safety measures and privacy activists concerned about surveillance, the report added.
The EU executive has brought in the CSAM proposal as the current system of voluntary identifying and reporting by companies has not proved to be sufficient to protect children.
EU lawmakers have to brainstorm the final details with member states before the draft can become a law and could be finalised in 2024.
As per the draft, messaging services, app stores and internet access providers have to report and remove known and new content such as images and videos, and cases of grooming, the report noted.
On Nov. 14, the draft Parliament position was adopted by the Committee on Civil Liberties, Justice and Home Affairs with 51 votes in favour, 2 against, and 1 abstaining.
As per the draft, the new rules would mandate internet providers to evaluate if there is a significant risk of their services being misused for online child sexual abuse and to solicit children, and to take measures to mitigate these risks.
Member of the European Parliament, or MEPs, want these measures to be targeted and effective, and providers should be able to decide which ones to use. They also want to ensure that pornographic sites have adequate age verification systems, flagging mechanisms for CSAM and human content moderation to process these reports.
To stop minors being solicited online, the MEPs also proposed that services targeting children should require by default user consent for unsolicited messages, have blocking and muting options, and boost parental controls.
To avoid mass surveillance or generalised monitoring of the internet, the draft law would allow judicial authorities to authorise time-limited orders, as a last resort, to detect any CSAM and take it down or disable access to it, when mitigation measures are not effective in taking it down, the EU said in the Nov. 14 press release.
MEPs had also excluded end-to-end encryption from the scope of the detection orders to make sure that all users’ communications are secure and confidential.
The companies would also be able to select the technology used to identify such offences, but the technology would be subject to an independent, public audit.
The law would also set up an EU Centre for Child Protection to help implement the rules and help internet providers in detecting CSAM. It would collect and distribute CSAM reports to competent national authorities and Europol.