A new report discovered that over 1,700 websites in the EU may contain unreported child sexual abuse (CSAM) content.
The findings are troubling and come from a recent study conducted by experts at Surfshark. The investigation was conducted on a global scale, and the study recorded an increase in reports of CSAM filed to authorities. Between 2020 and 2022, there were approximately 83 million reports, with 3.1 million of those coming from EU countries.
Just days after a group of tech companies—Surfshark included—penned an open letter urging EU ministers to withdraw from a proposed anti-CSAM regulation, a new report surfaced. This regulation could potentially enable authorities to scan all citizens’ private communication for dangerous content. Surfshark, as a VPN service provider, seeks to raise important questions about how to address this growing problem without intruding on people’s privacy by analyzing current available tech solutions.
Online Security Endangered for Children
Lina Survila, a spokesperson for Surfshark, shared, “There might be thousands of unreported websites containing CSAM at any given moment. According to our study, there could be as many as 1,720 websites in the EU alone. It’s frightening to think about how many websites containing CSAM are currently live in the rest of the world and have not yet been reported.”
In their research, Surfshark experts examined the extent of the child exploitation issue online across the EU and globally.
Within Europe, Poland appears to have the most significant CSAM problem, accounting for 16% of EU cases (269 unreported local harmful websites). France follows with 260 potentially dangerous websites, Germany with 158, Hungary with 152, and Italy with 110.
Internationally, Asia has the highest concerns over children’s online safety, with two-thirds of the 83 million CSAM reports filed between 2020 and 2022 attributed to the continent. India accounts for nearly 16% of these reports (over 13 million), followed by the Philippines with 7.1 million reports, Pakistan with 5.4 million, Indonesia and Bangladesh with 4.7 million each.
To compile this concerning data set, researchers utilized 2020-2022 open-source information from the National Center for Missing and Exploited Children (NCMEC). These resources were then compared with data reported by the Communications Regulatory Authority of Lithuania (RRT). More details of Surfshark’s methodology can be found here.
Innovative Tech for Privacy-Friendly Solutions
Substantially, the essential part of Surfshark’s study lies in the RRT findings. In 2022, the national regulator conducted an experiment in collaboration with proxy service provider, Oxilabs, to illustrate how new technology can combat CSAM matters in a way that preserves privacy.
The company developed a new AI-powered tool capable of scraping the web to effectively identify illegal content. It assesses image metadata and determines if there are any matches in the police database. This is followed by running the images through a machine-learning model that can detect pornographic material.
The pro bono project lasted two months and examined approximately 300 thousand Lithuanian websites. The tool managed to identify 19 local websites breaching national or EU laws, leading to eight police reports and two pre-trial investigations.
According to Survila, Oxilabs’ experiment should exemplify how tech innovation can support authorities in stopping child sexual abuse online. She expressed, “While there is no universal solution, the proactive measures taken by some governments can serve as a guiding model for others to address these complex challenges.”
In October last year, the EU Parliament reached a historical agreement, calling for the removal of the Chat Control clause from the EU Child Sexual Abuse Material (CSAM) Scanning Proposal. Emphasizing privacy as a fundamental right, the decision aims to safeguard online security and encryption. Nevertheless, each EU Member State must now agree on their own position. Ministers are expecting to reach an agreement by March.
The Chat Control proposal, currently under debate in the EU Parliament, appears to be taking a different direction that experts warn could jeopardize citizens’ security. They stress that scanning chats is not only an attack on encryption that violates people’s privacy but could also create a backdoor that criminals can exploit.
Survila further commented that before contemplating such laws, that could be intrusive yet envisioned to tackle broader online dangers, “an individual’s privacy rights should be non-negotiable, and other potential tools should be explored to fight abusive content online.”
Her opinion aligns with the notion that governments should initially explore less invasive tools, such as web scraping, to identify and combat publicly accessible dangerous content.
Denas Grybauskas, the Head of Legal at Oxylabs, believes that, while the European Commission (EC) recognizes that such incursion into people’s privacy should only be allowed as a last resort, it’s essential to thoroughly discuss tech-driven alternatives.
Grybauskas remarked, “We remain open to forming new collaborations with researchers, academia, and public organizations aiming to solve crucial research questions and missions using public web data.”