{"id":79691,"date":"2024-05-02T17:48:21","date_gmt":"2024-05-02T17:48:21","guid":{"rendered":"https:\/\/neclink.com\/index.php\/2024\/05\/02\/eu-plan-to-force-messaging-apps-to-scan-for-csam-risks-millions-of-false-positives-experts-warn\/"},"modified":"2024-05-02T17:48:21","modified_gmt":"2024-05-02T17:48:21","slug":"eu-plan-to-force-messaging-apps-to-scan-for-csam-risks-millions-of-false-positives-experts-warn","status":"publish","type":"post","link":"https:\/\/neclink.com\/index.php\/2024\/05\/02\/eu-plan-to-force-messaging-apps-to-scan-for-csam-risks-millions-of-false-positives-experts-warn\/","title":{"rendered":"EU plan to force messaging apps to scan for CSAM risks millions of false positives, experts warn"},"content":{"rendered":"<p> <br \/>\n<\/p>\n<div>\n<p id=\"speakable-summary\">A controversial push by European Union lawmakers to legally require messaging platforms to scan citizens\u2019 private communications for child sexual abuse material (CSAM) could lead to millions of false positives per day, hundreds of security and privacy experts warned in an <a href=\"https:\/\/nce.mpi-sp.org\/index.php\/s\/eqjiKaAw9yYQF87\" target=\"_blank\" rel=\"noopener\">open letter<\/a> Thursday.<\/p>\n<p>Concern over the EU proposal has been building since the Commission proposed the CSAM-scanning plan <a href=\"https:\/\/techcrunch.com\/2022\/05\/11\/eu-csam-detection-plan\/\">two years ago<\/a> \u2014 with independent experts,<a href=\"https:\/\/techcrunch.com\/2023\/10\/26\/eu-lawmakers-agree-on-key-detection-limits-in-controversial-csam-scanning-file\/\">\u00a0lawmakers across the European Parliament<\/a> and even <a href=\"https:\/\/techcrunch.com\/2023\/10\/24\/eu-csam-scanning-edps-seminar\/\">the bloc\u2019s own Data Protection Supervisor<\/a> among those sounding the alarm.<\/p>\n<p>The EU proposal would not only require messaging platforms that receive a CSAM detection order to scan for <em>known<\/em> CSAM; they would also have to use unspecified detection scanning technologies to try to pick up unknown CSAM and identify grooming activity as it\u2019s taking place \u2014 leading to accusations of lawmakers indulging in magical thinking-levels of technosolutionism.<\/p>\n<p>Critics argue the proposal asks the technologically impossible and will not achieve the stated aim of protecting children from abuse. Instead, they say, it will wreak havoc on Internet security and web users\u2019 privacy by forcing platforms to deploy blanket surveillance of all their users in deploying risky, unproven technologies, such as client-side scanning.<\/p>\n<p>Experts say there is no technology capable of achieving what the law demands without causing far more harm than good. Yet the EU is ploughing on regardless.<\/p>\n<p>The latest open letter addresses amendments to the draft CSAM-scanning regulation recently proposed by the European Council which the signatories argue fail to address fundamental flaws with the plan.<\/p>\n<p>Signatories to the letter \u2014 numbering 270 at the time of writing \u2014 include hundreds of academics, including well-known security experts such as professor Bruce Schneier of Harvard Kennedy School and Dr. Matthew D. Green of Johns Hopkins University, along with a handful of researchers working for tech companies such as IBM, Intel and Microsoft.<\/p>\n<p>An earlier <a href=\"https:\/\/docs.google.com\/document\/d\/13Aeex72MtFBjKhExRTooVMWN9TC-pbH-5LEaAbMF91Y\/edit\" target=\"_blank\" rel=\"noopener\">open letter<\/a> (last July), signed by 465 academics, warned the detection technologies the legislation proposal hinges on forcing platforms to adopt are \u201cdeeply flawed and vulnerable to attacks\u201d, and would lead to a significant weakening of the vital protections provided by end-to-end encrypted (E2EE) communications.<\/p>\n<h2>Little traction for counter-proposals<\/h2>\n<p><a href=\"https:\/\/techcrunch.com\/2023\/10\/26\/eu-lawmakers-agree-on-key-detection-limits-in-controversial-csam-scanning-file\/\">Last fall<\/a>, MEPs in the European Parliament united to push back with a substantially revised approach \u2014 which would limit scanning to individuals and groups who are already suspected of child sexual abuse; limit it to known and unknown CSAM, removing the requirement to scan for grooming; and remove any risks to E2EE by limiting it to platforms that are not end-to-end-encrypted. But the European Council, the other co-legislative body involved in EU lawmaking, has yet to take a position on the matter, and where it lands will influence the final shape of the law.<\/p>\n<p>The latest amendment on the table was put out by the Belgian Council presidency in March, which is leading discussions on behalf of representatives of EU Member States\u2019 governments. But in the open letter the experts warn this proposal still fails to tackle fundamental flaws baked into the Commission approach, arguing that the revisions still <span style=\"font-size: 1rem; letter-spacing: -0.1px;\">create \u201c<\/span><span style=\"font-size: 1rem; letter-spacing: -0.1px;\">unprecedented capabilities for surveillance and control of Internet users\u201d and would \u201cundermine\u2026 a <\/span>secure digital future for our society and can have enormous consequences for democratic processes in Europe and beyond.\u201d<\/p>\n<p>Tweaks up for discussion in the amended Council proposal include a suggestion that detection orders can be more targeted by applying risk categorization and risk mitigation measures; and cybersecurity and encryption can be protected by ensuring platforms are not obliged to create access to decrypted data and by having detection technologies vetted. But the 270 experts suggest this amounts to fiddling around the edges of a security and privacy disaster.<\/p>\n<p>From a \u201ctechnical standpoint, to be effective, this new proposal will also completely undermine communications and systems security\u201d, they warn. While relying on \u201cflawed detection technology\u201d to determine cases of interest in order for more targeted detection orders to be sent won\u2019t reduce the risk of the law ushering in a dystopian era of \u201cmassive surveillance\u201d of web users\u2019 messages, in their analysis.<\/p>\n<p>The letter also tackles a proposal by the Council to limit the risk of false positives by defining a \u201cperson of interest\u201d as a user who has already shared CSAM or attempted to groom a child \u2014 which it\u2019s envisaged would be done via an automated assessment; such as waiting for 1 hit for known CSAM or 2 for unknown CSAM\/grooming before the user is officially detected as a suspect and reported to the EU Centre, which would handle CSAM reports.<\/p>\n<h2>Billions of users, millions of false positives<\/h2>\n<p>The experts warn this approach is still likely to lead to vast numbers of false alarms.<\/p>\n<p>\u201cThe number of false positives due to detection errors is highly unlikely to be significantly reduced unless the number of repetitions is so large that the detection stops being effective. Given the large amount of messages sent in these platforms (in the order of billions), one can expect a very large amount of false alarms (in the order of millions),\u201d they write, pointing out that the platforms likely to end up slapped with a detection order can have millions or even billions of users, such as Meta-owned WhatsApp.<\/p>\n<p>\u201cGiven that there has not been any public information on the performance of the detectors that could be used in practice, let us imagine we would have a detector for CSAM and grooming, as stated in the proposal, with just a 0.1% False Positive rate (i.e., one in a thousand times, it incorrectly classifies non-CSAM as CSAM), which is much lower than any currently known detector.<\/p>\n<p>\u201cGiven that WhatsApp users send 140 billion messages per day, even if only 1 in hundred would be a message tested by such detectors, there would be 1.4 million false positives every single day. To get the false positives down to the hundreds, statistically one would have to identify at least 5 repetitions using different, statistically independent images or detectors. And this is only for WhatsApp \u2014 if we consider other messaging platforms, including email, the number of necessary repetitions would grow significantly to the point of not effectively reducing the CSAM sharing capabilities.\u201d<\/p>\n<p>Another Council proposal to limit detection orders to messaging apps deemed \u201chigh-risk\u201d is a useless revision, in the signatories\u2019 view, as they argue it\u2019ll likely still \u201cindiscriminately affect a massive number of people\u201d. Here they point out that only standard features, such as image sharing and text chat, are required for the exchange of CSAM \u2014 features that are widely supported by many service providers, meaning a high risk categorization will \u201cundoubtedly impact many services.\u201d<\/p>\n<p>They also point out that adoption of E2EE is increasing, which they suggest will increase the likelihood of services that roll it out being categorized as high risk. \u201cThis number may further increase with the interoperability requirements introduced by the Digital Markets Act that will result in messages flowing between low-risk and high-risk services. As a result, almost all services could be classified as high risk,\u201d they argue. (NB: <a href=\"https:\/\/techcrunch.com\/2022\/03\/24\/dma-political-agreement\/\">Message interoperability<\/a> is a core plank of the <a href=\"https:\/\/techcrunch.com\/2024\/03\/07\/europes-dma-rules-for-big-tech-explained\/\">EU\u2019s DMA<\/a>.)<\/p>\n<h2>A backdoor for the backdoor<\/h2>\n<p>As for safeguarding encryption, the letter reiterates the message that security and privacy experts have been repeatedly yelling at lawmakers for years now: \u201cDetection in end-to-end encrypted services by definition undermines encryption protection.\u201d<\/p>\n<p>\u201cThe new proposal has as one of its goals to \u2018protect cyber security and encrypted data, while keeping services using end-to-end encryption within the scope of detection orders\u2019. As we have explained before, this is an oxymoron,\u201d they emphasize. \u201cThe protection given by end-to-end encryption implies that no one other than the intended recipient of a communication should be able to learn any information about the content of such communication. Enabling detection capabilities, whether for encrypted data or for data before it is encrypted, <em>violates the very definition of confidentiality provided by end-to-end encryption<\/em>.\u201d<\/p>\n<p>In <a href=\"https:\/\/techcrunch.com\/2024\/04\/22\/e2ee-police-chiefs-lawful-access\/\">recent weeks<\/a> police chiefs across Europe have penned their own joint statement \u2014 raising concerns about the expansion of E2EE and calling for platforms to design their security systems in such as way that they can still identify illegal activity and send reports on message content to law enforcement.<\/p>\n<p>The intervention is widely seen as an attempt to put pressure on lawmakers to pass laws like the CSAM-scanning regulation.<\/p>\n<p>Police chiefs deny they\u2019re calling for encryption to be backdoored but they haven\u2019t explained exactly which technical solutions they do want platforms to adopt to enable the sought for \u201clawful access\u201d. Squaring that circle puts a very wonky-shaped ball back in lawmakers\u2019 court.<\/p>\n<p>If the EU continues down the current road \u2014 so assuming the Council fails to change course, as MEPs have urged it to \u2014 the consequences will be \u201ccatastrophic\u201d, the letter\u2019s signatories go on to warn. \u201cIt sets a precedent for filtering the Internet, and prevents people from using some of the few tools available to protect their right to a private life in the digital space; it will have a chilling effect, in particular to teenagers who heavily rely on online services for their interactions. It will change how digital services are used around the world and is likely to negatively affect democracies across the globe.\u201d<\/p>\n<p>An EU source close to the Council was unable to provide insight on current discussions between Member States but noted there\u2019s a working party meeting on May 8 where they confirmed the proposal for a regulation to combat child sexual abuse will be discussed.<\/p>\n<\/p><\/div>\n<p><br \/>\n<br \/><a href=\"https:\/\/techcrunch.com\/2024\/05\/02\/eu-csam-scanning-council-proposal-flaws\/\">Source link <\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>A controversial push by European Union lawmakers to legally require messaging platforms to scan citizens\u2019 private communications for child sexual abuse material (CSAM) could lead<\/p>\n","protected":false},"author":1,"featured_media":79692,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"footnotes":""},"categories":[178],"tags":[],"class_list":["post-79691","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-tech"],"aioseo_notices":[],"_links":{"self":[{"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/posts\/79691","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/comments?post=79691"}],"version-history":[{"count":0,"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/posts\/79691\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/media\/79692"}],"wp:attachment":[{"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/media?parent=79691"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/categories?post=79691"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/neclink.com\/index.php\/wp-json\/wp\/v2\/tags?post=79691"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}