{"id":36745,"date":"2022-06-17T06:29:00","date_gmt":"2022-06-17T06:29:00","guid":{"rendered":"https:\/\/cloudnewshub.com\/archives\/36745"},"modified":"2022-06-17T06:29:00","modified_gmt":"2022-06-17T06:29:00","slug":"what-the-eus-content-filtering-rules-could-mean-for-uk-tech","status":"publish","type":"post","link":"https:\/\/cloudnewshub.com\/?p=36745","title":{"rendered":"What the EU\u2019s content-filtering rules could mean for UK tech"},"content":{"rendered":"<div><img decoding=\"async\" src=\"http:\/\/cloudnewshub.com\/wp-content\/uploads\/2022\/06\/what-the-eus-content-filtering-rules-could-mean-for-uk-tech.jpg\" class=\"ff-og-image-inserted\"><\/div>\n<p>On 11 May 2022, the <a href=\"https:\/\/ec.europa.eu\/info\/index_en\">European Commission<\/a> released a proposal for a regulation for <a href=\"https:\/\/eur-lex.europa.eu\/legal-content\/EN\/TXT\/?uri=COM%3A2022%3A209%3AFIN&amp;qid=1652451192472\">laying down rules to prevent and combat child sexual abuse<\/a>. The regulation would establish preventative measures against child sexual abuse material (CSAM) being distributed online.<\/p>\n<p>Although the UK is no longer part of the <a href=\"https:\/\/european-union.europa.eu\/index_en\">European Union<\/a> (EU), any UK companies wishing to operate within the world\u2019s largest trading bloc will need to abide by EU standards. As such, this regulation would have an enormous impact on online communications services and platforms in the UK and around the world.<\/p>\n<p>Some online platforms already detect, report and remove online CSAM. However, such measures vary between providers and the EU has decided that voluntary action alone is insufficient. Some EU member states have proposed or adopted their own legislation to tackle online CSAM, but this could fragment the EU\u2019s vision of a united <a href=\"https:\/\/eufordigital.eu\/discover-eu\/eu-digital-single-market\/\">Digital Single Market<\/a>.<\/p>\n<p>This is not first time that content scanning has been attempted. In 2021, <a href=\"https:\/\/www.apple.com\/uk\/\">Apple<\/a> proposed <a href=\"https:\/\/www.computerweekly.com\/news\/252504970\/Apple-unveils-plans-to-scan-US-iPhones-for-child-sex-abuse-images\">scanning<\/a> owners\u2019 devices for CSAM using client-side scanning (CSS). This would allow CSAM filtering to be conducted without breaching <a href=\"https:\/\/www.techtarget.com\/searchsecurity\/definition\/end-to-end-encryption-E2EE\">end-to-end encryption<\/a>. However, the <a href=\"https:\/\/www.computerweekly.com\/news\/252508198\/Apple-scheme-to-detect-child-abuse-creates-serious-privacy-and-security-risks-say-scientists\">backlash<\/a> against this proposal led the idea being postponed indefinitely.<\/p>\n<p>At its core, the EU regulation will require \u201crelevant information society services\u201d to enact the following measures (Article 1):<\/p>\n<ul class=\"default-list\">\n<li>Minimise the risk that their services are misused for online child sexual abuse.<\/li>\n<li>Detect and report online child sexual abuse.<\/li>\n<li>Remove or disable access to child sexual abuse material on their services.<\/li>\n<\/ul>\n<p>Article 2 describes \u201crelevant information society services\u201d as any of the following:<\/p>\n<ul class=\"default-list\">\n<li>Online hosting service \u2013 a hosting service that consists of the storage of information provided by, and at the request of, a recipient of the service.<\/li>\n<li>Interpersonal communications service \u2013 a service that enables direct interpersonal and interactive exchange of information via electronic communications networks between a finite number of persons, whereby the persons initiating or participating in the communication determine its recipient(s), including those provided as an ancillary feature that is intrinsically linked to another service.<\/li>\n<li>Software applications stores \u2013 online intermediation services, which are focused on software applications as the intermediated product or service.<\/li>\n<li>Internet access services \u2013 publicly available electronic communications service that provides access to the internet, and thereby connectivity to virtually all end-points of the internet, irrespective of the network technology and terminal equipment used.<\/li>\n<\/ul>\n<p>The regulation would establish the EU Centre to create and maintain databases of indicators of online CSAM. This database would be used by information society services in order to comply with the regulation. The EU Centre would also act as a liaison to <a href=\"https:\/\/www.europol.europa.eu\/\">Europol<\/a>, by first filtering any reports of CSAM that are unfounded \u2013 \u201cWhere it is immediately evident, without any substantive legal or factual analysis, that the reported activities do not constitute online child sexual abuse\u201d \u2013 and then forwarding the others to Europol for further investigation and analysis.<\/p>\n<section class=\"section main-article-chapter\" data-menu-title=\"Fundamental rights\">\n<h3 class=\"section-title\"><i class=\"icon\" data-icon=\"1\"><\/i>Fundamental rights<\/h3>\n<p>A major concern about this regulation is that the <a href=\"https:\/\/www.techtarget.com\/searchsecurity\/definition\/content-filtering\">content filtering<\/a> of private messages would impinge on users\u2019 rights to privacy and freedom of expression. The regulation does not merely propose scanning the <a href=\"https:\/\/www.techtarget.com\/whatis\/definition\/metadata\">metadata<\/a> of messages, but the content of all messages for any offending material. \u201cThe European Court of Justice has made it clear, time and time again, that a mass surveillance of private communications is unlawful and incompatible with fundamental rights,\u201d says Felix Reda, an expert in copyright and freedom of communication for <a href=\"https:\/\/freiheitsrechte.org\/en\/\">Gesellschaft f\u00fcr Freiheitsrechte<\/a>.<\/p>\n<p>These concerns are acknowledged in the proposed regulation, which states: \u201cThe measures contained in the proposal affect, in the first place, the exercise of the fundamental rights of the users of the services at issue. Those rights include, in particular, the fundamental rights to respect for privacy (including confidentiality of communications, as part of the broader right to respect for private and family life), to protection of personal data and to freedom of expression and information.\u201d<\/p>\n<p>However, the proposed regulation also considers that none of these rights should be absolute. It states: \u201cIn all actions relating to children, whether taken by public authorities or private institutions, the child\u2019s best interests must be a primary consideration.\u201d<\/p>\n<p>There is also the issue of the potential erroneous removal of material \u2013 due to the mistaken assumption that said material concerns child sexual abuse material \u2013 which can have significant impact on a user\u2019s fundamental rights of freedom of expression and access to information.<\/p>\n<\/section>\n<section class=\"section main-article-chapter\" data-menu-title=\"Enacting the regulation\">\n<h3 class=\"section-title\"><i class=\"icon\" data-icon=\"1\"><\/i>Enacting the regulation<\/h3>\n<p>Article 10 (1) of the proposed regulation states: \u201cProviders of hosting services and providers of interpersonal communication services that have received a detection order shall execute it by installing and operating technologies to detect the dissemination of known or new child sexual abuse material or the solicitation of children, as applicable.\u201d<\/p>\n<p>However, unlike previous regulations, the necessary technical measures for establishing how online platforms can meet the requirements are not outlined in the proposed regulation. Instead, it gives platforms and providers flexibility in how they implement these measures, so the regulatory obligations can be embedded effectively within each service.<\/p>\n<p>\u201cYou notice in the introduction that it doesn\u2019t necessarily well define what a provider is and it doesn\u2019t necessarily define how well one has to scan things,\u201d says Jon Geater, CTO of <a href=\"https:\/\/www.rkvst.com\">RKVST<\/a>.<\/p>\n<p>According to Section 10 (3), once a detection order has been issued, the content filters will be expected to meet these criteria:<\/p>\n<ul class=\"default-list\">\n<li>Detecting the dissemination of known or new CSAM or the solicitation of children.<\/li>\n<li>Not extract any information, other what is necessary for the purposes of detection.<\/li>\n<li>In accordance with the state of the art in the industry and the least intrusive in terms of the impact on the users\u2019 rights to private and family life.<\/li>\n<li>Sufficiently reliable, such that they minimise false positives.<\/li>\n<\/ul>\n<p>But in order to detect CSAM or solicitation of children, content scanning of every communication would be required. The current proposal does not define what is considered to be a \u201csufficiently reliable\u201d benchmark for minimal false positives. \u201cIt\u2019s not feasible for us or anybody else to be 100% effective, and it\u2019s probably not very sensible for everybody to try their own attempt at doing it,\u201d says Geater.<\/p>\n<p>To help businesses meet these new regulatory obligations, the EU Centre will offer detection technologies free of charge. These will be intended for the sole purpose of executing the detection orders. This is explained in Article 50 (1), which states: \u201cThe EU Centre shall make available technologies that providers of hosting services and providers of interpersonal communications services may acquire, install and operate, free of charge, where relevant subject to reasonable licensing conditions, to execute detection orders in accordance with Article 10(1).\u201d<\/p>\n<p>Should a provider or platform choose to develop their own detection systems, Article 10 (2) states: \u201cThe provider shall not be required to use any specific technology, including those made available by the EU Centre, as long as the requirements set out in this Article are met.\u201d<\/p>\n<p>Although these detection technologies will be freely offered, the regulation nonetheless places huge demands on social media providers and communication platforms. Providers will be required to ensure human oversight, through analysing anonymised representative data samples. \u201cWe view this as a very specialist area, so we have a third-party supplier who provides scanning tools,\u201d says Geater.<\/p>\n<p>According to Article 24 (1), any technology company that comes under the purview of&nbsp; \u201crelevant information society services\u201d operating within the EU will require a legal representative within one of the EU\u2019s member states. At the very least, this could be a team of solicitors as the point of contact.<\/p>\n<p>Any platform or service provider that fails to comply with this regulation will face penalties of up to 6% of its annual income or global turnover. Supplying incorrect, incomplete or misleading information, as well as failing to revise said information, will result in penalties of up 1% of annual income or global turnover. Any periodic penalty payments could be up to 5% of average daily global turnover.<\/p>\n<\/section>\n<section class=\"section main-article-chapter\" data-menu-title=\"Concerns remain\">\n<h3 class=\"section-title\"><i class=\"icon\" data-icon=\"1\"><\/i>Concerns remain<\/h3>\n<p>One aspect that is particularly concerning is that there are no exemptions for different types of communication. Legal, financial and medical information that is shared online within the EU will be subject to scanning, which could lead to confidentiality and security issues.<\/p>\n<p>In October 2021, a <a href=\"https:\/\/arxiv.org\/abs\/2110.07450\">report into CSS<\/a> by a group of experts, including Ross Anderson, professor at the <a href=\"https:\/\/www.cam.ac.uk\/\">University of Cambridge<\/a>, was published on the open-access website <a href=\"https:\/\/arxiv.org\/\">arXiv<\/a>. The report concluded: \u201cIt is unclear whether CSS systems can be deployed in a secure manner such that invasions of privacy can be considered proportional. More importantly, it is unlikely that any technical measure can resolve this dilemma while also working at scale.\u201d<\/p>\n<p>Ultimately, the regulation will place significant demands on social media platforms and internet-based communication services. It will especially impact smaller companies that do not have the necessary resources or expertise to accommodate these new regulatory requirements.<\/p>\n<p>Although service providers and platforms could choose not to operate within EU countries, thus negating these requirements, this approach is likely to be self-destructive because of the massive limitation in userbase. It would also raise ethical questions if a company were seen to be avoiding the issue of CSAM being distributed on its platform. It is also likely that similar legislation could be put in place elsewhere, especially for any country wishing to harmonise its legislation with the EU.<\/p>\n<p>It would therefore be prudent to mitigate the impact of this proposed regulation by preparing for the expected obligations and having the appropriate policies and resources in place, enabling businesses to swiftly adapt to this new regulatory environment and manage the financial impact.<\/p>\n<\/section>\n","protected":false},"excerpt":{"rendered":"<p>On 11 May 2022, the European Commission released a proposal for a regulation for laying down rules to prevent and combat child sexual abuse. The regulation would establish preventative measures against child sexual abuse material (CSAM) being distributed online. Although the UK is no longer part of the European Union (EU), any UK companies wishing [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":36746,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[533],"tags":[],"class_list":["post-36745","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-it"],"_links":{"self":[{"href":"https:\/\/cloudnewshub.com\/index.php?rest_route=\/wp\/v2\/posts\/36745","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/cloudnewshub.com\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/cloudnewshub.com\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/cloudnewshub.com\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/cloudnewshub.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=36745"}],"version-history":[{"count":0,"href":"https:\/\/cloudnewshub.com\/index.php?rest_route=\/wp\/v2\/posts\/36745\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/cloudnewshub.com\/index.php?rest_route=\/wp\/v2\/media\/36746"}],"wp:attachment":[{"href":"https:\/\/cloudnewshub.com\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=36745"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/cloudnewshub.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=36745"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/cloudnewshub.com\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=36745"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}