Law A beginner’s guide to EU rules on scanning private communications - Your totalitarian nightmare - come true

  • 🐕 I am attempting to get the site runnning as fast as possible. If you are experiencing slow page load times, please report it.
( article 1 | archive )
( article 2 | archive )
( article 3 | archive )

A beginner’s guide to EU rules on scanning private communications: Part 2


In 2021, the European Union (EU) institutions responsible for making laws agreed to pass the temporary derogation from certain provisions of the ePrivacy Directive. This new law allows certain companies to scan everyone’s private messages and chats, even though such practices may be incompatible with the EU’s human rights and data protection laws.

In the first part of this blog series, we explored how changes to definitions in the European Electronic Communications Code (EECC) led to a situation of panic, which may have enabled the European Commission to push through the temporary law despite so many concerns having been raised.

The temporary derogation will expire in August 2024, and the EU wants to replace it before then with a ‘long term’ version. The long-term proposal is currently scheduled for 30 March 2022, although its publication has already been pushed back several times, meaning that time is ticking. In this blog, we take a look at what could be coming up in the new proposal, and how the investigation of online CSAM should be done in order to meet the standards required by EU law.

The Commission’s plans for Chat Control:

European Commissioner Ylva Johansson, the Commissioner responsible for EU laws and policy on Migration and Home Affairs, has spoken repeatedly with the press to emphasise the hard line that she is taking. She recently met with almost a dozen US-based tech companies about her plans, warning them:

“I will propose to make it [the automated scanning of private communications] mandatory so you better shape up and start realising that this is going to happen”.
What the Commissioner is saying is that the new proposal will take even further-reaching steps than the voluntary scanning currently permitted by the short-term law (and which the European Parliament have pointed out might already be unlawful).

Given that Members of the European Parliament (MEPs) have warned that the short-term law lacks a legal basis and would probably be invalidated if it were taken to court, it is deeply concerning that the Commissioner wants to put forward rules which will take an even more extreme stance against the privacy of the EU’s 447 million inhabitants. There have even been fears that the proposal might seek to undermine encryption, which is a vital technology that we all rely on every day, for example to make online bank transactions, to communicate with our doctor and even for governments to protect intelligence.

The Commission’s plans have been coined “Chat Control” by Patrick Breyer, a human rights lawyer and MEP, because they seek to automatically scan the chats, messages and web-based emails of every person in the EU (including young people). In effect, this would mandate the surveillance and control of all our private communications by Big Tech companies like Facebook, enabled by proprietary scanning tools from Microsoft and other companies, which we are therefore unable to externally audit.

We fear that the Commission are poised to propose a ‘solution’ which lacks a legal basis, will compel corporations to use secretive technology to invade all of our messages and chats, may remove our choice to choose privacy-respecting message services, put our devices at an enhanced risk of hacking, and which may constitute mass surveillance.
Furthermore, for a complex and controversial law like this one – with potentially enormous consequences on people’s rights and liberties – it is important that the process of negotiating the proposal is not rushed. Unlike what happened with the temporary derogation, it is vital that MEPs are given ample time to perform their role of democratic scrutiny, and are not silenced from voicing concerns by accusations that they aren’t committed to protecting children.

Is there a rights-respecting way to investigate online CSAM?

Democracy and the rule of law are founded on rights such as good administration and the presumption of innocence, as well as principles such as accountability and due process. These rules underpin the proper functioning of our justice systems. They ensure that vital evidence can be admissible in court and that cases don’t fall apart because a suspect was mistreated. As a result, they make it more likely that justice can be achieved for victims. They also protect human rights defenders, government critics and journalists from reprisals and ensure that we can all speak freely.

When it comes to detecting, investigating and prosecuting online CSAM, it is no different. Those who view or disseminate online child sexual abuse or exploitation are committing an egregious crime, and must be investigated and prosecuted for this. To do this, law enforcement agencies should tackle online CSAM in the same way that they tackle any other case: receiving reports, following leads, singling out suspects, conducting investigations into those suspects and building up evidence in a lawful way.

In certain cases, they might apply for a court order to covertly intercept the phone calls, messages or letters of a suspect – which is acceptable (as long as they can justify that such a move is necessary, proportionate and lawful, of course). The investigation of serious crimes does not, however, mean that governments can take any measure at any cost. The sensitivity of the topic of CSAM cannot be used to silence voices that call for police investigations to be conducted in a necessary, proportionate and lawful way. Nor should the legal responsibility for the spread of such content be outsourced to service providers, making them responsible for content that is a matter for law enforcement agencies.

In a democratic society, law enforcement cannot cast a wide net of surveillance ‘just in case’ they might find a crime. Governments can intrude on people’s privacy only if they have a very good reason to do so, such as that person being individually suspected of a crime that justifies that particular intrusion. This is vital for protecting each and every one of us from state over-reach, arbitrary investigations, making sure that we are not unfairly targeted or discriminated against, and that decisions made by law enforcement can be investigated in the event that wrongdoing is alleged. This protects suspects, witnesses and victims, as well as the police officers themselves, by creating a paper trail for accountability purposes.

A holistic approach should also tackle the issue in its grave context: child sexual abuse and exploitation does not exist because of the digital technologies, even though the internet exacerbates its spread and the ease with which it can be created. In fact, in 2017, the European Parliament reported that countries across the EU have failed to implement a series of measures which were adopted in a 2011 Directive to tackle the issue of child sexual abuse and exploitation. If countries still systemically fail to follow existing measures, then additional laws are at best premature.

Given the lack of implementation of the 2011 Directive, and in the absence of a legal basis for the short-term derogation, will the Commission see sense and propose only lawful, targeted, open-source methods and techniques for investigating online CSAM?
Keep your eyes peeled for the third installment of this blog, where we will outline our 10 principles for scanning private communications in the EU in a way that respects fundamental rights. And mark the 30h March in your diary because – if the Commission continues to ignore civil society concerns – it could be the end of privacy in the EU as we know it.



EU chat control law will ban open source operating systems


The proposed Chat control EU law will not only seize totalitarian control of all private communication. It will also ban open source operating systems as an unintended consequence.

The EU is currently in the process of enacting the chat control law. It has been criticized for creating an EU-wide centralized mass surveillance and censorship system and enabling government eavesdropping on all private communication. But one little talked about consequence of the proposed law is that it makes practically all existing open source operating systems illegal, including all major Linux distributions. It would also effectively ban the F-Droid open source Android app archive.

Article 6 of the law requires all "software application stores" to:

  • Assess whether each service provided by each software application enables human-to-human communication
  • Verify whether each user is over or under the age of 17
  • Prevent users under 17 from installing such communication software
Leaving aside how crazy the stated intentions are or the details of what software would be targeted, let's consider the implications for open source software systems.

A "software application store" is defined by Article 2[*] to mean "a type of online intermediation services, which is focused on software applications as the intermediated product or service".

This clearly covers the online software archives almost universally used by open source operating systems since the 1990s as their main method of application distribution and security updates. These archives are often created and maintained by small companies or volunteer associations. They are hosted by hundreds of organizations such as universities and internet service providers all over the world. One of the main ones, the volunteer run Debian package archive, currently contains over 170,000 software packages.

These software archive services are not constructed around a concept of an individual human user with an identity or an account. They are serving anonymous machines, such as a laptop, a server or an appliance. These machines then might or might not be used by individual human users to install applications, entirely outside the control of the archive services.

To even conceptually and theoretically be able to obey this law would require a total redesign of software installation and sourcing and security updates, major organizational restructuring and scrapping, centralizing and rebuilding the software distribution infrastructure.

This is of course only theoretical as the costs and practical issues would be insurmountable.

If and when this law goes into effect it would make illegal the open source software services underpinning the majority of services and infrastructure on the internet, an untold numbers of appliances and the computers used by software developers, among many other things. To comply with the law all of it would have to shut down, globally, as the servers providing software and security updates can't tell the difference between a web server, a Japanese software developer, a refrigerator and an EU teenager.

It may seem unbelievable that the authors of the law didn't think about this but it is not that surprising considering this is just one of the many gigantic consequences of this sloppily thought out and written law.

[*] To define a software application store the law makes a reference to the EU Digital Markets Act, Article 2, point 12 which defines “virtual assistant”. What they actually mean is point 14, which does define “software application store”.


Chat Control: The EU’s CSEM scanner proposal


The End of the Privacy of Digital Correspondence​

The EU Commission proposes to oblige providers to search all private chats, messages, and emails automatically for suspicious content – generally and indiscriminately. The stated aim: To prosecute child sexual exploitation material (CSEM). The result: Mass surveillance by means of fully automated real-time surveillance of messaging and chats and the end of privacy of digital correspondence.

Other aspects of the proposal include ineffective network blocking, screening of personal cloud storage including private photos, mandatory age verification resulting in the end of anonymous communication, appstore censorship and excluding minors from the digital world.

Chat control 2.0 on every Smartphone

On 11 May 2022 the European Commission presented a proposal which would make chat control searching mandatory for all e-mail and messenger providers and would even apply to so far securely end-to-end encrypted communication services. Prior to the proposal a public consultation had revealed that a majority of respondents, both citizens and stakeholders, opposed imposing an obligation to use chat control. Over 80% of respondents opposed its application to end-to-end encrypted communications.

Currently a regulation is in place allowing providers to scan communications voluntarily (so-called “Chat Control 1.0”). So far only some unencrypted US communications services such as GMail, Facebook/Instagram Messenger, Skype, Snapchat, iCloud email and X-Box apply chat control voluntarily (more details here).

The Chat Control 2.0 Proposal

This is what the current proposal actually entails:

EU Chat Control ProposalConsequences
Envisaged are chat control, network blocking, mandatory age verification for communication and storage apps, age verification for app stores and exclusion of minors from installing many apps
The communication services affected include telephony, e-mail, messenger, chats (also as part of games, on part of games, on dating portals, etc.), videoconferencingTexts, images, videos and speech (e.g. video meetings, voice messages, phone calls) would have to be scanned
End-to-end encrypted messenger services are not excluded from the scopeProviders of end-to-end encrypted communications services will have to scan messages on every smartphone (client-side scanning) and, in case of a hit, report the message to the police
Hosting services affected include web hosting, social media, video streaming services, file hosting and cloud servicesEven personal storage that is not being shared, such as Apple’s iCloud, will be subject to chat control
Services that are likely to be used for illegal material or for child grooming are obliged to search the content of personal communication and stored data (chat control) without suspicion and indiscriminatelySince presumably every service is also used for illegal purposes, all services will be obliged to deploy chat control
The authority in the provider’s country of establishment is obliged to order the deployment of chat controlThere is no discretion in when and in what extent chat control is ordered
Chat control involves automated searches for known CSEM images and videos, suspicious messages/files will be reported to the policeAccording to the Swiss Federal Police, 80% of the reports they receive (usually based on the method of hashing) are criminally irrelevant. Similarly in Ireland only 20% of NCMEC reports received in 2020 were confirmed as actual “child abuse material”.
Chat control also involves automated searches for unknown CSEM pictures and videos, suspicious messages/files will be reported to the policeMachine searching for unknown abuse representations is an experimental procedure using machine learning (“artificial intelligence”). The algorithms are not accessible to the public and the scientific community, nor does the draft contain any disclosure requirement. The error rate is unknown and is not limited by the draft regulation. Presumably, these technologies result in massive amounts of false reports. The draft legislation allows providers to pass on automated hit reports to the police without humans checking them.
Chat control involves machine searches for possible child grooming, suspicious messages will be reported to the policeMachine searching for potential child grooming is an experimental procedure using machine learning (“artificial intelligence”). The algorithms are not available to the public and the scientific community, nor does the draft contain a disclosure requirement. The error rate is unknown and is not limited by the draft regulation, presumably these technologies result in massive amounts of false reports.
Communication services that can be misused for child grooming (thus all) must verify the age of their usersIn practice, age verification involves full user identification, meaning that anonymous communication via email, messenger, etc. will effectively be banned. Whistleblowers, human rights defenders and marginalised groups rely on the protection of anonymity.
App stores must verify the age of their users and block children/young people from installing apps that can be misused for solicitation purposesAll communication services such as messenger apps, dating apps or games can be misused for child grooming and would be blocked for children/young people to use.
Internet access providers can be obliged to block access to prohibited and non-removable images and videos hosted outside the EU by means of network blocking (URL blocking)Network blocking is technically ineffective and easy to circumvent, and it results in the construction of a technical censorship infrastructure



More videos on Chatcontrol are available in this playlist




How did we get here? – A timeline​

2020: The European Commission proposed “temporary” legislation allowing for chat control

The proposed “temporary” legislation allows the searching of all private chats, messages, and emails for illegal depictions of minors and attempted initiation of contacts with minors. This allows the providers of Facebook Messenger, Gmail, et al, to scan every message for suspicious text and images. This takes place in a fully automated process, in part using error-prone “artificial intelligence”. If an algorithm considers a message suspicious, its content and meta-data are disclosed (usually automatically and without human verification) to a private US-based organization and from there to national police authorities worldwide. The reported users are not notified.

6 July 2021: The European Parliament adopted the legislation allowing for chat control.

The European Parliament voted in favour for the ePrivacy Derogation, which allows for voluntary chat control for messaging and email providers. As a result of this some U.S. providers of services such as Gmail and Outlook.com are already performing such automated messaging and chat controls.

9 May 2022: Member of the European Parliament Patrick Breyer has filed a lawsuit against U.S. company Meta.

According to the case-law of the European Court of Justice the permanent and comprehensive automated analysis of private communications violates fundamental rights and is prohibited (paragraph 177). Former judge of the European Court of Justice Prof. Dr. Ninon Colneric has extensively analysed the plans and concludes in a legal assessment that the EU legislative plans on chat control are not in line with the case law of the European Court of Justice and violate the fundamental rights of all EU citizens to respect for privacy, to data protection and to freedom of expression. On this basis the lawsuit was filed.

11 Mai 2022: The Commission presented a proposal to make chat control mandatory for service providers.

On 11 May 2022 the EU Commission made a second legislative proposal, in which the EU Commission obliges all providers of chat, messaging and e-mail services to deploy this mass surveillance technology in the absence of any suspicion. However, a representative survey conducted in March 2021 clearly shows that a majority of Europeans oppose the use of chat control (Detailed poll results here).

8 May, 22 June, 5 July, 20 July, 6 September, 22 September, 5 October, 19 October, 3 November, 24 November 2022:
The proposal was discussed in the Council’s Law Enforcement Working Party

  • 28 September: Council workshop on detection technologies
  • 10 October: The proposal was presented and discussed in the European Parliament’s lead LIBE Committee (video recording)
  • 16 November: Council workshop on age verification and encryption
  • 30 November: First LIBE Shadows meeting
  • 14 December: LIBE Shadows meeting (Hearings)
  • 10 January 2023: LIBE Shadows meeting (Hearings)

Timetable of the negotiations in the Parliament

  • 19 & 20 January 2023: Law Enforcement Working Party Police meeting
  • 24 January 2023: LIBE Shadows meeting (Hearings)
  • April 2023: Tabling draft report
  • May 2023: Deadline for tabling amendments
  • June 2023: Negotiations of compromises after translation
  • September 2023: Vote in the LIBE Committee
  • October 2023: Vote in Plenary





The Negotiators​

The actors involved in the European Parliament: Rapporteur and shadow rapporteurs




How does this affect you?​

  • All of your chat conversations and emails will be automatically searched for suspicious content. Nothing remains confidential or secret. There is no requirement of a court order or an initial suspicion for searching your messages. It occurs always and automatically.
  • If an algorithms classifies the content of a message as suspicious, your private or intimate photos may be viewed by staff and contractors of international corporations and police authorities. Also your private nude photos may be looked at by people not known to you, in whose hands your photos are not safe.
  • Flirts and sexting may be read by staff and contractors of international corporations and police authorities, because text recognition filters looking for “child grooming” frequently falsely flag intimate chats.
  • You can falsely be reported and investigated for allegedly disseminating child sexual exploitation material. Messaging and chat control algorithms are known to flag completely legal vacation photos of children on a beach, for example. According to Swiss federal police authorities, 80% of all machine-generated reports turn out to be without merit. Similarly in Ireland only 20% of NCMEC reports received in 2020 were confirmed as actual “child abuse material”. 40% of all criminal investigation procedures initiated in Germany for “child pornography” target minors.
  • On your next trip overseas, you can expect big problems. Machine-generated reports on your communications may have been passed on to other countries, such as the USA, where there is no data privacy – with incalculable results.
  • Intelligence services and hackers may be able to spy on your private chats and emails. The door will be open for anyone with the technical means to read your messages if secure encryption is removed in order to be able to screen messages.
  • This is only the beginning. Once the technology for messaging and chat control has been established, it becomes very easy to use them for other purposes. And who guarantees that these incrimination machines will not be used in the future on our smart phones and laptops?
Click here for further arguments against messaging and chat control

Click here to find out what you can do to stop messaging and chat control




What you can do​

Do you fear that this law would cause massive damage to fundamental rights and is the wrong approach?

What is important now is to increase pressure on the negotiators:
1) Reach out to your government which is negotiating in Council. Contact your government’s permanent representation or your Ministry of the interior.
2) Get in touch with your Members of the European Parliament! The so-called shadow-rapporteurs have the lead on the negotiations. Here, you can find the contact details:
Javier Zarzalejos – rapporteur for the EPP group
Paul Tang – shadow rapporteur for the S&D group
Patrick Breyer – shadow rapporteur for the Greens/EFA group
Hilde Vautmans – shadow rapporteur for the Renew group
Annalisa Tardino – shadow rapporteur for the ID group
Vincenzo Sofo – shadow rapporteur for the ECR group
Cornelia Ernst – shadow rapporteur for the Left group
Politely tell them your concerns about chat control (arguments here). Experience shows that phone calls are more effective than e-mails or letters. The official name of the planned mandatory chat control law is “Proposal for a Regulation laying down rules to prevent and combat child sexual abuse”.
Talk about it! Inform others about the dangers of chat control. Here, you can find tweet templates, share pics and videos. Of course, you can also create your own images and videos.
Generate attention on social media! Use the hashtags #chatcontrol and #secrecyofcorrespondence
Generate media attention! So far very few media have covered the messaging and chat control plans of the EU. Get in touch with newspapers and ask them to cover the subject – online and offline.
Ask your e-mail, messaging and chat service providers! Avoid Gmail, Facebook Messenger, outlook.com and the chat function of X-Box, where indiscriminate chat control is already taking place. Ask your email, messaging and chat providers if they generally monitor private messages for suspicious content, or if they plan to do so.


1675255080335.png


OP Editorializing:

The EU, in its infinite wisdom, has proposed a law to:
  • force service providers to scan the content you post online or share via messaging/chat apps for potential banned media/speech
  • force service providers to require government ID
  • force service providers to deny minors under 17 access to chat apps, potentially leaving them only with SMS
  • require Europol to audit your ISP collected data for "potential CSAM and other prior offences" retroactively, where possible
For those who remember, Apple came under fire for its intents to do this in the US also, but since the "Its a private company man, it can do whatever it wants" people came out the woodwork, it was all swept under and attention was shifted. This might found familiar and you won't be wrong.

This is, among many other malicious intents you can read in the proposition, a classic play of "Won't someone think of the children" the bureaucrats in Brussels will attempt to use in order to then go for their actual goal of stripping freedom of speech and having totalitarian control over online resources.

Cast a huge net, then kill off the fish inside one by one.


The results of this being implemented are huge. Say goodbye to anonymity, privacy, free speech, open software repositories, a lot of free services and much more.


I HATE THE ANTICHRIST
I HATE THE EU
 
Last edited:
I’m late to the party, but this article by that dumb bitch Ella Jakubowska (twitter) is so misleading and retarded that I don’t even know where to start.

To keep my rant short: These people do not care about “digital rights” or free speech or anything like that. They are just yet another run-of-the-mill leftist NGO sustaining itself by fear mongering and e-begging - just like EFF and countless others. They are perfectly fine with restricting free speech and other rights if it suits their agenda. e.g. The only critical thing they wrote about the upcoming DSA is that it didn’t go far enough. [DSA = digital services act], which among other regulations requires big platforms to take down all sorts of speech eurocucks consider “misinformation” or ”hate speech”. This is a much bigger threat to free speech on the internet than anything else currently on the table, including the proposal in this thread.


What Ella and EDRi usually advocate for:
3/11 MEPs must ensure a full ban on all types of #PredictivePolicing. Place-based predictive policing leads to automated racial profiling & redirecting police to already oversurveilled areas where mostly working class & racialised groups live.

'Emotion recognition' technologies are built on a chilling history of racism. Along with biometric categorisation, these segregationist systems are the hidden face of biometric mass surveillance.
 
  • Agree
Reactions: notafederalagent
Back