Since the attacks on the World Trade Center in 2001 and in Madrid in 2004, parts of global security policy have been on the wrong track. The idea of ​​capturing every trace of data, scanning it and accumulating huge mountains of data has repeatedly become a legal reality. It often failed to stand up before the European Court of Justice and the national constitutional courts.

Digital policy, regulation, artificial intelligence: the briefing on digitization

We are now seeing the same pattern at the EU Commission, led by German Commission President Ursula von der Leyen. The EU Commission’s draft regulation on preventing and combating child abuse online exceeds all expectations. It is a frontal attack on civil rights in the digital space and effectively undermines the digital secrecy of letters. The upload filters are also returning.

According to the Commission, providers of messenger and other communication services as well as host providers should be able to be ordered by order to search communication for specific content and to filter new content. Orders to delete content and network blocks are also planned. “Client-side scanning” or other forms of “chat control” are intended to avoid having to overturn the end-to-end encryption of messages.

However, as the internet blockades of 2009 and 2010 failed, this proposal will not become a reality either. Europe is not China, here the end does not justify every means, and what the Commission is planning would destroy digital postal secrecy and reduce the point of encrypted communication ad absurdum. The mere possibility that content scanning and filtering can be ordered is already sufficient to de facto abolish the confidentiality of the communication. Finally, in order to implement a scan or filter arrangement, you need the technical ability to search through content in plain text. This option must be built in by providers and is in conflict with effective encryption.

In a widely acclaimed paper from last year, IT security experts spoke of mobile phones as bugs (“bugs in your pocket”) that can be switched on and off at any time. As early as March, well-known civil rights organizations had written an open letter to the Commission on the originally announced presentation date for the draft regulation and pointed out the problems with “client-side scanning” and “chat control”. In its statement on the draft, the Chaos Computer Club speaks of a “fundamentally misguided monitoring method”.

Even if one doesn’t want to listen to all the “usual suspects” civil rights advocates, one should at least take the perspective of practitioners and stakeholders to heart. In expert discussions, security authorities often raise doubts as to whether the proposed instruments will give them an advantage in combating child abuse. Because today there is no shortage of data traces. There is a lack of capacity to evaluate them promptly and systematically. In addition, many perpetrators no longer send large amounts of image material, but link content. Even the best technical possibilities cannot analyze a link without clues for the further content. This is just one example of the fact that the proposed methods must remain largely unsuitable attempts.

This is all the more serious when one considers that child sexual abuse is a real and growing danger. Initial studies suggest that cases of sexual abuse have increased during the corona pandemic. Reports from the past few weeks say that the European Union is the hub for the dissemination of images of abuse on the Internet. How to address these dire problems and translate insights into effective action is a major challenge.

The extremely successful special unit “BAO Berg” (BAO for special organizational structure) from North Rhine-Westphalia, which was set up in response to the horrific incidents of child abuse in the state, represented a model example of well-coordinated action with sufficient resources against an entire crime area The federal board of directors of the child protection association said that scanning encrypted communication without cause was disproportionate and not very effective. The expansion of structures and the generally better visibility of the police on the Internet are more helpful.

The Return of Upload Filters and the End of Encryption is not the title of a dystopian novel by George Orwell. The EU Commission appears to be quite serious about its proposals. If Germany does not vehemently advocate civil rights at European level in the Council and in the forthcoming negotiations, a similar quandary as with upload filters in the copyright debate could loom. A right to encryption, as agreed in the coalition agreement, could no longer be implemented if the Commission were to implement its proposals.