It is our job to define what that means in terms of content and implementation on Instagram, TikTok or YouTube. when e.g. B. Is content shortened or simplified so insufficiently that the result is no longer compatible with journalistic standards? Has this already been completed and satisfactorily for all platforms? Certainly not. The bottom line is that ARD also adheres to journalistic quality criteria in social media.

What we have no influence on, however, are the recommendation logic of the algorithms. Which content is played out when to which users and in which context individual contributions are embedded is under the control of the platform. This is the greater threat to balanced opinion formation and reporting.

Content from the Tagesschau, for example, can be found on Instagram, YouTube and TikTok, but also at, in the Tagesschau app and in the ARD media library – i.e. on ARD’s own platforms. Presentation and recommendation logics are developed and controlled by ARD. The core of ARD’s digital strategy is to strengthen these platforms. We still want and need to be present on third-party platforms and in social media networks – they are part of everyday media life, especially for younger people. But we use these platforms in a more targeted way to promote our brands and our own offerings.

A concrete example: together with TikTok, ARD developed the ‘Bundestag Hub’ last year. The goal: to create interest and understanding among very young people and to give an impetus as to why it is important to vote.

Public broadcasters are not alone in striving to develop technology in such a way that it helps people more than harms them. One of the better-known thought leaders is Tristan Harris, who, significantly, used to be a “design ethicist” at Google. He left Google and founded the “Center for Humane Technology,” an organization that aims to “advance more humane technology that promotes our well-being, democracy, and the free flow of information.”

Last year, ARD formulated the following in ‘Key points for personalization’: “Unlike all commercial providers, we no longer have to sell anything to our users, unlike many commercial providers, we don’t earn anything with their data, unlike some commercial providers, we don’t have any Content that is dubious or harmful to young people. And we do not bind users by specifically using existing opinions and emotions, which leads to the much-cited “filter bubbles” on other platforms. Rather, it is part of our basic service mandate to create diversity.” This also includes the fact that, in addition to the recommendations made by algorithms, there is always editorial control and curating. Other building blocks are voluntariness, transparency and individual control. To a certain extent, users are always in control.

Nevertheless, the critical point remains that it is up to the platform to decide which content is displayed at all, which content is rated as relevant for the users. It is known that exaggerated representations or even fake news spread much faster. The legislature has responded. Platform-regulating elements have been included in the State Media Treaty since last year. For example, a certain findability must be guaranteed for content under public law and public value offers. There is also a regular and constructive exchange with the major platforms. This also helps to create understanding and represent interests.