gets the one Who is moving today in online social networks, a bunch of professionally staged and organized content sets: self-representations of Influencers, Social Media editors, recycled media reports, the interests of the individual matched ad – and state-controlled disinformation campaigns.

in The became a normal part of Facebook, Twitter, YouTube and other platforms, because they fit perfectly to the technology and the business model of the operator: The networks do everything, in order to bind the attention of the users to reward their inter-actions and they split for their advertising customers in the smallest of target groups. With this attention and Targeting mechanisms, the organizers of the campaigns work. Therefore, your Fake Accounts are distinguished for the part hardly of legitimate users, therefore, the detection of disinformation campaigns is a complex task. One for which even the large US companies in need of external help.

transparency can reveal embarrassing weaknesses

in this respect, Twitter’s release of tens of millions of Tweets to come from Russian and Iranian Troll factories is a correct and important step. Such transparency traded company for a stock exchange is not without risk. You can disclose, what weaknesses it has in the control of its own platform.

But can understand now, researchers and data journalists around the world about the Material here, and try to, as misinformation runs out, what its objectives are and maybe also, what kind of success she has. To learn

and to explain to the Public, there is a lot of the actors in disguise, in order to remain between the normal users as long as possible undetected and at the same time as large an audience to build, for example. Or how they try in their own country, to strengthen a political-ideological line, and in the foreign countries try to sow discord, to deepen social divisions, to influence elections, and to discredit elected politicians. Or which of your messages in other media and which media content they exploit social networks for their purposes. Such insights help in the best case, all the network users (self-)critical approach to the platforms.

disinformation is a Feature, not a Bug

However, the perpetrators will continue to develop their methods for years, while the operators of social networks are only now beginning to address disinformation campaigns are: no unsightly side-effect, but a key use scenario. A Feature, not a Bug.

to The Public for review not review ten million Tweets from the years to flip from 2013 to 2018 in front of the feet, but. Ex-post transparency is too little. External researchers will need more and constant access to the data by the platform operator. Facebook, for example, has now a “war Room”, in which two dozen of his employees in real time to suspicious network activities and analyze to recognize. This is also a useful step. It would be even better if independent experts are expected to work there.