By Jan Kalbhenn, the Department of Information, Telecommunication and Media law, University of Münster.

A series of new laws in Germany addresses the impact of the platform economy on public discourse

I. The debate on societies’ future is decided online

Societies around the globe are facing major challenges. While the corona pandemic is currently on top of the list, the debate on climate change will continue for a long time to come. Both issues affect the daily life and the future of all citizens. In each case, it will be crucial to find solutions that enjoy broad support among citizens. Therefore, setting out basic parameters for public discourse is crucial.

According to the case law of the German Federal Constitutional Court, the press, TV and radio are “medium and factor” of the individual and public opinion-forming process. [1] Since they play a central role in the democratic decision-making process, radio and TV programmes are obliged to fulfil certain diversity standards and journalistic due diligence requirements. The latter applies to the press as well. These basic rules are intended to ensure that citizens are provided with trustworthy information to find commonly agreed political solutions. As the number of those who inform themselves via social networks is increasing (a trend that is accelerated by the corona virus), [2] the public debate shifts to digital platforms.

Digital communication platforms have become gatekeepers for news and information content as well as for public debate. [3] On Facebook, YouTube, etc. the communication rules set up by private operators apply in addition to the general laws. These rules, incorporated in Community Standards and policed inter alia by the Facebook Oversight Board, etc., determine access and retention as well as the weighting and ranking of content. However, conducting public debates according to the rules of profit-oriented corporations not only offers opportunities for participation, but also involves potential dangers as has been well established in academic research:

– algorithms following the logic of the attention economy endanger the diversity of opinion, [4]
– a lack of credibility and uncertainty regarding content, [5]
– minorities and other individuals are intimidated by hate speech online (“chilling effects”). [6]

A recent study by U.S. scientists predicts that under the present conditions, the total antivaccination views on Facebook will be dominant in about 10 years. [7]

In two landmark judgments, the Federal Constitutional Court dealt with the digitization of media and spaces of pubic debate. In 2018, the court found that the digitization of the media, and especially the platform economy, has the effect of narrowing diversity. This would make it more difficult to separate facts and opinions, it would strengthen onesided views, and create new uncertainties with regard to evaluations and sources. [8] In 2019, the court emphasized the importance of the constitutional guarantee of freedom of expression on monopoly-like platforms. Facebook had to restore the account of a rightwing politician that was deleted due to repeated violations of community standards. Banning the politician’s account would have deprived her of a crucial space to make her political convictions known, which is of high importance during election campaigns. [9]

II. New Digital Media Order

These findings must be taken into account when analysing the current legislative activities, which intend to adapt the legal framework for the media to the conditions of digitization. In particular, public debate online is likely to be shaped by the new State Media Treaty [10] and the draft amendment of the Network Enforcement Act (NEA). [11] How is the legislator dealing with the dangers outlined above?

1. The problem of diversity [12]
The State Media Treaty acknowledges the relevance of digital communication platforms (e.g. Facebook, YouTube) for the opinion-forming process and thus for the diversity of opinions. The new Treaty, refers to platforms as media intermediaries: “Telemedia, which also present journalistic and editorial offers of third parties in an aggregated, selected and generally accessible way, without combining them into a complete offer.” [13] In order to ensure the diversity of opinion, they are to keep the “criteria on the access and retention of content as well as central criteria of aggregation, selection and presentation of content and their weighting, including information on the functionality of the algorithms used, easily perceptible, directly accessible and constantly available in comprehensible language”.

2. Trustworthiness of information [14]
The Interstate Media Treaty also aims to improve the trustworthiness of information. The online outputs of the press are already subject to the duty of journalistic due diligence. Traditionally, compliance is controlled by an institution of voluntary self-regulation (German Press Council). Under the Interstate Media Treaty, the obligation to observe journalistic due diligence is now extended to all news and information services that regularly contain news or political information. [15] This includes Podcasts, Instagram and YouTube channels and the like. In addition, social bots, i.e. computer programs that create automated content whose external appearance resembles human-made content, must be identified as a bot.

3. Hate speech online [16]
The problem of hate speech is addressed by the amendment to the Network Enforcement Act (NEA). The purpose of the NEA is to secure a rational discourse. Minorities should be able to participate without fear of hate speech. To this end, a compliance approach was initially pursued, i.e. to set time limits for the existing obligation of platform operators to delete evidently criminal content. Critics fear that this legal framework could give rise to an over-blocking and a concomitant chilling effect. Moreover, the NEA only creates incentives for deletion but not for reinstatement in case a wrongful deletion. So far, these 3 anticipated over-blocking effects could not be proven. However, the amendment now includes ‘design specifications’ that alter the platform’s architecture. They include specifications for a so-called counter-proposal procedure that strengthens the rights of users. The amendment also leads to a greater involvement of civil society that can participate in a new settlement procedure. Expanded transparency and reporting obligations seek to regulate the use of artificial intelligence to identify content.

III. Still to do: Disintermediation

All these changes in the law are controversially discussed but one thing is clear: The time when large digital platforms could operate largely under the radar of media regulators is over, at least for the time being. Statutory requirements relating to the overall design of the platforms are unavoidable to secure the foundations of public discourse online. The amendment of the NEA leads the path. The German legal framework for digital platforms will also be further adapted. [17] Whether all these new rules will help to protect the integrity of online debates depends, among other things, on how another closely related problem will be tackled: the disintermediation of the news media by the digital platforms. [18] It is doubtful that this will be solved by way of the EU Copyright Directive and the ancillary copyright law provided for therein, or by way of the liability of social networks for unlicensed content. [19] The problem of disintermediation could become even more complicated in the future since digital platforms are increasingly producing their own editorial content. Here Australia’s Draft news media bargaining code could serve as a model. It puts forward an innovative idea to rebalance power by requiring digital platforms to give news media businesses at least 28 days’ notice of certain algorithm changes. [20] It is to be hoped that this potentially game-changing requirement will also be reflected in the forthcoming Digital Services Act, which seeks to modernise the current EU legal framework for digital services.

References

[1] BVerfGE 12, 205, 260.

[2] http://www.digitalnewsreport.org/survey/2020/overview-key-findings-2020.

[3] ECLI:DE:BVerfG:2019:qk20190522.1bvq004219.

[4] Bayer/Bard, Hate speech and hate crime in the EU and the evaluation of online content regulation approaches, 114.

[5] https://cmpf.eui.eu/mpm2020-executive-summary/.

[6] Bayer, Disinformation and propaganda, 2019, 11.

[7] Johnson et al, The online competition between pro- and anti-vaccination views, https://www.nature.com/articles/s41586-020-2281-1#code-availability

[8] ECLI:DE:BVerfG:2018:rs20180718.1bvr167516.

[9] ECLI:DE:BVerfG:2019:qk20190522.1bvq004219

[10] https://ec.europa.eu/growth/tools-databases/tris/de/search/?trisaction=search.detail&year=2020&num=26

[11] http://www.bmjv.de/SharedDocs/Gesetzgebungsverfahren/Dokumente/RefE_NetzDGAendG.pdf?__blob=publicationFile&v=3

[12] Kalbhenn, https://cmpf.eui.eu/new-diversity-rules-for-social-media-in-germany/

[13] Definition in § 2 Abs. 2 Nr. 16 Medie State Treaty.

[14] Holznagel/Kalbhenn, Journalistische Sorgfaltspflichten auf YouTube und Instagram, in apperance.

[15] § 19 State Media Treaty.

[16] Kalbhenn/Hemmert-Halswick in design specifications, MMR 8/2020.

[17] In German antitrust law, among other things, the concept of “intermediary power” is to be established in order to take account of the mediating and controlling function of platforms. In addition, the “essential facilities doctrine” is to be rev ised, the Federal Cartel Office is to be given more effective control over certain large digital groups, and an antitrust claim to data access is to be created in certain constellations. The protection of minors is also to be adapted to the business model of the platforms. According to the draft Youth Protection Act, a new catalogue of obligations is to be created which requires commercial service providers who store or provide user-generated content to take “appropriate and effective precautionary measures” to protect minors from unsuitable content.

[18] Stigler Committee on digital platforms, final report, 9; Declaration by the Committee of Ministers on the financial sustainability of quality journalism in the digital age (Decl(13/02/2019)2).

[19] Directive 2019/790, recitals 54, 55.

[20] https://www.accc.gov.au/media-release/australian-news-media-to-negotiate-payment-with-major-digital-platforms