(Nov. 10, 2020) On September 3, 2020, the Austrian federal chancellor’s office published and introduced to the National Council (the lower house of the Austrian Parliament) a draft bill to fight hate on social media and to protect communication platform users. The Draft Federal Act on Measures to Protect Users on Communication Platforms (English translation) states that immediate national legislative action is necessary even before a common approach on this matter is reached within the European Union (EU). (Explanatory Notes on the Communication Platform Act at 1.) In addition, further initiatives concerning several related topics, such as protection of personal rights, claims for injunctive relief and removal in this context, and jurisdiction and procedural provisions issues were published and introduced to the National Council. (Federal Act Establishing Civil Legal and Civil Procedural Measures to Combat Hate on the Internet (English translation); Federal Act Establishing Penal and Media Policy Measures to Combat Hate on the Internet (English translation).)
Entry into Force
The draft does not specify a date for its entry into force in 2021. However, communication platform providers would be given four months to comply with the respective rules. Prospective communication platforms would be given a three-month period to comply after starting their business. (Communication Platform Act § 14.)
The period for soliciting expert opinions on the bill ended on October 15, 2020, which means the federal chancellor’s office is now able to make amendments to the draft bill and incorporate suggested changes. If the bill is approved by the federal government, it will be passed on to the National Council again for its first reading.
Scope of Application
The Communication Platform Act would apply to “communication platforms,” which are defined as ”information society service[s], the main purpose or an essential function of which is to enable the exchange of messages or presentations with intellectual content in written, aural or visual form between users and a larger group of other users by way of mass dissemination.” Facebook or Youtube, for example, would fall under this definition. (Communication Platform Act § 2, no. 4.) This determination is used to differentiate them from providers of individual communication, such as WhatsApp. (Explanatory Notes at 4.)
The bill would apply only to communication platforms that had an average of 100,000 registered users in Austria in the previous quarter and a turnover of more than 500,000 euros (about US$593,642) in Austria in the previous year. (Communication Platform Act § 1, para. 2.) Certain communication platforms are exempt, such as certain media companies that are already covered by specific legal requirements, or online trading platforms and not-for-profit online encyclopedias, even though they have a commentary section. (§ 1, para. 3.)
All regulated communication platforms, including those incorporated in foreign countries, would be required to appoint a “responsible representative” to ensure compliance with domestic law and for service of process and cooperation with law enforcement authorities. (§ 5, para. 1.)
Removal of Illegal Hosted Content
Providers of communication platforms would need to ensure that an effective, transparent and, above all, “user friendly” procedure is available to their users in order to handle alleged illegal content. (§ 3, paras. 1 & 2.) In cases when the illegality of the content is obvious on its face, even “to a legal layperson without further investigation,” posts would have to be removed or blocked without delay, but at the latest within 24 hours after receiving notice. (§ 3, para. 3, no. 1 (a).) In contrast, if a comprehensive assessment of the illegality is necessary, the removal or blocking would have to be conducted without delay after the respective assessment, which would have to take place within seven days from receiving a user’s notice. (§ 3, para. 3, no. 1 (b).)
The bill includes an exhaustive list of criminal offenses to determine whether specific content can be regarded as illegal within the Communication Platform Act’s scope of application and must be removed or blocked. (§ 2, no. 6.) This list includes, for example, the crimes of coercion, dangerous threats, stalking, blackmailing, or terrorist offenses. (Strafgesetzbuch §§ 105, 107, 107a, 107c, 113, 115, 120a, 144, 188, 207a, 208a, 278b, 278f, 282a, 283; Verbotsgesetz [National Socialism Prohibition Act 1947] §§ 3d, 3g, 3h.)
Complaint Mechanism and Reports
The draft would create a multi-tiered scrutinizing mechanism to determine whether the removal or blocking was correct. After a post has been reported, affected users would have to be informed of the outcome, the reasoning underlying the communication platform’s decision, and the exact time when the post was removed or blocked. (§ 3, para. 2.) Within two weeks of receiving this information, they would be able to ask to review the communication platform’s decision before commencing a complaints procedure, potentially followed by a supervisory review process. (§ 3, para. 4; § 7, para. 1; § 9, para. 1.)
To avoid “overblocking,” communication platform providers would be required to implement an effective and transparent review mechanism of their decisions to remove or block content. (§ 3, para. 4.)
Under specific conditions, including an inadequate reporting or review procedure and a failure to provide information to involved users on the possibility of a review or on a complaint procedure, users would be able to file their complaints with a complaints office. (§ 7, para. 1, sentence 1.) Nevertheless, affected users could commence such a complaint procedure only after contacting the communication platform and receiving no response or if their dispute could not be settled. (§ 7, para. 1, sentence 2.)
The supervisory authority would initiate a supervisory procedure when more than five substantive complaint proceedings regarding the insufficiency of the communication platform’s measures have been initiated within one month’s time. (§ 9, para. 1.)
To achieve greater transparency, the bill provides that communication platforms would have to submit and publish periodical reports—usually annually—on how reports of alleged illegal content from users are being handled. (§ 4, para. 1.) However, communication platforms with more than 1 million registered users would have to submit quarterly reports. (§ 4, para. 1.)
National authorities would be able to impose a fine on platforms for failure to comply with their obligation to appoint a responsible representative. (§ 6, para. 2 in conjunction with § 10, para. 1, no. 8.) Furthermore, the supervisory authorities would be authorized to impose either certain rules of conduct or a fine. (§ 9, para. 2.) Depending on the severity of the violation, the fine could be as much as 10 million euros (about US$11.9 million). The specific amount would depend on various other factors, such as the service provider’s financial strength, previous violations, number of registered users, and cooperation in the respective proceedings. (§ 10, para. 2.)
Additionally, the responsible representatives themselves could be fined—for example, if they do not comply with the responsibilities set out in the bill. (§ 11.)
Various bodies—for example, organizations that work with victims of hate on social media, law experts, Austrian public authorities, and social media platforms—submitted statements regarding the draft act during the consultation of experts. (List of submitted opinions.)
Facebook Ireland Limited in general welcomed the efforts of the Austrian legislature, but nevertheless raised detailed questions regarding certain provisions, in particular regarding the bill’s compliance with EU law, the specific period established for removal of posts, and the issue of “overblocking.”
Katharina Kucharowits, member and spokesperson of the oppositional Social Democratic Party of Austria (Sozialdemokratische Partei Österreichs – SPÖ), stated that she takes the criticism expressed by the experts very seriously and that the draft must be revised as a next step.
Prepared by Viktoria Fritz, Law Library intern, under the supervision of Jenny Gesley, Foreign Law Specialist.