Austria refers Facebook ‘Hate-Speech’ case to the CJEU

The supreme court of Austria (‘Oberste Gerichtshof’ - OGH, case number 6Ob116/17b) has asked the Court of Justice of the European Union (CJEU) for clarification on the scope of Art. 15 (I) of the E-Commerce Directive and the host provider privilege.

Here are the facts of the case:

The plaintiff in this case is Dr Eva Glawischnig-Piesczek, a green politician from Austria. In April 2016, a Facebook user with the fake alias ‘Michaela Jaskova’ posted an image of Glawischnig-Piesczek, and made some rude comments in German (“wretched traitor”, “corrupt clumsy oaf”, “member of a fascist party”) regarding the politician. Facebook was requested to delete the image and the comments in July 2016, but failed to do so.

Glawischnig-Piesczek then obtained a preliminary injunction against Facebook, which obliged the social network not only to delete the image and the specific comments (making them inaccessible worldwide), but also to delete any future uploads of the image if it was accompanied by comments that were identical or similar in meaning to the original comments. Upon being served the injunction, Facebook blocked access to the original image and comments (limited to Austria) and appealed the decision. The court of second instance upheld the first decision only in part: Facebook was now obligated to delete any future uploads of the image if it was accompanied by comments that were identical to the original wording or if the comments were similar in meaning and Facebook had actual knowledge of these comments (e.g. via a subsequent notice from the plaintiff or a third party).

Both parties appealed the court of appeal’s decision, which brought the case to Austria’s highest civil court, the OGH.

The judges begin their decision with a detailed analysis of the comments that were made by ‘Michaela Jaskova’ and find them to be unlawful due to their defamatory and offensive nature and complete lack of factual background for the statements. The court also states that the unlawfulness was evident, even to a layman without any legal background. Due to this, Facebook was required to delete the image and the comments upon obtaining actual knowledge of the post on July 2016. Facebook’s failure to do so deprived the social network of its liability privilege as a hosting provider under Article 15 of the E-Commerce-Directive, which is implemented in § 16 ECG (the Austrian ‘E-Commerce-Law’).

Facebook is considered by the court to be an abettor to the unlawful comments and as such, under Austrian law, is obliged to refrain from any repetition of the infringement. Austrian jurisprudence with regard to defamatory statements allows the plaintiff and the court to include statements that are not identical, but similar in wording or meaning, in the injunction. This does make sense and is handled similarly in Germany. If it was only forbidden to repeat the identical statement, the offender could easily circumvent the court’s decision by slightly altering his statement, resulting in the necessity of a new court order for each new statement.

Identical or similar?
However, according to the court, the claim for a ‘broad’ injunction that includes statements that are different from the original one might be in conflict with Article 15 of the E-Commerce-Directive. Article 15, which is implemented in § 18 ECG, asks that member states shall not impose a general obligation on providers to monitor the information which they transmit or store, nor a general obligation actively to seek facts or circumstances indicating unlawful activity.

Indeed, an obligation for Facebook to pro-actively identify every future infringing post, including those that are different in wording, but similar in meaning, could result in an obligation to monitor all information which Facebook stores.

The judges appear uncertain whether such a judgment would result in a general obligation in the sense of Article 15. They turn to recital 48 of the E-Commerce-Directive, which states:

‘This Directive does not affect the possibility for Member States of requiring service providers, who host information provided by recipients of their service, to apply duties of care, which can reasonably be expected from them and which are specified by national law, in order to detect and prevent certain types of illegal activities.’

According to this, a judgment asking for the prevention of illegal activities is not completely excluded with regards to Article 15 (1) of the E-Commerce-Directive, but can be made with regards to specific infringements.

Looking at McFadden, the court states that monitoring all of the information must be excluded from the outset as contrary to Article 15 (1). However, the judges believe this statement from the CJEU is not applicable to the case at hand, since McFadden dealt with an access-provider and not a hosting provider. This statement is a bit odd in its brevity, given that Article 15 applies to ‘mere conduit’ as well, but the court provides no further explanation.

The judges appear more inclined to apply L’Oréal/eBay, which found that hosting providers can be ordered to take measures which contribute, not only to bringing to an end infringements of intellectual property rights, but also to preventing further infringements of that kind. This leaves the question what constitutes infringements ‘of that kind’. Are only infringements that are identical to the original infringement ‘of that kind’ or are similar infringements included when it comes to ‘hate speech’?

The Vienna judges decided to ask the CJEU for clarification and have referred the following questions [be advised, this is a Kat-translation and not the official wording]:

Does Article 15(1) of Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (‘Directive on electronic commerce’) preclude the national court, to make an order requiring a hosting provider who has failed to expeditiously remove illegal information not only to remove the specific information but also other information that is identical in wording?

With regards to the first question, the court further asks whether Article 15(1) precludes such an order that requires the hosting provider to remove such information (or block access to it) worldwide or only in the relevant member state.

The court further asks whether Article 15(1) precludes such an order that is limited to removing or blocking access to the illegal information only from the specific user who posted the content and whether such an order would be applicable worldwide or only in the relevant member state.

The court next asks, if the previous questions are answered in the negative: does the same answer apply to information that is not identical in wording, but similar in meaning?

Finally, the court asks: does the same answer apply to information that is not identical in wording, but similar in meaning, once the host provider has actual knowledge of the information?

This Kat is very much looking forward to the CJEU’s answers. The referral touches some very relevant questions. In Canada, the Supreme Court ruled in Equustek v. Google that the search engine had to delist certain results from its search engine globally. In the case at hand, the plaintiff wants the offensive content to be removed from Facebook globally, while Facebook has only limited access from Austria.

Also, the issue of pro-active monitoring/content filtering by hosting providers is at the center of the discussion around Art. 13 of the upcoming DSM-directive.

This post will be updated once the CJEU publishes the questions on its website.

*EDIT* Here are the "official" questions:

Does Article 15(1) of Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (‘Directive on electronic commerce’) 1 generally preclude any of the obligations listed below of a host provider which has not expeditiously removed illegal information, specifically not just this illegal information within the meaning of Article 14(1)(a) of the Directive, but also other identically worded items of information:

a.a. worldwide?
a.b. in the relevant Member State?
a.c. of the relevant user worldwide?
a.d. of the relevant user in the relevant Member State?
In so far as Question 1 is answered in the negative: Does this also apply in each case for information with an equivalent meaning?

Does this also apply for information with an equivalent meaning as soon as the operator has become aware of this circumstance?
Austria refers Facebook ‘Hate-Speech’ case to the CJEU Austria refers Facebook ‘Hate-Speech’ case to the CJEU Reviewed by Mirko Brüß on Tuesday, January 30, 2018 Rating: 5

No comments:

All comments must be moderated by a member of the IPKat team before they appear on the blog. Comments will not be allowed if the contravene the IPKat policy that readers' comments should not be obscene or defamatory; they should not consist of ad hominem attacks on members of the blog team or other comment-posters and they should make a constructive contribution to the discussion of the post on which they purport to comment.

It is also the IPKat policy that comments should not be made completely anonymously, and users should use a consistent name or pseudonym (which should not itself be defamatory or obscene, or that of another real person), either in the "identity" field, or at the beginning of the comment. Current practice is to, however, allow a limited number of comments that contravene this policy, provided that the comment has a high degree of relevance and the comment chain does not become too difficult to follow.

Learn more here: http://ipkitten.blogspot.com/p/want-to-complain.html

Powered by Blogger.