Entrance to RPC building - dark

Obligations to remove content

Published on 21 January 2020

Eva Glawischnig-Piesczek v Facebook Ireland Ltd, Case C-18/18

The question

Can an online platform/host provider be required to remove identical or “equivalent” content that has previously been declared as an illegal post?

The key takeaway


A Member State can require an online platform to remove content that is identical or “equivalent” to content that has already been found to be illegal. 

The background

In April 2016, a Facebook user posted on their personal page with an article about Eva Glawischnig-Piesczek, a member and chair of the Austrian green party. The Facebook post also included a thumbnail of the article consisting of a short description of the article and a picture of Ms Glawischnig-Piesczek, and a comment from the user about the article. This comment was found to be defamatory by the Austrian court and as a result Ms Glawischnig-Piesczek asked Facebook that it delete the comment. Facebook refused.

As a result, Ms Glawischnig-Piesczek issued a claim against Facebook and in December 2016, the Viennese Commercial Court awarded an interim injunction in Ms Glawischnig-Piesczek’s favour stipulating that Facebook must stop sharing photos of Ms Glawischnig-Piesczek with identical or “equivalent” accompanying text (and that Facebook had to remove the original defamatory post).

This was appealed to the Higher Regional Court of Vienna who upheld the judgment, but found that Facebook would only need to prevent the dissemination of “equivalent” content if they had been informed by Ms Glawischnig-Piesczek or another source. This judgment did not satisfy either party who both appealed to the Austrian Supreme Court, who subsequently referred the following questions to the CJEU regarding the E-commerce Directive (the Directive):
  • Does Article 15(1) of the Directive prevent a Member State from ordering a host provider to take down content that has previously been declared as illegal and other “identically worded items of information”? 
  • If not, does this also apply in each case for information with an equivalent meaning?
  • Is the territorial scope of an order of a Member State limited?
  • Does this also apply for information with an equivalent meaning as soon as the operator has become aware of this circumstance? 
The decision

The CJEU replied accordingly:
  • The Directive does not prevent a Member State from requiring removal of content that has been formerly declared illegal. 
The CJEU considered that the speed with which information is shared on the internet means that illegal content could be shared and replicated with ease and so it was reasonable for a Member State to be able to remove access to such identical information. Further, in accordance with the Directive, the CJEU held that (as Article 15(1) stipulates), requiring such removal would not impose on providers a general obligation to monitor.
  • “Equivalent” content should be covered by the injunction.
The CJEU recognised that, as statements were held to be defamatory because of their overall meaning rather than the specific words used, an injunction should cover content that conveys the same underlying message (even if the words are not identical). However, so as not to impose too heavy a burden on host providers to search for content, “equivalent” content must contain specific features of the infringing comment such as the named individual, and the post must be such that there is no need to be carry out an independent assessment of the content. Given the technological capabilities of Facebook, it was held that the burden would not be too onerous. 
  • There is no limitation on territorial scope.
The CJEU found that, subject to international law, Member States can make enforceable worldwide orders.
  • Given the answers to questions 1 and 2, the CJEU did not respond to the 4th question as being informed of the “equivalent” content would not impose a general obligation to monitor (under s15(1)).
Why is this important?

From an online platform/host provider’s perspective, the CJEU judgment highlights the current drive towards greater regulation of online platforms/content. The ruling attempts to balance the burden placed on host providers to search for replicated illegal content with the need to protect individual’s rights. However, requiring host providers to take down “equivalent” content could be difficult as it may require human judgement, rather than just advanced search tools. With regards to applying worldwide search orders, it will be interesting to see how these global injunctions work in practice and whether countries with broad censoring policies will take advantage of this to try and prevent news spreading abroad. 

Any practical tips?

The CJEU ruling does not impose an obligation on host providers to monitor for illegal content, but the providers should be aware of their obligations once illegal content has been flagged to them.

Further, the providers should set up processes to identify and remove identical and “equivalent” material to content that they have already removed.