Obligations to remove content
Eva Glawischnig-Piesczek v Facebook Ireland Ltd, Case C-18/18
The questionCan an online platform/host provider be required to remove identical or “equivalent” content that has previously been declared as an illegal post?
The key takeaway
A Member State can require an online platform to remove content that is identical or “equivalent” to content that has already been found to be illegal.
The background
In April 2016, a Facebook user posted on their personal page with an article about Eva Glawischnig-Piesczek, a member and chair of the Austrian green party. The Facebook post also included a thumbnail of the article consisting of a short description of the article and a picture of Ms Glawischnig-Piesczek, and a comment from the user about the article. This comment was found to be defamatory by the Austrian court and as a result Ms Glawischnig-Piesczek asked Facebook that it delete the comment. Facebook refused.
As a result, Ms Glawischnig-Piesczek issued a claim against Facebook and in December 2016, the Viennese Commercial Court awarded an interim injunction in Ms Glawischnig-Piesczek’s favour stipulating that Facebook must stop sharing photos of Ms Glawischnig-Piesczek with identical or “equivalent” accompanying text (and that Facebook had to remove the original defamatory post).
This was appealed to the Higher Regional Court of Vienna who upheld the judgment, but found that Facebook would only need to prevent the dissemination of “equivalent” content if they had been informed by Ms Glawischnig-Piesczek or another source. This judgment did not satisfy either party who both appealed to the Austrian Supreme Court, who subsequently referred the following questions to the CJEU regarding the E-commerce Directive (the Directive):
- Does Article 15(1) of the Directive prevent a Member State from ordering a host provider to take down content that has previously been declared as illegal and other “identically worded items of information”?
- If not, does this also apply in each case for information with an equivalent meaning?
- Is the territorial scope of an order of a Member State limited?
- Does this also apply for information with an equivalent meaning as soon as the operator has become aware of this circumstance?
The CJEU replied accordingly:
- The Directive does not prevent a Member State from requiring removal of content that has been formerly declared illegal.
- “Equivalent” content should be covered by the injunction.
- There is no limitation on territorial scope.
- Given the answers to questions 1 and 2, the CJEU did not respond to the 4th question as being informed of the “equivalent” content would not impose a general obligation to monitor (under s15(1)).
From an online platform/host provider’s perspective, the CJEU judgment highlights the current drive towards greater regulation of online platforms/content. The ruling attempts to balance the burden placed on host providers to search for replicated illegal content with the need to protect individual’s rights. However, requiring host providers to take down “equivalent” content could be difficult as it may require human judgement, rather than just advanced search tools. With regards to applying worldwide search orders, it will be interesting to see how these global injunctions work in practice and whether countries with broad censoring policies will take advantage of this to try and prevent news spreading abroad.
Any practical tips?
The CJEU ruling does not impose an obligation on host providers to monitor for illegal content, but the providers should be aware of their obligations once illegal content has been flagged to them.
Further, the providers should set up processes to identify and remove identical and “equivalent” material to content that they have already removed.
Stay connected and subscribe to our latest insights and views
Subscribe Here