New Obligations for Social Platforms

Austria

Social media has become indispensable for the exchange of information on the Internet. On platforms where user-generated content is in the foreground, users are frequently encountering racist, sexist or violent content. Legal action cannot only be taken against the poster. In the fight against hate on the internet, website operators themselves are increasingly being called upon to take action.

Successful lawsuit before the CJEU

In a controversially commented and interpreted decision, the CJEU has recently dealt with the question how far the obligations of social media platforms can extend (CJEU 03.10.2019, C-18/18).

In the main proceedings, the Commercial Court of Vienna had issued a preliminary injunction that prohibited a social media platform to publish certain defamatory comments about Eva Glawischnig-Piesczek, who is a former Austrian politician. The CJEU had to deal with several questions following a request for a preliminary ruling from the Austrian Supreme Court:

  • whether the respective social media platform can be obliged to delete such comments worldwide,
  • whether such comments need to be deleted even if they are posted by other users and
  • whether the obligation to delete such comments extends to comments with the same meaning.

In its ruling, the Court of Justice of the European Union first held that the platform operator has no obligation to monitor postings and actively investigate for illegal content. However, since content spreads rapidly via social networks, a platform operator who becomes aware of illegal content can be ordered to remove not only content of the user concerned, but also content stored by other users - worldwide. In addition, content with the same meaning must also be deleted. In this context, the CJEU underlined that the platform operator is not required to carry out an autonomous assessment of the content but can resort to automated techniques and means of investigation.

Social media called to take technical and organisational measures

Despite this limitation, such an obligation has far-reaching consequences for social media platforms. Companies will have to invest in software and filtering systems to meet their obligations. Sometimes, the use of such systems can lead to the erroneous removal of content that is not illegal, which critical voices call "censorship on the net".

Some commentators have argued that the practical consequences of the ruling are limited, because the obligation to delete postings only applies once a national court has declared the content to be unlawful. This may be what the wording of the decision suggests, but in practice social media operators will usually not be able to wait for a court decision to determine the unlawfulness of the content. If they do not want to risk a worldwide injunction, they will have to respond immediately to reasoned user requests requiring the deletion of defamatory postings.

In addition, legislative measures against hate and violence on Internet platforms are pending or in discussion. In Austria, for example, a few months ago the previous government proposed a registration obligation for users of social platforms. In Germany, on 1 January 2018, the Network Enforcement Act (Act to Improve Law Enforcement in Social Networks) introduced the obligation to remove or block "obviously illegal content" within 24 hours of receipt of a complaint and to provide for an effective complaints procedure – otherwise, millions in fines could be imposed. However, the providers only have to react to complaints and do not have to investigate themselves.

As part of the reform of the Audiovisual Media Services Directive, which still has to be transposed into national law in many countries, providers of video platforms are obliged to take "appropriate measures" to protect the public from content that incites violence and hatred. The Directive does not contain an obligation to pre-filter content. Rather, mechanisms must be implemented that ensure that users can report and label illegal content. Video platforms include not only services such as Youtube, but also social media platforms where the provision of audiovisual content is an essential function of the service.

More obligations, fewer hate postings?

Whether these developments will lead to fewer hate postings remains to be seen. However, the legal measures and plans as well as the recent ruling of the CJEU indicate a trend: Operators of social platforms will have to prepare for increasing obligations in the future. The creation of functioning reporting systems and a rapid response to complaints will be essential, especially in view of the threatened sanctions.