Skip to content

In a period of ongoing modernization of European legislation concerning the European Digital Single Market, the regulation of online copyright is a continuing concern. The proposed new copyright directive (‘the Copyright Directive’) would bring far-reaching changes to European copyright law and has been heavily debated by the member states over the last two years. It has also been intensely criticized in the media.

On Wednesday, 13 February 2019, however, a breakthrough was achieved when the three branches of European government – the European Commission, the Parliament, and the Council of the EU – reached a political agreement on the Copyright Directive. In the coming months, the European Parliament and the Council of the EU will have a final vote on the Copyright Directive.

At the same time, the media publish extensively about the directive. Their most common complaint concerns Article 13 and the duty of ‘online content sharing providers’ to filter content. Household-named tech giants are the online content sharing providers (‘Content Sharing Providers’). Some argue that such filtering could violate people’s freedom of expression, as defined in Article 10 of the European Convention of Human Rights.

But where does this fear of ‘filtering of the internet’ come from? Does it really pose a threat to human rights? And are there no countervailing advantages to be provided by the Copyright Directive?

Protection of the Creative Content Sector

To start with the last question: Yes, there are advantages, but only for a limited group of stakeholders. One of the objectives of the Copyright Directive is to create a ‘fairer and sustainable marketplace for authors, performers, the creative industries and the press’.

While these parties are at the heart of content creation and the creative sector, their remunerations are not considered reflective of the extensive online use of their content by Content Sharing Providers. This use is generally not addressed in agreements between creators and such providers.

Consequently, if content published on the Internet infringes a creator’s copyright, the content can only be removed afterwards, and no fixed arrangements on remuneration and/or compensation for damages are in place to make the creators whole. Uncertainty about the specific use of content creators’ material negatively affects their ability to determine appropriate use and remuneration. The European Union therefore finds it important to ‘foster the development of the licensing market between rightholders and the Content Sharing Providers’. These licensing agreements should be ‘fair and keep a reasonable balance for both parties’. (Recital 37 of the last proposal for the Copyright Directive).

Article 13 Copyright Directive – Conclusion of License Agreements

Article 13 of the Copyright Directive says that Content Sharing Providers shall:

‘obtain an authorization from the rightholders […], for instance by concluding a licensing agreement’.

The complete text of the amended (and agreed upon) Article 13 can be found here.

Subparagraph 2 of Article 13 sets out one main element of the license agreement: ‘acts carried out by users of the services’. This means that the license agreement would need to cover the possible acts of the platform’s users. This would impose a great burden on online platforms to control and manage user-generated content, thus   incentivizing the filtering of user-generated content on these online platforms.

Article 13 Copyright Directive –  Liability, unless….

Subparagraph 4 of Article 13 makes the duty to filter content explicitly clear, because if the rightholder does not grant the required authorization, the Content Sharing Provider is liable for the publication of the copyrighted work.

There is an exception to this strict liability, but only if the Content Sharing Provider complies with the following obligations.

The Content Sharing Provider must demonstrate that:

  1. it made best efforts to obtain an authorization; and
  2. it made best efforts to ensure the unavailability of the specific work; and in any event
  3. it acted expeditiously (upon receiving a notice by the rightholders to remove the specific work).

The most efficient way to ensure the “unavailability of the specific work” under (b) would likely be the filtering of all uploaded works on a platform. Using a filter would enable the practical detection of potentially infringing specific works.

Possible Infringements of Human Rights

While Content Sharing Providers are likely to filter user-generated content to protect themselves, automated content filters often fail to recognize the context and actual content of the specific material (link in Dutch). Such failures would contravene other provisions of  the Copyright Directive, which explicitly allow specific forms of expression that may not be recognized by a content filter.

The Copyright Directive even contains an obligation for the member states to ensure that users in the EU have the freedom for expressions such as:

  • quotation, criticism and review; and
  • use for the purpose of caricature, parody or pastiche

(Article 13 subparagraph 5 Copyright Directive)

This language would seem to require that a filter draw a clear distinction between the content and its specific purpose. This may be an impossible task in practical terms.

A further challenge for adequate filtering is the fact that a copyrighted work, like a video clip, may have multiple authors, each of whom would need to authorize a license to use the work. Consequently, a Content Sharing Provider’s filter system would need to be precise and fully accurate as to attribution, since a mistake regarding a single author’s authorization could lead to a claim.

Lastly, the use of Internet filters poses threats to user privacy. The filtering of content could easily result in the monitoring of users and their personal data. Objective and clear criteria for content filtering is thus required to prevent infringements of the General Data Protection Regulation (GDPR).

All the above will likely lead to the implementation of risk averse policies by online platforms, considering the high threat of many large claims. Such policies are likely to result in strict application of filters in order to block all content that poses a potential risk.

Thus, the risk of using strict upload filters is that ‘safe’ content is filtered out, limiting a free flow of information and freedom of expression. The fact that ‘new platforms’ (i.e., platforms on the market for less than three years and with an annual turnover below EUR 10 million) are exempted from the above obligations (b) and (c) may seem positive for start-ups, but it also means that there is really no way out for the large-scale platforms with millions of European users.

To be continued…

The European Parliament and the Council of Europe are expected to take their final vote in March and April 2019. Subsequently, the member states need to implement the Copyright Directive as national legislation. The future of the Copyright Directive and its potential impact are uncertain, and it remains to be seen if the (members of these) institutions are willing to obstruct the current proposal after years of numerous and lengthy negotiations. To be continued.

For more on copyright law, click here.