Will EU copyright law ‘carpet bomb’ the digital world?

Article 13 of the EU’s Copyright Directive, up for vote on 20 June, will impose mandatory upload filters on internet users. Jillian York explains why the risks are too high

Photo: US Air Force

It’s not difficult to see why some artists are torn over copyright measures. Making a living from art can be hard, and is made even harder when digital technology makes it rather easy to download, re-upload, and profit off the work of others. For many, their intellectual property is their sole meal ticket.

But, as is often the case, proposed solutions to this problem are often blunt, created by lawmakers whose interests tend not to lie with the starving artist, but with the mega-corporations that have a hold over the music industry. In the case of Article 13 of the EU’s proposed Copyright Directive, this is almost certainly the case. As experts have pointed out, the proposal benefits most the major record labels and movie studios that are angry at internet platforms for allegedly being too lax with their content.

The proposal, drafted by the European Commission and now facing a 20 June vote by the European Parliament, would require companies to check uploaded content from users against a database of known copyrighted materials. For example, when a user uploads a video to a platform like YouTube, the filtering mechanism would scan for matching video and audio content and act accordingly; that is, if a work was marked as copyrighted, the user would be prevented from completing her upload.

So what makes mandatory upload filters so bad? First, they are unable to distinguish things like parody from infringing content, and are furthermore rather prone to error. As German Pirate Party MEP Julia Reda told me: ‘Upload filters have repeatedly shown to delete original content by independent artists. For example, a music video by the activist collective Pinkstinks was deleted by YouTube's ContentID after the video had been shown on a popular German TV station.’

We should consider the potential ramifications of unleashing a tool like this widely into the world

A campaign by several European digital rights groups, including EDRi, the European Digital Rights Initiative, calls the filters ‘censorship machines’ and argues that memes and other creative works are under threat and calls on individuals to contact their representatives to the European Parliament.

But these aren’t the only reasons civil society organizations have come out in force against the proposal. As Reda has pointed out, the filters are also bad for business, as they place a significant burden on small companies and thus hamper competition from European platforms against dominant US ones. Lenard Koschwitz of Allied for Startups writes that by levying fines to companies that don’t comply, the proposal is ‘carpet bombing the entire digital world’.

And, like existing rules, the filtering systems can easily be abused by rightsholders. To understand how, we need only look to abuses enabled by existing mechanisms like YouTube’s Content ID system. Meant to make enforcement of the Digital Millennium Copyright Act easier for platforms and protect rightsholders by scanning for infringing content, this automatic tool has resulted in a power imbalance that places the onus on users to prove their innocence, sometimes against multiple claimants.

In one recent egregious example, a 10 hour video of white noise uploaded to YouTube received five copyright claims as a result of Content ID. The copyright claimants opted to monetize rather than take down the video, meaning that they were able to profit off content that did not belong to them. Although Google, of which YouTube is a subsidiary, dropped the claims, a typical user accused of infringement would have to go through a lengthy process.

As Reda says, referring to the music video deleted by Content ID: ‘There are no negative consequences for rightsholders who wrongfully claim to hold the copyright in other people's works, like the German news show did. If the European Parliament won't stop these dangerous plans, automatic blocking of legal content by small artists or under copyright exceptions such as fair use will only become more common.’

Finally, as with all technology, we should consider the potential ramifications of unleashing a tool like this widely into the world. Although the intent behind the filters is to enforce existing copyright laws, once in places these systems can potentially be used for other purposes. This may seem farfetched, but again, we need only look to existing examples: The practice of hashing and matching images for the detection of child sexual abuse imagery is now being used by major platforms to censor ‘terrorist’ content – with little to no oversight from non-corporate actors. The risks of surveillance – and censorship – from a system that allows for the blocking of uploads is simply too high.

Jillian C York is the Electronic Frontier Foundation’s director for International Freedom of Expression.