WARNING: This is not a script to follow word by word, just an example. Just be yourself ;)
Here are a few things that you may want to mention in your communications with MEPs and their assistants:
On 12 September 2018, all 751 Members of the European Parliament (MEPs) got a chance to shape the European copyright reform with a vote on the:
The outcome: 366 MEPs blatantly ignored your calls asking them to #SaveYourInternet, as they adopted the copyright #CensorshipMachine.
On 5 July, your voices were heard by many Members of the European Parliament (MEPs). A lot of them did not have the copyright reform on their radar screen. Now thanks to your efforts, they do, and they have become aware of the many sensitivities surrounding the copyright debate, especially regarding the Article 13 #CensorshipMachine proposal.
A majority of MEPs voted against the JURI Committee negotiation mandate. 318 MEPs decided to reject the Report by Rapporteur MEP Axel Voss (EPP, Germany) to #SaveYourInternet! The battle however continues, and we need your support more than ever. All 751 MEPs will now get a chance to scrutinise this copyright reform and table amendments by 5 September, and then the original JURI report and these amendments will be put up to a vote in the Plenary session of the European Parliament on 12 September.
Article 13’s various versions creates a system whereby platforms face an increased (direct) liability for the content uploaded by their users if it infringes copyright. As a result, these platforms are likely to overblock even legal content and use automated techniques to avoid being sued, which will mean users will no longer be able to share and experience the content they were used to find online.
Our Ability To Post Content On The Internet Will Be Limited By A Censorship Machine
Some of the content uploaded on the Internet infringes the copyright of rightholders (which are often not the content creators but intermediaries and investors such as recording or film studios) and content creators complain that due to the digital evolution, they make less money than they used to (the so-called ‘value gap’). This does not reflect the reality accurately, specifically in the case of the music industry that year after year announce that their incomes keep increasing. However, what they claim is that some platforms (YouTube, Vimeo… ) do not pay them enough when they stream copyrighted content: that is what they call the “value gap” (the gap between what rightsholders think would be fair as a compensation and what platforms pay them).
Article 13 claims to address these problems but does so it in a way that hampers the way the Internet has been functioning so far by asking platforms to put in place costly and opaque solutions to pre-screen our content. This proposal would require intermediaries such as Facebook and YouTube to constantly police their platforms with censorship machines, often with no human element involved in the process. It will mean that you will no longer be able to upload or enjoy the same content as you used to, as automated blocking is likely to stop (legitimate) content of ever making it online. Analyses by EDRi of the European Commission and JURI proposals show the underlying threats in Article 13’s logic.
And what’s worse: none of the versions of Article 13 make life better for creators. Article 13 actually makes no mention of creators: only rightholders.
The scope of application of Article 13 is excessively broad and does not comprise any mechanism that constrains inappropriate or unreasonable claims by rightholders. To solve this, some of the proposed version include carve-outs for specific platforms in a more or less defined manner (for example for online encyclopediae like Wikipedia) but this approach means that only those platforms that are known and valued today get a ‘pass’ from the censorship machine.
New Censorship Machines Should Not Be ‘Encouraged’ And Existing Ones Should Have User Safeguards
The measures required by Article 13 to avoid liability will be expensive to implement and will thus make it harder for European start-ups to grow and compete with big US platforms that already have these filters in place (such as YouTube with ContentID).
Moreover, where most of the ‘complaints’ seem to come from the music and film industry, Article 13 applies to all types of platforms and all types of content, including text or software code, or music sheets, architect blueprints, etc.
As organisations such as Github and Wikimedia raised their voice, carve-outs have been written to try and avoid them becoming collateral damage of Article 13. But what about the companies that have not raised their voice or not been heard (e.g. WordPress, AirBnB)? What about the platforms that do not exist yet but could bring the same benefit to society in the future as Wikipedia does currently? The carve outs show the collateral damage is real. The extent however is currently unfathomable, as shown by an infographic by trade association EDIMA (note: some versions of the text of Article 13 include partial carve-outs for code sharing platforms, online encyclopedia, online retail platforms and (B2C) cloud services but these are not without loopholes).
The copyright rules in the European Union are extremely complex and nuanced, as evidenced by a solid body of case law from the highest European court, the Court of Justice of the European Union. Many of the handling we currently do on social media rely on exceptions to copyright (such as parody or quotation) which are not identifiable by algorithms as they require ‘context’ (is this funny? Are you acting in a non-commercial manner? Did you use this for the purpose of criticism) and are not implemented in the same manner in each EU Member State.
Algorithms and Filters Have a Proven Track Record at Being Bad At Nuance
Creativity and free speech will be harmed by Article 13 because algorithms struggle to tell the difference between infringement and the legal use of copyrighted material vital to research, commentary, parodies and more. This is far too high a cost for enforcing copyright.
No filter can possibly review every form of content covered by the proposal including text, audio, video, images and software. Article 13's mandate is technically infeasible and it is absurd to expect courts in 27 EU Member States to be constantly working out what the “best” filters might be.
Moreover, it is a bad idea to make Internet companies responsible for enforcing copyright law. To ensure compliance and avoid penalties, platforms are sure to err on the side of caution and overblock. To make compliance easier, platforms will adjust their terms of service to be able to delete any content or account for any reason. That will leave victims of wrongful deletion with no right to complain – even if their content was perfectly legal.
Finally, the proposed censorship machines are a disproportionate and ineffective ‘solution’ to the problem: this has been highlighted by the highest European Court, the Court of Justice of the European Union, in a decision called SABAM v Netlog (CJEU C-360/10), which ruled that social networks and other web hosting providers cannot be required to monitor and filter activities that occur on their sites to prevent copyright infringement. This would be a breach of freedom of expression and of privacy.