Entertainment industry calls for internet upload filters to preemptively remove copyrighted content
Something else that could be abused for censorship and that would harm small creators.
The Entertainment industry and other powerful copyright holders are looking for ways to tighten the screws around Big Tech and their response to piracy even more, as they continue the seemingly endless struggle against such content.
In order to preserve their safe harbor status under DMCA, companies like Google, Facebook and Twitter swiftly react to takedown requests, even though this system is also often abused through false claims.
However, all this is not enough for the rights holder “cartel” and they are trying to push the giants to move towards not merely reacting, but anticipating infringement proactively – i.e., implementing upload filters that would flag content before it’s ever published by third parties on their platforms.
As with the current takedown system, this one is also vulnerable to abuse that can easily produce a variety of harms, including suppression of free speech. In addition, as the huge controversy over EU’s Copyright Directive and this bloc’s intention to introduce upload filters showed last year, this is also expensive.
Whether for that or other reasons, tech giants were in the past unhappy with that proposed solution to curb piracy, but they are now saying that they want to take their responsibility very seriously and “do more” about what’s referred to as illegal content, and activity online – even beyond what they are currently legally required to do.
Some ideas on how to achieve this surface in proposals made by EDiMA, a trade body that has 15 members, including Twitter, Facebook, Google, TikTok, and Mozilla, such as its Online Responsibility Framework published earlier in 2020.
Director General of EDiMA Siada El Ramly clarified that the Framework would allow proactive action (through the use of algorithms, i.e., upload filters by any other name) against “any and all illegal content, including copyrighted content.”
Building on this, a new paper by EDiMA states that it is EU’s current laws that present a barrier for its members to “do more voluntarily and proactively.”
“The association is calling for the introduction of a legal safeguard which would allow companies to take proactive actions to remove illegal content and activity from their services, without the risk of additional liability for those attempts to tackle illegal content,” EDiMA said – referring to EU’s current requirement and a way to protect speech – that says tech companies must remove illegal content if they have “actual knowledge” of it – but not to go out to find all illegal content.