Upload-Filters: Bypassing Classical Concepts of Censorship?

Author:Amélie Pia Heldt
Position:Amélie P. Heldt is a junior researcher and doctoral candidate with the Leibniz Institute for Media Research/ Hans-Bredow-Institute, Hamburg, and currently a visiting fellow with the Information Society Project at Yale Law School.
Pages:56-64
SUMMARY

Protecting human rights in the context of automated decision-making might not be limited to the relationship between intermediaries and their users. In fact, in order to adequately address human rights issues vis-à-vis social media platforms, we need to include the state as an actor too. In the German and European human rights frameworks, fundamental rights are in principle only applicable... (see full summary)

 
FREE EXCERPT
2019
Amélie Pia Heldt
56
1
Upload-Filters
Bypassing Classical Concepts of Censorship?
by Amélie Pia Heldt*
© 2019 Amélie Pia Heldt
Everybody may disseminate this ar ticle by electronic m eans and make it available for downloa d under the terms and
conditions of the Digital P eer Publishing Licence (DPPL). A copy of the license text may be obtain ed at http://nbn-resolving.
de/urn:nbn:de:0009-dppl-v3-en8.
Recommended citation: Améli e Pia Heldt, Upload-Filters: Bypassing Classic al Concepts of Censorship?, 10 (2019) JIPITEC 56
para 1.
Keywords: Freedom of expression; censorship; democratic legitimation; upload-filters; prior restraint
private actors to delete user-content pro-actively,
is it still accurate to solely examine the relationship
between platforms and users? Are we facing an ex-
pansion of collateral censorship? Is the usage of soft
law instruments, such as codes of conduct, enhanc-
ing the protection of third parties or is it rather an
opaque instrument that tends to be conflated with
policy laundering? This paper aims to analyse the dif-
ferent layers of the usage of artificial intelligence by
platforms, when it is triggered by a non-regulatory
mode of governance. In light of the ongoing struggle
in content moderation to balance between freedom
of speech and other legal interests, it is necessary to
analyse whether or not intelligent technologies could
meet the requirements of freedom of speech and in-
formation to a sufficient degree.
Abstract: Protecting human rights in the con-
text of automated decision-making might not be lim-
ited to the relationship between intermediaries and
their users. In fact, in order to adequately address
human rights issues vis-à-vis social media plat-
forms, we need to include the state as an actor too.
In the German and European human rights frame-
works, fundamental rights are in principle only ap-
plicable vertically, that is, between the state and the
citizen. Where does that leave the right of freedom
of expression when user-generated content is de-
leted by intermediaries on the basis of an agreement
with a public authority? We must address this ques-
tion in light of the use of artificial intelligence to mod-
erate online speech and its (until now lacking) regu-
latory framework. When states create incentives for
A. Introduction
1
Considering that user-generated content constitutes
both speech in constitutional terminology as well as
the basis for many social media platforms’1 business
* Amélie P. Heldt is a junior researcher and doctoral
candidate with the Leibniz Institute for Media Research/
Hans-Bredow-Institute, Hamburg, and currently a visiting
fellow with the Information Society Project at Yale Law
School.
1 In this article, “intermediaries” is used as a generic term for
“social media services, platforms and networks”. They will be
used as synonyms for Internet-based applications that rely
on user-generated-content to create online communities to
share information, ideas, personal messages, etc. Denition
models, its regulation poses many challenges.
Social media platforms, or to put it more generally,
intermediaries, rely on user-generated-content to
attract other users. To sustain their attention and, by
extension, revenue from advertisers, social networks
are dependent on the activity of users on the one hand
and on a clean, condence-inspiring environment on
the other. Examples such as the decline of MySpace2
or the almost non-existent moderation policy at
retrieved from <https://www.merriam-webster.com/
dictionary/social%20media> accessed 23 January 2019.
2 Stuart Dredge, ‘MySpace – what went wrong: “The site
was a massive spaghetti-ball mess’” (2015) <https://www.
theguardian.com/technology/2015/mar/06/myspace-
what-went-wrong-sean-percival-spotify> accessed 10
December 2018.

To continue reading

REQUEST YOUR TRIAL