0% found this document useful (0 votes)
102 views9 pages

Chapter 2

This document provides a literature review on the concepts of free speech and online platforms. It discusses the debate around where free speech ends and content moderation begins online. While free speech is important, there are also arguments that certain content like hate speech or misinformation can harm society. The review explores perspectives on how to balance these issues and find an approach that protects both individual expression and societal well-being. It also examines how online platforms have evolved from early message boards to today's major social media sites, and how platforms now play a role in shaping social interactions and communities.

Uploaded by

Henry
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
102 views9 pages

Chapter 2

This document provides a literature review on the concepts of free speech and online platforms. It discusses the debate around where free speech ends and content moderation begins online. While free speech is important, there are also arguments that certain content like hate speech or misinformation can harm society. The review explores perspectives on how to balance these issues and find an approach that protects both individual expression and societal well-being. It also examines how online platforms have evolved from early message boards to today's major social media sites, and how platforms now play a role in shaping social interactions and communities.

Uploaded by

Henry
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 9

CHAPTER 2

LITERATURE REVIEW

THE CONCEPT OF FREE SPEECH:

The internet, once hailed as the epitome of unfettered expression, now


finds itself at the center of a heated debate: where does free speech end
and online content moderation begin? This complex question, with its
myriad legal, ethical, and societal implications, has ignited passionate
discourse within the scholarly community.1 This review delves into the
current landscape of this debate, exploring key arguments, identifying
persisting challenges, and proposing avenues for further research.

The concept of free speech, enshrined in democratic principles, has long


served as a cornerstone of individual liberty and societal progress, it fuels
open discourse, challenges authority, and fosters the exchange of ideas,
ultimately contributing to a well-functioning and vibrant society.
2
However, as pointed out by Loso Judijanto et al, 3 the rise of online
platforms and the ever-evolving digital landscape have introduced
unprecedented complexities to this fundamental right, sparking an
intricate and ongoing debate surrounding its application and limitations.

Scholars like Vincent Blasi4 have emphasized the critical role that free
speech plays in fostering a healthy society, while Kaniklidis 5 reminds us
that it's not absolute, acknowledging legal limitations like incitement to
violence. On one hand, moderation can create a safer and more inclusive
online environment by combating the spread of harmful content and
fostering civil discourse. Platforms like Facebook and Youtube argue that
moderation is crucial for maintaining responsible use of their platforms.

1
Kozyreva, Anastasia, et al. "Free speech vs. harmful misinformation: Moral dilemmas in online content
moderation." PsyArXiv (2022).
2
Hallberg, Pekka, et al. "From the Origins of Freedom of Speech to the Modern Information Society." Freedom of
Speech and Information in Global Perspective (2017): 61-77.
3
Judijanto, Loso, et al. "NAVIGATING THE DIGITAL FRONTIER: A COMPREHENSIVE EXAMINATION OF COPYRIGHT
PROTECTION IN THE DIGITAL ERA, UNRAVELING COMPLEX CHALLENGES, AND PROPOSING LEGAL SOLUTIONS."
INTERNATIONAL JOURNAL OF SOCIETY REVIEWS 2.2 (2024): 353-363.
4
Blasi, Vincent. "Free speech and good character: From Milton to Brandeis to the present." Eternally Vigilant:
Free speech in the modern era 61 (2002): 61-63.
5
Kaniklidis, Constantine. "Free Speech, Hate Speech and Principles of Community: The Case Against Free Speech
Absolutism." (2015).
However, concerns remain about transparency and accountability in
moderation decisions, algorithms used for automated moderation can be
biased, and human moderators may misinterpret context, leading to unfair
silencing of individuals or groups. Scholars like Elizabeth Stewart 6, raise
concerns about potential censorship and bias, echoed by Robert Gowa 7 in
his critique of algorithmic decision-making. Mark Wheeler 8 further
underscores the immense responsibility these platforms bear in shaping
the digital public sphere. At its core, free speech embodies the right to
express oneself freely without fear of censorship or repercussions. 9
Although, this seemingly straightforward principle reveals its limitations
upon closer examination.

There are legal frameworks that recognize limitations that prohibit hate
speech, incitement to violence, and defamation in the real world, but
translating these principles to the online world proves challenging. 10
Further complicating matters, the anonymity offered by the internet
emboldens some to spew hate speech and misinformation. 11 These
expressions, although technically falling under the umbrella of free
speech, can have detrimental consequences, inciting violence,
discrimination, and eroding trust in institutions.

However, the issue goes beyond individual platforms and individual


users. Decentralized solutions and alternative platforms, as suggested by
Shagun Jhavar et al12, can offer users more control over their online
experience and diversify the digital landscape. International frameworks
for online content governance, advocated by Ann Linke 13, are crucial for
establishing common ground and shared responsibility.If there is an
internationally accepted standard for online free speech, it would be
easier to regulate across borders and platforms.

6
Stewart, Elizabeth. "Detecting fake news: Two problems for content moderation." Philosophy & technology 34.4
(2021): 923-940.
7
Gorwa, Robert, Reuben Binns, and Christian Katzenbach. "Algorithmic content moderation: Technical and
political challenges in the automation of platform governance." Big Data & Society 7.1 (2020):
2053951719897945.
8
Wheeler, Mark, and Petros Iosifidis. Public spheres and mediated social networks in the western context and
beyond. Palgrave Macmillan, 2016.
9
Warburton, Nigel. Free speech: A very short introduction. OUP Oxford, 2009.
10
Alkiviadou, Natalie. "The legal regulation of hate speech: The international and European frameworks."
Politička misao 55.04 (2018): 203-229.
11
Hannan Durrani, Dr Moazzam Naseer. "Hate Speech and Culture of Trolling on Social Media."
12
Jhaver, Shagun, Seth Frey, and Amy X. Zhang. "Decentralizing Platform Power: A Design Space of Multi-level
Governance in Online Social Platforms." Social Media+ Society 9.4 (2023): 20563051231207857.
13
Linke, Anne, and Ansgar Zerfass. "Social media governance: Regulatory frameworks for successful online
communications." Journal of Communication Management 17.3 (2013): 270-286.
Ultimately, the free speech debate in the digital age is not about creating a
utopia of unfettered expression, It's about finding a balance, one that
safeguards individual liberties while protecting society from harm 14. This
necessitates ongoing research, critical dialogue, and collaborative efforts
from diverse stakeholders – scholars, platforms, users, and policymakers
alike. It's a complex journey, but one worth undertaking to ensure a
digital space that embodies both freedom and responsibility, fostering a
truly democratic and vibrant online society.

Furthermore, it's crucial to acknowledge the evolution of the free speech


concept itself. Traditional understandings often emphasized individual
expression, neglecting the potential societal harms emanating from
certain forms of speech.15 Scholars like Daniel Wachtell16 propose a
"harm principle": free speech should be limited only when it causes direct
and demonstrable harm to others. Conversely, Louisa Bartolo 17 argues for
a more nuanced approach, considering the power dynamics and potential
silencing effects of harmful speech, particularly on marginalized groups.

Examining free speech in the digital age necessitates a multifaceted


approach, recognizing its historical foundation, exploring its limitations
and evolving interpretations, and acknowledging the complexities of the
online environment are crucial first steps. 18 Finding a balance between
individual expression and societal well-being demands ongoing research,
critical reflection, and collaborative efforts from diverse stakeholders.
This complex terrain, with its intricate pathways and shifting boundaries,
requires our collective attention and active engagement, for a conclusion
that will be both beneficial as it is inclusive for all.

THE CONCEPT OF ONLINE PLATFORMS

14
Tutt, Andrew. "The New Speech." Hastings Const. LQ 41 (2013): 235.
15
Sadurski, Wojciech. Freedom of speech and its limits. Vol. 38. Springer Science & Business Media, 1999.
16
Wachtell, Daniel F. "No harm, no foul: reconceptualizing free speech via tort law." NYUL Rev. 83 (2008): 949.
17
Bartolo, Louisa. "'Eyes wide open to the context of content': Reimagining the hate speech policies of social
media platforms through a substantive equality lens." Renewal: A Journal of Social Democracy 29.2 (2021): 39-51.
18
Feng, Jing, Yueyao Yu, and Tong Xu. "Content Regulation Laws for Chinese ISPs: Legal Responsibilities in Free
Speech and Filtering of Harmful Content." Law and Economy 2.11 (2023): 53-59.
The ubiquitous presence of online platforms in our lives today seems
almost self-evident. From the mundane tasks of checking email and
online banking to the immersive worlds of social media and virtual reality
experiences, these digital spaces have become deeply woven into the
fabric of our social, economic, and political realities. 19 Yet, understanding
the concept of online platforms and their complex implications
necessitates delving beyond their immediate functionality and venturing
into their history, evolution, and impact.

Early iterations like message boards and email listservs laid the
groundwork, fostering rudimentary online interaction. 20 Web 2.0,
characterized by user-generated content and interactivity, witnessed the
explosion of social media giants like Facebook and YouTube, propelling
these platforms into the mainstream. 21 Today, the landscape flourishes
with diverse actors – e-commerce giants, gaming platforms, virtual reality
experiences – blurring the lines between physical and digital spheres.

The digital age has ushered in a new reality: online platforms. These
multi faced spaces transcend mere technological tools, evolving into
complex ecosystems fostering interaction, content sharing, and shaping
the very fabric of human connection. 22 Examining this phenomenon
demands a meticulous approach, delving beyond immediacy and
engaging with the rich vault of scholarly discourse surrounding online
platforms within the context of a literature review.

Their impact extends far beyond technological prowess. Scholars like


Ward Peeters23 highlight their role in shaping social interactions, with
Marcia Mundt24 emphasizing their potential to foster new communities
and even social movements. Yet, concerns like digital inequality and the
spread of misinformation raise ethical questions, as explored by Lamber
Royakkers25.

19
Papacharissi, Zizi. A private sphere: Democracy in a digital age. Polity, 2010.
20
Frenken, Koen, and Lea Fuenfschilling. "The rise of online platforms and the triumph of the corporation."
Sociologica 14.3 (2021): 101-113.
21
Gehl, Robert W. A cultural and political economy of Web 2.0. George Mason University, 2010.
22
Carrigan, Marylyn, et al. "Fostering sustainability through technology-mediated interactions: Conviviality and
reciprocity in the sharing economy." Information technology & people 33.3 (2020): 919-943.
23
Peeters, Ward. "The peer interaction process on Facebook: A social network analysis of learners’ online
conversations." Education and information technologies 24.5 (2019): 3177-3204.
24
Mundt, Marcia, Karen Ross, and Charla M. Burnett. "Scaling social movements through social media: The case
of Black Lives Matter." Social Media+ Society 4.4 (2018): 2056305118807911.
25
Royakkers, Lambèr, et al. "Societal and ethical issues of digitization." Ethics and Information Technology 20
(2018): 127-142.
The rise of dominant platforms has generated critical discourse about
power dynamics. Scholars like Giovanni De Gregorio point towards
potential monopolies and the need for accountability, while Sofia
Benanchi26 raises concerns about surveillance capitalism and the erosion
of privacy. These shifting dynamics necessitate new approaches to
governance, as evidenced in the cases of US v Jones 27 and the
Nickelodeon Consumer Privacy Litigation.28

As online platforms continue to evolve, fostering a responsible and


inclusive digital future requires multifaceted efforts. Hamutoglu et al 29
emphasizes the importance of digital literacy, while Terry Flew and Fiona
Martin30 underscore the need for diverse perspectives in platform
governance. Exploring alternative models, as suggested by Barrat Mehra
et al,31 holds promise for diversifying the landscape and empowering
users.

The literary landscape itself is not immune to the influence of online


platforms. Platforms like fanfiction sites and self-publishing avenues
offer new modes of creative expression and audience engagement,
prompting scholars like Sophie Corser 32 to re-examine traditional notions
of authorship and readership. Additionally, online platforms host vibrant
communities of readers and writers, fostering literary discussions and
shaping the reception of works.33

Examining online platforms within a literature review offers a glimpse


into a dynamic and overly complex terrain. Understanding their historical
evolution, recognizing their social and cultural implications, and
engaging with ongoing debates surrounding content moderation, power
dynamics, and responsible governance are crucial steps. By weaving this
26
Benanchi, Sofia. "Surveillance Capitalism: the implications of losing privacy." (2023).
27
US v. Jones, 565 U.S. 400, 132 S. Ct. 945, 181 L. Ed. 2d 911 (2012).
28
In re Nickelodeon Consumer Privacy Litigation, 827 F.3d 262 (3d Cir. 2016).
29
Hamutoğlu, Nazire Burçin, Merve Savaşçi, and Gözde Sezen-Gültekin. "Digital literacy skills and attitudes
towards e-learning." Journal of Education and Future 16 (2019): 93-107.
30
Flew, Terry, and Fiona R. Martin. Digital platform regulation: Global perspectives on internet governance.
Springer Nature, 2022.
31
Mehra, Bharat, Cecelia Merkel, and Ann Peterson Bishop. "The internet for empowerment of minority and
marginalized users." New media & society 6.6 (2004): 781-802.
32
Corser, Sophie. Against Joyce: Ulysses, Authorship, and the Authority of the Reader. Diss. Goldsmiths,
University of London, 2018.
33
Sedo, DeNel Rehberg. "An introduction to reading communities: Processes and formations." Reading
communities from salons to cyberspace. London: Palgrave Macmillan UK, 2011. 1-24.
understanding into the broader literary discourse, we can contribute to
shaping a digital future that empowers individuals, fosters meaningful
connections, and celebrates the potential of online platforms for enriching
the literary landscape.

THE CONCEPT OF CONTENT MODERATION

The digital age has unleashed a powerful force: online platforms. These
virtual spaces, bursting with communication and content, have
transformed how we connect, express ourselves, and access information.
Yet, amid this vibrant landscape lies a complex and often contentious
issue: content moderation.34 At its core, content moderation aims to
maintain a balance between online freedom and responsibility, ensuring a
safe and inclusive environment for users while upholding the right to
express diverse viewpoints.35 However, achieving this narrative demands
careful consideration of multiple factors and perspectives.

Early online communities operated with minimal moderation, leaving


users to self-regulate discourse.36 With the rise of social media giants and
their vast user bases, the need for structured content moderation became
apparent. Algorithms and human moderators began filtering content
deemed harmful, offensive, or illegal, yet, concerns quickly emerged
about potential bias, censorship, and the silencing of marginalized
voices.37

Content moderation is rarely ever a clear-cut issue. As Carl Fox 38 so


correctly opined, Satire can blur the lines with hate speech, humor can
offend certain groups, and misinformation can masquerade as legitimate
news. This ambiguity necessitates a careful approach, considering factors
like intent, context, and potential impact. Just as was supported by
Sonboli et al39, transparency in decision-making and clear guidelines for
users are crucial in fostering trust and ensuring fairness.
34
Ganesh, Bharath, and Jonathan Bright. "Countering extremists on social media: challenges for strategic
communication and content moderation." Policy & Internet 12.1 (2020): 6-19.
35
Lee, Edward. "Moderating content moderation: A framework for nonpartisanship in online governance." Am.
UL Rev. 70 (2020): 913.
36
Klonick, Kate. "The new governors: The people, rules, and processes governing online speech." Harv. L. Rev.
131 (2017): 1598.
37
Gorwa, Robert, Reuben Binns, and Christian Katzenbach. "Algorithmic content moderation: Technical and
political challenges in the automation of platform governance." Big Data & Society 7.1 (2020):
2053951719897945.
38
Fox, Carl. "Stability and disruptive speech." Journal of Social Philosophy (2023).
39
Sonboli, Nasim, et al. "Fairness and transparency in recommendation: The users’ perspective." Proceedings of
the 29th ACM Conference on User Modeling, Adaptation and Personalization. 2021.
Content moderation encompasses a spectrum of methodologies ranging
from algorithmic filtering to human moderation, each with its own set of
advantages and limitations. Emma Lianso et al 40, scrutinizes the role of
artificial intelligence in content moderation, examining the promises and
perils of automated systems. The study illuminates how machine learning
algorithms contribute to the scalability of moderation efforts but also
raises concerns about their ability to discern context and avoid
perpetuating biases.

The fundamental challenge however, lies in striking a delicate balance


between protecting users from harmful content and upholding the
principles of free speech.41 Platforms wrestle with this dilemma daily,
attempting to define "harmful" while respecting diverse opinions and
cultural contexts.42 The dilemma does not end there. Algorithmic
moderation, while efficient, can be susceptible to bias and fail to grasp
the nuances of language and context. Human moderators, though capable
of deeper understanding, face scalability issues and potential
inconsistencies in decision-making.

Although, the role of human judgment remains vital in content


moderation.43 Humans can understand context, empathize with diverse
perspectives, and adapt to evolving situations. However, Sai Wang 44
brought our attention to the fact that human moderators are susceptible to
fatigue, personal biases, and even manipulation. Constant and dynamic
training, diverse teams, and robust appeal mechanisms are essential to
mitigate these risks and ensure responsible decision-making.

In response to the dynamic nature of online communication, content


moderation practices are in a perpetual state of evolution. Joseph
Seering45 investigates emerging trends in content moderation,
emphasizing the shift toward proactive measures and community-driven
40
Llansó, Emma, et al. "Content Moderation, and Freedom of Expression." Algorithms (2020).
41
Brown, Rebecca L. "The Harm Principle and Free Speech." S. Cal. L. Rev. 89 (2015): 953.
42
Ibid.
43
Katsaros, Matthew, Jisu Kim, and Tom Tyler. "Online Content Moderation: Does Justice Need a Human Face?."
International Journal of Human–Computer Interaction 40.1 (2024): 66-77.
44
Wang, Sai. "Moderating uncivil user comments by humans or machines? The effects of moderation agent on
perceptions of bias and credibility in news content." Digital journalism 9.1 (2021): 64-83.
45
Seering, Joseph. "Reconsidering self-moderation: the role of research in supporting community-based models
for online content moderation." Proceedings of the ACM on Human-Computer Interaction 4.CSCW2 (2020): 1-28.
moderation. This approach empowers users to actively participate in
shaping platform norms, acknowledging the collective responsibility in
maintaining a healthy online ecosystem.

Furthermore, Topidi46 explores the potential of transparency and


accountability measures in content moderation. The study illuminates
how platforms are increasingly compelled to be transparent about their
moderation processes, providing users with insights into decision-making
and fostering trust.

The impact of content moderation reverberates through the social fabric,


influencing societal discourse and shaping user experiences. Gillespie 47
conducts a comprehensive analysis of the societal implications of content
moderation, examining how moderation decisions may inadvertently
contribute to the formation of online echo chambers or restrict the voices
of marginalized communities. Understanding the far-reaching
consequences of content moderation is crucial in assessing its role in
shaping the present and future of our digital world.

Similarly, Shagun Jhaver48 focuses on user experiences, exploring how


individuals perceive and interact with content moderation mechanisms.
This research delves into user attitudes toward moderation policies,
shedding light on the delicate balance between safeguarding against harm
and preserving individual freedoms.

The responsibility for content moderation extends beyond individual


platforms. Governments grapple with crafting legal frameworks that
balance free speech with public safety, often facing the challenge of
regulating platforms operating across borders. Civil society organizations
play a crucial role in advocating for user rights, raising awareness about
problematic content, and proposing alternative solutions.49

Content moderation in the digital age represents a complex and ever


evolving topic that necessitates constant adaptation to the challenges
posed by the ever-changing online landscape. As platforms grapple with
46
Topidi, Kyriaki. "Accountability in the globalised digital age: Online content moderation and hate speech in the
European Union." Accountability and the Law. Routledge, 2021. 9-27.
47
Gillespie, Tarleton. Custodians of the Internet: Platforms, content moderation, and the hidden decisions that
shape social media. Yale University Press, 2018.
48
Jhaver, Shagun, et al. "Personalizing content moderation on social media: User perspectives on moderation
choices, interface design, and labor." Proceedings of the ACM on Human-Computer Interaction 7.CSCW2 (2023):
1-33.

49
Raboy, Marc, and Normand Landry. Civil society, communication, and global governance: Issues from the
World Summit on the Information Society. Peter Lang, 2005.
the dilemma of fostering free expression while mitigating potential
harms, the literature converges on the recognition that effective content
moderation requires a holistic approach, integrating technological
advancements, ethical considerations, and user-centric practices. As we
navigate the frontiers of expression and regulation, content moderation
stands as a critical mechanism in shaping the character and inclusivity of
our digital societies. This unique understanding lays the groundwork for
future research endeavors and policy considerations within the realm of
content moderation.

You might also like