079dbcfa6aea
079dbcfa6aea
079dbcfa6aea
pt/publications/publication/079dbcfa6aea/
martim farinha
NOVA School of Law / NOVA Consumer Lab
*We would like to thank Maria Clara Paixão, Joana Campos Carvalho, Paula Ribeiro Alves
and Paulo Lacão for having read a draft of this text and for their suggestions.
Keywords: (i) digital services; (ii) digital market; (iii) data protection; (iv)
consumer protection
1. Introduction
1
For example, Directive 2000/31/EC of the European Parliament and of the Council of
8 June 2000 on certain legal aspects of information society services, in particular elec-
tronic commerce, in the Internal Market (Directive on electronic commerce) – e-Commerce
Directive.
2
Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April
2016 on the protection of natural persons with regard to the processing of personal data
and on the free movement of such data, and repealing Directive 95/46/EC (General Data
Protection Regulation).
ket: the Proposal for a Digital Services Act3 (hereafter DSA). Here,
it mainly focuses on regulating intermediary services, thus comple-
menting the consumer law and the data protection rules already
put in place.
Thus, the aim of this article is to explore this proposal's history,
structure, as well as its relation with other European directives and
regulations, so as to ascertain its accomplishments and situations
where it does not go far enough. In particular, we shall focus on
content moderation as well as the way in which it aims at protect-
ing consumers’ rights, how much it protects consumers and if that
is enough. For that reason, Chapter 2 will focus on an analysis of
the Digital Services Act, from its origin and the reasons that led the
Commission to present it, to the main provisions that are embod-
ied therein and the relation it establishes with the existing legal
framework, especially the GDPR. In Chapter 3 we will approach
specifically the topic of content moderation, in order to ascertain
if the progress presented by the Digital Services Act adequately
answers the problems identified in prior legislation (in the Euro-
pean Union and the United States), and what is already done in
practice, namely by digital platforms. Lastly, on Chapter 4 we will
analyse how the Proposal reinforces consumer protection, when it
comes to five main aspects: traceability, pre-contractual informa-
tion and product safety information, advertisement transparency,
recommender systems and the general principle of non-liability of
hosting service providers.
The Proposal for a Digital Services Act has been the culmination
of years of technological innovation that needed to be accompanied
3
Proposal for a Regulation of the European Parliament and of the Council on a Single
Market for Digital Services (Digital Services Act) and amending Directive 2000/31/EC
(COM/2020/825 final). Available at EUR-Lex: https://eur-lex.europa.eu/legal-content/en/
TXT/?uri=COM:2020:825:FIN (last accessed 14 April 2021).
4
In that sense, see EPRS, Digital Services Act (March 2021), 1-4. Available at
Europarl: https://www.europarl.europa.eu/RegData/etudes/BRIE/2021/689357/EPRS_
BRI(2021)689357_EN.pdf. (last accessed 14 April 2021).
5 European Commission, Proposal for a Regulation of the European Parliament and of the
Council on a Single Market for Digital Services (Digital Services Act) and amending Direc-
tive 2000/31/EC, (COM (2020) 825 final), Brussels, 15-dez.-2020, 1. For the need to review
the e-Commerce Directive as stated by the European Parliament, see the brief summary
in EPRS, Digital Services Act cit., 2-3.
6 European Commission, Digital Services Act cit., 1.
8
European Commission, Proposal for a Regulation of the European Parliament and of the
Council on contestable and fair markets in the digital sector (Digital Markets Act), (COM
(2020) 842 final), Brussels, 15-dez-2020.
9 European Commission, Digital Markets Act cit., 16.
14
European Commission, Digital Markets Act cit., 16.
15 European Commission, Digital Services Act cit., 6; EPRS, Digital Services Act cit., 1-2.
16
CJUE 20-dez.-2017, case c-434/15 (Judgment Uber Spain); CJUE 10-abr.-2018, case
C-320/16 (Judgment Uber France); CJUE 19-dez.-2019, case C-390/18 (Judgment Airbnb
Ireland).
istered with it with regard to the conditions under which the service
is provided17.
Recitals 5 and 6 of the Proposal state that the regulation should
apply to “providers of intermediary services”. It is clarified that this
application is restricted to intermediary services, not affecting the
requirements established in European Union or national legisla-
tion “relating to products or services intermediated through inter-
mediary services, including in situations where the intermediary
service constitutes an integral part of another service which is not
an intermediary service as specified in the case law of the Court
of Justice of the European Union”. This is a clear reference to the
CJEU case law referred to in the previous paragraph. The regime of
the Proposal applies irrespective of whether the information society
service is part of an overall service whose principal element is a
service with another legal qualification, provided that it is an inter-
mediary service. The Regulation will not obviously cover the (other
core) service, such as transport or accommodation, which is not an
intermediary service18.
The Regulation is intended to apply to intermediation services,
simply defined by belonging to one of three categories of services:
mere conduit, caching and hosting. The hosting services consist of
the “storage of information provided by, and at the request of, a
recipient of the service”. Explicitly included among hosting services
are online platforms. According to the definition in article 2(h) an
online platform is (i) a provider of a hosting service which, (ii) at the
request of a recipient of the service, (iii) stores and disseminates to
the public information. It is not qualified as an online platform if
the “activity is a minor and purely ancillary feature of another ser-
vice and, for objective and technical reasons cannot be used with-
out that other service, and the integration of the feature into the
17
Jorge Morais Carvalho, Airbnb Ireland Case: One More Piece in the Complex Puzzle Built
by the CJEU Around Digital Platforms and the Concept of Information Society Service,
6/2 ItalLJ (2020), 463-476, 473.
18
Jorge Morais Carvalho, Sentenças Airbnb Ireland e Star Taxi App, Conceito de Serviço
da Sociedade da Informação e Regulação de Plataformas Digitais, RDC – Liber Amico-
rum (2021), 481-510, 508.
19
See Recital 12.
20For a general overview of the adopted structure for the Digital Services Act, see Euro-
pean Commission, Digital Services Act cit., 13-16.
order to comply with the rules set out by EU law in general (article
6). Lastly it sets two final obligations: the prohibition of general
monitoring or active fact-finding (article 7) and the obligation to
respect orders from national judicial or administrative authorities
to act against illegal content and to provide information (articles
8 and 9).
Chapter III sets out due diligence obligations for a transpar-
ent and safe online environment, through five different sections.
Here, the Commission regulates the different intermediary services
according to their activities and sizes, imposing obligations propor-
tional to those two criteria.
The first section consolidates the foundation of the due dili-
gence obligations every intermediary service provider should com-
ply with: the need to establish a single point of contact to facilitate
direct contact with state authorities (article 10), the need to desig-
nate a legal representative in the Union, for those providers that
are not established in any Member States, but who provide their
services inside the territory of the European Union (article 11), the
obligation of setting out on their terms and conditions any restric-
tions they may impose on the use of their services as well as to act
responsibly when applying them (article 12) and, lastly reporting
obligations when it comes to the removal and the disabling of infor-
mation considered to be illegal or contrary to the provider’s terms
and conditions (article 13).
From then on, the next sections regulate specific types of interme-
diary services providers, additionally to what is already enshrined
in Section 1. Section 2 regulates providers of hosting services, oblig-
ing them to put in place mechanisms allowing third parties to notify
the presence of potentially illegal content (article 14) as well as the
obligation to state the reasons for the removal or disabling of access
provided by a recipient of the service (article 15). Sections 3 and
4 regulate online platforms, as complements to Sections 1 and 2.
Therefore, while Section 3 lays general rules applicable to them,
namely in what concerns complaint-handling systems and dispute
resolution (articles 17 to 19), protection against illegal usage of the
platform (articles 20 to 22) and information obligations (articles
23 and 24), Section 4 adds further due diligence responsibilities to
21
According to article 25 these are “online platforms which provide their services to a
number of average monthly active recipients of the service in the Union equal to or higher
than 45 million (...)”.
22
Article 47 states that the European Board for Digital Services is an “[a]n independent
advisory group of Digital Services Coordinators on the supervision of providers of inter-
mediary services(...)”, responsible for: “[c]ontributing to the consistent application of this
Regulation and effective cooperation of the Digital Services Coordinators and the Commis-
sion with regard to matters covered by this Regulation”, “coordinating and contributing
to guidance and analysis of the Commission and Digital Services Coordinators and other
competent authorities on emerging issues across the internal market with regard to mat-
ters covered by this Regulation” and “assisting the Digital Services Coordinators and the
Commission in the supervision of very large online platforms”.
25
Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010
on the coordination of certain provisions laid down by law, regulation or administrative
action in Member States concerning the provision of audiovisual media services (Audiovi-
sual Media Services Directive), 1-24.
26 European Commission, Digital Services Act cit., 4, 5 and 19.
28
European Commission, Digital Services Act cit., 5, 19 and 20; EDPS, Opinion 1/2021
(10.02.2021), 7. Available at EDPS: https://edps.europa.eu/system/files/2021-02/21-02-10-
opinion_on_digital_services_act_en.pdf (last accessed 14 April 2021).
fact, the link between the Digital Services Act and the GDPR has
already been explored by the European Data Protection Supervisor
(EDPS) earlier this year.
If we start with the additions that the Digital Services Act brings
in terms of the right to information, article 12-1 complements and is
without prejudice to articles 12 to 14 of the GDPR, thus increasing
the transparency of content moderation practices29. This way, the
information that must be given to data subjects is reinforced in the
context of digital services, through the joint application of the legal
regimes.
Regarding online advertisement, articles 24 and 30 of the Pro-
posal clearly complement what is enshrined in data protection law,
by bringing additional transparency and accountability to targeted
advertisement, without prejudice to the application of the relevant
GDPR provisions and the need for consent30.
There are other sectors where the Proposal for a Digital Services
Act touches the GDPR. For instance, the EDPS mentions the need
to coordinate article 15 of the Proposal with article 22 of the GDPR,
which imposes strict conditions on decisions based solely on auto-
mated processing31. It is also important to mention that the com-
plaint mechanism enshrined in article 17 of the Proposal is without
prejudice to the rights and remedies available to data subjects and
provided for in the GDPR32.
Having in mind the relations between the GDPR and the Pro-
posal, the EDPS welcomes the former, but suggests additional mea-
sures in order to strengthen even more the rights of individuals,
especially when it comes to content moderation and online targeted
29
EDPS, Opinion 1/20 cit., 9.
30
EDPS, Opinion 1/20 cit., 15.
31
It is important to mention that in this context, the EDPS suggest that in order to promote
transparency, article 15(2) of the Proposal should “state unambiguously that information
should in any event be provided on the automated means used for detection and identifica-
tion of illegal content, regardless of whether the subsequent decision involved use of auto-
mated means or not” (see EDPS, Opinion 1/20 cit., 11-12).
32 European Commission, Digital Services Act cit., 30-31; EDPS, Opinion 1/20 cit., 12.
advertising33. Thus, the EDPS focuses on the fact that profiling for
the purpose of content moderation should be prohibited unless the
online service provider shows that such measure is necessary to
address the risks identified by the Digital Services Act34. Further-
more, it considers that there should be a ban on online targeted
advertising based on pervasive tracking, as well as a limitation on
data collected for the purpose of targeted advertising35.
Thus, it seems that even though these two documents com-
plement each other in key areas related to the digital market, there
is still a way to go and ideas to be discussed in order to reinforce the
rights of data subjects in a digital context, namely when it comes to
advertising, content moderation and profiling. These issues will be
further addressed in the following chapters of this article, dedicated
to content moderation and consumer protection.
33 EPRS, Digital Services Act cit., 9; EDPS, Press Release: EDPS Opinions on the Digital
Services Act and the Digital Markets Act, Brussels (10.02.2021), 1. Available at EDPS:
https://edps.europa.eu/system/files/2021-02/edps-2021-01-opinion-on-digital-services-act-
package_en.pdf (last accessed 14 April 2021).
34 EPRS, Digital Services Act cit., 9; EDPS, Press Release: EDPS Opinions on the Digital
36
For further insight on the reasons behind this dichotomy between the publisher and host-
ing model, see pp. 3 and following of Peggy Valcke/ Marieke Lenaerts, Who’s author, edi-
tor and publisher in user-generated content? Applying traditional media concepts to UGC
providers, 24/1 Int’l Rev L Comp & Tech (2010), 119-131.
37 Available at Tubefilter: https://www.tubefilter.com/2019/05/07/number-hours-video-
In the late 1990s, with the signature of the two WIPO trea-
ties, several States started initiatives to regulate the role of the
then early intermediary digital services. Several approaches were
considered: strict liability, negligence liability, liability under safe
harbour conditions, immunity from sanctions, immunity from sanc-
tions and injunctions. And, concerning the applicable sanctions,
should the intermediaries be subject to the same civil, administra-
tive, or criminal sanctions applied to their users, or different sanc-
tions, lower or of a different kind (merely administrative in nature
and not criminal, for example)38?
There are several arguments to justify and argue against the
imposition of secondary liability to service providers. In favour of
more responsibility, we have:
38
Giovanni Sartor, Providers Liability: From the eCommerce Directive to the future,
Directorate General For Internal Policies Policy Department A: Economic And Scientific
Policy (2017), 9. Available at Europarl: https://www.europarl.europa.eu/RegData/etudes/
IDAN/2017/614179/IPOL_IDA(2017)614179_EN.pdf (last accessed 14 april 2021).
39
Giovanni Sartor, Providers Liability cit., 12.
40 Directive 2001/29/EC of the European Parliament and of the Council of 22 May 2001
on the harmonisation of certain aspects of copyright and related rights in the information
society.
41 Michael I. Meyerson, Authors, Editors, and Uncommon Carriers: Identifying the “Speaker”
Within the New Media, 71/1 Notre Dame L. Rev. (1995), 116, 118.
42
Recital 47 Directive and CJUE 03-oct.-2019, case C-18/18 (Glawischnig-Piesczek), 34.
43
The extent to which the injunction from the administrative authority can go is also lim-
ited. It cannot impose the adoption of specific measures, nor can impose an excessive bur-
den on the provider. It needs to take in consideration the fundamental rights of all parties
involved, including that it does not unnecessarily deprive access to internet users acting
lawfully. See CJUE 27-mar.-2014, case C314/12 UPC (Telekabel Wien).
44 Recital 45 Directive and CJUE 03-oct.-2019, case C-18/18 (Glawischnig-Piesczek), 24.
45
For example, European Commission, Commission Recommendation of 1.3.2018 on mea-
sures to effectively tackle illegal content online (C(2018) 1177 final), 01-mar.-2018. Available
at European Commission: https://www.ec.europa.eu/digital-single-market/en/news/commis-
sion-recommendation-measures-effectively-tackle-illegal-content-online (last accessed 14
April 2021); European Commission, The Commission Code of Practice on Disinformation,
16-mar.-2021. Available at European Commission: https://www.ec.europa.eu/digital-single-
market/en/code-practice-disinformation (last accessed 14 April 2021).
46
It has been reported by several media organizations and in the court filings of the still
ongoing case Parler LLC v. Amazon Web Services, Inc before U.S. District Judge Barbara
Rothstein in the Western District of Washington, that prior to 6th January, Parler had failed
to take down violent hate speech and that it admitted to having a backlog of over 26.000
complaints unanswered. See https://www.cnbc.com/2021/01/13/amazon-says-violent-posts-
forced-it-to-drop-parler-from-its-web-hosting-service.html. (last accessed 14 April 2021).
47 For example, Facebook still complements their automated systems with human mod-
erators. This has also shown that this task takes an heavy toll on the individuals’ mental
health. See: BBC, Facebook to pay $52m to content moderators over PTSD. Available at BBC:
https://www.bbc.com/news/technology-52642633 (last accessed 14 April 2021); and The
Verge, The Trauma Floor – The secret lives of Facebook moderators in America. Available
at The Verge: https://www.theverge.com/2019/2/25/18229714/cognizant-facebook-content-
moderator-interviews-trauma-working-conditions-arizona. (last accessed 14 April 2021).
48 For more information, see Robert Gorwa/ Reuben Binns/ Christian Katzenbach, Algo-
rithmic content moderation: Technical and political challenges in the automation of plat-
form governance, BD&S (28-fev.-2020). Available at Sage: https://www.journals.sagepub.
com/doi/full/10.1177/2053951719897945 (last accessed 14 April 2021).
49
European Commission, The EU Code of conduct on countering illegal hate speech online.
Available at European Commission: https://www.ec.europa.eu/info/policies/justice-and-
fundamental-rights/combatting-discrimination/racism-and-xenophobia/eu-code-conduct-
countering-illegal-hate-speech-online_en. (last accessed 14 April 2021).
51See the sections “Consistency with existing policy provisions in the policy area”, “Fun-
damental Rights”, “Other Elements” and Recital 28 of the proposal.
52
See Recital 47.
53
Conducted by certified bodies, whose fees should be reasonable, and fully reimbursed by
the provider to the user, if the dispute is solved in their favour.
54
This concept of trusted flaggers was also already present in European Commission, Com-
mission Recommendation cit., recitals 29 and 34, paragraphs 4 (g) and 25 to 27, with some
differences in relation to the DSA. For example, in the Recommendation, Trusted flaggers
could be individuals, natural persons.
55
See recitals 36 and 46.
56
Council Directive 93/13/EEC of 5 April 1993 on unfair terms in consumer contracts.
57 Directive 98/6/EC of the European Parliament and of the Council of 16 February 1998
4.1. Traceability
4.5. Liability
61
On this issue, see the article 20 of the ELI Model Rules on Online Platforms, which
impose the liability of the platform operator with predominant influence. See Joana Cam-
pos Carvalho, Online Platforms: Concept, Role in the Conclusion of Contracts and Current
Legal Framework in Europe, 12/1 CDT (2020), 863-874, 873-874.
5. Conclusion
62 Jorge Morais Carvalho, Sale of Goods and Supply of Digital Content and Digital Ser-
vices, 5 EuCML (2019), 194-201, 196.
63 For more information regarding the soft power of European regulation worldwide, see
https://www.brusselseffect.com.
Reference List
The Verge, The Trauma Floor – The secret lives of Facebook modera-
tors in America. Available at The Verge: https://www.theverge.
com/2019/2/25/18229714/cognizant-facebook-content-moderator-inter-
views-trauma-working-conditions-arizona. (last accessed 14 April
2021).
Valcke, Peggy / Lenaerts, Marieke, Who’s author, editor and publisher in
user-generated content? Applying traditional media concepts to UGC
providers, 24/1 Int’l Rev L Comp & Tech (2010), 119-131.