Edpb Statement 202403 Dpasroleaiact en
Edpb Statement 202403 Dpasroleaiact en
Edpb Statement 202403 Dpasroleaiact en
The European Data Protection Board has adopted the following statement:
2. The AI Act lays down harmonised rules on the placement on the market, putting into service and use
of Artificial Intelligence (hereinafter “AI”) in line with the New Legislative Framework and requires
market surveillance within the meaning of Regulation (EU) 2019/10202. As stated in its Article 1(1),
the AI Act aims to improve the functioning of the internal market and support innovation, whilst
promoting the uptake of human-centric and trustworthy AI and ensuring health, safety and a high level
of protection of the fundamental rights enshrined in the Charter of Fundamental Rights of
the European Union (hereinafter the “Charter”), including as regards the fundamental rights to privacy
and to the protection of personal data (respectively, Article 7 and 8 of the Charter).
1
Regulation (EU) 2024/1689 of the European Parliament and of the Council of 13 June 2024 laying down
harmonised rules on artificial intelligence and amending Regulations (EC) No 300/2008, (EU) No 167/2013, (EU)
No 168/2013, (EU) 2018/858, (EU) 2018/1139 and (EU) 2019/2144 and Directives 2014/90/EU, (EU) 2016/797
and (EU) 2020/1828 (Artificial Intelligence Act) (Text with EEA relevance), OJ L, 2024/1689, 12.7.2024,
ELI: http://data.europa.eu/eli/reg/2024/1689/oj.
2
Regulation (EU) 2019/1020 of the European Parliament and of the Council of 20 June 2019 on market
surveillance and compliance of products and amending Directive 2004/42/EC and Regulations (EC) No 765/2008
and (EU) No 305/2011 (Text with EEA relevance).
Final 1
3. From this perspective, the AI Act and the Union data protection legislation (notably, GDPR3, EUDPR4
and LED5, as well as the ePrivacy Directive6) need to be, in principle7, considered (and coherently
interpreted) as complementary and mutually reinforcing instruments. This must be done both in terms
of the law’s goals and in terms of the protections provided, including the rights of the affected persons
(which may also be qualified as “data subjects” under the GDPR). Further, it is important to underline
that Union data protection law is fully applicable to the processing of personal data involved in
the lifecycle of the AI systems, as explicitly recognised in Article 2(7) AI Act (see also Recitals 9 and 10).
4. In fact, the processing of personal data (which is often strictly intertwined with non-personal data)
along the lifecycle of AI systems ‒ and particularly along the lifecycle of those AI systems presenting a
high risk to fundamental rights8 ‒ clearly is (and will continue to be) a core element of the various
technologies covered under the umbrella of the AI definition, as enshrined in Article 3(1) AI Act. For
these reasons, national data protection authorities (hereinafter “DPAs”) have been active with regard
to these technological developments9 and the EDPB, which has closely followed the legislative process
regarding the AI Act10, has already initiated the examination of its (multifaceted) interplay with EU data
protection law.
5. With this statement, the EDPB would like to further highlight supervision and coordination issues
that could result from the designation of competent authorities by the Member States11 in areas
that are so closely linked to personal data protection matters. It should be considered that, at
the national level, the AI Act enforcement framework will have to be built in the next year after entry
into force of the AI Act around one or more established or designated “national competent authorities”
3
Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of
natural persons with regard to the processing of personal data and on the free movement of such data, and
repealing Directive 95/46/EC (General Data Protection Regulation) (Text with EEA relevance), OJ L 119, 4.5.2016,
p. 1–88.
4
Regulation (EU) 2018/1725 of the European Parliament and of the Council of 23 October 2018 on the protection
of natural persons with regard to the processing of personal data by the Union institutions, bodies, offices and
agencies and on the free movement of such data, and repealing Regulation (EC) No 45/2001 and Decision No
1247/2002/EC (Text with EEA relevance), OJ L 295, 21.11.2018, p. 39–98.
5
Directive (EU) 2016/680 of the European Parliament and of the Council of 27 April 2016 on the protection of
natural persons with regard to the processing of personal data by competent authorities for the purposes of the
prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties,
and on the free movement of such data, and repealing Council Framework Decision 2008/977/JHA, OJ L 119,
4.5.2016, p. 89–131.
6
Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing
of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and
electronic communications), OJ L 201, 31.7.2002, p. 37–47.
7
More specifically, for remote biometric identification the AI Act is lex specialis vis-à-vis the LED (Art. 5(1)(h) AI
Act).
8
See in this regards the largest part of AI systems listed in Annex III of the AI Act.
9
It has materialised in the EU jurisdictions through position papers, public consultations, parliamentary hearings,
guidelines, opinions related to data protection impact assessments, investigations, corrective measures and
(sometimes) sanctions. Furthermore, DPAs are also participating in various regulatory sandboxes.
10
See, in particular, EDPB-EDPS Joint Opinion 5/2021 on the proposal for a Regulation of the European Parliament
and of the Council laying down harmonised rules on artificial intelligence (Artificial Intelligence Act), 18 June 2021
(hereinafter “EDPB-EDPS Joint Opinion 5/2021”); EDPB Statement on the Digital Services Package and Data
Strategy adopted on 18 November 2021.
11
See Recital 153 AI Act.
Final 2
(hereinafter “NCAs”), in particular market surveillance authority/authorities (hereinafter “MSAs”)12,
interacting both among themselves and with DPAs and other authorities protecting fundamental
rights.13 The EDPB recognises that some Member States have already decided on the appointment of
MSAs and that, therefore, some of the recommendations in this statement may not be fully relevant
in those cases.
6. As already stated in the Joint Opinion of the EDPB and the EDPS14, in this emerging enforcement
framework, a prominent role of the DPAs at national level should be recognised, in particular due to
the experience and expertise gathered by them in working out guidelines and best practices
and carrying out enforcement actions on AI-related issues with respect to the processing of personal
data at both national and international level15. DPAs have proven and are proving to be indispensable
actors in the chain leading to the safe, rights-oriented and secure deployment of AI systems across
several sectors.
7. Moreover, it should be pointed out that the designation of DPAs as MSAs would benefit all
stakeholders in the AI value chain by making available a single contact point, facilitating
the interactions between different regulatory bodies that are concerned by both the AI Act and EU
data protection law.
8. Furthermore, in the light of the AI Act, the EDPB considers the following points of particular
importance:
§ DPAs, in addition to their existing expertise in AI technologies, are skilled in many of the areas
referred to in Article 70(3) AI Act, such as data computing and data security, and in assessing
risks to fundamental rights posed by new technologies;
§ DPAs, due to their full independence16, can provide effective independent supervision of AI
systems, as required by Article 70(1) AI Act17;
§ Pursuant Article 74(8) AI Act, DPAs ‒ or other authorities with same requirements on
independence (under the conditions laid down in Articles 41 to 44 of Directive (EU) 2016/680)
‒ must be designated as MSAs for high-risk AI systems listed in point 1 of Annex III AI Act, in so
far they are used for law enforcement purposes, border management and justice and
democracy, and for high-risk AI systems listed in points 6, 7 and 8 of Annex III AI Act, which are
key elements of the democratic order;
§ From a systematic perspective, it is also particularly valuable that where Union institutions,
bodies, offices or agencies fall within the scope of the AI Act, the EDPS shall act as
the competent authority for their supervision, as provided for by Article 70(9) AI Act18;
12
See Article 3(48) AI Act.
13
See Article 77 and Recital 157 AI Act.
14
See, in particular points 47 ff. of the EDPB-EDPS Joint Opinion 5/2021.
15
DPAs have been active for a long time and also cooperate on the different topics related to AI systems in
different international fora (G7 Data Protection and Privacy Authorities Roundtable, Global Privacy Assembly -
GPA, International Working Group on Data Protection in Technology, Council of Europe, international standards
organisations, etc.) and, with their representatives, at the OECD. This integration is particularly important in the
current context in which AI regulation is widely discussed at the global level, notably regarding the development
of standards.
16
See Article 8(3) of the Charter and Article 16(2) of the Treaty on the Functioning of the European Union
(hereinafter “TFEU”).
17
See also Recitals 79 and 80 AI Act.
18
See also Articles 3(48), 74(9) and 43(1) AI Act.
Final 3
§ Moreover, in the light of the provisions contained in Articles 26(9), 27(4) and Annex VIII,
Section C, point 5 AI Act, a close relationship is expected between the data protection impact
assessment and the fundamental right impact assessment19.
2 RECOMMENDATIONS
9. The EDPB recommends that DPAs should be designated by Member States as MSAs for the high-risk
AI systems mentioned in Article 74(8) AI Act. Further, the EDPB recommends that, taking account of
the views of the national DPA, Member States consider appointing DPAs as MSAs for the remaining
high-risk AI systems as listed in Annex III, particularly where those high-risk AI systems are in sectors
likely to impact natural persons rights and freedoms with regard to the processing of personal data,
unless those sectors are covered by a mandatory appointment required by the AI Act (e.g. the financial
sector).
10. Considering the above, since the single point of contact under the AI Act should be a MSA as provided
for by Article 70(2) AI Act, DPAs (acting as MSAs) should be designated as the single points of contact
for the public and counterparts at Member State and Union levels. This would be without prejudice
to the representatives of Member State to the European Artificial Intelligence Board, which could be
different, as indicated by Article 65(4)(b) AI Act. This would also enable a unified, consistent,
and effective approach across the different sectors.
11. From a broader perspective, there is a need for sound cooperation between MSAs and the other
entities which are tasked with the supervision of AI systems20, including DPAs, and clear procedures
have to be provided for in this regard on the basis of Article 74(10) AI Act21. Such procedures should
be built and developed under the principle of sincere cooperation provided by the Article 4(3) of
the Treaty of the European Union, as highlighted by the Court of Justice in the Bundeskartellamt case22.
In this way, inconsistencies between decisions taken by different oversight authorities and bodies
can be prevented in the digital ecosystem, and synergies can be exploited in coherent, effective
and complementary enforcement actions for the benefit of individuals and legal certainty.
12. Whilst the attribution of new tasks and powers with regard to the supervision of AI systems could be
facilitated by the legal and technical skills already present in the DPAs, the very fact that new tasks
and powers are envisaged for DPAs, including in their capacity as MSAs, entails the need for adequate
additional human and financial resources to be provided by Member States.
13. The considerations made so far concerning the relationship between national DPAs and MSAs under
the AI Act similarly apply to the supervisory activity carried out by the AI Office23 on general-purpose
AI models, which could be as well trained on personal data24 and whose output can affect individuals’
privacy and data protection rights. In this regard, no clear coordination is currently established in the
AI Act between the AI Office and the DPAs/EDPB.
19
See Articles 26(9), 27(4) and Annex VIII, point 5 AI Act.
20
In particular in the fields of product safety (e.g. in the case of smart toys, see also Annex I), competition, digital
and media services, financial services, consumer protection, and fundamental rights protection.
21
See also, with regard to the activity carried out within the regulatory sandboxes, Article 57(10) AI Act.
22
Judgment of the Court of Justice of 4 July 2023, Meta Platforms and others (Conditions générales d’utilisation
d’un réseau social, C-252/21, ECLI:EU:C:2023:537), in particular paras 53-63.
23
The AI Office, referred to in Article 64 AI Act, has been established by Commission Decision of 24 January 2024
establishing the European Artificial Intelligence Office, C/2024/390, OJ C, C/2024/1459, 14.2.2024.
24
See also Annex XI, Section 1(1)(e) and (2)(c) of the AI Act.
Final 4
14. The EDPB underlines that whenever a general-purpose AI model or system entails the processing of
personal data, it may fall – like any other AI system – under the supervisory remit, as applicable, of
the relevant national DPAs (also cooperating according to Chapter VII of the GDPR) and of the EDPS
(when it falls under the EUDPR). Therefore, national DPAs and the EDPS cannot but be properly
involved where questions arise as to matters falling within the scope of Union data protection law in
the supervision of those systems. This is necessary to comply with Article 8 of the Charter and Article
16 of the TFEU, including as regards the process of assessment of the codes of practice mentioned in
Article 56 AI Act.
15. Accordingly, the EDPB calls the attention of the European Commission and the EU AI Office which is
part of it to the need to cooperate with the national DPAs and the EDPB, and the need to establish,
in agreement with them, the appropriate mutual cooperation25 in the most effective way and in full
compliance with the principle of sincere cooperation as recalled by the Court of Justice26.
The Chair
(Anu Talus)
25
Also in light of Articles 2(3)(c) and (d), Articles 3 and 6 of the Commission Decision of 24 January 2024
establishing the European Artificial Intelligence Office, C/2024/1459.
26
See footnote 22 of this Statement.
Final 5