WEF Advancing Towards Digital Agency 2022
WEF Advancing Towards Digital Agency 2022
WEF Advancing Towards Digital Agency 2022
Contents
Preface 3
Executive summary 4
5 Levers of action 31
Conclusion 37
Glossary 39
Contributors 41
Acknowledgements 42
Endnotes 43
Disclaimer
This document is published by the
World Economic Forum as a contribution to a project,
insight area or interaction. The findings, interpretations and
conclusions expressed herein are a result of a collaborative
process facilitated and endorsed by the World Economic
Forum but whose results do not necessarily represent the
views of the World Economic Forum, nor the entirety of its
Members, Partners or other stakeholders.
Preface
The power of the data ecosystem has never
been greater but the system itself is becoming
more difficult to navigate.
The power of the data ecosystem has never Contrasted with this complexity is our reliance
been greater but the system itself is becoming on data sharing both as our way of life and
more difficult to navigate due to increasing as the backbone of the global data economy
complexity. We share and receive data every day and the key to technological innovation.
to interact with the technologies that serve us,
whether in a personal or commercial context. If mistrust in the data ecosystem acts as a point
Anne Josephine Data value chains then funnel, use and reuse that of failure leading to suboptimal outcomes for
Flanagan data, usually for commercial or public interest us all, what can be done? What if there was a
Data Policy and purposes. These value chains, often involving better way whereby data could be more easily
Governance Lead, World
personal data, are at best complicated to follow; traced, more easily permissioned, more easily
Economic Forum
at worst they can lead to mistrust in data sharing controlled by data rights holders (including
and can potentially give cover to bad actors. people) across the data ecosystem?
In this report, the World Economic Forum’s In an era that has policy-makers moving beyond
Task Force on Data Intermediaries, composed just privacy laws and to grapple with developing
of business, academic and civil society actors policy levers designed to support data-sharing for
worldwide, explores these questions and more. common purposes, the task force shares what it
Sheila Warren Building on the Forum’s Redesigning Data Privacy: has learned to support responsible policies. The
Deputy Head, Centre Reimagining Notice & Consent for Human- value-added use of data intermediaries as a key
for the Fourth Industrial technology Interaction1 report, the task force to unlocking complexity and building trust holds
Revolution Network, examines data value chain scenarios as they the promise of protecting the interests of data
World Economic Forum
already exist today – and may exist in the future – sharers and data subjects alike – and ultimately
with a view to improving both human–technology that of society.
interaction and data sharing more broadly.
Taking lessons from global business, the research Finally, although any views expressed do not
community and cutting-edge technology design, represent the views of any individual taskforce
we explore best practices in the use of data member or their organizations, we invite you
intermediaries. We identify various models, to join us on this journey of exploration as we
including the organizational data intermediary, unearth and build a picture of where consensus
such as the data trust, that assumes a fiduciary may or may not lie in unleashing the power of data
duty. We explore the automated gateway that intermediaries leading to trusted digital agency –
predetermines standard rules. And we look to the and where and when these types of policies could
future, to the artificially intelligent agent that allows potentially be deployed.
for autonomous third-party decision-making on
our behalf, with its associated promises and perils.
The challenge
Everyone is familiar with the paradigm of going time. Where once people had screens to navigate,
online and clicking on terms and conditions they new ambient data collection methods with their
don’t understand (or take time to read). No one many benefits create nervousness and resignation
knows (nor follows) what happens to their data. when people don’t have the full picture. In some
This status quo creates a reliance on companies to cases, individuals may opt out of interacting with
be responsible but can lead to mistrust in the data technologies that would be of huge benefit to their
ecosystem as a whole. Further, mistrust between lives. But what if it were possible to outsource these
people and technology becomes amplified the decision points to a trusted agent acting on an
more complex the data ecosystem becomes over individual’s or even a group’s behalf?
The opportunity
Now that screenless technology is a part of everyday lever through and around which individuals can
life, there is an opportunity to rethink the human– potentially navigate the challenges of the growing
technology interaction paradigm and reposition data ecosystem. This report seeks to shed light
the debate to focus on roles and responsibilities on an alternative method of mediated human–
beyond the person. How can the use of data technology interaction whereby data appears to
intermediaries help people navigate technologies travel seamlessly from people to technology in a
and data ecosystem models without losing sight of human-centric and, crucially, trusted manner. By
what it means to be human, in terms of agency and communicating shared incentives, establishing
expectations? How can people think beyond that reputation or receiving third-party verification, as
given that, as they move towards the complexity of well as having assurance structures to mitigate risk
screenless metaverse issues, their understanding of to both the intermediary and the rights holders,
“humanness” is transforming? Data intermediaries– data intermediaries can increase trust between
especially digital agents – represent a new policy people and the technology they interact with.
The solution
This report explores the opportunities and risks of innovation through the introduction of a trusted third
data intermediaries and, specifically, third-party digital party. Crucially, it suggests levers of action for both
agents. From data trusts to trusted digital agency, the public and private sector to ensure a future-
the report paints a picture of a world that is more proof digital policy environment that allows for the
empathetic to people and to companies, providing seamless and trusted movement of data between
greater certainty for data sharing as a foundation for people and the technology that serves them.
People mistrust “Our days are filled with myriad discrete data important when considering that people share data
even the most collection moments. Even when we have genuine every time they interact with the technologies
responsible and intent to affirmatively consent to each moment of in their lives.
ethical companies data collection, it is practically impossible to do so:
No individual has the time to provide affirmative As Bill McDermott, former Chief Executive Officer
because the
consent on a near constant basis. This reality of SAP, has noted: “When trust is there, we can take
system – the data
arguably undermines our individual agency.”2 giant strides, turning our greatest challenges into
ecosystem – is our biggest opportunities. When it’s not, the needle
so confusing to The world is experiencing something of a mistrust gets stuck. Small hurdles become insurmountable.
navigate. pandemic when it comes to people’s engagement Division overwhelms unity.”3
with the data ecosystem. This global “trust gap” or
“trust deficit” is a barrier to economic growth, digital As defined by Russell Hardin,4 trust is a belief
innovation and social cohesion. The technology that an actor will perform a specific action within
ecosystem is ultimately powered by the collection, a specific context, whereas trustworthiness
sharing and processing of data, often personal is a property of an actor. The goal of data
in nature. Data sharing is a driver of innovation intermediaries and the infrastructure that
in technology and of the digitization of mature supports them is to enable data rights holders
economic models. to trust trustworthy data intermediaries.
But trust between parties who seek to share or That is not to say that without trust and
exchange data is not a default state; it is something trustworthiness there is no sharing of data; but
that needs to be earned or built, often as a result with trust and trustworthiness there will be greater
of great effort over time. This includes building trust participation and in turn an increase in the volume and
between people and technology. It is all the more indeed the veracity of data made available as a result.
When presented The challenges of meaningfully consenting to could they truly anticipate how their data would
with privacy personal data sharing, meaning the collection be used?
notices, it is and processing of personal data, are well-
necessary to take known.5 Much of people’s interaction with And what if there is no screen? Ambient data
technology relies on giving consent for data collection, through for example closed circuit
the time to consider
collection and processing via a medium such as television and connected devices, is increasingly
the implications a screen. When presented with privacy notices, common. Getting to an acceptable default state
of terms and it is necessary to take the time to consider the is more urgent than ever as the world moves
conditions and implications of terms and conditions and to towards the creation of the metaverse where
to overcome the overcome the barrier presented by the attention the metaphysical state of human–technology
barrier presented required to think explicitly about preferences. interaction becomes ever more seamless.
by the attention People need to think about what they really
required to think care about and foresee what their data might Other lawful bases for data collection and
explicitly about be used for – if they can imagine it. The term processing do exist in some jurisdictions, such as
preferences. “decision fatigue”6 reflects something real: Lorrie legitimate interest or performance of a contract
Cranor and Aleecia MacDonald of Carnegie under the European Union (EU) General Data
Mellon University researched7 the unfathomable Protection Regulation, but they have their own
burden of reading privacy notices that people limitations. Courts the world over have been clear
typically experience and the resulting difficulty that notice and consent is the preferred lawful
in being afforded the time to meaningfully react, basis in certain scenarios. In situations where
understand and consent to them. People are notice and consent has been deemed to be the
simply too busy to take the time to read every only existing acceptable standard, that constraint
consent notice on websites. And even if they did, can have limitations as described earlier.
Resultant mistrust
Today’s default state is not healthy. On the As for the law, it struggles to keep up. Heavily
one hand, people are sometimes accepting weighted in favour of principles that lack the
and often left feeling disempowered; on the nuance of specific scenarios, regulation’s
other hand, organizations struggle to access favourite tool is simply to ask: Can this
and process data that can meaningfully entity collect your data? And individuals say
improve lives, health and even the planet. “yes” without meaningfully understanding
the benefits as well as the costs, and so on
People mistrust even the most responsible and and so forth as they continue to “consent”
ethical companies because the system – the without always meaningfully consenting.
data ecosystem – is so confusing to navigate.
A new approach
What if there was a better way? What if you via a third-party data intermediary? Elements
could outsource the decision-making fatigue of such a sophisticated and nuanced data
to a trustworthy third party? What if you could ecosystem already exist but the appropriate
pre-consent to your preferences so that you policy frameworks are far from being in place to
did not need to continuously opt-in? What make such a scenario viable at a systemic level.
if technology allowed you to outsource your
decision-making even further – to a digitally In addition to asking this question, this
automated agent, potentially using artificial report also explores the secondary effects
intelligence (AI), which could actively make of such a scenario through the examination
those decisions for you? All such scenarios of relevant use cases and asks what
require the enlisting of an intermediary. actions public and private sector actors
can take when probing such issues for the
Is the world ready for such a radical and human- benefit of building a more robust, human-
centred approach to managing data relationships centric and sustainable data ecosystem.
For the purposes of this paper, several for example, if combined with other datasets.
assumptions are made. This includes business-to-consumer (B2C) and
business-to-business (B2B) relationships involving
The first assumption is that whatever the data someone’s personal data, but business-to-
sharing relationship, data rights holders will government (B2G) scenarios are also relevant.
inherently mistrust each other without appropriate If data intermediaries anonymize personal data
safeguards, positing that a data intermediary and/or handle non-personal data, it may be
can potentially become that missing safeguard, recommended that they should have a process in
depending on the data-sharing scenario and the place to test the robustness of their anonymization
characteristics of that intermediary. The assumption methodology. Nevertheless, given the difficulty
in all cases is that data rights holders have an of disassociating personal data from data sets
interest in their data rights: for example, people that contain otherwise non-personal data, and
care about information about them and companies the higher regulatory bar placed on the handling
care about the value of proprietary information. of personal data, personal data will be used as a
proxy for all data.
Secondly, when exploring data intermediary
possibilities, the relationships may be binary And finally, there is no silver bullet approach:
or multi-party in nature. An example of this policy responses are as nuanced as the scenarios
is where data collected about people in a they respond to. It is assumed that the findings
smart city environment can be used for the of this work as they pertain to personal data may
purposes of urban planning; while the people be adjusted as relevant to apply to the treatment
whose data was collected are themselves of different scenarios, including exclusively non-
rights holders, the sharing takes place several personal data-sharing scenarios, such as B2B
times throughout a data value chain.8 sharing of proprietary data generated from non-
personal data sources or unknown future use
Thirdly, it is worth nothing that data is contextual, cases. Indeed, it is intended that this paper be
which means that non-personal data may become made available to contribute to future work by
personal in nature depending on the context, others in this space.
Data intermediaries can take many forms; but what By definition, it is assumed that data intermediaries
they share is a primary purpose of facilitating and are always third party in nature, as witnesses
managing data relations between data rights holders to the primary data sharing transaction.
To facilitate trust between data rights holders, at the most basic level data intermediaries may
communicate shared incentives, establish reputation or receive third-party verification; and have
assurance structures in place to mitigate risk to both the intermediary and the rights holders.
In addition, they can take on different roles for different kinds of data rights holders.
BOX 1 The role of data intermediaries for different data rights holders
People & society that complies with a set of base standards, such as those as
A data intermediary can play a significant role in enabling people determined by a sector or industry: this is already the case in
to be more in control of their personal data, determining what relation to information security and the tracking of illegal activity
personal data is shared with which participants and for what online. A data intermediary can also help participants navigate
purposes. They can vet parties that would receive the data laws, regulations and other complex data privacy requirements,
to determine if they are “trusted” based on a set of externally thereby effectively outsourcing some of these services to the
published standards and criteria, thereby removing the obligation intermediary. It is for this reason that there has been a boom
from the individual and thus removing the deficiency of the notice in so-called “regtech” whereby third-party processes manage
and consent mechanism common in data protection regimes. A information compliance. Such third parties could be classified
data intermediary can leverage economies of scale to implement as private data intermediaries.
technologies to enable greater protection of personal data
through real-time anonymization, pseudonymization9 or other In a similar vein, scientific research institutions have been
privacy enhancing technologies and services. Conversely, the
proponents of data trust models for a number of years, given
data intermediary could also verify and confirm the identity of
that the data trust can act as a trustworthy conduit to manage
the individual, thereby providing additional guarantees that the
access to data that otherwise would be inaccessible for
information being shared belongs to the individual and has not
purposes other than research.10 This paper examines data trusts
been misappropriated or obtained by other means.
in more detail later.
Data intermediaries could also provide a variety of services, Government & public sector bodies
including that of matchmaker between supply and demand for Although there is growing momentum to enable greater sharing
data. They could engage in security, authentication and fraud
of public data by government bodies, this remains sporadic and,
prevention activities, such as performing verification services
where personal data is involved, complex and limited. Open
on the participants and the data being introduced based on a
data policies seek to streamline access to publicly held data but
range of parameters, from potential copyright infringement to
often fall short. The World Economic Forum’s recent work on
information security scanning of malicious code.
empowered data societies,11 sheds light on this topic through
Businesses & private sector organizations the example of improving access to publicly held data in the City
A data intermediary can act as a conduit to gain greater of Helsinki. One finding of that work is that citizen-held data can
access to permissioned personal data. It can also enable be a rich source of relevant information for government service
greater sharing of that data between private corporations provision and can enhance people’s lives by delivering value
and organizations. Private entities could benefit from the use for societies if conducted in a human-centric manner. A data
of a data intermediary as a method of third-party verification intermediary can help ensure trust in such a scenario.
The above are just a few main examples of the – Leverage individuals’ personal data for social
usefulness of data intermediaries. Some of the impact, such as to contribute to academic or
more advanced competencies of data intermediaries scientific research.
may include:
– Providing added-value services to participating
– Storing individuals’ personal data within a members, such as data anonymization and/or
personal data space or a vault so that data aggregation, benchmarking services, security
processing can happen within that space and fraud prevention.
without the transmission of personal data to any
parties outside the space; rather the insights – Acting as a data aggregation and/or
from the data are transmitted in a manner similar pseudonymization and/or anonymization layer.
to federated data learning models. Echoes of
this idea appear in a proposal for Common – Acting as a proxy for consent to offer individual
European Data Spaces.12 control to the data subject.
– Advising individuals on uses of their data, If data intermediaries can add so much value, why
including tracking who is uses their data and are they not used more often? One possible answer
for what purpose. is the complexity of the data value chain within
which date intermediaries operate by definition and
– Strengthening individuals’ negotiation power the policy environment surrounding both it and the
when influencing the terms of the data use(s), data ecosystem as a whole.
negotiating a “fee” for the data exchange, or
solving disputes.
In describing the difference between the data infinite ways, the model also contains a feedback
ecosystem and a data value chain, the former loop. This feedback loop is something everyone
might be termed as all data, all transactions and is familiar with: it is the means by which functional
the global space within which data exists and is data sharing takes place and how technology
processed, whereas the latter is a value chain of knows what to serve back. It has implications
sorts. Data exists in the data ecosystem by default, for online advertising, profiling and, taken to the
but once data is collected and processed it enters a extreme, is key to dark patterns (which combine
data value chain and that value chain is as long and heretofore disparate data sets for purposes of
as infinite as the life of the data. manipulating the user in a non-transparent manner).
But importantly, without this feedback loop it would
Open Data Watch’s data value chain model below be impossible for people to interact with today’s
describes the four major stages of the life cycle of technologies in any meaningful fashion. In other
data: collection, publication, uptake and impact. In words, the feedback loop is a neutral feature of the
addition, as data is an infinite resource and (absent data value chain but may be open to manipulation.
external constraints) can be reused infinite times in Disposing of data closes the feedback loop.13
Feedback
Once automated Introducing a data intermediary into the data value digital identity interacted with relevant scenarios
decision-making chain can fundamentally alter the flow of data in those preferences and permissioning could be
the transaction by disrupting at least one point taken forward by a data intermediary to conduct
starts to occur, a
in the chain. brand new transactions. The value of this is that
type of synthetic
the person does not need to be asked more
data precedent Under the notice and consent model, a person
than once what their preferences are; but an
arises. This means consents to the collection and processing of their
obvious downside is that the use cases may
that a data-use data at the very beginning of the data value chain.
be very different from each other and consent
pattern emerges The data then flows through the data value chain,
is being inferred, which may reduce individual
guided by the permissions set before the data
that infers further agency and lead to unintended outcomes.
entered the chain.
use cases.
4. Automated decision-making by a digital agent:
A data intermediary could alter this process in
In this scenario a data intermediary digital agent
several fundamental ways. If the purpose of a data
takes on the role of decision-maker. Consent is
intermediary is to effectively accompany personal
automated as before but this time using AI the
data by adding a layer of permissioning onto the
data intermediary agent decides autonomously
accompanying metadata (or use metadata as a
what kind of data permissioning a person might
proxy), that permissioning effectively follows the
like. This opens the door to even more possible
data (technically it acts to determine the use of
uses of that data. This type of scenario disrupts
the data) throughout the entirety of the data value
the normal flow of data in the data value chain
chain and will trigger changes on a case-by-case
at all stages and again can carry both wonderful
basis depending on what the permissioning allows
opportunities and considerable risks. The key to
for. A similar model is in use in permissioned and
success here lies in the quality of the automated
permissionless blockchains.
decision-making and the underlying algorithm.
Below are some different variations of
5. Replenishing and automating across multiple
permissioning scenarios in the data value
data value chains: Once automated decision-
chain using data intermediaries:
making starts to occur, a type of synthetic data
1. Notice & consent: This is the default state precedent arises. This means that a pattern of
whereby people consent to the collection data use emerges that infers further use cases. If
and processing of personal data. There are this could be harnessed at a systemic level with
alternative lawful bases for data collection and appropriate policy safeguards, the data and its
processing but in all cases a pre-determination associated permissions could be recycled over
is made that the data can lawfully enter the and over and look slightly different every time
data value chain. but should reflect the preferences of the user.
The move is towards a fully automated system
2. Transferred permissioning: The data
of personal data collection and processing to
intermediary could take the data into a
overcome notice and consent limitations. This
brand-new data value chain by relying on the
is a scary and amazing space and arguably
permissions from previous unrelated incidents
not so different from a world absent of any
of data collection and processing by the same
data protection and privacy requirements: the
person. This alters the collection phase of the
difference here is that there is a system, ideally
data value chain (identify, collect, process) by
with backstops, designed in a human-centric
leapfrogging past specific notice and consent.
manner and therefore retains the preferences
3. Pre-permissioning using digital identity: of the user and exerts limits accordingly. In
This mimics transferred permissioning above, fact, there is no reason AI agents could not be
except now the power of digital identity is programmed to be conservative if that is what is
introduced. For example, if someone’s digital reflective of the user’s preferences. In addition,
identity stored their general preferences for data such a system would require clear rules to avoid a
collection and processing, then any time that conflict of interest on the part of the digital agent.
When it comes to personal data, most data not explicitly and meaningfully given, regardless
protection and privacy regimes do not currently of the jurisdiction. Notwithstanding that other
allow for many of the above scenarios. In most lawful grounds for the processing of personal data
jurisdictions that use the notice and consent model, already exist beyond notice and consent, this paper
consent needs to be specific and meaningful looks at what the appropriate data intermediary
in order for the data to be considered to have backstops would need to be in order to make the
been lawfully obtained. Best practice for now is above a reality. Inherent in this is the use of both
therefore to avoid inferring consent where it is public and private policy levers.
For human-computer interaction, researchers have manner to a data steward and other related
developed a definition of online trust as an evolution fiduciaries. Much as a doctor is charged with
of its offline counterpart. In the real world, “trust is taking care of patient health or a lawyer with
the social capital that can create cooperation and legal affairs, the digital fiduciary is responsible
coordination.”15 In the cyberworld, trust becomes for assisting individual clients in managing their
“an attitude of confident expectation in an online digital selves. At a minimum, this means that
situation of risk that one’s vulnerabilities will not be a digital fiduciary upholds its duty of care by
exploited.”16 This is at the core of the confusion of doing no harm to its clients and upholds its
current human–technology interaction, where data duty of loyalty by not having any conflicts of
collection is so ubiquitous as to make people feel at interest. Under a more expansive definition,
risk of being vulnerable. To solve this, intermediary a digital fiduciary upholds its duty of care by
third parties can be helpful. protecting and enhancing the individual’s digital
experiences and upholds its duty of loyalty by
Much can be learned from data intermediary actively promoting the individual’s interest.20 The
models that are already in use in commercial digital fiduciary can be an individual or an entity,
and academic spheres today, whether they a private or public (governmental) body and, if
share personal data or not. The section below private, a for-profit or not-for-profit enterprise.
examines some of the most relevant and trusted
data intermediary models already in existence Fiduciary duties can be defined, implemented
at the B2B level. and enforced in a variety of ways, including
via: a new legal framework, existing contract
– Data stewards law, voluntary certification, or a professional
association with licensing and related assurance
Organizational leaders such as the chief data infrastructure (like for physicians or lawyers).
officer may hold a designated data steward role,
or teams may be empowered to ensure that – Data trust
data is leveraged in a responsible way. The data
steward’s role is to manage data rights and data A data trust is a repeatable framework of
reuse, identifying opportunities for productive agreements based on trust or contract law,
cross-sector collaboration and responding allowing data rights holders to delegate control
proactively to external requests for functional of their data to a trustee.
access to data, insights or expertise. Stewards
are active in both the public and private If the data trust employs trust law, the trustee is
sector, promoting trust within and outside their bound by fiduciary duties of loyalty and care to
organization on how data is being used. In act in the interest of the beneficiary. The trust
some cases, the data steward can be an entity pools individuals’ power and provides an agent
with duties to carry out the interests of a group to negotiate their interests, suited to managing
of data rights holders, a community,17 or the individuals’ asymmetric relationships with
entity holding the data. companies in a complex technical environment.
Upholding duties of loyalty requires the data
To establish and demonstrate their trust to be independent and may preclude the
trustworthiness, data stewards may take data trust from being a for-profit company.
on a professional role, including verifiable Although trust law does not exist in all countries,
ethics obligations or certification.18,19 Outside fiduciary duties are more common globally.21
organizations must always perceive the A data trust can be designed for different
data steward as trustworthy. In the case of levels of beneficiary participation, delegating
B2G data sharing, the data steward could various degrees of decision-making power to
The data steward even facilitate relationships between the the trustee.22 A data trust contract then is “a
can both lead private and public sector. Thus, the data contract among one or more controllers of data
responsible data steward can both lead responsible data (the ‘entrusters’) and a third party under which
management within their organization and the entrusters empower the third party (the
management within
increase the trustworthy perception of their ‘data trustee’) to make certain decisions about
their organization sector and facilitate new relationships. use or onward supply of data (the ‘entrusted
and increase data’) on their behalf, in the furtherance of
the trustworthy – Digital fiduciary stated purposes that may benefit the entrusters
perception of their or a wider group of stakeholders (such
sector and facilitate A digital fiduciary takes on the mantle of duties entrusters or stakeholders being referred to as
new relationships. of care and loyalty but in a somewhat different the ‘beneficiaries’).”23
Can a data Some common features emerge that start to build data architecture and data standards for which
intermediary be out conditions for the third-party intermediary all organizations would be required to comply.
truly independent, being independent, having a set of duties in This will require deep expertise in privacy, data
especially in their performance, being a dedicated asset and and technology, and therefore upskilling of the
relation to the with clear rules of the game. Considering the staff and/or hiring of a “data steward” with the
difference also in various duties of care from model required skillsets.
services it may
to model, does it make a difference whether the
offer and the
intermediary is public or private in nature? Can a However, whether a public body can be said
financial incentive data intermediary be truly independent, especially in to be “trusted” will be dependent on the role
to perform? relation to the services it may offer and the financial of government in any given country, its level
incentive to perform? The following section explores of control, access and use of surveillance
some further characteristic options, especially as laws and related technologies. Although a
they relate to human–technology interaction and the super-intermediary may enable vast sharing
collection and processing of personal data. of data between multiple participants,
enabling economies of scale and a consistent
– Public data intermediaries interoperable approach even across borders,
if there is no trust in the system, in the
A public body or government agency could government and its underlying intentions, there
take on the role of an intermediary, especially may not be active use, unless under the force of
as it relates to data coming from public bodies. law. This would then impact the veracity of data
Therefore, it can also act as an aggregator being shared and could in turn stifle innovation.
or gateway for such information. Such an
intermediary could play an even greater role – Private for-profit data intermediaries
in making the data more easily accessible,
identifiable, searchable and usable, including Whether and how a for-profit commercial
coordinating interoperable systems, especially entity can successfully serve its clientele under
across the public sector at least. Therefore, the voluntary fiduciary duties of care and loyalty
role of a public body is arguably greater if it is an remain open to debate among stakeholders.29
aggregator of multiple sources of public data. A key driver of the success of this model is how
Another role it could play is to act as a super- the intermediary derives economic value to be
intermediary, setting the national standard, able to perform and make this service available.
Human-centricity
“Human-centricity means focussing on Autonomy and agency are core tenets of human-
something variously called (self-)sovereignty, self- centricity and fit in with the aims of restoring
determination, self-governance, autonomy, agency trust to human–technology interaction. Human-
or the like, in terms of the people involved with the centric design is a well-researched and used
generation of data. These concepts derive from space but human-centricity has typically taken
the internationally-recognized concepts of human a backseat to a rights-based approach when it
rights. A human-centric approach is one that comes to data protection and privacy norms,
makes central the following: that people have the especially when it comes to regulation.
right to determine, without any kind of coercion or
compulsion, what happens to them.”32
Fiduciary duty
A more highly developed area of consideration – The fiduciary duty of care = act prudently
is fiduciary duty. A fiduciary typically abides by towards the entrustor
two basic types of duties: care and loyalty.
In turn, these can be further subdivided into four – The “thin” fiduciary duty of loyalty = have no
specific duties: conflicts of interest between duties and clients
– The general tort-like duty of care = do no harm – The “thick” fiduciary duty of loyalty = promote
to others the entrustor’s best interests.
Can the use of data intermediaries establish a expectations? Our digital identities
notion of sort of “digital self-determination40” by may hold the key to allowing us to
helping people navigate technologies and data determine how we can start to navigate
ecosystem models without losing sight of what the data ecosystem around us in a more
it means to be human, in terms of agency and sophisticated manner.
Digital identity
A digital ID is the electronic equivalent of an specific purpose. Some of the biometric based
individual’s identity card. It is a way to provide digital ID systems have already been adopted in
verified personally identifying information of an financial transactions and for a cash-free shopping
individual for a software to read and process. Both experience. Such authentication and authorization
online and offline environments can adopt digital processes can be completed in real time and free
identity. And it can also act as a key by storing and of hassle.
deploying permission.
Good digital identity has five key components as
Carefully designed and properly managed, digital ID defined by a multistakeholder group curated by the
can also enhance privacy protection and reduce the World Economic Forum: useful, inclusive, secure,
rise of identity fraud since each time only minimum offers choice, fit for purpose.41 Figure 2 shows the
information is needed for authentication for the importance of Identity in everyday lives.
Healthcare
For users to access insurance, treatment; to
monitor health devices, wearables; for care
providers to demonstrate their qualifications
Telecommunications
To monitor devices and Financial services
sensors transmitting data such To open bank accounts, carry
as energy usage, air quality, out online financial transactions
traffic congestion
Digital
identity
Entities People
Devices Things
E-commerce
To shop; to conduct business
transactions and secure payments
Source: World Economic Forum, 2018, Identity in a Digital World A new chapter in the social contract.
Authentication: Processes that determine if Profile: May include inherent data attributes (such
authenticators used (e.g. fingerprints, passwords) as biometrics) or assigned attributes (such as
to claim an identity are valid. Sometimes names or national identifier numbers).
digital identity goes beyond authentication.
History: Credit or medical histories, online
Authentication is a security process that
purchasing behaviours.
compares attributes to confirm a claim. In
principle, there is no need to know who the Inferences: Judgements or decisions made based
person is. In digital identity, there may be a on authentication processes, profiles and histories
need to link the person to their identity and that (e.g. a bank decides the attractiveness of an
may require identity verification technologies. individual for a loan).
Traditional intermediaries Personal data stores, on- Next level of data intermediaries
and user consent (e.g. web device data storage and more (embedded in body, devices,
browsers, apps, mobile devices) advanced data intermediaries homes, cities, etc.)
(e.g. smart devices, agents)
Policy – Data protection and privacy – Credential interoperability Create definitions and
considerations (technical, legal levels) thresholds of ownership,
– Security requirements delegation, liability
– Legal acceptance
– Data minimization of digital ID Prescribe transparency,
auditability, predictability
– Certification of – Trust frameworks linked
issuers, verifiers to attributes, exchange Allow for scalable (rule-
of credentials based vs granular per
data items) approaches to
– Recourse and liability scope of data agency
Create sandboxes
for experimentation
Digital identity can allow for the selection of Digital identity then can be the key to unlocking
preferences and the making of certain choices in a less ethically concerning but arguably equally
advance, such as “pre-consent”, avoiding doubling impactful scenario as an AI-enabled digital agent.
of efforts. This already happens in device usage: Digital identity allows the digital agent to recognize
when setting up a new phone, for example, users that the data belongs to a specific user and consult
can predetermine privacy settings before using any the permissions that that user has authorized
app. Their identity is usually inherently connected to (effectively data processing scenarios that the user
their devices. Similarly, through the use of cookies, has pre-consented to) and act accordingly in line
browsers can remember which user is which with the user’s wishes. Crucially, consent can be
through a set of identifiers. given in advance for a myriad of use cases and that
consent can be attached to the user’s digital ID.
The COVID 19 pandemic has led to a heightened health data with the establishment per se, in a
focus on the power of medical data, specifically sort of zero-knowledge proof scenario whereby
so-called vaccine passports. These passports the trusted data intermediary verifies that the
by nature serve as a form of digital identity. data subject is vaccinated but does not share
Commercial entities serve as a type of centralized any other information. This avoids unwanted
data intermediary in several jurisdictions. Given secondary effects of the establishment sharing
the sensitivity of this type of health data, in many the data any further.
cases governments have procured third-party
contractors to administer and manage such However, at a collective level, vaccine data is an
systems. Unsurprisingly, strict security and privacy incredible public health asset. The United Kingdom
criteria are central to such systems in most cases, Government in particular has acknowledged
not least because a public policy health concern this44 and has suggested that anonymization,
relies on increasing trust in the system. pseudonymization and data shielding techniques
could be harnessed in a controlled environment to
Such vaccine passports are used when travelling allow for the reuse of that highly sensitive data. In
between jurisdictions and at a local level, such such cases, notice and consent is not required per
as when entering dining establishments or se for the reuse of the data but the intermediary
other places where proof of vaccination status processes the data undergoes must be done in a
is necessary. Importantly, these intermediaries controlled environment so that the findings of the
provide a means of verifying status without sharing data set are made available rather than the data itself.
To automate the data intermediary process there money and the data that represents the value of
are some additional concerns about machine that money. Nowhere is this truer than in blockchain
decision-making that people may inherently distrust technology and cryptocurrency, where the value of
due to the machine’s lack of empathy.45 In addition, assets is intangible and inherently and inextricably
as well as perceived harm, as the Future of Privacy fully dependent on trusted data.
Forum points out, harms associated with automatic
algorithmic decision-making can vary.46 Many different technologies could potentially serve
a role as an intermediary; but some of the most
So how to instil trust? It comes down to backstops interesting and relevant are those acting as software
of governance including provisions for recourse and agents. A software agent is defined by four key
mechanisms for redress. The rules of a banking hallmarks: autonomy, social ability, responsiveness
transaction – the execution of standardized and and proactiveness.47
consistent behaviour throughout the transaction –
acts as a de facto data intermediary because Excitingly, digital agents may negotiate access
the data is handled through a specific process to data above and beyond a simple binary gated
with rigorous backstops. This example plays out function. Using sophisticated algorithms may allow for
especially in the payments industry, where people decisions that emulate agency and autonomy in as
rely on trusted third-party technology to handle close a way to human decision-making as possible.
Autonomy: Agents should be able to perform the Responsiveness: Agents should perceive their
majority of their problem-solving tasks without environment (which may be the physical world,
the direct intervention of humans or other agents; a user, a collection of agents, the internet, etc.)
and they should have a degree of control over and respond in a timely fashion to changes that
their own actions and their own internal state. occur in it.
Social ability: Agents should be able to interact,
when they deem appropriate, with other software Proactiveness: Agents should not simply act
agents and humans in order to complete their in response to their environment; they should
own problem solving and to help others with their be able to exhibit opportunistic, goal-directed
activities where appropriate. behaviour and take the initiative where appropriate.
In order to be truly at the service of the individual, their data across different services needs to respect
a trusted digital agent (TDA) that automates a certain number of rules. Below is an outline of a
permissions for people and effectively manages prototype concept of how such a TDA might work.
Rules of the game for Valexander, a friendly, – The TDA should therefore be able to
trustworthy TDA perform a basic check that compliance
exists before any data sharing occurs, with
The TDA will base its decisions mostly upon: the caveat that a verifiable third party likely
conducted the compliance itself.
– Previous consents and preferences of the
person, as well as the full context of such – Such checks could be traced using smart
consents (who, what, when, why, etc.). contracts on the chain, for instance.
– Previous consents and preferences of the large – TDAs should have contracts with people
amount of people that agent serves. guaranteeing it serves their best interests.
– Information about the person (age, gender, – Auditability, explainability of its processes
objectives, etc.).
– The TDA needs to be able to explain why
– Information about the services it exchanges it shared data with one service and not
data to and from: with another.
Such a TDA needs to guarantee: – Human interaction for some data sharing
– TDAs should be neutral and independent – How is that information described and
in regard to the digital services that will use provided? For instance, for the health
the person’s data, in order to prevent any sector there should be a registry so that
conflict of interest and ensure the TDAs only health services (public and private) register
serve the interests of the person. that information.
– The person can manage and decide its – This will guarantee fair access to data about
preferences on the data, reset any profile the services needed by TDAs, explainability
the TDA is supposed to use, and needs to of the TDA’s decision and foster competition
be able to reverse an automatic decision
among TDAs.
made by the TDA or made in consequence
of the TDA’s decision.
– It is essential to ensure a person can change their
TDA and that there is competition among TDAs:
– Governance structures (public-private-
people partnerships) need to be mandated
or created to decide and standardize: – People can easily switch from one TDA
to another without losing their preferences
– When human interaction is necessary or profile. TDAs can differentiate on the
quality of their AI but not on the data
– The automatic decisions that can be they access about the person or about
reversed and how the services.
In the case of a sophisticated approach like the one the opportunities and risks of TDAs as the world
above, TDA interoperability must be mandatory in moves towards trusted digital agency that is not just
order for this system to function. But Valexander interesting from a technical and policy perspective
is just one example: this paper’s role is to unearth but may become essential in one form or another.
The system Designing for online interfaces and interactions intended to reduce the overarching issues into
should be relies upon existing heuristics. Such heuristics three principles: agency, legibility and negotiability.51
accessible, have remained largely unchallenged despite HDI pushes individuals beyond user awareness
interrogable, developments in both the underpinning business and control, extending this to include questions
intelligible and models of online platforms and data protection over how a user might interrogate the system in
law. Even when designing to support user consent, order to support their understanding and then how
controllable.
the rule of minimal distraction (that a designer the user might allow the system to exert control
should seek to ensure the user is not distracted or over how their data is used. Arguably, even if a
noticeably redirected from their principal activity/ data intermediary distributes consent, the user
goal) remains a tenet.49 However, when the locus should still be unsurprised by what happens, be
of consent is distributed or redirected, as in the able to interrogate the model, and have the tools
case of data intermediaries, this then requires available that allow them to act, if they so choose.
substantial rethinking of how to approach the So, the system should be accessible, interrogable,
design of such interactions. At its most basic, a intelligible and controllable.
priori consent requires that the user be cognizant
of the transaction, informed of its implications, Finally, the use of a data intermediary, to overcome
and capable of agreeing to the terms. the limitations of notice and consent, does not
do away with the core components of notice and
One might assume that if a data intermediary consent but merely displaces them. Informing,
is sought, then this is a voluntary choice (and agency and revocation (awareness and control)
ultimately revocable); therefore, the moral role of are still central to the functioning of an effective
interaction designers is not to replicate consent intermediary. Equally, agreeing to trust such a
but simply to scaffold understanding and promote system with data requires a priori assent but with
agency,50 so as to ensure that any signal of assent the additional burden of informational sufficiency,
is sufficiently supported. This notion of informational as with any software product. However, given
sufficiency is highly contextual but broadly includes: the normative nature of such a system, it is
(a) how much a user should understand and the also necessary to consider how to design the
presentation of this information; (b) what aspects onboarding/assent process to be one more akin to
of the system or the data transfers should be an engagement with any offline intermediary. While
highlighted/brought to the surface and how/when such relationships are notoriously difficult to model
this might occur; and (c) when and how to alert through systems design, one interesting concept
users to changes in system state. Another way to is that of building in latency, or the affordance of
consider this is to first ask: “How much do I need delay, in the law and the design of computational
to know to ensure I am neither surprised nor upset systems.52And the “ongoing pursuit of seamless
by the use of my data?” The second question to user experiences forecloses opportunities for
ask is: “How soon, and in what way, would I wish to engagement with the text, meaningful reflection,
know this?” suspicion and interrogation, thereby limiting
agency and autonomy,”53 raising the importance
Another way to look at this issue is through human of building moments of latency into interaction
data interaction (HDI), a normative framework design, particularly in the build-up to assent.
The design for which a digital agent framework Inclusive: A universal, non-discriminatory and
becomes universally accessible and desirable accessible tool that allows ease of use and inhibits
must follow a traditional approach to achieving exclusion, and whose design prevents surveillance.
ease of use. To a greater degree when dealing
with digital identity and personal data, there exists Secure: A trusted and open framework that is
a challenge of trust and participation, which auditable and designed with a dashboard that
makes it vital to achieve a low rate of attrition. provides notifications of all data access points.
Beginning with the user, there must be a high Choice: A user-centric and user-
degree of individual control and open knowledge managed design where alternatives are
developed into the framework to ease concerns of provided through informed consent.
surveillance, misuse and security vulnerabilities.
Purpose: The accuracy and sustainability of design
To the first point of surveillance, a data intermediary that encourage use across services over time, with
could be designed as a pass-through mechanism predictable outcomes.
without knowledge of the data exchanged, where
no access to the data is required for the service. The EU’s Data Governance Act proposes
Producers of data will be sceptical of each point of a framework for the governance of data
interaction between producer and consumer, thus intermediaries, including the obligation to have
creating a need for open design. neutral and independent data intermediaries,
interoperability of data intermediaries, registration
With a blurred reality of liability and consequences for and specific governance organizations.
data misuse, a decentralized exchange system such
as blockchain must be incorporated to enhance Initiatives are emerging to unite data intermediaries
security and limit siloed control. The responsibility of and public and private service providers to
intermediaries to act on behalf of both parties creates form such governance organizations and start
a need to establish well-checked decentralized building those rules for automatic human-centric
transaction and decision-making processes data sharing. For example, in the European
throughout the entire exchange. Whether government, Union, aNewGovernance unites leaders of such
enterprise, private company or individual, trust must TDAs and organizations in the skills (education,
be earned and security proven through the design of a employment, etc.) and mobility sectors. In India,
framework that includes the following attributes: Sahamati does the same for the banking and the
healthcare sector. Both are producing governance
Useful: A portable and responsive design that rules and are working on concrete use cases to
functions across platforms and is acceptable to less help build such human-centric data networks.
tech-savvy users.
The consensual sharing of data rests on the Effective trustworthy data intermediaries, which
balance of incentives (such as for innovation, opt in or out on behalf of people, might ease the
profit or philanthropy) and disincentives (such subjective need for strict legislation in specific
as privacy concerns and proprietary interests or industries and for specific use cases and instead
other disincentives such as external regulatory allow for a more harmonized and holistic approach
intervention). In many cases, regulation intervenes with multiple applications. The appeal of TDAs
to bridge this trust gap by demanding a level of is that they are similarly simplistic and complex:
data protection and privacy be adhered to. While when a TDA can navigate any data sharing
that may tackle disincentives, in most markets scenario, the sky is the limit for the opportunity –
there remains a lack of regulatory support for data and the risk.
Due to the risks that data intermediaries can pose by guaranteeing that the monetization of the
to fundamental rights – next to their benefits if service mainly derives from the management
implemented correctly – it seems consequent of the data and possibly the provision of
to explore having certain provisions for data added value services and not from using the
intermediaries enshrined in law. data itself. The EU’s Data Governance Act
contains provisions of this nature and echoes
– Transparency and neutrality the ePrivacy Directive60 where providers of
electronic communications services may
Transparent data trusts may be more neutral transport data but may not harness it for their
than others. One way this can be achieved is own purposes, including commercial use.
As a consequence of the GDPR’s focus on the corporate structure, and independence. They will
protection of the European Union’s fundamental also need to consider the impacts of legislative
right to privacy, it contains a strict purpose movements to localize the residency of data
limitation for granting consent, which currently intermediaries and require representatives in-
hinders many use cases of data altruism. Another country, among other similar requirements, which
area with potential is that of cross-border data could prejudice those use cases, even when
transfers: With the right amendments to the law, ostensibly motivated to protect personal data.
could an adequacy decision (Art. 45 GDPR) be
issued in favour of a data trust? Finally, it is critical to emphasize the need to enable
and facilitate global data flows while maintaining high
Data intermediaries will need to consider where standards for privacy and security, as the free flow of
they are located, their place of legal establishment, data is the backbone of any data sharing economy.
Meanwhile, as useful as regulation is, businesses requests for data presents a host of challenges
also have a role to play in developing responsible and implications, especially in light of three
data intermediaries. While some aspects of trends: (i) law enforcement and intelligence
how business will ultimately drive the design community responsibilities to safeguard
of digital agents have already been discussed, national interests against domestic and
responsible businesses may wish to explicitly transnational threats; (ii) increasing restrictions
consider the following: on cross-border data flows based in part on
concerns with those lawful access/surveillance
– Standards responsibilities and authorities; and (iii)
increasing desire to localize and tap into data
Widespread standards are a precondition for to develop revolutionary technologies like AI.
efficient and well-functioning data intermediary
systems. Standardized machine-readable In addition, the fact that daily lives are
formats and communication protocols increasingly lived online leads to requests
allow for the automation of the execution of between private parties to access data via
services offered by data intermediaries and mechanisms like a subpoena.
thus allow them to scale. The private sector
has a crucial role to play in the adoption of Businesses using data intermediaries will
standards: what industry as a whole uses therefore need to consider:
ultimately becomes endorsed at a systemic
level. A government, in turn, may endorse it – Whether to encrypt data in a way that only
later, either explicitly or implicitly; at the very the authorized recipients can access the
least standards are passively tolerated. data – such that not even the intermediary
itself can access the data in an intelligible
– Certification/licensing schemes fashion (which may, however, prevent
the data intermediary from offering some
Certification, or licensing schemes, such as services that require access to the data
certificates of conformity are a well-established in the clear);
co-regulatory measure and an acknowledged
option for the regulation of data intermediaries.65 – Whether to seek legislative relief from
A certification could work as follows: the and protection against lawful access
legislation would define a set of core criteria requests; or
that all certified intermediaries should meet
in order to demonstrate their neutrality; this – Whether to create policies and procedures
set of core criteria could be the absence of to handle such requests (which is a legal
conflict of interest, no competition with data obligation under GDPR and other privacy
users (e.g. no development of own data apps laws globally).
in competition with others, so as to avoid any
risks of self-preferencing) and the commitment – Collaboration
to not discriminate between companies that
would like to offer data services (openness Finally, businesses may wish to consider
obligation).66 Certification under these criteria learning from non-traditional allies, such as
could then either be voluntary or compulsory. peers with different business models but who
could benefit from the use of TDAs. Peer-to-
– Law enforcement and access requests peer learning can provide opportunities for
the application of TDAs. Those applications
This inevitable pressure to either resist or in turn inform the broader policy and
comply with lawful access or intelligence governance debate.
In many ways, a lot of things are already going agency. This is due to the echo chamber effect
wrong. Users are carved up as products and their of group think.
data used in ways they are uninformed about – or
feel uninformed about; in ways that their data might On the flip slide, a lot could go right:
be used, which could be inconsistent with the
users’ values or preferences; or in unexpected ways – A balance of control for any user to understand
that the application did not disclose. In worst case the decision they are making as to voluntarily
scenarios, digital agents could lead to the non- providing their data or withholding it, thanks
transparent use of data, including in ways that harm to their understanding of the policy of the
the data subject. application and the accountability of the
host company OR the scaling of the user’s
At the system level, without efficient diversity, permission sets according to skill set on
people may find the opposite – that they have or understanding technology.
perceive to have reduced spectrum of choices or
In developing the rules of the game for trusted data making from a unique vantage point – that it is
intermediaries giving rise to a potentially automated possible to start to understand the rich tapestry of
regime of personal data sharing at a system level in the implications of data intermediaries, especially
a manner that overcomes the limitations of notice trusted digital agents, in different scenarios.
and consent regimes, it is the voice and presence
of people that matter most. This presence can be The concept of trusted digital agency is effectively
amplified by taking a human-centric approach to in policy “beta67 mode and therefore requires testing
the issue and placing people at the centre of such a from all stakeholders. Only when the concept is
step change in data policies. tested will it be possible to unearth the solutions
that society will demand to advance towards
But it also requires a multistakeholder approach trusted digital agency. That will be the key to
to get right. It is only by listening – to people holistic, systemic policy-making that leverages
to understand their experience and desires, technological advancement for human-centric, pro-
to businesses to understand their innovations innovation purposes in areas such as international
and constraints, to scholars who can isolate data transfers, healthcare research and diagnostics,
commonalities between models, and of course to innovation itself and a safer and more inclusive
governments who aim for evidence-based policy- online world.
Donald Bullers
Global Technical Lead, Elastos Foundation
Jan Huizeling
Vice-President, Digital Labs, Yara International
Robert Kain
Chief Executive Officer and Co-Founder, LunaPBC
Richard Kibble
Global Head, Data Privacy, Alcon
Kimmy Bettinger
Project Specialist, Data Policy
Evîn Cheikosman
Policy Analyst, Crypto Impact and Sustainability
Accelerator
Tenzin Chomphel
Project Coordinator, Data Policy
Sheila Warren
Deputy Head, Centre for the Fourth Industrial
Revolution Network
Danielle Carpenter
Editor
Laurence Denmark
Design
See also de Werra, J., 2016, “ADR in Cyberspace: The Need to Adopt Global Alternative Dispute Resolution Mechanisms
for Addressing the Challenges of Massive Online Micro-Justice”, Swiss Review of International & European Law 2016,
289-306, https://ssrn.com/abstract=2783213.
39. See the report Data trusts: legal and governance considerations by BPE Solicitors, Pinsent Masons and Chris Reed at
Queen Mary University of London, 2019, https://theodi.org/wp-content/uploads/2019/04/General-legal-report-on-
data-trust.pdf, p. 41: “[…] dispute review boards (“DRB”), are ordinarily seen under construction contracts and exist for
the length of a particular project. These are put in place by contractual arrangement and governed by the International
Chamber of Commerce Board Rules. The model however could equally be applicable to disputes arising out of a data
trust if similar DRB provisions were to be put into the terms of use for the providing or licensing of data.”
40. International Network on Digital Self-Determination, https://idsd.network/.
41. Source: World Economic Forum, 2018, Identity in a Digital World: A new chapter in the social contract,
https://www.weforum.org/reports/identity-in-a-digital-world-a-new-chapter-in-the-social-contract.
42. World Economic Forum, 2018, Identity in a Digital World: A new chapter in the social contract, https://www.weforum.org/
reports/identity-in-a-digital-world-a-new-chapter-in-the-social-contract.
43. Kinston, J. & Ng, I., 2021, “Personal Data Servers will help take back digital ID from big tech”, Wired, https://www.wired.
co.uk/article/personal-data-servers.
44. Centre for Data Ethics and Innovation, 2021, Unlocking the value of data: Exploring the role of data intermediaries – An
exploration of the role intermediaries could play in supporting responsible data sharing, https://assets.publishing.service.
gov.uk/government/uploads/system/uploads/attachment_data/file/1004925/Data_intermediaries_-_accessible_version.
pdf.
45. Turner, C., David, S., Ahuja, A. & Wulfsohn, G., 2016, Accenture Labs Ethical algorithms for “sense and respond” systems,
Accenture, https://www.academia.edu/31436602/Accenture_Labs_Ethical_algorithms_for_sense_and_respond_systems.
– MyData Operators, which is empower individuals by improving their right to self-determination regarding their
personal data, at https://mydata.org/;
– Decode, a consortium of 15 organizations from across the European Union, at https://www.decodeproject.eu/;
– The Solid project at the Massachusetts Institute of Technology, at https://solid.mit.edu/;
– RadicalxChange (RxC), a global movement for next-generation political economies, at https://radicalxchange.org/.
55. European Parliament resolution of 20 October 2020 with recommendations to the Commission on a Digital Services Act:
adapting commercial and civil law rules for commercial entities operating online (2020/2019(INL)), Art. 17,
https://www.europarl.europa.eu/doceo/document/TA-9-2020-0273_EN.html.
56. Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural
persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive
95/46/EC (General Data Protection Regulation), https://eur-lex.europa.eu/eli/reg/2016/679/oj.
57. See Whitt, R., 2021, “Hacking the SEAMs: Elevating Digital Autonomy and Agency for Humans”, Colorado Technology
Law Journal, Vol. 19, Issue 1, 137, 202 (2021).
58. Ibid.
59. European Parliament, 2020, Regulation of the European Parliament and of the Council on European data governance
COM (2020) 767.
60. European Parliament and European Council, 2002, Directive 2002/58/EC of the European Parliament and of the Council
concerning the processing of personal data and the protection of privacy in the electronic communications sector
(Directive on privacy and electronic communications) as amended, https://eur-lex.europa.eu/eli/dir/2002/58/oj.
61. European Parliament, 2020, Regulation of the European Parliament and of the Council on European data governance
COM (2020) 767 final.
62. Cf. Data Ethics Commission of the Federal Government of Germany, 2019, Opinion of the Data Ethics Commission,
p. 134, https://www.bmjv.de/SharedDocs/Downloads/DE/Themen/Fokusthemen/Gutachten_DEK_EN_lang.pdf?__
blob=publicationFile&v=3.
63. InfoCuria, n.d., Case-law, https://curia.europa.eu/juris/documents.jsf?num=C-311/18.
64. World Economic Forum, 2021, A Roadmap for Cross-Border Data Flows: Future-Proofing Readiness and Cooperation in
the New Data Economy, https://www.weforum.org/whitepapers/a-roadmap-for-crossborder-data-flows-future-proofing-
readiness-and-cooperation-in-the-new-data-economy.
65. European Parliament, 2020, Regulation of the European Parliament and of the Council on European data governance
COM (2020) 767.
66. European Commission, 2020, EU Commission impact assessment accompanying the proposal for a Regulation on
data governance (Data Governance Act) from 25 November 2020, SWD (2020) 295 final, p. 26, https://ec.europa.eu/
newsroom/dae/document.cfm?doc_id=71225.
67. Beta mode is the test and trial period for a new piece of software when rapid learning and experimentation occur.
68. Open Data Watch, 2018, “The Data Value Chain: Moving from Production to Impact”, https://opendatawatch.com/
publications/the-data-value-chain-moving-from-production-to-impact/.
69. EQS Group, 2021, “What Is RegTech?”, https://www.eqs.com/en-us/compliance-knowledge/blog/what-is-regtech/.