Burman IncorrectSolutionsOnline 2020

Download as pdf or txt
Download as pdf or txt
You are on page 1of 9

Carnegie Endowment for International Peace

Report Part Title: Incorrect Solutions for Online Privacy Harms


Report Title: Will India’s Proposed Data Protection Law Protect Privacy and Promote
Growth?
Report Author(s): Anirudh Burman
Carnegie Endowment for International Peace (2020)

Stable URL: http://www.jstor.com/stable/resrep24293.5

JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide
range of content in a trusted digital archive. We use information technology and tools to increase productivity and
facilitate new forms of scholarship. For more information about JSTOR, please contact support@jstor.org.

Your use of the JSTOR archive indicates your acceptance of the Terms & Conditions of Use, available at
https://about.jstor.org/terms

Carnegie Endowment for International Peace is collaborating with JSTOR to digitize, preserve
and extend access to this content.

This content downloaded from


180.179.211.44 on Sat, 05 Oct 2024 18:18:00 UTC
All use subject to https://about.jstor.org/terms
Incorrect Solutions for Online Privacy Harms

Problems With Consent as a Cornerstone of Data Protection

The key regulatory approach adopted in the Personal Data Protection Bill seeks to protect consumers
from uses of data that could be harmful to them. The bill does not, however, identify specific harm-
ful practices. Instead, it makes user consent an important part of the data protection framework. In
order to do so, it mandates that personal data can only be collected after providing notice and taking
consent.49 Such consent must be free, informed, clear, and specific, and there must be provisions that
allow users to withdraw it.50 In addition, other features such as time limits on data retention and
disclosure requirements are intended to regulate how personal data can be used by data fiduciaries.
The bill therefore focuses on adequate disclosure to individuals as a mechanism for preventing harm
to them.

In addition, the bill aims to reduce the gap in information about the use of personal data between
consumers and data fiduciaries. It aims to do so by limiting the purposes of data processing as well as
by giving users the right to access their personal data and the right to know how it will be used. Users
can also correct their personal data stored with data fiduciaries. The bill requires that data fiduciaries
give notice of these rights to consumers before collecting their data.51 This notice must provide,
among other information, purposes for data collection, categories of personal data collected, source
of collection, persons with whom such data may be shared, and information about grievance redress.52

The proposed DPA will oversee whether data fiduciaries are complying with these obligations.53

The Srikrishna committee regarded these provisions as foundational to the legislation:


The notice and choice framework to secure an individual’s consent is the bulwark on which
data processing practices in the digital economy are founded. It is based on the philosophi-
cally significant act of an individual providing consent for certain actions pertaining to her
data.54

The committee’s report states that while consent is the basis for the digital economy, existing practic-
es of consent are broken. Based on this assumption, it proposes to empower the DPA to inquire into
cases where the data fiduciaries or processors have “violated any of the provisions of this Act or the
rules prescribed.”55

The report and the bill acknowledge that users are not capable of providing meaningful consent, and
yet—somewhat paradoxically—they build on the premise that stronger consent mechanisms can lead
to better outcomes.56 The report argues that consent is usually obtained through complicated agree-

CARNEGIE INDIA | 9

This content downloaded from


180.179.211.44 on Sat, 05 Oct 2024 18:18:00 UTC
All use subject to https://about.jstor.org/terms
ments that individuals do not read. If they read the agreements, they cannot understand them, and
even if the agreements are comprehensible, these agreements cannot be negotiated.57 But rather than
move away from a consent-based framework, the bill incorporates a preventive principle of con-
sent—that is, it concludes that since individuals are incapable of consenting in a meaningful manner,
consent must be regulated.

The bill’s approach also fails to recognize the ways in which existing legal frameworks that already
regulate consent have failed. As stated earlier, since the 1970s, legal frameworks have predominantly
been aimed at ensuring consent-based data protection. This legal regime shaped the data collection
practices of tech firms that collect personal data. But securing consent has become meaningless as a
basis for data protection, not just because of the problems with the idea of meaningful consent but
also because sweeping technological changes have rendered the idea even more redundant. It is
important, therefore, to ask whether doubling down on a consent-based framework is likely to
protect personal data in India.

The Srikrishna committee accepted that consent on the internet is “broken”:

A preponderance of evidence points to the fact that the operation of notice and consent on
the internet today is broken. Consent forms are complex and often boilerplate. . . . Any
enumeration of a consent framework must be based on this salient realisation: on the
internet today, consent does not work.58

However, the committee goes on to state that the issue is practical rather than conceptual. In this
view, the problem is not with consent per se but how consent-based data protection has been con-
ceived. According to the committee, a better consent architecture is likely to be more effective at
protecting privacy.59 The provisions in the bill, however, do not radically alter existing consent
frameworks. They continue to rest on the main assumption that consent is the best mechanism for
protecting personal data if supplemented by additional requirements for how it is to be given—ex-
plicitly, freely, and capable of being withdrawn.60

It is not clear how the Srikrishna committee reached the conclusion that strengthening the consent
framework would lead to better data protection. Its report presents no empirical evidence to show
how this revised framework would be more effective.

Since the ability to give consent depends on whether a person is knowledgeable about what he or she
is consenting to, any empirically grounded consent framework should seek to ascertain how far
Indian users value their informational privacy and how they make trade-offs between the benefits of

10

This content downloaded from


180.179.211.44 on Sat, 05 Oct 2024 18:18:00 UTC
All use subject to https://about.jstor.org/terms
consenting to digital services and the risks to their privacy.61 The bill does not base its analysis of this
issue on any empirical studies that could answer this question. There is, however, evidence from
other jurisdictions that shows that users have a fairly low threshold for consenting to giving away
information about themselves.

One study from 2011 found that users of a large software company spent a median time of just six
seconds to read the end user license agreement that is part of the process of installing new software.62
It further found that no more than 8 percent of users bothered to read the license agreement in full.

Similar user behavior has been observed in other studies. An IBM survey found that, though users
think companies should be more heavily regulated for data management, 71 percent of them were
still willing to give up privacy to get access to the technology they sought, and only 16 percent had
ever walked away from a company because of data misuse.63 Researchers who tracked the online
behavior of more than 48,000 visitors to ninety online software companies found that “only 2 out of
every 1,000 retail software shoppers access the end-user license agreement [EULAs].”64 The study
found that “EULAs were accessed in only 63 of the 131,729 visits to software retailers (0.05% of all
such visits) and in 44 visits to freeware companies (0.15%).”65 The study goes on to cite research that
highlights that increased disclosure is not necessarily likely to increase readership of contract terms.66

Other research has found that some users tend to uninstall a software program if they are presented
with a notice about a company’s data policies with the end user license agreement after they install it.
Despite these notices, many users who still opted to download the software later wished that they
had not done so.67

If users do not use consent agreements to protect their online privacy, should a legal framework
enforce a consent-based regime, particularly in the absence of clear evidence that it is likely to work?

In addition, a consent-based framework may instead intensify existing issues. As one paper points
out, a consent-and-notice framework designed similarly to the EU’s GDPR (as the bill is) is likely to
exacerbate the cognitive problems in giving meaningful consent.68 The Srikrishna committee also
noted that users must contend with an overabundance, not a scarcity, of disclosure-related
information about consent under existing frameworks.69 If current consent mechanisms lead to
information overload and consent overload, the idea of “stronger” consent proposed in the bill is
likely to exacerbate these issues. The proposed framework would therefore provide more information
to consumers (consent agreements will have to contain more disclosures and more rights and
obligations, and fresh consent will be required for a fresh purpose), without necessarily increasing
data privacy.

CARNEGIE INDIA | 11

This content downloaded from


180.179.211.44 on Sat, 05 Oct 2024 18:18:00 UTC
All use subject to https://about.jstor.org/terms
In addition, the existence of high penalties in the GDPR for violating notice and consent require-
ments has been critiqued on the basis that it is likely to make technology companies more risk-
averse, leading to consent agreements that have stronger opt-in clauses and are even more legalistic in
nature.70 The bill also proposes to impose high monetary penalties for violations.71 This could work
to the detriment of users and companies. Increased consent requirements could lead to increased user
desensitization to consent agreements. Firms, meanwhile, may face a situation where users trust them
less if they feel they have been misled, even though the firm has complied with legal requirements.72

Alessandro Acquisti, a professor of information technology and public policy, points out that an
overreliance on consent would create its own costs that could undermine the goals of data protection.
He writes:

Additional costs . . . comprise the social losses due to ‘incoherent privacy policies’: amidst a
complex array of legislative and self-regulatory initiatives, both consumers and firms are
uncertain about the level of protection afforded to, or required for, various types of personal
data. This uncertainty is costly in itself, in that it forces data subjects and data holders to
invest resources into learning about the admissibility of a given data practice. It also creates
costly second order effects, in that it may lead both data subjects and data holders to ineffi-
ciently under- or over-invest in data protection.73

The proposed notice-and-consent framework may therefore be counterproductive. It may not actual-
ly prevent harms from online activity but instead exacerbate moral hazard. Users could place in-
creased reliance on regulation and become more careless in their online behavior. Additionally,
cognitive loads on users may increase. This could, in turn, make consent requirements futile for
protecting personal data. If the proposed notice-and-consent framework is not even going to be able
to achieve its stated objective of implementing a preventive privacy framework, its costs for a country
like India would outweigh the benefits.

Limitations on Data Processing

The bill proposes various limitations on data processing. These are rooted in the idea that consumers
have little knowledge of how their data are being processed. The bill proposes that data should be
processed only for specific, clear, and lawful purposes;74 that the purpose be reasonable;75 that they be
limited to those consented to by users;76 and that only data that are necessary for such purposes
should be collected.77 In addition, data storage limitations require that data be deleted once the
purpose for its collection has been fulfilled.78

12

This content downloaded from


180.179.211.44 on Sat, 05 Oct 2024 18:18:00 UTC
All use subject to https://about.jstor.org/terms
The rationale behind these requirements is that preventive limits on personal data are likely to result
in better individual control over the use of one’s personal data and reduce the scope for personal
harm. The Srikrishna committee report states:

If abuse of power is to be prevented, it is critical that the data fiduciary is obliged to use the
personal data entrusted to it by the data principal only for the purpose for which the princi-
pal reasonably expects it to be used. This is the germ of the collection and purpose limitation
principles.79

These provisions are agnostic to the kinds of harms that may occur due to the processing of data.
Instead of being narrowly tailored toward reducing specific harms, they impose significant preventive
obligations with respect to data processing.

However, some of these requirements are at odds with the evolving nature of the digital economy.
Compliance with these could lead to productivity losses for India. For one, they seem out of tune
with the increased adoption of machine-learning technologies that rely on large datasets to provide
services. Big data has been defined as “high-volume, high-velocity and high-variety information
assets that demand cost-effective, innovative forms of information processing for enhanced insight
and decision making.”80 The difference between conventional analytics and big data or ma-
chine-learning analytics is that “programs don’t linearly analyze data in the way they were originally
programmed. Instead they learn from the data in order to respond intelligently to new data and
adapt their outputs accordingly.”81

The predictions derived from big data often cannot be foreseen. The “opacity” of data processing, the
use of high quantities of data, and “the use of new types of data” are what set big-data analytics apart
from conventional ones.82 Limiting use of data to predefined purposes might hamper such innova-
tions. For example, the Norwegian Data Protection Authority points out that “it is possible that a
person’s Facebook activities are built into an algorithm that determines whether she will obtain a
mortgage from the bank.”83 While such a use may in some cases violate a purpose limitation, it could
also benefit potential seekers of credit. A financial service provider with access to such data could
potentially reach out to an underserved individual with an offer of credit. In such a case, the benefits
of having a purpose limitation would have to be weighed against the costs of the opportunity fore-
gone: increased access to credit. In India, this has important implications for meeting national
economic objectives such as financial inclusion.

CARNEGIE INDIA | 13

This content downloaded from


180.179.211.44 on Sat, 05 Oct 2024 18:18:00 UTC
All use subject to https://about.jstor.org/terms
Additionally, the bill limits the purposes of data collection to those that a user might “reasonably
expect.”84 As highlighted earlier, a distinctive feature of big data is the difficulty in understanding
how personal data may be used in decisionmaking by algorithms. While the Srikrishna committee
report refers to a harms-based test, the bill does not incorporate any such requirement.

Similar concerns have been raised with regard to other provisions of the bill. For example, there is a
possibility of conflict between the provision for algorithmic accountability and the use of other
emerging technologies such as blockchain.85 A blockchain is a “digital database containing informa-
tion that can be simultaneously used and shared within a large decentralized, publicly accessible
network.”86 Blockchain technology is increasingly used in businesses such as e-stamping, logistics,
and payment systems.87

If personal data are stored on blockchain-based databases, such uses would be subject to the require-
ments of the bill. For example, the bill would require a central node or person to be accountable for
the operation of the blockchain as a data fiduciary. However, certain kinds of blockchain designs,
such as decentralized blockchains, have no central issuer or controller. The use of such systems could
lead to difficulties in how accountability for data processing is assigned.88 Blockchain is being increas-
ingly used in significant economic sectors in India, such as the TReDS platform for trade invoice
discounting and the digitization of land records.89 While these are largely government platforms, the
technology can also be used by private players for protecting intellectual property and enforcing
contracts. The provisions of the bill could potentially limit the uses of this technology.

The bill, therefore, seeks to regulate the way technology is used without narrowly identifying harms
that could arise from its use. In doing so, it would circumscribe many beneficial uses of emerging
technologies.

Alternative Solutions for Protecting Online Privacy

The consent-and-notice framework, as well as the limitations on data processing discussed below,
assume that consumer privacy costs under the proposed framework are lower than the benefits of
protecting consumer privacy. This may not be the case. First, consumers incur opportunity costs in
getting informed about their privacy. For example, there are significant costs to being properly
informed about potential risks to privacy by perusing though privacy policies of companies.90 Sec-
ond, investment in privacy-enhancing technologies is also a cost for consumers.91 And third, con-
sumers who prevent their data from being processed forego the benefits that accrue from such
processing.

14

This content downloaded from


180.179.211.44 on Sat, 05 Oct 2024 18:18:00 UTC
All use subject to https://about.jstor.org/terms
A stocktaking of these costs and benefits would be of great relevance to emerging economies such as
India. The government’s report on fintech states that the use of emerging technologies, such as
artificial intelligence and blockchain, could help address significant issues of access to finance for
large sections of society, particularly small businesses.92

A country like India—with low levels of access to credit, insurance, and other financial services—
may potentially make very different trade-offs between the need for such access on the one hand and
the need for informational privacy on the other. By constraining the scope for innovation, the bill
arguably overprotects informational privacy at a significant cost to the economy.

The bill does not actually state what harms the provisions on both notice-and-consent and data
collection limitations are trying to protect users from. As a result, they are not narrowly tailored
toward protecting against harms. Further, they carry a serious risk of restricting innovations that
could significantly benefit India.

The argument here is not that data fiduciaries should be allowed to use personal data without con-
sent but rather that regulating consent to protect personal data is not an effective solution. Principles
of consumer protection in other economic activities, such as finance, usually prohibit specific kinds
of contractual provisions and require the disclosure of specific kinds of activities to consumers. This
is narrowly tailored toward the kinds of conduct that could cause harm to consumers.

This regulatory approach is followed in many other sectors. For example, the EU’s directive on unfair
contractual terms states that

A contractual term which has not been individually negotiated shall be regarded as unfair if,
contrary to the requirement of good faith, it causes a significant imbalance in the parties’
rights and obligations arising under the contract, to the detriment of the consumer.93

This wide language is constrained by requiring that the unfairness of the contract be assessed “taking
into account the nature of goods and services . . . the circumstances attending the conclusion of the
contract . . . [and] to all other terms of the contract.”94 This cross-sectoral directive is intended to
protect consumers from unfair contractual terms and requires member states to put in place mea-
sures to protect consumers from such terms.95

To follow this approach, the bill would have to move from a positive to a negative list approach. This
would mean that, if users have willingly consented to the use of their data, the privity of such
contracts must be respected. There could be certain exceptional circumstances or contractual

CARNEGIE INDIA | 15

This content downloaded from


180.179.211.44 on Sat, 05 Oct 2024 18:18:00 UTC
All use subject to https://about.jstor.org/terms
provisions that may be deemed too harmful for consumers, and a regulatory agency may be given the
power to periodically determine what such terms could be. For the rest, no limitations or liability
should be imposed on the use of personal data if consumers have willingly consented to its use.

This would imply that the provisions that require detailed notice and consent would not be required.
While data would still have to be processed with user consent, the limitations on purpose, fairness of
processing, and data storage would not be required. This approach could potentially have a better
chance to protect user privacy in the most cost-effective manner.

New Compliance Costs and their Economic Impact

The preventive approach adopted in the Personal Data Protection Bill is reflected in other provisions
that significantly increase compliance requirements for all firms that process data in India. Since
“processing” has been defined expansively (for example, also to include the collection of personal
data), these requirements would apply to all businesses.96 However, the costs of these requirements
for businesses have not been assessed. The key implications of these provisions for Indian businesses
are set out below.

A Significant Increase in Compliance Costs

The bill proposes requirements that all entities processing data will have to comply with. These
include data-minimization requirements, notice-and-consent requirements, privacy by design,
organizational and management requirements, transparency requirements, security safeguards,
localization requirements, and the creation of grievance-redress systems. Significant data fiduciaries
would have to implement these and other compliances: data protection impact assessments, appoint-
ments of data protection officers, record-keeping requirements, and data audits.

While the bill proposes exemptions for small entities and for certain purposes such as journalistic and
research purposes—as well as heightens requirements for significant data fiduciaries—most obliga-
tions will be applicable to all businesses, irrespective of the types of risks and the probability of harm
involved.

In addition, the bill proposes user rights modeled on the GDPR. Users will have the right to port
their data for a fee, seek information on how their data has been used, and have the right to correct
it.97 Users will have the right to ask firms to delete their personal data (that is, they will have the right
to be forgotten).98 Finally, the bill proposes a data localization regime, with tiered restrictions de-
pending on whether the data are merely personal, sensitive personal, or critical personal.99

16

This content downloaded from


180.179.211.44 on Sat, 05 Oct 2024 18:18:00 UTC
All use subject to https://about.jstor.org/terms

You might also like