Baik TelematicsandInformatics
Baik TelematicsandInformatics
net/publication/341413137
CITATIONS READS
56 654
1 author:
SEE PROFILE
All content following this page was uploaded by Jeeyun Sophia Baik on 16 June 2020.
and Journalism, University of Southern California, Los Angeles, CA, 90089, USA.
© <2020>. This manuscript version is made available under the CC-BY-NC-ND 4.0
license http://creativecommons.org/licenses/by-nc-nd/4.0/
Baik 1
Abstract
Conducting a case study on the California Consumer Privacy Act (CCPA), this paper analyzes
divergent frames of privacy argued by different stakeholders. While the United States has allowed
corporate self-regulation of consumer privacy, California became the first state to introduce its
own privacy law in June 2018. In early 2019, California held public forums on CCPA, which then
became a battleground for various stakeholders to discuss data privacy regulations. Examining 105
public comments made by 99 speakers in 7 CCPA public forums, this study identified that
corporate representatives and consumer advocates differed in seven major areas: (1) a purpose of
CCPA, (2) definitions of personal information and consumer, (3) operationalization of opt-out, (4)
non-discrimination rules, (5) economic ramifications, (6) consumer literacy, and (7) a comparison
with other privacy frameworks. The findings suggest that corporate speakers follow the frame of
privacy as a commodity, while consumer speakers seek the frame of privacy as a right.
Keywords: Data privacy; consumer rights; California Consumer Privacy Act; CCPA;
communication policy
Baik 2
1. Introduction
1.1. Introduction
communication policy issues these days (Bennet, 2019; Yeh, 2018). The European Union (EU)
replaced its 1995 Data Protection Directive with the General Data Protection Regulation (GDPR)
in May 2018, requiring companies to have explicit consent from EU residents for processing and
movement of their personal data. In the United States, California became the first state that
introduced its own privacy law, the California Consumer Privacy Act (CCPA), in June 2018. The
legal steps worldwide reflect a public outcry over increasing data breaches and misuses in the
digital era. One of the high-profile cases that provoked great public attention to the issues of data
privacy is the Cambridge Analytica scandal disclosed in March 2018: personal data of millions
of Facebook users were collected by a UK firm without users’ consent to target voters in the
2016 US presidential election. Data privacy means one’s ability to control whereabouts of their
flow (Reidenberg, 1999). Yet, digital platforms have made important decisions on the collection
and flow of personal data, and the public is increasingly going through “digital resignation,”
feeling unable to control their data and just “accepting” the loss of control (Draper and Turow,
2019).
In this context, the recent rulemaking of CCPA has been a critical juncture where diverse
stakeholders engage in legally defining what data privacy should entail in California. The reason
this study focuses on the California case is because the CCPA is the first state privacy law
introduced in the United States with the earliest enforcement date of January 1, 2020. Also,
California is a symbolic state which houses lots of technology companies in Silicon Valley
Baik 3
including headquarters of giant corporations such as Facebook, Google and Twitter. Therefore,
CCPA has been considered either a benchmark for a federal law or a trigger for a patchwork of
state laws (Howley, 2018). As such, this study suggests the importance of looking at California
as a main case of the US privacy rulemaking while acknowledging that California has its distinct
local politics. It is well documented that when there is a higher standard for privacy protection
policies, there emerges a “trading up” or a “race to the top” tendency to minimize compliance
issues (Bennett and Raab, 2003). That is, corporations whose operations are not limited to the
state of California would likely apply a privacy policy standard compliant with CCPA to other
states as well. Moreover, key rhetorical strategies used by stakeholders such as corporations and
privacy advocates will not vary too much from state to state, as the core issues around privacy
exist in and beyond California. Even though the current study is not aimed at generalizing every
finding to the other states or the United States as a whole per se, investigating CCPA public
forum comments is expected to shed light on the dynamics that contribute to shaping the US
This study analyzes how data privacy is made sense of by different stakeholders in
California, examining comments presented at seven CCPA public forums1 held by the
California’s Attorney General (AG) Office. Few studies closely looked at how divergent the
understanding of various stakeholders is regarding this timely change and what it means for
privacy regulation in detail. Since the emerging regulatory effort on privacy is differently
crucial to understand diverse perspectives by key stakeholders in order to develop and implement
1
You can visit https://www.oag.ca.gov/privacy/ccpa/prelim and access the documents online.
Baik 4
effective policies. This study closely investigates the following main research questions: (1)
which stakeholders are actively engaged, and (2) what arguments are provided in the privacy
2. Literature review
into retrievable digital traces (Hintz et al., 2018). Zuboff (2015) suggests that the intensified
data-oriented logic brought us “surveillance capitalism” where “populations are targets of data
extraction” for revenue production and market control (p. 86). The issue at stake is that one’s
right to privacy has been increasingly treated as a commodity (Sevignani, 2013). Papacharissi
(2010) explains that our personal information is transformed “into currency” and our privacy is
transformed “into a commodity,” resulting in a tradeoff between “our right to privacy” and
“access to social services.” Smith, Dinev, and Xu (2011) aptly describe how “privacy as human-
right” and “privacy as commodity” differ. The human-right approach considers privacy to be
“integral to society’s moral value system,” but the commodity approach considers privacy
applied to “consumer behavior” assigning an “economic value” to privacy (Smith et al., 2011,
p.993). Fornaciari’s (2018) study on the US media coverage of privacy from 1900s to 2017 also
shows that the news framing of privacy has shifted from “privacy as dignity” to “privacy as
commodity” over the past decades. Under this commodity frame, the creator of data - a user - is
not necessarily equal to the monetary beneficiary of the data - a digital platform.
The commodity frame is tied to the way the United States has imposed responsibility on
individuals for privacy management and protection while permitting corporate self-regulation.
Within the commodity frame, individuals are expected to rationally opt out of undesirable
Baik 5
platforms and implement privacy-protecting tools when necessary. However, it is not always
easy for individuals to withdraw from these platforms when the digital service has become a
public infrastructure everyone widely needs and uses (Madden et al., 2017). Corporations have
implemented self-regulatory measures such as privacy policies, informed consent, and terms of
service (Drezner, 2004; Fernback and Papacharissi, 2007). Yet, the privacy policies are generally
misleading with lengthy jargon, leaving individuals without practical choices (Fernback and
Papacharissi, 2007; Obar and Oeldorf-Hirsch, 2018). Informed consent is also mostly “one-
sided, non-negotiated, and non-negotiable” (Sadowski, 2019, p. 7). Therefore, the growing
public demand for data privacy implies the need to address a power asymmetry between
In the United States, there is no comprehensive privacy law that shields consumers from
the “collection, processing, and sale of their personal data by the private sector” due to its
longstanding “sectoral approach” (Yeh, 2018, p. 286). The US Congress has adopted privacy
laws in a topic-by-topic fashion based on political needs of the moment, tackling complaints on
financial/credit information with the Gramm-Leach-Bliley-Act (GLBA) and the Fair Credit
Reporting Act (FCRA) and on medical information with the Health Insurance Portability and
Accountability Act (HIPAA), for example. It never developed anything comprehensive that
reaches across all types of data, and the US legal framework remains to be far from the “rights-
based” system in Europe (Edenberg and Jones, 2019; Newman, 2008). However, as GDPR
applies to any organizations holding personal data on EU residents, the US companies whose
consumer base is transnational started to respond to the compliance issues (Bennett and Raab,
2003, 2018). CCPA was soon introduced in the United States and it has been called the US
Baik 6
equivalent of GDPR. The US corporations then promptly put resources to lobby against the
California law as the industry was concerning that CCPA would become a de facto national
This study focuses on public forums of CCPA, as the consultation phase is generally
perceived as “an opportunity for stakeholders” to influence policy rules “in a relatively open
fashion” (Minkkinen, 2019, p. 989). Scholars have examined consultation processes for net
neutrality around the world (Shepherd, 2019), competition regulation in Canada (Rajabiun and
Middleton, 2015) and data protection in the EU (Minkkinen, 2019). In the case of CCPA, its
public forums held in early 2019 were a part of California’s preliminary rulemaking process,
after the bill was introduced the previous year. It’s because the bill was passed in a “lightning-
fast” fashion when local lawmakers, selected privacy advocates and business leaders were
pressured to avoid a pricey ballot action that had been put forward by a real estate mogul
(Romm, 2019). The preliminary public comments were considered for the formal rulemaking
activities that occurred in early December 2019 (see Figure 1). As CCPA went into effect on
January 1, 2020, the public comments made in early 2019 were the ones with more expectations
of the comments being actually reflected in the rulemaking, compared to the public comments
made in December 2019, which was less than a month from the enforcement date.
California has been historically the leading state for one’s right to privacy. The US
Constitution does not have any word “privacy” despite the Fourth Amendment on search and
seizure that handles unwarranted government surveillance. On the contrary, the California State
1972, which protects individuals against privacy violations from both public and private entities
(Betzel, 2005). In 2003, California also implemented the first state data breach notification
system in the country (Lockwood, 2019). In this context, CCPA is an attempt to put forward this
right-based approach of California, acknowledging that California law has “not kept pace with
There is a local politics of California as a state that enabled the passage of CCPA, while
various American states have vastly different political stances and not all states would have the
political willpower to pass similar statutes except states such as New York and Washington. The
early version of CCPA was first introduced in February 2017 but it was moved to “the inactive
file” on September 2017 (King et al., 2018), stalled by giant tech companies who wielded strong
institutional power on the local rulemaking process. However, a ballot initiative introduced in
October 2017 by a real estate developer, Alaistair Mactaggart, prompted the passage of CCPA in
June 2018. It’s because regulators and businesses chose to make a compromise and passed
CCPA, concerned with the ballot proposition’s more aggressive approach and associated costs
(Confessore, 2018).
The 2018 version of CCPA defined personal information as “information that identifies,
relates to, describes, is [reasonably]2 capable of being associated with, or could reasonably be
linked, directly or indirectly, with a particular consumer or household” (Goldman, 2019, p. 3),
2
The word “reasonably” was added in the version enforced on January 1, 2020.
Baik 8
which remained almost identical when it was enforced in January 2020. CCPA delineates the
rights of Californians (1) to know the types of data collected and whether and to whom the data
is sold, (2) to request deletion of data, (3) to decline the sale of data, (4) to access their data, and
(5) to not be discriminated for service and price upon exercising their privacy rights (Rothstein
and Tovino, 2019). Though it has some limitations such as no private right of action guaranteed
except for data breaches, CCPA has been acknowledged as a step to elevate “data privacy as a
This study analyzed comments made at public forums of the California Consumer
Privacy Act (CCPA) held between January and March 2019 (see Table 1). The study considers
public hearings as sites of citizen participation and political performances (Checkoway, 1981;
Cole and Caputo, 1984; Lama-Rewal, 2018). The study collected publicly available transcripts of
the seven public forums uploaded to the AG office website. At the beginning of each meeting, it
was announced that the whole meeting would be transcribed and become publicly available. The
speakers were not required to identify who they are, but most of them introduced themselves.
There were 105 public comments in total made by 99 individuals. The author attended six of the
seven public forums (except the San Francisco meeting), and the six fieldnotes accompanied this
analysis. The Institutional Review Board of the university the author is affiliated with approved
the study.
Baik 9
The speakers were comprised of attorney (from corporate entities or consumer privacy
industry association, nonprofit organization, government agency, and academic (see Table 2).
3
A consumer activist is either a member of a consumer advocacy organization or a person who self-identified
oneself as a consumer activist.
Baik 10
The study conducted a qualitative thematic analysis (Braun and Clarke, 2006) on the
public comments, guided by the grounded theory approach (Glaser and Strauss, 1967). The
author did two rounds of coding, using a software called NVivo. In the first round of coding, the
author closely read transcripts to familiarize herself with the data, created codes as they emerged,
and categorized the highlighted quotes into the codes. In total, there emerged 30 codes after the
first round. In the second round of coding, the codes were grouped into themes, depending on
similarities and/or differences among comments made by various speaker types, according to the
research question. After the whole coding process, the themes were reviewed and clearly
defined. It resulted in seven major themes which reflect divergent perspectives held by
4. Findings
for. One major response from companies, industry associations and corporate attorneys was that
corporations agree with the basic purpose of CCPA, emphasizing their care for consumer
privacy. A corporate attorney said, “the companies…care very much about respecting the
privacy rights of consumers” (Attorney A, FR). Others commented, “[we] strongly support
CCPA’s goals of providing Californians with better transparency and control over data”
(Industry Association B, FR), and “we support California’s commitment to protecting the
privacy” (Industry Association D, SAC). The corporate speakers acknowledged the importance
However, the corporate speakers raised concerns over unintended consequences that can
rather harm consumer privacy. One claimed, “some of the aspects of the law, while well-
intentioned, will have unintended consequences for consumers, businesses and advertisers that
will inadvertently undermine rather than enhance consumer privacy” (Industry Association E,
SD). Verification of consumer requests to access or delete the data was particularly of concern.
A speaker said, the “difficulty in determining which requests are legitimate and which are
fraudulent puts consumers and their data at risk from unauthorized requests” (Industry
resources among different businesses. An attorney pointed out that companies in the financial
services industry may already have authentication processes, but other companies may “only
requested the law to distinguish different types of companies so that the companies with less
resources or working only with pseudonymous data won’t need to collect more data for the sake
“household” that remains unclear. The speakers worried, “a household could be a family or could
be strangers sharing an apartment. Without clarity...the law could lead to information being
shared to the wrong individual, for example, by scorned partners or roommates” (Attorney A,
FR). They considered such possibilities “very concerning” and “very dangerous” (Industry
association I, SAC). The unclear definition of “household” was brought up as an area where
The other side of the speakers including consumer activists, individuals and nonprofit
organizations, however, provided quite different arguments. They often introduced recent data
breaches, emphasizing the purpose of CCPA. A consumer activist said, “24 million financial
documents...from the nation’s largest banks has been disclosed…Facebook recently revealed
another major breach of public trust” (Activist J, SAC). An individual shared he has been
recently “notified by eight or ten different large institutions” where the protection of his personal
and financial data “has been compromised” (Individual K, LA). Consumer comments also
showed a public distrust of corporations. One speaker said, “mobile carriers were so concerned
about privacy and consumer trusts that they sold our location data to third parties” (Activist L
SAC). Another individual expressed her concern that there will be no way to make an expensive
change in the business if there is no threat of investigation (Individual NN, ST). The public
comments show that corporate representatives focused on unintended consequences that may
emerge in future enforcement of CCPA, while consumer speakers rather highlighted past and
speakers asserted that current definitions are too broad or unclear while consumer-side speakers
maintained they are clear and should be broad. One corporate speaker talked about the “CCPA’s
incredibly broad definition of personal information, which includes all IP addresses and so much
more” (Industry association P, SAC), and a corporate attorney asked whether it is “just
information that has been provided by the consumer” or it needs “to cover all data that is held by
the organization” (Attorney Q, LA). The comments suggested that broad definitions can cause
economic costs. A corporate speaker mentioned, “it will be very expensive...to comply with the
consumer requests because of the broad definition of personal information” (Industry association
information. It was said, “IP addresses change over time… IP address is not actually personal
information to the extent the way the Internet works today” (Company S, SF).
What “consumer” means was also a hot topic. A company representative said, “it
representative claimed that “the operational costs of including employees and others who do not
have a true consumer relationship with the business, would be staggering” (Industry association
P, SAC). They generally alluded to the economic costs the broad definitions would impose upon
On the other hand, consumer activists viewed that the definitions in CCPA are clear and
should be broad. One said, “the law is actually very clear about the types of information that are
considered personal information” (Activist U, LA). An activist in Sacramento also shared the
view and said, “We have heard a lot of people asking for limiting the categories of personal
Baik 15
information and identifiers beyond what was intended. I think it’s quite clear” (Activist V, SAC).
Another activist further touched upon IP address, saying “there is zero justification for excluding
IP address since it can easily be linked to a specific person or household” (Activist J, SAC). All
in all, corporations and consumers expressed very different interpretations on the definitions of
key terms. While companies perceived that broad and unclear definitions would increase their
economic costs, consumers considered that the broad definitions as clearly written in CCPA can
Public comments paid great attention to how to operationalize opt-out of data sale.
Corporate speakers asked for a change of CCPA’s all-or-nothing opt-out. A corporate attorney
said that CCPA “doesn’t explicitly permit a business to allow a consumer a choice of what they
are opting out of” (Attorney A, FR). She framed her suggestion as a provision of “more choices”
to consumers by allowing the “option to out-out of certain sales” (Attorney A, FR). Another
corporate attorney elaborated that “such flexibility would provide consumers greater control of
However, consumer activists considered that a real way to provide consumers with “more
choices” is a universal opt-out choice. A consumer activist at the Sacramento meeting suggested
to consider:
mechanisms to make sure that choices are scalable and persistent. It’s not really practical
for consumers to opt out every single website they go to… industry opt-outs today are
The consumers perceived the available opt-out choices as impractical. In effect, it is not just a
matter of incessant labor to click an opt-out button but a problem inherent in a business model
that monetizes on personal information. The prevalent business model is working as the
“apparatus of total surveillance” in ordinary people’s everyday lives, leading the individuals to
(Brunton and Nissenbaum, 2019). To maintain their business model, platforms have often used
deceptive privacy policies and settings. Facebook, for example, was fined heavily by the FTC in
2019, and one of the charges was that it misrepresented users’ ability to control the use of facial
recognition technology with their accounts by turning on the setting by default (“FTC Imposes
$5 Billion Penalty,” 2019). Considering these issues, it was not surprising that several consumer
commenters requested the default choice to be opt-in, not opt-out of data sale. Opt-in by default
is considered much stricter as it means every consumer has to actively opt in if any corporations
want to collect and process the consumer data (Smith et al., 2011). A nonprofit speaker further
urged the law to take an opt-in approach for children’s data in particular (Nonprofit W, ST).
There were some middle grounds suggested though. An industry association speaker
brought up a tiered-approach, suggesting opt-out for less sensitive data and opt-in for more
sensitive data (Industry Association B, FR). However, the tiered-approach was not always
welcomed, and a huge gap was evident in the public comments between corporations and
consumers in terms of how to operationalize opt-out and if the opt-out is the best choice for
consumers.
What will constitute discrimination of consumers was addressed by public comments too.
CCPA explicitly states that consumers should not be discriminated upon exercising their privacy
Baik 17
rights: denying goods or services, charging different prices, or providing a different quality of
service are considered discrimination (AB-375, 2018). Yet there is an exception “if that price or
difference is directly related to the value provided to the consumer by the consumer’s data” (AB-
375, 2018). Corporate speakers wondered whether their current loyalty programs would be
and expect” loyalty-based discount programs and that the loyalty programs “allow businesses to
maintain and foster positive relationships with consumers” (Industry Association E, SAC&SD).
Another speaker explained that “80% of all Americans belong to some sort of loyalty program”
Corporate speakers also asked the law to allow companies to charge fees on consumers.
One said, “it is important that CCPA’s nondiscrimination provisions do not prevent publishers
model” (Industry association F, SAC). Another speaker reminded the audience that “CCPA
states that the business may offer a different price, rate, level or quality of service to an opted-out
consumer” (Industry association Y, SF). CCPA permits corporations to provide different prices
to consumers who opt out when the value is reasonably calculated, yet how to do the calculation
their revenues from consumer data sale and sharing. An activist said, “the website must be
explicit as to how [the incentive] is calculated. Companies must prove the charge is correlated to
the value of the consumer’s data” (Activist J, SAC). Another suggested, “the only way...is to
require companies perhaps quarterly, but certainly at least once a year, to submit to the Attorney
General’s Office the revenue they receive from the sale of consumers’ data” (Activist U, LA).
Baik 18
An attorney on behalf of Californians for Consumer Privacy also asserted, “financial incentives
and discounts offered by businesses should be tied to the average value [of] the business of
(Attorney Z, SAC). Consumers countered the corporate voices that hoped for loose applications
of nondiscrimination rules with a stricter rule that would make the corporations regularly submit
available to every consumer. An activist warned the rise of “privacy tax” and elucidated what the
It seems to provide the opportunity for businesses to create a privacy tax, especially on
California…Online services are all but essential in the 21st century...[We] must ensure
that people are not…priced out of access to online services without being forced to
The commenter clearly pointed out the reality online services function as a public infrastructure
and shared the concern over tradeoff between privacy and those services. One citizen talked
about his own experience of having surrendered his personal information as he can’t pay for
privacy (Individual BB, ST); an activist urged that “any incentives that companies do choose to
provide consumers cannot set up a situation where mid income and low income consumers are
forced to sell their data... in order to use a website or service” (Activist U, LA). Another
consumer activist illustrated the high cost of living in California and hypothesized:
Baik 19
If each and every one of those companies is going to charge me an amount to opt out of
the sale of my data, that’s going to add up over the course of a year, and we get into a
situation where we potentially run a risk of kind of two tier privacy law: One that works
for the rich…and one that for poorer people, often frankly people who like me. (Activist
CC, SF)
All these comments at CCPA public forums reflected the public concern over what Papacharissi
(2010) called “privacy as a luxury commodity” only the wealthy can afford.
of CCPA. First, they pointed out some California-specific situations where lots of start-ups are
located. An attorney said, “it is frequently the case that start-ups grow at large, sometimes 10 or
20 times within one or two years…” and “get to a point where they exceed the $25 million gross
revenue threshold without realizing that they have exceeded it” (Attorney DD, FR). He
continued, “because their growth is so quick, they may not be prepared or in compliance with
CCPA until a significant amount of time after they reach that threshold” (Attorney DD, FR).
The commenters further expressed their concerns over negative impacts of CCPA
potentially imposed upon small businesses in California while big corporations are likely to
survive the compliance cost. A company representative mentioned, “compared with larger
companies, smaller businesses face significant expenses in complying with consumer requests”
(Company EE, LA). Another point was made that enforcing the law on big corporations can
better help generate a change CCPA is intended for. A company representative explained, “the
Baik 20
stakes are much larger and the larger players at the Apple scale have the ability to enforce those
Moreover, corporate speakers underscored the importance of consumer data for the
digital economy. One commenter stated, “consumer data is integral to the value exchange that
exists behind the free ad-supported online ecosystem” and explicated, “small- and medium-size
businesses and self-employed individuals rely upon consumer data to improve products and
services” (Company EE, LA). Another commenter shared the same idea, saying “businesses
have a vested interest in collecting inferences on consumers to improve and inform their own
services...” (Company FF, SF). The current business model’s contribution to economic growth
was further emphasized. A speaker claimed, “for decades on-line, data-driven advertising has
powered the growth of the Internet by funding innovative tools and services for consumers”
(Industry Association GG, SAC). Another commenter explained that the online ad-based
industry “supports over 478,000 full-time jobs across the state” (Industry association F, SAC).
The corporate representatives tried to highlight that the current business practice relying on the
free flow of consumer data is economically good for both consumers and the society as a whole.
The value of innovation was frequently mentioned by the corporate camp as something to
become in jeopardy under CCPA as well. A corporate attorney said, “we fear that innovation is
going to be stifled” (Attorney HH, SF). However, what innovation really means was not clarified
by these corporate speakers. In business, innovation generally means both “product innovation”
and “process innovation,” but most fundamentally innovation is perceived as “change” (Neely
and Hii, 1998). While corporate speakers claimed the importance of consumer data in improving
(and possibly innovating) consumer products and services, there lacked discussion of how
corporations will be able to innovate the “process” of the internet industry, moving away from
Baik 21
unwarranted collection and use of personal data. If the word “innovation” was used only to
maintain the status quo in favor of business interests, the very rhetorical strategy used by
corporations may be rather in opposition to the necessary “change” a true innovation should
entail.
There were public comments that stressed the importance of consumer literacy. A few
consumer attorneys urged to figure out comprehensive ways to help consumers fully understand
the law. One said, “it’s incredibly important that consumers have an easy and clear way to opt
out of the sale of their personal information” (Attorney Z, SAC). Another attorney stated, “[we]
have to make sure that any notifications to consumers are knowing and conspicuous...so that the
consumers fully know...that when they see something what their rights are and how to act…”
Individuals at the public forums did express the need of education. One senior citizen
After listening to the comments so far, which I understood may be half, I am here largely
it all the way through an opt-out procedure...So we need help. (Individual LL, SAC)
Another senior citizen similarly urged the AG office to “ensure us meaningful choices, simple
and transparent, to opt out of the sale to third parties of our information” (Individual K, LA).
Consumer activists provided more detailed accounts to improve the law’s practicality. An
activist suggested CCPA require corporations to “disclose what data is collected and why and
Baik 22
with whom the data is shared on its website in a publicly-accessible way,” as consumers are less
likely to actively request such information (Activist L, SAC). He hoped the law to reshape an
online environment in a way that consumers “don’t actually need to request the information” and
A few company employees in fact acknowledged the necessary change of the ecosystem
as a whole. A consumer-privacy-app developer said that “a big pop-up that says we accept this,
isn’t really enough...to a world where everybody continues to do all the same things...consumers
would like to see [the problem] solved that has developed into an ecosystem...” (Company M,
LA). Another speaker further mentioned the issue of individual autonomy under CCPA. He said
providing consumers with information about how their data are used by business can “help offset
the impact of influence campaigns, can empower consumers to make their own decisions”
(Company FF, SF). Enhanced public literacy of CCPA was deemed critical for a meaningful
A few consumer-side comments further addressed that CCPA is lacking “public” voices.
An activist claimed that the public forum is missing the user-side of privacy when it’s really
important for CCPA to work for users (Activist MM, ST). One citizen at the Los Angeles forum
explicitly urged “to keep the interest of the public rather than Silicon Valley companies and
oligarchs in mind” when crafting the rules (Individual K, LA). The commenters problematized
that CCPA rulemaking is dominated by corporate interests while most consumers lack literacy
Public comments made references to other existing privacy frameworks. CCPA’s success
frequently wondered whether complying with GDPR would be enough to comply with CCPA.
There are some key differences between GDPR and CCPA. To name a few, GDPR takes opt-in
by default while CCPA takes opt-out by default; GDPR includes publicly available information
as personal information where CCPA doesn’t; GDPR forbids the processing of “sensitive” data
and CCPA does not mention such a limitation; GDPR requires certain businesses to hire data
protection officers when CCPA doesn’t mandate it; CCPA has an explicit clause on non-
discrimination but GDPR doesn’t (Kessler, 2019, p. 114). A corporate attorney said, “companies
have spent millions trying to comply with this law [GDPR] already,” implying the economic cost
would go up if CCPA requires something different from GDPR (Attorney OO, RI). Other
attorneys suggested “safe harbor” for GDPR-compliant businesses (Attorney HH, SF) or same
lessons for CCPA. A company representative explained, “GDPR has created unexpected
during the verification process of consumer requests (Company S, SF). An industry association
also explained, “it was recently reported that over 70 percent of small businesses covered by that
law [GDPR] are not in compliance, and that was after many years of discussion and ample time
to ramp up” (Industry association P, SAC). This alluded to the problem CCPA would have if
rushed, considering GDPR faced challenges even after its longer period of preparation.
Meanwhile, consumer speakers were more vocal about the weakness of CCPA compared
to GDPR. A consumer activist said, “the right has already been successfully implemented in
Europe under GDPR” so “it [CCPA] is clearly possible” (Activist J, SAC). CCPA was deemed
having the capacity to regulate companies as GDPR already showed precedents. A consumer
Baik 24
attorney called CCPA “a GDPR light,” because “CCPA puts the onus on the consumer to come
forward and do something to alert the business that they want something done” (Attorney KK,
SD). He pointed out that CCPA is more favorable to corporations compared to GDPR.
In addition, public comments addressed a future privacy framework that is being debated
in the United States: a comprehensive federal privacy law. The comments fell under two
perspectives. Some argued that CCPA needs to be strong as it will guide a federal-level
framework. A company employee said, “CCPA can be an example for even the federal-level
regulatory system and we shouldn’t weaken the law as an example for others to follow”
(Company JJ, ST). The others suggested to put foremost efforts on the federal-level regulation. A
(Company SS, LA). Likewise, CCPA was strategically positioned in relation to existing and/or
5. Discussion
CCPA public comments showed that there largely exist two distinct perspectives of
corporate-side speakers and consumer-side speakers. The key areas of difference suggest that the
negotiation of data privacy regulations in California reflects the longer history of contestation
between the frame of privacy as a “commodity” and the frame of privacy as a “right”
(Fornaciari, 2018; Smith et al., 2011). The corporate camp generally took the commodity frame,
most concerned about economic ramifications of CCPA, and tended to consider data privacy
under the new law to be against “innovation” integral to an economic success of the state. On the
contrary, the consumer camp mainly took the rights frame, urging data privacy to be practically
available and to not discriminate individuals. That is, corporations showed their interests in
maintaining the status quo of the current internet business model heavily based on consumer data
Baik 25
monetization, while consumers expected CCPA to be a starting point of gaining control over
their personal information and making privacy as an inalienable right that can’t be bargained.
commodity frame ends up treating populations as “targets of data extraction” for revenue
production and market control (Zuboff, 2015, p. 86), when the rights frame considers people as
problematic. Cohen (2013) argues that in contrary to the corporate understanding of innovation
“as the absence of regulatory constraint” in privacy policy discourse, it is “modulation, not
privacy” which threatens innovative practices (pp.1919-1920). She explains that pervasive
surveillance and modulation “seek to mold individual preferences and behavior in ways that
reduce the serendipity and the freedom to tinker on which innovative thrives” (p. 1920).
However, corporate speakers mostly treated privacy and innovation as trade-offs of each other,
emphasizing the economic burdens possibly introduced by CCPA and not being able to move
beyond their long-held frame of privacy as a commodity. In doing so, corporations failed to
manifests that their frame eventually views people as “data targets,” contradicting their
thinking of inequality that can disproportionally affect one’s privacy affordability. Consumer
advocates raised concerns over “privacy tax” as CCPA may allow corporations to offer different
prices or services to consumers who opt out of data sale. They warned that only the people who
can economically afford extra fees or neglect incentives could enjoy privacy rights. This pertains
Baik 26
to the issue of privacy being “a luxury commodity” along the commodification of privacy rights,
which normalizes a divide between the “privacy rich” and the “privacy poor” (Papacharissi,
2010; Arora, 2019). Historically privacy has been “more accessible to those wealthy enough to
build walls that shield them from public view” (Draper, 2019, p. 212), and this can continue in
the 21st century as “privacy-discount plans may force consumers to make difficult choices
between privacy and other necessities” like groceries or public transportation (Elvy, 2017,
p.1405).
Moreover, it is not only one’s socio-economic status but associated differences in literacy
that would have discriminatory effects. Consumer-side speakers urged CCPA to be friendlier to
consumers and as Marwick and boyd (2018) pointed out, “achieving privacy is especially
difficult for those who are marginalized in other areas of life” because “the ability to achieve
privacy often requires the privilege to make choices and create structures that make such
freedom possible” (pp. 1157-1158). A person with a low socio-economic status may be less
likely to seek one’s data privacy right due to lack of resources such as education, time or money.
Yet the outcome of not being able to actively pursue privacy rights would be most detrimental to
the marginalized people as they can be profiled into “financially vulnerable” market segments
and become unfairly targeted for “dubious financial products such as payday loans…or debt
relief services” (Madden et al., 2017, p. 77). ). As such, unless we think of privacy as a “right”
instead of a commodity, we can’t rightfully address various human conditions that impact one’s
access to privacy.
Such concerns over discrimination, however, won’t be fully alleviated by a solution like
comments at CCPA public forums. It’s because “any effort to assign a dollar value to our
Baik 27
millions of data points scattered across the internet” may be “inherently flawed” (Warzel, 2019).
More importantly, this data-revenue-reporting approach can help sustain the frame of privacy as
a commodity by considering personal data as something that can be calculated for monetary
value. Thus, delineating “data revenue” in economic terms would not fit well with the consumer
camp’s pivotal focus on framing privacy as one’s fundamental “right” in the digital era.
6. Conclusion
This paper sought to examine which stakeholders are actively engaged in the US data
privacy rulemaking and what arguments are provided in the case of the California Consumer
Privacy Act (CCPA). It was clear in the analysis of all the public comments that corporate actors
tend to frame data privacy against innovation, while consumer advocates are more likely to
frame data privacy against discrimination, following the trajectory of contestation between
privacy as a commodity and privacy as a right. The gap identified in the study shows how
different the understanding of privacy and its regulations is depending on the two frames each
stakeholder holds in the contemporary data-driven economy as its logic governs more and more
The key here is which frame the authority will eventually uphold in regulating data
practices in the private sector. CCPA, as enforced after reflecting the public comments, seems to
sustain the frame of privacy as a commodity. For example, even though CCPA as of January
2020 provides more detailed explanations of its non-discrimination clause compared to its 2018
version, it allows a business to offer “a price or service difference if it is reasonably related to the
value of the consumer’s data” (CCPA, 2020). CCPA says the business shall “use and document a
reasonable and good faith method for calculating the value of consumer’s data,” but the very
statement itself is a reflection of a commodity frame, treating personal data as a commodity that
Baik 28
can have monetary value. CCPA is going through modifications until July 1, 2020 as its
enforcement can be delayed by up to six months, technically allowing a room for further
changes. However, it is very unlikely that the specific clause would be starkly shifted as the
modification released in March 2020 remains to have the same phrase. Therefore, California’s
interpretation of privacy right is being limited to “managing corporate access to personal data”
and “framing digital privacy as an economic issue” (Draper, 2019, p.191), instead of providing
(2019), public consultation is not an “ideal forum for stakeholder dialogue” as the commenters
talk toward the state authority, and it “only constitute the public part of the lobbying and policy-
making process,” often dominated by the industry players (p. 1001). This was also the case for
CCPA public forums and there were fewer speakers from the consumer-side. It may be because
the general public lacks an awareness of the specific regulation per se or because public forums
were held on weekdays, likely discouraging participation of people whose jobs don’t regard
privacy issues. The findings of this study thus may fall short of identifying possible arguments
This study still suggests valuable insights on the divergence manifested between
corporations and consumers in the recent privacy rulemaking process in California. The
California law will be fully enforced starting in July 2020, and CCPA would have real impacts
ahead of any future federal regulations in the United States. Most importantly, the key finding of
this study on differing frames maintained by corporate and consumer stakeholders hints at
divergent perspectives underpinning the current digital economy that will continue to challenge
ongoing privacy enforcements and emerging regulatory measures to privacy in the United States.
Baik 29
References
Arora, P., 2019. Decolonizing Privacy Studies. Telev. New Media 20, 366–378.
Bennet, J., 2019. Opinion | Do You Know What You’ve Given Up? N. Y. Times.
Bennett, C.J., Raab, C.D., 2018. Revisiting the governance of privacy: Contemporary policy
Bennett, C.J., Raab, C.D., 2003. The Governance of Privacy: Policy Instruments in Global
Perspective. Routledge.
Braun, V., Clarke, V., 2006. Using thematic analysis in psychology. Qual. Res. Psychol. 3, 77–
101.
Checkoway, B., 1981. The Politics of Public Hearings. J. Appl. Behav. Sci. 17, 566–582.
Cohen, J.E., 2013. WHAT PRIVACY IS FOR. Harv. Law Rev. 126, 1904–1933.
Cole, R., Caputo, D.A., 1984. The Public Hearing as an Effective Citizen Participation
Mechanism: A Case Study of the General Revenue Sharing Program. Am. Polit. Sci. Rev.
78, 404–416.
Confessore, N., 2018. The Unlikely Activists Who Took On Silicon Valley — and Won. The
Couldry, N., Yu, J., 2018. Deconstructing datafication’s brave new world. New Media Soc. 20,
4473–4491.
Draper, N.A., 2019. The Identity Trade: Selling Privacy and Reputation Online. NYU Press.
Draper, N.A., Turow, J., 2019. The corporate cultivation of digital resignation. New Media Soc.
Drezner, D.W., 2004. The Global Governance of the Internet: Bringing the State Back In. Polit.
Baik 30
Edenberg, E., Jones, M.L., 2019. Analyzing the legal roots and moral core of digital consent.
Elvy, S.-A., 2017. Paying for Privacy and the Personal Data Economy. Columbia Law Rev. 117,
1369.
Fernback, J., Papacharissi, Z., 2007a. Online privacy as legal safeguard: the relationship among
consumer, online portal, and privacy policies. New Media Soc. 9, 715–734.
Forbes Technology Council, 2018. How Will California’s Consumer Privacy Law Impact The
Fornaciari, F., 2018. What is Privacy Anyway? A Longitudinal Study of Media Frames of
Glaser, B.G., Strauss, A.L., 1967. The Discovery of Grounded Theory. Aldine, Chicago.
Hintz, A., Dencik, L., Wahl-Jorgensen, K., 2018. Digital citizenship in a datafied society. Polity
Press.
Kessler, J., 2019. Data Protection in the Wake of the GDPR: California’s Solution for Protecting
King, J.S., Vakili, A., Jacobson, J.B., 2018. Frequently Asked Questions About the California
Consumer Privacy Act of 2018 (CCPA) [WWW Document]. K&L Gates. URL
http://m.klgates.com/frequently-asked-questions-about-the-california-consumer-privacy-
Lama-Rewal, S.T., 2018. Public Hearings as Social Performance: Addressing the Courts,
Leprince-Ringuet, D., 2019. What is the CCPA? Everything you need to know about the
Baik 31
California Consumer Privacy Act right now [WWW Document]. ZDNet. URL
https://www.zdnet.com/article/california-consumer-privacy-act-everything-you-need-to-
Lockwood, B., 2019. California’s Consumer Privacy Act and the Future of Privacy in the U.S.
(accessed 4.16.20).
Madden, M., Gilman, M., Levy, K., Marwick, A., 2017. Privacy, Poverty, and Big Data: A
Marwick, A.E., Boyd, D., 2018. Understanding Privacy at the Margins. Int. J. Commun. 12, 9.
Middleton, C., 2018. GDPR USA: Why tech industry now lobbying against consumer privacy.
Minkkinen, M., 2019. Making the future by using the future: A study on influencing privacy
protection rules through anticipatory storylines. New Media Soc. 21, 984–1005.
Moscovici, S., 2015. La psychanalyse, son image et son public [Psychoanalysis, its image and its
Neely, A., Hii, J., 1998. Innovation and business performance: a literature review. The Judge
Newman, A., 2008. Protectors of privacy: Regulating personal data in the global economy.
Obar, J.A., Oeldorf-Hirsch, A., 2018. The biggest lie on the Internet: ignoring the privacy
policies and terms of service policies of social networking services. Inf. Commun. Soc.
1–20.
Baik 32
Rajabiun, R., Middleton, C., 2015. Public Interest in the Regulation of Competition: Evidence
Reidenberg, J.R., 1999. Resolving conflicting international data privacy rules in cyberspace.
Romm, T., 2019. ‘There’s going to be a fight here to weaken it’: Inside the lobbying war over
Rothstein, M.A., Tovino, S.A., 2019. California Takes the Lead on Data Privacy Law. Hastings
Sadowski, J., 2019. When data is capital: Datafication, accumulation, and extraction. Big Data
Soc. 6.
Smith, Dinev, Xu, 2011. Information Privacy Research: An Interdisciplinary Review. MIS Q. 35,
989.
Turow, J., 2012. The Daily You: How the New Advertising Industry Is Defining Your Identity
Warzel, C., 2019. Opinion | Congress Wants Data Transparency, but It Still Doesn’t Understand
Westin, A.F., 1966. Science, privacy, and freedom: Issues and proposals for the 1970’s. Part I--
The current impact of surveillance on privacy. Columbia Law Rev. 66, 1003–1050.
Wood, D.M., Ball, K., 2013. Brandscapes of control? Surveillance, marketing and the co-
construction of subjectivity and space in neo-liberal capitalism. Mark. Theory 13, 47–67.
Yeh, C.-L., 2018. Pursuing consumer empowerment in the age of big data: A comprehensive
regulatory framework for data brokers. Telecommun. Policy, SI: Interconnecting 42,
282–292.
Baik 33
Zuboff, S., 2015. Big other: surveillance capitalism and the prospects of an information