I filed the following comments today on the CCPA to the CA AG.
March 8, 2019
VIA Email
California Department of Justice
ATTN: Privacy Regulations Coordinator
300 S. Spring St.
Los Angeles, CA 90013
Re: Comments on Assembly Bill 375, the California Consumer Privacy Act of 2018
Dear Attorney General Becerra,
I helped conceive of the high-level policy goals of the privacy initiative that was withdrawn from the ballot with passage of AB 375. Here I provide comment to give context and explain the high-level policy goals of the initiative, in hopes that it helps your office in contemplating regulations for the CCPA.
Strong policy support for the initiative
As you interpret the CCPA, please bear in mind that the initiative would have passed because Americans care about privacy. In multiple surveys, Americans have indicated support for stronger privacy law and dramatic enforcement. Americans have rarely been able to vote directly on privacy, but when they do, they overwhelmingly support greater protections. One example comes from a 2002 voter referendum in North Dakota where 73% of citizens voted in favor of establishing opt-in consent protections for the sale of financial records.[1]
A series of surveys performed at Berkeley found that Americans wanted strong penalties for privacy transgressions. When given options for possible privacy fines, 69% chose the largest option offered, “more than $2,500,” when “a company purchases or uses someone’s personal information illegally.” When probed for nonfinancial penalties, 38% wanted companies to fund efforts to help consumers protect their privacy, while 35% wanted executives to face prison terms for privacy violations.
Information is different
The CCPA is unusually stringent compared to other regulatory law because information is different from other kinds of services and products. When a seller makes an automobile or a refrigerator, the buyer can inspect it, test it, and so on. It is difficult for the seller to change a physical product. Information-intensive services however are changeable, they are abstract, and since we have no physical experience with information, consumers cannot easily see the flaws and hazards of them in the way one could see an imperfection in a car’s hood.
Because information services can be changed, privacy laws tend to become stringent. Information companies have a long history of changing digital processes to trick consumers and to evade privacy laws in ways that physical product sellers simply could not.[2]
Some of the CCPA’s most derided provisions (e.g. application to household level data) are in response to specific evasions of industries made possible because information is different than product regulation. Here are common examples:
- Sellers claim not to sell personal data with third parties, but then go on to say we “may share information that our clients provide with specially chosen marketing partners.”[3] For this reason, the initiative tightened definitions and required more absolute statements about data selling. Companies shouldn’t use the word “partner” or “service provider” to describe third party marketers.
- Companies have evaded privacy rules by mislabeling data “household-level information.” For instance, the DMA long argued that phone numbers were not personal data because they were associated with a household.
- Many companies use misleading, subtle techniques to identify people. For instance, retailers asked consumers their zip code and used this in combination with their name from credit card swipes to do reverse lookups at data brokers.[4]
- Information companies use technologies such as hash-matching to identify people using “non personal” data.[5]
Careful study of information-industry tricks informed the initiative and resulted in a definitional landscape that attempts to prevent guile. Those complaining about it need only look to the industry’s own actions to understand why these definitions are in place. For your office, this means that regulations must anticipate guile and opportunistic limitations of Californians’ rights.
The advantages of privacy markets
Creating markets for privacy services was a major goal of the initiative. The ability to delegate opt out rights, for instance, was designed so that Californians could pay a for profit company (or even donate to a non-profit such as EFF) in order to obtain privacy services.
There are important implications of this: first, the market-establishing approach means that more affluent people will have more privacy. This sounds objectionable at first, but it is a pragmatic and ultimately democratizing pro-privacy strategy. A market for privacy cannot emerge without privacy regulation to set a floor for standards and to make choices enforceable. Once privacy services emerge, because they are information services and because they can scale, privacy services will become inexpensive very quickly. For instance, credit monitoring and fraud alert services are only available because of rights given to consumers in the Fair Credit Reporting Act that can be easily invoked by third party privacy services. These services have become very inexpensive and are used by tens of millions of Americans.
Some will argue that the CCPA will kill “free” business models and this will be iniquitous. This reasoning underestimates the power of markets and presents free as the only solution to news. The reality is much more complex. Digital advertising supported services do democratize news access, however, they also degrade quality. One cost of the no-privacy, digital advertising model is fake news. Enabling privacy will improve quality and this could have knock-on effects.
Second, the market strategy relieves pressure on your office. The market strategy means that the AG does not have to solve all privacy problems. (That is an impossible standard to meet and perfection has become a standard preventing us from having any privacy.)
Instead, the AG need only set ground rules that allow pro-privacy services to function effectively. A key ground rule that you should promote is a minimally burdensome verification procedure, so that pro-privacy services can scale and can easily deliver opt out requests. For instance, in the telemarketing context, the FTC made enrolling in the Do-Not-Call Registry simple because it understood that complexifying the process would result in lower enrollment.
There is almost no verification to enroll in the Do-Not-Call Registry and this is a deliberate policy choice. One can enroll by simply calling from the phone number to be enrolled, or by visiting a website and getting a round-trip email. What this means is that online, a consumer can enroll any phone number, even one that is not theirs, so long as they provide an email address. The FTC does not run email/phone number verification.
The low level of verification in the Do-Not-Call Registry is a reflection of two important policy issues: first, excessive verification imposes transaction costs on consumers, and these costs are substantial. Second, the harm of false registrations is so minimal that it is outweighed by the interest in lowering consumer transaction costs. Most people are honest and there is no evidence of systematic false registrations in the Do-Not-Call Registry. More than 200 million numbers are now enrolled.
The AG should look to the FTC’s approach and choose a minimally invasive verification procedure for opt out requests that assumes 1) that most Californians are honest people and will not submit opt out requests without authority, and 2) that verification stringency imposes a real, quantifiable cost on consumers. That cost to consumers is likely to outweigh the interest of sellers to prevent false registrations. In fact, excessive verification could kill the market for privacy services and deny consumers the benefit of the right to opt out. A reasonable opt out method would be one where a privacy service delivers a list of identifiable consumers to a business, for instance through an automated system, or simply a spreadsheet of names and email addresses.
The AG should look to Catalog Choice as a model for opt outs. Catalog Choice has carefully collected all the opt out mechanisms for paper mail marketing catalogs. A consumer can sign up on the site, identify catalogs to opt out from (9,000 of them!), and Catalog Choice sends either an automated email or a structured list of consumers to sellers to effectuate the opt out. This service is free. Data feeds from Catalog Choice are even recognized by data brokers as a legitimate way for consumers to stop unwanted advertising mail. Catalog choice performs no verification of consumer identity. Again, this is acceptable, because the harm of a false opt-out is negligible, and because deterring that harm would make it impossible for anyone to opt out efficiently.
I served on the board of directors of Catalog Choice for years and recall no incidents of fraudulent opt outs. The bigger problem was with sellers who simply would not accept opt outs. A few would summarily deny them for no reason other than that allowing people to opt out harmed their business model, or they would claim that Catalog Choice needed a power of attorney to communicate a user’s opt out. The AG should make a specific finding that a power of attorney or any other burdensome procedure is not necessary for delivering verified opt out requests.
The AG should assume that sellers will use guile to impose costs on opt out requests and to deter them. Recall that when consumer reporting agencies were required to create a free credit report website, CRAs used technical measures to block people from linking to it, so that the consumer had to enter the URL to the website manually. CRAs also set up confusing, competing sites to draw consumers away from the free one. The FTC actually had to amend its rule to require this disclosure on all “free” report sites.
The definition of sell
The definition of sell in the CCPA reflects the initiative’s broad policy goal of stopping guile in data “sharing.”
From a consumer perspective, any transfer of personal information to a third party for consideration is a sale (subject to exceptions for transactional necessity, etc). But the information industry has interpreted “sale” to only mean transfers for money consideration. That is an unfounded, ahistorical interpretation.
The initiative sought to reestablish the intuitive contract law rule that any transfer for value is the “consideration” that makes a data exchange a sale. In the information industry’s case, that valuable consideration is often a barter exchange. For instance, in data cooperatives, sellers input their own customer list into a database in exchange for other retailers’ data.[6] Under the stilted definition of “sale” promoted by the information industry, that is not data selling. But from a consumer perspective, such cooperative ”sharing” has the same effect as a “sale.”
Recent reporting about Facebook makes these dynamics clearer in the online platform context.[7] Properly understood, Facebook sold user data to application developers. If application developers enabled “reciprocity” or if developers caused “engagement” on the Facebook platform, Facebook would give developers access to personal data. From a consumer perspective, users gave their data to Facebook, and Facebook transferred user data to third parties, in exchange for activity that gave economic benefit to Facebook. That’s a sale. The AG should view transfers of personal information for value, including barter and other exchange, as “valuable consideration” under the CCPA. Doing so will make the marketplace more honest and transparent.
Disclosures that consumers understand
Over 60% of Americans believes that if a website has a privacy policy, it cannot sell data to third parties.[8]
I have come to the conclusion, based on a series of 6 large scale consumer surveys and the extensive survey work of Alan Westin, that the term “privacy policy” is inherently misleading. Consumers do not read privacy policies. They see a link to the privacy policy, and they conclude “this website must have privacy.” My work is consonant with Alan Westin’s, who over decades of surveys, repeatedly found that most consumers think businesses handle personal data in a “confidential way.” Westin’s findings imply that consumers falsely believe that there is a broad norm against data selling.
In writing consumer law, one can’t take a lawyer’s perspective. Consumers do not act nor do they think like lawyers. Lawyers think the issue is as simple as reading a disclosure. But to the average person, the mere presence “privacy policy” means something substantive. It looks more like a quality seal (e.g. “organic”) rather than an invitation to read.
This is why the initiative and the CCPA go to such extraordinary measures to inform consumers with “Do not sell my personal information” disclosures. Absent such a clear and dramatic disclosure, consumers falsely assume that sellers have confidentiality obligations.
The CCPA is trying to thread a needle between not violating commercial speech interests and disabusing consumers of data selling misconceptions. These competing interests explain why the CCPA is opt-out for data selling. CCPA attempts to minimize impingement on commercial free speech (in the form of data selling) while also informing consumers of businesses’ actual practices.
Let me state this again: the government interest in commanding the specific representation “Do not sell my personal information,” is necessary to both 1) disabuse consumers of the false belief that services are prohibited from selling their data, and 2) to directly tell consumers that they have to take action and exercise the opt out under CCPA. It would indeed make more sense from a consumer perspective for the CCPA to require affirmative consent. But since that may be constitutionally problematic, the CCPA has taken an opt out approach, along with a strong statement to help consumers understand their need to take action. Without a visceral, dramatic disclosure, consumers will not know that they need to act to protect their privacy. Your regulatory findings should recite these value conflicts, and the need for compelled speech in order to correct a widespread consumer misconception.
Data brokers and opting out
Vermont law now requires data brokers to register, and its registry should help Californians locate opt out opportunities. However, the AG can further assist in this effort by requiring a standardized textual disclosure that is easy to find using search engines. Standardized is important because businesses tend to develop arbitrary terminology that has no meaning outside the industry. Text is important because it is easier to search for words than images, and because logo-based “buttons” carry arbitrary or even conflicting semiotic meaning.
Non-discrimination norms
Section §125 of the CCPA is the most perplexing, yet it is harmonious with the overall intent of the initiative to create markets. My understanding of §125 is that it seeks to 1) prevent platforms such as Facebook from offering a price that is widely divergent from costs. For instance, Facebook’s claims its average revenue per user (ARPU) is about $100/year in North America. The CCPA seeks to prevent Facebook from charging fees that would be greatly in excess of $10/month. Thus, the AG could look to ARPU as a peg for defining unreasonable incentive practices. 2) CCPA was attempting to prevent the spread of surveillance capitalism business models into area where information usually is not at play, for instance, at bricks and mortar businesses.
One area to consider under §125 are the growing number of businesses that reject cash payment. These businesses are portrayed as progressive but actually the practice is regressive (consumers spend more when they use plastic, the practice is exclusionary for the unbanked, it subjects consumers to more security breaches, and it imposes a ~3% fee on all transactions). Consumers probably do not understand that modern payment systems can reidentify them and build marketing lists. The privacy implications of digital payments are not disclosed nor mitigated, and as such, bricks and mortar businesses that demand digital payment may be coercive under CCPA.
Pro-privacy incentives
Privacy laws present a paradox: schemes like the GDPR can induce companies to use data more rather than less. This is because the GDPR’s extensive data mapping and procedural rules may end up highlighting unrealized information uses. The CCPA can avoid this by creating carrots for privacy-friendly business models, something that the GDPR does not do.
The most attractive carrot for companies is an exception that broadly relieves them of CCPA duties. The AG should make the short term transient use exemption the most attractive and usable one. That exception should be interpreted broadly and be readily usable by those acting in good faith. For instance, short-term uses should be interpreted to include retention up to 13 months so long as the data are not repurposed. The broad policy goals of the CCPA are met where an exception gives companies strong pro-privacy incentives. There’s no better one than encouraging companies to only collect data it needs for transactions, and to only keep it for the time needed to ensure anti-fraud, seasonal sales trend analysis, and other service-related reasons. For many businesses, this period is just in excess of one year.
Respectfully submitted,
/Chris Hoofnagle
Chris Jay Hoofnagle*
Adjunct full professor of information and of law
UC Berkeley
*Affiliation provided for identification purposes only
[1] North Dakota Secretary of State, Statewide Election Results, June 11, 2002.
[2] Hoofnagle et al., Behavioral Advertising: The Offer You Can’t Refuse, 6 Harv. L. & Pol’y Rev. 273 (2012).
[3] Jan Whittington & Chris Hoofnagle, Unpacking Privacy’s Price, 90 N.C. L. Rev. 1327 (2011).
[4] Pineda v. Williams Sonoma, 51 Cal.4th 524, 2011 WL 446921.
[5] https://www.clickz.com/what-acxiom-hash-figured-out/31429/ and https://developer.myacxiom.com/code/api/endpoints/hashed-entity
[6] From Nextmark.com: “co-operative (co-op) database
a prospecting database that is sourced from many mailing lists from many different sources. These lists are combined, de-duplicated, and sometimes enhanced to create a database that can then be used to select prospects. Many co-op operators require that you put your customers into the database before you can receive prospects from the database.
[7] Chris Hoofnagle, Facebook and Google Are the New Data Brokers, Cornell Digital Life Initiative (2018) https://www.dli.tech.cornell.edu/blog/facebook-and-google-are-the-new-data-brokers
[8] Chris Jay Hoofnagle and Jennifer M. Urban, Alan Westin’s Privacy Homo Economicus, 49 Wake Forest Law Review 261 (2014).