Focus Areas:    Antitrust | Competition | Consumer Protection | FTC

FTC Hearings on Competition & Consumer Protection in the 21st Century – Topic 12: Consumer Privacy

Comments of the International Center for Law & Economics

Topic 12: Consumer Privacy

Introduction

Digital privacy and data security are important ongoing concerns for lawmakers, particularly in light of recent, high-profile data breaches and allegations of data misuse. Understandably, in the wake of such incidents advocates regularly call for tighter restrictions on data collection and use. But, as we detail below, privacy is a highly complex topic comprising a wide variety of differing, and often conflicting, consumer preferences. While undoubtedly in need of ongoing assessment in the face of new challenges, the US federal government’s sectoral, tailored model of privacy regulation remains the soundest method of regulating privacy.

Although the US does not have a single, omnibus, privacy regulation, this does not mean that the US does not have “privacy law.” In the US, there already exist generally applicable laws at both the federal and state level that provide a wide scope of protection for individuals, including consumer protection laws that apply to companies’ data use and security practices, as well as those that have been developed in common law (property, contract, and tort) and criminal codes. In addition, there are specific regulations pertaining to certain kinds of information, such as medical records, personal information collected online from children, credit reporting, as well as the use of data in a manner that might lead to certain kinds of illegal discrimination.

Before engaging in a deeply interventionist regulatory experiment—such as intervening in the design of algorithms or imposing strict privacy regulations in contravention to revealed consumer preferences—there should be empirically justifiable reasons for doing so; in the language of economics, there should be demonstrable market failures in the provision of “privacy” (however we define that term), before centralized regulation co-opts the voluntary choices of consumers and firms in the economy.

It surely might be the case that some consumers, abstractly speaking, would prefer one-hundred percent perfect privacy and security. It is also a certainty that, faced with tradeoffs—including the price of services, the number of features, the pace of innovation, ease of use and convenience—consumers are willing to settle for some lesser degree of privacy and security.

The responsibility of lawmakers who wish to write rules that optimize that set of tradeoffs is two-fold. First, there must be a demonstration that actual failures to provide optimal privacy and security exist, relative to consumers’ revealed preferences. Second, there must also be a demonstration that new legislation will not introduce new costs that dwarf the value they are designed to create.

As we detail below, the available evidence suggests that, at least at this time, there is no demonstrable failure in the market’s provision of privacy protection or the exist- ing legal regime’s ability to regulate it. Moreover, the experimental and theoretical literature also demonstrates that many of the proposed regulatory interventions are at best useless, and at worst destructive.

Getting regulation right is always difficult, but it is all the more so when confronting evolving technology, inconsistent and heterogeneous consumer demand, and intertwined economic effects that operate along multiple dimensions—all conditions that confront online privacy regulation:

[S]ecuring a solution that increases social welfare[] isn’t straightforward as a practical matter. From the consumer’s side, the solution needs to account for the benefits that consumers receive from content and ser- vices and the benefits of targeting ads, as well as the costs they incur from giving up data they would prefer to keep private. Then from the ad plat- form’s side, the solution needs to account for the investments the plat- form is making in providing content and the risk that consumers will attempt to free ride on those investments without providing any com- pensation—in the form of attention or data—in return. Finally, the solu- tion must account for the costs incurred by both consumers and the ad platform including the costs of acquiring information necessary for mak- ing efficient decisions.

Given the complications confronting privacy regulation, and the limits of our knowledge regarding consumer preferences and business conduct in this area, the proper method of regulating privacy is, for now at least, the course that the Commission has historically taken, and which has, generally, yielded a stable, evenly administered regime: case-by-case examination of actual privacy harms and a minimalist approach to ex ante, proscriptive or prescriptive regulations, coupled with narrow legislation targeted at unambiguously problematic uses of personal information. Following this approach will allow authorities to balance flexibility and protection.

This approach to privacy protection matches the United States’ historic preference for light-touch regulation when dealing with highly dynamic markets. The Internet in the United States grew up around an ethos of “permissionless innovation” in which firms were free to experiment with business models and service offerings, and consumers were essentially free to interact with those services they found valuable relative to the costs, both in terms of money and, relevant here, in terms of personal data.

This environment has been and continues to be essentially based on “opt-out.” Many (if not most) services on the Internet are offered on the basis that user data can, within certain limits, be used by a firm to enhance its services and support its business model, thereby generating benefits to users. To varying degrees (and with varying degrees of granularity), services offer consumers the opportunity to opt-out of this consent to the use of their data, although in some cases the only way effectively to opt-out is to refrain from using a service at all. Over time online services have generally increased the extent of user control over the use of user data, and the type of controls have evolved as both technology and consumer preferences have changed. This trend appears to mirror general consumer preferences with respect to privacy, and this evolution of business practice has concomitantly shaped user expectations regarding privacy online.

The Commission has generally evidenced admirable restraint and assessed the relevant trade-offs, recognizing that the authorized collection and use of consumer information by data companies confers enormous benefits, even as it entails some risks. Indeed, the overwhelming conclusion of decades of intense scrutiny is that the application of ex ante privacy principles across industries is a fraught exercise as each industry—indeed each firm within an industry—faces a different set of consumer expectations about its provision of innovative services and offering of privacy protections.

This background reality does not mean that privacy practices and their regulation should never be debated, nor that a more prescriptive regime should never be considered. But any such efforts must begin with the collective wisdom of the agencies, scholars, and policy makers that have been operating in this space for decades, and with a deep understanding of the business realities and consumer welfare effects involved.

Click here to read full comments.

Download PDF