What are you looking for?

Showing 9 of 97 Results in Data Security

European Commission Tentatively Finds US Commitments ‘Adequate’: What It Means for Transatlantic Data Flows

TOTM Under a draft “adequacy” decision unveiled today by the European Commission, data-privacy and security commitments made by the United States in an October executive order signed by . . .

Under a draft “adequacy” decision unveiled today by the European Commission, data-privacy and security commitments made by the United States in an October executive order signed by President Joe Biden were found to comport with the EU’s General Data Protection Regulation (GDPR). If adopted, the decision would provide a legal basis for flows of personal data between the EU and the United States.

Read the full piece here.

Continue reading
Data Security & Privacy

Biden’s Data Flows Order: Does It Comport with EU Law?

TOTM European Union officials insist that the executive order President Joe Biden signed Oct. 7 to implement a new U.S.-EU data-privacy framework must address European concerns about U.S. . . .

European Union officials insist that the executive order President Joe Biden signed Oct. 7 to implement a new U.S.-EU data-privacy framework must address European concerns about U.S. agencies’ surveillance practices. Awaited since March, when U.S. and EU officials reached an agreement in principle on a new framework, the order is intended to replace an earlier data-privacy framework that was invalidated in 2020 by the Court of Justice of the European Union (CJEU) in its Schrems II judgment.

Read the full piece here.

Continue reading
Data Security & Privacy

There Isn’t Such a Thing as Free Privacy Protection

Popular Media Hardly a day goes by without the Federal Trade Commission announcing plans to clamp down on the tech industry. Its latest foray, a proposal for far-reaching . . .

Hardly a day goes by without the Federal Trade Commission announcing plans to clamp down on the tech industry. Its latest foray, a proposal for far-reaching rules to counter the bogeyman of “commercial surveillance,” comes like a great dark cloud: essentially hazy but portentous and sweeping.

Read the full piece here.

Continue reading
Data Security & Privacy

ICLE Comments on FTC ANPR on Commercial Surveillance and Data Security

Regulatory Comments Executive Summary The Federal Trade Commission (“FTC”) has issued an Advanced Notice of Proposed Rulemaking (“ANPR”) on “Commercial Surveillance and Data Security,”[1] initiating a proceeding . . .

Executive Summary

The Federal Trade Commission (“FTC”) has issued an Advanced Notice of Proposed Rulemaking (“ANPR”) on “Commercial Surveillance and Data Security,”[1] initiating a proceeding intended to result in binding rules regarding “the collection, aggregation, analysis, retention, transfer, or monetization of consumer data and the direct derivatives of that information.”[2]

There is reason to believe that streamlined and uniform federal data-security or privacy regulations could be both beneficial and within the FTC’s competence and authority. But the approach suggested by the ANPR—simultaneously sweeping and vague—appears very likely to do more harm than good. Most notably, the ANPR evinces an approach that barely acknowledges either the limits of the FTC’s authority or the tremendous consumer benefits produced by the information economy.

The FTC is uniquely positioned to understand the complexities entailed in regulating privacy and data security. It has expertise and experience in both consumer-protection and competition matters. With regard to privacy and data security, in particular, it has decades of experience bringing enforcement actions for violations of the FTC Act’s prohibition of deceptive and unfair practices. Its enforcement experience also has been bolstered by its statutory mission to conduct economic and policy research, which has, not incidentally, comprised numerous hearings, workshops, studies, and reports on issues pertinent to data policy.

The ANPR does not build on the Commission’s experience and expertise as it could, however, and its dearth of economic analysis is especially striking. Moreover, the Commission’s authority is not unbounded, and neither are its resources. Both limitations are salient when the Commission considers adopting substantive—or “legislative”— regulations under either Section 18 or Section 6 of the FTC Act. As we discuss below, the current proceeding is deficient on both substantive and procedural grounds. Absent an express grant of authority and the requisite resources from Congress, the Commission would be ill-advised to consider, much less to adopt, the kinds of sweeping data regulations that the Commercial Surveillance ANPR appears to contemplate.

A.      The FTC Must Provide More Detail Than Is Contained in the ANPR

The ANPR states that it was issued pursuant to the Commission’s Section 18 authority,[3] which both grants and restrains the FTC’s authority to adopt regulations with respect to “unfair or deceptive acts or practices in or affecting competition” (“UDAP”).[4] Rulemaking under Section 18 of the FTC Act[5] requires that the Commission follow a careful process. As a preliminary matter, it must identify for both Congress and the public an area of inquiry under the Commission’s jurisdiction; the Commission’s objectives in the rulemaking; and regulatory alternatives under consideration.[6] Unfortunately, the Commission has not met these obligations in this ANPR.

Under Section 18, the Commission may adopt “rules which define with specificity acts or practices which are unfair or deceptive acts or practices in or affecting commerce”[7] under Section 5 of the FTC Act. Section 18 imposes express procedural requirements, in addition to those set out for this ANPR. These include, but are not limited to, requirements for a Notice of Proposed Rulemaking (“NPRM”). Section 18 also incorporates by reference the procedures prescribed by the Administrative Procedure Act.[8]

As noted, Section 18’s requirements for an ANPR are brief and preliminary but they are nonetheless real. In contravention of the requirements of Section 18, this ANPR does not clearly describe any “objectives which the Commission seeks to achieve,” and it provides no indication of “possible regulatory alternatives under consideration by the Commission.”[9] Instead, it provides a laundry list of putative harms, and it fails to identify even the most basic benefits that may be associated with diverse commercial-data practices. It does not describe the Commission’s current assessment of, or position on, those practices. And it provides no sense of the direction the Commission intends to take regarding potential rules.

Failing to identify the Commission’s objectives or proposals under consideration, this ANPR fails in its basic purpose to “invite… suggestions or alternative methods for achieving [the] objectives.”[10]

B.       The Commission Must Undertake a Cost-Benefit Analysis that Defines Harms, Identifies Benefits, and Weights the Two

Any rules the Commission issues under a Section 18 proceeding must emerge from a cost-benefit analysis.[11] Both the potential harms and the benefits of challenged conduct must be well-defined, and they must be weighed against each other. Even at this early stage of the process, the FTC is obligated to provide more than a suggestion that some harm might be occurring, and to provide more than a hint of how it might handle those harms.

This is also good procedure for policymaking more generally, irrespective of the Commission’s statutory obligations under Section 18. Before engaging in a deeply interventionist regulatory experiment—such as imposing strict privacy regulations that contravene revealed consumer preferences—the Commission should publicly state empirically justified reasons to do so. In other words, there should be demonstrable market failures in the provision of “privacy” (however we define that term) before centralized regulation co-opts the voluntary choices of consumers and firms in the economy, and before it supplants the ability to redress any residual, cognizable harms through law enforcement with broad, economywide, ex ante rules.

Thus, a vital threshold question for any rules issued under this proceeding is whether and why markets operating without specific privacy regulation generate a suboptimal provision of privacy protection. Without this inquiry, it is unclear whether there are problems requiring regulatory intervention and, if so, what they are. Without knowing their purpose, any rules adopted are likely to be ineffective, at best, and harmful, at worst. They may increase costs for consumers and businesses alike, chill innovation, mandate harmful prescriptions for alleged privacy harms while failing to address the most serious and persistent harms, or exacerbate the risks of harm—or all of the above.

Particularly in the United States, where informational privacy is treated both legally and socially as more of a consumer preference (albeit, perhaps, a particularly important one) than a fundamental right,[12] it is difficult to determine whether our current regime produces the “right” amount of privacy protection. That cannot be determined by observing that some advocates and consumers who are particularly privacy-sensitive opine that there should be more, or more of a certain sort; nor is it enough that there have been some well-publicized violations of privacy and cases of demonstrable harm. Indeed, the fact that revealed preferences in the market tend toward relatively less privacy protection is evidence that advocates may be seeking to create a level and a type of privacy protection for which there is simply no broad-based demand. Absent a pervasive defect that suggests a broad disconnect between revealed and actual preferences, as well as a pattern of substantial net harm, the Commission should be extremely cautious before adopting preemptive and sweeping regulations.

At a minimum, the foregoing indicates that the Commission must undertake several steps before this ANPR is close to satisfying the requirements of Section 18, not to mention good government:

  • First, the Commission must proffer an adequate definition of “commercial surveillance.” While the ANPR is framed around this ominous-sounding term,[13] it is functionally defined in a way that is both sweeping and vague. It appears to encompass virtually all commercial uses of “consumer data,” albeit without providing a workable definition of “consumer data.”[14] If the Commission is contemplating a general data regulation, it should say so and enumerate the objectives such a regulation would serve. In the current ANPR, the Commission has done neither.
  • Second, the Commission must do more than merely cite diverse potential harms arising from what it terms “commercial surveillance.” The Commission has a long history of pursuing privacy and data-security cases, and it should rely on this past practice to define with specificity the types of harms—cognizable as injuries under Section 5—that it intends to pursue.

The Commission must also adequately account for the potential harms to innovation and competition that can arise from the adoption of new privacy and data-security regulations. Resources that firms invest in compliance cannot be invested in product development, customer service, or any of a host of other ends. And compliance with overly broad constraints will often curtail or deter the sort of experimentation that is at the heart of innovation.

Moreover, there is a potential tension between privacy and data security, such that mandates to increase privacy can diminish firms’ ability to ensure data security. The EU’s experience with the General Data Protection Regulation (“GDPR”) has demonstrated some of this dynamic.[15] These realities must be incorporated into the Commission’s assessment.

  • Third, the Commission must do more than merely nod to potential benefits that the modern data-driven economy provides to consumers. The clear benefits that arise from information sharing must be considered. Since the dawn of the Internet, free digital services have created significant consumer surplus. This trend continues today: Research using both survey and experimental methods has consistently found substantial benefits for consumers from sharing information in exchange for free (or subsidized) digital products. Moreover, productive conduct and consumer benefits are not limited to free digital products and services. Myriad products and services—from health care to finance to education—are made more efficient, and more widely available, by the commercial use of various forms of consumer data.

C.      The ANPR Must Account for the Effect of Any ‘Commercial Surveillance’ Rules on Consumer Welfare and Competition

The Commission is obligated to consider the likely effects of data regulation on consumers and competition. That ought to be a requirement for regulation generally, but it is an express, statutory requirement for unfairness regulation under Section 18 of the FTC Act. The Commission is uniquely well-situated to meet that mandate by virtue of its distinctive, dual competition and consumer-protection missions. Indeed, the Commission’s antitrust-enforcement experience dates to the agency’s inception. In addition, the Commission can access the considerable expertise of its Bureau of Economics, which employs experts in both industrial organization and consumer-protection economics. Yet much of that expertise appears absent from the ANPR.

This ANPR does not specify, or even sketch, the data regulations being contemplated by the Commission. Neither does it specify the Commission’s goals in the rulemaking or alternative regulatory approaches under consideration, although both are required by statute. Consequently, one cannot assess the net effects of any proposed “commercial surveillance and data security” rule on competition or consumers, because there simply is no proposed rule to assess.

The economic literature, however, does suggest caution:

  • First, as a general matter, regulations that impose substantial fixed costs on regulated firms tend to burden smaller firms and entrants more than they do large firms and incumbents.[16]
  • Second, studies of specific domestic-privacy and data-security requirements underscore the potential for unintended consequences, including competitive costs.[17]
  • Third, empirical studies of the effects of general data regulations in foreign jurisdictions, such as the EU’s GDPR, suggest that such regulations have indeed led to substantial competitive harms.[18]

The literature on the effects of GDPR and other data regulations is particularly instructive. Although it is neither definitive nor complete, it has thus far found slender (at best) benefits to competition or consumers from data regulations and considerable costs and harms from their imposition. Further experience with and study of data regulations could yield a more nuanced picture. And, again, the FTC is well-positioned to contribute to and foster a greater understanding of the competitive effects of various types of data regulation. Doing so could be greatly beneficial to policymaking, competition, and consumer welfare, precisely because specific data practices can produce substantial benefits, harms, or a complex admixture of the two. But documented harms and speculative benefits of regulation recommend caution, not blind intervention.

D.      Conclusion

The Commission should take account of a further reality: the rules it contemplates will be created in an environment filled with other privacy regulators. Although the United States does not have a single, omnibus, privacy regulation, this does not mean that the country does not have “privacy law.” Indeed, generally applicable laws providing a wide range of privacy and data-security protections already exist at both the federal and state level. These include consumer-protection laws that apply to companies’ data use and security practices,[19] as well as those that have been developed in common law (property, contract, and tort) and criminal codes.[20] In addition, there are sector-specific regulations pertaining to particular kinds of information, such as medical records, personal information collected online from children, and credit reporting, as well as regulations prohibiting the use of data in a manner that might lead to certain kinds of illegal discrimination.[21]

Despite the FTC’s noted experience in a certain slice of privacy regulation, Congress has not made the FTC the central privacy regulatory body. Neither has Congress granted the Commission the resources likely required for such a regulator. Congress has wrestled with complex tradeoffs in several areas and has allowed—through design and otherwise—various authorities to emerge. Where Congress has provided for privacy regulation, it has tailored the law to address specific concerns in specific sectors, or with respect to specific types of information. Moreover, in each case, it has balanced privacy and security concerns with other policy priorities. That balancing requires technical expertise, but it also entails essentially political judgements about the relative value of diverse policy goals; in that latter regard, it is a job for Congress.

There are, as well, questions of resource allocation that may attend an express statutory charge. We cannot gainsay the importance of the FTC’s privacy and data-security enforcement work under Section 5 of the FTC Act. At the same time, we cannot help but notice a misfit between the Commission’s congressionally allocated resources and the obligations that are entailed by data regulations of the scope contemplated in the ANPR. By way of contrast, we note that, since the compliance date of the Health Insurance Portability and Accountability Act (“HIPAA”) privacy rule, the U.S. Department of Health and Human Services (“HHS”) Office of Civil Rights (“OCR”) has investigated and resolved nearly 30,000 cases involving HIPAA-covered entities and their business associates; for appropriate cases of knowing disclosure or obtaining of protected health information, OCR has referred more than 1,500 cases to the U.S. Department of Justice (“DOJ”) for criminal prosecution.[22]

In his dissent from the issuance of this ANPR, former Commissioner Noah Phillips noted the massive and complicated undertaking it initiates:

Legislating comprehensive national rules for consumer data privacy and security is a complicated undertaking. Any law our nation adopts will have vast economic significance. It will impact many thousands of companies, millions of citizens, and billions upon billions of dollars in commerce. It will involve real trade-offs between, for example, innovation, jobs, and economic growth on the one hand and protection from privacy harms on the other. (It will also require some level of social consensus about which harms the law can and should address.) Like most regulations, comprehensive rules for data privacy and security will likely displace some amount of competition. Reducing the ability of companies to use data about consumers, which today facilitates the provision of free services, may result in higher prices—an effect that policymakers would be remiss not to consider in our current inflationary environment.[23]

This is particularly true given the Commission’s long history of work in this area. The Commission has undertaken decades of investigations and a multitude of workshops and hearings on privacy and related topics. This ANPR nods to that history, but it does not appear to make much use of it, possibly because much of it contains lessons that pull in different directions. Overall, that impressive body of work does not remotely point to the need for a single, comprehensive privacy rule. Rather, it has demonstrated that privacy regulation is complicated. It is complicated not just as a technical matter, but also because of the immense variety of consumers’ attitudes, expectations, and preferences with respect to privacy and the use of data in the economy.

The Commercial Surveillance ANPR poses 95 questions, many of which will find some answers in this prior history if it is adequately consulted. The Commission has generally evidenced admirable restraint and assessed the relevant tradeoffs, recognizing that the authorized collection and use of consumer information by companies confers enormous benefits, even as it entails some risks. Indeed, the overwhelming conclusion of decades of intense scrutiny is that the application of ex ante privacy principles across industries is a fraught exercise, as each industry—indeed each firm within an industry—faces a different set of consumer expectations about its provision of innovative services and offering of privacy protections.

These considerations all militate in favor of regulatory restraint by the FTC as a matter of policy. They also require restraint, and an emphasis on established jurisdiction, given the Supreme Court’s recent “major questions” jurisprudence.[24] As noted in the statements of several commissioners, West Virginia v. EPA[25] clarifies the constitutional limits on an agency’s authority to extend the reach of its jurisdiction via regulation. In brief, the broader the economic and political sweep of data regulations the Commission might propose, the more likely it is that such regulations exceed the FTC’s authority. If the “major questions doctrine” is implicated, the burden is on the agency to establish the specific grant of authority that is claimed.[26] Moreover, the Court was clear that a merely colorable claim of statutory implementation is inadequate to establish the authority to issue sweeping regulations with major economic and political implications.[27]

Download the full comments here.

 

[1] Trade Regulation Rule on Commercial Surveillance and Data Security, 87 Fed. Reg. 51273 (Aug. 22, 2022) (to be codified at 16 C.F.R. Ch. 1) [hereinafter “ANPR” or “Commercial Surveillance ANPR”].

[2] Id. at 51277.

[3] Id. at 51276.

[4] That is, “unfair or deceptive acts or practices in or affecting commerce,” as they are prohibited under Section 5 of the FTC Act, 15 U.S.C. § 45(a)(1).

[5] 15 U.S.C. § 57a.

[6] 15 U.S.C. § 57a(b)(2)(A).

[7] 15 U.S.C. § 57a(a)(1)(B).

[8] 15 U.S.C. § 57a(b)(1) (“When prescribing a rule under subsection (a)(1)(B) of this section, the Commission shall proceed in accordance with section 553 of title 5.”)

[9] 15 U.S.C. § 57a(b)(2)(i).

[10] 15 U.S.C. § 57a(b)(2)(ii).

[11] See Section III, infra (regarding the role of cost-benefit analysis under Magnuson-Moss and the statutory requirements of Section 18).

[12] Except, of course, when it comes to government access to private information, i.e., under the Fourth Amendment.

[13] See, e.g., ANPR, supra note 1 at 51273-75.

[14] The purported definition of consumer data in the ANPR, and the scope of activities around consumer data, are so overbroad as to encompass virtually the entirety of modern economic activity: “the collection, aggregation, analysis, retention, transfer, or monetization of consumer data and the direct derivatives of that information. These data include both information that consumers actively provide—say, when they affirmatively register for a service or make a purchase—as well as personal identifiers and other information that companies collect, for example, when a consumer casually browses the web or opens an app. This latter category is far broader than the first.” Id. at 51277.

[15] See, e.g., Coline Boniface, et al., Security Analysis of Subject Access Request Procedures, in Privacy Technologies & Policy: 7th Annual Privacy Forum (Maurizio Naldi, et al. eds., 2019).

[16] See, e.g., James Campbell, Avi Goldfarb & Catherine Tucker, Privacy Regulation and Market Structure, 24 J. Econ. & Mgmt. Strategy 47 (2015); Alex Marthews & Catherine Tucker, Privacy Policy and Competition, Econ. Stud. at Brookings (December 2019), available at https://www.brookings.edu/wp-content/uploads/2019/12/ES-12.04.19-Marthews-Tucker.pdf.

[17] See, e.g., Jin-Hyuk Kim & Liad Wagman, Screening Incentives and Privacy Protection in Financial Markets: A Theoretical and Empirical Analysis, 46 RAND J. Econ. 1 (2015).

[18] See, e.g., Jian Jia, Ginger Zhe Jin & Liad Wagman, The Short-run Effects of the General Data Protection Regulation on Technology Venture Investment, 40 Marketing Sci. 661 (2021).

[19] See, e.g., FTC Act, 15 U.S.C. § 45(a) et seq.

[20] See Privacy-Common Law, Law Library —American Law and Legal Information, http://law.jrank.org/pages/9409/Privacy-Common-Law.html (last visited Oct. 16, 2022).

[21] See, e.g., Comments of the Association of National Advertisers on the Competition and Consumer Protection in the 21st Century Hearings, Project Number P181201, available at https://docplayer.net/93116976-Before-the-federal-trade-commission-washington-d-c-comments-of-the-association-of-national-advertisers-on-the.html: [T]he Health Information Portability and Accountability Act (“HIPAA”) regulates certain health data; the Fair Credit Reporting Act (“FCRA”) regulates the use of consumer data for eligibility purposes; the Children’s Online Privacy Protection Act (“COPPA”) addresses personal information collected online from children; and the Gramm–Leach–Bliley Act (“GLBA”) focuses on consumers’ financial privacy; the Equal Employment Opportunity Commission (“EEOC”) enforces a variety of anti-discrimination laws in the workplace including the Pregnancy Discrimination Act (“PDA”) and American with Disabilities Act (“ADA”); the Fair Housing Act (“FHA”) protects against discrimination in housing; and the Equal Credit Opportunity Act (“ECOA”) protects against discrimination in mortgage and other forms of lending. Id. at 6.

[22] Dep’t Health & Human Servs., Health Information Privacy, Enforcement Highlights, https://www.hhs.gov/hipaa/for-professionals/compliance-enforcement/data/enforcement-highlights/index.html (HHS Office of Civil Rights, last reviewed Sep. 14, 2022).

[23] ANPR at 51293 (Dissenting Statement of Comm’r Noah J. Phillips).

[24] See W. Virginia v. Env’t Prot. Agency, 142 S. Ct. 2587, 2595 (2022) (citing a line of cases including Utility Air Regulatory Group v. EPA, 573 U. S. 302 (2014); Gonzales v. Oregon, 546 U. S. 243 (2006); FDA v. Whitman v. American Trucking Assns., Inc., 531 U. S. 457, 468 (2001); and Brown & Williamson Tobacco Corp., 529 U. S. 120, 159 (2000)).

[25] Id.

[26] See id. at 2613 (citing William Eskridge, Interpreting Law: A Primer on How to Read Statutes and the Constitution 288 (2016)).

[27] Id. at 2608-09.

Continue reading
Data Security & Privacy

Comments of the International Center for Law & Economics, ‘Ensuring Responsible Development of Digital Assets’

Regulatory Comments We thank the U.S. Treasury Department for the opportunity to participate in this Request for Comment on “Ensuring Responsible Development of Digital Assets.” Docket No. TREAS-DO-2022-0018 Submitted: November 3, 2022

I. Introduction

We thank the U.S. Treasury Department for the opportunity to participate in this Request for Comment on “Ensuring Responsible Development of Digital Assets.”[1] Our response most directly addresses part “B” of the Request for Comments, focusing particularly on the following questions:

  • “What additional steps should the United States government take to more effectively deter, detect, and disrupt the misuse of digital assets and digital asset service providers by criminals?” (B1)
  • “Are there specific areas related to AML/CFT and sanctions obligations with respect to digital assets that require additional clarity?” (B2)
  • “What additional steps should the U.S. government consider to address the illicit finance risks related to mixers and other anonymity-enhancing technologies?” (B7)
  • “What steps should the U.S. government take to effectively mitigate the illicit finance risks related to DeFi?” (B8)

Agencies whose primary function is law enforcement are chiefly concerned with the effectiveness of that mission and may not have the resources to properly consider the costs of actions that appear to promise effectiveness. We thus welcome the whole-of-government approach to the responsible development of digital assets adopted in Executive Order 14067, which invites a rigorous assessment of costs and benefits across various policy objectives.[2] The principal policy objectives set out in the Executive Order cover both law-enforcement and national-security concerns, while supporting technological advances and promoting access to safe and affordable financial services. Given the Order’s broad scope, some ways of pursuing its diverse policy objectives may be in tension. Our aim in this response is to shed light on two important areas of such tension.

First, policymakers must determine which entities in the crypto ecosystem are the most appropriate targets for law-enforcement and national-security efforts. We suggest that the costs of targeting crypto’s infrastructural or “base” layer may to a disproportionate extent impede the attainment of other policy objectives.

Second, it is important to determine the appropriate policy response to privacy-enhancing crypto technologies. As Treasury seeks to forward the goals of consumer and investor protection, promotion of access to finance, support of technological advances, and reinforcement of U.S. leadership, all point in favor of facilitating responsible use of privacy-enhancing technologies, including so-called “privacy coins.”

II. Targeting the ‘Base Layer’

Crypto’s “base layer” is in some important ways analogous to the basic infrastructure of the Internet and of traditional finance. We understand the base layer to include:

  • The “infrastructural” participants of blockchain networks—g., miners, validators, and node operators;[3] and
  • Service providers that directly serve the former—g., private relay operators like Flashbots and specialized node-hosting services[4] like Infura, Alchemy, or even Google.[5]

One approach to prevent and counteract undesirable activity “on top” of crypto’s infrastructure layer would be to lay legal duties on base-layer participants to mitigate such activity, particularly where they may, in even some remote sense, have facilitated it. This approach will often be inappropriate, however, either because it is bound to be ineffective or because it will impose disproportionate costs relative to its benefits.

Infrastructural participants of blockchain networks are not often in the best position to apply rules like anti-money-laundering (“AML”) and combating-the-financing-of-terrorism (“CFT”) obligations because they do not have direct relationships with end users. They therefore do not possess the information needed and, even if they do act, cannot offer redress to the affected users. Moreover, in open networks like Ethereum and Bitcoin, imposing legal duties on U.S.-based actors (e.g., miners or validators) is very likely to be ineffective, as many network participants will be located in other jurisdictions. Finally, some base-layer participants may simply find it impossible to comply with some legal duties, which could prompt them to leave U.S. jurisdiction.

Recent enforcement actions arising from the strict-liability duty not to facilitate transactions with entities sanctioned by the U.S. Treasury Department help to illustrate the concerns that attend imposing such duties on base-layer participants. In August 2022, a number of Ethereum addresses deployed by Tornado Cash were added to the Specially Designated Nationals and Blocked Persons List (“SDN”).[6] Following this designation—out of an abundance of caution and adopting an expansive interpretation of the law—some base-layer participants of Ethereum (validators, block builders, proposers, and relay operators) began to filter out transactions that interacted with SDN-listed Ethereum addresses, so that they would not contribute to including those transactions on the blockchain. While it appears that a fairly large segment of the base layer joined in this effort, it has been—and will very likely remain—ineffective at stopping transactions with sanctioned entities from being included on the blockchain.

One reason the filtering effort has been ineffective is that it was focused on blockchain addresses, which is what base-layer participants have access to. But sanctioned entities can create new addresses and use other methods to obfuscate their identities in transactions. The scope of filtering could theoretically be broadened, also using on-chain analysis, but this would likely be overinclusive.[7] It would therefore threaten to harm other users; potentially leave filtering base-layer operators less competitive than non-filtering ones; and likely hasten the development of changes to Ethereum to bypass such filtering.

There are, to be sure, examples of situations where it would be difficult to use a new address to circumvent filtering. Some designated blockchain addresses (e.g., the addresses of autonomous smart contracts deployed by Tornado Cash) are not controlled by anyone and thus cannot “move” to new addresses on their own. But even where a smart contract is autonomous, its original deployers—or, in the case of open-source code, anyone—could copy the code and deploy a new smart contract that would perform the same functions as the original. The need to redeploy smart contracts to new addresses often would create significant friction and costs for all who relied on the original smart contract, but as we will note in a moment, there are also cases where redeployment may not be necessary.

Even if the scope of filtering is broadened, one reason that filtering efforts may remain ineffective is that even a relatively small number of validators—including those located outside the United States—can ensure that any transaction be included on the blockchain, albeit with some delay. The extent of that delay will be proportionate to how many non-filtering validators there are among the universe of all validators. Importantly, the Ethereum addresses included on the Tornado Cash SDN list largely do not represent the kinds of smart contracts that require rapid communication.[8]

With more time-sensitive transactions—e.g., smart contracts used to liquidate on-chain collateral—delays could significantly affect utility. In cases where such delays could harm users, there would be a strong incentive to swiftly redeploy contracts to new addresses. Moreover, were the addresses of such time-sensitive smart contracts ever included on the SDN list, it would likely prompt changes to the Ethereum protocol to render base-layer filtering impossible. Indeed, development work in this direction was already underway prior to the Tornado Cash designation and may have accelerated in its aftermath. The proposed changes would involve the introduction of privacy-enhancing solutions to Ethereum, which we will discuss in the next section.

Here, we wish to focus on what these technical changes could mean for U.S. sanctions law if a determination is made that it is, indeed, illegal (on a strict-liability basis, i.e., irrespective of intent) for a U.S.-based Ethereum validator to propose (or perhaps even “attest to”) a block containing transactions with sanctioned entities.[9] If changes to the Ethereum protocol render the contents of transactions hidden from validators, then those validators could never be certain that they are in compliance with the prohibitions. This would effectively force validators (and other base-layer operators) to leave the United States. Ethereum would likely continue to function and remain accessible to U.S.-based users, but the technological and economic position that the United States currently holds in the base layer of the ecosystem would be diminished significantly.

To this point, our comments have concerned targeting the base layer for undesirable activity that happens “on top” of it—i.e., for facilitating the actions of others. It is, however, also possible for base-layer participants to engage in illicit activity in their own right. In such cases, it would certainly be appropriate that they be a target of law enforcement. For example, node operators could use their privileged access to private information about pending securities or commodities transactions in ways that would constitute market manipulation under the Securities Exchange Act or the Commodity Exchange Act.[10] Validators could also engage in potentially illegal market manipulation through some forms of “MEV extraction.”[11]

An alternative to targeting the base layer is to target the application layer—i.e., services built on top of the base layer, with the primary function of interacting with end users.[12] Of particular interest in this space are services that intermediate between crypto assets and the rest of the financial system—i.e., “on-ramps” and “off-ramps.”[13] Due to their user-facing role, such services tend to already possess—and can more easily acquire—information needed for effective compliance with legal obligations related to user activity, such as AML/CFT and sanctions obligations. Because these services have direct relationships with users, they also can ask for additional information and provide redress opportunities in certain cases—e.g., where a user is mistakenly flagged as high risk by automated tools. Moreover, crypto on- and off-ramps have been regulated as money transmitters or under analogous regulatory regimes in certain other jurisdictions.[14]

Targeting the base layer of permissionless blockchain networks may have symbolic value, but it is unlikely to achieve genuine law-enforcement or national-security goals. Imposing rules with which it would be impossible for base-layer operators to comply will simply push those operators to other jurisdictions. More effective targeting of the base layer is possible in permissioned blockchain networks, but requiring blockchain networks to be permissioned would run counter to the goal of reinforcing U.S. financial and economic leadership. It would amount to giving up on the promise of permissionless blockchains like Ethereum and Bitcoin. Finally, targeting the base layer is unnecessary, as the application layer presents a more appropriate target for legal obligations.

A.   Recommendation addressing ‘specific areas related to AML/CFT and sanctions obligations with respect to digital assets that require additional clarity’ (RFC question B2)

As we note above, base-layer efforts to filter transactions with sanctioned entities are currently ineffective and are likely to become impossible, given in-progress technological developments. We also noted that the application layer is the more appropriate target for sanctions law. The primary effect of the prevailing uncertainty surrounding the potential legal exposure of base-layer participants of public blockchains like Ethereum and Bitcoin has been to threaten U.S. technological and economic leadership in digital assets.

The U.S. Treasury Department’s Office of Foreign Assets Control (“OFAC”) could address this uncertainty by offering a public statement—perhaps in its sanctions FAQs—that it does not regard any of the following as the prohibited facilitation of a transaction with a designated entity, either on public blockchains in general or, at least, on Ethereum and Bitcoin:

  1. To include such a transaction in a block, by mining or validating (“proposing”);
  2. To accept such a transaction and the block in which it was included as valid for the purposes of adding new blocks referencing the first block; or
  3. To receive and retransmit such a transaction for inclusion by potential miners or validators (g., by a node, a block builder, or a relay).

We stress that this issue is independent from any evaluation of either the propriety or legality of sanctioning any particular entity, or of the inclusion of addresses of autonomous smart contracts on the SDN list.[15]

III. Privacy-Enhancing Technologies

Ethereum and Bitcoin—the most widely used public blockchains—were not designed with user privacy in mind. Pseudonymity of blockchain addresses is easily broken, for example, whenever a user discloses their identity to make a purchase. The effect of breaking pseudonymity is that the other party will likely be able to discover the entirety of that user’s past activity on the blockchain. It is akin to a user giving someone access to their entire history of bank or credit-card transactions. The risk of so massive a breach of financial privacy—potentially exposing users to targeting by thieves and fraudsters—is inimical to the goal of “access to safe and affordable financial services” that President Biden set out in Executive Order 14067.[16]

The lack of privacy on blockchains like Ethereum and Bitcoin has proven convenient for law enforcement, who have leveraged it to prosecute crimes.[17] But it would be mistaken to regard the current level of transparency as a benchmark either for “responsible” public blockchains or for services built atop them. Safe and accessible public blockchains of the near future—including planned changes to Ethereum—will not offer the same transparency on which today’s criminals and law enforcement alike rely.

It is useful to examine the now-sanctioned Tornado Cash within this context. Tornado Cash was arguably the most effective “on-chain” tool to protect user privacy.[18] For some use cases, users can enjoy similar privacy-protecting effects by routing their transactions through regulated exchanges like Coinbase, FTX, or Binance, but this comes at the expense of having to trust one of those third parties. The tradeoffs involved in going “off-chain” to achieve “on-chain” privacy include additional risk, friction, and delays, which could at least partially negate the point of using a public permissionless blockchain. If public blockchains are an innovation worth preserving and supporting, as the Executive Order implies, then a solution should be found that does not erase their primary salutary features.

Fortunately, there are technological solutions to preserve user privacy that simultaneously enable effective mitigation of illicit activity. One such solution is selective disclosure.[19] Even where the pseudonymous identifiers of senders and recipients—or the contents of a blockchain message (transaction)—are hidden, users may nonetheless be able to selectively disclose in a non-falsifiable way that, for example, they control the account from which a certain transaction was made. This would allow on- and off-ramp services between crypto-assets and the rest of the economy to serve as gatekeepers that perform appropriate AML/CFT or sanctions screening of customers who wish to exchange their “private coins” for fiat currency or other goods. To be sure, service providers and law enforcement would likely have access to less information under this sort of blockchain analysis than they do today, especially regarding the transactions of parties other than the customer in question (although service providers may have access to disclosed transactions from many customers). As we noted above, however, the current level of transparency poses a regrettable risk to user privacy and safety and thus cannot serve as a normative benchmark.

Tornado Cash, Zcash, and Monero all offer forms of selective disclosure.[20] While the transaction volume in these protocols is small relative to Ethereum or Bitcoin, it would be worthwhile to devote resources toward developing rules and guidance—especially for money transmitters and financial institutions—on how to facilitate transactions with those protocols responsibly. A pragmatic reason for this investment is that public blockchains and the services built on them are moving in the direction of increased privacy. Thus, the issue of privacy cannot be adequately addressed by blunt instruments like sanctioning an entire protocol, as happened with Tornado Cash. Even today, the hypothetical prohibition of Ethereum or Bitcoin would cause immense economic damage. Soon, such action could jeopardize the stability of the global economy.

As public blockchains grow, they will become more attractive both for lawful uses and for illicit uses. While illicit use may remain small as a percentage of total transactions, the volume of illicit transactions will likely rise in absolute numbers.[21] The anticipated improvements in crypto privacy will cause significant tension for the prevailing law-enforcement and national-security approaches to digital assets. In this context, Treasury’s Digital Asset Action Plan may not be entirely adequate.[22]

It is, to start, puzzling why the Digital Asset Action Plan adopted the label “anonymity-enhancing technologies,” rather the commonly used “privacy-enhancing technologies.”[23] This focus on “anonymity” rather than “privacy” directs attention away from the tension among important policy objectives set out in Executive Order 14067. The importance of privacy and the aim to strengthen it (while also countering illicit activities) is mentioned 10 times in the Executive Order. Anonymity is not mentioned.

The Action Plan itself also refers to the goal of strengthening privacy several times. It is notable, however, that Priority Action 5 (“Holding Accountable Cybercriminals and Other Illicit Actors”) does not. It is in this section that the Action Plan singles out “mixing services” as an area of “primary concern.” Treasury’s recent enforcement actions—notably the branding of Tornado Cash as a “notorious (…) mixer”[24]—suggest that the term “mixing services” is meant to refer to some of the popular privacy-enhancing technologies upon which both law-abiding Americans and foreign nationals alike have been relying.

In other words, rather than balancing the goals of strengthening privacy and mitigating illicit finance, as set out in the Executive Order, Priority Action 5 suggests a near-exclusive exclusive focus on the latter.[25] Furthermore, it is hard to avoid the impression that, in a further departure from the Executive Order, the Action Plan treats strengthening privacy as chiefly a research concern (and thus assigns it primarily to the National Science Foundation) and not an issue to be given considerable weight in law-enforcement or national-security missions.

A.   Recommendation: ‘additional steps the U.S. government should consider to address the illicit finance risks related to mixers and other anonymity-enhancing technologies’ (RFC question B7)

Given the value of both preserving and strengthening financial privacy, as well as the pragmatic concern that the largest public blockchains are moving in the direction of greater privacy, we suggest that a more constructive law-enforcement approach is needed with respect to the already-deployed privacy-enhancing technologies. This approach could include reversing the designation of Tornado Cash, combined with offering guidance for money transmitters and financial institutions on how to approach transactions with tools like Tornado Cash in a responsible manner. These guidelines could rely, among other mechanisms, on selective-disclosure functionalities built into privacy-enhancing tools.

 

[1] Ensuring Responsible Development of Digital Assets; Request for Comment, TREAS-DO-2022-0018-0001, 87 FR 57556, U.S. Dep’t of the Treasury (Sep. 20, 2022), https://www.federalregister.gov/d/2022-20279.

[2] Executive Order on Ensuring Responsible Development of Digital Assets, White House (Mar. 9, 2022), https://www.whitehouse.gov/briefing-room/presidential-actions/2022/03/09/executive-order-on-ensuring-responsible-development-of-digital-assets (hereinafter, “Executive Order”).

[3] Mikolaj Barczentewicz, Base Layer Regulation, Regulation of Crypto-Finance, https://cryptofinreg.org/projects/base-layer-regulation. Some operators (e.g., Infura) act both as infrastructural network participants in their own right (e.g., as node operators) and also offer services to infrastructural participants.

[4] Id.

[5] Amit Zavery & James Tromans, Introducing Blockchain Node Engine: Fully Managed Node-Hosting for Web3 Development, Google Cloud (Oct. 27, 2022), https://cloud.google.com/blog/products/infrastructure-modernization/introducing-blockchain-node-engine.

[6] U.S. Treasury Sanctions Notorious Virtual Currency Mixer Tornado Cash, U.S. Dep’t of the Treasury (Aug. 8, 2022), https://home.treasury.gov/news/press-releases/jy0916.

[7] @ElBarto_Crypto, Twitter (Aug. 13, 2022, 8:21 AM), https://twitter.com/ElBarto_Crypto/status/1558428428763815942 (“[W]hile only 0.03% of addresses received ETH from tornado cash, almost half the entire ETH network is only two hops from a tornado cash receiver.”).

[8] All but one designated Ethereum addresses deployed by Tornado Cash represent smart contracts, but the SDN list also includes Ethereum addresses that do not represent smart contracts, which are associated with other sanctioned entities.

[9] For an argument that it is not illegal, see Rodrigo Seira, Amy Aixi Zhang, & Dan Robinson, Base Layer Neutrality: Sanctions and Censorship Implications for Blockchain Infrastructure, Paradigm (Sep. 8, 2022), https://www.paradigm.xyz/2022/09/base-layer-neutrality.

[10] Mikolaj Barczentewicz & Anton Wahrstätter, How Transparent Is Ethereum and What Could This Mean for Regulation?,  Regulation of Crypto-Finance, https://cryptofinreg.org/projects/public-data-supervision.

[11] Mikolaj Barczentewicz & Alexander F. Sarch, Shedding Light in the Dark Forest: A Theory of Liability for Cryptocurrency “MEV” Sandwich Attacks, available at SSRN (Oct. 5, 2022), https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4187752.

[12] Autonomous smart contracts that do not rely on off-chain cooperation and also are not controlled are not part of the application layer, as we understand it here. From the perspective of asserting legal control, they are functionally part of the base layer (e.g., to “remove” such a smart contract from the blockchain, it would require the cooperation of an overwhelming majority of validators). Also, strictly speaking, end users may also interact with some base-layer participants, e.g., by submitting transactions directly to a node’s remote-procedure-calls (RPC) interface.

[13] See also Miles Jennings, Regulate Web3 Apps, Not Protocols, a16z (Sep. 29, 2022), https://a16zcrypto.com/web3-regulation-apps-not-protocols.

[14] Application of FinCEN’s Regulations to Certain Business Models Involving Convertible Virtual Currencies, Financial Crimes Enforcement Network (May 9, 2019), https://www.fincen.gov/resources/statutes-regulations/guidance/application-fincens-regulations-certain-business-models.

[15] There has been some controversy regarding the legality of sanctioning the autonomous smart contracts deployed by Tornado Cash. See Paul Grewal, Sanctions Should Target Bad Actors. Not Technology., Coinbase (Sep. 8, 2022), https://www.coinbase.com/blog/sanctions-should-target-bad-actors-not-technology; Jerry Brito & Peter Van Valkenburgh, Coin Center Is Suing OFAC Over Its Tornado Cash Sanction, Coincenter (Oct. 12, 2022), https://www.coincenter.org/coin-center-is-suing-ofac-over-its-tornado-cash-sanction; Steve Engel & Brian Kulp, OFAC Cannot Shut Down Open-Source Software, Dechert LLP (Oct. 18, 2022), https://ipfs.io/ipfs/QmTC9q5yidSWoM2HZwyTwB3VbQLVbG5cpDSBTaLP8voYNX.

[16] Executive Order, supra note 3, at Sec. 1; On cryptocurrencies’ promise for financial inclusion, including in situations especially needing privacy (e.g., domestic violence, authoritarian regimes), see, e.g., Alex Gladstein, Finding Financial Freedom in Afghanistan, Bitcoin Magazine (Aug. 26, 2021), https://bitcoinmagazine.com/culture/bitcoin-financial-freedom-in-afghanistan; Charlene Fadirepo, Why Bitcoin Is a Tool for Social Justice, CoinDesk (Feb. 17, 2022), https://www.coindesk.com/layer2/2022/02/16/why-bitcoin-is-a-tool-for-social-justice; How Cryptocurrency Meets Residents’ Economic Needs in Sub-Saharan Africa, Chainanalysis (Sep. 29, 2022), https://blog.chainalysis.com/reports/sub-saharan-africa-cryptocurrency-geography-report-2022-preview.

[17] See, e.g., Andy Greenberg, Inside the Bitcoin Bust That Took Down the Web’s Biggest Child Abuse Site, Wired (Apr. 7, 2022), https://www.wired.com/story/tracers-in-the-dark-welcome-to-video-crypto-anonymity-myth.

[18] For an explanation of Tornado Cash’s functionality, see Alex Wade, Michael Lewellen, & Peter Van Valkenburgh, How Does Tornado Cash Work?, Coincenter (Aug. 25, 2022) https://www.coincenter.org/education/advanced-topics/how-does-tornado-cash-work.

[19] See also Peter Van Valkenburgh, Open Matters: Why Permissionless Blockchains Are Essential to the Future of the Internet, Coincenter (December 2016) https://www.coincenter.org/open-matters-why-permissionless-blockchains-are-essential-to-the-future-of-the-internet.

[20] Zooko Wilcox & Paige Peterson, The Encrypted Memo Field, Electric Coin Co. (Dec. 5, 2016), https://electriccoin.co/blog/encrypted-memo-field; View Key, Moneropedia, https://www.getmonero.org/resources/moneropedia/viewkey.html; Wade, Lewellen, & Van Valkenburgh, supra note 18.

[21] Crypto Crime Trends for 2022: Illicit Transaction Activity Reaches All-Time High in Value, All-Time Low in Share of All Cryptocurrency Activity, Chainanalysis (Jan. 6, 2022), https://blog.chainalysis.com/reports/2022-crypto-crime-report-introduction.

[22] Action Plan to Address Illicit Financing Risks of Digital Assets, U.S. Dep’t of the Treasury (Sep. 20, 2022), https://home.treasury.gov/system/files/136/Digital-Asset-Action-Plan.pdf.

[23] A query for “anonymity-enhancing technologies” in the Google Scholar database returns about 40 results, while a query for “privacy-enhancing technologies” returns more than 30,000 results. See https://scholar.google.com/scholar?q=%22anonymity-enhancing+technologies%22 (accessed Oct. 28, 2022); https://scholar.google.com/scholar?q=%22privacy-enhancing+technologies%22 (accessed Oct. 28, 2022).

[24] U.S. Department of the Treasury, supra note 6.

[25] U.S. Department of the Treasury, supra note 22.

Continue reading
Data Security & Privacy

A webinar briefing on the Credit Card Competition Act

Senators Richard Durbin (D-Ill.) and Roger Marshall (R-Kan.) recently introduced the Credit Card Competition Act, which would effectively enable merchants to route credit card transactions . . .

Senators Richard Durbin (D-Ill.) and Roger Marshall (R-Kan.) recently introduced the Credit Card Competition Act, which would effectively enable merchants to route credit card transactions over a network other than the main one affiliated with the card. The sponsors say that this will increase “competition” and reduce costs for merchants, who will pass on the savings to consumers.

But are Durbin and Marshall being overly optimistic? Have they perhaps missed some predictable but unintended consequences that might cause their act to harm rather than help consumers?

We hope you will join our esteemed colleagues Julian Morris and Todd Zywicki for a timely discussion of this proposed legislation.

Continue reading
Financial Regulation & Corporate Governance

ICLE Executive Summary of FTC Commercial Surveillance ANPR Comments

Regulatory Comments A preview of the response ICLE is preparing to the FTC's 95-question advance notice of rulemaking on "commercial surveillance and data security."

I. Introduction

The Federal Trade Commission (“FTC”) has issued an Advanced Notice of Public Rulemaking (“ANPR”) on “Commercial Surveillance and Data Security,”[1] initiating a proceeding intended to result in binding rules regarding “the collection, aggregation, analysis, retention, transfer, or monetization of consumer data and the direct derivatives of that information.”[2] On some possible approach, streamlined and uniform federal data security or privacy regulations could be both beneficial and within the FTC’s competence and authority. But the approach suggested by the ANPR—simultaneously sweeping and vague—appears very likely to do more harm than good. Most notably, the ANPR evinces an approach that barely acknowledges either the limits of the FTC’s authority or the tremendous consumer benefits produced by the information economy.

The FTC is uniquely positioned to understand the complexities entailed in regulating privacy and data security. It has expertise and experience in both consumer protection and competition matters; with regard to privacy and data security, in particular, it has decades of experience bringing enforcement actions for violations of the FTC Act’s prohibition of deceptive and unfair practices; and its enforcement experience has been bolstered by its statutory mission to conduct economic and policy research, which has, not incidentally, comprised numerous hearings, workshops, studies, and reports on issues pertinent to data policy.

The Commission’s authority is not unbounded, however, and neither are its resources. Both limitations become salient when the Commission considers adopting substantive—or “legislative”— regulations under either Section 18 or Section 6 of the FTC Act. As we discuss below, the current proceeding is deficient on both substantive and procedural grounds. Absent an express grant of authority and the requisite resources from Congress, the Commission would be ill-advised to consider, much less adopt, the kinds of sweeping data regulations that the Commercial Surveillance ANPR appears to contemplate.

II. The FTC Must Provide More Detail than Is Contained in the ANPR

The ANPR states that it was issued pursuant to the Commission’s Section 18 authority,[3] which both grants and restrains the FTC’s authority to adopt regulations with respect to “unfair or deceptive acts or practices in or affecting competition” (“UDAP”).[4] Rulemaking under Section 18 of the FTC Act[5] requires that the Commission follow a careful process: as a preliminary matter, it must identify, for Congress and the public, an area of inquiry under the Commission’s jurisdiction; the Commission’s objectives in the rulemaking; and regulatory alternatives under consideration.[6] Unfortunately, the Commission has not met these obligations in this ANPR.

Under Section 18, the Commission may adopt “rules which define with specificity acts or practices which are unfair or deceptive acts or practices in or affecting commerce”[7] under Section 5 of the FTC Act. Section 18 imposes express procedural requirements, in addition to those set out for this ANPR. These include but are not limited to requirements for a Notice of Proposed Rulemaking (“NPRM”). Section 18 also incorporates by reference the procedures prescribed by the Administrative Procedure Act.[8]

As noted, Section 18’s requirements for an ANPR are brief and preliminary but they are nonetheless real. In contravention of the requirements of Section 18, this ANPR does not clearly describe any “objectives which the Commission seeks to achieve,” and it provides no indication of “possible regulatory alternatives under consideration by the Commission.”[9] Instead, it provides a laundry list of putative harms, and it fails to identify even the most basic benefits that may be associated with diverse commercial-data practices. It does not describe the Commission’s current assessment of, or position on, those practices. And it provides no sense of the direction the Commission intends to take regarding potential rules.

Failing to identify the Commission’s objectives or proposals under consideration, this ANPR fails in its basic purpose to “invite… suggestions or alternative methods for achieving [the] objectives.”[10]

III. The Commission Must Undertake a Cost-Benefit Analysis that Defines Harms, Identifies Benefits, and Weights the Two

Any rules the Commission issues under a Section 18 proceeding must emerge from a cost-benefit analysis. Both the potential harms and the benefits of challenged conduct must be well-defined, and they must be weighed against each other. Even at this early stage of the process, the FTC is obligated to provide more than a suggestion that some harm might be occurring, and to provide more than a hint of how it might handle those harms.

Irrespective of the statutory obligations to do so, this is also good procedure for policymaking. Before engaging in a deeply interventionist regulatory experiment—such as imposing strict privacy regulations in contravention of revealed consumer preferences—there should be empirically justified reasons for doing so. In other words, there should be demonstrable market failures in the provision of “privacy” (however we define that term) before centralized regulation co-opts the voluntary choices of consumers and firms in the economy, and before it supplants the ability to redress any residual, cognizable harms through law enforcement with broad, economywide, ex ante rules.

Thus, a vital threshold question for any rules issued under this proceeding is whether and why markets operating without specific privacy regulation generate a suboptimal provision of privacy protection. Without this inquiry, it is unclear what problems the rules are needed to address. Without knowing their purpose, any rules are likely to be ineffective, at best, and harmful, at worst. They may increase costs for consumers and businesses alike; mandate harmful prescriptions for alleged privacy harms; or exacerbate the risks of harm—or all of the above. Whether such costs would be accompanied by concrete gains material to most consumers is also up for grabs.

Particularly in the United States, where privacy is treated both legally and socially as more of a consumer preference (albeit perhaps a particularly important one) than a fundamental right,[11] it is difficult to determine whether our current regime produces the “right” amount of privacy protection. It is surely not sufficient to make that determination, however, that privacy advocates and consumers who are particularly privacy-sensitive think there should be more, or more of a certain sort, nor is it enough that there have been some well-publicized violations of privacy and cases of demonstrable harm. Indeed, the fact that revealed preferences in the market tend toward relatively less privacy protection is evidence that advocates may be seeking to create a level of privacy protection for which there is simply no broad-based demand. Absent a pervasive defect that suggests a broad disconnect between revealed and actual preferences, as well as a pattern of substantial net harm, the Commission should be extremely cautious before adopting preemptive and sweeping regulations.

At a minimum, the foregoing indicates that the Commission must undertake several steps before this ANPR is close to satisfying the requirements of Section 18, not to mention good government:

  • First, the Commission must proffer an adequate definition of “commercial surveillance.” While the ANPR is framed around this ominous-sounding term,[12] it is functionally described in a way that is both sweeping and vague. It appears to encompass virtually all commercial uses of “consumer data,” albeit without providing a workable definition of “consumer data.”[13] If the Commission is contemplating a general data regulation, it should say so and enumerate the objectives such a regulation would serve. In the current ANPR, the Commission has done neither.
  • Second, the Commission must do more than merely cite diverse potential harms arising from what it terms “commercial surveillance.” The Commission has a long history of pursuing privacy and data-security cases, and it should rely on this past practice to define with specificity the types of harms—cognizable as injuries under Section 5—that it intends to pursue.
    The Commission must also adequately account for the potential harms to innovation and competition that can arise from the adoption of new privacy and data-security regulations. Resources that firms invest in compliance cannot be invested in product development, customer service, or any of a host of other ends. And compliance with overly broad constraints will often curtail or deter the sort of experimentation that is at the heart of innovation.
    Moreover, there is a potential tension between privacy and data security, such that mandates to increase privacy can diminish the ability of firms to ensure data security. The EU’s experience with the General Data Protection Regulation (“GDPR”) has demonstrated some of this dynamic.[14] These realities must be incorporated into the Commission’s assessment.
  • Third, the Commission must do more than merely nod to potential benefits that the modern data-driven economy provides to consumers. The clear benefits that arise from information sharing must be considered. Since the dawn of the Internet, free digital services have created significant consumer surplus. This trend continues today: Research using both survey and experimental methods has consistently found substantial benefits for consumers from sharing information in exchange for free (or subsidized) digital products. Moreover, productive conduct and consumer benefits are not limited to free digital products and services. Myriad products and services—from health care to finance to education—are made more efficient, and more widely available, by the commercial use of various forms of consumer data.

IV. The ANPR Must Account for the Effect of Any ‘Commercial Surveillance’ Rules on Consumer Welfare and Competition

The Commission is obligated to consider the likely effects of data regulation on consumers and competition. That ought to be a requirement for regulation generally, but it is an express statutory requirement for regulation under Section 18 of the FTC Act. The Commission is uniquely well-situated to meet that mandate by virtue of its distinctive, dual competition and consumer-protection mandates. Indeed, the Commission’s antitrust-enforcement experience dates to the agency’s inception. In addition, the Commission can access the considerable expertise of its Bureau of Economics, which employs experts in both industrial organization and consumer-protection economics.

The Commercial Surveillance ANPR does not specify, or even sketch, the data regulations being contemplated by the Commission as required by statute. Consequently, one cannot assess the net effects of any proposed “commercial surveillance and data security” rule on competition or consumers, because there simply is no proposed rule to assess.

The economic literature, however, does suggest caution:

  • First, as a general matter, regulations that impose substantial fixed costs on regulated firms tend to burden smaller firms and entrants more than large firms and incumbents.[15]
  • Second, studies of specific domestic-privacy and data-security requirements underscore the potential for unintended consequences, including competitive costs.[16]
  • Third, empirical studies of the effects of general data regulations in foreign jurisdictions, such as the EU’s GDPR, suggest that such regulations have indeed led to substantial competitive harms.[17]

The literature on the effects of GDPR and other data regulations is particularly instructive. Although it is neither definitive nor complete, it has thus far found slender (at best) benefits to competition or consumers from data regulations and considerable costs and harms from their imposition. Further experience with and study of data regulations could yield a more nuanced picture. And, again, the FTC is well-positioned to contribute to and foster a greater understanding of the competitive effects of various types of data regulation. Doing so could be greatly beneficial to policymaking, competition, and consumer welfare, precisely because specific data practices can produce substantial benefits, harms, or a complex admixture of the two. But documented harms and speculative benefits of regulation recommend caution, not blind intervention.

V. Conclusion

The Commission should take account of a further reality: the rules it contemplates will be created in an environment filled with other privacy regulators. Although the United States does not have a single, omnibus, privacy regulation, this does not mean that the country does not have “privacy law.” Indeed, there already exist generally applicable laws at both the federal and state level that provide a wide scope of protection for individuals, including consumer-protection laws that apply to companies’ data use and security practices,[18] as well as those that have been developed in common law (property, contract, and tort) and criminal codes.[19] In addition, there are sector-specific regulations pertaining to particular kinds of information, such as medical records, personal information collected online from children, and credit reporting, as well as regulations prohibiting the use of data in a manner that might lead to certain kinds of illegal discrimination.[20]

Despite the FTC’s noted experience in a certain slice of privacy regulation, Congress has not made the FTC the central privacy regulatory body. Congress has wrestled with complex tradeoffs in several different areas and has allowed—through design and otherwise—various authorities to emerge in this area. Thus, given the Supreme Court’s recent “major questions” jurisprudence,[21] it is necessary to understand that the rules that emerge from this process must be properly constrained.

In his dissent from the issuance of this ANPR, Commissioner Noah Phillips noted the massive, complicated undertaking it initiates:

Legislating comprehensive national rules for consumer data privacy and security is a complicated undertaking. Any law our nation adopts will have vast economic significance. It will impact many thousands of companies, millions of citizens, and billions upon billions of dollars in commerce. It will involve real trade-offs between, for example, innovation, jobs, and economic growth on the one hand and protection from privacy harms on the other. (It will also require some level of social consensus about which harms the law can and should address.) Like most regulations, comprehensive rules for data privacy and security will likely displace some amount of competition. Reducing the ability of companies to use data about consumers, which today facilitates the provision of free services, may result in higher prices—an effect that policymakers would be remiss not to consider in our current inflationary environment.[22]

This is particularly true given the Commission’s long history of work in this area. The Commission has undertaken decades of investigations and a multitude of workshops and hearings on privacy and related topics. This ANPR nods to that history, but it does not seem to make much use of it, possibly because much of it contains lessons that pull in different directions. Overall, that impressive body of work does not remotely point to the need for a single, comprehensive privacy rule. Rather, it has demonstrated that privacy regulation is complicated. It is complicated not just as a technical matter, but also because of the immense variety of consumers’ attitudes, expectations, and preferences with respect to privacy and the use of data in the economy.

The Commercial Surveillance ANPR poses 95 questions, many of which will find some answers in this prior history if it is adequately consulted. The Commission has generally evidenced admirable restraint and assessed the relevant tradeoffs, recognizing that the authorized collection and use of consumer information by companies confers enormous benefits, even as it entails some risks. Indeed, the overwhelming conclusion of decades of intense scrutiny is that the application of ex ante privacy principles across industries is a fraught exercise, as each industry—indeed each firm within an industry—faces a different set of consumer expectations about its provision of innovative services and offering of privacy protections.

[1] Trade Regulation Rule on Commercial Surveillance and Data Security, 87 Fed. Reg. 51273 (Aug. 22, 2022) (to be codified at 16 C.F.R. Ch. 1) [hereinafter “ANPR” or “Commercial Surveillance ANPR”].

[2] Id. at 51277.

[3] Id. at 51276.

[4] That is, “unfair or deceptive acts or practices in or affecting commerce,” as they are prohibited under Section 5 of the FTC Act, 15 U.S.C. § 45(a)(1).

[5] 15 U.S.C. § 57a.

[6] 15 U.S.C. § 57a(b)(2)(A).

[7] 15 U.S.C. § 57a(a)(1)(B).

[8] 15 U.S.C. § 57a(b)(1) (“When prescribing a rule under subsection (a)(1)(B) of this section, the Commission shall proceed in accordance with section 553 of title 5.”)

[9] 15 U.S.C. § 57a(b)(2)(i).

[10] 15 U.S.C. § 57a(b)(2)(ii).

[11] Except, of course, when it comes to government access to private information, i.e., under the Fourth Amendment.

[12] See, e.g., ANPR at 51273-75.

[13] The purported definition of consumer data in the ANPR, and the scope of activities around consumer data, are so overbroad as to encompass virtually the entirety of modern economic activity: “the collection, aggregation, analysis, retention, transfer, or monetization of consumer data and the direct derivatives of that information. These data include both information that consumers actively provide—say, when they affirmatively register for a service or make a purchase—as well as personal identifiers and other information that companies collect, for example, when a consumer casually browses the web or opens an app. This latter category is far broader than the first.” Id. at 51277.

[14] See, e.g., Coline Boniface, et al., Security Analysis of Subject Access Request Procedures, in Privacy Technologies & Pol’y. APF 2019. (Maurizio Naldi, et al. eds., 2019); see also Peter Swire and DeBrae Kennedy-Mayo, The Effects of Data Localization on Cybersecurity, Georgia Tech Scheller College of Business Research Paper No. 4030905 (2022), available at https://ssrn.com/abstract=4030905 or http://dx.doi.org/10.2139/ssrn.4030905.

[15] See, e.g., James Campbell, Avi Goldfarb & Catherine Tucker, Privacy Regulation and Market Structure, 24 J. Econ. & Mgmt. Strategy 47 (2015); Alex Marthews & Catherine Tucker, Privacy Policy and Competition, Econ. Stud. at Brookings (Dec. 2019), available at https://www.brookings.edu/wp-content/uploads/2019/12/ES-12.04.19-Marthews-Tucker.pdf.

[16] See, e.g., Jin-Hyuk Kim & Liad Wagman, Screening Incentives and Privacy Protection in Financial Markets: A Theoretical and Empirical Analysis, 46 RAND J. Econ. 1 (2015).

[17] See, e.g., Jian Jia, Ginger Zhe Jin & Liad Wagman, The Short-run Effects of the General Data Protection Regulation on Technology Venture Investment, 40 Marketing Sci. 661 (2021).

[18] See, e.g., FTC Act, 15 U.S.C. § 45(a) et seq.

[19] See Privacy-Common Law, Law Library —American Law and Legal Information, http://law.jrank.org/pages/9409/Privacy-Common-Law.html (last visited Oct. 16, 2022).

[20] See, e.g., Comments of the Association of National Advertisers on the Competition and Consumer Protection in the 21st Century Hearings, Project Number P181201, available at https://docplayer.net/93116976-Before-the-federal-trade-commission-washington-d-c-comments-of-the-association-of-national-advertisers-on-the.html:

[T]he Health Information Portability and Accountability Act (“HIPAA”) regulates certain health data; the Fair Credit Reporting Act (“FCRA”) regulates the use of consumer data for eligibility purposes; the Children’s Online Privacy Protection Act (“COPPA”) addresses personal information collected online from children; and the Gramm–Leach–Bliley Act (“GLBA”) focuses on consumers’ financial privacy; the Equal Employment Opportunity Commission (“EEOC”) enforces a variety of anti-discrimination laws in the workplace including the Pregnancy Discrimination Act (“PDA”) and American with Disabilities Act (“ADA”); the Fair Housing Act (“FHA”) protects against discrimination in housing; and the Equal Credit Opportunity Act (“ECOA”) protects against discrimination in mortgage and other forms of lending.

Id. at 6.

[21] See, e.g., W. Virginia v. Env’t Prot. Agency, 142 S. Ct. 2587, 2595 (2022).

[22] ANPR at 51293 (Dissenting Statement of Comm’r Noah J. Phillips).

Continue reading
Data Security & Privacy

Gus Hurwitz on FTC Consent Decrees

Presentations & Interviews ICLE Director of Law & Economics Programs Gus Hurwitz joined Steptoe & Johnson LLP’s The Cyberlaw Podcast to discuss (among other topics): The FTC is likely . . .

ICLE Director of Law & Economics Programs Gus Hurwitz joined Steptoe & Johnson LLP’s The Cyberlaw Podcast to discuss (among other topics):

The full podcast episode is embedded below.

Continue reading
Antitrust & Consumer Protection

US-EU Data-Privacy Framework

TL;DR On Oct. 7, President Joe Biden signed an executive order to implement the U.S.-EU data-privacy framework.

Background…

On Oct. 7, President Joe Biden signed an executive order to implement the U.S.-EU data-privacy framework. The order had been awaited since March, when U.S. and EU officials reached an agreement in principle on a new framework, which EU officials insist must address concerns about surveillance practices by U.S. agencies. An earlier data-privacy framework was invalidated in 2020 by the Court of Justice of the European Union (CJEU) in its Schrems II judgment.

But…

The European Commission will now consider whether to issue an “adequacy decision” for the U.S. This is urgent, because national data-protection authorities in the EU have been using a strained interpretation of the EU General Data Protection Regulation (GDPR) to prosecute various workarounds that companies have employed to transfer data between the U.S. and the EU. Like prior U.S.-EU arrangements, the order is likely to be challenged before the EU courts, but preliminary legal analysis suggests that this one has a greater chance of being upheld.

Read the full explainer here

Continue reading
Data Security & Privacy