Showing 9 of 90 Publications in Data Security

ICLE Executive Summary of FTC Commercial Surveillance ANPR Comments

Regulatory Comments A preview of the response ICLE is preparing to the FTC's 95-question advance notice of rulemaking on "commercial surveillance and data security."

I. Introduction

The Federal Trade Commission (“FTC”) has issued an Advanced Notice of Public Rulemaking (“ANPR”) on “Commercial Surveillance and Data Security,”[1] initiating a proceeding intended to result in binding rules regarding “the collection, aggregation, analysis, retention, transfer, or monetization of consumer data and the direct derivatives of that information.”[2] On some possible approach, streamlined and uniform federal data security or privacy regulations could be both beneficial and within the FTC’s competence and authority. But the approach suggested by the ANPR—simultaneously sweeping and vague—appears very likely to do more harm than good. Most notably, the ANPR evinces an approach that barely acknowledges either the limits of the FTC’s authority or the tremendous consumer benefits produced by the information economy.

The FTC is uniquely positioned to understand the complexities entailed in regulating privacy and data security. It has expertise and experience in both consumer protection and competition matters; with regard to privacy and data security, in particular, it has decades of experience bringing enforcement actions for violations of the FTC Act’s prohibition of deceptive and unfair practices; and its enforcement experience has been bolstered by its statutory mission to conduct economic and policy research, which has, not incidentally, comprised numerous hearings, workshops, studies, and reports on issues pertinent to data policy.

The Commission’s authority is not unbounded, however, and neither are its resources. Both limitations become salient when the Commission considers adopting substantive—or “legislative”— regulations under either Section 18 or Section 6 of the FTC Act. As we discuss below, the current proceeding is deficient on both substantive and procedural grounds. Absent an express grant of authority and the requisite resources from Congress, the Commission would be ill-advised to consider, much less adopt, the kinds of sweeping data regulations that the Commercial Surveillance ANPR appears to contemplate.

II. The FTC Must Provide More Detail than Is Contained in the ANPR

The ANPR states that it was issued pursuant to the Commission’s Section 18 authority,[3] which both grants and restrains the FTC’s authority to adopt regulations with respect to “unfair or deceptive acts or practices in or affecting competition” (“UDAP”).[4] Rulemaking under Section 18 of the FTC Act[5] requires that the Commission follow a careful process: as a preliminary matter, it must identify, for Congress and the public, an area of inquiry under the Commission’s jurisdiction; the Commission’s objectives in the rulemaking; and regulatory alternatives under consideration.[6] Unfortunately, the Commission has not met these obligations in this ANPR.

Under Section 18, the Commission may adopt “rules which define with specificity acts or practices which are unfair or deceptive acts or practices in or affecting commerce”[7] under Section 5 of the FTC Act. Section 18 imposes express procedural requirements, in addition to those set out for this ANPR. These include but are not limited to requirements for a Notice of Proposed Rulemaking (“NPRM”). Section 18 also incorporates by reference the procedures prescribed by the Administrative Procedure Act.[8]

As noted, Section 18’s requirements for an ANPR are brief and preliminary but they are nonetheless real. In contravention of the requirements of Section 18, this ANPR does not clearly describe any “objectives which the Commission seeks to achieve,” and it provides no indication of “possible regulatory alternatives under consideration by the Commission.”[9] Instead, it provides a laundry list of putative harms, and it fails to identify even the most basic benefits that may be associated with diverse commercial-data practices. It does not describe the Commission’s current assessment of, or position on, those practices. And it provides no sense of the direction the Commission intends to take regarding potential rules.

Failing to identify the Commission’s objectives or proposals under consideration, this ANPR fails in its basic purpose to “invite… suggestions or alternative methods for achieving [the] objectives.”[10]

III. The Commission Must Undertake a Cost-Benefit Analysis that Defines Harms, Identifies Benefits, and Weights the Two

Any rules the Commission issues under a Section 18 proceeding must emerge from a cost-benefit analysis. Both the potential harms and the benefits of challenged conduct must be well-defined, and they must be weighed against each other. Even at this early stage of the process, the FTC is obligated to provide more than a suggestion that some harm might be occurring, and to provide more than a hint of how it might handle those harms.

Irrespective of the statutory obligations to do so, this is also good procedure for policymaking. Before engaging in a deeply interventionist regulatory experiment—such as imposing strict privacy regulations in contravention of revealed consumer preferences—there should be empirically justified reasons for doing so. In other words, there should be demonstrable market failures in the provision of “privacy” (however we define that term) before centralized regulation co-opts the voluntary choices of consumers and firms in the economy, and before it supplants the ability to redress any residual, cognizable harms through law enforcement with broad, economywide, ex ante rules.

Thus, a vital threshold question for any rules issued under this proceeding is whether and why markets operating without specific privacy regulation generate a suboptimal provision of privacy protection. Without this inquiry, it is unclear what problems the rules are needed to address. Without knowing their purpose, any rules are likely to be ineffective, at best, and harmful, at worst. They may increase costs for consumers and businesses alike; mandate harmful prescriptions for alleged privacy harms; or exacerbate the risks of harm—or all of the above. Whether such costs would be accompanied by concrete gains material to most consumers is also up for grabs.

Particularly in the United States, where privacy is treated both legally and socially as more of a consumer preference (albeit perhaps a particularly important one) than a fundamental right,[11] it is difficult to determine whether our current regime produces the “right” amount of privacy protection. It is surely not sufficient to make that determination, however, that privacy advocates and consumers who are particularly privacy-sensitive think there should be more, or more of a certain sort, nor is it enough that there have been some well-publicized violations of privacy and cases of demonstrable harm. Indeed, the fact that revealed preferences in the market tend toward relatively less privacy protection is evidence that advocates may be seeking to create a level of privacy protection for which there is simply no broad-based demand. Absent a pervasive defect that suggests a broad disconnect between revealed and actual preferences, as well as a pattern of substantial net harm, the Commission should be extremely cautious before adopting preemptive and sweeping regulations.

At a minimum, the foregoing indicates that the Commission must undertake several steps before this ANPR is close to satisfying the requirements of Section 18, not to mention good government:

  • First, the Commission must proffer an adequate definition of “commercial surveillance.” While the ANPR is framed around this ominous-sounding term,[12] it is functionally described in a way that is both sweeping and vague. It appears to encompass virtually all commercial uses of “consumer data,” albeit without providing a workable definition of “consumer data.”[13] If the Commission is contemplating a general data regulation, it should say so and enumerate the objectives such a regulation would serve. In the current ANPR, the Commission has done neither.
  • Second, the Commission must do more than merely cite diverse potential harms arising from what it terms “commercial surveillance.” The Commission has a long history of pursuing privacy and data-security cases, and it should rely on this past practice to define with specificity the types of harms—cognizable as injuries under Section 5—that it intends to pursue.
    The Commission must also adequately account for the potential harms to innovation and competition that can arise from the adoption of new privacy and data-security regulations. Resources that firms invest in compliance cannot be invested in product development, customer service, or any of a host of other ends. And compliance with overly broad constraints will often curtail or deter the sort of experimentation that is at the heart of innovation.
    Moreover, there is a potential tension between privacy and data security, such that mandates to increase privacy can diminish the ability of firms to ensure data security. The EU’s experience with the General Data Protection Regulation (“GDPR”) has demonstrated some of this dynamic.[14] These realities must be incorporated into the Commission’s assessment.
  • Third, the Commission must do more than merely nod to potential benefits that the modern data-driven economy provides to consumers. The clear benefits that arise from information sharing must be considered. Since the dawn of the Internet, free digital services have created significant consumer surplus. This trend continues today: Research using both survey and experimental methods has consistently found substantial benefits for consumers from sharing information in exchange for free (or subsidized) digital products. Moreover, productive conduct and consumer benefits are not limited to free digital products and services. Myriad products and services—from health care to finance to education—are made more efficient, and more widely available, by the commercial use of various forms of consumer data.

IV. The ANPR Must Account for the Effect of Any ‘Commercial Surveillance’ Rules on Consumer Welfare and Competition

The Commission is obligated to consider the likely effects of data regulation on consumers and competition. That ought to be a requirement for regulation generally, but it is an express statutory requirement for regulation under Section 18 of the FTC Act. The Commission is uniquely well-situated to meet that mandate by virtue of its distinctive, dual competition and consumer-protection mandates. Indeed, the Commission’s antitrust-enforcement experience dates to the agency’s inception. In addition, the Commission can access the considerable expertise of its Bureau of Economics, which employs experts in both industrial organization and consumer-protection economics.

The Commercial Surveillance ANPR does not specify, or even sketch, the data regulations being contemplated by the Commission as required by statute. Consequently, one cannot assess the net effects of any proposed “commercial surveillance and data security” rule on competition or consumers, because there simply is no proposed rule to assess.

The economic literature, however, does suggest caution:

  • First, as a general matter, regulations that impose substantial fixed costs on regulated firms tend to burden smaller firms and entrants more than large firms and incumbents.[15]
  • Second, studies of specific domestic-privacy and data-security requirements underscore the potential for unintended consequences, including competitive costs.[16]
  • Third, empirical studies of the effects of general data regulations in foreign jurisdictions, such as the EU’s GDPR, suggest that such regulations have indeed led to substantial competitive harms.[17]

The literature on the effects of GDPR and other data regulations is particularly instructive. Although it is neither definitive nor complete, it has thus far found slender (at best) benefits to competition or consumers from data regulations and considerable costs and harms from their imposition. Further experience with and study of data regulations could yield a more nuanced picture. And, again, the FTC is well-positioned to contribute to and foster a greater understanding of the competitive effects of various types of data regulation. Doing so could be greatly beneficial to policymaking, competition, and consumer welfare, precisely because specific data practices can produce substantial benefits, harms, or a complex admixture of the two. But documented harms and speculative benefits of regulation recommend caution, not blind intervention.

V. Conclusion

The Commission should take account of a further reality: the rules it contemplates will be created in an environment filled with other privacy regulators. Although the United States does not have a single, omnibus, privacy regulation, this does not mean that the country does not have “privacy law.” Indeed, there already exist generally applicable laws at both the federal and state level that provide a wide scope of protection for individuals, including consumer-protection laws that apply to companies’ data use and security practices,[18] as well as those that have been developed in common law (property, contract, and tort) and criminal codes.[19] In addition, there are sector-specific regulations pertaining to particular kinds of information, such as medical records, personal information collected online from children, and credit reporting, as well as regulations prohibiting the use of data in a manner that might lead to certain kinds of illegal discrimination.[20]

Despite the FTC’s noted experience in a certain slice of privacy regulation, Congress has not made the FTC the central privacy regulatory body. Congress has wrestled with complex tradeoffs in several different areas and has allowed—through design and otherwise—various authorities to emerge in this area. Thus, given the Supreme Court’s recent “major questions” jurisprudence,[21] it is necessary to understand that the rules that emerge from this process must be properly constrained.

In his dissent from the issuance of this ANPR, Commissioner Noah Phillips noted the massive, complicated undertaking it initiates:

Legislating comprehensive national rules for consumer data privacy and security is a complicated undertaking. Any law our nation adopts will have vast economic significance. It will impact many thousands of companies, millions of citizens, and billions upon billions of dollars in commerce. It will involve real trade-offs between, for example, innovation, jobs, and economic growth on the one hand and protection from privacy harms on the other. (It will also require some level of social consensus about which harms the law can and should address.) Like most regulations, comprehensive rules for data privacy and security will likely displace some amount of competition. Reducing the ability of companies to use data about consumers, which today facilitates the provision of free services, may result in higher prices—an effect that policymakers would be remiss not to consider in our current inflationary environment.[22]

This is particularly true given the Commission’s long history of work in this area. The Commission has undertaken decades of investigations and a multitude of workshops and hearings on privacy and related topics. This ANPR nods to that history, but it does not seem to make much use of it, possibly because much of it contains lessons that pull in different directions. Overall, that impressive body of work does not remotely point to the need for a single, comprehensive privacy rule. Rather, it has demonstrated that privacy regulation is complicated. It is complicated not just as a technical matter, but also because of the immense variety of consumers’ attitudes, expectations, and preferences with respect to privacy and the use of data in the economy.

The Commercial Surveillance ANPR poses 95 questions, many of which will find some answers in this prior history if it is adequately consulted. The Commission has generally evidenced admirable restraint and assessed the relevant tradeoffs, recognizing that the authorized collection and use of consumer information by companies confers enormous benefits, even as it entails some risks. Indeed, the overwhelming conclusion of decades of intense scrutiny is that the application of ex ante privacy principles across industries is a fraught exercise, as each industry—indeed each firm within an industry—faces a different set of consumer expectations about its provision of innovative services and offering of privacy protections.

[1] Trade Regulation Rule on Commercial Surveillance and Data Security, 87 Fed. Reg. 51273 (Aug. 22, 2022) (to be codified at 16 C.F.R. Ch. 1) [hereinafter “ANPR” or “Commercial Surveillance ANPR”].

[2] Id. at 51277.

[3] Id. at 51276.

[4] That is, “unfair or deceptive acts or practices in or affecting commerce,” as they are prohibited under Section 5 of the FTC Act, 15 U.S.C. § 45(a)(1).

[5] 15 U.S.C. § 57a.

[6] 15 U.S.C. § 57a(b)(2)(A).

[7] 15 U.S.C. § 57a(a)(1)(B).

[8] 15 U.S.C. § 57a(b)(1) (“When prescribing a rule under subsection (a)(1)(B) of this section, the Commission shall proceed in accordance with section 553 of title 5.”)

[9] 15 U.S.C. § 57a(b)(2)(i).

[10] 15 U.S.C. § 57a(b)(2)(ii).

[11] Except, of course, when it comes to government access to private information, i.e., under the Fourth Amendment.

[12] See, e.g., ANPR at 51273-75.

[13] The purported definition of consumer data in the ANPR, and the scope of activities around consumer data, are so overbroad as to encompass virtually the entirety of modern economic activity: “the collection, aggregation, analysis, retention, transfer, or monetization of consumer data and the direct derivatives of that information. These data include both information that consumers actively provide—say, when they affirmatively register for a service or make a purchase—as well as personal identifiers and other information that companies collect, for example, when a consumer casually browses the web or opens an app. This latter category is far broader than the first.” Id. at 51277.

[14] See, e.g., Coline Boniface, et al., Security Analysis of Subject Access Request Procedures, in Privacy Technologies & Pol’y. APF 2019. (Maurizio Naldi, et al. eds., 2019); see also Peter Swire and DeBrae Kennedy-Mayo, The Effects of Data Localization on Cybersecurity, Georgia Tech Scheller College of Business Research Paper No. 4030905 (2022), available at or

[15] See, e.g., James Campbell, Avi Goldfarb & Catherine Tucker, Privacy Regulation and Market Structure, 24 J. Econ. & Mgmt. Strategy 47 (2015); Alex Marthews & Catherine Tucker, Privacy Policy and Competition, Econ. Stud. at Brookings (Dec. 2019), available at

[16] See, e.g., Jin-Hyuk Kim & Liad Wagman, Screening Incentives and Privacy Protection in Financial Markets: A Theoretical and Empirical Analysis, 46 RAND J. Econ. 1 (2015).

[17] See, e.g., Jian Jia, Ginger Zhe Jin & Liad Wagman, The Short-run Effects of the General Data Protection Regulation on Technology Venture Investment, 40 Marketing Sci. 661 (2021).

[18] See, e.g., FTC Act, 15 U.S.C. § 45(a) et seq.

[19] See Privacy-Common Law, Law Library —American Law and Legal Information, (last visited Oct. 16, 2022).

[20] See, e.g., Comments of the Association of National Advertisers on the Competition and Consumer Protection in the 21st Century Hearings, Project Number P181201, available at

[T]he Health Information Portability and Accountability Act (“HIPAA”) regulates certain health data; the Fair Credit Reporting Act (“FCRA”) regulates the use of consumer data for eligibility purposes; the Children’s Online Privacy Protection Act (“COPPA”) addresses personal information collected online from children; and the Gramm–Leach–Bliley Act (“GLBA”) focuses on consumers’ financial privacy; the Equal Employment Opportunity Commission (“EEOC”) enforces a variety of anti-discrimination laws in the workplace including the Pregnancy Discrimination Act (“PDA”) and American with Disabilities Act (“ADA”); the Fair Housing Act (“FHA”) protects against discrimination in housing; and the Equal Credit Opportunity Act (“ECOA”) protects against discrimination in mortgage and other forms of lending.

Id. at 6.

[21] See, e.g., W. Virginia v. Env’t Prot. Agency, 142 S. Ct. 2587, 2595 (2022).

[22] ANPR at 51293 (Dissenting Statement of Comm’r Noah J. Phillips).

Continue reading
Data Security & Privacy

Gus Hurwitz on FTC Consent Decrees

Presentations & Interviews ICLE Director of Law & Economics Programs Gus Hurwitz joined Steptoe & Johnson LLP’s The Cyberlaw Podcast to discuss (among other topics): The FTC is likely . . .

ICLE Director of Law & Economics Programs Gus Hurwitz joined Steptoe & Johnson LLP’s The Cyberlaw Podcast to discuss (among other topics):

The full podcast episode is embedded below.

Continue reading
Antitrust & Consumer Protection

US-EU Data-Privacy Framework

TL;DR On Oct. 7, President Joe Biden signed an executive order to implement the U.S.-EU data-privacy framework.


On Oct. 7, President Joe Biden signed an executive order to implement the U.S.-EU data-privacy framework. The order had been awaited since March, when U.S. and EU officials reached an agreement in principle on a new framework, which EU officials insist must address concerns about surveillance practices by U.S. agencies. An earlier data-privacy framework was invalidated in 2020 by the Court of Justice of the European Union (CJEU) in its Schrems II judgment.


The European Commission will now consider whether to issue an “adequacy decision” for the U.S. This is urgent, because national data-protection authorities in the EU have been using a strained interpretation of the EU General Data Protection Regulation (GDPR) to prosecute various workarounds that companies have employed to transfer data between the U.S. and the EU. Like prior U.S.-EU arrangements, the order is likely to be challenged before the EU courts, but preliminary legal analysis suggests that this one has a greater chance of being upheld.

Read the full explainer here

Continue reading
Data Security & Privacy

How Not to Use Industrial Policy to Promote Europe’s Digital Sovereignty

TOTM The concept of European “digital sovereignty” has been promoted in recent years both by high officials of the European Union and by EU national governments. . . .

The concept of European “digital sovereignty” has been promoted in recent years both by high officials of the European Union and by EU national governments. Indeed, France made strengthening sovereignty one of the goals of its recent presidency in the EU Council.

Read the full piece here.

Continue reading
Innovation & the New Economy

The Role of Transaction Cost Engineering in Standards Adoption: Evidence from Internet Security

Scholarship Abstract The growing economic importance of technical standards has heightened the need for a better understanding of why they succeed or fail. While existing literature . . .


The growing economic importance of technical standards has heightened the need for a better understanding of why they succeed or fail. While existing literature has scrutinized the role of public governance, particularly in the realms of regulation, antitrust, and intellectual property, to date legal scholars have largely overlooked the role of private organizational and contractual lawyering in determining the path of technical standardization.

In this Article, we explore this dimension through a case study of the effects of private organizational governance and contracting practices on the fortunes of a nascent Internet security standard. The standard, known as Resource Public Key Infrastructure (“RPKI”), is designed to increase the trustworthiness of information about Internet routing. Through analysis of private organizational and contractual documents, semi-structured interviews with participants in the Internet operations industry, and attendance and participation in key industry conferences, we gained an embedded perspective on the role that private lawyering played in shaping would-be adopters’ perceptions and decisions regarding the technical standard.

According to our interviewees, contract and organizational bureaucracy mattered greatly. Notably, we found that the terms of contractual agreements prevented some potential adopters from experimenting with the technology and deterred others from proposing that their organizations adopt the technology. This was due to the perceived costs of involving organizational lawyers in technology-adoption decisions. In addition, contract terms deterred actors from increasing the functional value of the standard via complementary innovation and the development of complementary information services. Remarkably, even the basic mechanisms for presenting and assenting to contract terms chilled prospects for adoption. Regarding organization, we found that stark differences of governance and mission between key North American and European nonprofits contributed to different patterns of adoption. Taken together, these findings reveal the continuing importance of old-school transaction-cost engineering even in the most technical realms of Internet operation and standardization.

Continue reading
Data Security & Privacy

Commerce Committee Fails to Correct Major Deficiencies in House Privacy Bill

TOTM Having earlier passed through subcommittee, the American Data Privacy and Protection Act (ADPPA) has now been cleared for floor consideration by the U.S. House Energy and Commerce Committee. Before the . . .

Having earlier passed through subcommittee, the American Data Privacy and Protection Act (ADPPA) has now been cleared for floor consideration by the U.S. House Energy and Commerce Committee. Before the markup, we noted that the ADPPA mimics some of the worst flaws found in the European Union’s General Data Protection Regulation (GDPR), while creating new problems that the GDPR had avoided. Alas, the amended version of the legislation approved by the committee not only failed to correct those flaws, but in some cases it actually undid some of the welcome corrections that had been made to made to the original discussion draft.

Read the full piece here.

Continue reading
Data Security & Privacy

Why the EU’s Rushed ‘Travel Rule’ for Crypto Should Be Struck Down

Popular Media We appear to be reaching an end stage in negotiations between the European Parliament and the Council of the European Union on a plan to extend the EU’s financial-surveillance . . .

We appear to be reaching an end stage in negotiations between the European Parliament and the Council of the European Union on a plan to extend the EU’s financial-surveillance regime over the cryptocurrency industry. Alas, lawmakers were in such a rush that they appear not to have noticed that the hastily crafted legislative package violates fundamental tenets of the EU’s founding treaties.

Read the full piece here.

Continue reading
Financial Regulation & Corporate Governance

European Proposal for a Data Act: A First Assessment

Scholarship INTRODUCTION AND BACKGROUND On 23 February 2022, the European Commission unveiled its proposal for a Data Act (DA).[1] As declared in the Impact Assessment,[2] the . . .


On 23 February 2022, the European Commission unveiled its proposal for a Data Act (DA).[1] As declared in the Impact Assessment,[2] the DA complements two other major instruments shaping the European single market for data, such as the Data Governance Act[3] and the Digital Markets Act (DMA),[4] and is a key pillar of the European Strategy for Data in which the Commission announced the establishment of EU-wide common, interoperable data spaces in strategic sectors to overcome legal and technical barriers to data sharing.[5] The DA also represents the latest effort of European policy makers to ensure free flows of data through a broad array of initiatives which differ among themselves in terms of scope and approach: some interventions are horizontal, others are sector-specific; some mandate data sharing, others envisage measures to facilitate the voluntary sharing; some introduce general data rights, others allow asymmetric data access rights.

Notably, the General Data Protection Regulation (GDPR) enshrined a general personal data portability right for individuals,[6] the Regulation on the free flow of non-personal data facilitated business-to- business data sharing practices,[7] the Open Data Directive aimed to put government data to good use for private players,[8] and the Data Governance Act attempted to harmonising conditions for the use of certain public sector data and further promoting the voluntary sharing of data by increasing trust in neutral data intermediaries that will help match data demand and supply in the data spaces.[9] Sector- specific legislations on data access have also been adopted or proposed to address identified market failures, such as in the automotive,[10] payment service providers,[11] smart metering information,[12] electricity network data,[13] intelligent transport systems,[14] renewables,[15] and energy performance of buildings.[16]

Against this background, given that the DA is a horizontal legislative initiative fostering data sharing by unlocking machine-generated data and overcoming vendor lock-in, an issue of coherence with existing and forthcoming EU data-related legislations emerges.

The premise of such regulatory intervention is provided by the fact that an ever-increasing amount of data is generated by machines or processes based on emerging technologies, such as the Internet of Things (IoT), and is used as a key component for innovative services and products, in particular for developing artificial intelligence (AI) applications.[17] The ability to gather and access different data sources is crucial in order for IoT innovation to thrive. IoT environments are possible as long as all sorts of devices can be interconnected and can exchange data in real-time. Therefore, access to data and data sharing practices are pivotal factors for unlocking competition and incentivising innovation.

From this perspective, the proposal for a DA represents the last episode of a long thread of European Commission interventions. Since the 2015 Digital Single Market Communication, the Commission has indeed emphasised the central role played by big data, cloud services, and the IoT for the EU’s competitiveness, also pointing out that the lack of open and interoperable systems and services and of data portability between services represents a barrier for the development of new services.[18] The issue of (limited) access to machine-generated data has been raised in the 2017 Communication on the European Data Economy,[19] where the Commission envisaged some potential interventions which are now advanced by the DA, as well as in more recent Commission’ Communications on a common European data space and a European strategy for data.[20] In particular, the latter indicated the “issues related to usage rights for co-generated data (such as IoT data in industrial settings)” as a priority area for a legislative intervention.[21]

Moreover, the IoT economy has been the subject of a recent sector inquiry which offered a comprehensive insight into the current structure of IoT environments and the competitive dynamics that are shaping their development.[22] In particular, the Commission underlined the role of digital ecosystems within which a huge number of IoT interactions take place and identified the most widespread operating systems and general voice assistants as the key technological platforms that connect different hardware and software components of an IoT business environment, increase their complementarity as well as provide a single access point to diverse categories of users.[23] Against this backdrop, interoperability is deemed to play a crucial role in improving consumer choice and preventing lock-in into providers’ products.

To contribute to the current policy debate, this paper will provide a first assessment of the tabled DA and will suggest possible improvements for the ongoing legislative negotiations. The paper is structured as follows. Section 2 deals with the problems addressed and the objectives pursued by the legislative initiative. Section 3 analyses the scope of the new data access and sharing right for connected devices. Then, Section 4 investigates the provisions aimed at favouring business-to- government data sharing for the public interest. Section 5 deals with the rules which tackle the vendor lock-in problem in data processing services by facilitating switching between cloud and edge services. Section 6 analyses the requirements set forth regarding interoperability. Finally, Section 7 concludes by addressing the governance structure. Each section briefly summarises the DA proposal and then makes a first assessment with suggestions for improvements.

[1] European Commission, ‘Proposal for a Regulation of the European Parliament and of the Council on harmonised rules on fair access and use of data (Data Act)’ COM(2022) 68 final.

[2] Commission Staff Working Document, Impact Assessment Report accompanying the Proposal for a Regulation on harmonised rules on fair access to and use of data (Data Act) SWD(2022) 34 final, 1.

[3] Regulation (EU) 2022/868 on European data governance (Data Governance Act) [2022] OJ L 152/1.

[4] Regulation (EU) on contestable and fair markets in the digital sector (Digital Markets Act).

[5] European Commission, ‘A European strategy for data’ COM(2020) 66 final.

[6] Regulation (EU) 2016/679 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC, [2016] OJ L 119/1, Article 20.

[7] Regulation (EU) 2018/1807 on a framework for the free flow of non-personal data in the European Union, [2018] OJ L 303/59.

[8] Directive (EU) 2019/1024 on open data and the re-use of public sector information, [2019] OJ L 172/56.

[9] Data Governance Act, supra note 3.

[10] Regulation (EU) 2018/858 on the approval and market surveillance of motor vehicles and their trailers, and of systems, components and separate technical units intended for such vehicles, amending Regulations (EC) No 715/2007 and (EC) No 595/2009 and repealing Directive 2007/46/EC, [2017] OJ L 151/1.

[11] Directive (EU) 2015/2366 on payment services in the internal market, [2015] OJ L 337/35, Article 67.

[12] Directive (EU) 2019/944 on common rules for the internal market for electricity and amending Directive 2012/27/EU, [2019] OJ L 158/125; and Directive 2009/73/EC concerning common rules for the internal market in natural gas and repealing Directive 2003/55/EC, [2009] OJ L 211/94.

[13] Regulation (EU) 2017/1485 establishing a guideline on electricity transmission system operation, [2017] OJ L 220/1; and Regulation (EU) 2015/703 establishing a network code on interoperability and data exchange rules, [2015] OJ L 113/13.

[14] Directive 2010/40/EU on the framework for the deployment of Intelligent Transport Systems in the field of road transport and for interfaces with other modes of transport Text with EEA relevance, [2010] OJ L 207/1.

[15] Proposal for a Directive amending Directive (EU) 2018/2001, Regulation (EU) 2018/1999 and Directive 98/70/EC as regards the promotion of energy from renewable sources, and repealing Council Directive (EU) 2015/652, COM(2021) 557 final.

[16] Proposal for a Directive on the energy performance of buildings (recast), COM(2021) 802 final.

[17] On the economic value of data, see Jan Krämer, Daniel Schnurr, and Sally Broughton Micova (2020), ‘The role of data for digital markets contestability’, CERRE Report the_role_of_data_for_digital_markets_contestability_case_studies_and_data_access_remedies-september2020.pdf.

[18] European Commission, ‘A Digital Single Market Strategy for Europe’, COM(2015) 192 final, 14.

[19] European Commission, ‘Building a European Data Economy’, COM(2017) 9 final, 12-13.

[20] European Commission, ‘A European strategy for data’, supra note 5, 10; and European Commission, ‘Towards a common European data space’, COM(2018) 232 final, 10.

[21] European Commission, ‘A European strategy for data’, supra note 5, 13, and 26.

[22] European Commission, ‘Final Report – Sector inquiry into consumer Internet of Things’ COM(2022) 19 final.

[23] Commission Staff Working Document accompanying the ‘Final Report – Sector inquiry into consumer Internet of Things’ COM(2022) 10 final.

Continue reading
Data Security & Privacy

Privacy, Crypto, and EU Financial Surveillance

TOTM European Union lawmakers appear close to finalizing a number of legislative proposals that aim to reform the EU’s financial-regulation framework in response to the rise of cryptocurrencies. Prominent . . .

European Union lawmakers appear close to finalizing a number of legislative proposals that aim to reform the EU’s financial-regulation framework in response to the rise of cryptocurrencies. Prominent within the package are new anti-money laundering and “countering the financing of terrorism” rules (AML/CFT), including an extension of the so-called “travel rule.” The travel rule, which currently applies to wire transfers managed by global banks, would be extended to require crypto-asset service providers to similarly collect and make available details about the originators and beneficiaries of crypto-asset transfers.

Read the full piece here.

Continue reading
Data Security & Privacy