Showing 9 of 27 Publications in Data Security & PrivacyScholarship

European Proposal for a Data Act: A First Assessment

Scholarship INTRODUCTION AND BACKGROUND On 23 February 2022, the European Commission unveiled its proposal for a Data Act (DA).[1] As declared in the Impact Assessment,[2] the . . .

INTRODUCTION AND BACKGROUND

On 23 February 2022, the European Commission unveiled its proposal for a Data Act (DA).[1] As declared in the Impact Assessment,[2] the DA complements two other major instruments shaping the European single market for data, such as the Data Governance Act[3] and the Digital Markets Act (DMA),[4] and is a key pillar of the European Strategy for Data in which the Commission announced the establishment of EU-wide common, interoperable data spaces in strategic sectors to overcome legal and technical barriers to data sharing.[5] The DA also represents the latest effort of European policy makers to ensure free flows of data through a broad array of initiatives which differ among themselves in terms of scope and approach: some interventions are horizontal, others are sector-specific; some mandate data sharing, others envisage measures to facilitate the voluntary sharing; some introduce general data rights, others allow asymmetric data access rights.

Notably, the General Data Protection Regulation (GDPR) enshrined a general personal data portability right for individuals,[6] the Regulation on the free flow of non-personal data facilitated business-to- business data sharing practices,[7] the Open Data Directive aimed to put government data to good use for private players,[8] and the Data Governance Act attempted to harmonising conditions for the use of certain public sector data and further promoting the voluntary sharing of data by increasing trust in neutral data intermediaries that will help match data demand and supply in the data spaces.[9] Sector- specific legislations on data access have also been adopted or proposed to address identified market failures, such as in the automotive,[10] payment service providers,[11] smart metering information,[12] electricity network data,[13] intelligent transport systems,[14] renewables,[15] and energy performance of buildings.[16]

Against this background, given that the DA is a horizontal legislative initiative fostering data sharing by unlocking machine-generated data and overcoming vendor lock-in, an issue of coherence with existing and forthcoming EU data-related legislations emerges.

The premise of such regulatory intervention is provided by the fact that an ever-increasing amount of data is generated by machines or processes based on emerging technologies, such as the Internet of Things (IoT), and is used as a key component for innovative services and products, in particular for developing artificial intelligence (AI) applications.[17] The ability to gather and access different data sources is crucial in order for IoT innovation to thrive. IoT environments are possible as long as all sorts of devices can be interconnected and can exchange data in real-time. Therefore, access to data and data sharing practices are pivotal factors for unlocking competition and incentivising innovation.

From this perspective, the proposal for a DA represents the last episode of a long thread of European Commission interventions. Since the 2015 Digital Single Market Communication, the Commission has indeed emphasised the central role played by big data, cloud services, and the IoT for the EU’s competitiveness, also pointing out that the lack of open and interoperable systems and services and of data portability between services represents a barrier for the development of new services.[18] The issue of (limited) access to machine-generated data has been raised in the 2017 Communication on the European Data Economy,[19] where the Commission envisaged some potential interventions which are now advanced by the DA, as well as in more recent Commission’ Communications on a common European data space and a European strategy for data.[20] In particular, the latter indicated the “issues related to usage rights for co-generated data (such as IoT data in industrial settings)” as a priority area for a legislative intervention.[21]

Moreover, the IoT economy has been the subject of a recent sector inquiry which offered a comprehensive insight into the current structure of IoT environments and the competitive dynamics that are shaping their development.[22] In particular, the Commission underlined the role of digital ecosystems within which a huge number of IoT interactions take place and identified the most widespread operating systems and general voice assistants as the key technological platforms that connect different hardware and software components of an IoT business environment, increase their complementarity as well as provide a single access point to diverse categories of users.[23] Against this backdrop, interoperability is deemed to play a crucial role in improving consumer choice and preventing lock-in into providers’ products.

To contribute to the current policy debate, this paper will provide a first assessment of the tabled DA and will suggest possible improvements for the ongoing legislative negotiations. The paper is structured as follows. Section 2 deals with the problems addressed and the objectives pursued by the legislative initiative. Section 3 analyses the scope of the new data access and sharing right for connected devices. Then, Section 4 investigates the provisions aimed at favouring business-to- government data sharing for the public interest. Section 5 deals with the rules which tackle the vendor lock-in problem in data processing services by facilitating switching between cloud and edge services. Section 6 analyses the requirements set forth regarding interoperability. Finally, Section 7 concludes by addressing the governance structure. Each section briefly summarises the DA proposal and then makes a first assessment with suggestions for improvements.

[1] European Commission, ‘Proposal for a Regulation of the European Parliament and of the Council on harmonised rules on fair access and use of data (Data Act)’ COM(2022) 68 final.

[2] Commission Staff Working Document, Impact Assessment Report accompanying the Proposal for a Regulation on harmonised rules on fair access to and use of data (Data Act) SWD(2022) 34 final, 1.

[3] Regulation (EU) 2022/868 on European data governance (Data Governance Act) [2022] OJ L 152/1.

[4] Regulation (EU) on contestable and fair markets in the digital sector (Digital Markets Act).

[5] European Commission, ‘A European strategy for data’ COM(2020) 66 final.

[6] Regulation (EU) 2016/679 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC, [2016] OJ L 119/1, Article 20.

[7] Regulation (EU) 2018/1807 on a framework for the free flow of non-personal data in the European Union, [2018] OJ L 303/59.

[8] Directive (EU) 2019/1024 on open data and the re-use of public sector information, [2019] OJ L 172/56.

[9] Data Governance Act, supra note 3.

[10] Regulation (EU) 2018/858 on the approval and market surveillance of motor vehicles and their trailers, and of systems, components and separate technical units intended for such vehicles, amending Regulations (EC) No 715/2007 and (EC) No 595/2009 and repealing Directive 2007/46/EC, [2017] OJ L 151/1.

[11] Directive (EU) 2015/2366 on payment services in the internal market, [2015] OJ L 337/35, Article 67.

[12] Directive (EU) 2019/944 on common rules for the internal market for electricity and amending Directive 2012/27/EU, [2019] OJ L 158/125; and Directive 2009/73/EC concerning common rules for the internal market in natural gas and repealing Directive 2003/55/EC, [2009] OJ L 211/94.

[13] Regulation (EU) 2017/1485 establishing a guideline on electricity transmission system operation, [2017] OJ L 220/1; and Regulation (EU) 2015/703 establishing a network code on interoperability and data exchange rules, [2015] OJ L 113/13.

[14] Directive 2010/40/EU on the framework for the deployment of Intelligent Transport Systems in the field of road transport and for interfaces with other modes of transport Text with EEA relevance, [2010] OJ L 207/1.

[15] Proposal for a Directive amending Directive (EU) 2018/2001, Regulation (EU) 2018/1999 and Directive 98/70/EC as regards the promotion of energy from renewable sources, and repealing Council Directive (EU) 2015/652, COM(2021) 557 final.

[16] Proposal for a Directive on the energy performance of buildings (recast), COM(2021) 802 final.

[17] On the economic value of data, see Jan Krämer, Daniel Schnurr, and Sally Broughton Micova (2020), ‘The role of data for digital markets contestability’, CERRE Report https://cerre.eu/wp-content/uploads/2020/08/cerre- the_role_of_data_for_digital_markets_contestability_case_studies_and_data_access_remedies-september2020.pdf.

[18] European Commission, ‘A Digital Single Market Strategy for Europe’, COM(2015) 192 final, 14.

[19] European Commission, ‘Building a European Data Economy’, COM(2017) 9 final, 12-13.

[20] European Commission, ‘A European strategy for data’, supra note 5, 10; and European Commission, ‘Towards a common European data space’, COM(2018) 232 final, 10.

[21] European Commission, ‘A European strategy for data’, supra note 5, 13, and 26.

[22] European Commission, ‘Final Report – Sector inquiry into consumer Internet of Things’ COM(2022) 19 final.

[23] Commission Staff Working Document accompanying the ‘Final Report – Sector inquiry into consumer Internet of Things’ COM(2022) 10 final.

Continue reading
Data Security & Privacy

The Overlooked Systemic Impact of the Right to Be Forgotten: Lessons from Adverse Selection, Moral Hazard, and Ban the Box

Scholarship Abstract The right to be forgotten, which began as a part of European law, has found increasing acceptance in state privacy statutes recently enacted in . . .

Abstract

The right to be forgotten, which began as a part of European law, has found increasing acceptance in state privacy statutes recently enacted in the U.S. Commentators have largely analyzed the right to be forgotten as a clash between the privacy interests of data subjects and the free speech rights of those holding the data. Framing the issues as a clash of individual rights largely ignores the important scholarly literatures exploring how giving data subjects the ability to render certain information unobservable can give rise to systemic effects that can harm society as a whole. This Essay fills this gap by exploring what the right to be forgotten can learn from the literatures exploring the implications of adverse selection, moral hazard, and the emerging policy intervention know as ban the box.

Continue reading
Data Security & Privacy

Optimizing Cybersecurity Risk in Medical Cyber-Physical Devices

Scholarship Abstract Medical devices are increasingly connected, both to cyber networks and to sensors collecting data from physical stimuli. These cyber-physical systems pose a new host . . .

Abstract

Medical devices are increasingly connected, both to cyber networks and to sensors collecting data from physical stimuli. These cyber-physical systems pose a new host of deadly security risks that traditional notions of cybersecurity struggle to take into account. Previously, we could predict how algorithms would function as they drew on defined inputs. But cyber-physical systems draw on unbounded inputs from the real world. Moreover, with wide networks of cyber-physical medical devices, a single cybersecurity breach could pose lethal dangers to masses of patients.

The U.S. Food and Drug Administration (FDA) is tasked with regulating medical devices to ensure safety and effectiveness, but its regulatory approach—designed decades ago to regulate traditional medical hardware—is ill-suited to the unique problems of cybersecurity. Because perfect cybersecurity is impossible and every cybersecurity improvement entails costs to affordability and health, designers need standards that balance costs and benefits to inform the optimal level of risk. FDA, however, conducts limited cost-benefit analyses, believing that its authorizing statute forbids consideration of economic costs.

We draw on statutory text and case law to show that this belief is mistaken and that FDA can and should conduct cost-benefit analyses to ensure safety and effectiveness, especially in the context of cybersecurity. We describe three approaches FDA could take to implement this analysis as a practical matter. Of these three, we recommend an approach modeled after the Federal Trade Commission’s cost-benefit test. Regardless of the specific approach FDA chooses, however, the critical point is that the agency must weigh costs and benefits to ensure the right level of cybersecurity. Until then, medical device designers will face continued uncertainty as cybersecurity threats become increasingly dangerous.

Continue reading
Data Security & Privacy

Issue Brief: The EU Artificial Intelligence Act

ICLE Issue Brief As currently drafted, the text of the EU's proposed Artificial Intelligence Act would define virtually all software as AI.

INTRODUCTION

European Union (EU) legislators are considering legislation— the Artificial Intelligence Act (AIA), the original draft of which was published by the European Commission in April 2021[1]—that aims to ensure the safety of AI systems in uses designated as “high risk”. As originally drafted, however, the AIA’s scope was not at all limited to AI; it would instead cover virtually all software. EU governments seem to have realized this problem and are trying to fix the proposal, while some pressure groups have pushed to move the draft in the opposite direction.

The AIA proposal is currently under consideration by specialized committees of the European Parliament. The parliamentary stage began with a long disagreement among the various committees regarding who should have decisive influence over the Parliament’s position on the bill. With that disagreement now resolved, discussions on the legislation’s merits are ongoing.

The purpose of this brief is to inform debate on the proposal’s fundamental features: its scope and the key provisions setting out prohibited AI practices (related to so-called “subliminal techniques” and “social scoring”).

Read the full issue brief here.

[1] Proposal for a Regulation of the European Parliament and of the Council Laying Down Harmonised Rules on Artificial Intelligence (Artificial Intelligence Act) and Amending Certain Union Legislative Acts, European Commission, (Apr. 21, 2021), available at https://perma.cc/RWT9-9D97.

Continue reading
Data Security & Privacy

Guiding Principles and a Legislative Checklist for Consumer Privacy Regulation

ICLE Issue Brief State legislatures are now tackling consumers’ digital privacy. Given the Internet’s inherently international character, a federal bill setting a national standard for digital privacy would . . .

State legislatures are now tackling consumers’ digital privacy. Given the Internet’s inherently international character, a federal bill setting a national standard for digital privacy would be ideal. Yet, in the absence of federal legislation, state governments are seeking to address consumer privacy. Unfortunately, overly broad and burdensome regulatory obligations pose a real and immediate risk to digital innovation. Ensuring a globally robust market requires balancing consumer privacy and legitimate information exchange between consumers and digital-services companies.

The attached guiding principles and legislative checklist from the Reason Foundation and the International Center for Law & Economics seeks to help legislators and stakeholders narrowly tailor state consumer-privacy policy to address concrete consumer harms while preventing disproportionately punitive responses that obstruct market performance.

Read the full checklist here.

Continue reading
Data Security & Privacy

In Harm’s Way: Why Online Safety Regulation Needs an Independent Reviewer

Scholarship The attached was originally published by the Institute of Economic Affairs. Summary The draft Online Safety Bill presents a significant threat to freedom of speech, . . .

The attached was originally published by the Institute of Economic Affairs.

Summary

  • The draft Online Safety Bill presents a significant threat to freedom of speech, privacy, and innovation. “Safety” has been prioritized over freedom. The bill’s proponents wrongly assume it is possible to remove “bad” content without negatively impacting on the “good” and that platforms, not users, are responsible for “harms.”
  • The bill’s inclusion of “legal but harmful” speech–along with defining unlawful speech as any content that the platform merely has “reasonable grounds to believe” is unlawful–risks state-mandated automated censorship of lawful online speech. The duties to “have regard” to freedom of expression and privacy are far weaker than the “safety” duties.
  • The bill threatens innovation and competition within the U.K. economy by imposing byzantine duties that will inevitably be harder and more costly for start-ups and smaller companies to comply with, while discouraging companies from operating in the United Kingdom, limiting access to online services.
  • The bill provides extraordinary discretion to the Secretary of State and Ofcom to design “codes of conduct” that will define “legal but harmful” content. They will also have the power to impose additional requirements such as age verification and undermine end-to-end encryption. The regulator will also have significant leeway about what types of content and which platforms to target.
  • If the Government is unwilling to fundamentally rewrite the bill, there is a clear need for serious, independent scrutiny mechanisms to prevent regulatory and ministerial overreach.
  • An Independent Reviewer of Online Safety Legislation, modelled partly on the Independent Reviewer of Terrorism Legislation, could provide some accountability.
  • The Independent Reviewer would need to be properly resourced and empowered to scrutinize the activities of the Secretary of State and Ofcom and communicate findings to policymakers and the general public.
  • An Independent Reviewer, properly empowered and resourced, could stand up for freedom of expression, privacy and innovation while being a bulwark against future authoritarian demands.

Read the full paper here.

Continue reading
Data Security & Privacy

Privacy and Security Implications of Regulation of Digital Services in the EU and in the US

Scholarship Written for the Transatlantic Technology Law Forum (TTLF) Working Paper Series, ICLE Senior Scholar Mikołaj Barczentewicz assesses privacy and security risks raised by U.S. and EU legislative proposals to regulate digital platforms.

The attached is a part of the Transatlantic Technology Law Forum’s (TTLF) Working Paper Series, which presents original research on technology, and business-related law and policy issues of the European Union and the United States. TTLF is a joint initiative of Stanford Law School and the University of Vienna School of Law.

Abstract

The goal of this project is to assess the data privacy and security implications of the “new wave” of legislation on digital services—both in the United States and in the EU. In the European Union, the proposals for the Digital Services Act and the Digital Markets Act include provisions that have potentially significant security and privacy implications, like interoperability obligations for online platforms or provisions for data access for researchers. Similar provisions, e.g., on interoperability, are included in bills currently being considered by the U.S .Congress (e.g., in Rep. David Cicilline’s American Choice and Innovation Online Act and in Sen. Amy Klobuchar’s American Innovation and Choice Online Act). Some stakeholders are advocating that the EU and U.S. legislatures go even further than currently contemplated in a direction that could potentially have negative security and privacy consequences—especially on interoperability. I aim to assess whether the legislative proposals in their current form adequately addresses potential privacy and security risks, and what changes in the proposed legislation might help to alleviate the risks.

Introduction

Increasing information privacy and security through the law is notoriously difficult, even if that is the explicit goal of legislation. Thus, perhaps we should instead expect the law at least not to unintentionally decrease the level of privacy and security. Unfortunately, pursuing even seemingly unrelated policy aims through legislation may have that negative effect. In this paper, I analyze several legislative proposals from the EU and from the United States belonging to the new “techlash” wave. All those bills purport to improve the situation of consumers or competitiveness of digital markets. However, as I argue, they would all have
negative and unaddressed consequences in terms of information privacy and security.

On the EU side, I consider the Digital Services Act (DSA) and the Digital Markets Act (DMA) proposals. The DSA and the DMA have been proceeding through the EU legislative process with unexpected speed and given what looks like significant political momentum, it is possible that they will become law. On the U.S. side, I look at Rep. David Cicilline’s (D-R.I.) American Choice and Innovation Online Act, Rep. Mary Gay Scanlon’s (D-Pa.) Augmenting Compatibility and Competition by Enabling Service Switching (ACCESS) Act, Sen. Amy Klobuchar’s (D-Minn.) American Innovation and Choice Online Act, and Sen. Richard Blumenthal’s (D-Conn.) Open App Markets Act.

I chose to focus on three regulatory solutions: (1) mandating interoperability, (2) mandating device neutrality (a possibility of sideloading applications), and (3) compulsory data access (by vetted researchers or by authorities). The first two models are shared by most of the discussed legislative proposals, other than the DSA. The last one is only included in the DSA.

Read the full paper here.

Continue reading
Data Security & Privacy

Issue Brief: The Great Transatlantic Data Disruption

ICLE Issue Brief A new issue brief published jointly by ICLE and the Progressive Policy Institute looks at looming threats to transatlantic data flows between the U.S. and EU that power an estimated $333 billion in annual trade of digitally enabled services.

(This issue brief is a joint publication of the International Center for Law & Economics and the Progressive Policy Institute)

Executive Summary

Data is, logically enough, one of the pillars supporting the modern digital economy. It is, however, not terribly useful on its own. Only once it has been collected, analyzed, combined, and deployed in novel ways does data obtain its highest utility. This is to say, a large part of the value of data is its ability to flow throughout the global connected economy in real time, permitting individuals and firms to develop novel insights that would not otherwise be possible, and to operate at a higher level of efficiency and safety.

Although the global transmission of data is critical to every industry and scientific endeavor, those data flows increasingly run into barriers of various sorts when they seek to cross national borders. Most typically, these barriers take the form of data-localization requirements.

Data localization is an umbrella term that refers to a variety of requirements that nations set to govern how data is created, stored, and transmitted within their jurisdiction. The aim of data-localization policies is to restrict the flow of data across a nation’s borders, often justified on grounds of protecting national security interests and/or sensitive information about citizens.

Data-localization requirements have in recent years been at the center of a series of legal disputes between the United States and the European Union (EU) that potentially threaten the future of transatlantic data flows. In October 2015, in a decision known as Schrems I, the Court of Justice of the European Union (CJEU) overturned the International Safe Harbor Privacy Principles, which had for the prior 15 years governed customer data transmitted between the United States and the EU. The principles were replaced in February 2016 by a new framework agreement known as the EU–US Privacy Shield, until the CJEU declared that, too, to be invalid in a July 2020 decision known as Schrems II. (Both complaints were brought by Austrian privacy advocate Max Schrems).

The current threatened disruption to transatlantic data flows highlights the size of the problem caused by data-localization policies. According to one estimate, transatlantic trade generates upward of $5.6 trillion in annual commercial sales, of which at least $333 billion is related to digitally enabled services.[3] Some estimates suggest that moderate increases in data-localization requirements would result in a €116 billion reduction in exports from the EU.

One difficulty in precisely quantifying the full impact of strict data-localization practices is that the list of industries engaged in digitally enabled trade extends well beyond those that explicitly trade in data. This is because “it is increasingly difficult to separate services and goods with the rise of the ‘Internet of Things’ and the greater bundling of goods and services. At the same time, goods are being substituted by services … further shifting the regulatory boundaries between what is treated as goods and services.” Thus, there is reason to believe that the true value of digitally enabled trade to the global economy is underestimated.

Moreover, as we discuss infra, there is reason to suspect that data flows and digitally enabled trade have contributed a good deal of unmeasured economic activity that partially offsets the lower-than-expected measured productivity growth seen in the both the European Union and the United States over the last decade and a half. In particular, heavy investment in research and development by firms globally has facilitated substituting the relatively more efficient work of employees at firms for unpaid labor by individuals. And global data flows have facilitated the creation of larger, more efficient worldwide networks that optimize time use by firms and individuals, and the development of resilient networks that can withstand shocks to the system like the COVID-19 pandemic.

In the Schrems II decision, the court found that provisions of U.S. national security law and the surveillance powers it grants to intelligence agencies do not protect the data of EU citizens sufficiently to justify deeming U.S. laws as providing adequate protection (known as an “adequacy” decision). In addition to a national “adequacy” decision, the EU General Data Protection Regulation (GDPR) also permits firms that wish to transfer data to the United States to rely on “standard contractual clauses” (SCC) that guarantee protection of citizen data. However, a prominent view in European policy circles—voiced, for example, by the European Parliament—is that, after Schrems II, no SCC can provide a lawful basis for data transfers to the United States.

Shortly after the Schrems II decision, the Irish Data Protection Commission (IDPC) issued a preliminary draft decision against Facebook that proposed to invalidate the company’s SCCs, largely on the same grounds that the CJEU used when invalidating the Privacy Shield. This matter is still pending, but a decision from the IDPC is expected imminently, with the worst-case result being an order that Facebook suspend all transatlantic data transfers that depend upon SCCs. Narrowly speaking, the IDPC decision only immediately affects Facebook. However, if the draft decision is finalized, the SCCs of every other firm that transfers data across the Atlantic may be subject to invalidation under the same legal reasoning.

Although this increasingly restrictive legal environment for data flows has been building for years, the recent problems are increasingly breaking into public view, as national DPAs grapple with the language of the GDPR and the Schrems decisions. The Hamburg DPA recently issued a public warning that the use of the popular video-conference application Zoom violates GDPR. The Portuguese DPA issued a resolution forbidding its National Institute of Statistics from transferring census data to the U.S.-based Cloudflare, because the SCCs in the contract between the two entities were deemed insufficient in light of Schrems II.

The European Data Protection Supervisor has initiated a program to “monitor compliance of European institutions, bodies, offices and agencies (EUIs) with the ‘Schrems II’ Judgement.” As part of this program, it opened an investigation into Amazon and Microsoft in order to determine if Microsoft’s Office 365 and the cloud-hosting services offered by both Amazon and Microsoft are compatible with GDPR post-Schrems II. Max Schrems, who brought the original complaint against Facebook, has through his privacy-activist group submitted at least 100 complaints as of August 2020 alone, which will undoubtedly result in scores of cases across multiple industries.

The United States and European Union are currently negotiating a replacement for the Privacy Shield agreement that would allow data flows between the two economic regions to continue. But EU representatives have warned that, in order to comply with GDPR, there will likely be nontrivial legislative changes necessary in the United States, particularly in the sensitive area of national-security monitoring. In effect, the European Union and the Unites States are being forced to rethink the boundaries of national law in the context of a digital global economy.

This issue brief first reviews the relevant literature on the importance of digital trade, as well as the difficulties in adequately measuring it. One implication of these measurement difficulties is that the impact of disruptions to data flows and digital trade are likely to be far greater than even the large effects discovered through traditional measurement suggest.

We then discuss the importance of network resilience, and the productivity or quasi-productivity gains that digital networks and data flows provide. After a review of the current policy and legal challenges facing digital trade and data flows, we finally urge the U.S. and EU negotiating parties to consider longer-term trade and policy changes that take seriously the role of data flows in the world economy.

Read the full issue brief here.

Continue reading
Innovation & the New Economy

Digital Duty to Deal, Data Portability, and Interoperability

Scholarship In this chapter, we discuss the development of the duty to deal doctrine in antitrust law, its application to the digital economy, and proposals for specific duties to deal, such as data portability and interoperability.

Abstract

In this chapter, we discuss the development of the duty to deal doctrine in antitrust law, its application to the digital economy, and proposals for specific duties to deal, such as data portability and interoperability.

Part I outlines the development of the duty to deal doctrine in antitrust law. The development of the doctrine in the United States will be compared to that in the European Union. Popular economic justifications for the doctrine and key cases will be explored. Part II then situates this doctrine within the digital economy, focusing on the importance of getting the contours of the doctrine right in that economy. As we shall see, the law and economics of the duty to deal caution against its application to dynamic, digital markets. This will be illustrated by looking at cases where it has been applied. Part III focuses on two specific categories of duties to deal: data portability and interoperability.

Continue reading
Data Security & Privacy