Showing 9 of 36 Publications in GDPR

Mikolaj Barczentewicz on the EDPB’s ‘Pay or Consent’ Opinion

Presentations & Interviews ICLE Senior Scholar Mikolaj Barczentewicz took part in a virtual panel hosted by Arthur Cox LLP on the European Data Protection Board’s recent opinion concerning . . .

ICLE Senior Scholar Mikolaj Barczentewicz took part in a virtual panel hosted by Arthur Cox LLP on the European Data Protection Board’s recent opinion concerning whether “pay or consent” platform models comply with the EU’s General Data Protection Regulation (GDPR). Video of the full panel is embedded below.

Continue reading
Data Security & Privacy

Google Previews the Coming Tussle Between GDPR and DMA Article 6(11)

TOTM Among the less-discussed requirements of the European Union’s Digital Markets Act (DMA) is the data-sharing obligation created by Article 6(11). This provision requires firms designated . . .

Among the less-discussed requirements of the European Union’s Digital Markets Act (DMA) is the data-sharing obligation created by Article 6(11). This provision requires firms designated under the law as “gatekeepers” to share “ranking, query, click and view data” with third-party online search engines, while ensuring that any personal data is anonymized.

Given how restrictively the notion of “anonymization” has been interpreted under the EU’s General Data Protection Regulation (GDPR), the DMA creates significant tension without pointing to a clear resolution. Sophie Stalla-Bourdillon and Bárbara da Rosa Lazarotto recently published a helpful analysis of the relevant legal questions on the European Law Blog. In this post, I will examine Google’s proposed solution.

Read the full piece here.

Continue reading
Data Security & Privacy

US Shouldn’t Follow the European Union’s Confusing Data Privacy Model

Popular Media While myriad attempts to pass federal privacy legislation have all fizzled out in recent years, House Energy and Commerce Committee Chairwoman Cathy McMorris Rodgers (R-WA) . . .

While myriad attempts to pass federal privacy legislation have all fizzled out in recent years, House Energy and Commerce Committee Chairwoman Cathy McMorris Rodgers (R-WA) and Senate Commerce Committee Chairwoman Maria Cantwell (D-WA) recently introduced the American Privacy Rights Act of 2024 as a compromise approach that could pass both chambers.

But before it does, it’s worth considering whether the United States really wants to follow the European privacy model, which has led to much confusion and consumer harm.

Read the full piece here.

Continue reading
Data Security & Privacy

Mikołaj Barczentewicz on the EDPB’s Pay or Okay Ruling

Presentations & Interviews ICLE Senior Scholar Miko?aj Barczentewicz was a guest on the Mobile Dev Memo podcast to discuss the European Data Protection Board’s recent ruling on the . . .

ICLE Senior Scholar Miko?aj Barczentewicz was a guest on the Mobile Dev Memo podcast to discuss the European Data Protection Board’s recent ruling on the so-called “pay or okay” business model, and whether it complies with the requirements of the EU’s General Data Protection Regulation (GDPR). Audio of the full interview is embedded below.

Continue reading
Data Security & Privacy

Confronting the DMA’s Shaky Suppositions

TOTM It’s easy for politicians to make unrealistic promises. Indeed, without a healthy skepticism on the part of the public, they can grow like weeds. In . . .

It’s easy for politicians to make unrealistic promises. Indeed, without a healthy skepticism on the part of the public, they can grow like weeds. In the world of digital policy, the European Union’s Digital Markets Act (DMA) has proven fertile ground for just such promises. We’ve been told that large digital platforms are the source of many economic and social ills, and that handing more discretionary power to the government can solve these problems with no apparent side effects or costs.

Read the full piece here.

Continue reading
Antitrust & Consumer Protection

EU Authorities on ‘Pay or Consent’: Mid-April 2024 Update

Popular Media Due to Meta’s adoption of a “pay or consent” model for Facebook and Instagram, the model became a key issue not only under EU privacy . . .

Due to Meta’s adoption of a “pay or consent” model for Facebook and Instagram, the model became a key issue not only under EU privacy law but also under the new digital regulations: the Digital Services Act (DSA) and the Digital Markets Act (DMA). Given the barrage of pay or consent-related news in the past months, I thought it would be a good idea to take stock of where we are now.

Read the full piece here.

Continue reading
Data Security & Privacy

A Report Card on the Impact of Europe’s Privacy Regulation (GDPR) on Digital Markets

Scholarship Abstract This Article will evaluate the consequences of the General Data Protection Regulation (“GDPR”) implemented by the European Union (“EU”) in 2018. Despite its aim . . .

Abstract

This Article will evaluate the consequences of the General Data Protection Regulation (“GDPR”) implemented by the European Union (“EU”) in 2018. Despite its aim to bolster user privacy, empirical evidence from thirty-one studies suggests a nuanced impact on market dynamics. While GDPR modestly enhanced user data protection, it also triggered adverse effects, including diminished startup activity, innovation, and increased market concentration. Further, this Article will uncover a complex narrative where gains in privacy are offset by compliance costs disproportionately affecting smaller firms. This Article will also highlight the challenges of regulating highly innovative markets, which is particularly important given subsequent EU regulations, such as the Digital Markets Act (“DMA”) and Digital Services Act (“DSA”). As other jurisdictions consider similar privacy regulations, the GDPR experience is a cautionary tale.

Read at SSRN.

Continue reading
Data Security & Privacy

Does the DMA Let Gatekeepers Protect Data Privacy and Security?

TOTM It’s been an eventful two weeks for those following the story of the European Union’s implementation of the Digital Markets Act. On April 18, the . . .

It’s been an eventful two weeks for those following the story of the European Union’s implementation of the Digital Markets Act. On April 18, the European Commission began a series of workshops with the companies designated as “gatekeepers” under the DMA: Apple, Meta, Alphabet, Amazon, ByteDance, and Microsoft. And even as those workshops were still ongoing, the Commission announced noncompliance investigations against Alphabet, Apple, and Meta. Finally, the European Parliament’s Internal Market and Consumer Protection Committee (IMCO) held its own session on DMA implementation.

Many aspects of those developments are worth commenting on, and you can expect more competition-related analysis on Truth on the Market soon. Here, I will focus on what these developments mean for data privacy and security.

Read the full piece here.

Continue reading
Data Security & Privacy

Lessons from GDPR for AI Policymaking

Scholarship Abstract The ChatGPT chatbot has not just caught the public imagination; it is also amplifying concern across industry, academia, and government policymakers interested in the . . .

Abstract

The ChatGPT chatbot has not just caught the public imagination; it is also amplifying concern across industry, academia, and government policymakers interested in the regulation of Artificial Intelligence (AI) about how to understand the risks and threats associated with AI applications. Following the release of ChatGPT, some EU regulators proposed changes to the EU AI Act to classify AI systems like ChatGPT that generate complex texts without any human oversight as “high-risk” AI systems that would fall under the law’s requirements. That classification was a controversial one, with other regulators arguing that technologies like ChatGPT, which merely generate text, are “not risky at all.” This controversy risks disrupting coherent discussion and progress toward formulating sound AI regulations for Large Language Models (LLMs), AI, or ICTs more generally. It remains unclear where ChatGPT fits within AI and where AI fits within the larger context of digital policy and the regulation of ICTs in spite of nascent efforts by OECD.AI and the EU.

This paper aims to address two research questions around AI policy: (1) How are LLMs like ChatGPT shifting the policy discussions around AI regulations? (2) What lessons can regulators learn from the EU’s General Data Protection Regulation (GDPR) and other data protection policymaking efforts that can be applied to AI policymaking?

The first part of the paper addresses the question of how ChatGPT and other LLMs have changed the policy discourse in the EU and other regions around regulating AI and what the broader implications for these shifts may be for AI regulation more widely. This section reviews the existing proposal for an EU AI Act and its accompanying classification of high-risk AI systems, considers the changes prompted by the release of ChatGPT and examines how LLMs appear to have altered policymakers’ conceptions of the risks presented by AI. Finally, we present a framework for understanding how the security and safety risks posed by LLMs fit within the larger context of risks presented by AI and current efforts to formulate a regulatory framework for AI.

The second part of the paper considers the similarities and differences between the proposed AI Act and GDPR in terms of (1) organizations being regulated, or scope, (2) reliance on organizations’ self-assessment of potential risks, or degree of self-regulation, (3) penalties, and (4) technical knowledge required for effective enforcement, or complexity. For each of these areas, we consider how regulators scoped or implemented GDPR to make it manageable, enforceable, meaningful, and consistent across a wide range of organizations handling many different kinds of data as well as the extent to which they were successful in doing so. We then examine different ways in which those same approaches may or may not be applicable to the AI Act and the ways in which AI may prove more difficult to regulate than issues of data protection and privacy covered by GDPR. We also look at the ways in which AI may make it more difficult to enforce and comply with GDPR since the continued evolution of AI technologies may create cybersecurity tools and threats that will impact the efficacy of GDPR and privacy policies. This section argues that the extent to which the proposed AI Act relies on self-regulation and the technical complexity of enforcement are likely to pose significant challenges to enforcement based on the implementation of the most technologically and self-regulation-focused elements of GDPR.

Continue reading
Innovation & the New Economy