What are you looking for?

Showing 9 of 147 Results in Privacy

Google Previews the Coming Tussle Between GDPR and DMA Article 6(11)

TOTM Among the less-discussed requirements of the European Union’s Digital Markets Act (DMA) is the data-sharing obligation created by Article 6(11). This provision requires firms designated . . .

Among the less-discussed requirements of the European Union’s Digital Markets Act (DMA) is the data-sharing obligation created by Article 6(11). This provision requires firms designated under the law as “gatekeepers” to share “ranking, query, click and view data” with third-party online search engines, while ensuring that any personal data is anonymized.

Given how restrictively the notion of “anonymization” has been interpreted under the EU’s General Data Protection Regulation (GDPR), the DMA creates significant tension without pointing to a clear resolution. Sophie Stalla-Bourdillon and Bárbara da Rosa Lazarotto recently published a helpful analysis of the relevant legal questions on the European Law Blog. In this post, I will examine Google’s proposed solution.

Read the full piece here.

Continue reading
Data Security & Privacy

US Shouldn’t Follow the European Union’s Confusing Data Privacy Model

Popular Media While myriad attempts to pass federal privacy legislation have all fizzled out in recent years, House Energy and Commerce Committee Chairwoman Cathy McMorris Rodgers (R-WA) . . .

While myriad attempts to pass federal privacy legislation have all fizzled out in recent years, House Energy and Commerce Committee Chairwoman Cathy McMorris Rodgers (R-WA) and Senate Commerce Committee Chairwoman Maria Cantwell (D-WA) recently introduced the American Privacy Rights Act of 2024 as a compromise approach that could pass both chambers.

But before it does, it’s worth considering whether the United States really wants to follow the European privacy model, which has led to much confusion and consumer harm.

Read the full piece here.

Continue reading
Data Security & Privacy

Consumer Privacy, Information Sharing, and Consumer Finance: Tradeoffs and Opportunities

Scholarship Abstract Concerns over the ownership, use, security, and flows of consumer data information are not new. Yet the dominance of the Internet and electronic payments . . .

Abstract

Concerns over the ownership, use, security, and flows of consumer data information are not new. Yet the dominance of the Internet and electronic payments has elevated such concerns to a high level. Traditionally there was perceived to be a tradeoff between the flow of information necessary for the consumer financial system to work well (such as to solve information asymmetries necessary in order to make credit-granting decisions) and consumer control over their data and keeping their information private. Data security approaches historically pursued a state “fortress” model that rested on the ability of consumers to keep private a small amount of information the consumer uniquely knew, such as a PIN or password.

Today, however, it is becoming apparent that this static model is no longer viable and can be expected to grow less viable with the growth of artificial intelligence and machine-learning. But such approaches have costs as well—not only are they often more cumbersome, when the fortress walls are breached these systems can be slower to adapt and can result in increased harm to consumers on the back end. Some people have suggested that we respond to these emergent threats by trying to build taller and thicker fortress walls and other static, such as the use of biometric identification. The approach suggested here, by contrast, attempts to model what a more dynamic approach to information security would look like and how such a system would be dependent on more data flows rather than less. I suggest some areas of current and proposed regulation that should be reexamined in light of the analysis presented here.

Read at SSRN.

Continue reading
Data Security & Privacy

Mikołaj Barczentewicz on the EDPB’s Pay or Okay Ruling

Presentations & Interviews ICLE Senior Scholar Miko?aj Barczentewicz was a guest on the Mobile Dev Memo podcast to discuss the European Data Protection Board’s recent ruling on the . . .

ICLE Senior Scholar Miko?aj Barczentewicz was a guest on the Mobile Dev Memo podcast to discuss the European Data Protection Board’s recent ruling on the so-called “pay or okay” business model, and whether it complies with the requirements of the EU’s General Data Protection Regulation (GDPR). Audio of the full interview is embedded below.

Continue reading
Data Security & Privacy

Confronting the DMA’s Shaky Suppositions

TOTM It’s easy for politicians to make unrealistic promises. Indeed, without a healthy skepticism on the part of the public, they can grow like weeds. In . . .

It’s easy for politicians to make unrealistic promises. Indeed, without a healthy skepticism on the part of the public, they can grow like weeds. In the world of digital policy, the European Union’s Digital Markets Act (DMA) has proven fertile ground for just such promises. We’ve been told that large digital platforms are the source of many economic and social ills, and that handing more discretionary power to the government can solve these problems with no apparent side effects or costs.

Read the full piece here.

Continue reading
Antitrust & Consumer Protection

Comments to UK Information Commissioner’s Office on ‘Pay or Consent’

Regulatory Comments I thank the ICO for the opportunity to submit comments on “pay or consent.” My focus will be on the question of how to deal with . . .

I thank the ICO for the opportunity to submit comments on “pay or consent.” My focus will be on the question of how to deal with consent to personal data processing needed to fund the provision of a service that does not fit the legal basis of contractual necessity.[1]

Personalised Advertising: Contractual Necessity or Consent?

Under the GDPR, personal data may only be processed if one of the lawful bases from Article 6 applies. They include, in particular, consent, contractual necessity, and legitimate interests. When processing is necessary for the performance of a contract (Article 6(1)(b)), then that is the basis on which the controller should rely. One may think that if data processing (e.g., for targeting ads) is necessary to fund a free-of-charge service, that should count as contractual necessity. I am unaware of data protection authorities disputing this in principle, but there is a tendency to interpret contractual necessity narrowly.[2] Notably, the EDPB decided in December 2022 that Facebook and Instagram shouldn’t have relied on that ground for personalisation of advertising.[3] Subsequently, the EDPB decided that Meta should also not rely on the legitimate interests basis.[4]

The adoption of a narrow interpretation of contractual necessity created an interpretative puzzle. If we set aside the legitimate interests basis under Article 6(1)(f)), in many commercial contexts, we are only left with consent as an option (Article 6(1)(a)). This is especially true where consent is required not due to the GDPR but under national laws implementing the ePrivacy Directive (Directive 2002/58/EC), including the UK Privacy and Electronic Communications Regulations (PECR). That is, for solutions like cookies or browser storage. Importantly, though, these are not always needed for personalised advertising. Perhaps the biggest puzzle is how to deal with consent to processing needed to fund the provision of a service that does not fit the narrow interpretation of contractual necessity.

Consent, as we know from Articles 4(11) and 7(4) GDPR, must be “freely given.” In addition, Recital 42 states that: “Consent should not be regarded as freely given if the data subject has no genuine or free choice or is unable to refuse or withdraw consent without detriment.” The EDPB provided self-contradictory guidance by first saying that withdrawing consent should “not lead to any costs for the data subjects,” but soon after adding that the GDPR “does not preclude all incentives” for consenting.[5]

Despite some differences, at least the Austrian, Danish, French, German (DSK), and Spanish data protection authorities generally acknowledge that paid alternatives to consent may be lawful.[6] Notably, the Norwegian Privacy Board—in a Gridnr appeal—also explicitly allowed that possibility.[7] I discuss below the conditions those authorities focus on in their assessment of “pay or consent” implementations.

The CJEU and ‘Necessity’ to Charge ‘An Appropriate Fee’

In its Meta decision from July 2023, the EU Court of Justice weighed in, though in the context of third-party-collected data, by saying that if that kind of data processing by Meta does not fall under contractual necessity, then:

(…) those users must be free to refuse individually, in the context of the contractual process, to give their consent to particular data processing operations not necessary for the performance of the contract, without being obliged to refrain entirely from using the service offered by the online social network operator, which means that those users are to be offered, if necessary for an appropriate fee, an equivalent alternative not accompanied by such data processing operations.[8]

Intentionally or not, the Court highlighted the interpretative problem stemming from a narrow interpretation of contractual necessity. The Court said that even if processing does not fall under contractual necessity, it may still be “necessary” to charge data subjects “an appropriate fee” if they refuse to consent. Disappointing some activists, the Court did not endorse the EDPB’s first comment I cited (that refusal to consent should not come with “any costs”).

Even though the Court did not explain this further, we can speculate that the Court was not willing to accept the view that all business models simply have to be adjusted to a maximally prohibitive interpretation of the GDPR. The Court may have attempted to save the GDPR from a likely political backlash to an attempt to use the GDPR to deny Europeans a choice of free-of-charge services funded by personalised advertising. Perhaps, the Court also noted that other EU laws rely on the GDPR’s definition of consent (e.g., the Digital Markets Act) and that this gives an additional reason to be very cautious in interpreting this concept in ways that are not in line with current expectations.

Remaining Questions

Several questions will likely be particularly important for future assessments of “pay or consent” implementations under the GDPR and ePrivacy/PECRs. The following list may not be exhaustive but aims to identify the main issues.

How Specific Should the Choice Be?

The extent to which service providers batch consent to processing for different purposes, especially if users cannot (in a “second step”) adjust consent more granularly, is likely to be questioned. This is problematic because giving users complete freedom to adjust their consent could also defeat the purpose of having a paid alternative.

In a different kind of bundling, service providers may make the paid alternative to consent more attractive by adding incentives like access to additional content or the absence of ads (including non-personalised ads). On the one hand, this means that service providers incentivise users not to consent, making consent less attractive. This could be seen as reducing the pressure to consent and making the choice more likely to be freely given. On the other hand, a more attractive paid option could be more costly for the service provider and thus require a higher price.

What Is an ‘Appropriate’ Price?

The pricing question is a potential landmine for data protection authorities, who are decidedly ill-suited to deal with it. Just to show one aspect of the complexity: setting as a benchmark the service’s historical average revenue per user (ARPU) from (personalised) advertising may be misleading. Users are not identical. Wealthier, less price-sensitive users, who may be more likely to pay for a no-ads option, are also worth more to advertisers. Hence, the loss of income from advertising may be higher than just “old ARPU multiplied by the number of users on a no-ads tier,” suggesting a need to charge the paying users more than historical ARPU merely to retain the same level of revenue. Crucially, the situation will likely be dynamic due to subscription “churn” (users canceling their subscriptions) and other market factors. The economic results of the “pay or consent” scheme may continue to change, and setting the price level will always involve business judgment based on predictions and intuition.

Some authorities may be tempted to approach the issue from the perspective of users’ willingness to pay, but this also raises many issues. First, the idea of price regulation by privacy authorities, capping prices at a level defined by the authorities’ view of what is acceptable to a user, may face jurisdictional scrutiny. Second, taking users’ willingness to pay as a benchmark implicitly assumes a legally protected entitlement to access the service for a price they like. In other words, to assume that users are entitled to specific private services, like social media services.[9] This is not something that can be simply assumed; it would require a robust argument—and arguably constitute a legal change that is appropriate only for the political, legislative process.

Imbalance

Recital 43 of the GDPR explains that consent may not be free when there is “a clear imbalance between the data subject and the controller.” In the Meta decision, the EU Court of Justice admitted the possibility of such an imbalance between a business with a dominant position, as understood in competition law, and its customers.[10] This, too, may be a difficult issue for data protection authorities to deal with, both for expertise and competence reasons.

The Scale of Processing and Impact on Users

Distinct from market power (dominance), though sometimes conflated with it, are the issues of the scale of processing and its impact on users. An online service provider, e.g., a newspaper publisher, may have relatively little market power but may be using a personalised advertising framework (e.g., an RTB scheme facilitated by third parties[11]) that is very large in scale and with more potential for a negative impact on users than an advertising system internal to a large online platform. A large online platform can offer personalised advertising to its business customers (advertisers) while sharing little or no information about who the ads are being shown to. Large platforms have economic incentives to keep user data securely within the platform’s “walled garden,” not sharing it with outsiders. Smaller publishers participate in open advertising schemes (RTB), where user data is shared more widely with advertisers and other participants.

Given the integration of smaller publishers in such open advertising schemes, an attempt by data protection authorities to set a different standard for consent just for large platforms may fail as based on an arbitrary distinction. In other words, however attractive it may seem for the authorities to target Meta without targeting the more politically powerful legacy media, this may not be an option.

[1] The comments below build on my ‘“Pay or consent:” Personalized ads, the rules and what’s next’ (IAPP, 20 November 2023) < https://iapp.org/news/a/pay-or-consent-personalized-ads-the-rules-and-whats-next/ >.

[2] On this issue, I highly recommend the article by Professor Martin Nettesheim on ‘Data Protection in Contractual Relationships (Art. 6 (1) (b) GDPR)’ (May 2023) < https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4427134 >.

[3] https://www.edpb.europa.eu/news/news/2023/facebook-and-instagram-decisions-important-impact-use-personal-data-behavioural_en

[4] https://www.edpb.europa.eu/news/news/2023/edpb-urgent-binding-decision-processing-personal-data-behavioural-advertising-meta_en

[5] https://www.edpb.europa.eu/sites/default/files/files/file1/edpb_guidelines_202005_consent_en.pdf

[6] David Pfau, ‘PUR models: Status quo on the European market’ (BVDW, October 2023) < https://iabeurope.eu/knowledge_hub/bvdws-comprehensive-market-overview-pur-models-in-europe-legal-framework-and-future-prospects-in-english/ >; for the view of the Spanish authority, see ??https://www.aepd.es/prensa-y-comunicacion/notas-de-prensa/aepd-actualiza-guia-cookies-para-adaptarla-a-nuevas-directrices-cepd

[7] https://www.personvernnemnda.no/pvn-2022-22

[8] https://curia.europa.eu/juris/document/document.jsf?mode=lst&pageIndex=1&docid=276478&part=1&doclang=EN&text=&dir=&occ=first&cid=163129

[9] See also Peter Caddock, ‘Op-ed: “Pay or data” has its reasons – even if you disagree’, https://www.linkedin.com/pulse/op-ed-pay-data-has-its-reasons-even-you-disagree-peter-craddock

[10] See para [149]. This is also referenced in the Joint EDPB-EDPS contribution to the public consultation on the draft template relating to the description of consumer profiling techniques (Art.15 DMA) (September 2023), page 14.

[11] https://en.wikipedia.org/wiki/Real-time_bidding

Continue reading
Data Security & Privacy

EU Authorities on ‘Pay or Consent’: Mid-April 2024 Update

Popular Media Due to Meta’s adoption of a “pay or consent” model for Facebook and Instagram, the model became a key issue not only under EU privacy . . .

Due to Meta’s adoption of a “pay or consent” model for Facebook and Instagram, the model became a key issue not only under EU privacy law but also under the new digital regulations: the Digital Services Act (DSA) and the Digital Markets Act (DMA). Given the barrage of pay or consent-related news in the past months, I thought it would be a good idea to take stock of where we are now.

Read the full piece here.

Continue reading
Data Security & Privacy

A Report Card on the Impact of Europe’s Privacy Regulation (GDPR) on Digital Markets

Scholarship Abstract This Article will evaluate the consequences of the General Data Protection Regulation (“GDPR”) implemented by the European Union (“EU”) in 2018. Despite its aim . . .

Abstract

This Article will evaluate the consequences of the General Data Protection Regulation (“GDPR”) implemented by the European Union (“EU”) in 2018. Despite its aim to bolster user privacy, empirical evidence from thirty-one studies suggests a nuanced impact on market dynamics. While GDPR modestly enhanced user data protection, it also triggered adverse effects, including diminished startup activity, innovation, and increased market concentration. Further, this Article will uncover a complex narrative where gains in privacy are offset by compliance costs disproportionately affecting smaller firms. This Article will also highlight the challenges of regulating highly innovative markets, which is particularly important given subsequent EU regulations, such as the Digital Markets Act (“DMA”) and Digital Services Act (“DSA”). As other jurisdictions consider similar privacy regulations, the GDPR experience is a cautionary tale.

Read at SSRN.

Continue reading
Data Security & Privacy

Does the DMA Let Gatekeepers Protect Data Privacy and Security?

TOTM It’s been an eventful two weeks for those following the story of the European Union’s implementation of the Digital Markets Act. On April 18, the . . .

It’s been an eventful two weeks for those following the story of the European Union’s implementation of the Digital Markets Act. On April 18, the European Commission began a series of workshops with the companies designated as “gatekeepers” under the DMA: Apple, Meta, Alphabet, Amazon, ByteDance, and Microsoft. And even as those workshops were still ongoing, the Commission announced noncompliance investigations against Alphabet, Apple, and Meta. Finally, the European Parliament’s Internal Market and Consumer Protection Committee (IMCO) held its own session on DMA implementation.

Many aspects of those developments are worth commenting on, and you can expect more competition-related analysis on Truth on the Market soon. Here, I will focus on what these developments mean for data privacy and security.

Read the full piece here.

Continue reading
Data Security & Privacy