Showing 9 of 3169 PublicationsWritten Testimonies & Filings

Antitrust at the Agencies Roundup: Spring Has Sprung

TOTM Last week was the occasion of the “spring meeting”; that is, the big annual antitrust convention in Washington, D.C. hosted by the American Bar Association . . .

Last week was the occasion of the “spring meeting”; that is, the big annual antitrust convention in Washington, D.C. hosted by the American Bar Association (ABA) Antitrust Section. To engage in a bit of self-plagiarism (efficient for me, at least), I had this to say about it last year…

Read the full piece here.

Continue reading
Antitrust & Consumer Protection

DOJ’s Case Against Apple: Beware of Forcing ‘Efficiencies’

TOTM The U.S. Justice Department’s (DOJ) recent complaint charging Apple with monopolizing smartphone markets is, according to Assistant U.S. Attorney General Jonathan Kanter, intended as a contribution to the . . .

The U.S. Justice Department’s (DOJ) recent complaint charging Apple with monopolizing smartphone markets is, according to Assistant U.S. Attorney General Jonathan Kanter, intended as a contribution to the agency’s “enduring legacy of taking on the biggest and toughest monopolies in history.”

Unfortunately, the case has fundamental weaknesses in its assessment of both Apple’s alleged monopoly power and the “exclusionary” nature of its business strategies. These infirmities have been discussed at-length by, among others, Alden AbbottHerbert Hovenkamp, and Randall Picker.

What appears to have flown under the radar, however, is the DOJ’s flawed understanding of the goals and scope of what it calls “our system of antitrust laws.”

Read the full piece here.

Continue reading
Antitrust & Consumer Protection

Children’s Online Safety and Privacy Legislation

TL;DR TL;DR Background: There has been recent legislative movement on a pair of major bills related to children’s online safety and privacy. H.R. 7891, the Kids . . .

TL;DR

Background: There has been recent legislative movement on a pair of major bills related to children’s online safety and privacy. H.R. 7891, the Kids Online Safety Act (KOSA) has 62 cosponsors in the U.S. Senate. Meanwhile, H.R. 7890, the Children and Teens’ Online Privacy Protection Act (COPPA 2.0) also has bipartisan support within the U.S. Senate Commerce Committee. At the time of publication, these and a slate of other bills related to children’s online safety and privacy were scheduled to be marked up April 17 by the U.S. House Energy and Commerce Committee.

But… If enacted, the primary effect of these bills is likely to be less free online content for minors. Raising the regulatory burdens on online platforms that host minors, as well as restricting creators’ ability to monetize their content, are both likely to yield greater investment in identifying and excluding minors from online spaces, rather than creating safe and vibrant online ecosystems and content that cater to them. In other words, these bills could lead to minors losing the many benefits of internet usage. A more cost-effective way to address potential online harms to teens and children would be to encourage parents and minors to make use of available tools to avoid those harms and to dedicate more resources to prosecuting those who use online platforms to harm minors.

KEY TAKEAWAYS

RAISING THE COST TO SERVE MINORS COULD LEAD TO THEIR EXCLUSION

If the costs of serving minors surpass the revenues that online platforms can generate from serving them, those platforms will invest in excluding underage users, rather than creating safe and vibrant content and platforms for them. 

KOSA will substantially increase the costs that online platforms bear for serving minors. The bill would require a “high impact online company” to exercise “reasonable care” in its design features to “prevent and mitigate” certain harms. These harms include certain mental-health disorders and patterns indicating or encouraging compulsive use by minors, as well as physical violence, cyberbullying, and discriminatory harassment. Moreover, KOSA requires all covered platforms to implement default safeguards to limit design features that encourage minors’ use of the platforms and to control the use of personalized recommendation systems.

RESTRICTING TARGETED ADVERTISING LEADS TO LESS FREE CONTENT

A significant portion of internet content is delivered by what economists call multisided platforms. On one side of the platform, users enjoy free access to content, while on the other side, advertisers are granted a medium to reach users. In effect, advertisers subsidize users’ access to online content. Platforms also collect data from users in order to serve them targeted ads, the most lucrative form of advertising. Without those ads, there would be less revenue to fund access to, and creation of, content. This is no less true when it comes to content of interest to minors.

COPPA 2.0 would expand the protections granted by the Children’s Online Privacy Protection Act of 1998 to users under age 13 to also cover those between 13 and 17 years of age. Where the current law requires parental consent to collect and use persistent identifiers for “individual-specific advertising” directed to children under age 13, COPPA 2.0 would require the verifiable consent of the teen or a parent to serve such ads to teens. 

Obtaining verifiable consent has proven sufficiently costly under the current COPPA rule that almost no covered entities make efforts to obtain it. COPPA has instead largely prevented platforms from monetizing children’s content, which has meant that less of it is created. Extending the law to cover teens would generate similar results. Without the ability to serve them targeted ads, platforms will have less incentive to encourage the creation of teen-focused content.

DE-FACTO AGE VERIFICATION REQUIREMENTS

To comply with laws designed to protect minors, online platforms will need to verify whether its users are minors. While both KOSA and COPPA 2.0 disclaim establishing any age-verification requirements or the collection of any data not already collected “in the normal course of business,” they both establish constructive knowledge standards for violators (i.e., “should have known” or “knowledge fairly implied on the basis of objective circumstances”). Online platforms will need to be able to identify their users who are minors in order to comply with the prohibition on serving them personalized recommendations (KOSA) or targeted advertising (COPPA 2.0). 

Age-verification requirements have been found to violate the First Amendment, in part because they aren’t the least-restrictive means to protect children online. As one federal district court put it: “parents may rightly decide to regulate their children’s use of social media—including restricting the amount of time they spend on it, the content they may access, or even those they chat with. And many tools exist to help parents with this.”

A BETTER WAY FORWARD

Educating parents and minors about those widely available practical and technological tools to mitigate the harms of internet use is a better way to protect minors online, and would pass First Amendment scrutiny. Another way to address the problem would be to increase the resources available to law enforcement to go after predators. The Invest in Child Safety Act of 2024 is one such proposal to give overwhelmed investigators the necessary resources to combat child sexual exploitation.

For more on how to best protect minors online, see “A Law & Economics Approach to Social Media Regulation” and “A Coasean Analysis of Online Age-Verification and Parental-Consent Regimes.” 

Continue reading
Innovation & the New Economy

Clearing the Telecom Logjam: A Modest Proposal

TOTM In this “Age of the Administrative State,” federal agencies have incredible latitude to impose policies without much direction or input from Congress. President Barack Obama . . .

In this “Age of the Administrative State,” federal agencies have incredible latitude to impose policies without much direction or input from Congress. President Barack Obama fully pulled off the mask in 2014, when he announced “[w]e are not just going to be waiting for legislation,” declaring “I’ve got a pen, and I’ve got a phone.” Subsequent presidents have similarly discovered that they had pens and phones, too.

Read the full piece here.

Continue reading
Telecommunications & Regulated Utilities

Confronting the DMA’s Shaky Suppositions

TOTM It’s easy for politicians to make unrealistic promises. Indeed, without a healthy skepticism on the part of the public, they can grow like weeds. In . . .

It’s easy for politicians to make unrealistic promises. Indeed, without a healthy skepticism on the part of the public, they can grow like weeds. In the world of digital policy, the European Union’s Digital Markets Act (DMA) has proven fertile ground for just such promises. We’ve been told that large digital platforms are the source of many economic and social ills, and that handing more discretionary power to the government can solve these problems with no apparent side effects or costs.

Read the full piece here.

Continue reading
Antitrust & Consumer Protection

Comments to UK Information Commissioner’s Office on ‘Pay or Consent’

Regulatory Comments I thank the ICO for the opportunity to submit comments on “pay or consent.” My focus will be on the question of how to deal with . . .

I thank the ICO for the opportunity to submit comments on “pay or consent.” My focus will be on the question of how to deal with consent to personal data processing needed to fund the provision of a service that does not fit the legal basis of contractual necessity.[1]

Personalised Advertising: Contractual Necessity or Consent?

Under the GDPR, personal data may only be processed if one of the lawful bases from Article 6 applies. They include, in particular, consent, contractual necessity, and legitimate interests. When processing is necessary for the performance of a contract (Article 6(1)(b)), then that is the basis on which the controller should rely. One may think that if data processing (e.g., for targeting ads) is necessary to fund a free-of-charge service, that should count as contractual necessity. I am unaware of data protection authorities disputing this in principle, but there is a tendency to interpret contractual necessity narrowly.[2] Notably, the EDPB decided in December 2022 that Facebook and Instagram shouldn’t have relied on that ground for personalisation of advertising.[3] Subsequently, the EDPB decided that Meta should also not rely on the legitimate interests basis.[4]

The adoption of a narrow interpretation of contractual necessity created an interpretative puzzle. If we set aside the legitimate interests basis under Article 6(1)(f)), in many commercial contexts, we are only left with consent as an option (Article 6(1)(a)). This is especially true where consent is required not due to the GDPR but under national laws implementing the ePrivacy Directive (Directive 2002/58/EC), including the UK Privacy and Electronic Communications Regulations (PECR). That is, for solutions like cookies or browser storage. Importantly, though, these are not always needed for personalised advertising. Perhaps the biggest puzzle is how to deal with consent to processing needed to fund the provision of a service that does not fit the narrow interpretation of contractual necessity.

Consent, as we know from Articles 4(11) and 7(4) GDPR, must be “freely given.” In addition, Recital 42 states that: “Consent should not be regarded as freely given if the data subject has no genuine or free choice or is unable to refuse or withdraw consent without detriment.” The EDPB provided self-contradictory guidance by first saying that withdrawing consent should “not lead to any costs for the data subjects,” but soon after adding that the GDPR “does not preclude all incentives” for consenting.[5]

Despite some differences, at least the Austrian, Danish, French, German (DSK), and Spanish data protection authorities generally acknowledge that paid alternatives to consent may be lawful.[6] Notably, the Norwegian Privacy Board—in a Gridnr appeal—also explicitly allowed that possibility.[7] I discuss below the conditions those authorities focus on in their assessment of “pay or consent” implementations.

The CJEU and ‘Necessity’ to Charge ‘An Appropriate Fee’

In its Meta decision from July 2023, the EU Court of Justice weighed in, though in the context of third-party-collected data, by saying that if that kind of data processing by Meta does not fall under contractual necessity, then:

(…) those users must be free to refuse individually, in the context of the contractual process, to give their consent to particular data processing operations not necessary for the performance of the contract, without being obliged to refrain entirely from using the service offered by the online social network operator, which means that those users are to be offered, if necessary for an appropriate fee, an equivalent alternative not accompanied by such data processing operations.[8]

Intentionally or not, the Court highlighted the interpretative problem stemming from a narrow interpretation of contractual necessity. The Court said that even if processing does not fall under contractual necessity, it may still be “necessary” to charge data subjects “an appropriate fee” if they refuse to consent. Disappointing some activists, the Court did not endorse the EDPB’s first comment I cited (that refusal to consent should not come with “any costs”).

Even though the Court did not explain this further, we can speculate that the Court was not willing to accept the view that all business models simply have to be adjusted to a maximally prohibitive interpretation of the GDPR. The Court may have attempted to save the GDPR from a likely political backlash to an attempt to use the GDPR to deny Europeans a choice of free-of-charge services funded by personalised advertising. Perhaps, the Court also noted that other EU laws rely on the GDPR’s definition of consent (e.g., the Digital Markets Act) and that this gives an additional reason to be very cautious in interpreting this concept in ways that are not in line with current expectations.

Remaining Questions

Several questions will likely be particularly important for future assessments of “pay or consent” implementations under the GDPR and ePrivacy/PECRs. The following list may not be exhaustive but aims to identify the main issues.

How Specific Should the Choice Be?

The extent to which service providers batch consent to processing for different purposes, especially if users cannot (in a “second step”) adjust consent more granularly, is likely to be questioned. This is problematic because giving users complete freedom to adjust their consent could also defeat the purpose of having a paid alternative.

In a different kind of bundling, service providers may make the paid alternative to consent more attractive by adding incentives like access to additional content or the absence of ads (including non-personalised ads). On the one hand, this means that service providers incentivise users not to consent, making consent less attractive. This could be seen as reducing the pressure to consent and making the choice more likely to be freely given. On the other hand, a more attractive paid option could be more costly for the service provider and thus require a higher price.

What Is an ‘Appropriate’ Price?

The pricing question is a potential landmine for data protection authorities, who are decidedly ill-suited to deal with it. Just to show one aspect of the complexity: setting as a benchmark the service’s historical average revenue per user (ARPU) from (personalised) advertising may be misleading. Users are not identical. Wealthier, less price-sensitive users, who may be more likely to pay for a no-ads option, are also worth more to advertisers. Hence, the loss of income from advertising may be higher than just “old ARPU multiplied by the number of users on a no-ads tier,” suggesting a need to charge the paying users more than historical ARPU merely to retain the same level of revenue. Crucially, the situation will likely be dynamic due to subscription “churn” (users canceling their subscriptions) and other market factors. The economic results of the “pay or consent” scheme may continue to change, and setting the price level will always involve business judgment based on predictions and intuition.

Some authorities may be tempted to approach the issue from the perspective of users’ willingness to pay, but this also raises many issues. First, the idea of price regulation by privacy authorities, capping prices at a level defined by the authorities’ view of what is acceptable to a user, may face jurisdictional scrutiny. Second, taking users’ willingness to pay as a benchmark implicitly assumes a legally protected entitlement to access the service for a price they like. In other words, to assume that users are entitled to specific private services, like social media services.[9] This is not something that can be simply assumed; it would require a robust argument—and arguably constitute a legal change that is appropriate only for the political, legislative process.

Imbalance

Recital 43 of the GDPR explains that consent may not be free when there is “a clear imbalance between the data subject and the controller.” In the Meta decision, the EU Court of Justice admitted the possibility of such an imbalance between a business with a dominant position, as understood in competition law, and its customers.[10] This, too, may be a difficult issue for data protection authorities to deal with, both for expertise and competence reasons.

The Scale of Processing and Impact on Users

Distinct from market power (dominance), though sometimes conflated with it, are the issues of the scale of processing and its impact on users. An online service provider, e.g., a newspaper publisher, may have relatively little market power but may be using a personalised advertising framework (e.g., an RTB scheme facilitated by third parties[11]) that is very large in scale and with more potential for a negative impact on users than an advertising system internal to a large online platform. A large online platform can offer personalised advertising to its business customers (advertisers) while sharing little or no information about who the ads are being shown to. Large platforms have economic incentives to keep user data securely within the platform’s “walled garden,” not sharing it with outsiders. Smaller publishers participate in open advertising schemes (RTB), where user data is shared more widely with advertisers and other participants.

Given the integration of smaller publishers in such open advertising schemes, an attempt by data protection authorities to set a different standard for consent just for large platforms may fail as based on an arbitrary distinction. In other words, however attractive it may seem for the authorities to target Meta without targeting the more politically powerful legacy media, this may not be an option.

[1] The comments below build on my ‘“Pay or consent:” Personalized ads, the rules and what’s next’ (IAPP, 20 November 2023) < https://iapp.org/news/a/pay-or-consent-personalized-ads-the-rules-and-whats-next/ >.

[2] On this issue, I highly recommend the article by Professor Martin Nettesheim on ‘Data Protection in Contractual Relationships (Art. 6 (1) (b) GDPR)’ (May 2023) < https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4427134 >.

[3] https://www.edpb.europa.eu/news/news/2023/facebook-and-instagram-decisions-important-impact-use-personal-data-behavioural_en

[4] https://www.edpb.europa.eu/news/news/2023/edpb-urgent-binding-decision-processing-personal-data-behavioural-advertising-meta_en

[5] https://www.edpb.europa.eu/sites/default/files/files/file1/edpb_guidelines_202005_consent_en.pdf

[6] David Pfau, ‘PUR models: Status quo on the European market’ (BVDW, October 2023) < https://iabeurope.eu/knowledge_hub/bvdws-comprehensive-market-overview-pur-models-in-europe-legal-framework-and-future-prospects-in-english/ >; for the view of the Spanish authority, see ??https://www.aepd.es/prensa-y-comunicacion/notas-de-prensa/aepd-actualiza-guia-cookies-para-adaptarla-a-nuevas-directrices-cepd

[7] https://www.personvernnemnda.no/pvn-2022-22

[8] https://curia.europa.eu/juris/document/document.jsf?mode=lst&pageIndex=1&docid=276478&part=1&doclang=EN&text=&dir=&occ=first&cid=163129

[9] See also Peter Caddock, ‘Op-ed: “Pay or data” has its reasons – even if you disagree’, https://www.linkedin.com/pulse/op-ed-pay-data-has-its-reasons-even-you-disagree-peter-craddock

[10] See para [149]. This is also referenced in the Joint EDPB-EDPS contribution to the public consultation on the draft template relating to the description of consumer profiling techniques (Art.15 DMA) (September 2023), page 14.

[11] https://en.wikipedia.org/wiki/Real-time_bidding

Continue reading
Data Security & Privacy

EU Authorities on ‘Pay or Consent’: Mid-April 2024 Update

Popular Media Due to Meta’s adoption of a “pay or consent” model for Facebook and Instagram, the model became a key issue not only under EU privacy . . .

Due to Meta’s adoption of a “pay or consent” model for Facebook and Instagram, the model became a key issue not only under EU privacy law but also under the new digital regulations: the Digital Services Act (DSA) and the Digital Markets Act (DMA). Given the barrage of pay or consent-related news in the past months, I thought it would be a good idea to take stock of where we are now.

Read the full piece here.

Continue reading
Data Security & Privacy

The Missing Element in the Google Case

TOTM Through laudable competition on the merits, Google achieved a usage share of nearly 90% in “general search services.” About a decade later, the government alleged . . .

Through laudable competition on the merits, Google achieved a usage share of nearly 90% in “general search services.” About a decade later, the government alleged that Google had maintained its dominant share through exclusionary practices violating Section 2 of the Sherman Antitrust Act. The case was tried in U.S. District Court in Washington, D.C. last fall, and the parties made post-trial filings this year.

Read the full piece here.

Continue reading
Antitrust & Consumer Protection

Lazar Radic on DMA Implementation

Presentations & Interviews ICLE Senior Scholar Lazar Radic joined Associazione Copernicani and moderator Carlo Alberto Carnevale Maffè for a discussion (in Italian) of the European Union’s Digital Markets . . .

ICLE Senior Scholar Lazar Radic joined Associazione Copernicani and moderator Carlo Alberto Carnevale Maffè for a discussion (in Italian) of the European Union’s Digital Markets Act and the European Commission’s noncompliance investigations of Alphabet, Apple, and Meta. Video of the full discussion is embedded below.

Continue reading
Antitrust & Consumer Protection