TL;DR

Children’s Online Safety and Privacy Legislation

TL;DR

Background: There has been recent legislative movement on a pair of major bills related to children’s online safety and privacy. H.R. 7891, the Kids Online Safety Act (KOSA) has 62 cosponsors in the U.S. Senate. Meanwhile, H.R. 7890, the Children and Teens’ Online Privacy Protection Act (COPPA 2.0) also has bipartisan support within the U.S. Senate Commerce Committee. At the time of publication, these and a slate of other bills related to children’s online safety and privacy were scheduled to be marked up April 17 by the U.S. House Energy and Commerce Committee.

But… If enacted, the primary effect of these bills is likely to be less free online content for minors. Raising the regulatory burdens on online platforms that host minors, as well as restricting creators’ ability to monetize their content, are both likely to yield greater investment in identifying and excluding minors from online spaces, rather than creating safe and vibrant online ecosystems and content that cater to them. In other words, these bills could lead to minors losing the many benefits of internet usage. A more cost-effective way to address potential online harms to teens and children would be to encourage parents and minors to make use of available tools to avoid those harms and to dedicate more resources to prosecuting those who use online platforms to harm minors.

KEY TAKEAWAYS

RAISING THE COST TO SERVE MINORS COULD LEAD TO THEIR EXCLUSION

If the costs of serving minors surpass the revenues that online platforms can generate from serving them, those platforms will invest in excluding underage users, rather than creating safe and vibrant content and platforms for them. 

KOSA will substantially increase the costs that online platforms bear for serving minors. The bill would require a “high impact online company” to exercise “reasonable care” in its design features to “prevent and mitigate” certain harms. These harms include certain mental-health disorders and patterns indicating or encouraging compulsive use by minors, as well as physical violence, cyberbullying, and discriminatory harassment. Moreover, KOSA requires all covered platforms to implement default safeguards to limit design features that encourage minors’ use of the platforms and to control the use of personalized recommendation systems.

RESTRICTING TARGETED ADVERTISING LEADS TO LESS FREE CONTENT

A significant portion of internet content is delivered by what economists call multisided platforms. On one side of the platform, users enjoy free access to content, while on the other side, advertisers are granted a medium to reach users. In effect, advertisers subsidize users’ access to online content. Platforms also collect data from users in order to serve them targeted ads, the most lucrative form of advertising. Without those ads, there would be less revenue to fund access to, and creation of, content. This is no less true when it comes to content of interest to minors.

COPPA 2.0 would expand the protections granted by the Children’s Online Privacy Protection Act of 1998 to users under age 13 to also cover those between 13 and 17 years of age. Where the current law requires parental consent to collect and use persistent identifiers for “individual-specific advertising” directed to children under age 13, COPPA 2.0 would require the verifiable consent of the teen or a parent to serve such ads to teens. 

Obtaining verifiable consent has proven sufficiently costly under the current COPPA rule that almost no covered entities make efforts to obtain it. COPPA has instead largely prevented platforms from monetizing children’s content, which has meant that less of it is created. Extending the law to cover teens would generate similar results. Without the ability to serve them targeted ads, platforms will have less incentive to encourage the creation of teen-focused content.

DE-FACTO AGE VERIFICATION REQUIREMENTS

To comply with laws designed to protect minors, online platforms will need to verify whether its users are minors. While both KOSA and COPPA 2.0 disclaim establishing any age-verification requirements or the collection of any data not already collected “in the normal course of business,” they both establish constructive knowledge standards for violators (i.e., “should have known” or “knowledge fairly implied on the basis of objective circumstances”). Online platforms will need to be able to identify their users who are minors in order to comply with the prohibition on serving them personalized recommendations (KOSA) or targeted advertising (COPPA 2.0). 

Age-verification requirements have been found to violate the First Amendment, in part because they aren’t the least-restrictive means to protect children online. As one federal district court put it: “parents may rightly decide to regulate their children’s use of social media—including restricting the amount of time they spend on it, the content they may access, or even those they chat with. And many tools exist to help parents with this.”

A BETTER WAY FORWARD

Educating parents and minors about those widely available practical and technological tools to mitigate the harms of internet use is a better way to protect minors online, and would pass First Amendment scrutiny. Another way to address the problem would be to increase the resources available to law enforcement to go after predators. The Invest in Child Safety Act of 2024 is one such proposal to give overwhelmed investigators the necessary resources to combat child sexual exploitation.

For more on how to best protect minors online, see “A Law & Economics Approach to Social Media Regulation” and “A Coasean Analysis of Online Age-Verification and Parental-Consent Regimes.”