Geoffrey Manne, Kristian Stout & Ben Sperry on Platform Regulation
RealClearMarkets – ICLE President Geoffrey Manne, Director of Innovation Policy Kristian Stout, and Associate Director of Legal Research Ben Sperry were cited in an op-ed in RealClearMarkets about finding a better balance between harms and liability in the context of Section 230. You can read full piece here.
In a thoughtful paper, Who Moderates the Moderators?; A Law & Economics Approach to Holding Online Platforms Accountable Without Destroying the Internet, Geoffrey Manne, Kristian Stout, and Ben Perry reject the major platforms’ repeated assertion that any lessening of their Section 230 immunity necessarily will radically alter their present business models, or “destroy the Internet.” As they put it: “Counting the cost of defending meritorious lawsuits as an avoidable and unfortunate expense is tantamount to wishing away our civil-justice system. That is unlikely to be a defensible position in any regard, but it is certainly not defensible solely in the context of online platforms.” As they say: “The current Section 230 doesn’t just reduce the liability risk of intermediaries for user-generated content; it removes it virtually entirely.”
Messrs. Manne, Stout, and Perry, in a way that builds on similar work by media and copyright lawyer Neil Fried, employ a “law and economics” cost-benefit approach in supporting adoption of a negligence-like rule based on a “reasonableness” standard of care. In their view, this would allow imposing some degree of intermediary liability on platforms, without opening the floodgates to unmeritorious litigation. They make clear that their proposal doesn’t contemplate suits against the platforms for the underlying illegal or tortious conduct of users, but rather requires that the platforms take “reasonable steps to curb such conduct.” Significantly, they highlight an exception to the general reasonableness rule for so-called communications torts like libel. Like offline publishers subject to the judge-made liability rule in New York Times v. Sullivan, online providers would not be liable for communications torts arising out of user-generated posts unless they knew, or should have known, the content was defamatory.
I’m not convinced the recommendations in the Who Moderates the Moderators paper, with its caveats, go far enough in the reform direction. But they are a good starting point for considering, aside from whatever the Supreme Court might do in Gonzalez, a proper framework for meaningfully reducing the platforms’ current immunity to make them more accountable for their moderation actions.