What are you looking for?

Showing 9 of 97 Results in Data Security

There’s Nothing “Conservative” About Trump’s Views on Free Speech and the Regulation of Social Media

TOTM Despite the simplistic narrative tying President Trump’s vision of the world to conservatism, there is nothing conservative about his views on the First Amendment and how it applies to social media companies.

Yesterday was President Trump’s big “Social Media Summit” where he got together with a number of right-wing firebrands to decry the power of Big Tech to censor conservatives online. According to the Wall Street Journal

Read the full piece here.

Continue reading
Data Security & Privacy

10 Reasons Why the California Consumer Privacy Act (CCPA) Is Going to Be a Dumpster Fire

TOTM Last year, real estate developer Alastair Mactaggart spent nearly $3.5 million to put a privacy law on the ballot in California’s November election. He then negotiated a deal with state lawmakers to withdraw the ballot initiative if they passed their own privacy bill. That law — the California Consumer Privacy Act (CCPA) — was enacted after only seven days of drafting and amending.

Last year, real estate developer Alastair Mactaggart spent nearly $3.5 million to put a privacy law on the ballot in California’s November election. He then negotiated a deal with state lawmakers to withdraw the ballot initiative if they passed their own privacy bill. That law — the California Consumer Privacy Act (CCPA) — was enacted after only seven days of drafting and amending. CCPA will go into effect six months from today.

Read the full piece here.

Continue reading
Data Security & Privacy

Response to McGeveran’s The Duty of Data Security: Not the Objective Duty He Wants, Maybe the Subjective Duty We Need

Scholarship This response argues that in his efforts to locate a clear duty in existing data security law he has identified a standard that, in all meaningful ways, is one of subjective (not objective) reasonableness – and therefore offers no clarity at all.

William McGeveran’s recent article, The Duty of Data Security, is a significant contribution to ongoing debates about what duty firms holding electronic information about consumers owe in ensuring the security of that data. It also supports the opposite conclusion from that which McGeveran articulates. McGeveran frames the article as identifying a clear duty of data security. This response argues that in his efforts to locate a clear duty in existing data security law he has identified a standard that, in all meaningful ways, is one of subjective (not objective) reasonableness – and therefore offers no clarity at all. There is likely room for disagreement on both sides of this argument – both that which McGeveran makes and my response to it. The ultimate purpose of this response, however, is to recognize this aspect of the duty that McGeveran has identified and to reframe it in the familiar terms of objective vs. subjective reasonableness. This distinction is both useful and important, and has gone unremarked upon in two decades of discussions about the data security obligations.

Read the full response here.

Continue reading
Data Security & Privacy

The FTC’s Flawed Data Security Enforcement Program and Suggestions for Reform (FTC hearings, Comment 8)

Written Testimonies & Filings FTC Hearings on Competition & Consumer Protection in the 21 st Century. Comments of the International Center for Law & Economics: The FTC’s Flawed Data Security Enforcement Program and Suggestions for Reform. Hearing #9 (Dec. 11-12, 2018). Submitted May 31, 2019.

Comments of the International Center for Law & Economics

Several pressing issues are raised by the ongoing need for data security as underscored by high profile breaches. One of the core problems in this area, however, is not simply that firms have inadequate data security, but that lawmakers have, to date, broadly failed to offer a viable standard by which firms can guide their conduct in this area.

The flawed strategy which the FTC currently deploys to deal with data security issues is a prime example. In brief, the Commission’s over-reliance on enforcement by consent decrees has created a quasi-regulatory approach to data security, eschewed the fundamentally useful aspects of a true common law approach to developing liability rules, and as a consequence provided little record of what actually amounts to liability for “unreasonable” data security. A true standard would include such components as: the assessment of reasonable care on the part of the tortfeasor, the thorough analysis of causality, an economically grounded computation of harm, and the establishment that harm is likely absent some level of care.

Given these failings, the FTC should consider implementing reforms that might bring its decisional practice closer to the common law tradition. These include giving more weight to economic analysis (notably by allowing the FTC’s Bureau of Economics to play a greater role in data security proceedings), adopting modest measures that would increase the transparency of the FTC’s data security decisions (thereby increasing legal predictability), bringing greater judicial review to data security proceedings, and incentivizing firms to better communicate their data security activities.

Click here to read full comments.

Continue reading
Antitrust & Consumer Protection

GDPR After One Year: Costs and Unintended Consequences

TOTM GDPR is officially one year old. How have the first 12 months gone?

GDPR is officially one year old. How have the first 12 months gone? As you can see from the mix of data and anecdotes below, it appears that compliance costs have been astronomical; individual “data rights” have led to unintended consequences; “privacy protection” seems to have undermined market competition; and there have been large unseen — but not unmeasurable! — costs in foregone startup investment. So, all-in-all, about what we expected.

Read the full piece here.

Continue reading

ICLE Comments, Australian Competition and Consumer Commission’s Digital Platforms Inquiry

Regulatory Comments The analysis in the Australian Competition and Consumer Commission’s Preliminary Report for the Digital Platforms Inquiry is inadequate in several ways. There is a real danger that if the policy recommendations outlined in the preliminary report were to be adopted, Australian consumers would be severely harmed.

Executive Summary

The analysis in the Australian Competition and Consumer Commission’s Preliminary Report for the Digital Platforms Inquiry is inadequate in several ways, most notably:

  • It mischaracterizes the relationship between changes in the economics of media advertising and the rise of digital platforms such as Facebook and Google.
  • Its analysis of the dynamics of media diversity is misguided.
  • Its competition analysis assumes its results and makes unsupportable claims about the division of advertising markets.
  • It is recklessly unconcerned with the freedom of speech consequences of its recommendations.
  • It fails to recognize, and proposes to supplant, the ongoing social negotiation over data privacy.
  • It provides a poor analytic base on which to make policy recommendations, as it applies a static, rather than dynamic, approach to its analysis.

There is a real danger that if the policy recommendations outlined in the preliminary report were to be adopted, Australian consumers would be severely harmed.

Click here to read the full comments.

Continue reading
Antitrust & Consumer Protection

When “Reasonable” Isn’t: The FTC’s Standard-less Data Security Standard

Scholarship Although the FTC is well-staffed with highly skilled economists, its approach to data security is disappointingly light on economic analysis. The unfortunate result of this lacuna is an approach to these complex issues lacking in analytical rigor and the humility borne of analysis grounded in sound economics.

Summary

Although the FTC is well-staffed with highly skilled economists, its approach to data security is disappointingly light on economic analysis. The unfortunate result of this lacuna is an approach to these complex issues lacking in analytical rigor and the humility borne of analysis grounded in sound economics. In particular, the Commission’s “reasonableness” approach to assessing whether data security practices are unfair under Section 5 of the FTC Act lacks all but the most superficial trappings of the well-established law and economics of torts, from which the concept is borrowed.

In actuality, however, the Commission’s manufactured “reasonableness” standard — which, as its name suggests, purports to evaluate data security practices under a negligence-like framework — actually amounts in effect to a rule of strict liability for any company that collects personally identifiable data. This is manifestly not what Section 5 intends.

In its recent LabMD opinion, the Commission describes its approach as “cost-benefit analysis.” But simply listing out (some) costs and benefits is not the same thing as analyzing them. Recognizing that tradeoffs exist is a good start, but it is not a sufficient end, and “reasonableness” — if it is to be anything other than the mercurial preferences of three FTC commissioners — must contain analytical content.

Persistent and unyielding uncertainty over the contours of the FTC’s data security standard means that companies may be required to accept the reality that, no matter what they do short of the extremes, liability is possible. Worse, there is no way reliably to judge whether conduct (short of obvious fringe cases) is even likely to increase liability risk.

The FTC’s recent LabMD case highlights the scope of the problem and the lack of economic analytical rigor endemic to the FTC’s purported data security standard. To be sure, other factors also contribute to the lack of certainty and sufficient rigor, (i.e., matters of process at the agency), but at root sits a “standardless” standard, masquerading as an economic framework.

This paper explores these defects, paying particular attention to the FTC’s decision in LabMD and subsequent district court proceedings in the case.

Continue reading
Antitrust & Consumer Protection

Geoffrey Manne at FTC Hearing #9: Data Security

Presentations & Interviews ICLE founder and president Geoffrey Manne participated in FTC Hearing #9: Data security on the panel entitled, FTC Data Security Enforcement, on Wednesday, December 12, . . .

ICLE founder and president Geoffrey Manne participated in FTC Hearing #9: Data security on the panel entitled, FTC Data Security Enforcement, on Wednesday, December 12, 2018 at the FTC Constitution Center Auditorium Washington, DC.

The data security hearings included five panel discussions and additional discussion of research related to data breaches and data security threats. The first day’s panel discussions examined incentives to invest in data security and consumer demand for data security. Discussions on the second day focused on data security assessments, the U.S. framework related to consumer data security, and the FTC’s data security enforcement program.

Read the full transcript here. Video of the event is embedded below.

Continue reading
Data Security & Privacy

ICLE urges NTIA to avoid heavy-handed privacy regulation that would stifle innovation and limit consumer choice

Regulatory Comments ICLE submitted comments to the National Telecommunications and Information Administration (NTIA) on Developing the Administration’s Approach to Consumer Privacy.

Last week, ICLE submitted comments to the National Telecommunications and Information Administration (NTIA) on Developing the Administration’s Approach to Consumer Privacy. Scholars Geoffrey Manne, Kristian Stout, and Dirk Auer urge the agency to avoid legislation mandating tight controls on private companies’ use of consumer data akin to the EU’s General Data Protection Regulation (GDPR).

Although the US does not have a single, omnibus, privacy regulation, this does not mean that the US does not have “privacy law.” In the US, there already exist generally applicable laws at both the federal and state level that provide a wide scope of protection for individuals, including consumer protection laws that apply to companies’ data use and security practices, as well as those that have been developed in common law (property, contract, and tort) and criminal codes.

In addition, there are specific regulations pertaining to certain kinds of information, such as medical records, personal information collected online from children, credit reporting, as well as the use of data in a manner that might lead to certain kinds of illegal discrimination.

Getting regulation right is always difficult, but it is all the more so when confronting evolving technology, inconsistent and varied consumer demand, and intertwined economic effects — all conditions that confront online privacy regulation. Given this complexity, and the limits of our knowledge regarding consumer preferences and business conduct in this area, ICLE’s evaluation suggests that the proper method of regulating privacy is, for now at least, the course that the Federal Trade Commission (FTC) has historically taken: case-by-case examination of actual privacy harms, without ex ante regulations, coupled with narrow legislation targeted at problematic uses of personal information.

Many (if not most) services on the Internet are offered on the basis that user data can, within certain limits, be used by a firm to enhance its services and support its business model, thereby generating benefits to users. To varying degrees (and with varying degrees of granularity), services offer consumers the opportunity to opt-out of this consent to the use of their data, although in some cases the only way effectively to opt-out is to refrain from using a service at all.

Critics of the US approach to privacy sometimes advocate for a move to an opt-in regime (as is the case in the GDPR). But the problem is that “‘[o]pt-in’ provides no greater privacy protection than ‘opt-out’ but imposes significantly higher costs with dramatically different legal and economic implications.” In staunching the flow of data, opt-in regimes impose both direct and indirect costs on the economy and on consumers, reducing the value of certain products and services not only to the individual who does not opt-in, but to the broader network as a whole. Not surprisingly, these effects fall disproportionately on the relatively poor and the less technology-literate.

U.S. privacy regulators have generally evidenced admirable restraint and assessed the relevant tradeoffs, recognizing that the authorized collection and use of consumer information by data companies confers enormous benefits, even as it entails some risks. Indeed, the overwhelming conclusion of decades of intense scrutiny is that the application of ex ante privacy principles across industries is a fraught exercise as each firm faces a different set of consumer expectations about its provision of innovative services, including privacy protections.

This does not mean that privacy regulation should never be debated, nor that a more prescriptive regime should never be considered. But any such efforts must begin with the collective wisdom of the agencies, scholars, and policy makers that have been operating in this space for decades, and with a deep understanding of the business realities and consumer welfare effects involved.

Read the full comments here.

Continue reading
Data Security & Privacy