Research Programs
More
What are you looking for?
Showing 9 of 42 Results in Spectrum & Wireless
Popular Media It’s easy to look at the net neutrality debate and assume that everyone is acting in their self-interest and against consumer welfare. Thus, many on . . .
It’s easy to look at the net neutrality debate and assume that everyone is acting in their self-interest and against consumer welfare. Thus, many on the left denounce all opposition to Title II as essentially “Comcast-funded,” aimed at undermining the Open Internet to further nefarious, hidden agendas. No matter how often opponents make the economic argument that Title II would reduce incentives to invest in the network, many will not listen because they have convinced themselves that it is simply special-interest pleading.
But whatever you think of ISPs’ incentives to oppose Title II, the incentive for the tech companies (like Cisco, Qualcomm, Nokia and IBM) that design and build key elements of network infrastructure and the devices that connect to it (i.e., essential input providers) is to build out networks and increase adoption (i.e., to expand output). These companies’ fundamental incentive with respect to regulation of the Internet is the adoption of rules that favor investment. They operate in highly competitive markets, they don’t offer competing content and they don’t stand as alleged “gatekeepers” seeking monopoly returns from, or control over, what crosses over the Interwebs.
Thus, it is no small thing that 60 tech companies — including some of the world’s largest, based both in the US and abroad — that are heavily invested in the buildout of networks and devices, as well as more than 100 manufacturing firms that are increasingly building the products and devices that make up the “Internet of Things,” have written letters strongly opposing the reclassification of broadband under Title II.
There is probably no more objective evidence that Title II reclassification will harm broadband deployment than the opposition of these informed market participants.
These companies have the most to lose from reduced buildout, and no reasonable nefarious plots can be constructed to impugn their opposition to reclassification as consumer-harming self-interest in disguise. Their self-interest is on their sleeves: More broadband deployment and adoption — which is exactly what the Open Internet proceedings are supposed to accomplish.
If the FCC chooses the reclassification route, it will most assuredly end up in litigation. And when it does, the opposition of these companies to Title II should be Exhibit A in the effort to debunk the FCC’s purported basis for its rules: the “virtuous circle” theory that says that strong net neutrality rules are necessary to drive broadband investment and deployment.
Access to all the wonderful content the Internet has brought us is not possible without the billions of dollars that have been invested in building the networks and devices themselves. Let’s not kill the goose that lays the golden eggs.
Filed under: antitrust, law and economics, markets, monopolization, net neutrality, technology, telecommunications, vertical restraints, wireless Tagged: antitrust, net neutrality, open internet, tech companies, Title II
Regulatory Comments "...A serious assessment of the need for new privacy legislation, and the right way to frame it, would not begin by assuming the premise that a particular framework is necessary..."
“…A serious assessment of the need for new privacy legislation, and the right way to frame it, would not begin by assuming the premise that a particular framework is necessary. Specifically, before recommending any new legislation, the NTIA should do – or ensure that someone does – what the Federal Trade Commission has steadfastly refused to do: carefully assess what is and is not already covered by existing U.S. laws…”
“Existing laws might well be inadequate to deal with some of the specific the challenges raised by Big Data. But until they are more carefully examined, we will not know where the gaps are. Even those who might insist that there would be no harm to redundancy should agree that we must learn from the lessons of past experience with these laws. Moreover, it is essential to understand what existing law covers because either (a) it will co-exist with any future privacy law, in which case companies will have potentially conflicting…”
TOTM Today the D.C. Circuit struck down most of the FCC’s 2010 Open Internet Order, rejecting rules that required broadband providers to carry all traffic for edge providers . . .
Today the D.C. Circuit struck down most of the FCC’s 2010 Open Internet Order, rejecting rules that required broadband providers to carry all traffic for edge providers (“anti-blocking”) and prevented providers from negotiating deals for prioritized carriage. However, the appeals court did conclude that the FCC has statutory authority to issue “Net Neutrality” rules under Section 706(a) and let stand the FCC’s requirement that broadband providers clearly disclose their network management practices.
Read the full piece here.
Written Testimonies & Filings "Twenty years ago, Democrats and Republicans agreed on the need to refocus communications competition policy on promoting competition in an era of convergence, focusing on effects rather than formalism..."
“Twenty years ago, Democrats and Republicans agreed on the need to refocus communications competition policy on promoting competition in an era of convergence, focusing on effects rather than formalism. Unfortunately, that focus was lost in the sausage-making process of legislation – and the FCC has been increasingly adrift ever since. The FCC has not waited for Congress to act, and has instead found creative ways to sidestep the formalist structure of the Act. It is high time for Congress to reassert its authority and to craft a new act focused on the effects of competition as a durable basis for regulation.
The antitrust statutes have not been fundamentally modified in over a century because Congress has not needed to do so: antitrust law has evolved on top of them through a mix of court decisions and doctrinal development articulated by the antitrust agencies. At the heart of this evolution of common law has been one guiding concern: effects on consumer welfare, seen through the lens of law and economics. The same concern and same analytical lens should guide the re-write of the Communications Act that is, by now, two decades overdue.
While refocusing competition regulation on effects, Congress should give equal focus to minimizing remaining barriers to competition. In particular, that means minimizing regulatory uncertainty (and, in particular, avoiding any return to mostly archaic Title II regulations); maximizing the amount of spectrum available; simplifying the construction and upgrading of wireless towers to maximize the capacity of wireless broadband; and promoting infrastructure policy at all levels of government that makes deployment cost-effective….”
Popular Media For those in the DC area interested in telecom regulation, there is another great event opportunity coming up next week. Join TechFreedom on Thursday, December 19, the . . .
For those in the DC area interested in telecom regulation, there is another great event opportunity coming up next week.
Join TechFreedom on Thursday, December 19, the 100th anniversary of the Kingsbury Commitment, AT&T’s negotiated settlement of antitrust charges brought by the Department of Justice that gave AT&T a legal monopoly in most of the U.S. in exchange for a commitment to provide universal service.
The Commitment is hailed by many not just as a milestone in the public interest but as the bedrock of U.S. communications policy. Others see the settlement as the cynical exploitation of lofty rhetoric to establish a tightly regulated monopoly — and the beginning of decades of cozy regulatory capture that stifled competition and strangled innovation.
So which was it? More importantly, what can we learn from the seventy year period before the 1984 break-up of AT&T, and the last three decades of efforts to unleash competition? With fewer than a third of Americans relying on traditional telephony and Internet-based competitors increasingly driving competition, what does universal service mean in the digital era? As Congress contemplates overhauling the Communications Act, how can policymakers promote universal service through competition, by promoting innovation and investment? What should a new Kingsbury Commitment look like?
Following a luncheon keynote address by FCC Commissioner Ajit Pai, a diverse panel of experts moderated by TechFreedom President Berin Szoka will explore these issues and more. The panel includes:
Space is limited so RSVP now if you plan to attend in person. A live stream of the event will be available on this page. You can follow the conversation on Twitter on the #Kingsbury100 hashtag.
When: Thursday, December 19, 2013 11:30 – 12:00 Registration & lunch 12:00 – 1:45 Event & live stream
The live stream will begin on this page at noon Eastern.
Where: The Methodist Building 100 Maryland Ave NE Washington D.C. 20002
Questions? Email [email protected]
Filed under: federal communications commission, net neutrality, regulation, technology, telecommunications, wireless Tagged: at&t, Berin Szoka, Commissioner Pai, Federal Communications Commission, Fred Campbell, Hance Haney, Harold Feld, Jeff Eisenach, Kingsbury Commitment, Rob Atkinson, techfreedom
Scholarship Increasingly, the wired and wireless networks are converging in architecture and function. For example, the further fiber moves towards the customer, the more wireless capabilities are available in cellular networks.
Increasingly, the wired and wireless networks are converging in architecture and function. For example, the further fiber moves towards the customer, the more wireless capabilities are available in cellular networks. As wireless offers more bandwidth, it can deliver video and other functions previously thought to require more substantial broadband pipes. The question then arises, to what extent are wireless offerings substitutable for wireline services, and vice versa? The 2013 Aspen Institute Roundtable on Spectrum Policy (AIRS), “Spectrum Policy for the Wired Network,” met on November 13-15, 2013 to consider what spectrum policies would foster best the goals of a robust, reliable and effective communications system in the United States.
The 24 leading communications policy experts who met at the Aspen Wye River Conference Center in Queenstown, Maryland began by looking at the characteristics of network architecture, both wired and wireless, that are relevant to a robust communications network. In the course of this exploration, the group considered public goods that need to reach consumers, and the desire for consumer choice of competitive services. They also investigated what essential elements of the wired network are required by public policy, and which of these can wireless services substitute for. The overall goal was to discover how spectrum services and spectrum policy can advance overall communications policy goals, e.g., robust, reliable, and effective communications with choice where possible.
As the following report details, the discussions were lively and knowledgeable. Throughout the report, the Roundtable rapporteur, Geoff Manne, sets forth a number of recommendations that he gleaned from the conference dialogue, specifically concerning the issues of rural communications, public services, and competition. While these recommendations generally reflect the sense of the meeting, there were some opponents to the viewpoints recorded and there were no votes taken. Accordingly, participation in the dialogue should not be construed as agreement with any particular statement in the report by the participant or his or her employer.
Popular Media The debates over mobile spectrum aggregation and the auction rules for the FCC’s upcoming incentive auction — like all regulatory rent-seeking — can be farcical. . . .
The debates over mobile spectrum aggregation and the auction rules for the FCC’s upcoming incentive auction — like all regulatory rent-seeking — can be farcical. One aspect of the debate in particular is worth highlighting, as it puts into stark relief the tendentiousness of self-interested companies making claims about the public interestedness of their preferred policies: The debate over how and whether to limit the buying and aggregating of lower frequency (in this case 600 MHz) spectrum.
Popular Media I have a new post up at TechPolicyDaily.com, excerpted below, in which I discuss the growing body of (surprising uncontroversial) work showing that broadband in . . .
I have a new post up at TechPolicyDaily.com, excerpted below, in which I discuss the growing body of (surprising uncontroversial) work showing that broadband in the US compares favorably to that in the rest of the world. My conclusion, which is frankly more cynical than I like, is that concern about the US “falling behind” is manufactured debate. It’s a compelling story that the media likes and that plays well for (some) academics.
Before the excerpt, I’d also like to quote one of today’s headlines from Slashdot:
“Google launched the citywide Wi-Fi network with much fanfare in 2006 as a way for Mountain View residents and businesses to connect to the Internet at no cost. It covers most of the Silicon Valley city and worked well until last year, as Slashdot readers may recall, when connectivity got rapidly worse. As a result, Mountain View is installing new Wi-Fi hotspots in parts of the city to supplement the poorly performing network operated by Google. Both the city and Google have blamed the problems on the design of the network. Google, which is involved in several projects to provide Internet access in various parts of the world, said in a statement that it is ‘actively in discussions with the Mountain View city staff to review several options for the future of the network.’”
The added emphasis is mine. It is added to draw attention to the simple point that designing and building networks is hard. Like, really really hard. Folks think that it’s easy, because they have small networks in their homes or offices — so surely they can scale to a nationwide network without much trouble. But all sorts of crazy stuff starts to happen when we substantially increase the scale of IP networks. This is just one of the very many things that should give us pause about calls for the buildout of a government run or sponsored Internet infrastructure.
Another of those things is whether there’s any need for that. Which brings us to my TechPolicyDaily.com post:
In the week or so since TPRC, I’ve found myself dwelling on an observation I made during the conference: how much agreement there was, especially on issues usually thought of as controversial. I want to take a few paragraphs to consider what was probably the most surprisingly non-controversial panel of the conference, the final Internet Policy panel, in which two papers – one by ITIF’s Rob Atkinson and the other by James McConnaughey from NTIA – were presented that showed that broadband Internet service in US (and Canada, though I will focus on the US) compares quite well to that offered in the rest of the world. […] But the real question that this panel raised for me was: given how well the US actually compares to other countries, why does concern about the US falling behind dominate so much discourse in this area? When you get technical, economic, legal, and policy experts together in a room – which is what TPRC does – the near consensus seems to be that the “kids are all right”; but when you read the press, or much of the high-profile academic literature, “the sky is falling.” The gap between these assessments could not be larger. I think that we need to think about why this is. I hate to be cynical or disparaging – especially since I know strong advocates on both sides and believe that their concerns are sincere and efforts earnest. But after this year’s conference, I’m having trouble shaking the feeling that ongoing concern about how US broadband stacks up to the rest of the world is a manufactured debate. It’s a compelling, media- and public-friendly, narrative that supports a powerful political agenda. And the clear incentives, for academics and media alike, are to find problems and raise concerns. […] Compare this to the Chicken Little narrative. As I was writing this, I received a message from a friend asking my views on an Economist blog post that shares data from the ITU’s just-released Measuring the Information Society 2013 report. This data shows that the US has some of the highest prices for pre-paid handset-based mobile data around the world. That is, it reports the standard narrative – and it does so without looking at the report’s methodology. […] Even more problematic than what the Economist blog reports, however, is what it doesn’t report. [The report contains data showing the US has some of the lowest cost fixed broadband and mobile broadband prices in the world. See the full post at TechPolicyDaily.com for the numbers.] Now, there are possible methodological problems with these rankings, too. My point here isn’t to debate over the relative position of the United States. It’s to ask why the “story” about this report cherry-picks the alarming data, doesn’t consider its methodology, and ignores the data that contradicts its story. Of course, I answered that question above: It’s a compelling, media- and public-friendly, narrative that supports a powerful political agenda. And the clear incentives, for academics and media alike, are to find problems and raise concerns. Manufacturing debate sells copy and ads, and advances careers.
In the week or so since TPRC, I’ve found myself dwelling on an observation I made during the conference: how much agreement there was, especially on issues usually thought of as controversial. I want to take a few paragraphs to consider what was probably the most surprisingly non-controversial panel of the conference, the final Internet Policy panel, in which two papers – one by ITIF’s Rob Atkinson and the other by James McConnaughey from NTIA – were presented that showed that broadband Internet service in US (and Canada, though I will focus on the US) compares quite well to that offered in the rest of the world. […]
But the real question that this panel raised for me was: given how well the US actually compares to other countries, why does concern about the US falling behind dominate so much discourse in this area? When you get technical, economic, legal, and policy experts together in a room – which is what TPRC does – the near consensus seems to be that the “kids are all right”; but when you read the press, or much of the high-profile academic literature, “the sky is falling.”
The gap between these assessments could not be larger. I think that we need to think about why this is. I hate to be cynical or disparaging – especially since I know strong advocates on both sides and believe that their concerns are sincere and efforts earnest. But after this year’s conference, I’m having trouble shaking the feeling that ongoing concern about how US broadband stacks up to the rest of the world is a manufactured debate. It’s a compelling, media- and public-friendly, narrative that supports a powerful political agenda. And the clear incentives, for academics and media alike, are to find problems and raise concerns. […]
Compare this to the Chicken Little narrative. As I was writing this, I received a message from a friend asking my views on an Economist blog post that shares data from the ITU’s just-released Measuring the Information Society 2013 report. This data shows that the US has some of the highest prices for pre-paid handset-based mobile data around the world. That is, it reports the standard narrative – and it does so without looking at the report’s methodology. […]
Even more problematic than what the Economist blog reports, however, is what it doesn’t report. [The report contains data showing the US has some of the lowest cost fixed broadband and mobile broadband prices in the world. See the full post at TechPolicyDaily.com for the numbers.]
Now, there are possible methodological problems with these rankings, too. My point here isn’t to debate over the relative position of the United States. It’s to ask why the “story” about this report cherry-picks the alarming data, doesn’t consider its methodology, and ignores the data that contradicts its story.
Of course, I answered that question above: It’s a compelling, media- and public-friendly, narrative that supports a powerful political agenda. And the clear incentives, for academics and media alike, are to find problems and raise concerns. Manufacturing debate sells copy and ads, and advances careers.
Filed under: federal communications commission, net neutrality, regulation, technology, telecommunications, truth on the market, wireless Tagged: Broadband, FCC, Internet Access, Network neutrality, rankings, TPRC
TOTM Susan Crawford recently received the OneCommunity Broadband Hero Award for being a “tireless advocate for 21st century high capacity network access.” In her recent debate with Geoffrey . . .
Susan Crawford recently received the OneCommunity Broadband Hero Award for being a “tireless advocate for 21st century high capacity network access.” In her recent debate with Geoffrey Manne and Berin Szoka, she emphasized that there is little competition in broadband or between cable broadband and wireless, asserting that the main players have effectively divided the markets. As a result, she argues (as she did here at 17:29) that broadband and wireless providers “are deciding not to invest in the very expensive infrastructure because they are very happy with the profits they are getting now.” In the debate, Manne countered by pointing to substantial investment and innovation in both the wired and wireless broadband marketplaces, and arguing that this is not something monopolists insulated from competition do. So, who’s right?