Net Neutrality or Net Neutering: Should Broadband Internet Services Be Regulated 0387339299, 9780387339290

Randolph J. May and Thomas M. Lenard The Progress & Freedom Foundation Most of the papers in this book were original

115 11 14MB

English Pages 237 [234] Year 2006

Report DMCA / Copyright

DOWNLOAD PDF FILE

Recommend Papers

Net Neutrality or Net Neutering: Should Broadband Internet Services Be Regulated
 0387339299, 9780387339290

  • 0 0 0
  • Like this paper and download? You can publish your own PDF file online for free in a few minutes! Sign Up
File loading please wait...
Citation preview

NET NEUTRALITY OR NET NEUTERING Should Broadband Internet Services Be Regulated?

edited by

Thomas M. Lenard Randolph J. May

NET NEUTRALITY OR NET NEUTERING: SHOULD BROADBAND INTERNET SERVICES BE REGULATED

NET NEUTRALITY OR NET NEUTERING: SHOULD BROADBAND INTERNET SERVICES BE REGULATED

edited by

Thomas M. Lenard and Randolph J. May The Progress & Freedom Foundation

THE PROGRESS t r FREEDOM FOUNDATION

Springer

Library of Congress Control Number: 2006924591

ISBN: 10: 0-387-33929-9 ISBN-13: 978-0387-33929-0

e-ISBN-10: 0-387-33928-0 e-ISBN-13: 978-0387-33928-3

© 2006 Springer Science-I-Business Media, LLC All rights reserved. This work may not be translated or copied in whole or in part without the written permission of the publisher (Springer Science-I-Business Media, Inc., 233 Spring Street, New York, NY 10013, USA), except for brief excerpts in connection with reviews or scholarly analysis. Use in connection with any form of information storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now known or hereafter developed is forbidden. The use in this publication of trade names, trademarks, service marks and similar terms, even if they are not identified as such, is not to be taken as an expression of opinion as to whether or not they are subject to proprietary rights. Printed in the United States of America. 9 8 7 6 5 4 3 2 1 springer.com

Contents

Foreword Randolph J. May and Thomas M. Lenard 1 Distribution, Vertical Integration and The Net Neutrality Debate Thomas M. Lenard and David T. Scheffman 2 Network Neutrality and Competition Policy: A Complex Relationship Christopher S. Yoo 3 Are "Dumb Pipe" Mandates Smart Public PoUcy? Vertical Integration, Net Neutrality, and the Network Layers Model Adam Thierer 4 The Importance of Open Networks in Sustaining The Digital Revolution Mark Cooper 5 Local Broadband Access: Primum Non Nocere or Primim Processi? A Property Rights Approach Bruce M. Owen and Gregory L. Rosston

vii

1

25

73

109

163

6 Open Access Arguments: Why Confidence is Misplaced Joseph Farrell

195

About the Authors

215

Foreword Randolph J. May and Thomas M. Lenard The Progress & Freedom Foundation

Most of the papers in this book were originally presented at a June 2003 Progress & Freedom Foundation conference entitled, "Net Neutrality or Net Neutering: Should Broadband Internet Services Be Regulated." As we now publish the suitably updated collection of papers, along with two others, the title remains entirely appropriate. For while calls to mandate rights of access to the broadband networks of cable operators, telephone companies, and other facilities-based broadband providers might ebb and flow, as we write this, the tide is running high. So persistent are calls for mandatory network access rights in the communications world that a book that explores the various facets of Net Neutrality is not likely to be soon outdated. The Policy Statement released by the Federal Communications Commission in September 2005 in its long-running proceedings to establish an appropriate regulatory framework for cable operator and telephone companyprovided broadband services describes the bundle of "rights" commonly understood to be encompassed under the rubric of Net Neutrality: (1) consumers are entitled to access the lawful Internet content of their choice; (2) consumers are entitled to run applications and services of their choice; (3) consumers are entitled to connect their choice of legal devices that do not harm the network; and (4) consumers are entitled to competition among network providers, application and services providers, and content providers.^ These rights are generally supported by a coalition that includes consumer groups, such as Consumers Union and the Consumer Federation of America, non-facilities-based Internet Service Providers such as Earthlink, and suppliers of Internet content such as Yahoo, Amazon and Google. In a broad ^ Appropriate Framework for Broadband Access to the Internet over Wireline Facilities, FCC 05-151, CC Docket No. 02-33, September 23, 2005.

viii

Net Neutrality

sense, the Net Neutrality debate is about whether law and regulation should dictate completely "open" or "dumb" broadband networks or whether the degree of "openness" should be left to the discretion of the network operator in light of marketplace imperatives. At the present time, the FCC's statement of the four Net Neutrality principles is characterized as "guidance," not rules in the sense of positive law. But the agency concludes its policy statement by observing: "To foster creation, adoption and use of Internet broadband content, applications, services and attachments, and to ensure consumers benefit from the innovation that comes from competition, the Commission will incorporate the above principles into its ongoing policymaking activities."^ Perhaps not surprisingly, it did not take long for the FCC to make good on its promise that it would incorporate the Net Neutrality principles into its ongoing policymaking activities. When the FCC approved the mergers of SBC Communications, Inc. and AT&T Corp. and Verizon Communications Inc. and MCI, Inc. in October 2005, it incorporated into its approval a condition requiring that the merger applicants "conduct business in a way that comports with the Commission's Internet policy statement issued in September."^ So, within two months of their promulgation, the FCC found the first occasion to incorporate the Net Neutrality principles "into its ongoing policymaking activities." The FCC is not alone in considering whether and how to respond to the ongoing calls for Net Neutrality. In light of the technological and marketplace changes that have taken place since passage of the Telecommunications Act of 1996, Congress is in the process of considering revisions to our communications law. The first two legislative proposals of any consequence—including one by the staff of the House Energy and Commerce Committee, which has jurisdiction in this area—^both contain guaranteed network access rights of the type embodied in the FCC principles. So those who seek mandatory access rights to broadband networks are actively pushing their cause. Yet, as timely as the Net Neutrality issue is today, it is by no means new. Whether there should be mandated access rights of 2/J.,at3. ^ News Release, "FCC Approves SBC/AT&T and Verizon/MCI Mergers," October 31, 2005. The FCC characterized the conditions it imposed, including the one relating to Net Neutrality, as "voluntary commitments." Of course, the applicants were anxious to have the Commission approve the proposed mergers without any further delay. For two articles explaining how the FCC uses—or, perhaps put more bluntly, sometimes abuses—the merger approval process to impose "voluntary" conditions that do not directly relate to any claimed competitive impacts uniquely associated with the proposed merger, see Randolph J. May, Telecom Merger ReviewReform the Process, NATIONAL LAW JOURNAL, May 30, 2005, at 27; Randolph J. May, Any Volunteers?, LEGAL TIMES, March 6, 2000, at 62.

Foreword

ix

one form or another is a recurring question in "network" industries in general and the communications sector in particular.'^ While the call for mandated network access assumes different names at different times, the change in terminology should not confuse the underlying issues at stake. For roughly the first three quarters of the twentieth century, the nation's telecommunications marketplace was dominated by AT&T. Before the 1984 breakup of the integrated Bell System in compliance with the antitrust consent decree in U.S. V. AT&T,^ no one seriously disputed AT&T's market power in the local telephone market. Thus, when the FCC fashioned its landmark Computer II regime in the early 1980s, as the previously separate communications and data processing markets begin to converge to enable the creation of a new online services market, it was not surprising that the new regime imposed on AT&T a non-discrimination requirement and safeguards intended to enforce it.6

The new online services, such as Telenet, Tymnet and CompuServe, were almost entirely dependent upon the local transmission facilities of AT&T for transport of the then newly emerging applications, such as e-mail, and data storage and retrieval, that combined some form of computer processing with basic transmission into what were called enhanced services. There was widespread agreement that, given its market power, AT&T had both the incentive and the ability to discriminate against its newly emerging enhanced services competitors. So, the FCC drew what it characterized at the time as a "bright line" between what it called "basic" and "enhanced" services,^ and it mandated that AT&T could offer enhanced services only through a fully separate subsidiary. This so-called structural safeguard was the means by which the agency enforced the requirement that AT&T's competitors were entitled to access AT&T's basic local network facilities on the same terms and conditions as AT&T itself

"^ A common feature of "network industries" as that term is used here is that these industries exhibit increasing returns to scale in consumption. This characteristic is commonly called "network effects". 5 United States v. AT&T, 552 F.Supp.131 (D.D.C. 1982), aff'd sub nom. Maryland v. United States, 460 U.S. 1001(1983). 6 See Second Computer Inquiry, Final Decision, 77 F.C.C. 2d 384 (1980). ^ "Basic service" was defined as the offering of a "pure transmission capability over a communications path that is virtually transparent in terms of its interaction with customer supplied information." 77 F.C.C. at 420. "Enhanced services" were defined as services that combine "basic service with computer processing applications that act on the format, content, code, protocol or similar aspects of the subscriber's transmitted information, or provide the subscriber with additional, different, or restructured information, or involve subscriber interaction with stored information." 77 F.C.C. 2d at 387.

X

Net Neutrality

The Computer II decision and its open access requirement were predicated on the existence of AT&T's market power. But even in 1980 there was a hint of the anticipated changes in the market environment that were on the horizon. While referring to the "existing ubiquity" of AT&T's network, the Commission stated that "technological trends suggest that hard-wired access provided by the telephone company will not be the only alternative... ."^ In 1984, AT&T divested itself of the local operating companies, and the Computer II structural safeguards were applied to the "Baby Bells." In 1986, in the Computer III proceeding, the FCC replaced the structural separation requirement with a new set of non-structural safeguards intended to enforce the non-discrimination mandate. At the heart of this new network access regime was a set of requirements applicable to the local telephone companies called "Open Network Architecture" or ONA.^ ONA required the local companies to unbundle elements of their networks as a means of ensuring that competitive information service providers had access to the network on the same terms and conditions as the operating company. This was followed, under the Telecommunications Act of 1996, by another new regime of network access mandates—^Unbundled Network Elements or UNEs. The intent was to prevent the Bell companies from using their presumed market power to disadvantage competitors who were still presumed to need access to the Bells' networks in order to compete. The effort of the FCC to implement this particular access regime led to an eight year saga that met judicial reversal at every turn.^^ Thanks in large part to the technological advances spurred by the digital revolution, the Commission's suggestion in the Computer II proceeding that alternatives would develop to the "last mile" networks of the local telephone companies has proved true. We are well into a rapid transition from a narrowband world into one in which access to broadband services is increasingly ubiquitous. There may be debate concerning the current competitiveness of the broadband marketplace and the extent of market power of any of the various broadband providers. But it is very difficult to argue with the FCC's assertion in 2002, when it initiated the rulemaking proposing to reclassify telephone company-provided broadband services as information services, that there are now "very different legal, technological and market cir-

8 77F.C.C.2dat468. 9 Third Computer Inquiry, Report and Order, 104 F.C.C. 2d 958 (1986). ^^ The long and short of it is that at least three times courts held that the FCC regulations required excessive network unbundling that was not consistent with the statutory directive contained in the 1996 Act. For some of the history of the long-running UNE litigation in its final throes, see United States Telecom Ass 'n v. FCC, 359 F. 3d 554 (D.C. Cir. 2004).

Foreword

xi

cumstances" than when the agency "initiated its Computer Inquiry line of cases."^^ In the three years since that FCC observation (one of many, of course) the pace of technological and marketplace change has continued to accelerate. Broadband networks have vastly more bandwidth available than previously and, as the FCC recently observed, this greater bandwidth encourages the introduction of services "which may integrate voice, video, and data capabilities while maintaining high quality of service."^^ The Commission goes on to add that, in a digital world "it may become increasingly difficult, if not impossible, to distinguish 'voice' service from 'data' service, and users may increasingly rely on integrated services using broadband facilities delivered using LP rather than the traditional PSTN (Public Switched Telephone Network)."i3 One only has to scan the daily newspaper—either the one in hand or online—^to see the alacrity with which cable operators, telephone companies, satellite operators and wireless service providers all are racing to offer integrated packages of voice, video, and Internet access services. Other potential broadband operators, such as power companies, lurk on the sidelines as potential competitors. It is in this rapidly evolving competitive environment that calls for Net Neutrality mandates—really nothing more than oldfashioned Computer //-like non-discrimination access requirements— continue to be made. We hope that this book's collection of papers proves useful to policymakers, communications industry participants, and others as they consider the various aspects of the Net Neutrality debate. The essays by Tom Lenard and David Scheffman; Christopher Yoo; Adam Thierer; and Bruce Owen and Greg Rosston set forth the arguments against the imposition of Net Neutrality mandates in today's environment, while Mark Cooper's paper argues strenuously in favor of imposing such requirements. In his essay, Joe Farrell examines arguments on both sides and explains why he has doubts about the positions of those either favoring or opposing Net Neutrality mandates. In the end, relying on what he still sees as the marketplace uncertainties, and with no apologies for the "consciously inconclusive tone" of his paper, he suggests we "could apply the open access rule to one, but not both, of the two main broadband pipes." In that way, "we would in some sense be making

^' Appropriate for Broadband Access to the Internet over Wireline Facilities, Notice of Proposed Rulemaking, 17 F.C.C Red. 3019, 3038 (2002). ^2 IP-Enabled Services, Notice of Proposed Rulemaking, 19 F.C.C. Red 4863, 4876 (2004). ''Id.

xii

Net Neutrality

the right choice as to one, rather than hoping to get both right but fearing to get both wrong." With all due respect to the virtues of doubts, in our view, the arguments against a Net Neutrality mandate are substantially stronger than those on the other side. Regardless, however, of anyone's current beliefs or doubts, we are convinced that these essays collectively provide a wealth of information and a diversity of views that will contribute to the understanding of the Net Neutrality issue. They discuss the history of access mandates; the current and projected state of the broadband marketplace; the impact of access mandates on investment in new broadband facilities, applications and content; the relationship of Net Neutrality mandates to the preservation of property rights and an open marketplace of ideas; the costs and benefits associated with regulating or not; and much more. In short, after perusing the essays contained in this book, even if he or she still harbors some of the doubts that lead Joe Farrell to adopt a "consciously inconclusive tone," the reader surely will be in a much better position to make up his or her own mind in the important debate about Net Neutrality. If we are right on this score, the book will have achieved the goal which we set for ourselves. Finally, we want to thank Marie Ryan, PFF Research Assistant, and Michael Pickford, former PFF Research Associate, for valuable research and editorial assistance. Marie Ryan, Amy Smorodin, PFF's Director of Communications, and Brooke Emmerick, Special Events and Publishing Coordinator, all deserve thanks for their work in formatting and producing the final version of this book. And thanks too to Jane Creel, PFF's Director of Finance and Operations, and Ray Gifford, PFF's President, for providing help in all the usual ways that are required to support a project like this from start to finish. While much credit is due to the PFF staff, and, of course, to the contributing authors for producing what we are confident will be a valuable contribution to the Net Neutrality debate, the responsibility for any errors remains our own. Randolph J. May Thomas M. Lenard Washington, DC

Chapter 1 Distribution, Vertical Integration and the Net Neutrality Debate

Thomas M. Lenard and David T. Scheffman The Progress & Freedom Foundation

I.

INTRODUCTION

The issue of whether to adopt a mandatory net neutrahty poUcy—i.e., to subject broadband providers to an open-access requirement that would prohibit them from discriminating against content providers—is one of the most important and controversial Internet policy issues before the FCC and the Congress. Despite the high levels of penetration achieved in recent years,^ broadband is still at a relatively early stage of development. Indeed, the long-hoped-for promise of broadband has yet to arrive and it is still not clear what "it" will be when it does arrive. Broadband providers, content providers and technology companies of various kinds are all placing expensive and risky bets on new technologies and "programming models" and will be doing so for many years to come. Obviously, the regulatory environment affects incentives to make these investments and to implement business models that could be successful and beneficial to consumers. Moreover, the issue of whether to impose a net neutrality requirement is not just a "regulatory" or a "telecom" issue, important as these issues are. Internet policy can have a significant effect on the broader macro-economy as well, because the Internet is a key element of the information and commu-

^ Federal Communications Commission, High Speed Services for Internet Access: Status as of December 31, 2004 (July 2005).

2

Net Neutrality

nication technology (ICT) sector, which has been the principal factor in the extraordinary performance of the U.S. economy during the last decade.^ This paper argues that the case for net neutrality regulation is weak for a number of reasons: •

Broadband—and what the actual contours of viable business models will look like—is still in its infancy. The issue here is nothing like the issue of traditional telecom regulation, which was concerned with established, relatively stable products and services.



Broadband is a distribution business and arrangements that are not neutral with respect to the products being distributed—in this case, content and applications—are typical of distribution businesses. In fact, "non-neutral" business models are likely to be necessary to provide sufficient incentives to invest, both in content and the distribution infrastructure itself. Such investments, probably large and risky, are going to be required to develop business models that are viable and achieve some of the apparent promise of an eventual broadband era.



Classical telephony was an electronic communication distribution business. Broadband is already—and in the future will be much more of—a media business. The economics of the media business are fundamentally different from the economics of classic telephony. Media distribution businesses provide content or programming in order to add subscribers in a way that is profitable. Put differently, inclusion is more characteristic of these businesses than exclusion. This does not mean that there are no conceivable exclusion issues, but now is certainly not the time to focus on alleged issues of "fairness" or abstract conceptions of "competition." The primary objective should be to not hinder the development of successful business models that can achieve some of the promise of broadband.

^ For an analysis of the impact of IT on past and future growth and productivity, see Dale W. Jorgensen, Mun S. Ho and Kevin J. Stiroh, Projecting Productivity Growth: Lessons from the U.S. Growth Resurgence, Federal Reserve Bank of New York, March 15, 2002.

Thomas M, Lenard and David T. Scheffman •

Even with two major providers in local broadband markets, there is evidence of a lot of competition. Under these conditions, it is difficult to envision circumstances where it would be in the interest of a broadband provider to foreclose access to any valuable content or application. Moreover, there is substantial investment in new broadband technologies, suggesting that additional competition is on the way.



Content providers have multiple outlets for their products, because the market for content is national (or even international) in scope. This makes it even more unlikely that content that is valuable to consumers could be foreclosed from the market.



Even if broadband was a monopoly, which it clearly is not, the case for net neutrality regulation would not be automatic. Monopolies often have the incentive to behave efficiently with respect to the vertical decisions that would be affected by such regulation.^ While there are exceptions to this conclusion, these exceptions do not appear to be applicable to the current markets for broadband and content, because they are not monopolies.

The outline of the paper is as follows: Section II briefly outlines the current regulatory debate and the arguments made by net neutrality proponents. Those arguments depend importantly on the competitive environment for broadband, which we describe in Section III. In Section IV, we discuss some characteristics of the distribution business generally, and how they apply to the broadband sector and the net neutrality debate. In Section V, we discuss some circumstances in which access regulation might be justified and argue that these circumstances are not applicable in the current broadband environment. Section VI offers some conclusions.

^ The economic arguments demonstrating this are well summarized in Joseph Farrell and Philip J. Weiser, Modularity, Vertical Integration, and Open Access Policies: Toward a Convergence of Antitrust and Regulation in the Internet Age, 17 HARVARD J. OF L. AND TECH. 86.

4

II.

Net Neutrality

THE REGULATORY DEBATE

At the present time, different broadband technologies are subject to different regulatory regimes, although this is about to change. DSL (digital subscriber line), because it is part of the legacy telecommunications regulatory world, has been subject to common-carrier, open-access obligations. Its closest competitor, cable modem, has not been subject to such obligations generally, although the Federal Trade Commission (FTC) imposed some open-access obligations on Time Warner as a condition of approving its merger with AOL. New technologies that might be capable of providing broadband access are also exempt from such obligations. The Supreme Court's recent decision in the Brand X case upheld the FCC's policy of declining to impose open-access regulation on cable broadband. The FCC subsequently indicated its intention to issue an order that would remove the existing obligations on DSL and place the two services on an equal footing."* The principal commercial supporters of an open-access requirement are content and application companies, such as Amazon, Microsoft, Google, Yahoo and Vonage, who express concern that broadband providers may discriminate against them and in favor of their own content. Two recent cases involving denial of access to independent VoEP providers have served to make the net neutrality issue more concrete. In one, Madison River Communications, a rural local exchange carrier, was found to have blocked VoIP calls. Vonage complained to the FCC and the company has agreed to refrain from this practice in the fixture and to pay a $15,000 fine. The second case involves Clearwire, a company that is starting to provide broadband services using WiMax in about 15 cities. Clearwire has signed a deal making Bell Canada its exclusive VoIP partner. As part of the deal. Bell Canada is investing $100 million in Clearwire. The principal academic proponents of open access are technologists and legal scholars led by Stanford Professor Lawrence Lessig.^ According to their view, the current "end-to-end" architecture of the Internet has been critical to its success. That architecture is often described as consisting of four layers^—^the physical, logical, applications and content layers. The physical layer consists of the "pipes" through which all this information flows. These pipes are "dumb" and should remain so. Resting on top of the '^ In October 2005, SBC and Verizon agreed to comply with net neutrality principles as a condition of obtaining approval from the FCC of their mergers with AT&T and MCI, respectively. ^ Mark A. Lemley and Lawrence Lessig, The End of End-to-End: Preserving the Architecture of the Internet in the Broadband Era, 48 UCLA LAW REVIEW 4 (April 2001).

Thomas M. Lenard and David T. Scheffman

5

physical layer is the logical layer, which relies on protocols like TCP/IP, which are open and non-proprietary and also should remain so. All of this is to facilitate innovation at the edge of the network—in the applications and content layers—^which is where the "intelligence" is and should remain concentrated. An open-access requirement is needed to maintain this end-to-end architecture in order to continue to facilitate innovation in the fixture. Lessig refers to this network architecture as an "innovation commons," which should be freely accessible by anyone adhering to its protocols and standards. As he has written, "[m]y concern is not whether the technology that "pipe" owners use is proprietary or not. My concern is how those technologies alter the environment for innovators and developers at the edge of the network."^ Broadband providers can pretty much do what they want as long as "they do so in ways that do not interfere with other network fimctionality, conflict with network values, or create negative externalities for the Internet generally. The Internet was meant to be extended. So long as any extension respects Internet values, I have no problem with it." The economic argument in favor of open access to broadband providers is similar to arguments for open-access requirements in other network industries, including traditional telephony, electricity and even software: Lastmile broadband providers have market power, which they will use to discriminate against content that is not their own.^ Consumers will be deprived of choices they should have and some content providers will be unable to access their customers. All this will be adverse to economic efficiency. While the existence of market power—generally, in the form of either a monopoly or a duopoly—is a necessary condition for regulatory intervention to be welfare enhancing, it is by no means sufficient. As discussed in Section V, below, monopolists often will do the right (i.e., the efficient) thing with respect to vertical arrangements. Moreover, the difference between markets with two and three sellers— often thought to represent a stark dividing line between non-competitive and competitive markets—is not always so clear. Markets dominated by two sellers can be characterized by intense competition, especially when those markets are rapidly evolving and customers are "sticky"—i.e., likely to be long-term subscribers. Indeed, that seems to be the case with local markets for broadband, which currently are dominated by two providers—^the local telephone and cable companies. As the next section suggests, there is a significant likelihood that new entrants will emerge to compete with these companies, especially if investment incentives are not stifled by regulation. ^ Lawrence Lessig, Coase 's First Question, REGULATION, The Cato Institute, Fall 2004. ^ See Mark Cooper, this volume.

6

Net Neutrality

However, even in the absence of new competitors, the cable and telephone companies are competing aggressively with each other in an attempt to increase their market shares by improving service (higher speeds) and lowering prices. Evidence of this competition is summarized in the next section.

III.

CURRENT COMPETITIVE ENVIRONMENT FOR BROADBAND

As we have discussed, the market for broadband is still at an early stage and it is as yet unclear how it will be develop or even how the "market(s)" of the future will be defined. It is clear, however, that most providers operating in what we currently think of as local markets for broadband face fairly intense competition for customers. The FCC collects data on broadband providers by zip code, which provide a partial picture of the level of competition within this sector.^ Ninety-five percent of zip codes have at least one high-speed Internet provider offering services; 83 percent have two or more; 67 percent have three or more; and 52 percent have four or more providers. The number of competitors is rising over time. For example, the number of zip codes with four or more competitors increased by more than four percentage points from the end of 2003 to the end of 2004. The number of providers operating in a zip code undoubtedly overstates the number of choices available to individual customers—^particularly, residential customers. Residential customers that have a choice typically have a choice between two providers—^the cable company and telephone company. This is consistent with the FCC data (see table below) showing that (as of the end of 2004) cable and DSL controlled almost 93 percent of the market. Business customers may have more choices. There are also a number of new technologies attempting to gain a foothold in the market and become the third or fourth broadband pipe. Technology ADSL Other Wireline Coaxial Cable Fiber Satellite or Wireless Total

Lines (millions) 13.82 1.47 21.36 0.70 0.55 37.89

Percent of Total 36.5 3.9 56.4 1.8 1.5 100.0

Source: FCC High-Speed Services for Internet Access: Status as of December 31,2004

^ Federal Communications Commission, High-Speed Services for Internet Access: Status as of December 30, 2005 (July 2005).

Thomas M. Lenard and David T. Scheffman

7

As the number of broadband subscribers has increased, so has the amount of money spent on high-speed Internet access. According to the Telecommunications Industry Association, spending on high-speed Internet access services in the U.S. reached $17 billion in 2004, a 30.5-percent increase over 2003. TIA further predicted that consumer spending on high-speed Internet access services will reach $24.8 billion in 2007.^ These figures include spending on DSL, cable, fixed wireless, FTTH (fiber to the home), satellite and 3G (third-generation) wireless.

A.

Cable Modem and DSL

Most broadband today is either cable modem service provided by the local cable company or DSL service provided by the telephone company. Cable broadband infrastructure utilizes a hybrid-fiber-coaxial (HFC) architecture and is most commonly used for residential service. DSL is a copper-based technology that allows the telephone carrier to add certain electronics to the copper loop used for voice so that it can serve as a conduit for both voice and high-speed data traffic. There are a number of variations of DSL service. The primary residential DSL service is asymmetric DSL (ADSL). As of the end of 2004, cable accounted for 21.36 million high-speed lines (56 percent of the total) and ADSL for 13.82 million lines (37 percent of the total). "Other wireline," which includes symmetric DSL services the telephone companies provide to business customers, accounts for 1.47 million lines (3.9 percent of the total).^^ Recently, DSL has been growing slightly more rapidly and has taken some market share from cable. From December 2003 to December 2004, the market share of ADSL grew by 2.8 percentage points, while the share of cable fell by almost 2 percentage points. Other wireline technologies and fiber each lost less than one percentage point of market share, while satellite and wireless gained 0.2 percentage points. In the competition for market share, cable and DSL providers are pricing aggressively. DSL has historically been cheaper than cable broadband, as cable companies have been more inclined to increase services while holding prices constant rather than cut prices. The price gap has since been shrinking, however, as the average price of cable broadband ^ Spending on High-Speed Internet Access to Rise 31% to $17 Billion in 2004, TIA Predicts. Telecommunication Industry Association Press Release, February 26, 2004. 10 Id.

Net Neutrality service worldwide is now $30.43 a month, compared to $28.66 a month for DSL. Between Q2 2004 and Q4 2004, DSL prices dropped an average of 5 percent, while cable prices fell an average of 10 percent over the same period.^* Recently, cable and DSL providers have undertaken another round of price cuts and special offers, instigated by SBC Communications offering DSL service for $14.95 a month. This has led to a string of promotions by major rivals, including Comcast offering service for $14.95 a month for three months in selected areas. Cox Communications offering high speed service for $24.95 a month for three months, down from $39.99, and Verizon offering DSL for $19.95 a month for three months, $10 off the regular price.'2 Verizon has also begun offering a slower DSL service for $14.95 a month (on a one-year contract) with speeds of 168 kbps down and 128 kbps up, faster than dial-up connections but below the speeds of Verizon's regular DSL, which offers 3 mbps for downloads.^^

B.

Fiber

The intense competition underway for broadband customers is also reflected in the major fiber investments being made by the telephone companies. Fiber now accounts for about 700,000 high-speed lines or 1.8 percent of the total.^^ It is growing as a broadband solution and as a competitor to cable and DSL because it offers substantially more capacity than copper-based solutions. Several major carriers have FTTH (fiber-to-the-home) deployments of various magnitudes underway. ^^ Verizon now offers a 5 Mbps/2 Mbps service for $39.95 a month and a 15 Mbps/2 Mbps service for $49.95 a month. By the end of 2005, Verizon plans to have FTTP (fiber-to-the^^ eMarketer, What Broadband Service Offers the Best Value?, February 25, 2005. ^2 Dionne Searcey, The Price War For Broadband Is Heating Up, THE WALL STREET JOURNAL ONLINE, June 29, 2005. ^^ Justin Hyde, Verizon to Offer Slower DSL to Gain Broadband Users, Reuters, August 22, 2005. ^"^ Federal Communications Commission, High-Speed Services for Internet Access: Status as of December 30, 2004 (July 2005). ^^ Of the three types of FTTH, the most common architecture used is Passive Optical Network (PON) technology. This technology allows multiple homes to share a passive fiber network in which the plant between customers and the head end consists entirely of passive components (no electronics are needed in the field). The other architectures used are home run fiber, or point-to-point fiber, in which subscribers have dedicated fiber strands, and hybrid PONs, which are a combination of home run and PON architecture.

Thomas M. Lenard and David T. Scheffman

9

premises) services in 20 Metroplex cities.^^ SBC has announced plans for its Project Lightspeed, a fiber network using both FTTP and fiber-to-thenode (FTTN) technologies. By the end of 2007, SBC expects to reach 17 million households with FTTN technology and nearly one million with FTTP. Through Project Lightspeed, SBC plans to deploy IPTV service trials for the fourth quarter of 2005. The project is expected to carry a three-year deployment cost of approximately $4 billion.'^ Some carriers are constructing fiber-to-the-curb (FTTC) facilities, which employ copper lines for the final 500 feet from a pedestal where the fiber runs to the subscriber premises. These lines permit carriers to provide high-speed data in addition to high-definition video services. Through its FTTC deployment, BellSouth hopes to offer IPTV and video services by 2006. At the end of 2004, BellSouth had over 1 million FTTC subscribers and could add between 150,000 and 200,000 more in 2005.^8 The costs of these investments are substantial. With estimated costs of $1,364 per household in urban areas and $2,705 per household in rural areas, the total U.S. price tag for FTTH networks would come to $233 billion.i9

C.

Satellite and Wireless

At the present time, satellite and wireless account for only about 550,000 lines, or about 1.5 percent of the total, but this situation could easily change. High-speed Internet access over satellite remains a nascent technology. DIRECWAY, the main U.S. satellite broadband provider, passed the 250,000-subscriber mark in Q2 2005. Satellite providers primarily serve home offices and small businesses not currently served by wireline or cable broadband, particularly in rural areas where running cable is cost-prohibitive or technologically difficult. DIRECWAY is available nationwide and costs $600 for installation and equipment and $60 a

16 Federal Communications Commission, Availability of Advanced Telecommunications Capability in the United States, Fourth Report to Congress (September 9, 2004). Prices for Verizon FTTH when bundled with phone service. If FTTH alone, prices are $5 more. See Stephen Lawson, Verizon Weaves Fiber-Based Broadband Offering, PC WORLD, July 21, 2004, available at: . 1^ SBC Press Release, SBC Communications to Detail Plans for new IP-Based Advanced Television, Data and Voice Network, November 11, 2004. 1^ Light Reading, Analysts See Tellabs Win at BellSouth, March 29, 2005. 1^ Taking Fiber to the Home, Red Herring Research, February 2004, available at: .

10

Net Neutrality month for the service.^o Starband, another satelhte broadband provider, charges a one-time fee ranging from $99.99 to $599 for equipment and a monthly fee ranging from $49.99 to $99.99.^1 Terrestrial wireless, also in its early stages, comes in a variety of flavors and can be either fixed or mobile. Wireless technologies have a lot of potential to make substantial inroads in the market. Their success will depend in part on the government's spectrum policy—^how much spectrum is made available and under what conditions. Wireless Fidelity (Wi-Fi), which can provide data at speeds of up to 54 Mbps, operates in the unlicensed 2.4 and 5 GHz radio bands. Wi-Fi allows wireless devices, such as a laptop computer or personal digital assistant (PDA) to send and receive data from any location within range of a Wi-Fi-equipped base station or access point (AP)—^typically around 300 feet.22 Major telecom providers, including T-Mobile and Verizon, are installing networks of these "hotspots," sometimes in partnership with businesses, such as McDonald's and Starbucks. A number of municipalities are also planning Wi-Fi networks. Wi-Fi is a technology best suited for short distances. WiMax, which operates over much greater distances, offers a better possibility of becoming the third broadband pipe into the home. WiMax employs a point-tomultipoint architecture operating between 2 GHz and 66 GHz and is capable of transmitting network signals covering in excess of 30 miles of linear service area, providing multiple shared data rates of up to 75 Mbps.^^ Clearwire, a wireless broadband company using pre-WiMax technology to provide residential high-speed access currently serves 15 areas and has three more planned. Clearwire uses OFDM (Orthogonal Frequency Division Multiplexing) transmission protocol on licensed 2.5 GHz spectrum, providing a more secure connection than Wi-Fi. OFDM eliminates the need for a direct line of sight between the transmitter and receiver and allows access to the Internet through "plug-and-play" modem devices. The monthly prices for OFDM services vary by carrier and speeds and range from $24.95 to $129.99.24

20 Federal Communications Commission, Availability of Advanced pability in the United States, Fourth Report to Congress, September 2^ Information regarding prices for Starband, available at: . 22 Federal Communications Commission, Availability of Advanced pability in the United States, Fourth Report to Congress (September

Telecommunications 9, 2004.

Ca-

Telecommunications 9, 2004).

Ca-

23 Ibid. 2"^ Federal Communications Commission, Availability of Advanced Telecommunications pability in the United States, Fourth Report to Congress (September 9, 2004).

Ca-

Thomas M. Lenard and David T, Scheffman

11

In addition, after years of promises, a new generation of wireless broadband services—sometimes referred to as third-generation (3G) services— is now coming to the market.^^ These mobile wireless broadband services allow high-speed access to the Internet and other data services from a cell phone, a PDA or a wireless-equipped laptop. A laptop equipped with the appropriate laptop card can get broadband without having to be near a Wi-Fi hotspot. Verizon Wireless began offering high-speed mobile Internet access in selected areas in October of 2003. As of July 2005, Verizon Broadband Access was available in 53 markets serving one-third of the U.S. population. Verizon plans to cover half of the U.S. population by the end of 2005. At the end of August, Verizon announced it was lowering its price for wireless broadband from $79.99 to $59.99. Both Sprint and Cingular charge about $80 for their services.^^ Sprint also is improving its wireless broadband network, offering service to 14 metropolitan areas covering 92 million people by Q3 2005, with expansion planned for Q4 2005 as well as 2006. Cingular has announced the roll-out of its own wireless broadband network and will be up and running in 15 to 20 cities by the end of 2005, offering services to 40 million Americans.^^ While Verizon and Sprint are offered over a cellular network architecture using EvolutionData Optimized technology (EV-DO), Cingular uses universal mobile telephone system (UMTS). UMTS has one big advantage—it is compatible with most overseas 3G services, while EV-DO is not. As is the case with fiber, updating the carriers' wireless networks to provide these advanced services is not cheap. Eastern Research estimates that for U.S. wireless carriers to upgrade to 2.5G and 3G systems may cost in excess of $100 billion.^^

D.

Broadband Over Powerline

Broadband Over Power Line (BPL) is also a candidate for being the third (perhaps the fourth if WiMax or some other form of wireless is the ^^ See, for example, Seth Schiesel, For Wireless, The Beginnings of a Breakout, THE NEW YORK TIMES, January 13, 2005, El. ^^ Dionne Searcey, Verizon Wireless Cuts Broadband JOURNAL ONLINE, AugUSt 29, 2005.

Price by 25%, THE WALL STREET

^"7 Ben Chamy, 3G Wars Heat Up, July 20, 2005; and Sprint Touts High-Speed Wireless Service, CNET News, July 7, 2005. ^^ Cost Optimization in Radio Access Networks as Mobile Carriers Migrate to 30, Eastern Research, 2004.

12

Net Neutrality third) broadband pipe into the home. BPL travels on medium-voltage transmission lines and can offer symmetric speeds up to 3 Mbps, with next-generation chipsets being developed to provide up to 100 Mbps. BPL is seen as an important new technology because the infrastructure is already in place to connect even the most rural areas to a viable broadband solution. The FCC opened the door to deployment of BPL in October 2004 by adopting technical standards.^^ There have been a number of BPL pilot projects, including in AUentown, Pennsylvania and Potomac, Maryland. BPL is now being rolled out on a commercial basis in Cincinnati, Ohio by Cinergy. Manassas, Virginia has also begun to offer residential and commercial BPL service at a cost of $28.95 a month for residential and $39.95 a month for commercial service, offering asymmetric speeds of at least 300 kbps.^^ These prices look less competitive as the cable and DSL providers lower their prices. The success of BPL may depend in part on its treatment by state regulatory commissions, which regulate electric utilities.31

IV.

BROADBAND AS A DISTRIBUTION BUSINESS

As the preceding section indicates, the broadband sector is intensely competitive and there is a lot of downward pressure on prices. Broadband providers are of necessity spending hundreds of billions of dollars building out both wireline and wireless networks in order to offer a range of new and better services. These providers are betting that the demand for these services—^which will be driven by the demand for the available content—^will be sufficient to enable them to recoup their investments. The question is what kinds of business models will providers need to employ in order for this to happen. Broadband is a distribution business with many characteristics in common with other distribution businesses. Consumers don't have a demand for broadband by itself; they want the goods and services—^the content— that it provides. In this section, we lay out some general propositions ^^ Federal Communications Commission, ET Docket No. 04-37 Report and Order, October 28, 2004. ^^ Thomas Hoffman, Plugged In: Broadband over Power Lines Goes Live, March 14, 2005, available at: . ^^ The most contentious issues would revolve around the allocation of costs between the regulated electricity services and the unregulated broadband services, both provided over the same pipe.

Thomas M. Lenard and David T. Scheffman

13

about the business of distribution that apply to the net neutrality debate. These propositions suggest the following: Behavior that might be viewed as discriminatory is common in distribution industries (despite the lack of market power). Nevertheless, distributors generally will not find it in their interest to block their customers from accessing goods and services that they find valuable. Finally, the ability to adopt business models that bundle distribution with content is critical to providing adequate incentives and perhaps funding to invest in both industries in the first place. An example of this is the Clearwire-Bell Canada deal, discussed above, in which Clearwire gave Bell Canada exclusive rights to distribute VoIP over its network in exchange for a SlOO-million investment. Proposition 1: Distribution is generally bundled with content. Consumer goods are primarily purchased from retailers. Retailers are generally the distributors (rather than the manufacturers) of the products they sell. When consumers purchase products from retailers they are purchasing the products and the distribution services (i.e., retailing) being provided. Similarly, when a consumer purchases a service, such as plumbing, the service comes "bundled" with the "distribution" (that is, providing the service at the consumer's residence). Finally, with respect to electronic media (e.g., radio and television), content and distribution are, of course, bundled. The analogy with broadband is obvious. Content is the product or service; the broadband connection and related transmission and access services are the distribution. A little reflection makes clear that consumers overwhelmingly purchase products and services that are bundled with distribution. Typically the seller provides distribution in terms of bricks and mortar (retailers), delivery of services to the home (household services), or arranging shipment to the consumer (mail order and Internet commerce). That this is the case is hardly surprising. Consumers do not generally have an independent demand for distribution; rather, consumers have a demand for the goods and services made accessible by distribution. Proposition 2: Typically, the distributor sells the bundle (rather than just distribution services). Again, most consumer products are purchased through retailers. Retailers are the distributors, and choose the products that they sell. When consumers purchase directly from a "manufacturer" (for example, through mail or phone order), typically the "manufacturer" sells a bundle that in-

14

Net Neutrality eludes transportation. Again, this is beeause eonsumers do not generally have a demand for distribution, as opposed to the goods and services provided via distribution. Proposition 3: At least some viable business models will require broadband to be bundled with content. We would expect that broadband and content would be bundled based on general propositions about the bundling of consumer goods with distribution. The short history of broadband suggests that bundling may well be required to create a viable business model. Current and likely future trends in subscribership, absent significant advances in content, do not justify the large investments that are being made in broadband deployment. What is needed is a bundled product offering that is sufficiently attractive for consumers to become subscribers at rates that support the large infrastructure investments that are and will continue to be required. The bundled product offering is going to have to be "put together" and sold by the broadband provider. This may involve broadband providers going further than currently in being content creators. More importantly, it probably will require various contractual arrangements with content creators and providers. Indeed, vertical integration into programming was a key element in the development of cable television.^^ There may be other analogies to cable television. We may well see different tiers of service—a "basic" broadband service with sufficiently attractive bundled content to drive basic broadband subscribership, and additional "premium" content that only a subset of subscribers will purchase. Some version of this scenario underlies the concerns of the net neutrality proponents. Specifically, the prospect of broadband providers becoming significant content providers raises concerns that they will favor their own content. Proposition 4: Broadband has some relatively unusual features (value of incremental content/subscribers). One thing that consumers can be expected to demand from broadband service is broad access to content. While increases in subscribership may be driven by breakthroughs in the creation of bundled broadband/content packages, most if not all consumers will also demand broad access to the Internet. The economics of broadband also make it strongly in the pro-

^^ See Bruce Owen and Gregory Rosston, this volume.

Thomas M. Lenard and David T. Scheffman

15

vider's interest to offer such broad access. This is because the incremental cost of subscribers (once they are wired) is quite small, so content that will drive incremental subscribership is likely to be profitable (depending on the fiill costs of the incremental content). We see this phenomenon in cable, where some of the largest cable providers are vertically integrated into content, but they offer competitive content as well. The negotiations and terms of offering of competitive content are sometimes "not pretty," but as we explain below, this is necessarily a bi-product of competitive markets. Other things equal, it is always in a vendor's interest to favor its own products. But other things aren't equal. Competitors' content can increase subscribership, which is net beneficial. Proposition 5: Competitive markets are typically not "fair and nondiscriminatory'' and this is generally a benefit to consumers. It is quite common for vendors that sell consumer goods and services directly to consumers to sell their own products and services along with those of other vendors. Supermarkets are one obvious example. Supermarkets sell their own products and services (private label, bakery, etc.) and sell the products and services of many other vendors. Supermarkets make decisions as to what products to sell and where they are placed on the shelves. Furthermore, supermarkets typically charge manufacturers for shelf space, in fees commonly call "slotting allowances." Supermarkets also receive various forms of incentives for special displays in prominent locations (e.g., the coveted end-aisle and checkout areas). Other sorts of deals and promotional allowances influence the allocation of shelf space. Of course, another very important determinant of shelf placement is how much a product will sell and the profit derived by the supermarket from its sale. The typical supermarket clearly does not have market power. Nonetheless, the typical supermarket engages in practices that, from a classic regulatory perspective, might be characterized as unfair and discriminatory. These practices are key to the messy system by which a competitive marketplace allocates scarcity (shelf space, "consumer attention" span, etc.). There is not a shelf-space constraint in broadband, although there are technical capacity constraints. Nonetheless, we see an analogous messy competitive (and arguably unfair and discriminatory) process in Internet commerce—e.g., payments for first screen preferences, payments for buttons, pop-ups, etc. This is competition for access to potential customers— roughly analogous to payments for shelf space, shelf placement and spe-

16

Net Neutrality cial displays. The Clearwire-Bell Canada deal—$100 million for an exclusive contract is an example of this. Finally, there are congestion problems. And this is occurring before there is substantial use of streaming video (which presumably will occur once a viable broadband business model is developed). Under congested conditions, efficiency may require charging a positive price for access to the network. This would be inconsistent with net neutrality principles and might also appear discriminatory from a regulatory perspective.

V.

INNOVATION AT THE EDGES AND INNOVATION AT THE CORE

Net neutrality proponents focus on the harm that compromising the end-to-end principle would cause to innovation, which they maintain occurs at the edges of the network. On the other hand, there is a striking lack of concern about the effect their proposals might have on incentives to invest and innovate in the network itself—^the physical and logical layers that form the "core" of the network. Our previous discussion shows what should be obvious—^that it is critical for the future of the Internet to permit business models to evolve that support innovation both at the edges and at the core. As discussed in Section III, broadband providers already are spending literally hundreds of billions of dollars on the core—^upgrading their networks in order to offer their customers better, higher-speed service. As discussed in Section IV, bundling these advanced distribution services with content may be necessary to create a viable business model and the appropriate incentives to invest in content—i.e., innovation at the edges. Instituting public utility type regulation for the Internet would surely interfere with the development of such business models and be detrimental to incentives to invest and innovate in both the network and content. A primary concern among net neutrality advocates, certainly one that Lessig emphasizes, is that innovation will suffer if access isn't open. Applications and content innovators will not only be deprived of a way to get their new products to consumers, but knowing this will be discouraged from innovating further. It is difficult to envision this happening in the current broadband environment. First, there is intense competition in local markets even with only two providers. A provider who denies access to content or applications that consumers find valuable will reduce the demand for its services. In addition, the market for content and applications is not the local broad-

Thomas M. Lenard and David T. Scheffman

17

band market. The market for content is national, or even international, in scope, and content providers face a number of potential buyers. Innovations that are valuable to consumers will surely find a way to get to them. Lessig suggests as a model for the Internet the electricity grid, where any appliance that meets basic standards can be plugged in and work. But, electricity provides a good example of exactly what not to do. The lack of investment and innovation in the electricity grid has for a number of years now been a widely recognized national problem.^^ In addition, the electricity grid does not work the way Lessig wants the Internet to work. Generators—analogous to content providers—^pay for the privilege of injecting electrons into the grid and customers pay for the privilege of withdrawing them. Up until now, content providers have not paid for the privilege of sending their content over the Internet. Consumers typically pay a fixed price to the broadband provider for the privilege of withdrawing content, irrespective of volume. This model may change over time in both respects. Content providers may, under some models, pay for access to the network, just as grocery manufacturers pay for shelf space at the supermarket. They may pay for exclusive access, as in the Clearwire-Bell Canada arrangement. And, in a world in which capacity is not infinite, those consumers that use the network more intensively can be expected to pay for it. Lessig's discussion suggests that if access to the network is auctioned off, this is somehow discriminatory. Obviously, access to the electricity grid needs to be "auctioned off—i.e., sold in the market—^because at times there is excess demand for the available capacity. Similarly, broadband is not in infinite supply and access to it might need to be priced. Moreover, covering the fixed costs of the grid by charging different prices to different users at different times is likely to be the efficient solution. As indicated above, such price discrimination, while efficient, is not consistent with principles of net neutrality.

^^ See Eric Hirst, Transmission Investment: All Talk and Little Action, PUBLIC UTILITIES FORTNIGHTLY, July 2004, and Eric Hirst and Brendan Kirby, Transmission Planning and the Need for New Capacity, prepared for National Transmission Grid Study, U.S. Department of Energy, December 2001.

>

VI.

Net Neutrality

EXCLUSIVE VERTICAL ARRANGEMENTS: ARE THERE MEANINGFUL EXCEPTIONS TO THE EFFICIENCY RULE?

The discussion in the previous sections indicates how common it is for distribution businesses—which, in Internet parlance, would be called "platforms"—^to bundle distribution with "content" and vertically integrate into the content business in ways that might appear to be discriminatory when viewed from a regulatory perspective. Such behavior is characteristic of distribution businesses that operate in highly competitive markets and, indeed, appears to be integral to the development of viable distribution business models. Business models that incorporate these characteristics are likely to be particularly valuable for distribution businesses that require large investments—as is obviously the case with Internet infrastructure. While it is clear that vertical arrangements of virtually any kind in competitive markets are not cause for concern, it is also the case—as the Chicago School of antitrust analysis has taught us—^that vertical arrangements, even of the exclusive variety, are frequently efficient even under monopoly. This is because, under any market structure, the platform provider has a strong incentive to maximize the value of the platform to consumers. To the extent that any general open-access requirement precluded these efficient arrangements, investment incentives and consumer welfare would suffer.

E.

Internalizing Complementary Efficiencies

The conditions under which exclusive vertical arrangements by a platform monopolist are efficient are described in the recent Farrell-Weiser paper, in which they introduce the term "internalizing complementary efficiencies" (ICE) to describe the phenomenon:^^ ICE claims that even a monopolist has incentives to provide access to its platform when it is efficient to do so, and to deny such access only when access is inefficient.^^ ^^ Joseph Farrell and Philip J. Weiser, Modularity, Vertical Integration, and Open Access Policies: Toward a Convergency of Antitrust and Regulation in the Internet Age, 17 HARV. J. OF L. AND TECH. 86.

35 M , at 89.

Thomas M. Lenard and David T. Scheffman

19

Thus, for example, Microsoft—^which arguably has a monopoly in the personal computer operating system platform—still benefits from an efficient applications market, because an efficient applications market increases the value of the Windows operating system. Similarly, broadband providers benefit from having applications and content markets that maximize value to their consumers. Anything that detracts from user value will also reduce the demand (and hence the price that can be charged) for the platform. This logic also extends to the decision of whether to vertically integrate into adjacent markets: The platform monopolist will (with the exceptions noted below) have the incentive to do whatever maximizes user value. It typically will not be in the platform monopolist's interest to try to monopolize the adjacent market and exclude competitors' applications, because it already has a monopoly in the platform market and can charge consumers the monopoly price regardless of what happens in the applications market. The route to higher profits for the platform owner is a more valuable platform—and better applications and content contribute to that. Farrell and Weiser cite examples of platform owners deciding to stay out of the applications market in order to encourage more entry and competition by others, because such a strategy translates into increased value for the platform. In sum, "the platform monopolist gains from an efficient applications market—^whether that be unbridled competition, integration without independents, licensing of a limited set of independents, or some attempt to combine these or other structures."^^

F.

Exceptions to ICE

There are, however, exceptions to this rule. Platform monopolists do not always have the incentive to do the efficient thing. Farrell and Weiser note that while: ICE is often a persuasive argument...its logic admits several cogent exceptions. Unfortunately, regulators and commentators seldom do justice to the nuances of this principle: some ignore ICE, while others embrace it and underestimate its exceptions. Only by addressing both

36/J., at 102.

20

Net Neutrality ICE and its exceptions can regulators make full use of economics in analyzing open access requirements.^^ Farrell and Weiser list eight exceptions to the ICE logic. The question is whether any of these exceptions is likely to be important for the Internet platform under the conditions that currently prevail in the relevant markets for broadband and content. We discuss each of the exceptions in turn and show that none is likely to be applicable to the current broadband market. Exception 1: The platform monopolist is subject to price regulation, but may be able to take monopoly profits in adjacent markets. This exception is not relevant for the broadband market, because it is not currently subject to price regulation. This exception provides an additional reason not to adopt price regulation, should it ever be contemplated Exception 2: Participating in the adjacent market can facilitate price discrimination. Price discrimination can (as Farrell and Weiser note) be either helpful or harmful and it is likely to be difficult for any regulator to make the distinction with any confidence. Indeed, our discussion in Section IV indicates that distribution industries are likely to be characterized by beneficial price discrimination as well as pricing that may look discriminatory but, in fact, is not. Exception 3: A competitor in the adjacent market can threaten the primary monopoly. This is essentially what the courts found in the Microsofl antitrust case—^that Microsoft had undermined Netscape's browser, because Netscape constituted a threat to the dominance of the Windows operating system. Similarly, this is the argument Lessig makes with a couple of examples. If, for example, video producers were to broadcast their programs over the Internet, this would cut into the cable companies' core business. So, the cable companies might not want to make this service available to their broadband customers and might then block their cable modem customers from receiving the new competitive video service. A similar situation

3^ M, at 89.

Thomas M. Lenard and David T. Scheffman

21

could occur with VoDP, which is competitive with the telephone companies' core business. This might give them a strong incentive to prevent their DSL customer$ from using such a service.^^ This exception is potentially important and could be of some concern, but it would seem only if the platform provider is truly a monopolist. This apparently was the case with Madison River, which does not face broadband competition in at least some of its markets. In these circumstances, competitive VoIP services could threaten Madison River's voice monopoly and blocking access to these services could be in the company's interest, even if inefficient. It would not necessarily be inefficient if, for example, Madison River decided to offer its own VoIP service and bundling that service with distribution could be done at a lower cost. Given the discussion of the broadband market structure above, the Madison River case is an exception. The overwhelming majority of local broadband markets are not monopolies. A relatively pessimistic scenario is that local broadband markets will continue to be dominated by two strong players—cable modem and DSL—competing vigorously for subscribers. Under this scenario, excluding content that consumers want would not seem to be a good business strategy. It is hard, for example, to envision the Microsoft campaign against Netscape if Microsoft had even one significant operating system competitor. When streaming video starts to be competitive with television as an entertainment medium, the cable companies won't be able to stop it, much as they might want to, because the streaming video providers will have other alternatives—at a minimum, the telephone companies' broadband services. If the cable companies were foolish enough to try to block the new service, they would see a rapid exodus of customers. Similarly, if the DSL providers were to block VoIP, they would lose a lot of business to the cable companies. Clearwire is not blocking independent VoIP services in order to protect a primary monopoly, because it has no primary monopoly to protect. If the VoIP service that Clearwire provides through its agreement with Bell Canada is inferior to the independent VoIP services, such as Vonage, consumers have an alternative. They can go with a competing broadband provider. If that turns out to be the case, Clearwire's decision to sign an exclusive agreement with Bell Canada will turn out to have been a mistake and the company will try to rectify it.

' Lawrence Lessig, THE FUTURE OF IDEAS, Chapter 3 (Vintage October 22, 2002).

22

Net Neutrality Exception 4: Transactions costs or other bargaining problems keep the platform provider and the applications developer from reaching agreement. This brings up the question of what the relevant market is for content. As discussed in the previous section, from the vantage of content and appHcations providers, the relevant market is not the local market for lastmile broadband, but rather a national market for applications and content. Indeed, content providers face a sizeable number of cable and telephone companies. It is hard to imagine that transactions costs or bargaining problems with all of them, or even any significant portion of them, would block an innovation that consumers would find valuable.^^ Exception 5: The platform monopolist is incompetent. Perhaps, but not likely less competent over time than the regulator, who would be called on to pass judgment on the monopolist's competence. Moreover, whatever the possible merits of this exception in a monopoly situation—and the argument does not appear to be a strong one—^those merits disappear when there is some competition. We would expect that incompetence would be swiftly punished in the market. Exception 6: Fear of open access regulation discourages the platform from voluntarily providing access in the first place. The strategy of denying access in order to retain the option to do so in the future would, again, only seem viable if the platform is a true monopoly. In the case of a duopoly, denying access to content that adds to consumer value would cede a significant advantage to the competition. It would seem to be an unlikely strategy in an environment in which there are at least two strong competitors and the number of competitive technologies is likely to grow over time. Exception 7: The platform provider is concerned that opening access in one area may increase its obligations elsewhere. As with the previous exception, even if a firm wanted to deny access for regulatory strategy reasons, to do so for any significant application or

^^ This point is made cogently by Christopher Yoo in this volume.

Thomas M. Lenard and David T. Scheffman

23

content would be a very costly strategy in a market with two or more competitors. Exception 8: Cases of incomplete complementarity—i.e., where the application in question has uses independent of the platform. An example of this would be applications for broadband platforms that can also be accessed using narrowband transport. This would be the case, for example, with online retailers. But, again, it is difficult to envision cases where it would be in the broadband platform's interest to monopolize such an application if it faced competition in the broadband market. Doing so would reduce consumers' willingness to pay for the platform that is restricting access and shift demand to its competitor(s).

VII.

CONCLUSION

In the end, however, the net neutrality debate is not really about the static efficiency issues with which the ICE exceptions are concerned. It is really about dynamic efficiency—creating a system with appropriate incentives to invest and innovate both in the network itself and in content and applications in order to deliver to consumers the new services they want and for which they are willing to pay. Both sides in the net neutrality debate base their arguments on the need to foster investment and innovation. Proponents of open access are concerned that broadband providers have market power, which they can use to discriminate against content providers. If this occurs, the end-to-end principle on which the Internet was built will be violated. This will be bad for innovation in applications and content, which occurs at the edges of the network. But, the evidence indicates that market power concerns are largely hypothetical. The market for high-speed Internet is quite competitive now and will be even more so in the fiiture. It does not make sense to be concerned with hypothetical market power issues when it is not yet clear that there is a viable business model for broadband—i.e., a business model that will enable investors to recoup the very large sums that are being and will be expended. At most, this calls for a case-by-case analysis of alleged blocking, to attempt to sort out those cases that truly are anti-competitive. Broadband is a distribution business and our discussion suggests there is a lot to be learned from other distribution businesses. The economics of distribution suggests that we should expect broadband to be bundled with

24

Net Neutrality content, sometimes under exclusive arrangements. We should also expect that broadband providers will not exclude content that consumers want, because to do so will reduce the demand for their product. It is typical for distribution businesses to engage in practices that are not consistent with non-discriminatory open-access principles. But these practices are also an integral part of the business models needed to support the investment expenditures that we hope will make the promise of broadband a reality.

Chapter 2 Network Neutrality and Competition Policy: A Complex Relationship

Christopher S. Yoo Vanderbilt University Law School

L

INTRODUCTION

The broadband industry has reached a crossroads. After avoiding the issue for years,^^ the Federal Communications Commission eventually decided that the Internet access services of cable broadband providers should be classified as "information services" rather than "telecommunications services."^"^ This determination, which removes these services from the common carriage requirements of Title II of the Communications Act, was recently affirmed by the Supreme Court,^^ and the FCC promptly classi5^ See Nat'l Cable & Telecomms. Ass'n v. Gulf Power Co., 534 U.S. 327, 348-51 (2002) (Thomas, J., concurring in part and dissenting in part) (criticizing the FCC for its reticence to address the proper regulatory classification of cable modem service). The FCC's reluctance to address these issues may end up limiting its latitude in determining how broadband should be regulated. Even though the FCC has since concluded that cable modem service is more properly regarded as an "information service," the Ninth Circuit has declined to accord Chevron deference to the FCC's rulings on the grounds that it is bound by stare decisis to adhere to its earlier determination that cable modem service is a "telecommunications service." See Brand X Internet Servs. v. FCC, 345 F.3d 1120 (9th Cir. 2003), cert, granted, 125 S. Ct 654, 655 (2004). This appears inconsistent with Chevron's recognition that agency interpretations of statutes should be permitted to change over time. See Chevron USA Inc. v. Natural Res. Def Council, 467 U.S. 837, 863-64 (1986). ^^ See Inquiry Concerning High-Speed Access to the Internet Over Cable and Other Facilities, Declaratory Ruling and Notice of Proposed Rulemaking, 17 F.C.C.R. 4798 (2002). 55 Nat'l Cable & Telecomms. Ass'n v. Brand X Internet Services, 128 S. Ct. 2688 (2005).

26

Net Neutrality fied the broadband Internet access services provided by telephone companies as information services.^^ Having largely failed to take the Internet into account when enacting the Telecommunications Act of 1996, Congress is preparing to undertake its second major overhaul of the communications laws in less than a decade.^^ And notwithstanding the classification of broadband Internet services as information services, the FCC continues to consider whether it should impose some common carrier-type open access and nondiscrimination requirements on broadband operators in response to a chorus of commentators asking the agency to require that all broadband network owners adhere to certain principles of network neutrality.^^ At their core, network neutrality proposals *stem from the concern that network owners will use their claimed control over last-mile broadband technologies to discriminate against nonproprietary Internet service providers (ISPs) and unaffiliated content and applications. According to these advocates, mandating interoperability is essential if the environment for competition and innovation on the Internet is to be preserved.^^ I believe that the current debate over network neutrality has overlooked several key insights. As an initial matter, the leading network neutrality proposals overstate the threat posed by vertical integration in the broadband industry. Although commentators differ widely on many aspects of vertical integration theory, there is widespread agreement that certain structural preconditions must be satisfied before vertical integration can plausibly threaten competition. An empirical analysis reveals that these preconditions are not met with respect to the broadband industry. Even more importantly, one of the core insights of vertical integration theory is that any chain of production can maximize economic welfare

^^ See Appropriate Framework for Broadband Access to Internet Over Wireline Facilities, Report and Order and Notice of Proposed Rulemaking, FCC 05-150, CC Docket No. 02-33, September 23, 2005. ^^ See, e.g., Stephen Labaton, What U.S. Businesses Are Looking for During Bush's 2nd Term: New Telecom Rules, INT'L HERALD TRIB., NOV. 5, 2004, at 19.

^^ See the Policy Statement containing "net neutrality" principles issued in the abovereferenced cable and wireline broadband docketed proceedings released in September 2005. Policy Statement, FCC 05-151, CC Docket No. 02-33, September 23, 2005. ^^ See, e.g., Lawrence Lessig, THE FUTURE OF IDEAS, 34-48, 147-75 (2001); Mark A. Lemley & Lawrence Lessig, The End ofEnd-to-End: Preserving the Architecture of the Internet in the Broadband Era, 48 UCLA L. REV. 925 (2001); Lawrence B. Solum & Minn Chung, The Layers Principle: Internet Architecture and the Law, 79 NOTRE DAME L. REV. 815, 851, 878 (2004); Kevin Werbach, A Layered Model for Internet Policy, 1 J. TELECOMM. & HIGH TECH. L. 37, 65-67 (2002); Timothy Wu, Network Neutrality, Broadband Discrimination, 2 J. ON TELECOMM. & HIGH TECH. L. 141 (2003).

Christopher S. Yoo

27

only if every level of production is competitive. In other words, any chain of production will only be as efficient as its least competitive link, which in the case of broadband is undoubtedly the last mile. This insight suggests that the major network neutrality proposals are focusing on the wrong policy problem. In attempting to preserve and encourage competition and innovation in applications, content, and ISP services, these proposals are focused on increasing competition in those segments of the broadband industry that are already the most competitive and the most likely to remain that way. Instead, basic economic principles suggest that the better course would be to eschew attempting to foster competition in ISP services, content, and applications and instead to pursue regulatory options that would promote more competition in last-mile technologies. Restated in terms of the existing models of "layered competition," the major network neutrality proposals advocate regulating the logical layer in a way that promotes competition in the application and content layers. Instead, the focus of public policy should be to promote competition at the physical layer, which remains the level of production that is currently the most concentrated and the most protected by barriers to entry. The irony is that network neutrality is likely to have the perverse effect of retarding, if not forestalling, the emergence of greater competition at the physical layer. The standardization implicit in compelled interoperability tends to reinforce and entrench the sources of market failure in lastmile technologies. The traditional justification for regulating wireline communications networks is that the presence of large, up-front sunk costs creates large supply-side economies of scale that cause markets for telecommunications services to collapse into natural monopolies. Interestingly, allowing networks to differentiate the services they offer can mitigate whatever tendency towards natural monopoly that may be present by allowing multiple last-mile technologies to coexist notwithstanding the presence of unexhausted returns to scale. Permitting variations in the protocols and network infrastructure employed by each network might enable smaller providers to overcome the cost disadvantages inherent in the smaller scale of their operations. They could tailor their networks to the needs of smaller subgroups that place a particularly high value on one particular type of network service and charge those subgroups more for those services, in much the same manner that specialty stores survive in a world dominated by one-stop shopping. Allowing network owners to differentiate their offerings would promote economic welfare by increasing the degree of price competition among last-mile providers. It would also increase utility more directly by allowing network owners to respond to the underlying heterogeneity in

28

Net Neutrality consumer preferences by varying the services they offer. Conversely, network neutraUty can prevent the reaUzation of these sources of economic efficiency. Even worse, it has the inevitable effect of introducing a regulation-induced bias in favor of certain types of applications and against others. Mandating universal interoperability may be effective in promoting the applications that currently dominate the Internet, such as email and web browsing, which operate solely at the network's edge. It is, however, ill suited to the more bandwidth intensive applications emerging today, which often depend on a greater degree of innovation in the network's core. For example, allowing networks to differentiate themselves might make it possible for three different types of last-mile networks to coexist by serving the needs of a different subgroup: one optimized for conventional Internet applications such as e-mail and website access, another incorporating security features to facilitate e-commerce, and a third employing routers that prioritize packets in the manner needed to facilitate timesensitive applications such as Internet telephony, generally known as "voice over Internet protocol" (VoIP). Conversely, mandating interoperability commodifies bandwidth in ways that sharply limit opportunities to compete on dimensions other than price, which in turn reduces the network owners' ability to satisfy the underlying heterogeneity of consumer preferences and reinforces the advantages enjoyed by the largest and most established players. Network neutrality is also particularly inappropriate when entry by alternative network technologies is technologically and economically feasible. This is because compelled access requirements represent something of a policy anomaly. By rescuing competing firms from having to supply the relevant input for themselves, compelled access destroys the incentives for those who need access to networks to invest in alternative network technologies. As a result, compelled access can have the perverse effect of entrenching any supposed bottleneck facility by forestalling the emergence of the alternative network technologies. This is particularly problematic in technologically dynamic industries, such as broadband, in which the prospects of developing new means for circumventing or competing directly with the alleged bottleneck are the highest. For example, DSL and cable modem providers are currently engaged in a spirited competition for new customers. At the same time, a host of other technologies are waiting in the wings, including such innovative services as satellite broadband, fixed terrestrial wireless, mesh networks, WiFi, and third-generation mobile wireless devices (3G), just to name a few. Those unable to obtain access to a broadband technology

Christopher S. Yoo

29

represent the natural strategic partners to provide the financing necessary to deploy these technologies. Other commentators have invoked the burgeoning literature on network economic effects as an alternative justification for regulatory intervention.^^ Network economic effects exist when the value of network access depends on the number of other users connected to the network, rather than the network's technological characteristics or price. As a result, a user's decision to join a network increases the value of the network for others. The fact that new users cannot capture all of the benefits generated by their adoption decisions has led many theorists to regard network economic effects as a kind of externality that causes overall network utilization to drop below efficient levels. Some commentators also argue that network externalities can turn network access into a competitive weapon. By refusing to interconnect with other networks, network owners can force users to choose one network to the exclusion of others and induce them to flock to the largest network. In short, network economic effects can create demand-side economies of scale analogous to the supply-side economies of scale caused by the presence of sunk costs. The current debate has overlooked a number of critical considerations that make it implausible that network economic effects are likely to harm competition. Even more importantly for the debates surrounding network neutrality, the economic literature recognizes that network differentiation can ameliorate the anticompetitive effects of the demand-side economies of scale associated with network economic effects in much the same manner as it can mitigate the problems caused by supply-side economies of scale. Imposing network neutrality would prevent such competition from emerging and would instead force networks to compete solely in terms of price and network size, considerations that give the largest players a decisive advantage. As a result, mandating network neutrality could have the perverse effects of reinforcing the sources of market failure and of dampening incentives to invest in the alternative network capacity that remains the most sustainable long-run solution to the problems of broadband policy. In other words, mandating network neutrality raises the real danger that regulation would become the source of, rather than the solution to, market failure. This is not to say that network differentiation represents a panacea. For example, in order for network differentiation to ameliorate supply-side 60

See, e.g., Jerry A. Hausman et al.. Residential Demand for Broadband Telecommunications and Consumer Access to Unaffiliated Internet Content Providers, 18 YALE J. ON REG. 129 (2001).

30

Net Neutrality and demand-side economies of scale in the manner I have described, the underlying preferences of network users must be sufficiently heterogeneous, otherwise users will tend to flock to the largest network regardless of whether multiple differentiated networks exist. Thus, although it is possible for network differentiation to counteract the tendency towards market failure, it will not necessarily have that effect in all cases. It also bears emphasizing that the potential harms associated with compelling network neutrality as a regulatory matter is not in any way inconsistent with recognizing that most network owners will adhere to network neutrality as a matter of choice. Interoperability clearly offers benefits to both providers and consumers, and network designers should hesitate before deviating from those central precepts. Indeed, I would expect that most industry participants would voluntarily design their technologies to be fully interoperable and compatible in the vast majority of cases even in the absence of regulation. At the same time, circumstances do exist in which the basic principles of economic welfare would be better served by allowing last-mile broadband networks to deviate from principles of universal interoperability. Adoption of any of the major network neutrality proposals currently pending before the FCC would effectively foreclose these benefits from being realized. The balance of this paper is organized as follows. Part I demonstrates the close relationship between network neutrality and the economics of vertical integration. It also examines the structure of the broadband industry, concluding that the preconditions needed for vertical integration to pose a threat to competition do not exist. Part II analyzes the potential welfare benefits of allowing last-mile providers to deviate from complete interoperability. Allowing last-mile providers to use vertical integration to differentiate their networks would allow the realization of certain efficiencies and would permit them to offer a broader range of services better attuned to consumers' preferences. Even more importantly, I show how requiring all broadband networks to use nonproprietary protocols can actually reduce competition by reinforcing the economies of scale already enjoyed by large telecommunications providers. Part III critiques some of the leading network neutrality proposals. Part IV analyzes the proper role of regulation, concluding that regulatory authorities will be more effective at promoting entry by new network platforms than they would be in ascertaining whether a particular exclusivity arrangement would promote or hinder competition. Even more importantly, promoting entry has embedded within it a built-in exit strategy. Once a sufficient number of broad-

Christopher S. Yoo

31

band network platforms exist, regulatory intervention will no longer be necessary.

11.

THE INTERRELATIONSHIP BETWEEN NETWORK NEUTRALITY AND VERTICAL INTEGRATION

This part examines the insights that vertical integration theory provides into the network neutrality debate. Section A describes the structure of the broadband industry and demonstrates how the network neutrality is designed to redress the supposed problems caused by vertical integration. Section B reviews vertical integration theory and shows how it is now widely recognized that vertical integration can create economic harms only if certain structural preconditions are met. An empirical analysis reveals that these structural preconditions are not satisfied with respect to the broadband industry. This in turn undermines claims that the types of vertical integration that network neutrality is designed to foreclose pose a serious policy concern.

A,

Two Conceptions of the Structure of the Broadband Industry

The major network neutrality proposals have embedded within them two, rather different conceptions of the vertical structure of the broadband industry. Multiple ISP access proposals implicitly conceive of providers being organized in a traditional, three-step chain of distribution, in which the ISPs act as a wholesaler and the last-mile providers play the role of the retailer. Other approaches conceive of the broadband industry as consisting of a series of layers. I will discuss each in turn. 1.

The Conventional Vertical Market Structure Implicit in Multiple ISP Access

Although the structure of the broadband industry may at times seem mysterious, when viewed from a certain perspective it is in fact quite ordi-

32

Net Neutrality nary.^^ Its basic organization differs little from that of the typical manufacturing industry, which is divided into a three-stage chain of production. The first and last stages are easiest to understand. The manufacturing stage is occupied by companies that create the actual products to be sold. The retail stage consists of those companies responsible for the final delivery of the products to end users. Although it is theoretically possible for retailers to purchase products directly from manufacturers, in practice logistical complications often give rise to an intermediate stage mediating between manufacturers and retailers. Firms operating in this intermediate stage, known as wholesalers, purchase goods directly from manufacturers, assemble them into complete product lines, and distribute them to retailers. Despite claims that the Internet is fimdamentally different from other media, the broadband industry can easily be mapped onto this three-stage vertical chain of distribution. The manufacturing stage consists of those companies that generate the webpage content and Internet-based services that end users actually consume. The wholesale stage is occupied by the ISPs and backbone providers, which aggregate content and applications and deliver them to retailers. Finally, last-mile providers, such as DSL and cable modem systems, represent the retailers who deliver the content and service packages assembled by the ISPs to end customers. The proponents of multiple ISP access in essence are concerned that vertical integration between the retail and wholesale levels of this chain of distribution will allow network owners to use their control of the retail stage to harm competition in the wholesale stage. 2.

The "Layered'' Approach Implicit in Connectivity Principles

Recent scholarship has increasingly turned to a somewhat different way to conceive of the structure of the broadband industry known as the "layered" approach.^2 x]^^ version of the layered approach that has gained the most popularity disaggregates networks into four horizontal layers that cut ^^ The following discussion is adapted from Christopher S. Yoo, Vertical Integration and Media Regulation in the New Economy, 19 YALE J. ON REG. 171, 182, 250-51 (2002). ^^ The layered model is related to the Open Systems Interconnection (OSI) model developed by the International Standards Organization (ISO) in the 1980s, which divides seven different layers: application, presentation, session, transport, network, data link, and physical. Some of these distinctions between those layers have greater relevance for technologists than for policy analysts.

Christopher S. Yoo

33

across different network providers.^^ The bottommost layer is the "physical layer," which consists of the hardware infrastructure that actually carries and routes the communications. The second layer is the logical layer, which is composed of the protocols responsible for routing particular communications within the network. The third layer is the applications layer, comprised of the particular programs used by consumers. The fourth layer is the content layer, which consists of the particular data being conveyed. Figure 1 The Layered Model of Broadband Architecture Content Layer (e.g., individual e-mail, webpages, voice calls, video programs) Applications Layer (e.g., web browsing, e-mail, VoIP, streaming media, database services) Logical Layer (e.g., TCP/IP, domain name system, telephone numbering system) Physical Layer (e.g., telephone lines, coaxial cable, backbones, routers, servers) The distinction between the layers can easily be illustrated in terms of the most common Internet application: e-mail. Assuming that the particular e-mail in question is sent via DSL, the physical layer consists of the telephone lines, e-mail servers, routers, and backbone facilities needed to convey the e-mail from one location to another. The logical layer consists of the SMTP protocol employed by the network to route the e-mail to its destination. The application layer consists of the e-mail program used, such as Microsoft Outlook. The content layer consists of the particular email sent. Network neutrality is motivated by a concern that last-mile providers will use their ability to control the physical layer to reduce competition in ^^ Werbach, supra note 59, at 37, 57-64; Richard S. Whitt, A Horizontal Leap Forward: Formulating a New Communications Public Policy Framework Based on the Network Layers Model 56 FED. COMM. L J . 587, 624 (2004).

34

Net Neutrality the application and content layer by entering into exclusivity arrangements with content and applications providers and by replacing the nonproprietary protocol currently used on the Internet—^known as the transmission control protocol/Internet protocol (TCP/IP)—^with a proprietary, noninteroperable set of protocols. Network neutrality is designed to short-circuit this dynamic by mandating that last-mile providers adhere to nonproprietary protocols and to open their networks to all applications and content on a nondiscriminatory basis.

B. Market Structure and Vertical Integration Vertical integration has long been one of the most contentious topics in economic theory, having spawned an extensive debate between what have become known as the Chicago and post-Chicago Schools of antitrust law and economics.^"* A full analysis of the scope of this controversy exceeds the scope of this chapter. For our purposes, it suffices to note that both sides agree that certain structural preconditions must be satisfied before vertical integration can plausibly harm competition. First, the vertically integrated firm must have market power in its primary market, because a firm that lacks market power has nothing to use as leverage. Second, the market into which the firm seeks to vertically integrate (called the secondary market) must also be concentrated and protected by barriers to entry. If no such barriers to entry exist, any attempt to raise price in the secondary market will simply attract new competitors until the price drops back down to competitive levels. The broad acceptance that these structural preconditions now enjoy is demonstrated by the fact that they are enshrined in the Merger Guidelines promulgated by the Justice Department and the Federal Trade Commission to evaluate vertical mergers.^^ Applying these principles to the broadband industry strongly suggests that the FCC should not erect the per se bar to vertical integration implicit in network neutrality. Considering first the requirement that the primary market be concentrated, the Merger Guidelines employs a measure of concentration known as the Hirschman-Herfindahl index (HHI) that has become the standard concentration under modem competition policy. HHI is calculated by adding the square of the market share of each competi-

^"^ The discussion that follows is based on Yoo, supra note 61, at 185-205. ^^ See U.S. Department of Justice & Federal Trade Commission, Non-Horizontal Merger Guidelines, §§ 4.131-.133, 4.21,4.212, available at: .

Christopher S. Yoo

35

tor.^^ The result is a continuum that places the level of concentration on a scale from just above 0 (in the case of complete market deconcentration) to 10000 (in the case of monopoly). The Guidelines indicate that the antitrust authorities are unlikely to challenge a vertical merger unless HHI in the primary market exceeds 1800/^ which is the level of concentration that would result in a market comprised of between five and six competitors of equal size. Determining whether the market is concentrated depends on market definition, which in turn requires the identification of the relevant product and geographic markets. Defining the relevant product market is relatively straightforward: The empirical evidence indicates that broadband represents an independent product market that is distinct from narrowband services.^^ Defining the relevant geographic market has proven more problematic.^^ Many analyses have mistakenly assumed that the relevant geographic market is the local market in which last-mile broadband providers meet end users. Because these markets are typically dominated by two players—^the incumbent cable operators offering cable modem service and the incumbent local telephone companies offering DSL service—defining the geographic market in this manner yields HHIs well in excess of 4000.^^ ^^ For example, a market of four firms with shares of 30%, 30%, 20% and 20% would have an HHI of 30^ + 30^ + 20^ + 20^ = 2600. ^^ Non-Horizontal Merger Guidelines, supra note 65, §§4,131. Note that the relevant threshold for vertical mergers is more lenient than the HHI thresholds applicable to horizontal mergers. Under the Horizontal Merger Guidelines, markets with HHIs between 1000 and 1800 are regarded as "moderately concentrated" and thus "potentially raise significant competitive concerns." U.S. Department of Justice & Federal Trade Commission, 1992 Horizontal Merger Guidelines § 1.51(b), available at: . Because vertical mergers are less likely than horizontal mergers to harm competition, the Merger Guidelines apply a more lenient HHI threshold to vertical integration. Non-Horizontal Merger Guidelines, supra note 65, § 4.0. The Merger Guidelines also reserve the possibility of challenging a vertical merger at HHI levels below 1800 if "effective collusion is particularly likely." Id. §4.213. ^^ Applications for Consent to Transfer of Control of Licenses & Section 214 Authorizations by Time Warner, Inc. and America Online, Inc., Transferors, to AOL Time Warner Inc., Transferee, Memorandum Opinion and Order, 16 F.C.C.R. 6547, 78-88 (2001); Jerry A. Hausman et al.. Cable Modems and DSL: Broadband Internet Access for Residential Customers, 91 AM. ECON. REV. 302, 303-04 (2001). ^^ Yoo, supra note 61, at 253-54. ^^ Amendment of Parts 1, 21, 73, 74 and 101 of the Commission's Rules to FaciHtate the Provision of Fixed and Mobile Broadband Access, Educational and Other Advanced Services in the 2150-2162 and 2500-2690 Mhz Bands, Notice of Proposed Rule Making and Memorandum Opinion and Order, 18 F.C.C.R. 6722, 6774-75 (2003); Hausman et al, supra note 60, at 155; Rubinfeld & Singer, supra note 59, at 649.

36

Net Neutrality The problem with this analysis is that network neutrality proposals are designed to limit the exercise of market power not in the final downstream market in which last-mile providers meet end users, but rather in the upstream market in which last-mile providers meet ISPs and content/application providers. This is easily seen if one hypothesizes a broadband market that is totally vertically disintegrated. Preventing lastmile providers from offering ISP services, content, or applications would not cause any changes in the fundamental economic relationship between last-mile providers and end users, which would remain a de facto duopoly. Compelled vertical disintegration would, however, substantially change the bargaining power between last-mile providers and ISPs and content/application providers. Thus, if network neutrality proposals are to have any effect at all, it is by changing the economics in the upstream market in which last-mile providers meet ISPs and providers of Internet content and applications. In contrast to the end-user market that has represented the focus of prior analyses, these markets are national in scope. Major web-based providers, such as Amazon.com or eBay, are focused more on the total customers they are able to reach nationwide than they are on their ability to reach customers located in any specific metropolitan area. They would, of course, prefer to be able to reach all consumers nationwide. The fact that they may be unable to reach certain customers is of no greater concern, however, than the fact manufacturers of particular brands of cars, shoes, or other conventional goods are not always able to obtain distribution in all parts of the country. The fact that some manufacturers may be cut off from certain customers should not cause economic problems so long as those manufacturers are able to obtain access to a sufficient number of customers located elsewhere. The proper question is thus not whether the broadband transport provider wields market power over broadband users in any particular city, but rather whether that provider has market power in the national market for obtaining broadband content. When the relevant geographic market is properly framed as being national in scope, it becomes clear that the market is too unconcentrated for vertical integration to pose a threat to competition.^^ The HHI is 987, well below the 1800 threshold for vertical integration to be a source of economic concern. In addition, the two largest broadband providers (Com-

71

The following discussion updates earlier data previously presented in Yoo, supra note 61, at 253-59; and Christopher S. Yoo, Would Mandating Broadband Network Neutrality Help or Hurt Competition?: A Comment on the End-to-End Debate, 3 J. ON. TELECOMM. & HIGH TECH. L. 23, 50-53 (2004).

Christopher S. Yoo

37

cast and SBC) control only 20% and 14% of the national market respectively. Absent collusion or some other impermissible horizontal practice (which would be a basis for sanction independent of concerns about vertical integration), the national broadband market is sufficiently unconcentrated to vitiate concerns about the vertical integration in the broadband industry. Figure 2 Last-Mile Broadband Subscribers as of Year End 2004 Provider Comcast Cable Communications SBC Communications Time Warner Cable Verizon Communications Cox Communications BellSouth Charter Communications Earthlink Adelphia Communications Cablevision Systems Qwest Communications Bright House Networks Covad Communications Sprint Mediacom Communications Insight Communications Alltel RCN Hughes Direcway Citizens Communications Cable One Century Tel Cincinnati Bell Other Total

Technology cable modem DSL cable modem DSL cable modem DSL cable modem mixed cable modem cable modem DSL cable modem DSL DSL cable modem cable modem DSL cable modem satellite DSL cable modem DSL DSL

Subscribers (thousands) 6,992 5,104 3,913 3,600 2,590 2,096 1,884 1,364 1,360 1,316 1,000 725 533 492 367 331 243 222 220 212 178 143 131 700 35,697

Share 20% 14% 11% 10% 7% 6% 5% 4% 4% 4% 3% 2% 1% 1% 1% 1% 1% 1% 1% 1% 0% 0% 0% 2% 100%

HHI 384 204 120 102 53 34 28 15 14 14 8 4 2 2 1 1 0 0 0 0 0 0 0 1 987

38

Net Neutrality In addition, the precondition that the secondary markets be concentrated and protected by entry barriers is also not met. As the FCC has recognized, the market for ISPs has long been quite competitive, and entry into ISP services has historically been quite easy.^ As of the end of 2004, the HHI for ISPs appears to be below 800.^ Similarly, the markets for applications and content have long been the most competitive segments of the entire industry, marked by low levels of concentration and low barriers to entry. The failure to satisfy these structural preconditions renders implausible any claims that vertical integration in the broadband industry constitutes a threat to competition.

III.

THE POTENTIAL BENEFITS OF NETWORK DIVERSITY

Conventional economic theory thus indicates that allowing last-mile providers to vertically integrate is unlikely to harm competition. In this Part, I analyze how allowing last-mile broadband providers to deviate from the principles of network neutrality can actually enhance competition. Section A discusses how allowing network owners to deviate from complete interoperability can increase economic welfare by increasing the diversity of products available. Conversely, imposing network neutrality as a regulatory matter may actually have the effect of reducing innovation and limiting consumer choice by skewing the Internet towards certain types of applications and away from others. Section B analyzes the impact that connectivity principles can have on the concentration of last-mile technologies, which looms as a far more central threat to the competitive performance of the Internet than does the robustness of competition among content and applications providers. Specifically, it details how standardizing network protocols can reinforce the supply-side and demand-side economies of scale that are the primary source of the tendency toward concentration in last-mile technologies. By forcing broadband providers to compete solely on price and network size, network neutrality reinforces the advantages already enjoyed by the largest players. Conversely, network diversity can provide new last-mile platforms, such as ^ Applications for Consent to Transfer of Control of Licenses and Section 214 Authorizations from Tele-Communications, Inc., Transferor, to AT&T Corp., Transferee, Memorandum Opinion and Order, 14 F.C.C.R. 3160, 3206 (1999); see also Yoo, supra note 61, at 259. 2 See Alex Goldman, Top 22 U.S. ISPs by Subscriber: Q4 2004, available at: .

Christopher S. Yoo

39

3G, with a strategy for survival. Section C briefly examines the economic efficiencies that can result from vertical integration. These arguments should not be misconstrued as favoring noninteroperability as a general matter. On the contrary, I would expect most network owners will voluntarily adhere to a basic architecture based TCP/IP. Maintaining interoperability provides network owners with substantial financial advantages that in most cases should prove so overwhelming that mandating network neutrality would have no real effect. Imposing network neutrality as a regulatory matter would, however, foreclose last-mile providers from employing proprietary technologies even in those cases in which doing so would yield substantial economic benefits. The lack of a plausible case that the use of such proprietary technologies would harm competition suggests that even though most network owners will adhere to network neutrality, imposing it as a regulatory matter would provide no tangible benefits and would impose harm by preventing network owners from pursuing certain strategies that would be economically beneficial.

A.

The Tradeoff Between Network Standardization and Product Variety

One of the biggest shortcomings of the current debate is that it has largely ignored how network neutrality can harm economic welfare by limiting the variety of products.^ The predominance of price theory, in which the sole source of economic welfare is economic surplus (i.e., the difference between reservation prices and the actual prices charged), has caused commentators studying the economics of broadband networks to overlook the potential benefits associated with product differentiation. Simply put, allowing network owners to employ different protocols can foster innovation by allowing a wider range of network products to exist. Conversely, compulsory standardization can reduce consumer surplus by limiting the variety of products available. In the words of two leading commentators on network economics, "market equilibrium with multiple incompatible products reflects the social value of variety.""^

^ The following discussion is based on Christopher S. Yoo, Beyond Network Neutrality, 19 HARV. J. L. & TECH. (forthcoming 2005); Yoo supra note 71, at 56-59. ^ Michael L. Katz & Carl Shapiro, Systems Competition and Network Effects, 8 J. ECON. PERSP. 93, 106 (1994); accord Joseph Farrell & Garth Saloner, Standardization and Variety,

40

Net Neutrality Viewed from this perspective, the pressure towards proprietary standards may not represent some sinister attempt by last-mile providers to harm competition. Instead, it may represent nothing more than the natural outgrowth of the underlying heterogeneity of consumer preferences. It is for this reason that economic theorists have uniformly rejected calls for blanket prohibitions of exclusivity arrangements and other means for differentiating network services.^ Indeed, the advent of broadband technologies has also largely coincided with a number of fundamental changes that have increased the heterogeneity of the demands that users are placing on the Internet that have placed increasing pressure on the continued adherence to a uniform, TCP/IPbased architecture. Although the forces are somewhat complex, a few examples illustrate the forces driving this fundamental shift.^

1.

The Shift from Institutional to Mass-Market Users

The termination of NSF support for backbone services in 1995 eliminated the few remaining restraints on the commercialization of the Internet. The Internet's transformation from a network designed primarily to facilitate academic interchange into a medium of mass communications has made management of the Internet considerably more complicated. The Internet was once only charged with bringing together a relatively small number of fairly sophisticated, institutional users who generally shared a broad set of common goals. It now must mediate among an increasingly disorderly onslaught of private users each pursuing increasingly divergent objectives. This has greatly complicated traffic management, as the variability in usage patterns has increased. At the same time, the influence of overlapping institutional norms and relationships has dwindled. This shift has also created pressure to simplify the demands imposed on end users by incorporating more of those functions into the core network.

20 EcoN. LETTERS 71 (1986); SJ. Liebowitz & Stephen E. Margolis, Should Technology Choice Be a Concern of Antitrust Policy?, 9 HARV. J. L. & TECH. 283, 292 (1996). ^ See, e.g., David Balto, Networks and Exclusivity: Antirust Analysis to Promote Network Competition, 7 GEO. MASON L. REV. 523 (1999); David S. Evans & Richard Schmalensee, A Guide to the Antitrust Economics of Networks, ANTITRUST, Spr. 1996, at 36; Carl Shapiro, Exclusivity in Network Industries, 1 GEO. MASON L. REV. 673, 678 (1999). ^ The discussion that follows draws on the analysis offered by Marjory S. Blumenthal & David D. Clark, Rethinking the Design of the Internet: The End-to-End Arguments vs. the Brave New World, 1 ACM TRANSACTIONS ON INTERNET TECH 70 (2001).

Christopher S. Yoo 2.

41

The Emergence of Bandwidth-Intensive Applications

By contemporary standards, early Internet applications, such as e-mail, web access, newsgroups, and file transfer, placed fairly modest demands on the network. Overall file sizes were relatively small, and delays of a second or two typically went unnoticed. The commercialization of the Internet has spurred the development of applications which place greater demands on network services. Bandwidth-intensive applications, such as multimedia websites and music downloads are placing increasing pressure on network capacity, as is the increase in telecommuting and home networking. Equally important is the emergence of applications that are less tolerant of variations in throughput rates, such as streaming media, on-line gaming, and Internet telephony, also known as voice over Internet protocol (VoIP). These concerns have led many network providers to make the terms of interconnection vary to some extent with bandwidth usage. For example, many last-mile providers either forbid end users to use bandwidthintensive applications or instead require that they pay higher charges before doing so. Similarly, backbone providers often base the amounts they charge for interconnection on volume-related considerations. Backbones who exchange traffic of roughly equal value enter into "peering" arrangements that are similar to telecommunications arrangements known as "bill and keep." Under peering arrangements, the originating backbone collects and retains all of the compensation for the transaction notwithstanding the fact that other backbones also incur costs to terminate the transaction. So long as the traffic initiated and terminated by each backbone is roughly equal in value, peering allows backbones to forego the costs of metering and billing these termination costs without suffering any adverse economic impact. Peering is less economical, however, in cases where the value of the traffic being terminated is not reciprocal. As a result, smallervolume backbones are often required to enter into "transit" arrangements in which they must pay larger backbones compensation for terminating their traffic. The growing importance of time-sensitive applications is also placing pressure on system designers to employ routers that can discriminate among packets and to assign them different levels of priority, depending upon the source of the packet or the nature of the application being run. This represents a marked departure from TCP/IP, which manages packets on a "first come, first served" basis and in which packets are routed without regard to the nature of the communications being transmitted.

42

Net Neutrality 3.

The Growth in Distrust of Other Endpoints

As noted earlier, the Internet's reliance on TCP/IP has dictated that all packets be routed without regard to their source. The anonymity of this system of transmission was implicitly built on the presumption that the other endpoints in the system were relatively trustworthy and were cooperating in order to achieve common goals. The rise of e-commerce has created the need for increased levels of confidence in the identity of the person on the other end of the connection. At the same time, end users have become increasingly frustrated by intrusions thrust upon them by other end users. Although some examples, such as spam, are relatively innocuous, others are considerably more malicious, such as viruses, worms, Trojan horses,^pornographic websites masquerading as less objectionable content, and programs that mine cookies for private information. Although end users are in a position to undertake measures to protect themselves against these harms, some Internet providers are interposing elements into the body of their network to shield end users from such dangers.

4.

The Needs of Law Enforcement

The demands of law enforcement represent another factor that is driving the Internet away from the anonymous, fully interoperable architecture that existed in the narrowband era. For example, the Communications Assistance for Law Enforcement Act ("CALEA") requires that all telecommunications carriers configure their networks in a way that permits law enforcement officials to place wiretaps on telephone calls.^ Emerging Internet telephone systems, such as VoIP, are not easily rendered wiretap compatible. In contrast to the architecture of conventional telephone networks, which requires that all voice traffic pass through a discrete number of network gateways, VoIP technologies rely upon the decentralized structure inherent in the Internet. Furthermore, even if law enforcement officials found an appropriate location to intercept VoIP traffic, the packet anonymity inherent in TCP/IP would make it extremely difficult for law enforcement officials to separate the telephony-related packets from the other packets in the data stream. As a result, the FCC has recently opened ^ Trojan horses are malicious pieces of code concealed within programs that perform beneficial functions. M7U.S.C. §§ 1002(a).

Christopher S. Yoo

43

a proceeding to address how to reconcile VoIP with CALEA.^ Similarly, states' desire to impose sales taxes on Internet transactions may prompt them to push for changes to the architecture of the Internet to permit them to conduct some degree of monitoring of on-line commercial activity. Any solution to either problem would almost certainly require a deviation from the content and application transparency that is inherent in TCP/IP.

5.

The Impact of the Shifts in Demand

The current forces that are motivating network providers to consider introducing increasing levels of intelligence into their core networks provide an apt illustration of this dynamic. As discussed earlier, consumers' demand for more time-sensitive applications, such as VoIP and streaming media, may be providing much of the impetus away from standardization. Refusing to allow network owners to introduce routers that can assign different priority levels to packets based on the nature of the application being run would have the effect of precluding consumers from enjoying the benefits of certain types of applications. The current ubiquity of TCP/IP makes it seem like an appropriate default rule and appears to justify placing the burden on those who would deviate from it. A moment's reflection makes clear how adherence to the Internet's nonproprietary structure may actually impede innovation. Indeed, some models indicate that the deployment of proprietary network standards may actually prove more effective in promoting innovation and the adoption of socially optimal technologies.^^ There is thus considerable irony in the network neutrality proponents' insistence that allowing Internet providers to introduce intelligence into their core networks would skew innovation and that technological humility demands adherence to an end-to-end architecture. The decisions to concentrate intelligence at the edges of the network and to require packet nondiscrimination would itself skew the market towards certain applications and away from others. The choice is thus not between neutrality and nonneutrality in the overall direction of innovation. Mandating either

^ Communications Assistance for Law Enforcement Act and Broadband Access and Services, Notice of Proposed Rulemaking, 19 F.C.C.R. 15676 (2004). *^ Michael L. Katz & Carl Shapiro, Product Introduction with Network Externalities, 40 J. INDUS. ECON. 55, 73 (1992); Michael L. Katz & Carl Shapiro, Technology Adoption in the Presence of Network Externalities, 94 J. POL. EcON. 822, 825, 838-39 (1986).

44

Net Neutrality would have the inevitable effect of determining technological winners and losers.

B.

Network Neutrality and Competition in the Last Mile

On a more fundamental level, network neutrality advocates' focus on innovation in content and applications appears to be misplaced. ^^ Application of the basic insights of vertical integration theory reveals that markets will achieve economic efficiency only if each stage of production is competitive. In other words, any vertical chain of production will only be as efficient as its most concentrated link. The intuition underlying this insight can be easily discerned by the thought experiment outlined above imagining how competitive the broadband industry would be if regulators required that it be completely vertically disintegrated. Complete vertical disintegration would not increase consumer choice among last-mile providers. If anything, to the extent that it prevents network owners from realizing the available efficiencies, it might have the effect of reducing the number of last-mile options. In addition, the last-mile providers' bargaining leverage against ISPs and content and application suppliers would remain the same. Viewing the issues in this manner reveals that the major network neutrality proposals are focusing on the wrong policy problem. These proposals direct their efforts towards encouraging and preserving competition among ISPs and content/application providers, which operate in the industry segments that are already the most competitive and the least protected by entry barriers. Instead, broadband policy should focus on increasing the competitiveness of the most concentrated level of production in the broadband industry, which in the case of broadband is the last mile. The central questions of broadband policy are thus more properly framed in terms of how to best to foster competition in alternative network technologies operating in the last mile. The current degree of concentration in the physical layer has traditionally been attributed to both supplyside and demand-side considerations.^^ The supply-side consideration is the fact that building the physical network of wires needed to provide DSL and cable modem service requires the incurrence of substantial sunk costs. The presence of high sunk costs in turns gives rise to a tendency towards ^ ^ Yoo, supra note 61, at 241 -42. ^^ The following discussion is based on Yoo, supra note 3; and Yoo, supra note 71, at 248-49. For a briefer discussion applying a similar analysis to another type of electronic communications, see Yoo, Rethinking Free, Local Television, supra note 13, at 1603 & n.61.

Christopher S. Yoo

45

natural monopoly. On the demand side is a series of considerations generally termed network economic effects. Network economic effects exist when the value of a network is determined by the number of other people connected to that network. The more people that are part of the network, the more valuable the network becomes. This dynamic in turn can create considerable demand-side economies of scale that will reinforce the tendency towards concentration. What has been largely overlooked in the current debates is how allowing networks to differentiate in the services they offer can mitigate the forces traditionally thought to induce market concentration in communications networks. Conversely, measures that limit networks' ability to differentiate their services only serve to reinforce these tendencies. There is thus a real possibility that imposing network neutrality may actually worsen rather than alleviate the central policy problem confronting the broadband industry.

1.

Declining Average Costs and Supply-Side Economies of Scale

The supply-side considerations that cause last-mile services to exhibit a tendency towards natural monopoly can most easily be understood by focusing on the shape of the average cost curve. ^^ If the average cost curve is decreasing, firms with the largest volumes can provide services the most cheaply, which in turn allows them to underprice their smaller competitors. The price advantage allows the largest players to capture increasingly large shares of the market, which reinforces their cost advantage still further. Eventually the largest firm will gain a sufficient cost advantage to drive all of its competitors out of the market. Whether average cost is increasing or decreasing is determined by magnitude of the sunk costs. On the one hand, the ability to spread sunk costs over increasing large volumes places downward pressure on average cost. For example, spreading a $100 million sunk-cost investment across 1 million customers would require allocating an average of $100 in sunk costs to each customer. If the same sunk-cost investment were spread over 10 million customers, each consumer would have to pay only an average of $10 in order to cover sunk costs. The larger the sunk costs relative to the overall demand, the more pronounced these scale economies will be, although the marginal impact of this effect will decay exponentially as production increases. At the same time, the scarcity of factors of

46

Net Neutrality production and the principle of diminishing marginal returns typically cause variable costs to increase as volume increases. Whether average cost is rising or falling at any particular point is determined by which of these two effects dominates the other. When the necessary sunk-cost investments are large, the former effect tends to loom as the more important and causes average cost to decline. Because entry by new broadband networks tends to require large sunk-cost investments, the market for last-mile providers is generally expected to exhibit a natural tendency towards concentration. What network neutrality advocates have failed to recognize is how allowing last-mile broadband providers to differentiate their product offerings can help prevent declining-cost industries from devolving into natural monopolies.^^ It is not unusual for small-volume producers to survive against their larger rivals even in the face of unexhausted economies of scale by targeting those customers who place the highest value on the particular types of products or services they offer, as demonstrated by the survival of high-cost, low-volume specialty stores in a world increasingly dominated by lower-cost, higher-volume discounters. Although consumers of these small-volume producers will pay more for these specialized products, it is difficult to see how these consumers are worse off. The value that they derive from the specialized product necessarily exceeds the amount they must pay for it, otherwise they simply would not agree to the transaction. Indeed, it is the ability to use prices to signal the intensity of their preferences that allows the particular low-volume version to be available for purchase at all. Last-mile providers have a number of avenues open to them for differentiating the networks. One way is by entering into exclusivity arrangements with respect to content, as demonstrated by the role played by such arrangements in helping direct broadcast satellite (DBS) provider DirecTV emerge as a viable alternative to cable television. For example, DirecTV is offering an exclusive programming package known as "NFL Sunday Ticket" that allows sports fans to watch the entire NFL schedule and not just the games being shown by the broadcast networks in their service area. Many cable customers have been frustrated by their inability to purchase NFL Sunday Ticket through their local cable operators. If regulators were to view this exclusivity arrangement solely in static terms, they might be tempted to increase consumer choice by requiring that the programming package also be made available to cable subscribers. The problems underlying such a reaction become manifest when one recalls that the central problem confronting the television industry is the local cable operators' historic dominance over multichannel video distribution. The

Christopher S. Yoo

47

market reaction has already demonstrated how permitting exclusivity arrangements can drive the deployment of alternative retail delivery networks. Conversely, requiring that such programming be made available to cable as well as DBS customers would run the risk of eliminating one of the primary inducements to shift from cable to DBS, which would in turn only serve to entrench the local cable operator still fiirther. Another way that last-mile providers can differentiate the services they provide is by optimizing the architecture of their networks for different types of applications. To offer an illustration in the context of broadband, it is theoretically possible that three different broadband networks could co-exist notwithstanding the presence of unexhausted economies of scale. The first network could be optimized for conventional Internet applications, such as e-mail and website access. The second network could incorporate security features designed to appeal to users focusing on ecommerce. The third network could employ policy-based routers that prioritize packets in the manner that allows for more effective provision of time-sensitive applications such as VoEP. If this were to occur, the network with the largest number of customers need not enjoy a decisive price advantage. Instead, each could survive by targeting and satisfying those consumers who place the highest value on the types of service they offer. The example I have sketched illustrates how imposing network neutrality could actually frustrate the emergence of platform competition in the last mile. Simply put, protocol standardization tends to commodify network services. Limiting networks' ability to compete in terms of content or quality of services effectively forces networks to compete on price, which in turn accentuates the advantages enjoyed by the largest players and reinforces the market's tendency towards concentration. Conversely, increasing the dimensions along which networks can compete by allowing them to deploy a broader range of architectures may make it easier for multiple last-mile providers to co-exist.

2.

Network Externalities and Demand-Side Economies of Scale

Other commentators have argued that network neutrality must be mandated as a regulatory matter in order to redress the competitive problems posed by network economic effects. For reasons that I have discussed in detail elsewhere, such claims are subject to a number of important analyti-

48

Net Neutrality cal limitations and qualifications.^^ A few brief comments on two of the more salient limitations will suffice to make my point. First, for reasons analogous to the similar requirement with respect to vertical integration, the existing theories require that the network owner have a dominant market position before network economic effects can even plausibly harm competition. The classic illustration of this phenomenon is the development of competition in local telephony during the 1890s made possible by the expiration of the initial telephone patents.^^ After the Bell System's market share was cut in half, it attempted to rely on network economic effects to reverse its losses. Specifically, it refiised to interconnect with the upstart independent telephone companies, hoping that its greater network size would make it sufficiently more attractive to consumers to give it a decisive advantage. This effort ultimately failed, however, since the independent companies that comprised the other half of the industry were able to forestall any negative impact from network economic effects by allying to form a network that was similar in size to the Bell network. In the end, it was control of certain patents critical to providing high-quality long distance service and not network economic effects that allowed the Bell System to return to dominance. The clear implication is that the presence of a single competitor of roughly the same size as the network owner will likely be sufficient to eliminate any such anticompetitive problems. Second, the argument that network economic effects create externalities that lead to market failure is wholly inapplicable in the context of telecommunications networks. This is because any externalities that may exist will necessarily occur within a physical network that can be owned. ^^ Thus, although individual users may not be in a position to capture all of the benefits created by their demand for network services, the network owner will almost certainly be in a position to do so. Any benefits created by network participation can thus be internalized and allocated through the interaction between the network owner and network users.'^

^^Yoo, supra note 61, at 278-82; Daniel F. Spulber & Christopher S. Yoo, Access to Networks: Economic and Constitutional Connections, 88 CORNELL L. REV. 885, 924-33 (2003). ^^ Roger Noll & Bruce M. Owen, The Anticompetitive Uses of Regulation: United States v. AT&T, in THE ANTITRUST REVOLUTION 290, 291-92 (John E. Kwoka, Jr. & Lawrence J. White eds., 1989). ^^The literature refers to network externalities that occur in the context of a physical network as "direct network externalities." Michael L. Katz & Carl Shapiro, Network Externalities, Competition, and Compatibility, 75 AM. ECON. REV. 424,424 (1985). ^^ See S. J. Liebowitz & Stephen E. Margolis, Are Network Externalities a New Source of Market Failure?, 17 RES. LAW & EcON. 1, 11-13 (1995); S. J. Liebowitz & Stephen E. Mar-

Christopher S. Yoo

49

The commentary on network economic effects thus does not support the contention that imposing network neutrahty is necessary to protect competition. Even if such problems were to exist, it is far from clear that imposing network neutrality would help. Quite the contrary, the literature indicates that compelling interoperability could make matters worse. This is because allowing last-mile providers to differentiate their networks can mitigate the problems resulting from any demand-side economies of scale created by network economic effects that may exist. Simply put, allowing networks to tailor their services to the needs of different groups of customers can offset the economic advantages enjoyed by larger networks in much the same manner as differentiation can offset the supply-side economies of scale. Targeting those customers who place a particularly high value on a particular type of service makes it possible for smaller networks to survive despite the greater inherent appeal of larger networks. ^^ Conversely, mandating that all broadband networks employ nonproprietary protocols can foreclose network owners from using differentiation to mitigate the pressures towards concentration. Preventing network owners from varying the services that they offer forces networks to compete solely on price and network size, further reinforcing and accentuating the benefits already enjoyed by the largest players. As a result, network neutrality runs the danger of becoming the source of, rather than the solution to, market failure.

C.

Economic Efficiencies from Vertical Integration

In addition to finding common ground on the structural preconditions necessary for vertical integration to harm competition, both Chicago and post-Chicago School theorists agree that vertical integration can yield substantial cost efficiencies.^^ The potential for vertical integration to enhance economic welfare is reflected in the Merger Guidelines, which explicitly recognize that efficiencies may exist that permit a vertical merger golis, Network Externality: An Uncommon Tragedy, 8 J. ECON. PERSP. 133, 137, 141-44 (1994). ^^ Joseph Farrell & Garth Saloner, Standardization and Variety, 20 EcON. LETTERS 71 (1986); Katz & Shapiro, supra note 4, at 106; Liebowitz & Margolis, supra note 4, at 292. ^^See Yoo, supra note 61, at 192-200 (reviewing efficiencies resulting from vertical integration identified by Chicago School commentators); id. at 204 (reviewing the acknowledgement by post-Chicago theorists that vertical integration can yield substantial efficiencies).

50

Net Neutrality to go forward even when the market structure raises the possibihty of anticompetitive effects.2^ As I have discussed at some length elsewhere, aspects of the broadband industry make it likely that allowing a greater degree of vertical integration could yield substantial economic efficiencies .^^ For example, ISPs minimize traffic by "caching," a process in which the ISP gathers information from popular websites and stores it locally. Once it has done so, the ISP can provide access to the content without tying up resources outside of the ISP's proprietary system. Like all systems involving fixed costs, however, caching systems must spread their costs over as large a number of subscribers as possible in order to be economically viable. If other ISPs are allowed access to cable modem and DSL systems, each ISP's caching costs will be spread across fewer subscribers, a result which would raise the cost of providing high-quality service. Worse yet, the unaffiliated ISPs would either have to create caching systems of their own, a result which would duplicate costs and waste resources, or would simply provide consumers with a lower quality product. Neither alternative seems particularly attractive. In addition, allowing ISPs to integrate with cable modem systems would also enable broadband providers to take advantage of the available economies of scope. For example, requiring open access would prevent cable modem systems from realizing the transaction cost economies associated with marketing, billing, and servicing both products together. Joint provision can be particularly important when the overall performance of the final product depends upon inputs provided by two different companies and when consumers have trouble distinguishing which of the two companies is responsible for any performance inadequacies. In such cases, the two companies may simply blame each other for the system's poor performance. Customers that experience unsatisfactory performance with an emerging technology may simply choose to drop the product without attempting to identify the cause of the poor performance. In such cases, allowing a single company to provide both complementary services better enables it to ensure the overall performance of the system. As the Supreme Court has implicitly recognized in an early cable television case, such concerns are particularly important in the case of new products, such

^^ Non-Horizontal Merger Guidelines, supra note 65, §§ 4.135, 4.24. In addition, the Guidelines give more weight to expected efficiencies in the case of vertical integration than with respect to a horizontal merger. Id. § 4.24. ^^ Yoo, supra note 61, at 260-64; see also Joseph Farrell & Philip J. Weiser, Modularity, Vertical Integration and Open Access Policies: Towards a Convergence of Antitrust and Regulation in the Internet Age, 17 HARV. J. L. & TECH. 85, 97-105 (2004).

Christopher S. Yoo

51

as cable modem service, since an emerging industry's "short and longterm well-being depend[s] on the success of the first systems sold."23 The presence of large, up-front fixed costs also leaves both network owners and content/application providers vulnerable to a range of opportunistic behavior that vertical integration can substantially mitigate. Since each user and each ISP do not internalize all of their costs, each has inadequate incentives to conserve bandwidth. In addition, the last-mile provider will eventually have to make additional capital investments to upgrade its system to accommodate increases in traffic. Theoretically, simply forcing each ISP to bear the full costs of their usage could solve such problems. To the extent that open access is limited to marginal cost, however, the existence of such externalities also gives ISPs the incentive to free ride on the last-mile provider by avoiding making any contribution to the additional capital costs that the ISP itself is responsible for creating. Even if regulators attempt to allocate such fixed costs fully, the allocation of fixed costs has proven quite difficult and even arbitrary. In addition, rate making authorities have had little success setting the appropriate cost of capital to reflect the true ex ante risks once the market has arrived in the ex post world. In addition, the migration from narrowband to broadband has effected a fundamental structural change in the role played by last-mile providers.^^ Under a narrowband architecture, most residential and small business customers connected to the Internet by using a dial-up modem to place a conventional telephone call. The local telephone company simply connected the local telephone call to the offices of the ISP. As a result, the last-mile provider could serve as a mere pass through. It did not need to maintain any packet-switching capability of its own. The situation is rather different with respect to broadband technologies. Because both DSL and cable modem providers use the same infrastructure to provide two different types of service (either cable television combined with cable modem service or local telephone service combined with DSL), both types of providers must maintain equipment to segregate the two different communication streams. As a result, last-mile broadband providers must maintain a packet-switched network in their main facilities to hold 23 United States v. Jerrold Elecs. Corp., 187 F. Supp. 545, 557 (E.D. Pa. 1960), aff'd, 365 U.S. 567 (1961) (per curiam); see also Bruce M. Owen & Gregory L. Rosston, Cable Modems, Access and Investment Incentives 19 (Dec. 1998) (unpublished manuscript), available at: ; J. Gregory Sidak, An Antitrust Rule for Software Integration, 18 YALE J. ON REG. 1,9-10 (2001). ^"^ See Yoo, supra note 71, at 33-34.

52

Net Neutrality and route the stream of data packets after they have been separated from other types of communications. Thus, under a broadband architecture, last-mile providers no longer serve as mere pass throughs. They must instead necessarily perform the same routing fiinctions previously carried out by ISPs. Indeed, some last-mile broadband providers have negotiated their own interconnection agreement with backbone providers and require all of their customers to use their own proprietary ISP, thereby supplanting the role of independent ISPs altogether. The migration of Internet users from narrowband to broadband technologies has thus had the inevitable effect of reducing the viability of many independent ISPs and encouraging last-mile providers to bundle their offerings with ISP services. The fact that last-mile broadband providers must necessarily maintain packet-switched networks within their primary facilities makes it unsurprising that last-mile broadband providers would find it more economical to provide ISP services themselves. The existence of these efficiencies is demonstrated most dramatically by the manner in which the multiple ISP access mandated during the AOL-Time Warner merger has been implemented. Contrary to the original expectations of the FTC, the unaffiliated ISPs that have obtained access to AOL-Time Warner's cable modem systems under the FTC's merger clearance order have not placed their own packet network and backbone access facilities within AOL-Time Warner's headend facilities. Instead, traffic bound for these unaffiliated ISPs exits the headend via AOL-Time Warner's backbone and is handed off to the unaffiliated ISP at some external location. It is hard to see how consumers benefit from such arrangements, given that they necessarily use the same equipment and thus provide the same speed, services, and access to content regardless of the identity of their nominal ISP.^^ The fact that these unaffiliated ISPs have found it more economical to share AOL Time Warner's existing ISP facilities rather than build their own strongly suggests that integrating ISP and last-mile operations does in fact yield real efficiencies. The abs9nce of consumer benefits underscores the extent to which compelled access represents something of a competition policy anomaly. When confronted with an excessively concentrated market, competition policy's traditional response is to deconcentrate the problematic market, either by breaking up the existing monopoly or by facilitating entry by a competitor. Compelled access, in contrast, leaves the concentrated market

^^ Columbia Telecommunications Corporation, Technological Analysis of Open Access and Cable Television Systems, December 2001, at 22-23, available at: .

Christopher S. Yoo

53

intact and instead simply requires that the bottleneck resource be shared. Such an approach may be justified if competition in the concentrated market is infeasible, as was generally believed to be the case with respect to local telephone service until recently. Simply requiring that the monopoly be shared is inappropriate when competition from new entrants is technologically and economically achievable.^^

IV.

CRITIQUE OF THE PRINCIPAL NETWORK NEUTRALITY PROPOSALS

Allowing last-mile broadband providers to employ proprietary protocols and enter into exclusivity arrangements thus offers the promise of enabling the realization of economic efficiencies and allowing the deployment of products that better satisfy consumer preferences. Even more importantly, network differentiation can alleviate the supply-side and demand-side economies of scale that represent the primary theoretical sources of market failure in the broadband industry. Preventing last-mile providers from using proprietary protocols and exclusivity arrangements to differentiate the network can thus have the perverse effect of forestalling the emergence of alternative network platforms and of entrenching the existing oligopoly into place. Nonetheless, a number of advocates continue to offer their support for network neutrality. This part will address three of these proposals: the end-to-end argument championed most prominently by Lawrence Lessig, the "connectivity principles" backed by the HTBC, and the "layered model" currently being advanced by MCI. Upon close analysis, it becomes clear that each suffers from some fundamental conceptual problems.

^^ The feasibility of platform competition underscores the problems with viewing previous efforts to standardize and compel access to the local telephone service as precedent for imposing network neutrality on the Internet. See Lessig, supra note 59, at 147-51; Lemley & Lessig, supra note 59, at 934-36, 938. Most steps to mandate access to local telephone networks were justified by the fact that competition in local telephony was believed impossible at the time. Such arguments do not apply to broadband, in which platform competition has emerged as a real possibility. See Yoo, supra note 3.

54

Net Neutrality

A.

The End-to-End Argument

Many network neutrality advocates have drawn much of the inspiration for their regulatory proposals from the "end-to-end argument" first advanced by Jerome Saltzer, David Reed, and David Clark in 1981.^^ Simply put, the end-to-end argument counsels against introducing intelligence into the core of the Internet and in favor of restricting higher levels of functionality to the servers operating at the edges of the network. The "pipes" that constitute the core of the network should be kept "dumb" and should focus solely on passing along packets as quickly as possible. The fundamental logic of the end-to-end argument is most easily understood by examining the core illustration offered by Saltzer, Reed, and Clark to articulate it: careful file transfer, in which a file stored on the hard drive of computer A is transferred to the hard drive of computer B without errors. Roughly speaking, this function can be divided into five steps: 1. Computer A reads the file from its hard disk and passes it to the file transfer program. 2. The file transfer program running on computer A prepares the file for transmission by dividing it into packets and hands off the packets to the data communication network. 3. The data communication network moves the packets from computer A to computer B. 4. The file transfer program running on computer B reassembles the packets into a coherent file. 5. The file transfer program saves the file onto computer 5's hard disk. Errors can emerge at any step in this process. Computer A can misread the file from the hard disk. The file transfer program on Computer A can introduce mistakes when copying the data from the file. The communication network can drop or change bits in a packet or lose a packet altogether. The file transfer program on Computer B can also produce errors when converting the packets back into a coherent file. Computer B can 2"^ J.H. Saltzer et al., End-to-End Arguments in System Design, 2 ACM TRANSACTIONS ON COMPUTER SYS. 277 (1984) (revised version of paper first presented in 1981).

Christopher S. Yoo

55

miswrite the file to its hard disk. The transfer can also be jeopardized by larger-scale hardware or software failures. Saltzer, Reed, and Clark compared two different approaches to managing the risk of such errors. One approach is to perform error checking at each intermediate step along the way. The other approach is known as "end-to-end check and retry." Under this approach, no error checking is performed at any of the intermediate steps. Instead, the only error checking occurs when the terminating end of the process (computer B) verifies the accuracy of the file transfer with the initiating end (computer A) after the entire transaction has been completed. As a result, Saltzer, Reed, and Clark concluded that system designers should adopt a presumption in favor of the latter approach. They based their argument on two insights. First, no matter how many intermediate error checks are introduced, the terminating end of the file transfer must still verify the transaction with the originating end after all of the steps have been completed. The fact that such end-to-end verification is necessary no matter what other intermediate reliability measures are built into the system renders any additional measures redundant and raises fiirther doubts as to the justifiability of incurring the burdens imposed by those additional measures. Second, intermediate error checking should properly be regarded as an engineering tradeoff between reliability and performance. Errors can be reduced, but only at the cost of introducing a degree of redundancy that will have the inevitable effect of slowing the network down. They emphasize that different applications vary in their tolerance for unreliability as well as their demand for speed. Imposing reliability checks in low-level subsystems that are common to all applications may have the uneconomical result of forcing all applications to incur the performance costs even if the increase in reliability does not provide particular applications with commensurate benefits. Together these insights suggest that system designers should avoid designing higher-level fiinctions into the core of the network. Instead, the Internet should presumptively be engineered with any such functions concentrated in the servers that operate at the network's edge. Saltzer, Reed, and Clark applied the same basic rationale to other system fiinctions, such as delivery guarantees, secure transmission of data, duplicate message suppression, and transaction management. Network neutrality proponents contend that the end-to-end argument justifies prohibiting Internet providers from introducing additional degrees of intelligence into their core networks. In short, all of the intelligence should be restricted to the servers operating at the edge of the network.

56

Net Neutrality They also argue that the end-to-end argument supports mandating that broadband network owners employ protocols like TCP/IP that ensure that the core of the network remains relatively transparent and dumb. A close analysis of the end-to-end argument reveals that network neutrality proposals are based on an overreading of Saltzer, Reed, and Clark's work that expands it far outside its proper scope. In fact, a careful examination of the rationale underlying the end-to-end argument reveals that it is fundamentally incompatible with network neutrality advocates' attempt to turn the end-to-end argument into a regulatory mandate. Although the end-to-end argument does support a presumption against introducing higher-level functions into the network's core, it does not provide any support for elevating this presumption into an inviolable precept. Conceding that it is "too simplistic to conclude that the lower levels should play no part ii| obtaining reliability," Saltzer, Reed, and Clark's original article articulating the end-to-end argument squarely concludes that "the end-to-end argument is not an absolute rule, but rather a guideline that helps in application and protocol design analysis." In fact, the cost-performance tradeoff underlying the end-to-end argument requires "subtlety of analysis" and can be "quite complex.''^^ Indeed, a later article by the same authors responding to calls for allowing the core of the Internet to exercise a greater level of functionality explicitly recognizes that "[tjhere are some situations where applying an end-to-end argument is counterproductive" and concludes that the proper approach is to "take it case-by-case." The end-to-end argument is thus more properly regarded as merely "one of several important organizing principles for systems design" rather than as an absolute. Although Saltzer, Reed, and Clark suggest that deviations from it will be rare, they acknowledge that "there will be situations where other principles or goals have greater weight."^^ Other technologists have drawn similar conclusions. One of the original authors of the end-to-end argument, writing with Marjory Blumenthal, candidly acknowledges that "the end-to-end arguments are not offered as an absolute" and that "[tjhere are functions that can only be implemented in the core of the network." Indeed, they argue that the developments described in Part I has made the case for introducing greater intelligence into Internet's core all the more compelling. They conclude that in many cases

•^'^ Id., at 280, 284, 285. To take but one example, the desirability of end-to-end depends in part on the length of the file. If a system drops one message per one hundred messages sent, the probability that all packets will arrive correctly decreases exponentially as the length of the file increases (and thus the number of packets composing the file) increases. Id., at 280-81. ^^ David P. Reed et al.. Commentaries on "Active Networking and End-to-End Arguments," IEEE NETWORK, May/June 1998, at 69, 69 n.l, 70.

Christopher S, Yoo

57

"an end-to-end argument isn't appropriate in the first place."^^ Samrat Bhattacharjee, Kenneth Calvert, and Ellen Zegura conclude that the endto-end argument "do[es] not rule out support for higher-level functionality within the networks" and instead simply requires that the costs and benefits inherent in the engineering tradeoff be carefully evaluated. Indeed, there are services that depend on information that is only available inside the network and thus cannot exist without relying to some degree of what has been called "active networking. "^^ Dale Hatfield acknowledges that the desire to improve the security, manageability, scalability, and reliability of the Internet may justify introducing greater intelligence into the core of the network. As a result, Hatfield argues against allowing regulation that prevents network owners from deviating from the end-to-end architecture and instead simply warns that deviations from the end-to-end argument should be undertaken with extreme care.^^ At this point, the incongruity of invoking the end-to-end argument as support for network neutrality as a regulatory mandate should be apparent. Far from justifying an absolute prohibition against placing intelligence in the core of the network, the end-to-end argument stands squarely opposed to such a simplistic approach.^^ Simply put, a close analysis of the end-toend argument reveals that it does not support the proposition for which many network neutrality proponents cite it. Indeed, as Marjory Blumenthal has noted, this incongruity demonstrates the extent to which network neutrality advocates' embrace of the end-to-end argument has left the realm of cost-benefit analysis and has instead entered the realm of ideology.^^ As a result, it is critical that network neutrality proposals not evade critical analysis by masquerading as nothing more than the application of sound engineering principles.

^^ Blumenthal & Clark, supra note 6, at 71, 80. Jerome Saltzer apparently concurs. See Id., at 102n.l9. ^^ Samrat Bhattacharjee et al.. Active Networking and the End-to-End Argument, in PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON NETWORK PROTOCOLS 220, 221 (1997);

Samrat Bhattacharjee et al., Commentaries on "Active Networking and End-to-End Arguments,'' IEEE NETWORK, May/June 1998, at 66. ^2 DaleN. Hatfield, Preface, 8 COMMLAW CONSPECTUS 1, 3 (2000). ^^ Although the end-to-end argument only supports a case-by-case approach to network design, it is arguable that such cases will prove so rare that the costs of evaluating the merits of each individual case exceed the benefits of doing so. Such categorical balancing is particularly perilous in industries, such as broadband, that are in a state of technological and economic flux. ^"^ Marjory S. Blumenthal, End-to-End and Subsequent Paradigms, 2002 L. REV. MiCH. ST. U . DET.C.L. 709,710.

58

Net Neutrality The foregoing discussion also casts a new and somewhat ironic light on Lessig's observation that "code is law."^^ Lessig's point was that the architecture enshrined in the Internet's communications protocols can have as dramatic an impact on competition and innovation as direct regulation. Network neutrality advocates have failed to appreciate that this admonition cuts both ways. While it is true that allowing Internet providers to impose proprietary protocols could have a significant impact on innovation and competition, forbidding them from doing so could have equally dramatic effects. Either decision necessarily involves policymakers in the unenviable task of picking technological winners and losers, a fact that undercuts claims that elevating the end-to-end argument into a regulatory mandate represents the proper way to show humility about the shape that the Internet may assume in the future.^^ Not only does government-imposed network neutrality contradict the letter of the end-to-end argument; it turns Lessig's admonition on its head, Lessig intended the statement to indicate how the architecture of the Internet could provide a privately provided substitute for many of the functions previously served by law. Indeed, Lessig warned of the dangers of allowing the government to dictate the standards that must be included in their code.^^ It would be a strange inversion of this argument to give the phrase "code is law" literal rather than figurative meaning and to sanction greater governmental control over the architecture of the Internet.

B.

HTBC's Connectivity Principles

Other proposals have shifted their attention away from preserving ISP competition and have instead focused on preserving competition among content and applications providers. For example. Professors Timothy Wu and Lawrence Lessig have proposed a network neutrality regime that would prohibit last-mile providers from imposing any restrictions on end users' ability to run the applications, attach the devices, and access the content of their own choosing except those restrictions that are necessary to comply with a legal duty, prevent physical harm to the network, prevent interference with other users' connections, ensure quality of service, and

^^ Lawrence Lessig, CODE AND OTHER LAWS OF CYBERSPACE 6 (1999).

^^ See Lessig, supra note 59, at 35, 39. ^'^ See Lawrence Lessig, The Limits in Open Code: Regulatory Standards and the Future of the Net, 14 BERKELEY TECH. L.J. 759, 764-67 (1999).

Christopher S. Yoo

59

prevent violations of security.^^ A recent speech by then-FCC Chairman Michael Powell sounded similar themes.^^ The High Tech Broadband Coalition has advanced a similar proposal that would impose a series of "connectivity principles" on all last-mile broadband providers. This proposal would require that all last-mile broadband providers give end users unrestricted access to all content and allow them to run any applications and attach any devices they desire, so long as these efforts do not harm the providers' network, enable theft of services, or exceed the bandwidth limitations of the particular service plan/^ The HTBC's proposal has drawn the support of a group composed primarily of software and content providers known as the Coalition of Broadband Users and Innovators (CBUI).^! Network neutrality proponents assert that such nondiscrimination is essential to promoting innovation and to preserving consumer choice in content and applications. These proposals are motivated by a concern that last-mile providers will use their control of the physical layer to reduce competition in the application and content layer by deviating from TCP/IP currently employed in the logical layer and replacing it with a proprietary, noninteroperable set of protocols. The connectivity principles are designed to short-circuit this dynamic by mandating that last-mile providers adhere to nonproprietary protocols and to open their networks to all applications and content on a nondiscriminatory basis, which in turn would preserve applications and content providers' access to end users.

^^ Ex parte Submission in CS Docket No. 02-52 at 12-15, Appropriate Regulatory Treatment for Broadband Access to the Internet Over Cable Facilities, F.C.C. filed Aug. 22, 2003, CS Dkt. No. 02-52, available at: ; Lessig, supra note 59, at 156-58; Wu, supra note 59, at 165-72. ^^ Michael K. Powell, Preserving Internet Freedom: Guiding Principles for the Industry, 3 J. ON TELECOMM. & HIGH TECH. L. 5 (2004).

^^ Comments of the High Tech Broadband Coalition at 6-9, Appropriate Regulatory Treatment for Broadband Access to the Internet Over Cable Facilities (F.C.C. filed June 17, 2002) (CC Dkt. No. 02-52), available at: . ^^ Ex parte Communication from the Coalition of Broadband Users and Innovators at 3-4, Appropriate Regulatory Treatment for Broadband Access to the Internet Over Cable Facilities (F.C.C. filed Jan. 8, 2003) (CS Dkt. No. 02-52), available at: . CBUI includes such notable content and software providers as Microsoft, Disney, Amazon.com, eBay, and Yahoo!, as well as the Media Access Project, the Consumer Electronics Association, and the National Association of Manufacturers.

60

Net Neutrality Viewing these proposals from the perspective of vertical integration theory reveals they suffer from some fundamental conceptual flaws. As discussed earlier, the structural preconditions that must be satisfied before vertical integration between last-mile providers and content/applications providers can plausibly threaten competition are not satisfied. Li addition, these proposals are attempting to protect and promote competition in the segments of the industry that are already the most competitive and the least protected by entry barriers, which underscores the extent to which these proposals have misframed the central policy problem confronting the broadband industry. In addition, the connectivity principles fail to take into account the extent to which the broadband industry is being confronted with the fundamental pricing problem that arises with respect to all shared facilities that are subject to congestion. Under the standard measures of economic performance, welfare is maximized when customers use the shared facility only up to the point where the benefits from consuming an additional unit of the shared facility no longer exceeds the costs of allowing them to consume an additional unit. Firms that charge prices that are not sensitive to the amount of bandwidth used will soon confront a quandary. Since the cost to customers of incremental use is zero, they will increase their use of the facility until the marginal utility of an additional use is zero. The congestion costs of the additional uses are not zero, however. The result is an equilibrium in which the number of visits is economically excessive. This problem can be solved by employing a two-part pricing scheme in which the customer pays both a fixed amount for its connection as well as a per-use amount that is set equal to the congestion costs associated with an incremental use of the system. This would result in a pricing system in which the amount paid would vary with the actual bandwidth used.^^ Unfortunately, this solution involves a number of administrative difficulties. Experience in other communications-related industries suggests that the costs of monitoring and billing on a per-use basis may be prohibitively expensive. In the absence of a precisely calibrated per-use pricing system, many last-mile providers have turned to rough proxies by prohibiting applications and equipment associated with more intensive bandwidth use. For example, some last-mile providers have prohibited customers from streaming media, from attaching content servers, or serving as an ISP. Other last-mile providers have begun offering two-tier plans

"^^ Jeffrey K. MacKie-Mason & Hal R. Varian, Pricing Congestible Network Resources, 13 IEEE J. ON SELECTED AREAS IN COMMUNICATIONS 1141 (1995).

Christopher S. Yoo

61

that charge more to users who want to use bandwidth-intensive appUcations, such as home networking and virtual private networking/^ These pricing innovations and equipment/allocation restrictions represent an economically rational response to the congestion problems associated with broadband service. The proposals advanced by the proponents of connectivity principles would have the unfortunate effect of unduly limiting last-mile providers' ability to address this fundamental problem.

C.

MCI's Layered Model

A public policy paper authored Richard Whitt of MCI has expanded upon the layered model discussed in Part I.A.2 has advanced another proposal that has emerged as a focal point in the discussions about network neutrality.^^ Although the MCI policy paper covers a broad range of issues, its core proposal can be boiled down into two main principles. First, it builds on ideas advanced by other commentators to argue that the FCC should regulate in a manner that respects and maintains the integrity of the layers of the broadband industry. In particular, MCI offers a series of examples purporting to demonstrate why the FCC should not use regulation of the physical and logical layer in order to protect competition at the content and applications layers."*^ Second, MCFs proposal argues in favor of subjecting last-mile broadband providers to a wholesale access requirement similar to the unbundled network access requirement currently imposed on incumbent local exchange carriers under the Telecommunications Act of 1996. MCI bases this aspect of its proposal on the supposed need to prevent last-mile broadband providers from "leveraging" their supposed market power in a way that will harm the applications layer. Although MCFs analysis has its merits,^^ it suffers from a number of fimdamental flaws. By invoking the notion of monopoly leverage, it relies "^^ Wu, supra note 59, at 158-62; Lemley & Lessig, supra note 59, at 944. "^"^ Whitt, supra note 63. 45/^., at 36-44. "^^ In particular, I agree with its call to reject the technology-specific approach embodied in the basic structure of the Communications Act of 1934 which treats each communications medium as a regulatory universe unto itself Id, at 2-26. As I have noted elsewhere, the technologically oriented division reflected in the basic structure of Communications Act of 1934 could provide a satisfactory basis for policy making only so long as the various technologies did not act as substitutes for one another, since the FCC could focus on each medium in isolation and could craft solutions tailored to the particular type of communications conveyed, as well as to the economics underlying the means of transmission. The emergence of technologies such as cable television and wireless telephony, which allowed consumers to receive both

62

Net Neutrality on an economic theory that is analytically suspect. In addition, in arguing for subjecting last-mile providers to unbundled access requirements, the MCI proposal adopts an unnecessarily static vision that ignores the basic insights of classic property theory. Although MCI correctly identifies the promotion of competition among alternative network platforms as the proper focus for broadband policy, it ignores that compelling access to the existing platforms can forestall the emergence of precisely the type of platform competition that MCI seeks to promote. In other words, the regulatory response that MCI would impose in an attempt to redress market power might have the perverse effect of cementing the existing oligopoly into place. Under such circumstances, the intervention that MCI envisions would be the source of market failure, rather than its solution.

1.

The Economic Critique of Leverage

As noted earlier, a consensus has emerged that certain structural preconditions must be met before a coherent leveraging claim could even be stated. Simply put, absent market power in the primary market, a firm has nothing to use for leverage. Chicago School theorists offered a more radical attack to the leverage theory of vertical integration. They argued that even when the structural preconditions identified in the first attack were met, vertical integration did not provide firms with any additional market power. In other words, although firms with monopoly power may have the ability to exercise leverage over vertically related markets, those firms typically lack the incentive to do so. This is because there is only one monopoly profit in any chain of production, and any monopolist can capture all of that profit without having to resort to vertical integration. All it has to do is simply price its goods at the monopoly level. Thus, even if firms can exercise leverage, they will generally find it unnecessary to do so."*^

voice and video communications through either wireline or wireless media, caused this neat, dichotomous universe to unravel. Moreover, the problem is about to get much worse. The impending shift of all networks to packet-switched technologies, in which all communications are reduced to data bits that can be transmitted through any network, promises to cause all of the distinctions based on the means of conveyance and the type of speech conveyed to collapse entirely. See Yoo, supra note 61, at 286-90. ^^See Robert H. Bork, THE ANTITRUST PARADOX 226-31, 372-73, 375 (1978); Richard A. Posner, ANTITRUST LAW: AN ECONOMIC PERSPECTIVE 173, 197 (1976); Ward S. Bowman, Jr.,

Tying Arrangements and the Leverage Problem, 67 YALE L. J. 19, 20-21 (1957); Aaron Director & Edward H. Levi, Law and the Future: Trade Regulation, 51 Nw. U. L. REV. 281, 290 (1956). Chicago School scholars recognized a number of exceptions to their critique of lever-

Christopher S, Yoo

63

Post-Chicago theorists have identified a number of circumstances under which leverage can be profitable,^^ The context-specific nature of these models provides little support for a blanket prohibition of vertical integration. Indeed, many of these models suggest that in many circumstances vertical integration may well be welfare enhancing."^^

2,

The Problematic Nature of Compelled Access as a Remedy

Furthermore, scholars of competition policy generally agree that compelled access is, in many ways, quite problematic as a remedy.^^ If regulators compel non-discriminatory access without putting any restrictions on the price charged, the monopolist will simply charge the full monopoly price. While such access would be beneficial to the monopolist's competitors, it provides no benefits to consumers, since the monopoly is left intact, and no improvements in price or output can be expected. Absent some regulation of the terms and conditions of access, compelled access represents something of an anomaly. As Professors Areeda and Hovenkamp note, the purpose of the competition policy "is not to force firms to share their monopolies, but to prevent monopolies from occurring or to break them down when they do occur."^^

age. None of them are apposite to the context of broadband. See Yoo, supra note 59, at 18991. ^^ See, e.g., Oliver Hart & Jean Tirole, Vertical Integration and Market Foreclosure, BROOKINGS PAPERS ON ECONOMIC ACTIVITY: MICROECONOMICS 205 (1990); Louis Kaplow,

Extension of Monopoly Power Through Leverage, 85 COLUM. L. REV. 515 (1985); Janusz A. Ordover et al.. Equilibrium Vertical Foreclosure, 80 Am. ECON. REV. 127 (1990); Michael H. Riordan, Anticompetitive Vertical Integration by a Dominant Firm, 88 AM. ECON. REV. 1232 (1998); Michael A. Salinger, Vertical Mergers and Market Foreclosure, 103 Q. J. EcON. 345 (1988); Steven C. Salop & David T. Scheffman, Raising Rivals' Costs, 73 AM. ECON. REV. 267 (1983); Michael D. Whinston, Tying, Foreclosure, and Exclusion, 80 AM. ECON. REV. 837 (1990); Ian Ayres, VERTICAL INTEGRATION AND OVERBUYING: AN ANALYSIS O F FORECLOSURE

VIA RAISED RIVALS' COSTS (Am. Bar Found., Working Paper No. 8803, 1988)

^^ Hart & Tirole, supra note 48, at 212; Michael W. Klass & Michael A. Salinger, Do New Theories of Vertical Foreclosure Provide Sound Guidance for Consent Agreements in Vertical Merger Cases?, 40 ANTITRUST BULL. 667, 679-82 (1995); Michael H. Riordan & Steven C. Salop, Evaluating Vertical Mergers: A Post-Chicago Approach, 63 ANTITRUST L. J. 513, 52227, 544-51, 564 (1995); Salinger, supra note 48, at 349-50; Ayres, supra note 48, at 17-20, 2324. ^^ The following discussion is based on Yoo, supra note 59, at 243-47, 268-69. ^^ 3 A Phillip E. Areeda & Herbert Hovenkamp, ANTITRUST LAW 771b, at 174 (1996).

64

Net Neutrality Thus, if an access remedy is to benefit consumers, it must necessarily include a requirement that the rates charged be reasonable. Any attempt at regulating rates would likely be extremely difficult to administer. Since the monopolist has already evinced a lack of willingness to deal with its competitor, the relationship is likely to be surrounded by disputes over the terms and conditions of the compelled access. As Professors Areeda and Hovenkamp have noted, once access is ordered, [t]he plaintiff is likely to claim that the defendant's price for access to an essential facility (1) is so high as to be the equivalent of a continued refusal to deal, or (2) is unreasonable, or (3) creates a 'price squeeze' in that the defendant charges so much for access and so little for the product it sells in competition with the plaintiff that the latter cannot earn a reasonable profit.^^ The disputes, moreover, will not be limited just to price. The parties are likely to disagree on non-price terms and conditions as well. It goes without saying that rate regulation in declining cost industries has been plagued by complicated valuation and second-best pricing problems that have bordered on insurmountable. Previous attempts at imposing rate regulation on cable television have largely been a failure, as the variability in the quality of cable programming has frustrated efforts to impose meaningful rate regulation.^^ The FCC's history with policing access regimes provides ample reason to question whether it is institutionally capable of executing this charge. For example, leased access to cable television systems has been plagued by precisely the type of problems predicted by Areeda and Hovenkamp. Simply put, the regulatory regime went almost entirely unused, with the various parties disagreeing vehemently on the reason for the regime's failure. Firms that sought leased access complained that local cable operators demanded excessively high prices and failed to bargain in good faith, while the cable operators claimed that the lack of leased access reflected a lack of demand for it.^^ Even more spectacular has been the inability of

52 Id., 174e, at 227-28; see also Id, 765c, at 103-04, 772, at 197. 5^ See Thomas W. Hazlett & Matthew L. Spitzer, PUBLIC POLICY TOWARD CABLE TELEVISION

(1997); Gregory S. Crawford, The Impact of the Household Demand and Welfare, 31 RAND J. ECON. 422 (2000). 54 See Time Warner Entm't Co. v. FCC, 93 F.3d 957, 970 (D.C. Cir. 1996); 1990 Report on Cable Competition, supra note 207, at 5048, 177; Donna M. Lampert, Cable Television: Does Leased Access Mean Least Access?, 44 FED. COMM. L.J. 245 (1992).

Christopher S, Yoo

65

the FCC and the state pubUc utihty commissions to use access requirements to foster competition in local telephone markets as mandated by the Telecommunications Act of 1996.^^ The FCC's experience in policing other access regimes thus provides little reason to be optimistic that it will be able to manage the myriad problems associated with administering a regime of compelled access in this instance. It is telling that two distinguished scholars of network industries not particularly noted for deregulatory views have suggested that access regimes have proven so unworkable that they should be abandoned altogether.^^ Compelled access regimes are thus extremely questionable from the standpoint of static efficiency, since it is far from clear whether they can deliver the requisite benefits in price and quantity needed to justify the enterprise. Even more profound is the impact that compelled access regimes have on dynamic efficiency. From the perspective of dynamic efficiency, the only viable way to solve the problems caused by monopoly bottlenecks is the appearance of a new entrant that directly competes with the bottleneck facility. Access regimes, however, may actually retard such entry. Access dampens investment in two ways that harm consumers. First, it is now well recognized that resources are most likely to receive the appropriate level of conservation and investment if they are protected by welldefined property rights. As Garrett Hardin pointed out in his pathbreaking work on the "Tragedy of the Commons," resources that are in effect jointly owned tend to be overused and receive suboptimal levels of investment.^^ Hardin's insights apply with equal force to compelled access regimes. Since any benefits gained from investments in capital or research must be shared with competitors, forcing a monopolist to share its resources reduces incentives to improve their facilities and pursue technological innovation. In addition, compelling access to an input also discourages other firms that need the input from entering into business alliances with potential alternative suppliers of the input. In effect, forcing a dominant provider to share an input rescues other firms from having to supply the relevant input for themselves. While such access would clearly benefit other ISPs by 55 See United States Telecom Ass'n v. FCC, 359 F.3d 554, 595 (D.C. Cir. 2004) (criticizing "the Commission's failure, after eight years, to develop lawftil unbundling rules"). 5^ See Paul L. Joskow & Roger G. Noll, The Bell Doctrine: Applications in Telecommunications, Electricity, and Other Network Industries, 51 STAN. L. REV. 1249 (1999). 5^ Garrett Hardin, The Tragedy of the Commons, 162 SCIENCE 1243 (1968); accord Harold Demsetz, Toward a Theory of Property Rights, AM. ECON. REV. (papers and procs.). May 1967, at 347, 354-59.

66

Net Neutrality rescuing them from having to make the capital investments that would have otherwise been required for them to secure carriage through other means, it would provide no tangible benefit to consumers, as price and output would remain at monopoly levels. Attempts at forcing prices below monopoly levels, however, would force regulators to referee a neverending series of disputes over the terms and conditions of access, a role for which the FCC has historically proven ill-suited. Thus, access should not be compelled whenever the resource is available from another source, even if it is only available at significant cost and in the relatively long run. This is particularly true in technologically dynamic industries, in which the prospects of developing new ways either to circumvent or to compete directly with the bottleneck are the highest. The inevitable lag in adjusting regulation also raises the risk that regulations, such as access, that protect incumbents from new entry will continue to exist long after the justifications for enacting the regulation have long disappeared.^^ This suggests that compelling access to the physical layer would harm dynamic efficiency as well, by slowing the deployment of high-speed broadband services. The fact that any positive developments would need to be shared with competitors would represent a deviation from the welldefined property rights needed to provide last-mile providers with the incentive to engage in efficient levels of investment in their own technology. In addition, compelled access would also rescue unaffiliated ISPs and content/application providers from having to support the development of alternative broadband providers. These unaffiliated ISPs represent the natural strategic partners for DSL, satellite, and other broadband transport providers seeking to build services to compete directly with incumbent last-mile providers. Providing them with access to DSL and cable modem systems would remove any incentive to support such initiatives. This insight underscores the core problem in the broadband industry, which is the paucity of providers capable of delivering broadband transport services into the home. Network neutrality would do nothing to alleviate this central problem, since compelling access would not provide consumers with any additional options for broadband transport services. Likewise, allowing vertical integration will not make this problem any worse. On the contrary, network neutrality could well make the problem worse by preventing last-mile providers from realizing the available effi-

^^ See, e.g., Stephen Breyer, REGULATION AND ITS REFORM 286-87 (1982); 2 Alfred E. Kahn, THE ECONOMICS OF REGULATION 127 (1971); Richard A. Posner, Natural Monopoly and Its Regulation, 21 STAN. L. REV. 548, 611-15 (1969).

Christopher S. Yoo

67

ciencies and by depriving alternative broadband transport providers of their natural strategic partners.

3.

The Potential Advantages of Interlayer Combinations and Competition

Economic theory also undercuts the other principal aspect of MCI's proposal, i.e., that the FCC should regulate the broadband industry to preserve the integrity of the layers. As discussed in Part II, prohibiting broadband providers from combining layers can prevent the realization of certain efficiencies and can prevent providers from offering innovative and differentiated services that are more responsive to consumer needs. Furthermore, MCI's principle of layer inviolability ignores the fact that providers operating at a different level often provide one of the primary sources of competition in the layered world. Indeed, as Timothy Bresnahan has noted, it is to be expected that providers operating at adjacent layers would exert constant pressure against one another by attempting to extend their dominance into the adjacent layer.^^ Indeed, regulatory intervention into the interfaces between the various layers might actually hurt competition by locking in the existing relationships in ways that decrease the level of vertical competition.^^

V.

THE ROLE OF REGULATION

It is thus clear that permitting last-mile providers to deviate from the universal interoperability envisioned by the proponents of network neutrality may actually yield substantial economic benefits. Not only does differentiation potentially put networks in a better position to satisfy any underlying heterogeneity in consumer preferences; it also has the potential to alleviate the supply-side and demand-side economies of scale that are the sources of market failure that justifies regulatory intervention in the first place.

^^ Timothy F. Bresnahan, New Modes of Competition: Implications for the Future Structure of the Computer Industry, COMPETITION, INNOVATION AND THE MICROSOFT MONOPOLY: ANTITRUST IN THE DIGITAL MARKETPLACE 155, 167-69 (Jeffrey A. Eisenach & Thomas M.

Lenardeds., 1999). ^^ See Yoo, supra note 3.

68

Net Neutrality The case against network neutrality is further bolstered by the risk that regulation might itself induce market failure by causing the existing oligopoly in last-mile technologies to persist long after technological improvements have made real competition possible. If access to a bottleneck network were not compelled, those who did not want to pay anticompetitively excessive prices for network services would have the incentive to invest in alternative network capacity. Compelling access, on the other hand, would rescue those who would otherwise be financing the buildout of other last-mile technologies from having to undertake those investments. Network neutrality may thus have the effect of starving alternative broadband platforms of the resources they need to build out their networks. Although such a policy might have been reasonable during previous eras, when the fact that construction of new network platforms was infeasible, it is unjustifiable in an environment in which competition from alternative network platforms is a real option. The task confronting policy makers is made all the more difficult by the fact that making any difference would require policy makers to intervene at a fairly early stage in the technology's development, since governmental intervention after the market has settled on the optimal technology would serve little purpose.^^ Although whether regulation or private ordering would provide the better means for determining the optimal technology is ultimately an empirical question, there are a number of considerations that suggest that public policy would be better served by relying on the latter. There are a number of salient examples where allowing competition among different protocols promoted a degree of experimentation. For example, during its early years the electric power industry went through an extended period of competition between standards based on direct current (DC) and alternating current (AC) that enhanced competition and promoted innovation in electrical appliances.^^ Even now, the electrical power network is diverse enough to accommodate appliances designed to run on the predominant 110 volt standard as well as larger appliances

^^ Bresnahan, supra note 59, at 200-03. ^2 Bruce M. Owen & Gregory L. Rosston, LOCAL BROADBAND ACCESS: PRIMUM NON NOCERE OR PRIMUMPROCESS/? A PROPERTY RIGHTS APPROACH 11-12 (AEI-Brookings Joint Center for

Regulatory Studies Related Publication No. 03-19, Aug. 2003), available at: (citing Paul A. David & Julie Ann Bunn, Gateway Technologies and the Evolutionary Dynamics of Network Industries: Lessons from Electricity Supply History, in EVOLVING TECHNOLOGY AND MARKET STRUCTURE 121 (Arnold Heertje & Mark Perlman eds., 1990)). There is thus some irony in the fact that some network neutrality proponents point to the example of electric power as supporting the need for early governmental intervention. See Ex parte Submission, supra note 38, at 3; Wu, supra note 62, at 1165.

Christopher S. Yoo

69

requiring 220 volts. Another example drawn from the telecommunications industry is the competition between TDMA and CDMA standards for mobile telephony. Rather than imposing a particular technological vision, the government has allowed these standards to compete in the marketplace. In addition, governmental processes are subject to a number of wellrecognized biases. Regulatory decisions are all too often shaped by political goals that are not always consistent with good policy. In addition, policymakers may also find it tempting to give too little weight to the future benefits associated with the entry of alternative network capacity, which will no doubt seem uncertain and contingent, and to overvalue the more immediate and concrete benefits of providing consumers with more choices in the here and now. Indeed, the FCC has allowed short-term considerations to override longer-term benefits in the past.^^ Public choice theory strongly suggests that the bias in favor of the former over the latter is no accident. There thus appears to be considerable danger that compelling access will forestall the buildout of 3G, fixed wireless, and other alternative broadband platforms. I acknowledge the possibility that last-mile broadband providers may be able to use the market power provided by the degree of concentration in local markets to harm competition. For example, it is conceivable that cable operators might prohibit cable modem customers from streaming video in order to protect their market position in the market for conventional television. At the same time, such a prohibition might also represent an understandable attempt to prevent high-volume users from imposing congestion costs on other users. Even network neutrality proponents acknowledge how difficult it can be to determine which is the case.^^ In effect, policymakers are presented with a choice between two possible responses. On the one hand, they can trust their ability to distinguish between these two different situations and limit network neutrality to those situations in which deviations from fiiU interoperability are motivated by anticompetitive considerations. The costs of doing so include the danger that regulators might err in making this determination as well as the risk that compelling access might delay entry by alternative last-mile technologies. On the other hand, regulators can adopt a more humble posture about their ability to distinguish anticompetitive from procompetitive ^^ See Christopher S. Yoo, The Rise and Demise of the Technology-Specific Approach to the First Amendment, 91 GEO. L J . 245, 272-75 (2003). ^"^ See Lessig, supra note 59, at 46-47, 167-76; Cooper, supra note 59, at 1050-52.

70

Net Neutrality behavior and attempt to resolve the problem by promoting entry by alternative broadband platforms. Once a sufficient number of alternative lastmile providers exist, the danger of anticompetitive effects disappears, as any attempt to use an exclusivity arrangement to harm competition will simply induce consumers to obtain their services from another last-mile provider. In this case, the primary costs stem from delay. Because entry by new network platforms will not be instantaneous, there will necessarily be a period of time during which consumers may remain vulnerable to anticompetitive behavior. Choosing between these two approaches depends upon weighing their relative merits, with the understanding that each represents a second-best alternative. Although a formal analysis of the tradeoff exceeds the scope of my comments, my instinct is to favor the latter. It is motivated in part by my belief that regulatory authorities will be more effective at pursuing the goal of stimulating entry by new network platforms than they would be in ascertaining whether a particular exclusivity arrangement would promote or hinder competition. In addition, because the long-term benefits will be compounded over an indefinite period of time, they should dominate whatever short-run static inefficiency losses that may exist.^^ Perhaps most importantly, promoting entry has embedded within it a builtin exit strategy. Once a sufficient number of broadband network platforms exist, regulatory intervention will no longer be necessary. This stands in stark contrast with access-oriented solutions, which implicitly assume that regulation will continue indefinitely.

VL

CONCLUSION

The claim that guaranteeing interoperability and nondiscrimination would benefit consumers has undisputed intuitive appeal. The fact that interoperability has represented the historical norm may lead some to put the burden of persuasion on those who would move away from that architecture. However, a close examination of the economic tradeoffs underlying network neutrality reveals a number of countervailing considerations that may not be readily apparent at first blush. Not only does network neutrality risk reducing consumer choice in content and applications, it raises the even more significant danger of stifling the development of further com^^ See Janusz Ordover & William Baumol, Antitrust Policy and High-Technology Industries, 4 OXFORD REV. ECON. POL'Y 13, 32 (1988); David J. Brennan, Fair Price and Public Goods: A Theory of Value Applied to Retransmission, 22 INT'L REV. L. & EcON. 347, 355 (2002).

Christopher S. Yoo

71

petition in the last mile by forestalling the continued emergence of new broadband technologies. Although such an admonition would be well taken under any circumstances, it carries particular force in dynamic industries like broadband that are undergoing rapid technological and marketplace changes.

Chapter 3 Are "Dumb Pipe" Mandates Smart Public Policy? Vertical Integration, Net Neutrality, and the Network Layers Model

Adam Thierer The Progress & Freedom Foundation

I.

INTRODUCTION

We hear a lot of talk these days about "open" versus "closed" systems in the field of high-technology and Intemet policy. Examples include: "open spectrum" versus privately-held wireless properties; "open source" versus proprietary software; and mandatory "open access" versus private (contractual) carriage for telecom or broadband networks. Oftentimes, this debate is also cast in terms of "dumb pipes" versus "intelligent networks." A purely dumb pipe, for example, would be a broadband network without any proprietary code, applications, or software included. An intelligent network, by contrast, would integrate some or all of those things into the system. One problem with this open-versus-closed or dumb-versus-smart system dichotomy is that it greatly oversimplifies matters. "Open" or "dumb" systems are almost never completely open or stupid; "closed" or "smart" systems are almost never completely closed or perfectly intelli* The author wishes to thank Andrew Odlyzko, Bruce Owen, Philip Weiser, Tim Wu, Jeffrey Eisenach, and Daniel Brenner for their comments and suggestions, and Thomas Pearson for his research assistance. This essay originally appeared in 3 JOURNAL ON TELECOMMUNICATIONS & HIGH-TECHNOLOGY LAW 275 (2005).

74

Net Neutrality gent. Nonetheless, an important question raised by these debates is whether, as a matter of pubUc poUcy, lawmakers should be mandating one type of business arrangement or system architecture over another. More specifically, debates over open versus closed systems raise the question of whether vertical integration within the communications and broadband marketplace is to be feared or welcomed. That question is receiving increasing attention in Internet policy circles today as numerous scholars begin to conceptualize this market in terms of layers. Most of these "network layers" models divide our increasingly packet-based Internet world into at least four distinct layers: (1) Content Layer; (2) Applications Layer; (3) Logical/Code Layer; and (4) Physical/Infrastructure Layer. The layers model is an important analytical tool that could help lawmakers rethink and eventually eliminate the increasingly outmoded policy paradigms of the past, which pigeonholed technologies and providers into discrete industrial regulatory categories. But should the layers model be taken a step further and be formally enshrined as a new regulatory regime? And should a layer-breaker be considered a law-breaker? Some scholars and policymakers appear to be moving in that direction with their advocacy of dumb pipe mandates that insist that providers essentially stay put in their primary layer of operation. For example, fearing the supposed ill effects of greater vertical integration in the broadband marketplace, some scholars and policymakers are advocating "net neutrality" mandates that would limit efforts by physical infi-astructure owners to integrate into other layers, especially content. Net neutrality proposals illustrate how the layers model could be used to restrict vertical integration in this sector by transforming the concept into a set of regulatory firewalls between physical infrastructure, code or applications, and content. You can offer service in one layer, but not another. Variations on this theme have already been seen in the debate over Microsoft's integration of a web browser or media player into its Windows operating system and in the AOL-Time Warner merger. In both cases, fears about vertical integration into adjoining layers drove numerous open access regulatory proposals. Had the proposed Comcast-Disney merger moved forward, similar arguments likely would have been raised since the combined entity would have been a major player in the physical infi-astructure, applications, and content layers.^ Undoubtedly, however, the proposed deal foreshadows similar combinations to come that will raise such policy issues. And recent rumblings about treating search engine

Michael Feazel and Brigitte Greenberg, Comcast Bids $66 Billion for Disney, 'Huge' Political Reaction Seen, CoMM. DAILY, Feb. 12, 2004, at 2.

Adam Thierer

75

provider Google as a public utility as it grows larger provides another example of how layer-jumping could result in a regulatory response. This article argues that far from being antithetical to innovation and competition, however, vertical integration can play a vital role in ensuring the development of a more robust broadband marketplace and should not be restricted through an overly rigid application of the network layers model or Net neutrality mandates. As broadband service providers (BSPs) and other Internet service and applications providers seek to expand and diversify their range of consumer offerings by integrating into other network layers, policymakers should not proscribe such layer-jumping. Rather, they should be agnostic with regard to the intelligence of broadband networks in general. Moreover, while the dumb pipe approach may have great merit as a business model and eventually become the approach many BSPs adopt over time, it should not be enshrined into law as a replacement regulatory regime. Added network "intelligence" in the form of bundled applications and services can provide the public with an expanded array of choices that make their Internet experience more userfriendly. More importantly, dumb pipe mandates might have a discouraging effect on competition in the creation of entirely new networks and services if these mandates come to be a formal prohibition on vertical integration between layers. For these reasons, a dumb pipe mandate would be quite dumb indeed. This article begins, in Section I, by laying out dumb pipe theory and the many variations on the network layers model. Section II attempts to draw a linkage between the network layers model, dumb pipe theory and emerging net neutrality regulatory proposals. After outlining these theories and proposals, the article shifts gears and critiques efforts to enshrine these principles into law. Section III discusses the potential disincentives to innovate and create entirely new broadband platforms that might accompany the adoption of dumb pipe mandates or net neutrality regulations. Section IV argues that if there is anything to dumb pipe theory, "openness" and (semi-) dumb pipes will likely prevail naturally in the marketplace, making government regulation a risky proposition. In particular. Section V warns that if past history is any guide, the potential for regulatory capture is quite real and worth considering before adopting such mandates. Questions are also raised regarding the applicability of property rights concepts within the field of broadband networks. Section VI discusses the importance of pricing flexibility and warns that if dumb pipe/net neutrality regulation prohibits pricing freedom, innovative business models and pricing methods may be preempted. Section VII discusses concerns about market power in the broadband marketplace and argues that the increasing con-

76

Net Neutrality testability of communications markets make CarterfoneAikQ regulatory mandates unnecessary. Section VIII concludes by discussing some shortterm developments worth watching that should help us gauge how policymakers might apply network layers models or dumb pipe mandates in the future. The article concludes that a dumb pipe mandate—whether applied though a network layers law or net neutrality mandates—would not constitute smart public policy. Such legal mandates are not needed to deter supposed "discrimination" or preserve the Net's "openness."

II.

THE NETWORK LAYERS MODEL AND DUMB PIPE THEORY

Officials with MCI have been aggressively pushing a new study entitled A Horizontal Leap Forward: Formulating a New Public Policy Framework Based on the Network Layers Model? MCFs white paper is the most succinct articulation to date of the Internet protocol-based "layering concept" previously sketched out by academics Lawrence Lessig,^ Lawrence Solum and Minn Chung/ Kevin Werbach,^ Philip J. Weiser,^ and Douglas Sicker^ among others. Although there is some disagreement within this literature about how many layers can be identified, as the MCI white paper notes, most of these ^ Richard S. Whitt, A HORIZONTAL LEAP FORWARD: FORMULATING A NEW PUBLIC POLICY FRAMEWORK BASED ON THE NETWORK LAYERS MODEL (MCI Public Policy Paper, March

2004), available at . ^ See Lawrence Lessig, The Architecture of Innovation, 51 DUKE L.J. 1783 (2002), available at: ; Lawrence Lessig, THE FUTURE OF IDEAS: THE FATE OF THE COMMONS IN A CONNECTED WORLD

19-25 (Random

House 2001); Mark A. Lemley & Lawrence Lessig, The End of End-to-End: Preserving the Architecture of the Internet in the Broadband Era, 48 UCLA L. REV. 925 (2001). "^ Lawrence B. Solum & Minn Chung, THE LAYERS PRINCIPLE: INTERNET ARCHITECTURE AND

THE LAW (Univ. of San Diego Pub. Law and Legal Theory Research Paper No. 55, June 2003), available at: . ^ Kevin Werbach, A Layered Model for Internet Policy, 1 J. ON TELECOMM. & HIGH TECH. L. 37 (2002). ^ Philip J. Weiser, Regulatory Challenges and Models of Regulation, 2 J. ON TELECOMM. & HIGH TECH. L. 1(2003).

^ Douglas C. Sicker & Joshua L. Mindel, Refinements of a Layered Model for Telecommunications Policy, 1 J. ON TELECOMM. & HIGH TECH. L. 69 (2002).

Adam Thierer

11

models divide our increasingly packet-based Internet world into at least four distinct layers: 1. Content Layer, speech, communications, text, music, video, music 2. Applications Layer: e-mail, word processors, Voice-Over Internet Protocol (VoIP), web browsers 3. Logical / Code Layer, TCP / IP, HTTP, FTP 4. Physical / Infrastructure Layer DSL, cable, satellite, Wi-Fi, fiber optics These layering models are important because they challenge traditional technological, legal, and regulatory assumptions about the way the communications marketplace operates. The traditional vertical "silo" model of communications industry regulation views each industry sector as a distinct set of entities that do not interact and which should be regulated under different principles. For example, telephone companies are governed under Title II of the Communications Act as common carriers. Wireless providers and broadcasters fall under Title III and receive licenses to operate "in the public interest;" while cable providers operate under Title VI and face neither common carrier obligations nor licensing requirements but are governed by local franchising boards. Despite the rapid convergence of these formerly distinctive industry sectors, discrete regulatory regimes and policies continue to exist that are at odds with emerging technological realities. In particular, the rise of the packet-based Internet and high-speed broadband networks challenge traditional assumptions about the vertical silo model of regulation. In other words, although the communications/broadband marketplace is becoming one giant fruit salad of services and providers, regulators are still separating out the apples, oranges, and bananas and regulating them differently. The layers model is an important analytical tool that could help public policymakers rethink and eventually eliminate these increasingly outmoded regulatory paradigms. But should it remain merely an analytical framework, or should it be enshrined into law as the new regulatory paradigm for the communications marketplace? And more importantly, in replacing vertical silos with horizontal layers, will vertical integration between the layers become verboten? In early 2004, MCI issued a follow-up paper also authored by Richard Whitt, entitled. Codifying the Network Layers Model, which begins to an-

78

Net Neutrality swer some of these questions.^ In this latest piece, Whitt criticizes the Federal Communications Commission (FCC) for its recent push to classify broadband services provided by telephone and cable companies as "information services," effectively exempting them from traditional telecom/common carrier regulations.^ He proposes that cable and telco BSPs instead: (1) be required to make their networks available to rivals on a wholesale basis or, (2) not be allowed to vertically integrate into other layers. In this specific context of entities possessing the ability to leverage market power into otherwise competitive markets, policymakers generally have two choices: restrict (quarantine) the upstream dominant firm, or regulate that firm to some degree (which requires regulation of wholesale price and quality of access). While a restriction on vertical integration would more directly address the market dominance concerns, appropriate regulation designed to facilitate nondiscriminatory access at various layers appears sufficient in most cases to largely negate those concerns. Many forms of vertical integration can and do bring efficiency benefits to consumers, and a relatively small likelihood of harming competition. At the same time, layers analysis helps reveal those notable instances where powerfiil firms at one level should not be allowed to leverage that power unfairly into adjacent levels, causing significant damage to competition and innovation. Broadband transport provided by the incumbent LECs is one such instance meriting careful regulatory scrutiny. ^0 This clearly raises the prospect of the layering model becoming a series of formal regulatory firewalls or quarantines on some firms to encourage or even mandate a "dumb pipe" approach to the provision of communications and broadband services in the fiiture. Layering proponents like Lessig often argue that "a dumb pipe is critical," meaning that it would be

^ Richard S. Whitt, Codifying the Network Layers Model: MCI's Proposal for New Federal Legislation Reforming U.S. Communications Law (MCI Working Paper, March 2004), available at: . ^ Appropriate Framework for Broadband Access to the Internet over Wireline Facilities, 67 Fed. Reg. 9232-9242 (proposed Feb. 28, 2002) (to be codified at 47 C.F.Rpt. 51). 10 WHITT, supra note 8, at 6, 7.

Adam Thierer

79

best for BSPs not to provide any integrated content or applications over the lines they own for fear of discrimination against independent suppliers.'^ Lessig and most other proponents of layering models also stress that their models build on, and in some cases seek to protect, the "end-to-end" network design principle that has governed the Internet for so long. The end-to-end principle was first articulated by Jerome Saltzer, David P. Reed, & David D. Clark in 1984.^2 ^ s Lessig summarizes: The end-to-end argument says that rather than locating intelligence within the network, intelligence should be placed at the ends: computers within the network should perform only very simple functions that are needed by lots of different applications, while functions that are needed by only some applications should be performed at the edge. Thus complexity and intelligence in the network are pushed away from the network itself.'^ Thus, the relationship between the layers model, the end-to-end principle, and "dumb pipe" or "stupid network" mandates becomes evident. As Solum and Chung note, "[t]he layers concept is implicit in the end-to-end argument," and from the two usually flows a series of assumptions about the wisdom of integrating additional intelligence into the core of the network.'^ Until recently, however, the "dumb pipe" or "stupid network" thesis did not really have any clear public policy implications. It functioned more as an ideal to which the industry should aspire. For example, throughout the 1990s, technology guru and Telecosm author George Gilder repeatedly stressed the importance of dumb pipes, "dark fiber," and "stupid storage." In fact, one of Gilder's "20 Laws of the Telecosm" was "The Law of Conduits and Content":

'* Teri Rucker, Coalition Urges FCC to Craft Rule on Broadband Access, NAT'L J. TECH. DAILY ( P M ED.), Apr. 24, 2003, available at: (quoting Lawrence Lessig). See also Simson Garfinkel, The End of End-to-End?, MIT TECH. REV, July/Aug. 2003, available at: . ^^ Jerome H. Saltzer, David P. Reed, & David D. Clark, End-to-End Arguments in System Design, 2 ACM TRANSACTIONS ON COMPUTER SYS. 277 (1984).

^^ Lessig, supra note 3, at 34. ^"^ Solum & Chung, supra note 4, at 19.

80

Net Neutrality This law comes in the form of a commandment to divorce content from conduit. The less content a network owns the more content flows through it. If you are a content company, you want your content to travel on all networks, not just your own. If you are a conduit company, you want to carry everyone's content, not restrict yourself to your own. Companies that violate this r u l e . . . tear themselves apart. The dumber the network the more intelligence it can carry. ^^ More recently this perspective was echoed by Don Tapscott, a management consultant and author of Digital Capital: Harnessing the Power of Business Webs, when he argued in a Wall Street Journal column that, "[T]he rule is that content wants all the distribution it can get. And distribution wants all the content it can get."^^ Similarly, former AT&T engineer David Isenberg was advancing this same thesis as far back as 1997 in a now-famous essay on the Rise of the Stupid Network: A new network "philosophy and architecture" is replacing the vision of an Intelligent Network. The vision is one in which the public communications network would be engineered for "always-on" use, not intermittence and scarcity. It would be engineered for intelligence at the end-user's device, not in the network. And the network would be engineered simply to "Deliver the Bits, Stupid," not for fancy network routing or "smart" number translation. Fundamentally, it would be a Stupid Network. In the Stupid Network, the data would tell the network where it needs to go. (In contrast, in a circuit network, the network tells the data where to go.) In a Stupid Network, the data on it would be the boss.^^ But Gilder, Tapscott, and Isenberg were generally making the case for why dumb pipes and "stupid networks" made sense from an engineering or business perspective. Again, the question left unanswered was whether the dumb pipe approach was merely a conceptual tool and a business model, or whether it should become the central animating principle for future regulation of the entire broadband/Internet marketplace. As we turn

^^ George Gilder, Telecosm: How Infinite Bandwidth Will Revolutionize Our World 269 (2000). ^^ Don Tapscott, The Magic Kingdom as Content, WALL ST. J., Mar. 30, 2004, at B2. ^"^ David Isenberg, Rise of the Stupid Network, COMPUTER TELEPHONY, Aug. 1997 (emphasis in original), available at: .

Adam Thierer

81

to the debate over so-called "net neutrality," or "digital discrimination" regulation, we see that the latter may soon be the case.

III.

DUMB PIPES LITE: THE NET NEUTRALITY PROPOSAL

Since the implementation of the Telecommunications Act of 1996, federal and state policymakers have been fixated with the question of how much access should be provided to the platforms owned by wireline telecom companies and cable operators.^^ While incumbent local exchange carriers have faced an extensive array of infrastructure sharing mandates, cable operators have thus far escaped similar mandates to share their networks with rivals at regulated rates. In fact, federal regulators have essentially crafted an asymmetrical industrial policy that has quarantined cable operators from forced access regulations in order to ensure they become formidable rivals to the Baby Bells. As a result of this regulatory forbearance, the cable industry has made significant investments in network upgrades to develop a high-speed, two-way pipe to the home. Eighty-four billion dollars has been invested by the industry since 1996 to upgrade infrastructure,^^ and the cable industry now controls 64 percent of the highspeed broadband market.^^ But despite ongoing pleas by some policymakers and regulatory advocates for the application of structural open access mandates to both telco and cable operators, there are signs that the days of full-blown structural access may be numbered. On the cable side, federal regulators still show little interest in imposing such infrastructure sharing mandates, and no municipal government has thus far been able to gain the legal right to do so. Meanwhile, although still shackled with a host of unbundling and resale mandates, telco operators chalked up an important victory in March 2004 when the U.S. Court of Appeals for the District of Columbia handed down a blistering decision vacating most of the FCC's latest revision of ^^ See generally Adam Thierer & Clyde Wayne Crews, Jr., WHAT'S YOURS IS MINE: OPEN ACCESS AND THE RISE OF INFRASTRUCTURE SOCIALISM (2003). ^^ NATIONAL CABLE AND TELECOMMUNICATIONS ASSOCIATION, 2004 MID-END INDUSTRY

OVERVIEW 2 (2004), available at: ; Adam Thierer, Cable Rates and Consumer Value, 53 TECHKNOWLEDGE, July 25, 2003, available at: . 2^ Alex Salkever, Will Naked DSL Chill the Cable Guys?, BUSINESS WEEK ONLINE, February 27, 2004, available at: .

82

Net Neutrality the rules.2^ The Bush Administration did not seek a Supreme Court review of the rules meaning many of the unbundhng mandates may gradually disappear and be replaced by voluntary access and carriage agreements. But while these structural access regulations may be withering away, a new push is underway to impose behavioral access regulations on both telco and cable network operators. These net neutrality/digital nondiscrimination mandates have recently been advanced by several major software and e-commerce firms who have formed the Coalition of Broadband Users and Innovators (CBUI). CBUI petitioned the FCC to adopt rules ensuring that cable and telephone industry BSPs will not use their control of high-speed networks to disrupt consumer access to web sites or other services. In the name of preserving end-to-end openness on the Net, CBUI members argue the FCC must adopt preemptive "non-discrimination safeguards" to ensure Net users open and unfettered access to online content and services in the fiiture. CBUI members claim such regulations are necessary because the current market is characterized by a cable-telco "broadband duopoly" that will "define the Internet for some time, and [allow] network operators to infringe or encumber the relationships among their customers or between their customers and destinations on the Intemet."^^ Consequently, CBUI members have proposed the FCC adopt what they regard as a "simple rule" to safeguard against online discrimination by BSPs. In a March 28, 2003, presentation before the agency, CBUI argued that, "[t]he FCC can and should be proactive and act in anticipation of future harm by taking simple, non-intrusive, measured steps."^^ What exactly is the supposedly "simple rule" or "measured steps" that Net neutrality proponents would have the FCC (or potentially even state regulators) adopt for BSPs? In a January 8, 2003, filing to the FCC, CBUI requested that the FCC adopt regulations that guarantee Net users the ability to:

21 United States Telecom Ass'n. v. FCC, 359 F.3d 554 (D.C. Cir. 2004). 22 Ex parte Filing of the Coalition of Broadband Users and Innovators, Appropriate Framework for Broadband Access to the Internet over Cable Facilities, Declaratory Ruling & Notice of Proposed Rulemaking, CS Docket 02-52, available at: . 2^ Coalition of Broadband Users and Innovators, Discrimination on the Broadband Network: Why the FCC Should Adopt Connectivity Principles to Ensure Unfettered Consumer's Access to the Internet, Presentation to the FCC's Local & State Governments Advisory Committee 8 (Mar. 28, 2003) (transcript on file with author).

Adam Thierer

83

1. lawfully roam over the Internet; 2.

run the applications they want using the equipment they choose;

3.

gather, create, and share information;

4.

connect to websites absent interference by network operators .^^

While the FCC has so far taken no action on the CBUI proposal, there are several proceedings pending at the agency to which a Net neutrality proposal could be attached.^^ In addition. Net neutrality mandates could be imposed as a condition of merger approval in the future by either the FCC or antitrust officials at the Department of Justice. Meanwhile, state regulators have already outlined what they think a Net neutrality rule should look like. On November 12, 2002, the National Association of Regulatory Utility Commissioners (NARUC), which represents state regulatory agencies and officials, adopted a Resolution Regarding Citizen Access to Internet Content that claimed, "[p]roviders of broadband services or facilities have the technical capability to create a 'walled garden' or 'fenced prairie,' that is designed to attract customers to preferred content but that also could keep consumers from reaching content other than those of the providers' choosing."^^ Moreover, the NARUC resolution continued: "It is conceivable that some providers of broadband service or facilities may have an incentive to restrict Internet access to favored news sources, and if they chose to do so, it could significantly harm free and open information exchange in the marketplace of ideas."^^ Therefore, NARUC resolved that broadband wireline and cable modem users should:

'^^ Ex parte submission of the Coalition of Broadband Users and Innovators, supra note 22, at 3-4. ^^ These FCC proceedings include: Inquiry Concerning High-Speed Access to the Internet Over Cable and Other Facilities, GN Docket 00-185; Appropriate Framework for Broadband Access to the Internet over Cable Facilities, CS Docket 02-52; Appropriate Framework for Broadband Access to the Internet over Wireline Facilities, CC Docket No. 02-33; Review of Regulatory Requirements for Incumbent LEC Broadband Telecommunications Services, CC Docket 01-337; Computer III Further Remand Proceedings, CC Dockets 95-20 & 98-10. 26 NAT'L ASS'N OF REG. UTIL. COMM'RS, RESOLUTION REGARDING CITIZEN ACCESS TO

INTERNET CONTENT (2002), available at: files/citizen_access.pdf>. 27 M

nmwm non nocere and primum processi? Should we act now to forestall the threat of exclusion, or should we be care-

Bruce M. Owen and Gregory L. Ross ton

189

ful and, first, do no harm? Lemley and Lessig (2000) and Bar et al. set out the arguments that lead them to conclude that waiting will be problematic for the future of the Internet. Essentially, they argue that thirty years of FCC regulation has kept the network owners (the ILECs that supply local analog, or low-speed Internet access) out of the Internet. The Internet has flourished during this time, they argue, precisely because innovative firms have had unfettered access to the network and users without the ILECs being able to influence network design. But dial-up analog connections to the Internet are identical to—indeed, are— voice connections; telephone companies have always controlled the technical standards, such as bandwidth, pertaining to these connections. Thus, indirectly, the necessity to rely on analog connections has until the recent advent of high-speed services very much affected the architecture of the Internet. Further, no FCC regulation has prevented the ILECs from offering Internet services, or Internet-like services, other than a requirement that such services be offered through a separate subsidiary. Present FCC regulations, to be phased out, provide for non-discriminatory access to telephone company high-speed platforms, but do not prevent the telephone companies from determining the technological characteristics of the services offered. Their argument is also based in part on the prediction that cable is the technology that is likely to dominate the next generation of Internet access and on the assumption that each local area will be served by a single local cable system. So consumers will have one main choice and the local cable companies will become not only the dominant supplier of high-speed Internet access, but also seek to extend their control of the network and influence network design to maximize their own profits. As we point out above, there is no basis to assume that cable or any other technology platform will monopolize the LBB service, or that if one does, it will be profitable for that monopolist to exclude equally or more efficient upstream or downstream suppliers. Even competition among LBB services, they contend, will not be sufficient to squelch the danger of vertical foreclosure of access. Bar et al. argue that even under the current FCC rules applicable to DSL, the closed cable system will reduce competition to "host" independent ISPs so that the ILEC will be less hospitable to its third party ISPs. In addition, they argue that a duopoly is not sufficient to provide the openness that allowed the Internet to

^^ Bar, Francois, Stephen Cohen, Peter Cowhey, Brad DeLong, Michael Kleeman, and John Zysman, DEFENDING THE INTERNET REVOLUTION IN THE BROADBAND ERA: WHEN DOING

NOTHING IS DOING HARM, (Berkeley Roundtable on the International Economy, E-conomy Working Paper 12, August 1999).

190

Net Neutrality

flourish. For example, competition in wireless telephony surely increased with the introduction of additional PCS licensees to compete with the cellular duopolies. But that does not mean that an unregulated duopoly (or even monopoly) would produce worse results than a vertically regulated firm or set of firms. Nor does it mean that such a policy is the appropriate policy to provide incentives for competing access providers. Of course, a policy of primum processi shifts the burden on these points to advocates of deregulation, who must prove a negative: that the absence of regulation would not lead to harm. The "bad outcome" in the Lessig school scenario is that failing to restrict the ability of LBB operators to deny access will lead to exclusion of efficient suppliers. The claim is that this exclusion is not only harmfiil to current consumers because they are denied the choice that a competitive market would give them, but more importantly that such control would artificially dictate the path of investment and innovation for the future of the Internet. In other words, the risk is that a closed solution would lead to a path that is optimized for the cable and DSL provider and not necessarily optimized for long-term consumer welfare. For the reasons already given, we think this story falls far short of justifying the equivalent of "preventative detention" for current and coming LBB access platforms. The story does not account of the costs we have enumerated above, and appears to be based very largely on the assumption that the historical path of the Internet was optimal and that the history of the Internet is also its future. This assumption may be user-friendly, but it is unsupported by evidence. Access regulation involves an assessment of the risk of two kinds of error. The first, the focus of the Lessig school, is that worthy new services and innovations will be excluded either by monopolistic greed or by the selection of a centralized architecture. The second, which the Lessig school ignores, is that a vertically integrated monopolist will have incentives to produce more output than a vertically disintegrated industry would, and that (a) centrally controlled network(s) would, in the future, become more efficient than (an) "end-to-end" network(s). The preceding parentheses remind us that of the possibility that competing, differentiated networks will be the most efficient outcome. Some public policy choices are close calls, because the costs and benefits of the alternative courses are equal. The net neutrality proposal is not one of these hard choices at this point in time. There is no evidence that the outcome proponents of net neutrality wish to avoid (hardware platform owners' controlling access and choosing content) is likely to happen, or that if it did happen it would be harmfiil to consumers. Rather than holding back the

Bruce M Owen and Gregory L. Ross ton

191

North Sea, the dike into which net neutrals would insert their fingers is more likely to block the road to competition and innovation.

192

Net Neutrality

REFERENCES AT&T Corp. V. Iowa Utilities Bd,, 525 U.S. 366 (1999) (Justice Breyer, dissenting). Bar, Fran9ois, Stephen Cohen, Peter Cowhey, Brad DeLong, Michael Kleeman, and John Zysman, "Defending the Internet Revolution in the Broadband Era: When Doing Nothing is Doing Harm," Berkeley Roundtable on the International Economy, E-conomy Working Paper 12, (August 1999). Coalition of Broadband Users and Innovators, Ex Parte Communication to the Commissioners of the Federal Communication Commission, CC Docket Nos. 02-33.98-10 & 95-20, CS Docket No. 02-52 & Docket No. 02-52, (November 18, 2002). Coase, Ronald H., The Problem of Social Cost, Journal of Law and Economics 3, (I960), pp, U44. David, Paul A. and J. Bunn, "Gateway Technologies and the Evolutionary Dynamics of Gateway Industries: Lessons fron from Electricity Supply History" ," in Mark Perlman and A. Heertje (editors). Evolving Technology and Market Structure, Chicago: University of Chicago Press, 1987, pp. 121-156 (NIL). Farrell, Joseph and Phillip Weiser, "Modularity, Vertical Integration and Open Access Policies: Towards a Convergence of Antitrust and Regulation in the Internet Age," Institute of Business and Economic Research, Paper CPC 02-035, (2002). FCC, High Speed Services for Internet Access: Status as of June, Industry Analysis and Technology Division, Wireline Competition Bureau, (2004). FCC, In the Matter of Madison River Communications LLC and Affiliated Companies, File No. EB-05-1H-0110, Released March 3, 2005. Lane v. Cotton, 1 Ld. Raym. 646, 654 (1701, per C.J. Holt). Lessig, Lawrence, "The Internet Under Siege," Foreign Policy, (Nov./Dec. 2001a).

Bruce M. Owen and Gregory L. Rosston

193

Lessig, Lawrence, The Future of Ideas: the Fate of the Commons in a Connected World, New York: Random House 2001. Lessig, Lawrence, "The Government's Role in Promoting Broadband Deployment," Testimony before the Senate Commerce Committee, (October 1, 2002). Lessig, Lawrence, Free Culture, New York: Penguin Books, 2004. Lucas V. South Carolina Coastal Council, 505 US 1003 (1992). Munn V. Illinois 94 U.S. 113 (1876). New State Ice Co, v. Liebman, 285 U.S. 262 (1932). Noll, Roger G. and Bruce M. Owen, The Political Economy of Deregulation, American Enterprise Institute, (1983). Noll, Roger G. and Bruce M. Owen, United States v. AT&T: The Economic Issues, in Kwoka and White, eds.. The Antitrust Revolution, Scott Foresman, 2nd ed, (1994). Owen, Bruce M., "Forced Access to Broadband Cable,", FCC CS Docket No. 02-52 Appropriate Regulatory Treatment for Broadband Access to the Internet Over Cable Facilities, (June 17, 2002). Owen, Bruce M., The Internet Challenge to Television, Harvard University Press, (1999). Owen, Bruce M. and Gregory L. Rosston, "Cable Modems, Access and Investment Incentives," (December 1998). Pew Internet & American Life Project, March 2005 Survey, (2005). . Phillips, Charles F., Jr., The Regulation of Public Utilities , 2nd ed., Arlington, VA, 1988. Posner, Richard, Economic Analysis of Law, 6th ed. Aspen Publishers, 2002. Ramsey, Frank P., "A Contribution to the Theory of Taxation," Economic Journal (1927).

194

Net Neutrality

Rubinfeld, Daniel and Singer, Hal, "Open Access to Broadband Networks: A Case Study of the AOL-Time Warner Merger," Berkeley Technology Law Journal Vol. 16, No. 2, (Spring 2001), pp. 631-675. Schumpeter, Joseph, Capitalism, Socialism and Democracy, New York: Harper & Row, 1942. Speta, James B. "Handicapping the Race for the Last Mile: A Critique of Open Access Rules for Broadband Platforms," Yale Journal on Regulation, Vol. 17, No. 39, (2000). Spulber, Daniel F., and Christopher S. Yoo (2003), "Access to Networks: Economic and Constitutional Connections," SS_Cornell Law Review 4,_(May2003). Stigler, George, "The Theory of Economic Regulation," Bell Journal of Economics, Vol. 2, (1971), pp. 3-21. TahoeSierra Preservation Council, Inc. v. Tahoe Regional Planning Agency, 122 S. Ct. 1465 (2002). United States v. Terminal Railroad Association, 224 U.S. 383 (1912). Verizon Communications Inc. v. Law Offices Of Curtis V. Trinko, LLP, 540 U.S. 398 (2004). Woroch, Glenn, "Open Access Rules and the Broadband Race," L. Rev M.S.U.-D.C.L, (2002) I. Western Union Telegraph Co. v. Call Publishing Co., 181 U.S. 92, 98 (1901). Wu, T, "Network Neutrality and Broadband Discrimination," 2 Journal of Telecommunications & High Technology, (2003).

Chapter 6 Open Access Arguments: Why Confidence is Misplaced

Joseph Farreir University of California, Berkeley

"\ beseech you in the bowels of Christ, think it possible you may be mistaken." -

I.

Oliver Cromwell

THE POLICY QUESTION

Should the FCC mandate some form of open access to stop a last-mile broadband Internet access provider from favoring some applications, ISPs, or other complements over others? Should the end-to-end modularity principle that has proven successful in the Internet be protected by regulation, or does that needlessly risk constraining innovation and diminishing incentives to build broadband networks? Like so many telecommunications problems, this one concerns the role of a last-mile provider in (potentially) more competitive activities. Contrary to the claims of some opponents of open access regulation, I believe that broadband providers are likely to depart from modularity if allowed This paper grew from a June 2003 presentation at a Progress & Freedom Foundation conference on open access, and I thank Mark Lemley and PFF conference participants, especially Pat DeGraba and Greg Rosston, for helpful discussion. Mark Rodini provided valued research assistance.

196

Net Neutrality

to do so. But this in itself does not show that regulation is desirable; such departures can have both good and bad consequences. Contrary to the tone of the debate (on both sides), the analytics are difficult and unsettled. Therefore I argue for treating this as a decision under severe uncertainty; but this does not simply mean a philosophical choice between presumptions or styles, as one might read Owen and Rosston to suggest. For instance, it pushes toward taking more seriously the benefits of asymmetric regulation, and toward vigorously benchmarking against other countries and within the US.

II.

EVALUATING OUTCOMES OR INCENTIVES

The outcomes approach to economic policy attempts a direct cost-benefit analysis: what are the benefits and costs of modularity, and which is bigger? Both are big, and the comparison is hard. Moreover, the answer needs to be calculated conditioning on when the proposed policy would make a difference—for instance, when it would enforce modularity that otherwise wouldn't happen. The outcomes approach fails to take full advantage of the fact that prohibiting A from doing X enhances efficiency if and only if X is on average inefficient when A would otherwise do X. A lot of the discussion I've seen consists of claims by proponents of open access regulation that broadband providers already depart from modularity, or obviously will do so, or that Cisco has encouraged them to do so; and opponents arguing that providers continue to offer modularity and would have no incentive to do otherwise. To first order, this is off topic: whether or not A has previously done X, or the likelihood that he will, says nothing about efficiency conditional on his choosing to do it. That's too stark: a prohibition may matter even if indeed A would never choose to do X. There are costs of drafting, complying, and (in some odd sense) enforcing the prohibition; and it might accidentally chill behavior that differs importantly from X. But the direct costs are surely small relative to what's at stake. More importantly, it would be unwise to rely heavily on a prediction that broadband providers would never ever want to depart from modularity. There are credible economic reasons (both good and bad) for them to do so, and this resonates with the nature of the debate: if broadband ^ Bruce Owen and Gregory Rosston, LOCAL BROADBAND ACCESS: PRIMUM NON NOCERE OR PRIMUM PROCESSI? A PROPERTY RIGHTS APPROACH, (Standford Institute for Economic Policy

Research Discussion Paper 02-37, July 2003), available at: .

Joseph Farrell

197

providers knew they would always sustain modularity, they wouldn't be very worried about regulation to that effect, and would be chiefly concerned to help draft it well. Meanwhile, complementors would also be serene. Any opposition would be led by taxpayer groups, bemoaning the cost of draftsmen and ink. So the discussion should weigh the relative likelihood and effects of the various (good and bad) motives for non-modularity, not waste a lot of time on their total likelihood. The incentives approach to policy is a non-ideologue's selective version of laissez-faire: it asks how well we can trust a private decision-maker's incentive to make efficient choices. This approach would not ask whether a departure from modularity is generally efficient, but whether a broadband provider has good incentives to choose between modularity and other vertical arrangements. In this way we can implicitly draw on the provider's often more detailed knowledge of the situation. Tfa broadband provider wants to depart from modularity, what does that tell us about the efficiency of its doing so?

III.

VERTICAL INCENTIVES AND ICE

Telecommunications and antitrust have approached modularity very differently. In telecommunications, reverence for "one system" and efficiencies of integration gave way, during the 1950s (Hush-a-Phone) to 1980s (the AT&T breakup and Computer Inquiries), to a policy of protecting modularity, and especially of protecting interfaces between monopoly and competitive segments. The 1996 Telecommunications Act tries the high-wire trick of simultaneously advancing this tradition and allowing more integration. In antitrust, pre-Chicago suspicion and hostility toward non-modular vertical arrangements gave way in the 1980s to Chicago-school acceptance of them. This acceptance stemmed partly from an improved outcomes analysis: a better understanding of the benefits and costs of non-modular arrangements. More fimdamentally, however, the Chicago "one monopoly rent theorem" (OMRT) sharpened our incentives analysis. It claims that a monopoly will not choose to "leverage" inefficiently into a complement. Farrell and Weiser (2003) reformulate the argument as a broader claim that such a firm has incentives to encourage the most efficient mode of complementary organization or the one that most benefits customers: it internalizes complementary efficiencies (ICE). However, post-Chicago economics finds that OMRT/ICE has many holes, perhaps too many to be a "theorem." Rather, it may be a useful principle for organizing an enquiry. It is not ideal in that regard: ideally, "exceptions" to an organizing principle can be confidently diagnosed, with few false posi-

198

Net Neutrality

lives or false negatives, whereas some exceptions to ICE are easy to suspect even if false, and hard to prove even if true. Nevertheless, ICE may be the best organizing principle we have for these issues.

A.

The ICE Argument

ICE asserts that if a platform sponsor does, or allows to be done, anything that reduces customer value from applications, say by $1, then the demand curve for platform subscription falls by that $1, lowering platform profits by $1 per customer. Thus the sponsor's incentive in the applications market is to do only efficient things (if it captures applications profits) or pro-consumer things (if it does not). This argument suggests a lot of laissez-faire for vertical relations. ICE conventionally assumes a monopoly platform, and I follow that convention in this section; below, I discuss how the argument is affected by (limited) platform competition, such as we have in broadband.

B.

Holes in the ICE Speta valuably brought ICE (though not by that name) into the open ac2

cess debate. Speta also argues that a broadband provider will internalize the consumer value of open access (which he calls "indirect network effects"). But he overstates the strength of ICE, writing that "economic theory holds that a monopolist—which, by definition, would have the ability to impede competition in adjacent markets—generally will have no incentive to do so.'"^ A more accurate statement is that economists find ICE a good starting point. Farrell and Weiser describe eight holes in the ICE logic; here I focus on price discrimination and maintenance of platform-level market power.

^ Speta, James, The Vertical Dimension of Cable Open Access, 71 COLORADO LAW REVIEW 975 (2000). ^ Speta, James, Handicapping the Race for the Last Mile?: A Critique of Open Access Rules for Broadband Platforms, 17 YALE JOURNAL ON REGULATION 39 (2000).

"^ Speta, James, The Vertical Dimension of Cable Open Access, 71 COLORADO LAW REVIEW 997 (2000). Speta recognizes some exceptions to ICE, but argues (without addressing the arguments given here) that they do not apply. ^ Farrell, Joseph and Philip Weiser, Modularity, Vertical Integration and Open Access Policies: Towards a Convergence of Antitrust and Regulation in the Internet Age, HARVARD JOURNAL OF LAW AND TECHNOLOGY 17(1), (2003), at 85-135, available at: .

Joseph Farrell

IV.

199

PRICE DISCRIMINATION AS HOLE IN ICE

Price discrimination is often far more profitable for a platform sponsor than uniform, nondiscriminatory pricing.^ It can thus help the sponsor capture the full benefits of buildout, which can be good7 More precisely, monopoly price discrimination based on exogenous characteristics correlated with demand elasticity has Ramsey-like properties: it charges the most to those customers whose demands will be least distorted by paying more. For these legitimate reasons, although the person in the street still dislikes the sound of price "discrimination," I think it is fair to say that the Washington policy community, mainly lawyers who have absorbed some economics, thinks price discrimination (and complex pricing in general) is good or at least okay. An additional pragmatic argument is that it can be hard to stop price discrimination given market structure, though that may not apply when structure is under discussion. But price discrimination cannot enhance efficiency unless it increases output by enough to make up for its allocative distortions; much price discrimination encourages inefficient rent-seeking (reluctantly staying over a Saturday night to get a lower fare); and oligopoly price discrimination is much less Ramsey-like than the monopoly version. So the person in the street may not be as wrong as the Washington consensus seems to think. It seems to be an open question whether broadband price discrimination, which is surely facilitated by a departure from modularity, is itself good or bad for efficiency or for consumers. Moreover, because departures from modularity facilitate price discrimination, a sponsor often (though not always) inefficiently limits applications

^ This is sometimes attributed to a high ratio of fixed, or sunk, to variable costs. But pricing behavior should depend only on marginal costs and demand, not on fixed or sunk costs. However, if fixed or sunk costs are large, then, in long-run equilibrium, one can expect that each provider will be able to charge well above marginal cost, since entry to the point that destroys such ability will presumably be unprofitable. ^ See Hausman, Jerry and Jeffrey MacKie-Mason, Price Discrimination and Patent Policy, RAND JOURNAL OF ECONOMICS 19(2) (1988), at 253-265, for a discussion in an intellectual

property context. Here, some commentators argue that there is fairly widespread availability of broadband already in the US, so that if there is a policy problem with the adoption of broadband it is on the demand side: there seems little logic to treating "availability" as a policy goal with no consideration of how attractively broadband is available, (Of course, one should be careftil about policy opportunism if buildout truly relied on expectations inconsistent with ex post policy formation). ^ See Borenstein, Severin, Price Discrimination in Free-Entry Markets, RAND JOURNAL OF ECONOMICS 16(3),(1985), and Holmes, Thomas, The Effects of Third-Degree Price Discrimination in Oligopoly, AMERICAN ECONOMIC REVIEW 79(1) (1988).

200

Net Neutrality

competition. That is a collateral-damage inefficiency motivated by price discrimination, not caused by price discrimination in the sense above. Therefore, in our present state of knowledge we should not be completely unconcerned either about allocative-efficiency consequences or about collateral damage to modularity, if we think that broadband access providers will seek to practice price discrimination. How likely is that? Historically cable companies have not wanted to become (even highpriced) common carriers, charging a flat rate per megahertz-hour.^ Surely someone might try to control or charge differently for streaming video. Speta asks whether cable companies would interfere with broadband Internet operations in order to protect rents in cable programming. He argues that programming lacks barriers to entry, so there can be no such rents. But even if (questionably) programming is really so competitive, his argument misses the use of programming and other applications as a meter for price discrimination. Even though there are few inherent barriers to manufacturing punch cards, IBM could have wanted to compromise computer/punch-card modularity so as to help it meter computer use. A second price discrimination strategy exploits the fact that some consumers value access to the unfiltered internet more than do others. Thus a broadband provider might make unfiltered access a high-end product, offering a walled "Internet lite" to low-end users. Firms often intentionally cripple a low-priced version of a product so as to price discriminate; indeed, Shapiro and Varian describe this as a standard part of selling information. A third (and related) possibility is that a broadband provider may expect to get more money from Travelocity for steering would-be travelers to it, than it loses from customers unhappy that they can't (as easily) get to Expedia.^^ This isn't obviously anti-consumer: if too many customers liked Expedia too much then the strategy could backfire; and customers may get some of the benefit from Travelocity's placement payments (or contractual discounts).

^ One should not get too tangled in exegesis about what is and is not "price discrimination": modularity tends to impose a common-carrier pricing model that differs from cable's traditional model, so cable may be tempted to compromise even efficient modularity to preserve the pricing model. The use of the PD phrase is neither here nor there, just convenient. ^^ Speta, James, The Vertical Dimension of Cable Open Access, 71 COLORADO LAW REVIEW 975 (2000). ^' Shapiro, Carl and Hal Varian, INFORMATION RULES- A STRATEGIC GUIDE TO THE NETWORK

ECONOMY (Harvard Business School Publishing 1999), available at: . *^ This may be especially true for the low-end customers unwilling to pay a higher price for unfiltered access. If current sensitivities persist, "favorable placement" of Travelocity would be more likely than a blatant "Expedia is 404" strategy. But if favorable placement is worth paying for, it presumably affects customers' behavior, and hence affects incentives.

Joseph Farrell

201

But it isn't obviously pro-consumer either, and it obviously could threaten modularity. In short, broadband access, like many high-technology and telecommunications products, probably has lucrative opportunities for price discrimination, and pursuing those opportunities may well conflict with the ICE logic of offering the best possible product to the marginal customer. Price discrimination may or may not be efficient in itself, but even then it often raises profits out of proportion to its efficiency contribution, so the seller may compromise efficiency in other respects, notably modularity, in pursuit of the ability to discriminate. Not all means of price discrimination threaten modularity: a firm could simply offer different access speeds at different markups, as some do. Some firms, some of the time, may well eschew discrimination, or pursue it in a form compatible with modularity. But oligopolists' optimal price discrimination strategies depend on subtle elasticities, and their fixture estimates of these elasticities are surely hard to predict.

V.

PLATFORM COMPETITION AND ICE

Many customers can choose between cable and DSL, as well as perhaps less popular options.^^ There is also some intra-modal competition: DSL 14

may be available through a DLEC. But many other customers have only one (or no) broadband option, and few have more than two independentplatform options (that have proved their appeal by attracting substantial numbers of customers): the market is highly concentrated. This might be competitive by the standards of regulators used to dealing with monopolies; it may not be all that competitive by ordinary industrial organization (or anti-

^^ Nguyen cites data from GAO (2000) that about half of those consumers who had broadband access available had a choice between cable and DSL, and the fraction has probably increased since then. Nguyen, Scott, "The Effects of Unbundling Regulations on Pricing, Innovation and Availability of Broadband Services," Senior Honors Thesis, Department of Economics, University of California, Berkeley, 2004. ^^ There is little intra-modal competition in cable modems: the FCC reports that of 33,485 cable television communities, only 2.6% have been certified by the Commission as having effective competition from alternative cable or from satellite broadcasting with more than 15% market share. Federal Communications Commission, Annual Assessment of the State of Competition in the Market for the Delivery of Video Programming (Tenth Annual Report), January 2004, at 80, available at: .

202

Net Neutrality

trust) standards.*^ Rather than assess the state of platform-level competition, I will abbreviate it as duopoly+/-, and ask: so what? Washington economics sometimes assumes that everything is bad if there is "monopoly," everything is good if there is "competition," and those categories are exhaustive. But broadband, like most of what Washington must address, is neither literal monopoly nor fully effective competition: it's in between. And some things work fairly well even with monopoly; some are problems even with competition; a few might behave even less conventionally. ICE claims to work with monopoly, which makes it puzzling that Washington economics applies ICE much more readily to platforms that face competition. But it could make sense if (a) one takes seriously the possible exceptions to ICE, but (b) ICE is strengthened and/or its exceptions weakened by platform-level competition. Is (b) the case? What does a limited degree of competition (duopoly+/-) do to ICE?

A.

Duopoly+/- Raises Platform-Level Demand Elasticity

This is the most conventional effect of platform competition. It underlies the widespread presumption that firms facing competition "can't" do inefficient things, because they will lose too many customers. One must take care with that argument, however, because losing a customer is less painful if prices are more competitive, and also because, if inefficiencies sufficiently spill over to rivals (see below), the argument doesn't apply.

^^ Nguyen cites 2002 FCC data that more than a quarter of broadband-served zip codes had just one provider (comparable June 2003 data shows this to have dropped to less than one fifth). He bounds local HHIs using the optimistic assumption that if n providers (including DLECs) serve a zip code then the local HHI is 10,000/«. With these strongly HHI-reducing assumptions, he calculates that the weighted average HHI is almost 5,000. Another way to estimate local concentration is that if a typical local market has representatively about a 2/3 share held by the local cable company and almost all the rest by DSL, that would imply an HHI of about 5,500 if we treat DSL as a single provider. Because the HHI is convex in shares, variation among local markets that averages out to these two-thirds/one-third shares implies higher average HHI (think for example of the hypothetical case in which two-thirds of local markets are served only by cable and one-third only by DSL). Nguyen, Scott, "The Effects of Unbundling Regulations on Pricing, Innovation and Availability of Broadband Services," Senior Honors Thesis, Department of Economics, University of California, Berkeley, (2004).

Joseph Farrell

B.

203

Duopoly+/- Trims Fat

Competition might lower baseline profits and thus make a firm (or its managers) more sensitive to the loss in profit resulting from an inefficient choice of vertical form, if ICE applies. But this argument itself has two problems. First, it relies on a belief that firms fail to maximize profits if they are fat (plausible) and that inefficient modularity choices are a likely form of failure to optimize. Second, when ICE fails to apply, the argument strengthens the prediction that managers will depart from efficiency.

C.

Duopoly+/- Provides Diversity

Competition supplies diversity: even if one provider closes its platform, another may not, so there is less to fear from any one provider's possible bad incentives (or incompetence). For instance, if we think that each firm is 80% likely to choose efficient vertical organization, the chance that no firm does so is 20% under monopoly, 4% under duopoly, and less than 1% with three firms—if their choices are statistically independent. But there are common failure modes. Also, the point cuts both ways: the existence of multiple firms limits the social losses from inefficiently regulating any one of them, as I discuss below.

D.

Does Platform Duopoly+/Discrimination Motive?

Remove

Price

Perfect competition removes the incentive/ability to price discriminate. But it would be bad Washington economics to infer that moderate amounts of competition do likewise: consider airlines, for instance. Modem economics has shown how price discrimination may even increase with (less than perfect) competition.*^

^^Borenstein, Severin, Price Discrimination in Free-Entry Markets, RAND JOURNAL OF ECONOMICS 16(3), (1985), at 380-397. Borenstein, S. and N. Rose, Competition and Price Dispersion in the U.S. Airline Industry, JOURNAL OF POLITICAL ECONOMY 102(4) (August 1994) at 653-683. Holmes, Thomas, The Effects of Third-Degree Price Discrimination in Oligopoly, AMERICAN ECONOMIC REVIEW 79(1) (1988) at 244-250. Katz, Michael L., Vertical Contractual Relationships, in THE HANDBOOK OF INDUSTRIAL ORGANIZATION, R. Schmalensee

and R.D. Willig (eds.), (North Holland Publishing, 1989). Stavins, J., PRICE DISCRIMINATION IN THE AIRLINE INDUSTRY: THE EFFECT OF MARKET CONCENTRATION (Federal Reserve Board

Boston Series, Paper No. 96-7, 1996).

204

Net Neutrality

What counts here is not the amount of price discrimination, but the profit incentive to discriminate, relative to the profit impact of an efficiency loss. Duopoly+/- does not destroy price discrimination incentives, and even if it weakens them relative to incentives for efficiency (not proven, as far as I know) they could remain strong. Moreover, as noted above, price discrimination in oligopoly lacks certain efficiency properties of monopoly price discrimination.

VI.

PLATFORM DUOPOLY +/- AND ICE: THE DARK SIDE

A firm may have an incentive to make inefficient vertical choices if those choices spill over to (actual or potential) competitors. This often requires that spillovers be (in a sense) more than 100%, but that may be more likely than it sounds. Thus suppose that a firm's profits are a f\xnQXionfix,y) of its own efficiency X and its rival's efficiency >^; presumably/is increasing in x and decreasing in y. For simplicity writeX-^,;;) = x - cy, where c measures the strength of competition.^"^ Now the firm chooses a vertical policy parameter, /, that affects x and may affect y\ x=a+t and y=b-^st. Thus s measures the spillover of the firm's choice of t into the rival's efficiency level. Then the firm's profits are {a+t) - c{b+st) = {a - cb) + {\ - cs)t. The firm gains from an efficient policy (maximize t) if cs\. This will be a problem if and only if s>{\lc). Since presumably 01)? Unfortunately I think that inference is too optimistic. If a large firm can hamper modular innovation, it can partly replace it with proprietary innovation. That reduces its own efficiency, but probably reduces a small rival's by more, since a small rival is unlikely to do as much proprietary innovation.*^ Once

*^ For firms that do not compete, c=0, or indeed c