The Future of Telecommunications Industries 3540325530, 9783540325536

This book contains the results of a symposium organized to ask what kind of future old and new players in the telecommun

117 34 8MB

English Pages 190 [194] Year 2006

Report DMCA / Copyright

DOWNLOAD PDF FILE

Recommend Papers

The Future of Telecommunications Industries
 3540325530, 9783540325536

  • 0 0 0
  • Like this paper and download? You can publish your own PDF file online for free in a few minutes! Sign Up
File loading please wait...
Citation preview

The Future of Telecommunications Industries

Arnold Picot (Editor)

The Future of Telecommunications Industries With 78 Figures

12

Professor Dr. Dres. h.c. Arnold Picot Munich University Institute for Information, Organization and Management Ludwigstraûe 28 80539 Mçnchen Germany [email protected]

ISBN-10 3-540-32553-0 Springer Berlin Heidelberg New York ISBN-13 978-3-540-32553-6 Springer Berlin Heidelberg New York Cataloging-in-Publication Data Library of Congress Control Number: 2006921137 This work is subject to copyright. All rights are reserved, whether the whole or part of the material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation, broadcasting, reproduction on microfilm or in any other way, and storage in data banks. Duplication of this publication or parts thereof is permitted only under the provisions of the German Copyright Law of September 9, 1965, in its current version, and permission for use must always be obtained from Springer-Verlag. Violations are liable for prosecution under the German Copyright Law. Springer is a part of Springer Science+Business Media springer.com ° Springer Berlin ´ Heidelberg 2006 Printed in Germany The use of general descriptive names, registered names, trademarks, etc. in this publication does not imply, even in the absence of a specific statement, that such names are exempt from the relevant protective laws and regulations and therefore free for general use. Softcover-Design: Erich Kirchner, Heidelberg SPIN 11677444

42/3153-5 4 3 2 1 0 ± Printed on acid-free paper

Content

V

Content 1

Welcome 1.1

1.2

2

Prof. Jeff Anderson, Georgetown University, School of Foreign Services, BMW Center for German and European Studies (CGES) Prof. Dr. Arnold Picot, Münchner Kreis and Konrad Adenauer Visiting Professor (CGES), Georgetown University

Telecommunication Industries in Transition: Winning the Future through Innovation and Change

1

1

2

5

Chair: Prof. Dr. Arnold Picot, Münchner Kreis and Georgetown University

2.1 2.2

3

Dr. Thomas Ganswindt, Siemens AG Discussion

All IP – All IT – All Wireless: The Drivers of Change

5 35

41

Chair: Prof. Dr. Jörg Eberspächer, Munich University of Technology

3.1 3.2 3.3 3.4

4

Gary A. Cohen, General Manager, Global Communications Sector, IBM Corp. Christine Heckart, Juniper Networks Inc., Sunnyvale Dr. Thomas Ganswindt, Siemens AG, München Discussion

Technology as Driver of Change in Telecommunications Prof. Robert Calderbank, Princeton University

41 48 51 58

69

VI

5

Content

Market Structures and Business Models of the Future Consolidation or Persisting Turbulence – Monopolization or Fragmentation: What will Future Telco-Markets look like? 87 Chair: Prof. Dennis Lockhart, Georgetown University

5.1

Perspective from USA

87

Prof. Eli Noam, Columbia University, Graduate School of Business, Columbia Institute for Tele-Information (CITI)

5.2

Perspective from Europe

92

Dr. Karl-Heinz Neumann, General Manager WIK – Scientific Insitute for Communication Services, Bad Honnef

5.3

6

Discussion

Determinants of Future Market Success

101 105

Chair: Prof. Dr. Thomas Hess, University of Munich

6.1 6.2 6.3 6.4

7

Prof. Michael Dowling, University of Regensburg Eckart Pech, Detecon Inc., Reston Prof. Dr. Rolf T. Wigand, University of Arkansas at Little Rock Discussion

How Much and What Kind of Regulation Will be Needed in the Networked World of Tomorrow?

105 111 118 121

129

Chair: Prof. Dr. Arnold Picot, Münchner Kreis und Georgetown University

7.1 7.2 7.3 7.4 7.5 7.6

Prof. John W. Mayo, Georgetown University Prof. Stephen C. Littlechild, Judge Institute, University of Cambridge, UK J. Gregory Sidak, American Enterprise Institute Prof. Justus Haucap, Ruhr-University of Bochum Prof. Robert C. Atkinson, Columbia Institute for Tele-Information (CITI) Stefan Doeblin, Chairman Network Economy Group

Appendix List of Speakers and Chairmen

129 147 155 158 169 182 189

1

Welcome

1.1 Prof. Jeff Anderson Georgetown University, School of Foreign Services, BMW Center for German and European Studies (CGES) Ladies and Gentlemen: A very cordial welcome to all of you on behalf of Georgetown University and in my function as director of the BMW Center for German and European Studies at the School of Foreign Service. The main mission of the BMW Center which was celebrating its 15th anniversary this year is to foster the transatlantic relationship through teaching, research and public outreach. And as such we, in partnership with the Münchner Kreis, are very proud to host this important transatlantic symposium on the future of the telecommunications industry. I would like to introduce the creative force behind this initiative, Prof. Arnold Picot. Prof. Picot is visiting here with us at Georgetown this year as the Konrad Adenauer Professor. He is the 24th in a long line of distinguished German scholars to hold this chair which will celebrate its 30th anniversary next year. Prof. Picot is professor in the department of business administration in the Institute for Information, Organization and Management at Munich University. He is chairman and member of various academic and scientific associations and advisory councils, among them the Münchner Kreis, a scientific working group at the Regulatory Authority for Telecommunications and Posts in Bonn and the Center for European Economic Research in Mannheim. He was member and chair person of various expert commissions of the German Federal Ministries for Research and Technology and of the Department of Economy and Labor. His real research fields in campus are related to organization, management, information and communication, innovation and technology management as well as basic theories of business administration, industrial organization and regulation. The prodigious output of his research efforts is reflected in over 400 publications. He is a leading international expert in his field and we are extremely fortunate here in Georgetown to be able to call him one of our own this year. Please, join me in a warm welcome for Prof. Picot who will introduce our keynote speaker this evening.

2

Arnold Picot

1.2 Prof. Dr. Arnold Picot Münchner Kreis and Konrad Adenauer Visiting Professor (CGES), Georgetown University Ladies and Gentlemen, on behalf of the Münchner Kreis, a supranational association for communication research, I extend a very warm welcome to all of you-.We are very happy and honored that our joint transatlantic symposium on the future of telecommunications industries has found such a lively and favorable response, not only from interested folks here in the Washington area but also from places in North America, Europe and even Japan. The historic Riggs Library here on Georgetown Campus seems to have just the capacity as we can see for the large audience that gathers here tonight and tomorrow. Piggs library was named after Francis Riggs a member of the famous Riggs family of industrialists and bankers that helped shaping this country in the 18th and 19th century for most. In 1889, that was the year of the centennial of this university, Francis Riggs donated 10.000 Dollars in order to overcome a financial crisis of this university and to enable the building of this library. This library was until the 20ies of the last century the university library. I think this is an example of industry-university-cooperation. Our symposium is another example of cooperation between industry and academia. Such vibrant and dynamic industries as IT and Telco can only be best understood and studied if experienced practitioners and specialized academics work together and exchange their knowledge. It is also important that expertise from major regions as North America and Europe join in order to deepen the exchange of experiences, ideas and insights. This symposium combines perspectives from industrial practice, from academic research originating from North America and from Europe. I am especially grateful to Georgetown University and its BMW Center for German and European Studies with its Director Professor Anderson and to his outstanding staff to have generously supported and sponsored this event from the first moment when the idea for such a symposium came up until today’s and tomorrow’s realization. Washington is an excellent location for such an undertaking since you find in this area a couple of large and important ICT corporation as well as many young innovative companies. Furthermore, Washington being this country’s federal capital, the famous FCC and other relevant government agencies, many thin tanks and even many more lobbying organizations specializing on telecommunications are located in this area. I also want to thank all those organizations that have helped us spreading out invitation, namely Georgetown University, the German Embassy, Deutsche Telekom, Siemens, Alcatel, the Columbia Institute for Teleinformation, Detecon and others.

1 Welcome

3

This symposium’s subject “The Future of Telecommunications Industries” does not have a question mark at the end. There is no doubt. Telecommunications industries have a future. I think they even have a very bright future. Communication is a crucial basis for the development of each individual’s social identity. Communication is the foundation for orientation, for intellectual and commercial exchange as well as for economic development and growth. The constant and sustainable desire of humankind to expand the reach of communication, that is to communicate over distance, explains the certainty of that future. That is what telecommunications is all about: enabling communication over distances. From the beginning, humans have used their capacity of invention to develop and use tools for overcoming physical distances in communication. Personal messengers, written or printed documents, light and smoke signals are early examples. However, only during the last two centuries, we have seen a technological revolution in telecommunications mainly driven by application of new knowledge in the field of electricity and engineering. This revolution of the late 19th and of the 20th century has brought about differentiated telecommunication industries such as carriers, equipment vendors, service specialists and others. Today this is one of the leading sectors of modern economies. However, the mentioned technological revolution is still ongoing. It has recently gained new momentum due to exponentionally increasing computer power, due to the digitalization of all aspects of communications, and due to new transmission technologies especially, but not exclusively, in the context of the internet. Therefore, the question is not whether telecommunications industries have a future but what kind of future old and new players will have seen the dynamic changes in technologies and markets with the various opportunities, challenges, and discontinuities involved. Our symposium will address major issues regarding this pivotal question. It will deal with the technological drivers of change; it will analyze changing market structures and business models; and it will discuss the nature of future regulation in telecom markets. Tomorrow’s sessions will cover these subject areas. Tonight’s keynote speech will open our conference. This keynote will provide an insight into the future of telecommunications from an industrial perspective. It is my honor and my great pleasure to introduce to you our keynote speaker Thomas Ganswindt. Thomas Ganswindt is member of the Corporate Executive Committee of Siemens AG. Siemens is one of the few real global giants in electrical engineering and electronics. At the Executive Committee of that corporation Herr Ganswindt holds the responsibility for all Siemens activities in the field of communication and information technology. This represents some 22 Billion Euro, that is more than a quarter of Siemens’ worldwide annual business of some 75 Billion Euro. Thus, seeing the order of magnitude, he is to a very large extent in charge of the presence and especially of the future of this industry. Prior to his actual top management position Thomas Ganswindt has held major executive management positions with Siemens automation and transportation divisions

4

Arnold Picot

and with Siemens communications group. He has earned a degree in mechanical engineering from Technical University Berlin and has been a researcher at one of the leading Fraunhofer institutes in Germany, Institute for Production Systems and Design Technology. He is chairman of the D21 initiative, a major public private partnership program in Germany promoting innovation, also advising Chancellor Schröder in the field of innovation policy. And he is also, I might mention this, the Vice Chairman of Münchner Kreis. Please, welcome together with me with a warm applause Thomas Ganswindt whom I ask to take the floor. Welcome (Second day of the symposium) A very warm welcome especially to all those who could only join our Symposium this morning. My name is Arnold Picot. Together with the Georgetown University and my colleague Jeff Anderson from the BMW Center for German and European Studies here in Georgetown we host this conference. I am at the same time the chairman of the Münchner Kreis, a supranational association for communicational research based in Munich and being around in Europe for some 30 years right now. It is an academic and non-profit organization with close contacts to industry and government. This morning we start our sessions on the overall subject of the future of telecommunications industries and it is my pleasure to introduce to you the chairman of this session. It is my colleague from the Technical University of Munich Jörg Eberspächer. Jörg Eberspächer is a in Europe well-known specialist in the field of electrical engineering and communication networks. His institute specializes on this and he is especially concentrating in his work on mobile communication, next generation internet and fault tolerant networks. He received his degrees from University of Stuttgart where he also did a lot of research. Lateron he spent 13 years at the Siemens company in research and development, also in international standardization work, especially with respect to high speed networks. He then joined Technical University of Munich. He is among others a co-author of a very important book on GSM, the famous system for mobile communication that has been discussed yesterday quite a bit. He teaches regularly at Tongji University in Shanghai and among other positions he is also a member of the board of the Münchner Kreis.

2

Telecommunication Industries in Transition: Winning the Future through Innovation and Change

Chair: Prof. Dr. Arnold Picot Münchner Kreis and Georgetown University

2.1

Statement

Dr. Thomas Ganswindt Siemens AG First I would like to thank the Münchner Kreis and Georgetown University for the honor of holding the opening speech at our “The Future of the Telecommunications Industry“ symposium here in Washington. It’s a great pleasure to join you, and I look forward to the many contributions.

Figure 1: Siemens has a diversified portfolio

6

Thomas Ganswindt

Before I turn to my theme today, I thought it would be helpful to say a few words about Siemens (Fig. 1). I presume that everyone here has heard of our company at some time or another. But many may not know exactly how our telecommunications activities fit into our overall company portfolio. Our name is known around the world – and stands for innovation, customer focus and global competitiveness. It also stands for a broad business portfolio that keeps us robust and flexible even in difficult times. We can tap unique synergies across this portfolio to offer an unmatched spectrum of products, systems, services and solutions. Siemens is a global business with 430,000 employees, millions of customers, and hundreds of thousands of suppliers and business partners in over 190 countries. We can look back on nearly 160 years of company history – and on 160 years of innovation. Expanding out of our little workshop in Berlin – which people would call a garage start-up these days – we built up a global network of innovation. By discovering the electric dynamo principle, for example – and giving birth to electrical engineering. With the first electric train, elevator, metal filament lamp, or pure silicon. And today we lead the ranking for patents in Germany and Europe – and are among the top ten in the U.S.. We hold around 48,000 patents worldwide. Much of what people take for granted today – in their homes, in factories, in hospitals, in communications, in lighting, in power generation, or in transportation – was developed by Siemens. Our company is a world leader in the fields of Information and Communications, Automation and Control, Power, Transportation, Medical Solutions and Lighting. With sales of around 75 billion euros, we are one of the twenty-five biggest companies in the world. And we are one of the three biggest electrical engineering and electronics companies in the world.

2 Telecommunication Industries in Transition

7

Figure 2: Communications (COM) – Sales

Communications is the biggest operating Group at Siemens (Fig. 2). Our activities in this field cover fixed, mobile and enterprise networks, and mobile and cordless phones. The Siemens Communications Group generates nearly 18 billion euros a year and is one of the world’s three top providers in the telecommunications industry. Siemens is the only company in the industry that offers its customers a complete portfolio – ranging from end-user devices to complex network infrastructures for enterprises and operators, as well as all of the associated services.

8

Thomas Ganswindt

Figure 3: Communications (COM) – Fixed networks …

And this broad portfolio ensures our enduring success in the market (Fig. 3). For example, every third telephone call made around the globe uses Siemens technology. Our customers include 180 mobile network operators in over 90 countries. And we have a leading position in so-called “real-time communication” and with IP-based solutions for enterprises.And that, ladies and gentlemen, wraps up my brief glimpse of Siemens and our various telecommunications activities. Today and tomorrow we will be taking a closer look at the future of the world’s most dynamic industries – telecommunications. Our presentations and discussions will center on current industry topics, on key trends, and on market developments. And above all, we will be concentrating on industry prognoses and the future.

2 Telecommunication Industries in Transition

9

Figure 4: Quotes K. Popper and W. Churchill

Sir Karl Popper, one of the greatest philosophers of the 20th century, once said: “The future is uncertain – otherwise we would surely know it.” (Fig. 4) Yet even considering all uncertainties, I believe it is essential to look forward and – with due caution and prudence, of course – develop a picture of the future. This caution and prudence is precisely what separates experts from dreamers. And it was Winston Churchill, if you recall, who so accurately characterized experts: “An expert is someone who can always explain afterwards why his prediction wasn’t right.” Today, I would like to tell you – in as realistic terms as possible – about how Siemens sees the future of communications. In fact, it is quite possible that predicting the future is easier for Siemens employees, per se. Why? For two reasons.

10

Thomas Ganswindt

Figure 5: Siemens: Innovation is key to success (I)

First of all, it has to do with the history of our company (Fig. 5). We have been around for quite some time. Nearly 160 years, in fact. Not many other companies can claim such a tradition. Not only that, but this long history was also extremely fruitful. And even fewer companies can make this claim. Innovation has been our company’s lifeblood from the very beginning – and has fueled our success over many generations. Siemens works in technology fields that are highly complex and demanding. And in these fields, one key success factor is being fastest to market with new technology solutions that give our customers a competitive edge. To ensure that we can keep up this demanding pace and remain a trendsetter, more than 45,000 employees work in R&D centers to keep our pipeline full. Most of them are located in Germany, the United States and Britain – but we also have centers in China, India, Russia and other countries around the globe, working to develop products for local markets. And our people deliver. In the past fiscal year we submitted 8,200 invention reports and filed patent applications for about two-thirds of them. It seems we have been pretty accurate with our forecasts about future developments and technologies. Right now at Siemens, over 75 percent of our products have been

2 Telecommunication Industries in Transition

11

on the market less than five years. In other words, three-quarters of our products today were just in the planning and developing stage a bit over five years ago. This rather high success quota for our innovations fuels our company’s success. Or let me put it this way: Five years ago, we had very concrete ideas of the innovative products, solutions and services we would be offering to our customers in 2005. And we were right in most cases. But this time frame really is only an “average temperature” – so to speak – of our company’s enormous spectrum of products – ranging from washing machines and high-speed trains to computed tomography systems, and from power plants and car navigation systems to telecommunication networks and devices. And you know just as well as I do: In the telecom field, five years can be half an eternity. The product cycles here are usually much shorter. Brutally short, I should add. The phones our parents used were on the market for 30 or 40 years – unchanged. Today, a cell phone has a life cycle of some six months or so – if we are lucky.

Figure 6: Siemens: Innovation is key to success (II)

And what is the second reason why it is easier for us to make accurate predictions about the future? It’s simple: We make enormous efforts to find out today what could be important tomorrow. In other words, our accuracy is not the result of some magic recipe or even a reliable gut feeling – but is the product of a sophisticated and proven

12

Thomas Ganswindt

system. In simple terms: We develop a “picture of the future” that is as accurate as possible (Fig. 6). These pictures are visions of where the world of technology is headed in the years to come. The pictures are the product of two opposing approaches or perspectives, each of which reinforces the other. Let me explain: On the one hand, the pictures are extrapolations into the future based on the world of today. On the other hand, they are generated through retropolation back to the present, starting from the world of tomorrow. Extrapolation, the first perspective, may also be seen as “road-mapping” – in other words, projecting the technologies and products of today into the future. What we call retropolation involves imaginatively placing ourselves some ten, twenty or even thirty years or more into the future. Once an appropriate time frame has been selected for a specific business, a comprehensive scenario can be devised that incorporates all relevant factors, including the future development of social and political structures, environmental considerations, globalization, technological trends and new customer requirements. The trick now is to backtrack to the present from the “known” facts of the future scenario. In this way, it is possible to identify the kinds of challenges and problems that need to be overcome to get there. These challenges and problems are universal trends – such as social developments, the growing scarcity of natural resources, global warming and other such factors. From this scenario for the future, we work back to see what products, solutions and services have to be available within a given time frame. We then use the results derived from these two approaches to shape the strategic planning of our business portfolio. Planning for the future is a matter of survival for every company. And I believe a critical look at the future goes well beyond the interest of individual companies. In the end, it is a matter of what kind of world we want to live in tomorrow. That’s why we don’t study the future behind closed doors, but engage in productive dialogues with society. Just a few months ago, we published a comprehensive study on life in the year 2020 – focused initially on Europe. The 320-page study presents two scenarios for life in Europe in 2020. One thing is quite interesting. The scientists compared two fundamentally different scenarios for the future. Both are considered equally plausible. And, as I said, they differ radically from one another. In the first scenario, a strong government cares for its citizens – and it has to. Because the state’s strong role is combined with a weak economy. A dynamic economy and innovation are not part of the picture. In this scenario, there aren’t enough resources for making major advances in communications. At the same time, however, people are not willing to accept new technologies and permanent structural changes in their society.

2 Telecommunication Industries in Transition

13

In the second scenario, the world looks completely different. In this picture of the future, the role of the government is strictly limited to core tasks. Personal responsibility is given top priority. The possibilities for highly qualified workers are virtually unlimited. Technologies of every kind are welcomed and used. The economy is dynamic. Yet many people have great difficulties meeting the high demands and keeping pace with the speed of change. These two opposing scenarios make it clear: Our future depends decisively on the answers that we develop today to meet future challenges. I believe telecommunications offers many opportunities. And we have to remember: The future doesn’t just happen. We are the ones who shape the future with innovations. After this rather abstract reflection, I would like to turn to a more concrete picture of the future. And I would also like to take a more specific look at the dynamism and change that so dominates our industry. While this change was driven over the past decades by the market – and here I need only mention deregulation, privatization of customers, and globalization – change today is driven by technology. It is marked by innovation. We are well on the way toward an “always on” society. In the future we will constantly be linked with the Internet and thus always accessible via phone call, e-mail, SMS or video.

Figure 7: Defining convergence

14

Thomas Ganswindt

And I believe one key technological development is especially important. One that will dominate telecommunications in the future and that will have to be fully accepted by manufacturers, operators and users alike. This trend is convergence (Fig. 7). The term convergence generally means the merging of technologies that were previously separate. Separate technical worlds are moving toward one another. This applies to fixed networks and mobile networks. For public and for private networks. For telecom and information technology. And naturally for devices as well.

Figure 8: The LifeWorks@Com vision – Mobility and Flexibility

Granted – all this is still a vision at the moment. And for this vision of a uniform communications experience, Siemens has coined the name LifeWorks@Com (Fig. 8). The basic idea is to make life with communications much easier and simpler – for the user and also for the network operator – whether it is a carrier or an enterprise that has its own communications system. The entire world of communications should function with a single interface – and the user should no longer have to think about choosing the appropriate network. This is an ambitious goal. But it is absolutely worth working for – with every possible effort. Because there is a great deal to be done yet. Let me give you a brief look at the communications landscape we have today.

2 Telecommunication Industries in Transition

15

Figure 9: Communication landscape today (I)

Right now we have a great variety of communication possibilities available (Fig. 9). If we want to get in touch with someone, we can use various devices – like computers, telephones or sometimes even a fax machine. We can convey our messages in many different ways – from traditional phone calls to SMS messages. And in our communication, we use completely different networks. Usually there is still a clear separation between data cables and access to telecommunications. The wide variety of choices increases our options.

16

Thomas Ganswindt

Figure 10: Communication landscape today (II)

But when we first reach a voice-mailbox in the fixed network, then reach no one with the cell phone and finally the auto-reply function of someone’s e-mail tells us that our desired communications partner isn’t available – then we are annoyed at all the time we have lost and not at all charmed by the variety of communication channels (Fig. 10). Surveys show that the great variety of communication possibilities often overwhelms us. Even more: Failed attempts to communication with someone cost time and money. Productivity suffers. And what now?

2 Telecommunication Industries in Transition

17

Figure 11: LifeWorks@Com (I)

The best solution would be if we didn’t have to worry about reaching someone – because we know in advance that the way to contact people has been optimized for us. This is precisely the goal we are pursuing with LifeWorks@Com (Fig. 11).

18

Thomas Ganswindt

Figure 12: LifeWorks@Com (II)

One technical prerequisite for a uniform communications experience is the so-called “presence functionality” – that is, the possibility to constantly be informed about the availability of the communications partner (Fig. 12). The prime requirement for this, of course, is to be “always on” – as we know it today from the Internet world. And that will indeed be possible when we make convergence a reality.

2 Telecommunication Industries in Transition

19

Figure 13: Levels of convergence

This convergence is taking place on various levels (Fig. 13): First, on the network level. Here, the task is to meld and integrate fixed telephone networks, mobile networks and data networks. Second, on the device level. Here, the job is to integrate telephones and computers in one user device. And third, on the application level. Here, it is all about integrating voice applications such as mobile phone mailboxes with data applications.

20

Thomas Ganswindt

Figure 14: Converged networks

The first convergence level – the network infrastructure – will enable the success of converged services in the future. IP will be the common basis (Fig. 14). There are no doubts about this. The telecommunications industry has developed many new processes for broadband communications – both on a mobile and fixed-line basis. An important goal in this regard is the seamless transfer between different technologies. Users shouldn’t notice which transmission standard their laptop, cell phone or PDA is using – whether it’s WLAN or WiMAX, UMTS or HSDPA. Several different standards will co-exist over the next few years, but fixed-line networks, mobile networks and the Internet will ultimately merge. And a main activity area for telecommunication equipment vendors will be to bring together separate networks such as company LANs, mobile communications networks and fixed-line networks into a single platform.

2 Telecommunication Industries in Transition

21

Figure 15: Converged devices: Example BlackBerry in cell phone

On the second level, with devices, I see more forms of convergence (Fig. 15). Entertainment electronics and communications system will merge. Television and cell phone will meld into a single device. Video-on-demand will work via the PC. We will use our SmartHome to control entertainment, telephony, lighting, household appliances and building security. All this is still largely music of the future. But current developments are making it quite clear: The worlds are getting closer together and will be completely digitized and networked. I want to limit my remarks today to trends in mobile phones. On the one hand, it is all about the scope of functions offered by the devices themselves. Over the past years, telephones have not only become far more efficient, but they have taken over an enormous variety of functions. That is one result of the increasingly fierce competition. Today, mobile phones usually have cameras and many also function as MP3 players. Business cell phones often have highly sophisticated PDA functionality – just as some PDAs have been given communications functions. The combination of pocket computer and telephone is especially interesting if familiar PC applications can also be used on a mobile basis – and as much like users in the PC world know. The key factor here is a seamless interplay of both sides.

22

Thomas Ganswindt

Convergence can offer benefits to the user only if the convergence is supported from the device to the services to the infrastructure. Today there are already mobile phones on the market that are equipped with the well-known BlackBerry functionality. They present e-mails parallel and automatically to both the desktop and to the mobile phone. And an impressive market for such devices is out there. We believe mobile accessibility to e-mail will grow extremely fast.

Figure 16: Mobile data sales are booming in Western Europe

One could actually talk about a killer application that we have wanted for UMTS (Fig. 16). The network operators will rub their hands: According to our analysts, the data share of total sales of the mobile network operators will be around one-third in 2008 – double today’s level. An alternative to the technology of “research in motion” are the developments based on open standards. These solutions can work with completely different mobile devices like smartphones and PDAs. As with BlackBerry devices, mail, calendar entries and addresses can be transmitted to mobile devices and compared, and data can be processed in some devices.

2 Telecommunication Industries in Transition

23

Figure 17: Innovation for converged devices

And the development of convergent devices is being propelled with concrete innovations (Fig. 17). Siemens developers have built a demonstrator that makes it possible to change networks during a call. The caller uses a data card in a laptop or PDA to call via either the company network (Ethernet), a WLAN or the UMTS network. If the user leaves the office during the call, the Ethernet connection is cut off. The VoIP data packets then automatically take the most efficient route depending on the availability of other networks. The unit also allows the UMTS network and WLANs to be simultaneously used to increase the transfer rate if large amounts of data need to be sent. And next year, we will introduce a UMTS cell phone with an integrated WLAN. This requires a unit equipped with two chips. And Siemens developers are even working on integrating different transmission standards on a single chip, with the Software Defined Radio (SDR) system. With SDR, a hard-wired chip architecture will no longer decide the frequency a terminal can transmit or receive in. Software installed in the unit will decide, so one cell phone can function in all networks. And furthermore, Siemens is also lead partner in a research project to develop a universal radio technology to supplement current standards after 2010. One goal is to achieve data transfer rates of up to 1 Gbit/s at distances under 100 meters, and approximately 100 Mbit/s for a broader radius.

24

Thomas Ganswindt

Figure 18: Integrated communications today

On the third level – applications – the job is to develop applications that function independently of the type of network and the communications architecture, and that offer the user a uniform environment (Fig. 18). One concrete example of this is the mobile PBX (Private Branch Exchange) solution, a type of exchange for the mobile phone. This allows telephone conferences to be held on mobile phones as well. The job is to make mobile telephony just as comfortable as fixed-network telephony. For service workers, for example, an incoming call can be automatically transferred to the next person in the team until someone accepts the job. In Sweden and Norway, more than 50,000 people already use such a service – primarily in companies with a high number of mobile employees. Yet this is just the beginning. In the medium term, IT and telecommunications will converge. Telephony will become an IT application and can be integrated into electronically displayed business processes. Siemens already has an innovative product on the market that merges telecommunications and data communications on a single interface. With this solution – which we call HiPath OpenScape – we haven’t reinvented the wheel, but have selected the world’s best-known platform: Microsoft Outlook.

2 Telecommunication Industries in Transition

25

HiPath OpenScape brings together telephone and e-mail communication, voicecontrolled services, text messaging, calendar functions and instant messaging – a service which enables users to chat and exchange data in real time. Furthermore, it is possible to conduct network-independent video conferences with several participants, and for several individuals to jointly work on documents and files of all types. This is a particularly important aspect, since it substantially reduces the number of business trips required. HiPath OpenScape is easy to use, as all applications are administered via a uniform PC interface developed in cooperation with Microsoft. The system is now only usable with Windows, but it will be adapted for Linux as well in the future. Users define when and through which terminal they can be contacted, allowing them to remain inaccessible if necessary. The system also comes with a VIP function that gives preferential treatment to certain callers. If the person to be contacted is inaccessible, the system determines to whom or through what medium contact attempts should be forwarded. The system’s biggest advantage is that users can be accessible at all times at a single number, wherever they are. What’s more, the system always selects the least expensive route. Ladies and gentlemen, so much for my brief survey of the most important trend in our industry – convergence.

Figure 19: Drivers for convergence

26

Thomas Ganswindt

Yet the picture wouldn’t be complete without talking about two important drivers of this development to an “always on” society (Fig. 19). I am talking about advances being made in broadband technology and software. New technologies open up new perspectives. In the future, people will be able to work at home with all the amenities they have at the office, for example, and managers will be able to monitor robots on production lines from their desks.

Figure 20: Broadband drives convergence (I)

All of these applications require broadband connections – in other words, transmission capacities of more than 200 kbit/s, as are offered by the Digital Subscriber Line (DSL), cable modems or satellite links (Fig. 20). The number of broadband users worldwide has risen from 100,000 in 1996 to 98.8 million at the end of 2003, and this trend is set to continue. In a report titled Broadband Worldwide 2004, market researchers at the New York-based company eMarketer Inc. predict that about 246 million private and commercial customers will be using broadband access by 2007.

2 Telecommunication Industries in Transition

27

The U.S. and Japan are the largest broadband markets today, with 27.6 million and 12.1 million customers respectively. DSL is the global leader: By the end of 2004, the number of DSL users is expected to rise to 86.5 million, increasing to 156.7 million by 2007. Second place is occupied by the cable modem, which has many users in the U.S., Canada, Belgium and the Netherlands; though only in the U.S. do modem users outnumber DSL customers. In Japan there were already 1.14 million high-speed fiber-to-the-home connections in March 2004 – double the number for 2003.

Figure 21: Broadband drives convergence (II)

Increasingly, complex data services can also be used on the move (Fig. 21). Although Third Generation (3G) networks are establishing themselves more slowly than expected, technologies such as General Packet Radio Service (GPRS) are on the increase. Forrester Research expects that GPRS will be standard for cell phones in Europe as early as 2005 and will be used by about 72 percent of cell phone owners. By 2008, 60 percent of all cell phone users are expected to be using mobile Internet services regularly.

28

Thomas Ganswindt

Figure 22: Broadband drives convergence (III)

Companies are also seeing an increasing number of advantages in the always-on society (Fig. 22). For example, production data can now be evaluated in real time, and that makes it possible to reduce stock inventories and track orders more effectively. Data exchange also enables applications including remote maintenance of plants and machines. Ethernet – the IT standard for offices – is gaining in importance as a transmission medium. And the Industrial Ethernet variant – which has been modified to meet the needs of industrial companies – is being increasingly used in production instead of proprietary solutions. Among other things, this makes it easier to transmit data between production sites and administrative offices. Along with the advances in broadband technology, software is the decisive factor for converging networks, devices and applications.

2 Telecommunication Industries in Transition

29

Figure 23: Software drives convergence (I)

Software is the key to the future of telecommunications (Fig. 23). Software is, without a doubt, the most invisible technology of all. Yet the trends shaped by software developers and their applications have a profound impact on our lives. The trend from hardware to software in our industry cannot be overseen. Just a few years ago, telephone exchange systems for network operators and companies dominated the market. Today we no longer deliver room-sized cabinets to our customers via container ships. Today the same technology is delivered on a CD. And this technology – which we call Softswitch – is far more powerful and efficient than ever before. For example, the key component of our LifeWorks system is such a software-based switching station. It functions as a cross-network control and connection interface that directs and forwards incoming signals. A media-independent protocol plays a key role here – since it is the world’s first protocol that can be used in all communication environments – fixed line, Internet and mobile wireless. Since all user access data is centrally stored on a single server, it’s possible to determine at all times if and how anyone can be contacted, irrespective of the time, network, location or device. This innovation helped us gain new customers. One of these is the New York-based company Cablevision, whose 100,000 subscribers can now use their TV cables to

30

Thomas Ganswindt

make phone calls thanks to Softswitch. Telephone companies Bellsouth and SBC have also begun to introduce the Siemens system. The trend on the device side is also clear. Software is taking over ever more functions from hardware. Today, software accounts for over 70 percent of the value-added of a mobile phone. Within six years, the number of program lines has grown five-fold. In modern mobile phones there is more software that in NASA’s Gemini space project in the late 1960s. Software has made an astonishingly fast transition from its role as an academic tool to a major economic factor, and the expectations in the productivity of the program developments grows from year to year. Over 30,000 software developers work at Siemens. The R&D costs for software total more than 3 billion euros a year. Software – usually hidden or embedded – is in a broad variety of products, from cell phones to cars to industrial automation systems.

Figure 24: Software drives convergence (II)

What’s driving the explosive shift away from hardware and toward software as the engine of innovation? Probably the most fundamental factor is the nose-diving cost of computing power (Fig. 24). In 1976, a Cray computer capable of 100 million

2 Telecommunication Industries in Transition

31

floating point operations per second cost the equivalent of about 13 million euros. Today, you can find the same computing power under the hood of an average car, and the price tag will be a modest 13 euros. In 1994, 1 Mbit (one million bits) of memory cost the equivalent of about 3 dollars and 26 cents. By 2003 it had dropped to approximately two cents. This trend means that devices ranging from cell phones to automotive infotainment systems and set-top boxes can have enough computing capacity to accommodate an operating system and a spectrum of application software indeed. These so-called “embedded” systems now account for a major part of the world software market. This market is growing steadily. U.S. market researchers at IDC report that worldwide spending on packaged software alone was 185 billion dollars in 2003 and is expected to reach some 260 billion dollars by 2008. The overall software market is, of course, much larger, because it includes software that companies write themselves. In contrast to the market for PC-based software, where growth is slowing, analysts expect explosive growth in embedded software – software used where most people don’t see it, as part of anything from a cell phone to an industrial control system.

Figure 25: Software drives convergence (III)

32

Thomas Ganswindt

Today’s automobiles tell the story (Fig. 25). Embedded functions are migrating from hardware to software as microprocessors take over functions from electromechanical devices. In 2000, car manufacturers and suppliers spent about 25 billion euros on development and production of software designed for automobiles, according to Mercer Management Consulting. Mercer predicts spending will quadruple by 2010. By then, 35 percent of the value of the average car will come from electronics and software with 13 percent being software.

Figure 26: M2M communication is a huge growth market

Great potential is also being developed through advances in software used for machine-to-machine communication (Fig. 26). This is a major growth market. Today some 1.3 billion people used GSM-based telephones, but only 15 million machines are capable of communicating. Looking at this number, we are highly optimistic about this market’s prospects. Market analysts see average annual growth rates of up to 50 percent in the machine-to-machine market. The greatest potential for embedded software is where there is a human interface. And this is naturally true in particular for our industry – telecommunications. Because telecommunications is ultimately the exchange of ideas and knowledge from person to person.

2 Telecommunication Industries in Transition

33

Ladies and gentlemen, let me briefly sum up my points.

Figure 27: Summary

The megatrend in our industry is convergence (Fig. 27). Networks, voice, data and applications will be merged. We are well on our way toward an “always-on” society. Technology will substantially change the way we live and work. Mobility in work and at home will play an ever greater role. And all these changes will be made possible by the triumph of broadband networks – especially in mobile networks. These changes will also be made possible by the steadily growing efficiency and productivity of software. In particular, embedded software will further drive the digitization and networking of all devices in the office, under way, and at home. In my opening remarks I presented two quite different future scenarios for 2020. Naturally, a lot can happen by then. But I believe that – in view of the dynamic developments in telecommunications – we can spot some of the developments I sketched far in advance. I can’t say for sure, of course. In this respect, the future remains uncertain. But I am certain that we should use events like our symposium. We should use them to discuss the future. And we should use them to set our paths to that future. Because in the end, the future will be what we make it. And we should rely on our knowledge, our experience and also our basic intuition.

34

Thomas Ganswindt

Figure 28: The Future of telecommunications

Otherwise it could turn out like an American once described (Fig. 28): If there had been a computer in 1872, it would have predicted that by now there would be so many horse-drawn vehicles that the entire surface of the Earth would be 10 feet deep in horse manure. (Karl Kapp)

Telecommunication Industries in Transition

2.2

35

Discussion

Chair: Prof. Dr. Arnold Picot Münchner Kreis and Georgetown University Prof. Picot: Thank you very much Mr. Ganswindt for this very fascinating and multifaceted picture that you have drawn for us. I think this was an outstanding opening to our symposium. I also hope that we will have a lively discussion following this keynote speech. I would like to start with a very simple question. One of your slides showed us a picture of the future of communication networks involving PSTN, Public Switched Telephone Networks. Do you think that in 10 or 20 years from now we will still have this specific kind of networking in telecommunications or will these networks look quite differently? Dr. Ganswindt: If we talk about a timeframe of 10 or even 20 years I think there will be still a PSTN network and it will be used worldwide. Looking at our order intake which is a good indicator about what is going on in the world I know that we will still have a big business with the proprietary solution in the PSTN network especially in emerging countries and so I am absolutely sure that all these operators want to depreciate their systems over a long time period. So, we will have the systems and we should not underestimate that we always talk about a real mass market and it takes one time before you can change the behavior in a mass market. Voice over IP is there and we will see it and it will really ramp up and accelerate. But I am sure that in ten years from now we still have a PSTN network. Prof. Picot: Thank you very much. Yes, please. Prof. Bernd Brügge, Technical University Munich: I was fascinated by your last slide with the horse-drawn vehicles and I would like to relate this to your visionary scenarios. I would like to know how you deal with paradigm breaks in a visionary scenario because you cannot plan them. Take for example the success of SMS, or any other technology you didn’t plan for, but which turns out to be profitable for your company. How do you deal with that? Can you really afford to have all your people concentrate on convergence or should you also have a team that deals with divergence? Dr. Ganswindt: What you are talking about we call in some a disruption. Something changes dramatically, something that you were not able to foresee. What we do in our organization is that first of all we listen to our employees, we listen to our customers

36

Chair: Arnold Picot

to understand what they see is coming. However, in some cases that is not enough, i.e. in our mobile phone business therefore we established a special group of people. Those people sit around and think what could be possible. They look what is going on in the market, what the trends are in Japan, Korea, the United States, Europe, in big cities, what do people like to do. By doing that in some cases you can foresee what is going on. But things like SMS happened and will also happen in the future. We should be open to understand early enough that such a new trend creates an opportunity for us as a company. Laura Shereman, Argus International LLC: Part of the thing that is really important in the future is standards. And one of the things you didn’t address was the importance of standards and in particular how standards get developed when you have places like China which want their own standards. So, I would be interested in hearing your views. Dr. Ganswindt: That was an interesting remark. We had a short discussion two hours ago in the Board of Münchner Kreis about standards and the question what is going on in the world. It is true that in the past we had two big centers in the world, the European center and the US center and most of the standards were created by both centers. They did it more or less together. Standards were the base of telecommunication. Without standards it is even impossible to create such a system as we have it today. However, we have to accept that there is now a third force in the world and that is China. The Chinese people and the Chinese government are very much interested to create standards that will support their environment. Therefore, we have to accept that now we have three people at the table, the US, the Europeans and the Chinese. I am absolutely sure that this will increase creativity and will definitely help the emerging countries all over the world that they also take their share in the creation of standards. I think it is something that had to happen due to the fact that most of the Asian countries including China are now in a phase of developing their own economy, their own industries and from my point of view it is a good development. We as the Europeans and also here in the United States can take part. We are active in China talking to the Chinese. I know some of my competitors in the United States do the same. From my point of view it is a good development. N.N.: I am interested to learn your views on the feature of the mobile industry. Will wireless carriers be able to continue the relationship with their customers or will mobile carriers become just a platform provider? Dr. Ganswindt: That is an interesting question. Looking at the development in the wireline network I think we should look a little bit back in history. It shows that by changing technology there are also opportunities to change the whole business model. Right

Telecommunication Industries in Transition

37

now the mobile network is really a proprietary network and the GSM network as we know it is a network that was created under the principles and the rules of the old PSTN network. However, looking at the 3G network things are changing. We are now in a packet world and that will also allow companies to create new business models by using this technology. I know that mobile operators, companies like Vodafone, are really recognizing what happened in the wireline industry and now ask themselves what they have to do to make sure that somehow they can protect their mobile island that they have today, where they are able to link their customers to their network and are able to control the customer. It is like the idea that my colleague Anton Schaaf said, the SIM card, the card in the mobile phone, is really a strong vehicle to link the customer to the operator. In the wireline industry we miss this opportunity. Prof. Calderbank: Right now in the mobile industry you have a small number of very fierce competitors. If you converge on IP and you make direct wireless connection to the internet then you will invite in a much larger number of competitors. Are you worried about that development? Dr. Ganswindt: Actually, I am not worried about that development because I am part of the supplier industry. I think the mobile network operators are worried about it. N.N.: If in the GPRS type network it is just a router than it is an opportunity for Cisco. If it is something that is implementing something proprietary about GSM or something particular to GSM than it is easier market to defend. Dr. Ganswindt: Looking at the technology I think you need a little bit more than a router to transfer data via the air. This technology is the core of the mobile network including the mobility functionality which is somehow integrated in the network. So, from my point of view this will remain but the basic principle to transmit the data via air is a packaged principle in the future and that will create new opportunities. That is true. But for us as a supplier industry it is good if we have more opportunities in the network because more competition also means more ideas, means more opportunities to sell equipment and solutions finally. Mark Cooper, Consumer Federation of America: Let me take this in a little different direction. I have no doubt that the 1.3 Billion people who are on today, will grow into 2 Billion people who can be always on. My concern is about the 4.7 Billion people who are not on today remaining never on. The question is in this vision which is a very high end vision: do you think about a basic functionality people can afford who are not in that market? I know in a

38

Chair: Arnold Picot

2 Billion person’s market you can make a lot of money at. But I am not sure I want to live in that world with 2 Billion people always on and 6 Billion people who are not on at all. Dr. Ganswindt: I have the opportunity to talk to those countries, to those leaders who are really concerned about this so called digital divide. Those people who will have the opportunity to be always on and those people who will not have it. There is a special task of the United Nations who is talking about this issue and I have the opportunity to participate there. To be honest I think we have technology available. It is true that technology is not cheap but it is much cheaper than it ever was before. That could be also used by emerging countries to make sure that they give an opportunity to their people to be always on. We will see more and more technology that will allow that. I think it is a question of time. Right now if you look at the growth rates and also the acceleration of mobile subscribers worldwide - it is a tremendous growth. It will not end with 2 Billion or maybe 2.5 Billion people, we will have more people that will have access to basic telecommunication and the technology is available. Jerry Adwin, Consultant and Government Relations, Washington: Taking the previous question just in a slightly different direction what are the prospects that broadband and all of this will reach out very far in the rural world and how will those folks get it? The wireline people may not whether their phone or cable want to go out that far can wireless ever really be a solution or how will people in the rural areas get that broadband? Dr. Ganswindt: I would like to give you an example. We are very active in Africa and in some African countries people also want to be always on and they want to get access to broadband connections. There we have wireless technology available which is used there. With a technology like Wimax we will have the opportunity to give people also in those areas the opportunity for broadband connection. N.N.: What is your view on CDMA standard? Do you push it? Dr. Ganswindt: First of all we are not pushing CDMA. Actually we are strongly fighting against it because we are pushing the other standards which is mainly GSM and 3G networks and WCDMA. We are pushing that and as far as I understood your question, for instance looking at China – China is one of the biggest growing markets regarding the infrastructure built out of mobile networks of the net generation. Right now the whole industry is waiting for the licenses. They didn’t submit the licenses today to the network operators in China. If that happens we will see a tremendous growth of 3G networks in China. Maybe we will see one or two standards that will be deployed

Telecommunication Industries in Transition

39

in China. This is a big opportunity. The question which technology will be used or which technology will be selected by the Chinese government is more a political decision than anything else. It depends strongly on the interest of the Chinese government. It is our belief today that they will submit one frequency, one license to TDS CDMA because they want to create their own standard because they also want to support their own local industry. But to be really honest we did the same thing in the past. We did it in Europe and you did it in the United States. So this is not something that is totally new. WE did the same thing. And China is in a stage where they want to create their own industry, where they have to make sure that they increase the standard of living for their people. And they do it via such an industry oriented policy. David Allen, Collab CPR: With your provocative quote from the beginning from Winston Churchill in some way you anchored your whole vision here on the question of certainty or we might say uncertainty and how societies and cultures respond to this really fundamental human issue. It seems clear that Siemens and you have chosen to respond with a very large scale R&D effort, 30.000 people alone in R&D. That is enormous. That is a real statement about how you try to move forward and cope with the inevitable uncertainties that we face. At the same time this is a dialogue across the Atlantic and certainly we could conclude that here in the last couple of decades of liberalization we have killed our main golden goose that laid all those innovative eggs. Perhaps there are other research efforts out there on the land but they don’t quite do what we might hope. I am suggesting a contrast between your choice and the way that we have addressed. So, let me frame a question here. As you look at to your 2020 do you imagine that you and Siemens will manage to hold on to this approach? After all the US a couple of decades ago had this and we also have to consider the uncertainty of what might happen in the future. But we can now truly frame it a look across these two, I am suggesting, somewhat different responses and different societies and styles. Should we choose the liberalization? Should you choose the liberalization and maybe not have that research organization? That is much more provocative but I am trying to encourage this look across the societies. Dr. Ganswindt: It is difficult to answer. As you have said it is difficult to predict what will happen. But as we are a European company and have a very long tradition. A company that is more than 160 years old has a tradition. We faced several challenges in our history. We learnt in our company and maybe even in the society that via innovation we can stay competitive. Therefore we in our company strongly believe that by putting plenty of effort into our R&D innovation is the only way how we can stay competitive in the world, in a world that is coming closer and where in the European society – and that seems to be different to the United States - we are a more networked society where people and societies come together. French, German and even the people from the UK live in one system. The only way how we as a company

40

Chair: Arnold Picot

can finally succeed is by putting all our efforts into R&D, into innovation. I think the system in Europe is changing dramatically. Since 60 years we have peace in Europe and now the people are really moving to one vision, the vision of one European society. N.N.: You talked about convergence and to achieve this. I think you need kinds of information exchange and maybe development in the whole industry. Do you have special strategies to avoid trouble with competition authorities or competition laws? Dr. Ganswindt: Oh, an interesting question. First of all I think it is necessary that there is an exchange of ideas, of concept in the whole industry and also between competitors and that is happening. In some cases we are forced by our customers to do it because our customers are very much interested that they find a way how to migrate the existing infrastructure into a new infrastructure. That is only doable if the suppliers are working together. In some cases they really force us to do this. I know that we always had this exchange in the past. And with standardization is an exchange of concept and architectures between competitors. I am not concerned about it and I don’t believe that this will ever create any trust issues. The whole industry is too competitive and there are too many players actually. Prof. Picot: I think we owe Thomas Ganswindt a very warm thank you and applause for this very excellent presentation and discussion. Thank you very much. With this I would like to invite you on behalf of the Georgetown University and Münchner Kreis to join us at the reception outside in the hall and in the president’s room opposite to this hall where we can continue our discussion and exchange of views and also get some food and drink in order to survive this strenuous exercise that you had to undergo. I promise that tomorrow everybody will have a seat. Once more thank you very much.

3

All IP – All IT – All Wireless: The Drivers of Change

Chair: Prof. Dr. Jörg Eberspächer Munich University of Technology

3.1

Statement

Gary A. Cohen General Manager, Global Communications Sector, IBM Corp. My goal this morning is to bring to this discussion the perspective I have gained from the work with our clients in the telecommunications as well as the media and entertainment industries. The subject of convergence is not about network equipment manufacturers. Instead it is about the telecommunications companies themselves. And about how these telecom companies – although they have been talking about convergence for many years – are now actually beginning to execute that convergence. Internet Protocol (IP) has been the enabler that allowed them to collaborate in a different way. Convergence – the telecom industry converging with Information Technology (IT), with electronics, with media industries – is about collaboration, collaboration of content, devices, technologies and systems, in order to provide clients with value for which they are willing to pay, and with services that are simple to use (Fig. 1).

42

Gary A. Cohen

Convergence is re-shaping the telecom landscape

IBM’s vision of convergence Any content, any network, any device Marketing & New Product Development

Content & Application Creators

Customer Management

Service Delivery Platform

Device Proliferation

Customers

Personal Application Developers

Home/SoHo

IP Network

SMEs Enterprise

Content Providers

Public Sector

Merchants Partner & Alliance Management

ANY CONTENT 2

Content Aggregation

Content & Applications

ANY NETWORK

Transatlantic Symposium “The Future of Telecommunications Industries”

ANY DEVICE © 2005 IBM Corporation

Figure 1

I am heading to the National Association of Broadcasters (NAB) convention in Las Vegas next week. Verizon is represented at NAB with one of their senior executives who also is a keynote speaker. I don’t think that Verizon is a media company now, which is going to do everything about media themselves. The interesting discussion here is: How do industries and companies collaborate. How do they collaborate on content in IT and telecommunications and electronics, in order to be able to deliver a different kind of value proposition? In a recent survey we made together with the Economist Intelligence Unit, we found the global telecommunications executives who participated have different views when they are asked about the impact of different types of telecom convergence.

3 All IP – All IT – All Wireless: The Drivers of Change

43

Convergence is re-shaping the telecom landscape

Multiple types of telecom convergence Convergence of voice and data well underway Voice & Data

88%

Fixed & Mobile

77%

Telecoms &Media

66%

Telco & IT

Device

51%

46%

Impact of different types of Telecom Convergence within next three years: Telecom Execs responding ‘very strong’ or ‘strong’ [% responses from a 112 total; multiple answers permitted] Source: IBM/ Economist Intelligence Unit, Global Telecoms Executive Online Survey, Oct 2004

3

Transatlantic Symposium “The Future of Telecommunications Industries”

© 2005 IBM Corporation

Figure 2

When we work our way down this list of the types of convergence (Fig. 2), it gets more inclusive and this allows telecom service providers to become more sophisticated in the kinds of services they can provide. They want to move from the world in which they live today, where much of what they see as their core business is commoditizing, to a world in which they are able to provide differentiation with value. And this is as much about business models as it is about technology.

44

Gary A. Cohen

Convergence is re-shaping the telecom landscape

Legacy network architectures Converging around IP Enterprise Applications

Data

Corporate Corporate Network Network

Enterprise

Internet Internet

Next Generation Network (NGN) IP IP Network Network

6

Service Provider

1-800 Apps Directory Apps

Voicemail Voice Response PBX PBX

Portals Transactions

Voice

Transatlantic Symposium The Future of Telecommunications Industries

PSTN PSTN 2005 IBM Corporation

Figure 3

This is where IP comes in. IP is not about one or another circle on this chart (Fig. 3). IP is about masking the underlying infrastructure and providing services and capabilities above it. This allows telecommunications service providers to be indifferent to their model of connection. In my opinion, the whole idea of next generation networks is not as much about the word “network” as the name implies. Certainly it is made available because of networking. But it is about moving on top of the networks themselves, leveraging the capacity of networking. Being able to deliver services across all these different networks is also about standards. IP in the 90ies had the most dramatic effect on the IT industry. Because IP enabled people in many walks of life – consumers, businesses, suppliers, partners and employees – to connect to IT. Some were not able to connect without IP because the applications in the proprietary network connections locked some people in and some people out. IP opened this up. But by doing so, it also began putting pressure on opening up IT-systems and moving standards up into IT-systems. The 90ies were very much about the opening of IT and standards. Standard have enabled the componentization of IT technologies. And the ability to connect components

3 All IP – All IT – All Wireless: The Drivers of Change

45

effectively in order to take the best capabilities and make them work together, and to make systems work despite the fact that they don’t come from one company.

Convergence is re-shaping the telecom landscape

Legacy networks transform to Next Gen Network Value shifting to services Legacy

NGN

Control

Transport

IMS

Services

ƒ Hundreds of services ƒ Horizontally integrated services & supporting infrastructure ƒ Open standards based ƒ Transition to Linux ƒ Commercial Off the Shelf (COTS) HW & SW Components ƒ Compute-centric IP applications ƒ Common Internet Protocols: HTTP, SSL, SIP, XML, Web Services ƒ Open integrated model – modular building blocks ƒ Competitive, ecosystemdependent

Convergence of the networks towards IT style infrastructure 7

Transatlantic Symposium “The Future of Telecommunications Industries”

© 2005 IBM Corporation

Figure 4

The internet, open systems and standards had a pivotal effect on IT companies and the IT industry, but in my view not as much on the network industries (Fig. 4). The layering of technologies – the service layer, the control layer and the transport layer – and networking including a set of standards allow for collaboration, allow for componentizations and allow something that is in many respects a shift in power. One could say that the commoditization of long distance service and of many fixed line services has shifted power from the telecommunications service providers to the consumers. That is what drove down the prices. In the IT industry of the late 90ies, the standards movement, the introduction of open systems and the componentizations of capabilities have driven more influence to the enterprises which are the major buyers of IT as well as to the consumers, and to some degree away from the traditional IT companies. I believe the same is going to happen with the network, as we are seeing IT standards moving into the network.

46

Gary A. Cohen

Convergence is re-shaping the telecom landscape

Converged Telecoms value chain Operators must decide where to play 1: Fixed ICT Operator

Content & Applications

Portal & Enabling Infrastructure

Network

Full Role

(ICT) Service Innovation

Partial Role

3rd Party

Customer Management

Device

Customer Management

Device

Enterprise Management

2: Home Broadband Service Provider

Content & Applications

Portal & Enabling Infrastructure

Wireline Network Wireless Network

Service Innovation

Enterprise Management

3: Integrated Operator

Content & Applications

Portal & Enabling Infrastructure

Wireline Network Wireless Network

Service Innovation

Customer Management

Device

Enterprise Management

4: Wireless Network Operator

Content & Applications

Portal & Enabling Infrastructure

Cellular Network Broadband Wireless

5: Virtual Operator (wireline or wireless)

Content & Applications

Portal & Enabling Infrastructure

Network

6: Network Utility (wireline or wireless)

Content & Applications

Portal & Enabling Infrastructure

Network

Service Innovation

Customer Management

Device

Enterprise Management Service Innovation

Customer Management

Device

(Wholesale) Customer Management

Device

Enterprise Management Service Innovation

Enterprise Management

9

Transatlantic Symposium “The Future of Telecommunications Industries”

© 2005 IBM Corporation

Figure 5

Many new and evolving business models exist in the telecommunications marketplace (Fig. 5): fixed ICT operators, home broadband service providers, integrated operators, wireless network operators, (wireline or wireless) virtual operators and network utilities. Telecommunications service providers become more and more focused on particular market segments. They must know where to play in the convergence value chain, from content and applications, infrastructure, service innovation, to customer management or the device. They have to examine their business models and determine whether to play a full or – with partners – partial role, and decide which non-core activities should be outsourced to third parties. The end customer doesn’t need to know about the parts. They want the integrated service and this is the challenge.

3 All IP – All IT – All Wireless: The Drivers of Change

47

Convergence is re-shaping the telecom landscape

Convergence is heralding a new ‘ecosystem’ Telecoms must partner to create value Telecom ƒ IP networks ƒ Customer relationships ƒ Access / ƒ bandwidth

Consumer Electronics ƒ PCs, Smart-phones ƒ PVR, games consoles ƒ Faster, cheaper chip-set technology ƒ Battery technology

Converged Ecosystem IT Services ƒ ƒ ƒ ƒ

10

Media ƒ Rich content ƒ Digital Rights Management ƒ Branding ƒ Packaging ƒ Broadcasting

Applications Middleware Web services Managed Services

Transatlantic Symposium “The Future of Telecommunications Industries”

© 2005 IBM Corporation

Figure 6

Convergence is heralding a new ecosystem (Fig. 6). Convergence is not just about how a telecommunications service provider, is becoming more than a telecommunications service provider. It is about how ecosystems of companies and industries change the way they work together. It is about how IT services around applications, middleware, web services and managed services, including the rich content, digital rights management, branding, marketing and broadcasting of media companies, combined with PCs, smart phones and game consoles and the IP networks, customer relationship systems, access and broadband, all these elements are put in a network proposition together. The configuration of which is the challenge of convergence. So, this is not just about technology, although it can’t be done without technology. This is about standards as well as industry and market configurations. The surveys I mentioned earlier say that 46 % of the future solutions in telecommunications will be developed by the telecommunications companies, 26 % of the content will be produced by media and entertainment companies, and 19 % by IT and computing companies. And without the standards, 0 % will actually be delivered to customers. The theme of this conference is about how we make it real for the end consumer. I think we make it real by breaking down the walls between artificially established industries.

48

Christine Heckart

3.2 Statement Christine Heckart Juniper Networks Inc., Sunnyvale I work Juniper Networks and our company was born in 1998. So, it was born after the internet area and we spend basically 100 % of our time every day, all of the people in the company, 3500 people, thinking about how we make the move to IP happen. Our entire strategy is predicated on speeding up the acceleration to the destination that we were just talking about. If you look back, last decade there was a lot of debate about what the destination was. I don’t think there is any debate right now about the destination. However, it is not a forgone conclusion that we can get there or that we can get there quickly. The way we look at the problem and the way we spend our time on resources is on two parts of how we get to that destination. One is a technology piece and one is a business model piece. And I think you guys covered all of those but I will summarize them in the following way. The move to put everything over IP has two components to it. One is to make IP capable of doing all the things that everybody just talked about. IP was never designed to do all of this stuff. It was designed for a really different, store and forward world, for applications that require almost no predictability or intelligence out of the network. And the kinds of applications we just talked about here require incredible intelligence out of the network because some of these applications – it is doesn’t matter what the networks do – are very tolerant of security problems, and they are very tolerant ofjitter, latency and loss. They are very tolerant of poor quality. But many of the applications we talked about like interactive gaming and real time communications and IP-TV have no tolerance for that. You need integrated security. You need very tight control over things like jitter and latency and the performance characteristics of the underlying network. And you need extreme reliability. And IP wasn’t designed for any of that. So, one of the first hurdles is how we create the standards. It needs to be standard’s based so that IP has the underlying capabilities and mechanisms in order to support all of these applications that we all want to have on the network. The second piece of the technology problem actually has nothing to do with IP. IP is the first time we have had a global network language which is phenomenal because for hundred years we have lots and lots of different languages. The more languages that are used the harder it is for everybody to communicate.

3 All IP – All IT – All Wireless: The Drivers of Change

49

It’s the same in the network, the network has never used the same language as the underlying applications that use the network. So, the network has no idea what the application needs from it. And the application has no idea when it is going over a network. So it makes it really hard to do everything we just talked about. It turns out that there are also a lot of standards bodies that are working on how you converge application’s standards. IP gave us the internet. The internet gave us XML and standard languages for how you create and present applications. It is our believe that it is only when the language of the applications can be understood by the network and vice versa that you can really have the convergence that we have been talking about. And that is something that almost nobody gets yet. And there is almost no work going on to make that happen. There is lots of work going on by standard bodies like the W3C which Gary showed, Libery Alliance, a bunch of other organizations to standardize at the application’s level. And there are so many standard’s bodies working on IP that there are too many IP standards to choose from. But there is almost nobody working on how you make the two languages work well together, so that the network understands what the application needs and the applications understand that there is a network underneath and it can call up and request certain levels of service, quality, security or availability. And Juniper has been working on this for a couple of years with IBM and with Siemens and we are happy to have Alcatel joining. It is something we call the Infranet Initiative. The entire purpose is how you bring the applications and the network together in a more intelligent way because until that happens there is no way that you can bring about the convergence that we all agree is the destination. That is kind of the technology hurdle that has to be solved. The second hurdle is what Gary spoke of. It is the business model hurdle. There are two pieces to that. There is an enterprise model hurdle, how do you get businesses of all sorts to become more on-demand. I am not going to talk about that one because I will think that will take even longer than that what we have time for here and doesn’t necessarily go in the direction of what the other speakers have been talking about. But the service provider business model does. To give insight into this service providers for at least 100 years have based their business model around shared resources. So, it has been like the diamond industry. The way you control your profitability is by controlling the scarcity of the underlying resource. Over night, thanks to all the deregulation that occurred and thanks to IP, they went from being in the diamond industry to being in the grain alcohol business. Bandwidth is all the same. And there is no scarcity to do it. You can make it in your backyard.

50

Christine Heckart

So, how do you make money at grain alcohol? How do you make money in this standardized world where everything is IP? And it is not scarcity anymore and it is very hard to differentiate. And if the answer is, well, you can’t, then we can’t have any of this happen because unless you can make money nothing happens. That is just the way the world works. You have to be able to make money. If you go back again to IP, IP is the source of the problem and the source of the solution. You can’t differentiate IP services. It was never designed to do that. So, one of the things that has to happen (and in fact the mechanisms are there but we have to make use of them) is to help service providers learn how to package what is now a commodity – just like grain alcohol. How to package band width. We need to package tailored services into very small communities of interest. So, you design services just for the gamers. And you design services just for software on demand. And you design services just for delivering IP-TV. And you charge differently, and you in fact deliver very different kinds of experiences based on the services being consumed. And if we do this correctly, if the technology piece and the business model piece work hand in hand, then the network will know what application you are using. And the service provider can charge accordingly. Maybe not charge by the Bit because I don’t think that is something that is very compelling to most consumers, but in fact can build their business model around packaging different kinds of services into different communities of interest. But you can’t do that without the underlying intelligence and I think you began with the question, “where does the intelligence lie?” I can tell you I have been in the industry for over 15 years and I started out in the service provider world and I grew up there. Since I have been in the industry there has always been a debate about “where is the intelligence?” I am going to tell you that that is a completely meaningless debate because unless the intelligence is everywhere, unless the answer is “yes”, it is in the network and it is in the devices and it is in the applications, and in fact unless the intelligence is all the same and we are speaking the same language, you can’t do any of the things we just talked about doing. We can get past the debate about the destination and I hope we can get past the debate about where is the intelligence because if the networks are dumb or the devices are dumb or the users are dumb we can’t do any of this anyway. So, everything has to be smart and also has to be standard based. And then when we solve the business model problem and the language problem we actually can make all of this convergence happen.

3 All IP – All IT – All Wireless: The Drivers of Change

51

3.3 Statement Dr. Thomas Ganswindt Siemens AG, München Talking about “the drivers of change” means good news for me: We are on the move again – driving and not being driven. As you know, all players in our industry have been through some rough times. It was pretty much stop-and-go, with no clear progress. Now, however, a fresh wind is blowing across the information and communication landscape (Firg.1).

Figure 1: Drivers of change

A recent study by Forrester market researchers estimates that 41 percent of European households will have a broadband connection by 2010. In an analysis published in the fall of 2004, Deloitte analysts predicted that two-thirds of the world’s 2,000 largest companies will be relying on Internet telephony (VoIP) by 2006. But what technologies are fueling this new momentum in the industry? Let me give you three examples.

52

Thomas Ganswindt

Figure 2: Examples of new momentum (Fixed Mobile Convergence)

First: Fixed mobile convergence (Fig. 2) The communications industry today is still largely divided between fixed network and mobile communications. Carriers found this strict separation to be a successful approach as long as the two technologies experienced steady and strong growth. But now both fixed and mobile carriers face declining margins – while the lack of unique selling points encourages users to switch brands frequently. Carriers are in a tight spot. The way out is to bring fixed and wireless network technologies together to form something called fixed mobile convergence. The advantages are easy to see. Carriers expand existing business by offering their customers attractive new voice, data and video services as both mobile and fixed network solutions. In addition, future network architectures will have a shared service and control level, which could lower the operating costs of fixed and mobile carriers. But what about the users? Thinking about the consequences of this technology shift I could imagine that – in the future – everyone will have a single mobile terminal for all their telephony needs. They will be reached under a single number anytime and

3 All IP – All IT – All Wireless: The Drivers of Change

53

anywhere. People will need only one voice mail system and receive a single bill for all services they use.

Figure 3: Examples of new momentum (Real-time Enterprise)

Second: Real-time Enterprise (Fig. 3) Managers have been clamoring for companies to turn themselves into enterprises that can communicate in real time, which is the only way they can respond to the increasing pace of market changes. Yet what does communication in a real-time enterprise really mean? The basic idea is to communicate faster and more efficiently as a way to speed up decision-making processes. I think all real-time initiatives begin with the challenge of selectively identifying reasonable information to avoid overload. The primary goal is to ensure that the right people have access to the right information at the right time. In the ordinary business world this means integrating data – such as the status of orders, the latest sales figures from branch offices or inventories of spare parts into the company’s data flow without delay. Analyses of the information in the data pool can be retrieved at any time by anyone at the press of a button.

54

Thomas Ganswindt

I am convinced that a company’s competitive strength and growth potential depends on the extent to which business communications are integrated into core business processes. The goal is to fully integrate real-time communications into all core processes.

Figure 4: Examples of new momentum (Smart Home)

Third: Smart Home (Fig. 4) The opportunities offered by broadband technology and network convergence help users manage the everyday challenges of life in a knowledge- and media-based society. The trend covers both business and private life. After all, it makes no sense to benefit from the advantages of modern communications at the office alone, while sacrificing the convenience it can bring without your own four walls. This approach is key to the Siemens Smart Home concept, which integrates the latest developments in information, telecommunications, electrical installation, household appliances and entertainment technology into a universal home solution and places them under one roof. We have set up a demo installation at our site in Munich to show that the possible scenarios are unlimited even today. Applications range from controlling blinds via a

3 All IP – All IT – All Wireless: The Drivers of Change

55

PDA and programming a DVD recorder from a cell phone to turning the family TV set into a turntable for a whole range of online services, including video on demand, gaming and Web surfing. By intelligently networking individual areas of life, we can make many everyday tasks easier in the future and create valuable time for the things that really matter. Communication today is still a rather cumbersome process. We are flooded with a constant stream of information, which – according to a Wall Street Journal study – has increased 64-fold over the past 30 years. To manage this flood of information, we use as many as ten different terminals, maintaining a separate address book for each one. We separate our communications into personal and business realms and communicate over such disparate media as the Internet, mobile networks and fixed networks. I gave you three examples of promising developments and technologies – three drivers of change. But what will change? Let me give you two key messages (Fig. 5):

Figure 5: Key messages

• Say goodbye to fragmentation • No boundaries To my first point: Say goodbye to fragmentation (Fig. 6)

56

Thomas Ganswindt

The conflict between excessive information and fragmented means of communication is about to be resolved. Why so optimistic? Because the insular technological solutions we use today are becoming increasingly standardized.

Figure 6: Say goodbye to fragmentation

Fixed networks, mobile networks and the Internet are converging. The key is to find a seamless transition between individual transmission technologies and terminals. Users should not even notice the transmission standard being used by their laptops, cell phones and PDAs.

3 All IP – All IT – All Wireless: The Drivers of Change

57

To my second point: No boundaries (Fig. 7)

Figure 7: No boundaries

Siemens has already come a long way in this direction, as demonstrated by the merger between the company’s fixed and mobile networks divisions to form Siemens Communications. The Group has also come up with a concept I have presented to you yesterday – LifeWorks@Com – whose purpose is to accelerate the development of applications for establishing a universal form of communications. Universality is key, because boundaries limit the flow of information and hinder the ability of parties to work together efficiently. In the wake of globalization, teamwork has become a determining factor in productivity, and thus a competitive advantage. Analysts with the Gartner Group predict that by 2010 as much as 65 percent of employees will work in teams that meet independently of time and place. We will see.

58

3.4

Chair: Jörg Eberspächer

Discussion

Chair: Prof. Dr. Jörg Eberspächer Munich University of Technology Prof. Eberspächer: I think we started yesterday already with interesting discussions and I hope we will continue. Yesterday we also heard it is all about uncertainty, in some respect. What does infor-mation theory say about uncertainty? Uncertainty is giving us a lot of information because if you don’t know anything about the future then a lot of information is given by each event. So, I think today we can also assemble a lot of information. Our topic this morning is “All IP – All IT – All Wireless”, and this is more or less what is “convergence”. We will go into that now in detail. May I introduce first our panel members and speakers; starting with Christine. Christine Heckart is vice president marketing for Juniper Networks. In this role Christine is responsible for all areas of corporate marketing working closely with the executive team on corporate strategy at Juniper Networks. Prior to Juniper she was president of strategic consulting firm Telechoice and in this time she worked with leading vendors and service providers worldwide on business and marketing strategy. She was named to one of network world’s top ten power thinkers and one of the 15 most powerful people in the industry by Network World. We will have a good discussion with you later! The next person is Gary Cohen. Gary Cohen comes from IBM. He brings more than 25 years of global leadership expertise, industry inside and IT customer knowledge to his current position as general manager, global communication sector at IBM. He has world wide revenue profit and customer satisfaction responsibility with IBM’s clients in several industries including telecommunications, media and entertainment, and energy and utilities. He also leads IBM’s efforts in the emerging business areas of digital media as well as wireless broadband and sensing solutions. Mr. Cohen is a member of IBM’s senior leadership team. Before he was appointed general manager he had a number of other positions at IBM, among others I want to mention general manager of IBM’s pervasive computing division. All the information you can also read in our information booklet we have given to you. So, I go to the next speaker, Jacques Dunogué. As you can see from his name he is coming from France. He is executive vice president at Alcatel and president of Alcatel Europe and South and member of the executive committee. He was previously secretary general and responsible for international government and regulatory affairs in Alcatel interfacing with Alcatel’s industrial participations and acting as a spokesman for Alcatel. In previous positions he was in the telecom sector where he acted in particular as executive vice president marketing and business development and president of the business systems division.

3 All IP – All IT – All Wireless: The Drivers of Change

59

Last but not least, Thomas Ganswindt. Some of you may not have attended the session last night, where he gave the introductory speech. Thomas Ganswindt earned a degree in engineering in Berlin and then after several positions also at some Fraunhofer Institutes he joined Siemens. In several positions he had an interesting time at Siemens, i.e. in the transportation system group. In September 2001 he was named head of the Siemens group information and communication networks. Now he is member of the corporate executive committee of Siemens and in this function he is responsible for the Siemens IC groups. Let us start now. Let me give a short introduction to this topic. As we have seen already, the word convergence is in the center of this symposium. I have assembled four aspects of convergence” together with some trends covering this topic. I think we will discuss all this later more in detail. The first aspect is the IP-convergence: “everything over IP, IP over everything”. Some believe that the IP networks will be the universal infrastructure in the future for all services. Another point which was not mentioned yesterday in detail is that the Internet also has brought a lot of self-organization to every aspect of networking and services. Hence, self-organization and, as some people say, user empowerment is an important topic. The question we have to discuss is where the intelligence in future will be – more in the network or at the user side, in the terminals? The next aspect is convergence of IT and Telco. In our preparations of the symposium this was a very important point. Some people say it is all about IT-based services and whereas telecommunication is very necessary for the future, the key thing is IT. People from the telecommunication side see it differently, saying telecommunication is the driving force and it goes also in classical IT-oriented areas like office, business and so on The third aspect is – and it has been mentioned very often, fixed mobile convergence. In some way, fixed network is a special case of mobile. But what about the broadband infrastructure? Can we have a wireless broadband infrastructure with similar speeds as the fixed infrastructure? The forth aspect is especially important from the standpoint of the Münchner Kreis, because we are bringing together people from media and telecommunication, so the telco and media convergence is in our focus. We know it and use it: streaming services, video on demand is coming. On the other side, in the home, we see also the entertainment systems being networked together. I think these are four aspects which may also stimulate our discussion later on. Now we will go to the first presentation.

60

Chair: Jörg Eberspächer

Statement of Gary A. Cohen: (See number 3.1) Prof. Eberspächer: Thank you for your presentation. I think the word standardization should be discussed later on because this last slide was wonderful but also wonderful complex. The next presentation complementing what was said last night will be given by Thomas Ganswindt focusing on our theme of this morning. Statement of Dr. Ganswindt: (See number 3.2) Statement of Christine Heckart: (See number 3.3) Prof. Eberspächer: Thank you. Twenty minutes are left for intensive discussion. I think we have identified a lot of areas. Maybe we should start with the standardization issue. For many years we did not focus so much on standardization. Now, it is obviously regaining importance more than ever. May I put some water into the wine? When I see your wonderful diagram, Jacques, with all the well structured bodies and areas where standardization is going on, if you look into the details you see a tremendous amount of different standardization bodies also in the same areas growing a number of standards, take one simple example, the IEEE 802.11x line of standards. Nearly every week we get a new one. Even in the Internet there are new standards upcoming because of the real time needs and security. How can you really cope with it? If I take my GSM phone from 1990 - we have one big box in my institute´s museum - you can really use it today. It is a standard from 1998. What about in the next six or ten years about IP or XML or what ever standards? Maybe some comments on that from Jacques? Jacques Dunogué: There are many faces to it. You are right. Everybody sings the song of standardization but everybody has in the back of the mind that they would like to become Microsoft or Cisco and set the standards. So, in fact standardization is now increasingly part of the competition. In the end you know if you don’t get to it at some point in time you start to agree to the standard the market will probably not pick up. At the same time everybody has in mind that they are want to be first to be implemented. And now they want the first to develop that value on top of it, so that I have a differentiator. We all agree on standardization but we are all competing on standardization as well. The practical implementation I see in the way we live every day. 10 or 15 years ago you had a core of standardization specialists in companies. Now standardization has become part of the normal work of our development engineers. The difficulty we have at the same time is to make sure that these guys

3 All IP – All IT – All Wireless: The Drivers of Change

61

see. You don’t have the time to go to the board and explain that we are discussing the standardization and how the 18th bit in the frame will be now standardized, whether it should be one or zero because nobody understands what they are talking about. At the same time we need to extract from these guys what is really strategic for the company and how to react to that. So, internally in the way our organizations are set it is becoming an increasing challenge. But this is my view and my comment on standardization. Standardization yes, still a competition and managing it because even more and more complex. N.N.: I have a question for the representative from Juniper. You spoke of the need for a language for the applications and networks to speak each other. Does that require standards or do you have a different conception of what that is? Christine Heckart: It doesn’t require it but it is a lot more cost effective if you base everything on standards. The problem that happens – and Jacque is right that there is always a drive to differentiate of your business because that is it what businesses are based on. The biggest problem right now with communications at the IT or the network level is that there are so many standards and it is not cost effective for any business to support them. If you are in the technology field like Juniper is in everybody appear if your organization has to keep up with an understanding developed towards multiple competing standards your resources are spread thin by the number of standards. You can’t do well, you can’t cost effectively focus your resource and you really can’t do well in anyone area. So, on the one sense everybody is doing the same and it makes it a little bit harder to differentiate and you have to do proprietary extensions and you have to find other ways to add value like having really good customer services, lots of different ways. The good side of it is as you can concentrate 100 % of your resource on implementing to that standard and if you are in the technology field sometimes time to market is very more critical than a small technical differentiator. So, standards can actually be a more cost effective way for everybody to do business and you can still find room to compete. If you don’t have standards then the amount of additional cost and resource that you have to not only support the additional standards but then to integrate them all and make them all work together, something that IBM has spent a lot of time and made a lot a lot of money doing. But the added complexity to everybody’s business, the technology developers and the users, is probably a difference. That is the amount of inefficiency where adding into the system by not agreeing on a simple set of underlying standards. Dr. Ganswindt: I want to be a little bit provocative. Who is really interested in standards? To be honest I think our customers are interested in standards. We as an industry are not really interested in standards because standards are driving commoditization and they are finally driving declining margins and therefore what Jacque said is

62

Chair: Jörg Eberspächer

absolutely true. We now have plenty of standardization initiatives. We have plenty of people talking about standardization but there is no real leadership behind it. The industry and the supply industry are mostly interested to send their RND engineers with good ideas. Then we talk about these good ideas but finally there is no real single standard created that will dominate the world. All of us are interested to build some proprietary solutions into a standard to make sure that we can keep our margins. Gary Cohen: The IT industry may be more mature at this than the network industry. The main standard issue is the adoption of standards. What causes and what affects the pace of adoption? When adoption takes place, everyone lines up because you can’t afford to miss the market and you can’t afford to be different than where the market is going. This is a statement about maturity. There is always more creation going on. What we do becomes standardized. Then we start building new innovation, which is not yet standardized. I don’t think we should get wrapped up on what the standard is because there will not be a single standard. There will be an ever evolving set of standards, standards that may make it feel like they are disturbing providers because they may drive a level of commoditization, but these standards are actually very good for providers if the providers are focused on innovation. What this does is raise the water line. In my view, when the water line goes up, there is more water and that means the market is larger and my opportunities may be larger as well. Prof. Eberspächer: Maybe we also have to distinguish between different areas and the IP layer. I think it was a real success to have only one standard and you see now we are debating how to go to IPv6. But in video coding or so every day you get a new code channel and you can adapt your Dr. Neumann: When I hear that there is a greater need of having standards in place to develop all the benefits of the upcoming and the merging technologies. On the other side we see that vendors have an ambivalent view of standards. Do you think there is the necessity of coming back to the old world of having more mandatory standards set by regulatory institutions or what are your alternative suggestions to manage the process? Christine Heckart: I don’t know if I can answer the question on the process because it is not an area that I have a lot of real time expertise and experience but on the issue of standards Juniper at least is a huge believer in the entire business and every product in it is based on open standards and we still manage to differentiate. So, I personally don’t see those two things as being in conflict and then I agree with your statement or proposition

3 All IP – All IT – All Wireless: The Drivers of Change

63

that the real key is how you get development behind one. As Gary was saying the problem with standards is that there are too many to choose from and there are too many different bodies working on all of them and at some point you have to converge. And it is a messy process or you have to put things in the market and see what will take hold. But when something takes hold it is and you let the market forces drive it then if you get everybody behind a standard like everybody got behind IP which is open, like Linux right now, like XML, that will cause more economic opportunity for all parties involved than having proprietary systems. It is not just in technology. You can point to any aspect of any part of our existence whether it is DNA which is a standard, whether it is language. It would be very difficult to be having this right now. If I was speaking in Swahili and you all understood a different language. We have to have one common way to speak to each other. In this room right now that is English and you have to have the same thing in networking or nothing can get done. Jacques Dunogué: Just one remark on this standardization or market driven standardization. I think my vision is that I fully agree with what has just been said. Probably the GSM in Europe was a big success and was imposed or pushed by a few actors and backed by governments. That is probably the last of this process that we have seen. And because the major reason is we are on the philosophy debate is just time to market. GSM was indeed a big success bur how long did it take to get to the market. It is about ten years. That was the circle we got through. And when you look at today, most of the innovations that have been brought to the market. One we have pushed a lot on ADSL and there was a conflict with standards The market decided what was the standard at the time our proposal won. But the next thing we had to do is to share it with everybody so that everybody will come and then nobody imposed it. It just came through the market. I think GSM is probably the last and as much as in Europe we cherish it and it was a good example and so on. I don’t see that model coming back. Personally I think the model of self competition and then something emerging because of market forces basically. And then it imposes itself and then we have succeeded. Prof. Eberspächer: Let us go to the next question. Prof. Dowling: I have a question about vertical integration and convergence. I remember in the early 80ies we also talked a lot about convergence and that talk led to such mergers as AT&T with NCR and IBM with ROLM. Now we are still talking about convergence, but I am wondering if there are still advantages for vertically integrated companies. For example, IBM is exiting the PC business. Siemens is struggling in the hand set business with mobile phones. I don’t know as much about Alcatel’s vertical integration. Is vertical integration still an advantage? Or will industry trends lead to

64

Chair: Jörg Eberspächer

a breakdown in the vertically integrated big companies and require more small companies to play a larger role? Dr. Ganswindt: It is true that we are struggling in the hand set business but I don’t want to comment on this. We are strongly convinced that there is a huge advantage of a vertically integrated company as Siemens is. And we believe that particularly for the telecommunication business it is a huge advantage that you are talking also about vertical sectors as I explained with the smart home concept. This is a concept where we combine different competences in our company to create a solution for the user. We are integrating the latest developments in information, telecommunications, electrical installation, household appliances and entertainment technology into a universal home solution. To realize this concept we had to bring our competences closer together. That was the reason why we have defined in our company now a huge initiative where we combine in so called sector development boards competences of other Siemens divisions with the telecommunication competence we have. We are now creating solutions for customers because finally telecommunication is only an enabler. It can help to drive a solution. Gary Cohen: I tried to express in my comments that integration is critical. Integration is not a statement about a company. It is a statement about how capabilities can come together. That is why standards are so important. IBM has many capabilities under the IBM umbrella but this is not always sufficient to satisfy our clients and their requirements. I think companies try to cluster capabilities together in order to establish a model of differentiation and one of value. But we don’t try to cluster capabilities together to close out other opportunities. As an example, in the telecommunications space we have established a network of partnerships we call the ‘Telecom Industry Network.’ Currently, there are more than 750 members of that community. We think that it is critical for a community that it provides the facilities for our clients to build more easily instantiated standards. As a result of those standards, we are helping many different kinds of providers of capabilities to build a more seamless world, a world requiring less integration efforts for the telecom service provider. Prof. Brügge, Technical University of Munich: Let me put a different twist on the discussion here: convergence also means monoculture. In some sense you are making everything uniform and that means more vulnerable to the bad guys. I am talking about the threat of identity theft and various other security problems by providing a single uniform infrastructure where you just integrate your services and make them based on IP. I believe we have a major problem here. Every farmer knows that monoculture has its problems and they seed different crops on their land every year. We are moving the other way. So, let me how you are addressing the problems within your company and the integration efforts.

3 All IP – All IT – All Wireless: The Drivers of Change

65

Prof. Eberspächer: Monoculture versus differentiation. Jacques Dunogué: If you get back to the user the user doesn’t want to informity of services. They want a variety. They want tomatoes, then they want potatoes, then they want an exotic fruit and they want all of this. However, it gets from the same ground. It gets out of the same earth. Maybe the temperature you cook it at is different or what ever. That is the image to monoculture. And it can be done in different ways. It can have a convergent service like for instance 3G which doesn’t work very well. But I want to have continued your service and you could offer it by giving me 3G which we have done in our headquarters. We are wired and have Wifi in 3G and all this stuff. Or you can do it by putting Wifi into the same terminal and then indifferently it could be connected to Wifi or it could be so. For me I don’t see convergence as a mono structural thing. It will happen in effect with diverse solutions, with diverse culture. There are needs to be the same ground and I fully agree with what has been said. If the ground is IP, okay, let’s still be it. I have been also quite a few years in Telekom. All those protocols I have seen so many now, are all bad. They are even worse than you can think. But when everybody agrees on one let’s use it. As Christine said it was not particularly designed for what we are using it for now. It was designed for the military to be very resilient networks and so on. And we are using it for TV and telephone. It doesn’t even know how to connect A to B. It doesn’t know what there is. It just sees a packet and says, well, this packet is going North or is going South, I sent it West. There are basic standards. There is the basic ground and then on this you will grow and I think convergence will give a lot of fruit. Otherwise it will not be successful. Prof. Eberspächer: I would suggest that for the next ten minutes we will shift a little bit away from standardization. What has not been covered very much up to now is the question of where intelligence is going. I think we have a clear trend in different areas. One is Peer-to-Peer technology for voice services, another is for file sharing. A lot of centralized services are now replaced by fully de-centralized services. What are you thinking about this, Christine for example? Where is intelligence really going? Christine Heckart: Your question is on intelligence and Peer-to-Peer. The one gentleman over here commented on how bad open systems are and ubiquitous because it creates more vulnerability which is absolutely true. And the gentleman here was talking about convergence and the tension between innovation and open standards. I don’t personally see the tension in the same way and I don’t see the tension that you are talking about in quite the same way and here is why. You can create a system where there is only one peer. It might be intelligent, it might be dumb. It doesn’t matter but it is just you by yourself. And there is zero vulnerability. And there might be tons of

66

Chair: Jörg Eberspächer

innovation but nobody gets to take advantage of it because it is just you. You know all about it but unless he can share that idea and that innovation with somebody, but there is a peer to talk to and there is some intelligent way to communicate. The value to anybody is zero. I can take the other extreme. Everybody wants open communication. The more communication there is the more we all get along and the more money can flow and the more everybody is happy and makes good business decisions and all that. It is basically impossible to create a completely open Peer-toPeer network that is flexible and cheap and ubiquitous so everybody can talk to everybody and not also open yourself up to some vulnerability. You can have complete invulnerability and zero value or you can have openness and you have to ride a curve of how risk you are willing to take. If you standardize then you have the ability to focus everybody’s resource on innovating in the right way. So, if the underlying standards aren’t what we are competing on, if we are saying, let’s use or resource to innovate around how we secure the networks better or how we take the open networks and make them more intelligent or how we allow machines to talk to each other so people aren’t always involved or how we enrich communications between people. If you shift the focus there is so much money, so many people, so much time and so much intelligence. You have to apply at somewhere. You can apply to completely closed proprietary systems and the values is less. Or you can apply to open systems and you can base those innovations on top of open standards, innovate in a way that makes use of them instead of coming up with competing ways which basically is a different way to do the same thing. If our resources spent on a different way to do the same thing you don’t get to the end destination as quickly. Prof. Eberspächer: I think the question is: Are those decentralized systems not less vulnerable than the classical ones? Gary Cohen: This is a statement about value versus risk. But it is also a statement about maturity and establishing foundations to build on. I believe that standards are about establishing foundations for further innovation. You could say they are foundations to build new proprietary capabilities. I don’t say that. I see the opportunity to establish foundational capabilities that will allow more people to participate in innovation -- innovation around technology, innovation around business models, innovation around communications, innovation around societal issues. Participation in innovation because of being able to connect and being able to share foundational practices even if they are not always best, and then moving on from there. Intelligence is everywhere and there are ways of configuring intelligence to accomplish goals and roles. There are times when you want absolute security and you want to have no communication. And depending on what you are trying to accomplish, that may be the best environment. But I think that what we will see, and what we are seeing from the IT world, is that the issue about intelligence is not where

3 All IP – All IT – All Wireless: The Drivers of Change

67

the intelligence is, but how it is managed and how it’s facilitated for the user. Therefore you can link it back to standards. Standards are not really about formalization, they are about enablement. They are about enabling an environment to operate in the way in which it is useful for users. At one time, people used to think that communications was going to be either Peer-to-Peer or Server-to-Server or Server-to-Client. The answer is it is going to be every which way. The question in my mind is how can it be managed effectively? How can it be provided? And how can the experience of the user be focused on accomplishing the task and not on spending time trying to figure out how to make technology work. Prof. Eberspächer: I think we have come now to an end. We will have discussions all the day and I am quite sure some will cover also the topics of this morning. I thank all the presenters and you for the discussions.

4

Technology as Driver of Change in Telecommunications

Prof. Robert Calderbank Princeton University I was invited to speak about technology as the driver of change in telecommunications; how technology shapes the world of industry. As I look over my talk, I now see a second theme emerging; how transformation of the telecommunications industry is changing our perception of what are the important research issues in science and engineering.

ne Bell System – It Worked 1876 • 1885 •

Alexander Graham Bell invents the telephone AT&T created to build and operate long distance network first line from New York to Philadelphia has a capacity of one call AT&T becomes government approved monopoly

1913 • 1919 •

Rotary dial

1925 • 1927 •

Bell Laboratories founded becomes THE focus of research in telecommunications Transatlantic service via radio

1951 •

1984 •

Direct dial

AT&T Breakup

Transatlantic Symposium 2

Figure 1

Let me start with a slide about United States history that follows the AT&T Corporation from the invention of the telephone to the break up in 1984 (Fig. 1). This slide includes both regulatory events and innovations in service. It was a different world; thirty-two years between rotary and direct dial. The pace of service innovation is very different today, and one of the reasons is competition.

70

Robert Calderbank

Throughout this timeline AT&T was so dominant that if you were a telecommunications engineer and you wanted to change this industry, than you would typically choose to work for Bell Labs because AT&T provided a unique path to influence.

The Changing Face of Telecommunications Research Vertical integration of service infrastructure, network/service/ customer management and network infrastructure within a monopoly business model of guaranteed return promotes unfettered research. Software research in part to improve efficiency of network/service/customer management Data centric research in part to improve efficiency of network/service/customer management Traditional communications research but with new emphasis on service value and cross layer design Transatlantic Symposium 3

Figure 2

This next slide speaks to how telecommunications research has changed (Fig. 2). The Bell System is characterized by vertical integration of service infrastructure, network/service customer management and network infrastructure. The business model is guaranteed return on investment, so if you spend $10M hunting a Nobel Price in superconductivity then the return to the company is $11M. This really was a golden age for physics and certainly a business model that promoted curiosity because the more stones you turned over the greater the return. How does an inverted triangle represent telecommunications today? There is a top layer representing business solutions where the main research component is software. There is a middle transport layer where the research component is operations at massive scale and I will talk some more about that in due course. At the tip there is infrastructure and what we might think of as traditional communications research. Twenty-five years ago I would still have used a triangle to represent telecommunications, but it would not be inverted. The relative importance of

4 Technology as Driver of Change in Telecommunications

71

traditional communications research versus information and software has changed dramatically. My inverted triangle is shaded green – the color of money – with light shading representing commodity businesses and darker shading representing businesses that are able to support speculative research. The money is in networked applications. Notice the lack of hard boundaries between layers. Infrastructure players are pushing up into network management as they chase revenue for operational support and Qualcomm has even bought up 700 MHz spectrum to enable entry into streaming media services. Carriers are pushing up into enterprise applications because they are not making enough money on basic transport.

There is a tide of innovation in products and services Companies can influence where that tide reaches but cannot control the waves IP Networks

System Based Architecture

Data Base Architecture

Service Management in Network

Service Management out of Network

Billing Cycle Response Time

Real Time

Transatlantic Symposium 4

Figure 3

We have heard a lot about innovation in product and services. When there was one Bell System there was really one source of innovation. Today there is a tide of innovation, and I’ll talk some about where that tide comes from (Fig. 3). The figure on the left is Canute, a famous Viking king of England who was persuaded by his court that he was so powerful he could turn back the waves. He did the experiment and it didn’t work too well. There is a tide of innovation in IP networking, and for now I will just focus on service management. It used to be that service logic resided in network at network control points or adjuncts. If you wanted new features or if you just wanted to fix a bug then

72

Robert Calderbank

you approached the vendor and the solution always seemed to be $30M to be delivered in 18 months. This was not a model that encouraged service innovation. Today service logic is migrating out of network to take advantage of advances in computing.

Historic Role of Federally Supported Fundamental R&D in Creating Billion Dollar Segments of the IT Industry

A key idea was to drop fine detail in QoS

ISDN ATM to the desktop Transatlantic Symposium 5

Figure 4

Where does this tide of innovation come from? One of the places is the federal government. Ed Lazowski, a professor of computer science at the University of Washington, has created a powerful series of train track pictures that trace how innovation flows from federally financed research to industry research and the end result is the creation of new billion dollar industries (Fig. 4). At the bottom of the slide are two ideas that got swamped. I have picked on Bell System ideas like ISDN and ATM to the desktop where the focus was central control, because the thinking was that very fine QoS was necessary. A key factor in the growth of the internet was to drop the fine detail in QoS to create a technology that could spread organically.

4 Technology as Driver of Change in Telecommunications

73

Redirection of Research Dollars at DARPA John McCarthy American leadership in computer science and in applications has benefited more from the longer term work than from the deliverables. Ed Lazowski Virtually every aspect of information technology upon which we rely today bears the stamp of federally sponsored university research. The federal government is walking away from this role, killing the goose that laid the golden egg. Transatlantic Symposium 6

Figure 5

However there are many demands on the federal budget, and we are seeing institutions like DARPA retreat from funding fundamental innovation (Fig. 5). This kind of short term thinking will weaken the US economy and diminish our place in the world. It is also bad business – I agree with John McCarthy that the economic value created by investment in long term research far outweighs that created by deliverables.

74

Robert Calderbank

Does Technology Matter? John Sidgmore Former CEO, WorldCom/MCI

Technology is not the carriers battlefield. The real issue is the back office. This is where the battle will be won or lost over the next several years.

Transatlantic Symposium 7

Figure 6

I was worried whether there would be enough questions, so I tried to be provocative by including a quote made by John Sidgmore when he was Chief Operating Officer at WorldCom (Fig. 6). “Technology is not the carrier’s battlefield. The real issue is the back office. This is where the battle will be won or lost over the next several years.” And of course when he made this comment he was probably underestimating the importance to WorldCom of the court room.

4 Technology as Driver of Change in Telecommunications

75

All Modes of Access

Hosting

Content Distribution

VoIP

IP VPN IP Switching and Routing Intelligent Optical Core

Data Repositories

Communication Services

Management of Customers, Services & Networks including Security

A Carrier Perspective on Network Services

Transatlantic Symposium 8

Figure 7

Let me rephrase the comment by John Sidgemore. What is of primary importance in the carrier perspective on networking is management of customers, services and networks, and this includes security (Fig. 7). We need to remember that this involves technology, some of it at the research frontier in computer science.

76

Robert Calderbank

Data Centric Systems Architecture Product/ Sales/ Tier III

Network Operations Center (NOC)

Network Care Reporting

Anomaly Detection

Network Management

Capacity Management

Network/ Customer Traffic Studies

Including lightweight publish/ subscribe capability

Data Distribution Bus Real Time Performance Data

Capacity Planning Reports

Historical Performance Data Data Distribution Bus

Data Collectors and Active Probes

Transatlantic Symposium 9

Figure 8

If we were to think of network operations as an iceberg, then what we see are Network Operations Centers or NOCs, product and sales support, and tools for network assessment and planning (Fig. 8). This is the tip of a data centric network architecture, where data is collected by active and passive probes, and published as real time or historical performance data. Lightweight diagnostic tools can then subscribe to this data and enable operations support functions by correlating across different data sources. The ice in this iceberg is a rich set of networking and software technologies operating at massive scale.

4 Technology as Driver of Change in Telecommunications

77

What information is Necessary to Operate and Compete? How is this Information Obtained and Applied? NETWORK How can data be efficiently transported across Wide Area Networks?

Application Specific Knowledge SYSTEMS AND SOFTWARE What types of storage and processing architectures will satisfy user needs?

Data Scale, Single Database Scale, Multiple Databases Scale, Multiple Databases Multiple modes

Fraud, Customer Focused Operations, IP, ATM, Frame Relay, . . .

DATA ANALYSIS What information can be mined from the data? What type of decisions can be supported?

VISUALIZATION What are the most effective ways to deliver information to decision makers?

Transatlantic Symposium 10

Figure 9

There is in fact an entire research agenda organized around what information is needed to operate and compete, and how this information is obtained and applied (Fig. 9). It requires excellence in marshaling data, in systems and software for storage and processing, in data analysis, and in data visualization. We need to find the right visual metaphors so that we can take advantage of the way our eyes identify anomalies. These functions form a chain and excellence is required of every link. Once the chain is in place, it is possible to start from a data source and some application specific knowledge and to provide insight in days or weeks. Sometimes it is possible to build an initial prototype and to spiral in to a minimal set of systems requirements through a test and learn process that connects programming team and client. I want to be clear that I am describing a new research frontier; for example, data analysis at massive scale is really the future of statistics. This is a subject that began with sorting outliers in small data sets and at that time it aspired to be a member in good standing of the mathematics community. That has completely changed today. The future revolves around understanding large dynamic data streams and in telecommunications that includes building signatures on network components and customer traffic at massive scale. What is massive scale? There are about 350 Million telephone numbers in North America and so telephone conversation

78

Robert Calderbank

defines a graph of order a billion nodes. Tracking and predicting the evolution of that graph as it evolves is one of the challenges of modern statistics. At JP Morgan there is a different graph defined by financial transactions rather than phone conversations, but the computational challenge is the same.

Technology Infusion – Internet Research Model Research Domain specific languages that facilitate creation of signatures at massive scale

Systems Database Data Compression Balance between real time and offline data analysis

Operations Analysis of daily usage and content mix on overlay network (Planetlab at Princeton University)

Transatlantic Symposium 11

Figure 10

There is a synergy between operations and systems and research (Fig.10). The value of operations to computer science research is that the challenge of operating at massive scale illuminates the research frontier across the discipline, including data bases, data compression, and the balance between analysis that is done in real time and analysis that is done off line An example of this internet research model within academia is PlanetLab, an overlay networking project directed by Larry Petersen, one of my colleagues at Princeton University. Within this model, the value of computer science research to operations is rapid infusion of innovation. It is customary to think about telecommunications research in terms of inventing the transistor and the long march through development, product, and finally services. My experience is that very few ideas make it through this long march. I think that there is a different internet model, and that this model is up and running at companies like Akamai and Google. You might question whether this is telecommunications research, but I would submit that both these companies are in the tele-

4 Technology as Driver of Change in Telecommunications

79

communications business, and that the market valuations of Google and MCI show that information about communication patterns have more value than transport. I would argue that a new experimental network is needed to enable this research model within academia.

Test and Learn – An Emerging Research Paradigm Illuminating Problems: Capability to monitor current events leading to dialog that anticipates/frames the right questions and collaboratively provides insight.

Data Integration: Capability to capture, integrate and use diverse information across silos, processes and organizations at full scale

Understand and (Re)Define the Problem(s)

Monitor & Control Anticipate User’s Needs

Data Publishing

Enhance the Infrastructure Better/Quicker Solutions each time.

Create a Solution and Iterate “Test and Learn”

Prototyping Solutions: Capability to build scalable, flexible prototypes that can be used immediately and then improved based on experience and evolving needs

Transatlantic Symposium 12

Figure 11

There is also a new interactive research process which I have called test and learn (Fig. 11). Operations at massive scale illuminate an abstract research challenge, an initial prototype then attacks the general challenge within a specific operational context, and there is then a dialogue between research and operations where the prototype is refined. The process spirals in to what in software development is called a minimal set of requirements. The value of minimality to software development is evolvability. When all the people who believe they need to sign off on a new system are involved in writing requirements, the result is too many requirements. Sometimes it is possible to build a first system but a second system will be too complex and expensive if there are too many requirements.

80

Robert Calderbank

Hosting

Content Distribution

VoIP

Wireless Access

IP VPN IP Switching and Routing Intelligent Optical Core

Data Repositories

Communication Services

Management of Customers, Services & Networks including Security

Mobile Broadband Access

Transatlantic Symposium 13

Figure 12

Now I would like to turn to mobile broadband access, in part because it is a personal research interest, and in part because several speakers have already talked about convergence of wireline and wireless communications (Fig. 12). I am one of the inventors of space-time codes for wireless systems that employ multiply antennas at the base station or mobile terminal. This is technology developed over the past five to ten years that enables high rates and reliability.

4 Technology as Driver of Change in Telecommunications

81

Universal Broadband Access Facilities Based

Primary Line Fixed Wireless DSL, Cable

Secondary Line Wi-Fi

Non-Facilities Based

Primary Line Cable Lease FTTH

Secondary Line BYOA

All represent an opportunity for Wi-Fi Overlay Networks

RIP

Home RF Transatlantic Symposium 14

Figure 13

There are many different ways to provide broadband services, some facilities based like DSL and Cable, others like Vonage based on overlay networks (Fig. 13). All represent an opportunity for a Wi-Fi overlay network. There is a tide of IP innovation in wireless, and this tide has swamped alternative technologies such as HomeRF that segregated voice and data in order to provide better quality voice. It might have been true that voice sounded incrementally better with HomeRF but that was not enough to swim against the tide.

82

Robert Calderbank

RF Propagation a.k.a "Why Broadband Wireless is Challenging" SIGNAL

Multipath Fading Reflec ted Path #1

-20dBm

DISTANCE

Reflec ted Path #2

Delay Spread

Reflec te d P ath #3

SIGNAL

Di rect Path

TIME

-120 dBm SIGNAL

f0

Path Loss

Reflections

Path loss limits “reach” and quality of wireless links

Reflections cause fading, “ghosting”, and noise.

FADING/PATH LOSS ERRORS Digital Analog

Doppler Noise v f0 c

FRE QUENCY

INTERSYMBOL INTERFERENCE ERRORS

?

?

?

?

?

?

?

?

Del Del a yay Sp Del Sp r eay raedaSp d read

+10 dB re Average

1 uSec

10 uSec

100 uSec

Detection Threshold

TIME

-30 dB re Average

DISTANCE The Remedy: Redundancy a nd Reordering Repeats Forward Error Correction Codes Interleaving Acknowledge/Negative Acknowledge

The Remedy: Adaptive Equalization Determine what distortions are occurring by "training" on a known sequence Synthesize a filter using a tapped delay line with multiply-and-add topology Adjust filter tap weights until the training sequence is received optimally Adapt on each time slot to "fine-tune" equalization

Transatlantic Symposium 15

Figure 14

The physics of radio propagation makes wireless communication much more of a challenge than wireline communication (Fig. 14). Today there are on the order of 30M devices enabled for Wi-Fi service. As individuals and enterprises deploy access points in offices and homes, the expectation is that by 2007 there will be on the order of 85M devices in the hands of 20M mobile workers. The challenge for Wi-Fi is that of migrating from an Ethernet cord substitute that supports email, IM and best effort data to a common air interface with the Quality of Service necessary to support voice, video conferencing, video and audio streaming and gaming.

4 Technology as Driver of Change in Telecommunications

83

Wireless LANS – where scheduling paradigms collide A set of users (transmitters) wish to communicate with a radio node (receiver) using multiple access: Users must share the communication channel

Problem: How to coordinate channel usage by the users so the channel is used efficiently?

Switched-Circuit Paradigm (Cellular)

Packet Data Paradigm (WLAN-Derived)

Reservation Techniques

Contention Techniques

(Control at Base)

(Control at User and Base)

FDMA, TDMA, CDMA

Packet (ALOHA, PRMA, etc.)

Fra m e

4 e 2

Slot Slot Slot Slot

Hz

Channel Channel Channel Channel Channel Channel

Use r 6 Use r 5 Use r 4 Use r 3 Use r 2 Use r 1

Time

Fram e

Slot Slot Slot Slot Slot

1,2,3,4 ,5 ,6 5 1

6 2

3

Time

4

Time

FB 0

4 1

3 1 e

4 2 e

3 1

Fram e

Slot Slot Slot Slot Slot SlotSlot Slot

1

Channel

e

1

4 1

2 3 e

2 1

1

4

1

1

Reserved

5 3 e

Contending FB

Multimedia Paradigm: Unified Protocol/MAC with both contention and reservation features (e.g. 802.11e, 802.16) Transatlantic Symposium 16

Figure 15

Ethernet was conceived as a system that enabled multiple users to share a common wired broadband network by communicating bursts of time-insensitive packets (Fig. 15). A user is only allowed to transmit a packet after the channel is sensed not in use, and if two users should sense the channel clear and transmit simultaneously, the packets will collide destructively, and the two users must contend again for access. The strength of the Carrier-Sense Multiple Access – Collision Detection (CSMA-CD) algorithm is distribution of control to the individual users. However as loading and demand for real-time services increase, the efficiency of Ethernet systems drops off significantly and the standard solution is to migrate to higher speeds such as 100BaseT and Gigabit Ethernet. Current 802.11 systems were standardized as wireless extensions to Ethernet and employ a variant of the CSMA access protocol. The transmitter sends a Request to Send (RTS) to the receiver, thereby silencing its neighbors for a period of time. If the receiver is able to accept the transmission, it sends a Clear to Send (CTS) thereby silencing its neighbors and allowing communication with minimal interference. The challenge presented by radio communication is that the loading point where 802.11 systems become contention limited is lower than for wired Ethernet systems. This challenge cannot easily be finessed by migrating to higher speeds. Operation of 802.11 within the enterprise or within public spaces employs a base station, usually

84

Robert Calderbank

called an access point (AP) that connects a community of subscriber modems to a wired backhaul network (usually Ethernet). A system management option called the Point Coordination Function (PCF) has the ability to suppress CSMA operation by instituting a polling process. The objective is to allow rapid access to the system, even under heavy loads. This means a wireless LAN is a place where two scheduling paradigms collide.

Successful Perspectives on Wired Infrastructure • What constitutes utility in an infrastructure for delivering IP services, and how to figure out if an incremental change is positive or negative. – Kelly: distributed rate allocation in wired IP networks

• Given a protocol, what is the optimization problem it solves – Low: TCP congestion control leading to FAST TCP

Can this approach provide insight into wireless MAC ? Transatlantic Symposium 17

Figure 16

I would like to pose the question of how we might design distributed protocols so that the mobile terminals end up maximizing some notion of global utility as they pursue their individual self interest (Fig. 16). My point of view is that a protocol implemented in a cellular phone is actually the solution to the distributed optimization problem. In fact if that protocol is the solution, what was the problem! The protocol is optimizing something. What is it that it is optimizing? We don’t really understand how to ask and answer those questions in the wireless world. We have had some success in the wireline world. Frank Kelly and Steven Low started out looking at TCP in this way and ended up with an improvement to TCP which is being used to distribute data from CERN to the US physics community.

4 Technology as Driver of Change in Telecommunications

85

What is Possible with Spatial Diversity? Choosing between High Rate and Reliability – Spatial multiplexing to maximize system throughput – Space-time codes to guarantee worst case data rate

Choosing between High Rate and Latency – Smart scheduling versus smart antennas

Having it all – multiple rate/reliability points – Embedded diversity to enable opportunistic communication when the channel is good and reliable communication with latency guarantees when it is less benign Transatlantic Symposium 18

Figure 17

Finally I wanted to touch on multiple antennas in wireless communication (Fig. 17). If there is more than one antenna at the base station you can correlate what is sent over different antennas and improve the reliability of transmission or you can just blast out data streams from the different antennas and get more rate. This way of increasing rate is called spatial multiplexing and the cost is decreased reliability and more latency. Correlation across antennas is called space-time coding. Improvements in reliability come at the cost of smaller rate, but the benefit is larger cells or to put it another way, smaller cost of capital deployment. We know that space-time codes improve TCP throughput by creating a smoother channel. We are starting to be able to create parallel data streams with different levels of diversity transforming diversity into a fine grained resource, which can be allocated judiciously to provide opportunistic communication when the channel realization is good and reliable communication with latency guarantees when it is less benign.

86

Robert Calderbank

Discussion Points • More industry research in telecommunications than ever before – But the focus at IBM, MSFT and Google is on higher layers

• Is it possible to sustain excellence across all layers within a single research organization? • Network/customer/service management is illuminating a new research frontier defined by massive scale. – If you think you are at the research frontier and you are not operating something at scale, then you are not at the frontier.

• No shortage of ideas about how to improve the wireless physical layer – What is missing is a framework that captures system utility. Transatlantic Symposium 19 Figure 18

I thought to end with discussion points (Fig. 18). I would contend that if you count properly, there is more industry research in telecommunications than ever before. I would say that IBM, Microsoft, Accenture and Google are engaged in telecommunications research. Their focus is on the higher layers, on the combination of software and information technology. I do wonder if it is possible to sustain excellence across all layers within a single research organization. Does it even make sense to have a single research organization that is pushing the research frontier at the physical layer and pushing the frontier in software? I have talked about the challenge of operations and how it is illuminating a new research frontier. The point that I want to make is that it is necessary to operate at scale in order to be at the research frontier. That means that the research frontier is at Google rather than the universities. I think this is a challenge for NSF, and I would like to see a new experimental research network with a focus on the higher layers and the challenge of operations at scale. Finally in wireless, there is no shortage of ideas about how to improve the wireless physical layer. We are less good at thinking about how to capture utility, and we need a framework for assessing the impact of changes at one level in the protocol stack on the other layers.

5

Market Structures and Business Models of the Future Consolidation or Persisting Turbulence – Monopolization or Fragmentation: What will Future Telco-Markets look like?

Chair: Prof. Dennis Lockhart Georgetown University

5.1

Perspective from USA Broadband and Wireless: The Next Telecom Crises

Eli Noam, Columbia University, Graduate School of Business, Columbia Institute for Tele-Information (CITI) Telecommunications debates often have a post-modernist flavor. Facts do not exist. It is all a matter of perspective. And perspective is a matter of economic interest. For telecom debates, the reigning orthodoxy for 25 years now has been competition. This perspective was advocated by new entrants, large corporate users, new technology companies, and free market advocates. For a while, this policy was successful. But more recently the sky seemed to be falling. The telecom sector was over-valued and over-expanded, and it subsequently contracted. According to the Economist magazine, ten times more money has been lost in the telecom crash than in the dotcom bubble. AT&T, the world’s foremost telecom firm for a century, is no more, having been acquired by SBC. Why did this happen? And what are the implications for the future? The industry as a whole has been picking itself up again, though not at the pace of the past years. But the problem is not immediate recovery; it is long-term health. And here, the perspective of most participants is one of denial. To some, the cause for the telecom crash is flawed government policy. To others, it is accounting malfeasance. Or, the raw power of incumbents. Or, the excesses of the financial sector. Or, managerial failures. But the most popular explanation is that all of these factors, and more, were coming together in one gigantic “perfect storm” scenario. Since the chances for such a

88

Eli Noam

confluence to happen again are small, this scenario it is comforting because it suggests that will not repeat itself soon. Yet the real problem goes much deeper than such a probabilistic scenario. The deeper problem is that the telecom industry as a whole has become unstable. It has moved from utility to volatility. From the model of the water company to the model of the airline. And its underlying dynamics are similar to those of airlines, a famously unstable industry: high fixed cost, low marginal costs. Competitive entry driving down prices. Expansion. Overinvestment. Continued price deflation. Financial losses. Credit spirals. Crash. In 2005, four of the major six airlines in America had filed for bankruptcy protection. Is this a harbinger for the telecom sector? And what should be the strategies for telecom firms to avoid such a fate? The first option is to follow economics textbooks, i.e., cut prices and raise efficiency. The problem is that competitors do the same, and the resulting price deflation will make firms worse off than before. The second strategy is get out of commodification by differentiating the product. This means, in particular, technological innovation. The problem with this strategy is that consumers and business users cannot keep up with a pace of technological innovation in the electronic hardware sector (“Moore’s Law”). Technology supply outpaces the demand to absorb it. Therefore, bursts of change are followed by breathing periods. For firms then, innovation is a difficult strategy, expensive to create and difficult to sustain. The third strategy is much easier. It is to regain control over prices and reduce price competition. This means, in most cases, to seek an oligopolistic market structure. And this is what has happened. Where once competition abounded or was anticipated, consolidation has taken place. Many of the new entrants have failed. Incumbents have merged. And investors, have been burned badly, do not fund new network competitors as that would challenge the incumbents. With such consolidation, prices rise or at least stop dropping. Already, large business users of telecommunications have lost many choices, and their bargaining power. Today’s telecom industry recovery, however, is not the end of the story. We have merely seen the first cycle, and there will be others. One cycle will be in wireless, another in broadband Internet connectivity. Both follow the sequence discussed above: excitement; public policy support; platform competition; investment; multiple entrants; over-investment; saturation; dropping prices; failures; consolidation.

5 Market Structures and Business Models of the Future

89

How then will the broadband market look like? That is an important question. In the United States, it is likely to consist of two major platforms provided, respectively, by the local cable and telecom companies, plus a few smaller platforms, primarily in wireless. Such a market structure can be described as one of “2.5” firms, a number that seems to prevail in many other telecom submarkets. The two major platforms will face the basic strategic options analyzed above. They can either price compete, which would be a blood bath for both considering the large investments they have undertaken. Or, they can differentiate their product. But basically they provide a commodity product -- the transmission of bits -- on top of which run applications. They can bundle their commodity transmission with specialized applications. But they will not be able to control the entry of other applications providers, unless governments let them do so. The market structures for infrastructure and applications will move in different directions, following their underlying economies of scale. And that means, eventually, unbundling rather than bundling transmission with applications. For voice service, that trend is emerging. And this will eventually also reach cable TV once the content providers can reach customers through the broadband route. The third and more likely option is that the two major platforms, after the market becomes saturated, will behave oligopolistically and cooperate. And that may mean sharing the fiber infrastructure. This will happen at first for rural low density areas and be blessed by regulators as a way to service these regions. It is likely to spread to higher density areas for the last mile of infrastructure. On the content side, Internet-TV will lead to a boom-bust cycle for content providers. This follows an earlier cycle that ended on “Black October” of 2001, when three service providers folded. In wireless too, we are in a boom-bust cycle. The excitement over 3G and data services has given way to skepticism around the world. Service providers have consolidated in the US, from six major national footprints to four, with a few small participants. This number is not likely to be the equilibrium either as growth rates slow or as data services such as WiFi siphon away some of the expected revenues. And in wireless too, will there be sharing of infrastructure by competitors. Eventually, these consolidations will stabilize the wireless market after it goes through overcapacity and price deflation. But this will not be the last cycle for wireless either. What will come next? It may well be, on the infrastructure side, the next generation of software-defined radio, which will destabilize the wireless sector and enable the growth of sensor networks and low-power applications.

90

Eli Noam

Are these boom-bust cycles a special problem? On one level, they are simply examples for the Schumpeterian process of creative destruction. We may feel sorry for the employees and small investors affected, but that’s how high-risk sectors behave. But we must also take a broader perspective. These upheavals affect much more than telecommunications. Similar dynamics are taking place across the entire information sector. Computers. Semiconductors. Music. Advertising. E-commerce sites. Portals. Search engines. Newspapers. Radio. Consumer electronics. Wherever one looks, the information industries are going through price deflation followed by crashes and restructuring. In the good old days, each of these sectors prospered and suffered on its own. But now, with digital convergence and sprawling media, the synergies are both positive and negative. Downturns in one segment percolate and oscillate through the entire information sector. An overinvestment in Internet portals leads to a collapse of banner advertising prices which affect advertising on cable channels and affect ISPs which affects backbones which affect telecom carriers which affect equipment makers, and whose lowered trans-Pacific prices lead to increased outsourcing which affects consumer electronics makers and labor unions. At the same time, governments do not have many tools to deal with these instabilities. Keynesian demand policy is not the solution because demand is not the problem. People consume more bits and connectivity than ever before. The problem is prices. Nor is monetary policy a solution, because the instabilities are not caused by credit or interest rates. One could argue that letting firms merge is, the government’s de facto policy for dealing with these instabilities: a de-emphasizing price competition in favor of investment protection. This dynamics of boom-bust instability happens on both sides of the Atlantic. Major European telecom companies have been mired by even more debt than Americans. Consolidations have taken place in Europe, too. For example, in Brussels, regulators have let the five major music companies become four, when just a few years ago they did not allow similar mergers. What has changed is that the music industry has become volatile and has declined. Rather than reducing barriers, the EU Commission has permitted consolidation. But it is unlikely that such a policy will protect the music industry as it faces new business and distribution models. And there are international dimensions. Even as some countries may try to stabilize their markets and economies, other nations have no vested interests to protect in such a way, and will follow other paths. This leads to the conclusion that the information economy is an unstable economy, and therefore will probably lead to an unstable society. More volatility creates less cohesion and more disagreements. We can see that already in domestic political

5 Market Structures and Business Models of the Future

91

discourse. These disagreements will spill across borders, with different countries taking different measures. Therefore, the information economy, regrettably, for a trans-Atlantic dialogue, is not likely to bring us together globally in policy, even as it links us more powerfully than ever.

Bibliography Brock, G.W., 1981, The telecommunications industry: the dynamics of market structure, Harvard University Press. DeLong, J.B. and L.H. Summers, 2001, The ‘New Economy’: background, historical perspective, questions, and speculations, Economic Review – Federal Reserve Bank of Kansas City 86(4), 29-60. Espinosa-Vega, M.A. and J. Guo, 2001, On business cycles and countercyclical policies, Federal Reserve Bank, Atlanta Economic Review, 1-11. Greenwald, B., and J.E. Stiglitz, 1993, Financial market imperfections and business cycles, Quarterly Journal of Economics 108, 77-114. Katz, R., Weise, M. and D.Yang, 2002, The US wireless industry: consolidation of scenarios (Booz Allen Hamilton, New York). Noam, E., May 19, 2004, The looming capacity shortage in international telecommunications and its impact on outsourcing, The Financial Times. Petersen, B. and S. Strongin., 1996, Why are some industries more cyclical than others?, Journal of Business and Economic Statistics 14(2), 189-198. Prescott, E.C., 1986, Theory ahead of business cycle measurement, Federal Reserve Bank of Minneapolis, Quarterly Review, 9-22. Schumpeter, J.A., 1939, Business cycles: a theoretical, historical, and statistical analysis of the capitalist process, McGraw-Hill.

92

Karl-Heinz Neumann

5.2 Perspective from Europe Dr. Karl-Heinz Neumann General Manager WIK – Scientific Insitute for Communication Services, Bad Honnef 1. Introduction It seems to be the job of economists to bring down visions to market realities and as an economist I fear I also have to argue in that tradition, perhaps with a little less pessimism than Eli has demonstrated. On the other hand I must say when hearing and looking at the discussion this morning in the industry panel I was impressed about the view of the industry representatives on where the sector is going and developing to. Furthermore, everybody seems to be sharing the same vision. There is, however, a risk involved when too many industry sector representatives or all the industry representatives from the banking side, from the vendor side, from the carrier side share the same vision and we all know what the implication has been four years ago during the internet bubble. There is a risk in having only one vision and for an economist markets are moving faster when there are different visions on the market development. It makes at least competition a bit easier if there is more than one vision. At the same time it is also interesting to see what is happening at the company level. I don’t see that the companies in the sector really move towards integration. We can observe even totally different business models over the last five years. Companies go back to their core competences. They concentrate activities. I don’t see too many examples where the firms in the sector really demonstrate by their own behaviour that they all go into the convergent world as converged entities.

2. Market developments Let us move to some market realities. Only a few figures are interesting in table 1 which are showing the telecommunications service market. In the US, you were not so lucky in the last four years with the growth of the service industry. Europe did a little better. But at least according to this source we see some convergent development in growth rates because of a relevant recovery of the service sector in the US and some slow down of the growth rates in Europe are forecasted by IDATE.

5 Market Structures and Business Models of the Future

93

Table 1 World Telecommunications Services Market by Region (Billion €) 2000

CAGR 2000-2004

2004

Value

Share

Value

Share

Western Europe

202

22%

260

22%

France

26

Germany

CAGR 2004-2007

2007 Value

Share

6,5%

288

21%

33

6,1%

39

5,7%

41

54

7,1%

59

3,0%

Italy

28

35

5,7%

39

3,7%

United Kingdom

39

48

5,3%

52

North America

324

1,0%

364

USA

291

1,0%

327

Asia-Pacific

257

9,6%

446

China

37

68

16,4%

89

Japan

159

192

4,8%

228

Rest of the World

136

12,0%

268

Eastern Europe

43

84

18,2%

101

6,3%

Latin America

63

83

7,1%

105

8,2%

Africa-Middle East

30

47

11,9%

62

9,7%

Total

918

6,5%

1366

4,9%

35%

337

29%

303 28%

371

15%

31%

214

100%

18%

1182

100%

3,5%

2,7% 27%

2,6% 2,6%

33%

6,3% 9,4% 5,9%

20%

7,8%

Source: IDATE; DigiWorld 2004; WIK

Table 1

It is interesting – to look at the various segments of the service market – I only have available the figures of the European market in table 2. It is obvious that in revenue terms the mobile telephony market is larger than the wireline market at least if we concentrate on telephony. If we sum up data and voice there still is some small dominance of the wireline market. Table 2 Western Europe TK Market Value (Million €) Market Value (€ Million) Fixed voice telephone services

2001

2002

2003

2004

2005

CAGR 2001-2005

88,597

88,72

87,574

86,584

85,587

-0,9%

Traffic

65,111

63,591

62,645

61,802

60,93

-1,6%

Service rental

23,486

25,13

24,929

24,782

24,657

1,2%

Fixed data services (incl. internet)

37,921

42,486

46,381

50,341

54,711

9,6%

Mobile telephone services

96,125

103,363

109,792

117,119

124,611

6,7%

CaTV services

11,805

12,726

13,538

14,203

14,733

5,7%

Total

234,448

247,295

257,285

268,247

279,642

4,5%

Market Share

2001

2002

2003

2004

2005

37,8%

35,9%

34,0%

32,3%

30,6%

Traffic

27,8%

25,7%

24,3%

23,0%

21,8%

Service rental

10,0%

10,2%

9,7%

9,2%

8,8%

Fixed data services

16,2%

17,2%

18,0%

18,8%

19,6%

Mobile telephone services

41,0%

41,8%

42,7%

43,7%

44,6%

CaTV services

5,0%

5,1%

5,3%

5,3%

5,3%

100,0%

100,0%

100,0%

100,0%

100,0%

Fixed voice telephone services

Total

Source: EITO 2004

Table 2

94

Karl-Heinz Neumann

It is relevant to note that the market share of the mobile industry is steadily growing. At the same time it is remarkable in Europe and in some other parts of the world, (quite different to the US) that we still have some significant transfer streams running from the wireline sector to the mobile sector due to overpriced mobile termination rates. Given the relative developments of the two sectors one should really get (strong) doubts why that really should be a proper approach of regulation. All European regulators are working on this problem in these months. Mobile termination generates transfers of billions of Euros or Dollars a year by which the fixed line industry still is subsidising the mobile industry.

3. The fixed-line market A few more words on the fixed line market and the implications of IP technology. We should not forget the dimension of costs. All IP networks, that is at least what vendors say, will have significantly lower costs than networks based on current technologies. The major problem of having the technology available is the migration path of the incumbents and their still undepreciated asset base. Otherwise we would see quite a tremendous drop of retail prices again. Besides the level of costs the cost structure is also changing. IP technology as applied to voice and other services will make usage dependent costs much more irrelevant than they are already today. Access becomes even more important than already today. Cost structures will also change pricing structures. We will see much more capacity based charging and flatrates as the dominant pricing schemes than today. We also have heard this morning that wireline and wireless networks more and more share the same or have the ability to share the same network elements and the same technology. I am, however, not quite sure whether in the end that would lead to one single network platform or whether we would be better off if these network platforms then would much more compete against each other as they are doing today. We should not forget that today there is a competitive disadvantage of the fixed line network and that the strongest feature of the mobile network is the mobility feature. IP technology will bring much more mobility to the fixed line network. I only have to mention WiFi, Voice over IP, other features, WIMAX. If the fixed network operators invest and have the chance to invest in these technologies they would become much more competitive against their mobile competitors as they are today. To me it is not obvious whether we will see a merger of fixed and mobile networks. It could also be a relevant development, that convergent technologies bring these networks in a position to compete more effectively against each other.

5 Market Structures and Business Models of the Future

95

4. The mobile communications market I agree with Eli Noam that we don’t see a stable or viable market structure in the mobile market yet. In some European countries we have five operators in the market. In small national markets we don’t have too much cross border activities of mobile operators. Given that prices in mobile move closer towards relevant costs, the more the relevant economies of scale in that business also become obvious and become a relevant factor for shaping the market structure. The current market structure somehow also is the result of a little luxury which we have due to the ability of the mobile operators to price and position their service as a premium service. Because we are going to see market saturation in terms of subscriber growth and also because competition is becoming stronger, the mobile industry is loosing this ability of positioning. Therefore the genuine economic factors will have a stronger influence on the market structure in that industry. The mobile operators can slow down the consolidation process by succeeding in high margin data message and content related services. At the moment the speed in which they develop those services does not give reason for too much optimism in that direction. In Europe we will see consolidation in the mobile industry. We had already quite intensive models on the table two to three years ago which were not realized in the end at that time. I would expect that a second move of consolidation is going to start soon.

5. Triple play A few words on triple play. Will households get their internet access, their cable TV and their voice service from the same provider and are those services produced over the same network infrastructure? In the US there seems to me a clear trend in favor of triple play which seems to be related to the fact that cable companies are strong players in the market and telcos have to react against the competitive activities of the cable companies. The picture in Europe is a little bit more mixed and dispersed. First of all, on average we have a lower cable penetration rate although it is still relatively high. As already pointed out it is very obvious that DSL dominates broadband markets in most member states in Europe (see figure 1). We don’t have much penetration of digital TV yet and cable telephony has only in two countries (UK and Spain) got some market relevance so far (see figure 2).

96

Karl-Heinz Neumann

Figure 1

EU15 – broadband lines by member state (July 2004)

6,0

5,0

Mio.

4,0

3,0

2,0

1,0

Other Brandband Access (DSL, Satellite, WLL, FTTH, PLC)

Germany

France

UK

Italy

Spain

Netherlands

Belgium

Sweden

Danmark

Austria

Portugal

Ireland

Finland

Greece

Luxembourg

0,0

Cabelmodem

Source: 10th Implementation Report, EU; WIK

Figure 1

Figure 2

Subscribers of digital cable, cable modem and cable telephony

x 1000

5000 3500

Digtal TV subscribers Interactive digital cable modem subscribers

3000 4000 .. .

Cable telephony subscribers

2500 2000 1500 1000 500

UK

Switzerland

Sweden

Spain

Portugal

Poland

Norway

The Netherlands

Hungary

Germany

France

Finland

Denmark

Czech Republic

Belgium

Austria

0

Source: European Cable Communications Association (ECCA); Broadband Cable: Connecting Online

Figure 2

5 Market Structures and Business Models of the Future

97

As I am most familiar with the German market let me give you this as an example because we have a unique development in Europe. We have a relatively high cable penetration of more than 60 % of all households. But we have less than 1 % penetration of internet access over cable, not less than 10 % but less than 1 %. That is by the way one of the reasons for our relative low penetration rate of broadband access in general. We don’t have relevant intermodal competition in the sector. I don’t want to be too skeptical here, but given the growth developments of broadband it may even be the case that the window of opportunity for cable is already closed. But it could also be turning around. One of the reasons for the limited relevance of cable results from the fact that the cable networks were separated from the fixed line incumbent quite lately and we have a fragmented cable industry structure. Furthermore, we have no real strategic investor in that business.

6. Capital market Given the experience the sector has faced with the capital market a few years ago, there are quite different views on whether or not capital market valuations can generate relevant and meaningful signals on business strategies. But companies have to take care of it. Capital markets restrict the abilities of firms to execute certain strategies or enable them to go for. In table 3 various types of carriers are ranked according to their market capitalization and market capitalization is calculated

Table 3 Industry Valuation of the Capital Market Carrier Ranking and Valuation (2004) Rank

Company

1

Vodafone Group

2

Verizon Communications

3

Nippon Telegraph & Telephone

4

Deutsche Telekom

5

SBC Communications

6

France Telecom

7

Country

Marketcapitalization (billion US-$)

Turnover (billion US-$)

Turnover Multiple

UK

159,2

55,0

2,9

USA

101,9

67,8

1,5

Japan

89,4

103,0

0,9

Germany

79,8

68,0

1,2

USA

79,6

40,8

1,9

France

60,5

56,1

1,1

Telecom Italia Mobile

Italy

46,9

14,3

3,3

8

Bellsouth

USA

48,9

22,6

2,2

9

Telecom Italia

Italy

47,0

39,4

1,2

10

BT Group

UK

28,2

33,9

0,8

11

Singapore Telecom

Singapore

24,8

6,1

4,1

12

AT & T

USA

15,7

34,5

0,5

13

China Unicom

Hong Kong

13,1

4,9

2,7

14

Mobile Telesystems

Russia

11,2

1,4

8,2

Source: Financial Times Top 500 (2004); WIK

Table 3

98

Karl-Heinz Neumann

against turnover. This multiple is used as an indicator for the capital market assessment of business models and carrier strategies. Does the capital market appreciate convergent business models? Pure play mobile operators and operators in emerging markets are still those operators that get the best valuations in the capital markets. Pure play fixed line operators get the lowest ones. Why should any mobile operator therefore invest in wireline? Do pure play fixed line operators have a chance to buy any mobile operator? You can see from the turnover multiple which I have reported here what the answer to these questions must be.

7. Long-term developments Let me start with two quotes from Technology Futures on network development “By the end of 2015, we will have transformed the local exchange from a narrowband network of circuit switches and copper cable to a broadband network of packet switches and fiber optics”1 and demand for bandwidth “The typical household of 2015 subscribes to broadband service at 24 Mb/s to 100 Mb/s, using it for traditional Internet activities, such as web surfing and downloading files, and new uses such as voice communications, device monitoring, and video streaming. … Medium and large businesses access the network directly with fiber at data rates from 2.4 Gb/s to 40 Gb/s. By 2015, most customers obtain voice and narrowband data service via wireless or VoIP on broadband channels”. I believe it is the general expectation in the industry that by the end of 2015 we will have transformed the local exchange from a narrow band network of circuit switches to a broadband network based on IP technology. In the longer term fiber to the home is a key and a very relevant network technology. The penetration of fiber even in the US and Asia is relatively low, in Europe we still have quite regionally focused activities in some countries. It is not a common phenomenon yet in Europe to invest in fiber to the home. For an economist and for those who have an interest in how competition might develop in the fiber to the home business is also an arbitrary development and perhaps a critical one. There is the argument that fiber in the last mile will strengthen the existing market power of operators in the local loop. I am not quite sure whether that is a necessary consequence. Innovative models may help to solve the competitive problems. I even see a potential for competition which might be stronger than what we have today at least in Europe. The model I am talking about is a model where the basic, you can also say the passive, infrastructure is operated and invested by a network operator independent utility provider or an operator independent basic infrastructure provider. The network operators then lease unbundled fiber pairs from the infrastructure operator and provide their services on that infrastructure. You can have quite a number of different access networks based on such a model. On that 1

Technology Futures: The Local Exchange Network in 2015

5 Market Structures and Business Models of the Future

99

basis relevant and strong economies of scale at the basic passive infrastructure level can be exhausted and at the same time that approach can be combined with competition at the network level, not only at the service level but also at the network level. I my mind this could be an effective model to mobilize investments. It could also be relevant and effective for the capital market. The model could be an alternative to incentivise investment in fiber to the home by excluding unbundling and therefore competition.

8. Conclusion For my concluding remarks I want to come back to Eli Noam’s fundamental question “Has telecom developed to a cyclical industry?” I have put that point into a question mark because I am not convinced that this hypothesis is valid. We have one observation point which is consistent to that approach or is supporting it. We need at least a few more observation points to have more confidence to the hypothesis of telecoms to be cyclical industry. What we can be sure about at this point in time is that the telecom industry is on search for a new industry structure equilibrium. Eli Noam would doubt if the industry can ever find it. If the industry is, however, able to find a new structural equilibrium we would hardly have a cyclical industry. Before we can conclude on the cyclical structure of the industry we should give the sector the chance a find a new and stable industry structure. Whether in the end the outcome of that market structure development process will lead us to a more competitive or to a less competitive environment compared to today is another question. We have to consider the various segments of the market seperately. I see in particular in Europe the mobile industry still on a consolidation and concentration path. The NGN development has the potential of decreasing infrastructure competition. Competition in fiber network access needs innovative models for competition. If those models are viable there is the possibility that even though passive basic infrastructure may show stronger economies of scale as compared to today, we might even have scenarios of getting more competition in the relevant area than we observe today. In any case the NGN development will strengthen the role of service competition. We will no longer talk on resale competition but on real service competition. In Europe 25 incumbents and hundreds of mostly nationally operating entrants compete in the market. We do not observe too much European-wide operation of carriers. That model was originally anticipated in Europe, but actually materialized only to a low degree. Mainly in Eastern Europe some integration at the incumbents` level can be observed. The European Commission still tries to motivate cross-border carrier activities. At the operator’s level, however, we are still far away from a more

100

Karl-Heinz Neumann

integrated market structure as in the U.S. Many observers are convinced that our market structures are not yet cost efficient and therefore not viable in the long term. There is, however, a long way to go in Europe before we will have real European operators. In the mobile industry we have two dimensions of consolidation or concentration which still become relevant. On the one hand consolidation at the national level and on the other hand consolidation at a European-wide level. We only have a few crossborder operations in mobile. I expect that process to evolve in the next two years.

5 Market Structures and Business Models of the Future

5.3

101

Discussion

Chair: Prof. Dennis Lockhart Georgetown University Prof. Picot: While Dennis Lockhart is taking the chair I just would like to introduce him briefly. Dennis Lockhart is a professor at the School of Foreign Service at the Georgetown University and he chairs the program on international business government relations and global commerce and finance which is part of the education here at Georgetown at his School of Foreign Service and the graduate schools. Prior to this he taught at John Hopkins University and prior to that he was in the financial industry with various positions as a senior manager in financial companies, also working for many years with Citibank/Citicorp (now Citigroup). Over the globe he was representing that important bank especially in export and import questions. He received his degrees from Stanford University, from John Hopkins and also from Sloan School at MIT. Dennis, thank you very much for serving today and the floor is yours. Prof. Lockhart: Arnold, thank you. I am probably the lowest value added person in this conference. So, I am going to add value by speaking as little as possible and by introducing two distinguished panelists here and making sure that those of you who have questions get a chance to express them.. Our subject today is industry structure and business models. That was addressed to some extent earlier in the morning. So, I am hoping our panelists will take those questions further. We may even get into the more arcane question of how exactly do you define the industry today. We have two panelists whose bios you already have. I would just focus on a couple of things. Eli Noam is a professor of economics and finance at Columbia Business School and their director of the Columbia Institute for Teleinformation which is a research center that focuses on strategy, management and policy issues in the telecommunications computing and electronic mass media areas. He has also been at Columbia Law School, at Princeton, particularly Princeton’s Woodrow Wilson School, and the University of St. Gallen in Switzerland. He authored 20 books, 300 articles and was recently appointed to President Bush’s Information Technology Advisory Committee. I will ask both the panelists perhaps to elaborate a little further on your bios, particularly as regards your current interests and current activity. Our other speaker is Dr. Karl-Heinz Neumann who is General Manager and Director of WIK. He has acted as a member of the board and member of the supervisory boards of several national and international telecommunications companies and has

102

Chair: Dennis Lockhart

broad experience with particular interest in regulatory affairs which I think will add a great deal of value here. Shall we start with Eli Noam? Statement of Prof. Noam: (See number 5.1) Prof. Lockhart I have no questions. There will be some questions from the audience but we would like to go to our second speaker first and then at the end will open the floor to questions. Statement of Dr. Neumann: (See number 5.2) Prof. Lockhart: Let us open the floor to questions. Julie Roundson, Telecommentator: My question is for Prof. Noam. I must say that your presentation was really quite depressing. I understand that. It is sort of reality. You are painting a picture of where I think we are going. And that is what my question is about. I guess I am wondering why you didn’t talk about the larger conclusion. For me the larger conclusion is that if the firms are buying out these firms and consolidation occurs. They are left with a very high bill to pay for all this and get consumer services out. What you didn’t say is that if we are going in this direction that there are going to be more things to support the buy-outs and heavy debt: what I am saying is pornography and gambling. You didn’t say that. You had a more optimistic outlook, but you didn’t talk about that. But with respect to what is happening in Korea, I understand that the high bandwidth is being used to download pornography. What I am just trying to understand: is this the direction that it could go into; try to find things that could support heavy users pay a lot of money; those kinds of things that people would get hooked on? I don’t want to see it going in that direction but I just wonder how all this is going to be paid for. If this may be a possible consequence. Thank you. Prof. Noam: Gambling and pornography today and in the future, will be accessed through broadband. This kind of content has always been an early application for usercontrolled new media. Both have relatively low-entry barriers. The likelihood of success for restrictive rules is therefore low. But to answer your broader point, whether such applications will somehow support the network infrastructure, the answer is, indirectly and unintentionally, and somewhat embarrassingly, yes. That has been true for many previous media, too.

5 Market Structures and Business Models of the Future

103

Tony Maccalsky, Vice President for Regulatory Affairs at Verizon: My questions have to do specifically with looking forward with trends you see in play . What kind of regulatory provisions do you both think will be necessary, both here and in Europe? This will be relevant for example for the EC NGN policy workshop in June that is coming up. And related to that specifically since it is probably near to what Eli thought, to what extent do things like unbundling, open interfaces and safety nets determine the competitive behavior that will be a part of this forward looking environment. Dr. Neumann: Concerning regulatory provisions in Europe I must say although there is a very positive view towards broadband there are still a lot of barriers to competition in broadband. We could have a higher penetration when in major countries regulators would take care of bringing down those still existing barriers to competition. Germany is one example, I could also mention others. Fibre to the home, fibre in the last mile would be a very powerful technological tool but with also potentially very far reaching economic implications, implications towards a potential for more or for less competition. That is what regulators in my mind should really take care of. I am not so convinced that when a technology which by itself could make it more difficult to get viable competition to restrict unbundling. For me that is not logical. Prof. Noam: I did not mean to sound negative, just realistic. But it may certainly sound negative relative to the bubble years in which some people argued that the economic laws for silicon were different from those for carbon. We have now returned to planet Earth. The industrial revolution of the 18th and 19th centuries was accompanied by significant upheavals, too, and observing these upheavals is not being anti-science and luddite. The same changes happen with the digital revolution, and if we recognize them we may be better prepared. What are the regulatory implications of broadband? Perhaps the major issue will be access by information and applications providers. Since the pipelines converge not only technologically but also in terms of ownership and business, it is important to prevent gate keeping power where market forces don’t do it. Andreas Mars, MCI: I want to follow up my point of question on regulation. I believe that regulation would play a fundamental factor determine what would be the future of the industry. Regulators have to answer simple questions as “will new entrance have access to essential facilities?” and if so “at what rates?” “Will they continue subsidy from the wire line industry to the wireless industry?” And then on new technologies questions like on voice oper IP “Will new entrance or VoIP service providers be obligated to provide access to emerging services, interceptions?” “Will authorities mandate the standards?” Thank you.

104

Chair: Dennis Lockhart

Christine Heckart: I just want to follow up Tony’s comment about the NGN. NGN platforms – there have been discussions about if you are looking NGN and regulation, and perhaps that leads into this afternoon’s program more specifically, you are actually back leading to a discussion about the internet. Because the evolution of the internet to accommodate new kinds of networking capabilities has always been contemplated. So, the idea of an open architecture that would accommodate the NGNs of the future, not just those specifically that are being discussed now in various form but others in general. I guess my input is more commentary than a question if you will excuse me is that basically if you are looking at regulation it is always helpful perhaps in this context to look at it in a more general sense and to take a more open interoperable approach. Dr. Neumann: Economically it is the question of whether or not those two sources belong to the same relevant markets. In Europe regulators as of today have decided that they are still in different markets. So, they come to the same conclusions as Eli does. Prof. Lockhart: We have one final question. N.N.: My question is: if you know about the situation in Asian countries and especially in Korea don’t you think that American or European people will follow the lifestyle in Asian countries or any kind of technology will be made by doing so? Prof. Noam: It is of course useful to learn from other countries. Korea has a very progressive policy (perhaps because the Vice Minister spent a year at Columbia with us at CITI.) Yes, one can learn from Korea, but it is also a different situation, with the government targeting the IT sector as part of the recovery from its economic shock a few years ago. The government’s interaction with the industry is different. Therefore, while it is interesting to observe the Korean trends, we should not overvalue them. There was a period in which European penetrations of VCRs was significantly higher than in the United States and people started to worry. In the end it did not weaken Hollywood, but helped it. Similarly, there was a period in which the Minitel was the rage in France, and Americans worried again. But France did not become a leader when it came to the Internet. Broadband penetrations are increasing briskly in America, and a few percentage points of lower penetration for 2–3 years are not critical, historically speaking. In terms of culture and lifestyle, American trends are still globally influential. Asian contributions are rising with technological progress and incomes, and that seems to me both normal and desirable. Prof. Lockhart: My guess is that this group of consumers now wants less communication and more food. Would you join me in thanking our panelists.

6

Determinants of Future Market Success

Chair: Prof. Dr. Thomas Hess University of Munich

6.1

Statement

Prof. Michael Dowling University of Regensburg It is a pleasure to be here. It is a little scary though being on a panel with three professors and a consultant talking about what companies should be doing. But since most of the company representatives are too busy acquiring each other or trying to not be acquired, we will have to make do with professors and consultants talking about this. As Thomas mentioned, I teach innovation and technology management at the Univesity of Regensburg. This is also sort of an interesting twist since I am an American living in Germany and we have a German living in America on the panel. My views or observations to are based on having lived now and taught in Germany for nine years. It shouldn’t be too surprising that we are asked to give short statements about what we think are important success factors. I’ll use mine; the key success in telecommunications has been and will continue to be innovation. But I am going to follow it up with a question. The question is: where is the innovation going to occur in the future? And that was somewhat part of the question I asked Thomas Ganswindt this morning. Do we really still see the large integrated firm as being the solution to the innovation problems? We also heard this morning from Rob Calderbank, who had been with AT&T, how AT&T had been a great source of innovation from Bell Labs and how well it had worked. But, whether we liked it or not, Bell Labs does not exist anymore. So, we have to think about where innovation is going to in the future. Here is a quote from D. Dorman (CEO of AT&T), suggesting it is not just the networks but it also the services that are going to be bundled with the network that are going to be important.

106

Michael Dowling

“The experience of the past couple of years has demonstrated that there is more to being a telecoms operator than simply owning a shiny new network. The best prospects are at the network's edges, not at its core, and revolve around providing complex services, not merely dumb capacity. The watchword now is transformation, not construction. Only by embracing this new reality will the industry find a way out of its troubles.” We talk a lot about the technology from the hardware side. But services have to be created on top of the hardware. What we really don’t know very well in my view, when I have looked at some other academic research on this, is: where does the service innovation really come from? A lot of the players hope that there is going to be important innovation in this area, but as we have also seen a couple of times in some cases, we were surprised by things we didn’t think would be real innovative, like text messaging. And then with other things we thought would be really innovative, like data over wireless, are still waiting for them to happen. One of the interesting things if you look at the telecommunications service companies is that they don’t spend very much on R&D development. In Figure 1 is some data that I put together from 2002 that shows worldwide companies in the telecommunication service business. You can see some are spending more than others. But all of these would be considered low tech companies compared to manufacturing firms or computer firms which typically spend above 10 % of sales on R&D to be considered “high tech”. For example, Microsoft spends about 15 to 18 % of sales on R&D and Intel is at about 15 %. IBM, Siemens spend around 10 %.

4,50

R&D / Sales in %

4,00 3,50 3,00 2,50 2,00 1,50 1,00 0,50 AT &T Te le fo ni Vo ca da fo ne

om

riz on Ve

BT Fr

an c

e

Te l

ec

SB C Te le ko m t.

re a

D

ou th Te le co m Te lia ,S

TT

llS

N

Ko

Be

Te ls

tra

,A

us

0,00

Figure 1: Relative R&D Expenditures of Leading Telecom Firms – 2002

6 Determinants of Future Market Success

107

Arun Sarin, the CEO of Vodafone said last year: “The telecommunications industry will enjoy a decade of ‘sweet innovation’ thanks to wireless technology.” But when we see what Vodafone is spending on R&D he is not really putting his money where his mouth is. I really wonder where the service innovation is going to come from and think that it is going to be interesting to see how that develops over the next few years. NTT in Japan is a very interesting counter example because relative to the other competitors they are spending quite a bit on investments in innovation. Right about this time two years back the Münchner Kreis did a similar type of symposium in Tokyo. I was quite fascinated when we went and visited NTT’s labs to see what new developments were going on. And inside the lab they had a number of hardware manufacturers working with them very closely on new service innovations. A paper done by an employee of NTT DoCoMo shows how they developed the I-Mode. They really have adopted a community model of innovation with their hardware suppliers. My personal view is that this kind of interaction is going to become more the model of the future and less the vertically integrated firm that we heard from Siemens this morning. The speaker from IBM talked more in the direction of a collaborative model. I think that is interesting that you can see how a community helped develop this innovation of I-Mode in Japan. One could say, well, it was only in Japan and this is due to cultural differences that would not work in western countries. But nevertheless, I think this kind of collaborated model for developing service innovations which are going to be put on the network is going to become more important in the future. We also heard right before lunch from Dr. Neumann about some of the problems that have to do with developments of broadband networks. I think that a lack of competition in general will hurt innovation. Eli Noam talked about a kind of “crashing and burning” of the industry and the downside of too much competition. But as a researcher on innovation person, I think that competition is best for promoting innovation. I think “crashing and burning” in industry is a good thing because it allows new companies to develop new technologies that the more established players may not have thought of or may have actually tried to suppress. This point is clear in the data that Dr. Neumann presented. I see that a lack of competition between the two leading technologies (DSL and cable) has hindered innovation. I think that lack of competition is a bad thing when it comes to innovation and it is a bad thing in Germany in my view. This is made clear by the development of DSL connections in Germany There are some competitors, but Deutsche Telekom is still the dominating supplier and pretty much determines what is going on in this industry. There is only very limited cable competition since for too many years the

108

Michael Dowling

cable network was owned also by Deutsche Telekom. So, I think that is something that we should be concerned about in Germany. (See Figures 2 and 3).

per 100 households

30 25

24

DSL Germany

20 17

16,7

11 9

11,2

0,2

0,3

15 10

10 8,1 6,1

5

0,2

0

2002

2003

DSL USA cable Germany cable USA

2004

Source: BITKOM (EITO)

Figure 2: Broadband Penetration Rates USA vs. Germany

5000000 4500000 4000000 3500000 3000000

Competitors Deutsche Telekom

2500000 2000000 1500000 1000000 500000 0 2000

2001

2002

2003

Soure: German Regulatory Agency (RegTP)

Figure 3: Development of DSL Connections in Germany

The story in the United States is much different with a thriving competition between the phone and the cable companies. This competition is illustrated by the story of my mother. She is 78 years old, and she discovered the Internet at the age of 70 as a way of connecting with her children. I live in Germany and I have a sister in Canada and another brother in Louisiana. She became first an email-user a standard connected by phone line to her computer. About two or three years ago she called me and was vey excited because she had a new broadband Internet service that was a lot faster than her old service. I said, “That is great Mom, so what did you do? Did you hook up with

6 Determinants of Future Market Success

109

DSL through the phone company or did you get a cable modem?” And she thought for a few seconds and said, “The wire comes out of the wall.” That is all she knew, and that is all she really cared about – the wire came out of the wall and went into her computer and it was a lot faster. It turned out it was a cable modem. Competition will lead to more satisfied customers. Internationally speaking, we heard some comments from Korea today. In Figure 4 you see low broadband prices and high Megabits per second connections which is leading to real interesting differences in Korea about what people are paying for band with. A lack of competition will also hinder innovation and new services being developed. I remember when I moved to Germany in 1996, the real Internet wave was just starting and I went to various conferences and meetings and people talked about the reason why Germans were not adapting to the Internet being because they somehow had some kind of an “Angst” of technology. This was clearly wrong. Germans were not “afraid” of the technology, ehe prices were simply too high. Back in 1996, if you wanted to have an Internet connection at home you didn’t have any flatrates. If you used it extensively, you were paying hundreds of dollars a month. At the same time you could get a flatrate in the US at that point for 20 dollars a month. As competition increased in Germany the prices came down rapidly, and now Internet usage rates are about the same between Europe, the US, and major industrialized countries.

Germany France Italy Hong Kong Austria Sweden Korea Japan $0

$2

$4

$6

$8

$10

$12

$14

$16

$18

$20

$22

$24

Figure 4: Broadband Connection Price/Mbs

In summary, I think competition will lead to more innovation and although that will hurt some companies but others will be helped and consumers will be better off. For example, I think it is great that the Bell companies are trying to go into the TV business. Who knows if it will be successful or not? But I think that the effort will

110

Michael Dowling

lead to new and interesting innovations. And from the companies’ point of view we need to think about what is going to be the right model for innovation in the future. I think there are probably some kinds of innovations where the vertically integrated large firm model has its advantage. However, there are lots of other innovations that are going to come from out of nowhere with start-up companies like Skype. Or they are going to come from collaborative models like the one being used by NTT DoCoMo that we have seen in Japan. I think one of the big challenges in the future for companies, large and small, will be to figure out which model will be best suited for them to be more innovative in the future.

6 Determinants of Future Market Success

111

6.2 Statement Eckart Pech Detecon Inc., Reston First of all, thank you for the opportunity to address you. For the Münchner Kreis, it is a gorgeous location and in spite of the sunshine outside, I am very pleased to be here with you today. This is a great platform and I hope that it not only becomes a great event, but maybe even an institution to do this on the German-American basis. Determinants on future market success. This is a very tough topic to talk about for a consultant. If there is one thing that I have learned in the past ten years in the telecom industry, it is the lack of predictability. The biggest examples for this are the hype technologies that come up, then die out and you won’t hear anything about them anymore. All that remains are the false predictions and promises that came along with those hyped ideas. One example of a false prediction was the initial prediction of the number of subscribers on the German cellular network back in the late 80s and very early 90s. This prediction said there would be 5 Million users in the year 2000. The actual figure was more in the 40 to 50 Million range. False prediction number 2: optimism regarding the penetration of mobile data services. I was involved in a lot of projects back in 2000. We were all very enthusiastic and predicted that none of the mobile operators really needed to be concerned about decreasing revenues because in 2005 40 % of their average revenue per user would be derived from data services. Today, the real number is much less than that – it is actually just 20 % and if we look deeper into those figures, 17 % of those 20 % are actually SMS messages. My challenge today is to stay vague enough so that no one here in the room will remember what I have said today some years from now. As a means to better forecast where the Telecom industry will be moving in the next few years we recently completed our Telco 2010 study (Fig. 1). We spoke to the wider environment in the telecom industry from academia to industry leaders to really figure out what is going on now and what is going to happen in the Telecom industries in the next few years. As an excerpt, I just want to go into a couple of things. We felt that three of the key drivers are society, telecom regulations and technology.

112

Eckart Pech

1. Telco 2010 Study: Major Trends Market, Socio-Political and Technology Trends Societal changes, population mobility, a new regulatory environment and emerging technologies are trends that operators need to anticipate. Markets and Society Create new technology interfaces for aging society

Aging Society

Mobilization

Fast, reliable & secure communication access from anywhere

Tribalism vs. Universalism

Individual and regional differences must be considered

Individualization

Segmentation & mass customization on service & app levels

Rising Affluence

More choice - choice management & intelligent systems

Technology Mastery

More technology – intelligent apps & user interfaces

Politics and Regulation Organization & Law

Markets

Methods

Technology

Telecom specific to communication authority, law & regulation

Technology Roadmap

Separate to converged networks & services

NGN & new wireless technologies greatest disruptive potential

Retail/market, ex-ante to wholesale/market ex-post regulation

Devices

Multi Functionality

Applications

Smart Home; Smart Office

Platform Apps

Web Services; Ad Hoc

Transport

IP Mobile; IPv6

Access

WiMax; 3G

Network

DWDM

Source: Detecon Study Telco 2010 Emerging Telecommunication Landscapes: The cards are reshuffled

Figure 1

Society: The society is aging. As a German I don’t need to talk about that extensively. But even in countries like China with the single child family policy, the aging of society is an upcoming issue. And, this issue is one we must anticipate when we develop new services. Mobilization has been spoken about for years now. The penetration of mobile data services and devices has definitely picked up. It goes beyond communications. Just think about Apple and the iPod. What a tremendous success it is already. Just imagine the additional momentum this device will gain once it offers connectivity. Another issue is tribalism versus universalism. We have universal access to the internet on the markets which are actually connected. But we must bear in mind the digital divide that is going on and there are still regional differences. These regional differences apply to local cuisine as well as telecom services. The i-mode example was perhaps one of the most enlightening in that regard. Although highly successful in Japan, you could hardly get a quote from any of the providers who launched this service on how many subscribers they had after the first year. It is not that easy as it always sounds. Affluence or just the information overload is another consideration. I just learned that Google has access to 8 Billion URLs. Just think about that! You also get a lot of links if you type something into Google. Choice really becomes an issue here.

6 Determinants of Future Market Success

113

Last, but not least, is technology mastery. This also is generation issue as is the rise of the smart home. One thing I just figured out when I recently leaned back and thought about it is that I am still a paper guy, so I like to print things. But I could bet that my son will be a screen guy and he is going to read things on a screen instead of having them printed. An interesting phenomenon is developing in this context. I have heard stories that many people already have begun to substitute reading the user manuals of their latest electronic gadgets and have started asking their kids to study those gadgets and educate them later on. They are doing a pretty good job, too. That is what I keep hearing at least. The regulatory space: the real issue is to draw lines between content and transmission. Transmission is subject to Telecom regulation and content – more likely subject to copyright laws. We just heard a good example for the area of customer focus. One of the panelists shared with us the example of his mother who did not know what kind of connection she had and instead only emphasized that the cable comes out of the wall. She doesn’t really care if it is a Wifi-connection, Wimax, or whatever it is. Because of this, I believe that soon there will be a separate approach to services and access. The Convergence of access networks will also make regulation much tougher than it is today because it will become much tougher to look at them as separate infrastructures because they may share various elements. As a result instead of an ex-ante regulation, we expect ex-post regulation to become the dominant approach. In the Technology area the greatest disruption will come through the introduction of next generation networks and largely be driven through the wireless space. On the devices side, the concept of multifunctionality is a key driver. One very tangible example was the recent introduction of a camera that has a Wifi-modem built in by Kodak. It will automatically connect to Kodak’s Internet photo shop. On the application side, machine-to-machine communication will continue to gain importance. I recently bought a book which had an RFID chip in it. This chip is not yet connected to a smart home but it could get connected some time soon. On the platform side, we will see the evolution of web services, meshed networks on the transport layer and next generation networks. On the access side, mobile technologies such as Wimax and also 3G with its differentiating aspects to Wimax in terms of security and real mobility will gain additional momentum. On the network side, optical networks that allow carrying high bandwidth – Pentabytes and Hexabytes, hard to image what these figures look like. But just to be ready to digest the enormous growth of the internet traffic doubling every sixteen month.

114

Eckart Pech

2.

Technology Enabler: NGN Architectures

Converged Mobile Data Network Architectures NGN architectures provide operators the flexibility to meet the changing needs of subscribers with regard to demographics, mobility, and new technologies. 3GPP Release 5 IMS & Control Plane Components

VAS

PSTN

Location Awa eness Presence Call Control

Media Control

Media GW

Home Subscriber Multimedia Subscriber Location Control

UTRAN/ RAN

Policy Control

Signal GW

Packet Core IP/MPLS

Voice & Data Path Multiple Sessions

Cable/MSO DSL/FTTX

Internet

802.16/ 802.20

WLAN/ PAN/Ad-hoc

Convergence supports virtually any access network!

As networks evolve, they start to look more and more alike The convergence of nextgeneration networks allows them to service any type of access network one can imagine The emergence of these networks is both disruptive and also unstoppable Service bundles will help retain valuable customers, while high degrees of personalization and customization will empower them Value added services will generate new revenue streams, with service creation and service delivery playing key roles Decreased development cycles will accelerate deployment capabilities and reduce costs Presence, location, and awareness of environment will create new usage contexts

Figure 2

Next Generation Networks will be another key driver for convergence and interoperability and thereby will fuel the evolution of new value added services. There are three key elements I want to mention in that context (Fig.2). We have the 3GPP rel. 5 standard that is kind of a key element for next generation networks. Then we have the IP multi-media sub-system or IMS that allows the simultaneous provision of multi-media services: in that context push-to-talk which was a very popular service in the US, could evolve into media rich services like push-to-video. And last but not least, the session initiation protocol, SIP, that allows the establishment of multi-media sessions and also the modification and running of these. The impact of these Next Generation Networks certainly is interoperability, network agnostic services, handover between networks, ubiquitous access, location and context sensible networks and ultimately an acceleration in time to product.

6 Determinants of Future Market Success

115

3. Convergent Services: Smart Environments New Services and Applications Enabled

Converging infrastructures, technologies and business models will enable Smart Environments with new applications and services.

Penetration

Stage2: Device Boom Stage1: Non-traditional device Traditional Devices proliferation Stand-alone, single Integrated voice/data services purpose devices Different devices for distinct Wireless access market segments Communicationonly; i.e., no computation

3

4

Stage3: Integrated Environment Broadband availability Integrated multimedia environment Wireless network Device-to-device interaction

Stage4: Truly Smart Environments Ad hoc, self-healing network Single command & control device; e.g., voice, gestures (cameras / sensors) Cognitive intelligence, virtual agents Contextual services

2 1

2001

2003

2005

2010

Figure 3

Allow me to look at the convergence of services (Fig.3). In our opinion, it is crucial to have a wider view on that subject and we believe it is necessary to think beyond just wireline, wireless and the internet and instead also anticipate machine to machine and the decreasing costs of chip sets. The whole concept is what we call smart environments. I don’t think that there is a standardized term for that as of today, but just imagine that the chip in my book talks to my home server.

More and more over the time, we will be heading towards integration and interoperability based on increasingly intelligent devices communicating with each other. First approaches are already visible with ad hoc and meshed networks which communicate much easier. The last stage of that evolution will lead to these so called smart environment, self healing networks that require much less configuration by the actual user and allow for single command and control by means of a simple device so that the technology really doesn’t become the point of concern for the end user. A lot of research has already been done in that field. Microsoft with the easy living project, MIT with the Oxygen Project, Stanford University with Archimedes Project and I just learned that you can apply to live in a house in Berlin for four days to check out how Deutsche Telekom envisions this idea.

116

Eckart Pech

4. Challenges: Customer Service Survey Findings

NGN and the advanced, integrated telecom services and applications they enable will lead to increasingly complex customer care issues. Multiple Data Networks & Devices

Majority of respondents use multiple devices and service providers for their telecom and advanced telecom services • Close to 68% use 2 or more providers with close to 14% using 4 or more • 43.5% use multiple devices for mobile data services using a combination of mobile phones, PDAs and/or wireless data cards

Internet access and use of data services while traveling were the issues most often encountered and reported

Troubleshooting Issues

Integrated Services and the Single Bill

• Respondents indicated that they have encountered problems with internet access, including an inability to access the internet (51.6%) and/or internet access being slower than usual (41.4%) • Inability to use data services when traveling encountered by 32.8% of respondents

Respondents are already receiving single bills for most or all of their services and most expect customer service from the billing service provider • •

Just over 22% of the respondents already receive a single bill for most or all of their telecommunications services Most respondents (74%) prefer the service provider billing them to handle customer service for all services they bill for

Respondents indicated a strong desire for traditional 800 number customer care, but self-care and email are other options considered

Multiple Customer Care Options

• •

Close to 95% selected calling an 800 number as one of their top 3 choices, with close to 65% selecting it as their first choice for customer care Self-care through the provider‘s website and email were cited by 71.1% and 56.8% respectively as options for customer care, with 19.5% choosing self-care and 7% choosing email as their first choice Source: Results of Detecon web survey 2005

Figure 4

All of these new technologies of course bear significant potentials, but do not come without challenges. I just picked one that sounds very operational and very unsexy – customer service (Fig. 4). Why did I pick that? Because it is an enormous cost driver. And why do I think that is important for operators to consider? We figured out that it is extremely beneficial to have all of those new services. Users will have less and less of a challenge to configure networks in their home environments. But the provision comes at a certain cost. From the initial introduction on the operator’s side, the new world is not that easy to handle. It is, by the way, already visible today. If some of you want to know, we will have a chance to study the pages of Better Business or Planetfeedback.com and you can see how many customer complaints cellular operators have with the provision of very basic mobile data services. That is an enormous cost driver. Not all of the people who have those problems actually call. But if they call, each of those calls to customer care will cost the operator 50 US $, not anticipating the free minutes they have to give for them and the damage in reputation created by dissatisfied subscribers. A very tangible example is the upgrade that Deutsche Telekom has taken from the original network to ISDN and DSL. The people that stayed on the original POTS network place a call into customer care approximately every twelve years. People on an ISDN network call every six years, and people on a DSL network every three years. You can run the numbers yourself. How much more do you have to charge

6 Determinants of Future Market Success

117

to make this feasible business care for an operator? That is the other side of this game – the operator needs to make some money and it is not looking that promising for them under the current set up. Just to quote some of the findings we had in the customer care space. People use multiple devices and multiple service providers. Therefore, I believe that the unification of that on the end user side will be a big challenge. There will also be a challenge to serve the basic needs of connectivity. And who is in charge of the connectivity for the user? Is it the service provider? Is it the operator? That is really an enormously challenging area. We see this with most basic of all applications – internet access. We want connectivity all the time and more than 50 % of the actual users report problems with service availability. The single bill is something that we all wish for but does not necessarily go along with unified customer service. It seems natural that this will also be unified at some point, but at what cost? Last, but not least, is customer self care. If on the one side, I add much more complexity there may be an opportunity to compensate for that by also introducing customer self care through a web interface. The adoption rate for this is still lower than 20 % which is extremely low. This can by no means counterbalance the increasing number of service requests that you will see once you introduce new services. In total this is very remarkable. We promise less complexity, much more sophisticated service supply, much more choice for the subscriber, but on the other side much more complexity for the operator. Managing and maintaining data networks is not that easy. If we miss one word in a voice conversation that is one thing. If we miss a whole data package of our transfer data that is a whole different story. Maintaining a satisfactory service level in spite of this increasing complexity remains a major challenge and poses questions for operators that currently have no solution. The other issue is the assurance of a satisfactory or even good experience on the customer side. We have learned this from other examples where new services have been introduced and this will largely determine how and when those whole new services will become successful and data services really become a mass market phenomenon beyond the current SMS.

118

Rolf T. Wigand

6.3 Statement Prof. Dr. Rolf T. Wigand University of Arkansas at Little Rock The topic of our session, ”Determinants of Future Market Success,” certainly is a tall order to fill and I am sure this will make for an interesting discussion after our panel’s presentations. Briefly I would like to describe the environment in which our industry finds itself today. Some of these comments have been made in one way or another earlier. So, I will move over those at a pretty good clip. It has become evident that the Internet has a puzzling way of colonizing other forms of communication and information distribution. Already, VoIP or voice over Internet protocol, is solidly advancing its march to destroying the traditional long-distance phone companies by routing calls from regular telephones over the Internet. Likewise, letters and faxes are being substituted by e-mail. More and more of our invoices arrive electronically and are settled in the same fashion. Internet-friendly digital pictures are pushing aside film and are playing havoc with the traditional film industry. Music and movie downloading, legal and otherwise, is already a vastly adopted practice and will only spread more. We heard this morning that AT&T, as we used to know it, is gone. It is no surprise to you that my students and many young people no longer have landline phones any longer. We heard a number of times about various fascinating developments that we are observing such as Voice-over-IP. To give just a quick impression, here are some numbers from Skype, one of the VoIP applications: 2.3 million Skype users are online every time I use it. Here is a surprising finding: I checked yesterday at 4:30 p.m. and 9.9 million people had downloaded the software and by 7:22 a. m. this morning, about 180.000+ additional people downloaded the software within in this 14 hours or so short time frame. Vonage, which we have also heard about, is the biggest Internet phone company and it sort of sucked away 500,000 customers from giants like Verizon and SBC, representing a US$100 million value in operating income for Vonage. But if we analyze and think about where that money came from it probably had an actual operating value of four or five times higher. As this highlighted example demonstrates, amazing things are happening and Vonage is predicting to double its customer base by next year. It would be fascinating to research this area in terms of being a disruptive technology or looking at diffusion gaps, chasms and related developments of that sort.

6 Determinants of Future Market Success

119

A few more things here are characterizing the environment: No-one really knows how wireless technology, optical networking or regulation will change the competitive landscape. We only can speculate about that. We know quite a bit about what users like in terms of services: Friendliness seems to prevail, as well as flexibility, security and they look for cost and tangible benefits. With regard to 3G and WiMax, as they take hold the division of low cost telecom carrier seems to become blurred and change quite quickly. Within this turbulent environment it is quite difficult to do a number of things, i.e. to foresee the impact of ever increasing digitalization. Very fascinating innovations are happening, very creative things as we all know. It is difficult to predict type and pace of new information and communication technologies and services. It is difficult to anticipate regulatory developments and conditions, things of that sort, and difficult to predict the time duration to achieve a target. Actually, if you think about it, it is really unpredictable. Moreover, it is difficult to master the transition period for a company in terms of management, marketing and financing. What are we supposed to do? The question here suggests as if I am going to provide answers – I am not. I am raising more questions than anything else or, at best, give maybe some hints. I think we need to examine the association, probably a very tight association, between a service and its relevant infrastructure and offering the service on whatever supporting infrastructure. In order to compete today firms need to have a scalable, reliable infrastructure obviously. And they need flexible, open tools allowing the development of new services fast. Services, not technology, drive the network of the future. I think we are aware of that. Success is determined not by who has the best information and communication technologies, but who provides customers with advanced services they want. Of course there is convergence which we should try to understand. This topic has come up a number of times during our symposium. Here are some points we need to consider along those lines. Convergence today is very visible in many mergers and acquisitions. No doubt, IT and telecommunications are merging. We realize that new strategic initiatives (often developing new services) by firms are often outside traditional core business areas. Accordingly, many firms are extending activities to new business sectors. We recognize also that software drives convergence. In all of these convergence developments, the prime driver is process convergence. We need to examine: devices, applications, networks, markets, and regulation. I think ultimately the arbiter in these uncertain developments is the market or the leveler is the market that determines what is going to happen in terms of future market success. What is really changing the face of information and communication technologies are the customers themselves. They are demanding and getting

120

Rolf T. Wigand

advanced services. Think about some of those services that they have now, expect, and take for granted: Unified messaging, local number portability, voice and fax over the Internet, and more. More robust e-commerce capabilities have been popping up and Internet access has been simply expected from any device. Firms are active in restructuring their strategies to adapt themselves to the current development and the overall context that is very much in flux. Maybe one lesson we can learn from history: I am reminded of the Louis and Clark expedition that went through a similar analogous effort as our industry. In 1803 Louis and Clark received from the US Congress a definite budget, I don’t know if there was a time limit or not, to find the best possible passage way to the West Coast. There was no predetermined way how to go. So, the Louis and Clark expedition explored valleys and found where the river is the lowest to cross. When a mountain was too high they backtracked and found a more passable way. There was a lot of trial and error, multiple parties went out, they reconvened and a decision was made which way is the best possible way to go. This was very much a heuristic effort similar to what we are doing in this industry today. I think what we are doing today is pretty much a model like this: We have a starting point. We sort of have a strategic direction in mind. There may be an understanding that where we must be headed, e.g., is the North West. But we are not sure how we get there. We may have to go over mountains, backtrack, go through trials and errors, but we may never loose sight of our strategic direction, i.e. North West in our example, where we must end up. In conclusion then, given our current rather turbulent environment within the telecommunication industry, we must not lose our vision and strategic direction where we must be headed broadly such that we stay strategically positioned. It seems this is probably the best we can do in such a flux environment, but we must seek and continuously ascertain this important strategic direction.

6 Determinants of Future Market Success

6.4

121

Discussion

Chair: Prof. Dr. Thomas Hess University of Munich Prof. Picot: The next session will be chaired by Thomas Hess. Thomas Hess is a professor at the University of Munich. He holds the chair for Information Systems and new Media. Prior to that he received degrees from the University of Göttingen, University of St. Gallen and University of Darmstadt. He also served for a couple of years as an assistant to the managing board of the Bertelsmann Group in Gütersloh in Germany. He is very active in many research areas especially also in the field of convergence between media and telecommunications and IT. I am very happy that he is here today in order to chair this next session. The floor is yours. Prof. Hess: Arnold, thank you for your friendly words. Ladies and gentlemen, the next session will go on with our ongoing subject. Today we heard a lot about technologies, what are the trends, why do we need broadband, why we should integrate services. We heard a lot about market structure, what are the cycles of the industry, what are new approaches of services. But I think one important question is still missing. What are the options for the companies? What should they do? What are success factors for companies on the market side? What are success factors on the company side? What are the capabilities the companies should have? These are the questions for our panel for the next time. I am very happy to give some answers to those questions together with my colleagues. We have a very good panel with three experts also from academia and practice and also from Germany and the States. I want announce the panel with some words only and in alphabetical order. We firstly have Prof. Michael Dowling. Michael holds a chair for innovation and technology management at the University of Regensburg in Germany since 1996 and he has a very strong background in telco and IT industry and also in academia and in consulting. He got his education at several very important schools, also Harvard and University of Texas. Eckart Pech is the second one on the panel. He is president and CEO of Detecon Inc., a management consulting company specialized in telco management and he is responsible for the Americas market since the beginning of 2003. He has a background in business management and information systems and is very well known expert in technology management.

122

Chair: Thomas Hess

Last but not least we have Professor Rolf Wigand. He holds the chair of information science and management at the University of Arkansas at Little Rock. Rolf is an internationally very well known expert in the field of information management, electronic business, alignment of industries and other subjects related to information technology and changes in industry. I think we have a very good panel to give some answers to the question, what should a company do? I am happy to give the word to Michael Dowling. Statement of Prof. Dowling: (See number 6.1) Prof. Hess: Thank you very much, Michael. Next turn is yours. Statement of Eckart Pech: (See number 6.2) Prof. Hess: Thank you, Mr. Pech. Here is the statement of Rolf Wigand. Statement of Prof. Wigand: (See number 6.3) Prof. Hess: I think we heard a lot about special subjects and how to manage a telco company in the next years. We heard about innovation and innovation processes. We heard about operational excellence and processes and we heard about the way to get new targets. Now, we have some time left for discussion. I will start with one point, a typical question of a management system, the question about how to structure a company. Maybe you know that Deutsche Telekom has changed its structure or it is changing the structure. What do you on the panel or in the audience think about this move? Is it necessary to change the structure of a company based on technology changes or can we work with the same structure that we have for ten or twenty years? Mr. Pech: I think the recent changes of organizational structures of operators go along with the disappearance of the classical siloed approach. In the past few years these organizations distinguished between wireline, wireless, IT services and internet services. In the future this will evolve into a layered approach. Basically it comes to a unified service and production layer that may distinguish between different customer segments. The requirements of a residential customer, of a SME or a corporate customer are certainly different and you want to communicate with them differently. For layers underneath that you of course want to leverage the capital and

6 Determinants of Future Market Success

123

operational expenditures if you are going to build and operate a unified network. Therefore we are going to see a structure where we have a production layer and on top of that a service layer and then the customer facing units segmented by target groups instead of products.. Prof. Hess: Thank you. Maybe there is a second statement of the academia? Prof. Dowling: I was going to say that you need restructuring so the consultants have something to do because that is how you guys make the large fees! I used to work for McKinsey and I would agree with him. I think that a lot of telecommunications service providers restructured that way back in the 1980ies such as British Telecom. And you are seeing now from the manufacturing side as well that we are going to have much more of a customer focus. That has been the theme all day here and I think the organizational structures have to reflect that. Prof. Wigand. Maybe just a couple of observations. I fully agree but I think that in terms of marketing and future market success there has to be a nearness to the customer as never before. Some of the things that we have to react to are almost on the fly, it seems. All of that is happening with incredible competition. Another interesting thing with regard to structure is, how to do R&D and we heard a little bit earlier about that, there is considerable discussion that companies are outsourcing R&D to India and as well. What does that do to the structure and dependence/independence of a company? Mr. Pech: I just wanted to add one remark with regard to customer orientation. Another symptom we see goes along the lines with what we have learnt this morning from Professor Calderbank. The whole momentum that you can create through leveraging the power of online communities in supporting your product development. Some operators have actually done that by embracing the concept of developer communities. When there was the initial need to come up with mobile data applications that would really be relished by subscribers they figured out they couldn’t do it themselves and they encouraged the community of early adopters to participate in a joint exercise of developing those applications. Prof. Hess: Thank you. Maybe let us have a second question about organizational structure, also in the field of make or buy. Some telcos had ideas about the production of content. May we have in mind the case of AOL and Time Warner, now it is only Time Warner. Do you think there is a trend in integration to the content side or with no best way?

124

Chair: Thomas Hess

N.N.: I have seen them attempting that many times in order to establish kind of a walled garden approach that we have seen for example with the MSOs because they want to protect the customer, they understand that their core product, the network, is going to get commoditized. And a way to shield the other valuable thing they have in their portfolio the customer, the customer relation, and to augment the customer relation and put a layer on top of the commodity, the transport, they try to embrace their whole concept of content and media. But on the other hand that is just not their core competence and that is where the trouble lies. But the AOL, Time Warner, Merger also, showed that just buying one of those players is certainly not the right path. Prof. Dowling: I would like to comment that the creation of content and the transmission of it is like the differences between hardware and software in the computer industry. It was interesting today that both Alcatel and Siemens said that they are cooperating with Microsoft. I think that such cooperative models are going to make a lot more sense in the future. Attempts in the past to combine hardware and software in computers has not worked that well. And the same may be true with media and transmission. Prof. Bauer, Michigan State University: One comment and two short questions. The comment was to Michael Dowling’s presentation. I think when you just show the R&D intensity of the service providers you are missing some of the pictures because there is simple specialization where R&D happens in the value chain. So figures from one set of firms in the value chain can be misleading. Regarding your collaborative innovation I was wondering whether you see any cultural barriers for Europeans and Americans to adopt that model. Let me tell you a story; NATE is the second service that is like DoCoMo in South Korea. It is very successful. They have unlike DoCoMo a platform that is portable across technical systems. They can sell in Europe and in the United States and they were not successful so far mostly because service providers do not want to engage in that same kind of revenue sharing that was at the heart of the South East Asian models. They have just enormous obstacles in terms of management culture to overcome this. The second comment or question is to Prof. Wigand’s remarks. One has to be very careful what the competitive advantage is like. You look at Vonage for example. It seems to be fast growing. However, the fact is that Cable Vision has come to the market with triple play and has signed up 85.000 VoIP customers in one month. Vonage may move very quickly between carriers who are single play competitors. Those, who can bundle very effectively and actually sell VoIP at a higher price than Vonage does are even more successful. The bundle is just very attractive. Prof. Dowling: You are right about my R&D statistics. Still, I think there is an interesting difference between companies that are just providing services and others that are in service as

6 Determinants of Future Market Success

125

well as hardware businesses. I also think your point is a good one. I think in the future to compete better the companies are going to need to collaborate better. You are right, that can cooperation can be difficult to manage. I was just reading last week in Business Week that Motorola has come out with an I-Mode-phone but they are having a hard time selling it because the phone companies want to get a piece of revenue for downloading the music. They don’t necessarily want to sell I-Mode phones. Right now the few that are out there are not being subsidized like the other phones. They may have other revenue models but they also may just be short sighted. It may be a good strategy in the short term, but a bad strategy long term. I think they are going to have to be more adaptive, especially in Europe and the United States. Prof. Wigand: I mean this is a very good point you made, Johannes. I think we should be fully aware. What I presented is a snapshot at the moment and one needs to see how things develop and also I am reminded of what happened to software bundling of popular software packages and how that changes the entire game and pricing structure. A good point, thank you. Prof. Hess: Are there further questions? N.N.: This is to Mr. Pech. He talked about wireless data and maybe I just live in the isolated environs of Capital Hill. But every other person has a Blackberry. I was just wondering if you could tell me a little bit about the development of that market and how does that fit in with other mobile data development. Mr. Pech: I am generally a person who doesn’t embrace technology at the first stage but I have tried a whole range of devices to get wireless connectivity. What struck me about the Blackberry was the pure simplicity. It is a push email concept so there is no configuration or anything else you really need to worry about. It is relatively secure. Of course, like anything today it can’t get hacked, too. There was recently a case from a gentleman who worked for the CIA and used his private Blackberry for his work email that got hacked. But still the simplicity is what actually gets most people to use it. That is also the reason why so many government officials enjoy it and it is so popular in the professional community. But there is also a regional flavor to it. It seems to be a concept that works very well in the US but doesn’t necessarily have the same level of popularity in other markets. That may also go along with the whole concept of live to work versus work to live. It has a lot to do with cultural aspects, too. Just to give you another example. The i-mode service is a good one to that. We should have a similar usage pattern for i-mode usage in the US and in Japan. In both countries people are on average commuting every day 60 to 90 minutes to their work. In Japan i-Mode got so popular because people embrace new technologies

126

Chair: Thomas Hess

very quickly and enjoy them for various entertainment applications. That is something that American commuters do not necessarily look for. So, there is a huge difference and this is also why the Blackberries is such a popular device in the US market because it drives what people like here: efficiency and simplicity. Prof. Hess: There is still time for one question: N.N.: This is not really a question but I just want to comment on his words. I think he pointed out exactly what really we need to focus on our telecommunicating market because in Korea the reason why we developed a lot of information or technology about the telecommunication is that people compete with each other. I.e. if one company developed cell phones with an iPod people just move on to that company. And then if another other company developed a kind of video game or play station and iPod in the cell phone then everybody just moves on to the other company. One of my friends changed his telecommunicating company for five times. I think that makes a lot of developments. Prof. Eberspächer: I have a question to Michael. You said, innovation is necessary for the whole story. I think this is clear and nearly trivial. The question is: how can we really make innovation? My question is: Do we need more small companies? Should the money go to the big companies as we have heard this morning? Do we need the money at universities? Do we need new attitudes for creating innovation? Do we need education for this? And what is the difference between Asia and the United States here for example? And what about the defense sector? Much money from this sector is going into innovation here in the U.S. Prof. Dowling: I think the key is going to be diversity. In the past, there were only a few models for innovation used including in the USA a focus on military R&D and innovation by large companies.. But those models have declined in performance and we are going to need more diversity. We need to support small company innovation and we don’t need to have more applied university research. I think the challenge is to find the right mix. And that is just really a question not an answer. N.N.: Service innovation. Do you hire people to try and invent services or do you have a network where you can in a sense monitor activity and discover quickly new things that are organically growing up and then do more of what seems to work?

6 Determinants of Future Market Success

127

Prof. Dowling: Probably more the latter than the former, would be my view. Yes, service innovation is not the same as R&D for manufacturing but it is like you are saying discovering what can be provided over networks. We have to figure out what kind of structures and what kind of policies,. But I think the vertically integrated firm alone will not promote service innovation. We have to have more diverse models. N.N.: One example comes to my mind. I think we have all heard the story where the record companies all they are doing that is to hire other companies to do market research for them in terms of directly observing what is being downloaded illegally and use that information for their marketing campaigns. Much more accurate than doing any kind of other marketing research. That nearness, that closeness is stuff to do, no doubt. Prof. Hess: Thank you, we have heard a lot about organization structures, about services, about how to manage evolution processes. There were a lot of questions, some answers. Thank you to the panel, thank you to the audience.

7

How Much and What Kind of Regulation Will be Needed in the Networked World of Tomorrow?

Chair: Prof. Dr. Arnold Picot Münchner Kreis und Georgetown University

7.1

The Role of Antitrust in a Deregulating Telecommunication Industry: The Economic Fallacies of Trinko

Prof. John W. Mayo Georgetown University* (The paper could not be personally presented on the symposium)

I. Introduction The interrelationships between economic policy instruments and their ultimate effectiveness are as complex and nuanced as they are critical to the formation of sound economic policies for the 21rst century. In the macroeconomic arena, legitimate debates are ongoing regarding the effectiveness of monetary (fiscal) instruments in the face of particular fiscal (monetary) policies. In the microeconomic arena, however, the comparable policy instruments – regulation and antitrust – are often seen as alternative or substitutable policy instruments, never to be seen in the same industry with one another. This perspective certainly shapes the policy agenda as society moves forward to consider how much and what kind of regulation will be needed in the networked world of tomorrow.

* McDonough School of Business, Georgetown University, 37th and O Streets, NW, Washington, DC 20057 e-mail: [email protected] ** Helpful comments on earlier versions of this article were received from Tim Brennan, David Kaserman and participants in the Australian Consumer and Competition Commission 2005 Regulatory Conference. Nonetheless, all remaining errors remain the sole responsibility of the author.

130

John W. Mayo

In this paper, we use the evolution of policy in the telecommunication industry as a specific case in which to make the point that such implicit tradeoffs between these instruments may be ill-considered. In particular, our examination of the evolution of regulatory and antitrust interventions in this industry reveals a clear role for the smart, and perhaps simultaneous, application of both these policy instruments as tools to fully enable the development and maintenance of effective competition on a forward going basis. We find, however, that the traditional, complementary role of antitrust in the telecommunications industry has been recently dealt a significant blow by the Supreme Court’s Trinko opinion.1 In light of this setback, the paper identifies legislative remedies and options that may restore the beneficial role of antitrust oversight of the telecommunications industry. The paper is organized as follows. Section II reviews the historical role of antitrust policy and enforcement in the largely regulated telecommunications industry of the twentieth century. Next, Section III provides a discussion of recent regulatory and antitrust decisions. Section IV describes several critical economic fallacies that underlie the Court’s Trinko opinion. Legislative remedies are considered in Section V, while the potential criticisms to such legislation are considered in Section VI. Finally, section VII concludes.

II. The History of Antitrust in the (largely) Regulated Telecommunications Industry If antitrust and regulation were perfect, and perfectly applied, policy instruments, then application of either to an industry would suffice in ensuring effectively competitive outcomes. Indeed, conventional wisdom holds that public utility-style regulation should be applied in naturally monopolistic industries, while antitrust policies should be applied in industries subject to contrived monopolistic outcomes. In this situation, the simultaneous presence of both would in the best case be unnecessary and, given the costs associated with the implementation of any policy instrument, more likely a net drain on society. History has revealed, however, that neither of these instruments is either perfectly conceived or perfectly implemented. Consequently, we cannot, as a logical matter, rule out the potential for the simultaneous application of these policy instruments to improve economic welfare. The logical potential for benefits from simultaneous application of regulation and antitrust, however do not ensure that such benefits actually exist. Such a determination can only be made upon a specific inquiry into whether the application of antitrust in a regulated industry has improved economic welfare or may do so in 1

Verizon Communications Inc. v. Law Offices of Curtis V. Trinko, LL, U.S., No. 02-682, 1/13/ 04.(Hereafter, Trinko)

7 How Much and What Kind of Regulation Will be Needed

131

the future. To be sure, such an exercise cannot be done abstractly. Rather, as noted by the Supreme Court, such antitrust analysis “must always be attuned to the particular structure and circumstances of the industry at issue.”2 Accordingly, in this section, we briefly review the role of antitrust in the history of the largely regulated, and now deregulating, telecommunications industry. We conclude that the application of existing antitrust laws has been substantially beneficial to the development of competition in the industry and has not served to disrupt the regulatory process.3 At the outset, we note that perhaps no industry in American history has been more affected by antitrust than telecommunications. While more encyclopedic coverage of the history of antitrust in the telecommunication industry can be found elsewhere, an important thread to emerge from that historical perspective is the conclusion that both the threat and application of antitrust enforcement have played decidedly procompetitive functions in the industry despite the ongoing presence of the consumer protections “ensured” by the presence of regulation.4 Here we eschew a comprehensive review. Rather we point to a few substantive examples that suffice in drawing the inference that both the credible threat and the enforcement of antitrust have served pro-competitive roles in the evolution of the industry, despite its historically regulated heritage. Early history: anticompetitive proclivities of those with control over vital resources. It is by this time well-documented that dominant control by a single provider over vital resources that are necessary to provide telecommunications has created a century-old history of anticompetitive practices by the organizations that have controlled these resources.5 These anticompetitive proclivities began virtually at the inception of the industry. Even before the expiration of the original patents of Alexander Graham Bell, “[t]he Bell system followed a policy of refusing to interconnect with independents.”6 Indeed, it was only under imminent threat of antitrust prosecution that AT&T agreed to stop purchasing competing long-distance telephone companies and to connect competitors’ lines to those AT&T’s network.7 Equipment interconnection and interface. The struggle to pro-competitively allow consumers the right to use their own equipment to interface with the public switched network took decades to succeed, only ending with the divestiture of AT&T. While 2 3 4 5 6 7

Trinko, p. 2. For a dissenting opinion, see MacAvoy (1996). For more detailed reviews of specific cases, see, Temin (1987), Kaserman and Mayo (1995), Noll (1995) and Spulber (2002). See, e.g., Gabel (1994) and Weiman and Levin (1994). Spulber (2002), p. 479. See Temin (1987), p. 10. See also, Gabel (1994) who notes that it was AT&T’s apprehension regarding potential DOJ objections to its aggressive acquisition policy that led it to temper the means by which made purchases. (p. 558)

132

John W. Mayo

in hindsight the pro-competitive aspect of this open interconnection policy is transparent and obvious, the regulatory infrastructure, acting on its own proved incapable of advancing this pro-competitive policy effort. For example, in Hush-aPhone the Federal Communications Commission (FCC) allowed the Bell system to insulate itself from even innocuous coupling devices that attached to the customer’s telephone. 8 Even subsequent to the court’s desire to see open interconnection it was observed that “the Bell companies continued to interpret this exception as narrowly as possible and continued their general refusal to interconnect.”9 Thus, in the face of advocacy by incumbent providers, the FCC has demonstrated a proclivity to insulate the regulated telecommunications system from competition. Long distance. Attempts to obtain competitive entry by upstart long distance providers (e.g., MCI and Sprint) began as early as 1963. Naturally, the incumbent Bell system opposed this competitive entry. Less defensible was the fact that regulatory agencies at both the federal and state levels consistently supported the incumbent’s position to deny or delay entry. In fact, it was not regulators but rather an appeals court that in the 1977 Execunet case broke open the ability of new long distance rivals to enter the long distance market.10 Even when entry was granted, the FCC continued to allow discriminatory dialing patterns that retarded entry and competitive growth of these new competitors until the antitrust court incorporated “equal access” in the Modification of Final Judgment in 1982.11 The Public Antitrust Case Against the Bell System. As noted, the emergence of competition in the long-distance arena was largely prompted by pro-competitive court ruling that overturned the protectionist policies of the FCC. The ostensible opening of long-distance service to competition, however, did not eliminate the myriad tools available to the Bell System to stifle this nascent competition. The result was that, despite the nominal presence of a “pro-competitive” FCC, the Bell System was able to act with virtual impunity toward new entrants. As amply documented elsewhere, the vertically integrated AT&T was able to deny, denigrate and delay the provision of access facilities to new entrants. Even when access to local exchange facilities was made available, it was do so “without regard to costs.” And the FCC continued to permit dialing patterns that imposed significant asymmetric burdens on consumers of new entrants relative to those of the incumbent. MCI brought a private antitrust case against AT&T and prevailed.12 8

See Hush-A-Phone Corporation et al. v United States of America and Federal Communications Commission, et al. 238 F. 2d (1956). 9 Kahn (1988), p. 142. 10 MCI Telecommunications Corp. v. F.C.C., 561 F.2d 365 (D.C. Cir. 1977) 11 For empirical evidence of the positive impact of equal access on the number of long-distance firms, see Kahai, Kaserman and Mayo (1995). 12 MCI Communications v. American Telephone and Telegraph Company, 708 F. 2d 1081 (1983).

7 How Much and What Kind of Regulation Will be Needed

133

Additionally in 1974, the Department of Justice filed suit against the Bell System for violating the antitrust laws. In bringing the case, the Deputy Assistant Attorney General, William Baxter, was forced to wrestle with the potential immunities and “pro-competitive” impacts that might exist as a consequence of regulation of the Bell System at both the federal and state levels. His conclusion: both clear and strident, was that the regulatory apparatus, however well-intentioned, is simply incapable of adequately ensuring competitive behavior from an entity as complex as the vertically integrated Bell System. The incumbent’s predictable claims of the procompetitive powers of regulators largely proved vacuous at trial and, with the trial not evolving as preferred by the Bell System, negotiations with the DOJ ensued, leading, ultimately, to an agreement to structurally separate the Bell System. The result was the elimination of the incentives of the owner of local exchange faculties to discriminate against downstream firms in the long-distance market and a veritable explosion of competition. Importantly, a lesson from the case which has subsequently been dubbed “Baxter’s Law” is that a viable overlay of antitrust policy is necessary to achieve competition even in regulated markets. Interestingly, even strident critics of antitrust policy seem to concede its valuable role in promoting competition in this case.13 For instance, Crandall and Winston (2003) who offer a broad assault on the effectiveness of antitrust policy in general acknowledge the value of the DOJ’s case because, while the FCC could have established pro-competitive rules of interconnection that would have permitted new entrants to thrive, it did not do so. Rather, as Crandall and Winston note, the FCC, “was trying to block MCI from competing in ordinary long distance services when the AT&T case was filed by the DOJ in 1974.” Thus, they concede that “antitrust policy did not triumph in this case over restrictive practices by a monopolist to block competition, but instead it overcame anticompetitive policies by a federal regulatory agency.” (emphasis added.) In summary, looking back at the history of the industry, we see that incumbent regulated firms have offered a variety of defenses for practices that have ultimately proven to be anticompetitive. Importantly, while each of these “justifications” for preserving incumbent telephone companies’ monopoly positions has ultimately been shown to be unwarranted, they have, nonetheless, each at one time or another found a sympathetic audience with regulators. Most often, it has been the presence of courts, informed by pro-competitive antitrust principles that have exposed these defenses as fraudulent. While a full discussion of these arguments is beyond the scope of the present paper they include at least the following: (1) “We shouldn’t be forced to interconnect because of degradation to the network;” (2) “We shouldn’t be required to interconnect because a single provider is most efficient;” (3) “Our refusal to deal is not anticompetitive because ‘the government made us do it’;” (4) “Were 13

This broader debate on the effectiveness of antitrust policy can be found in Crandall and Winston (2003) and Baker (2003).

134

John W. Mayo

trying to interconnect in a non-discriminatory fashion but its hard;” (5) “We couldn’t be behaving anticompetitively because government regulation eliminates our ability to do so;” (6) “Our behavior must be interpreted as competitive because we had no anticompetitive reason to refuse to deal or discriminate against downstream rivals;” (7) “Technological change has rendered our monopoly impotent so our behavior can only be seen as competitive;” and, (8) “Even if we retain control over vital inputs and may engage in conduct to maintain this control in a manner that might thought to be anticompetitive, society should permit these practices because of the ‘larger benefits’ to society from enhanced investment and technological change that are afforded by permitting the network to remain closed.”14 Elements of these often fraudulent defenses linger even today throughout the regulatory infrastructure, providing prima facie evidence that either: (1) regulators are either ill-equipped to discern fraudulent from bona fide arguments regarding competition; or, (2) that despite the ability to see through arguments designed to insulate incumbents “regulation is acquired by the industry and is designed and operated primarily for its benefit.”15 Regardless of the explanation, however, it does speak to the important, pro-competitive role that may be played by the sound application of antitrust principles in this industry.

III. Recent Regulatory and Antitrust Developments By all accounts, the Telecommunications Act of 1996 represented a compromise that granted the Regional Bell Operating Companies (RBOCs) the relief they sought from the line-of-business restrictions embedded in the Modification of Final Judgment in exchange for the full opening of local exchange networks and services to competition. The novel vision of this Act was that by doing so the twin goals of advancing competition and advancing deregulation could be achieved. The belief was that, with proper implementation of the Act, monopoly of the local network would end as series of side-by-side vertically integrated competitors arose, each providing a panoply of services with none denied the ability to do so by foreclosure from key elements of the local exchange network.16

14

For an extended discussion of the possible efficiency arguments that strengthen the case for “close” vertical relations, as well as the holes in these arguments, See Farrell and Weiser (2003). 15 See Stigler (1971), p.3. See also, Kahn (1990), who observes that “The essence of the regulatory approach…is the acceptance of a single company (or a select set of existing companies) as society’s chosen instrument for performing the services in question. (Vol. II, p. 114) (emphasis in original) 16 See, e.g., Noll (1995), p. 528.

7 How Much and What Kind of Regulation Will be Needed

135

If the aims of the Act were achieved, the structure of the industry would become dramatically more competitive, and with an effectively competitive local exchange telecommunications market in place, the role of policy could recede to a standard antitrust backstop.17 The evolution of the industry and policy has, however, not unfolded as envisioned. In the marketplace, there was an initial enthusiasm for, and actual, entry into local exchange markets. This entry principally relied upon a business model of utilizing (initially, if not for an indeterminate period of time) leased components of the incumbents’ extant network. Subsequent, to the initial wave of entry, however, a combination of factors falling alternatively into the “shakeout” and “shakedown” categories conspired to dampen that enthusiasm.18 At the same time that the vitality from such “intramodal” competition has receded, the unanticipated (at least by 1996 standards) prospect of “intermodal” competition from wireless and cable-based telephony has begun to surface. The degree to which such intermodal competition serves to restrain the incumbent local exchange carriers’ market power is currently a subject of considerable scrutiny and debate.19 Nonetheless, it is clear that the success of competition is at a particularly delicate moment in the history of the industry. Given this backdrop, the ability to preserve or advance the pro-competitive goals of the Act through the application of antitrust principles has arguably become increasingly important. Unfortunately, while a clear, strong and effective antitrust backstop to anticompetitive backsliding is especially important at this juncture in the history of the telecommunications industry, recent antitrust enforcement has fallen short of the this pro-competitive potential. Indeed, dispassionate observers have noted that the federal courts are “currently in complete disarray about the proper role that antitrust should play in bringing competition to local telephone service.”20 A variety of cases, all involving in one way or another the use by an incumbent local exchange carrier (ILEC) of its dominant control over network facilities to disadvantage new entrants, has generated no consensus regarding the role that antitrust might play in complementing the Act’s goals to promote competition. The prospect that this unsettled nature of the role antitrust in the deregulating telecommunications industry would be resolved by the Supreme Court arose in the recent case of Verizon Communications v. Law Offices of Curtis V. Trinko, LLP.21 In this case, it was argued that Verizon’s failure to provide network elements to AT&T as required by the Telecommunications Act under Section 251 constituted an attempt to monopolize in violation of Section 2 of the Sherman Act. The case came 17

As noted by Hovenkamp (2003, p. 631) “One natural consequence of deregulation has been the expanded application of antitrust laws.” 18 See Burton, Kaserman and Mayo (2003) for a discussion. 19 For an econometric assessment, see Kahai, Kaserman and Mayo (2005). 20 Hovenkamp (2003), pp. 631-632. 21 Verizon Communications Inc. v. Law Offices of Curtis V. Trinko, LL, U.s., No. 02-682, 1/13/04

136

John W. Mayo

to the Supreme Court after a district court finding that the alleged practice was out of the reach of the antitrust laws because the Sherman Act generally does not impose a duty to cooperate with competitors. The Second Circuit court, however, ruled that Verizon’s behavior may create an antitrust liability either because of the denial of access to essential facilities or as a consequence of monopolistic leveraging tactics. As in other cases involving the application of antitrust to regulated industries, the Court noted that there is an issue of whether the existence of regulation limits the applicability of antitrust enforcement.22 In this case, however, the Court was compelled to take into account the legislative guidance from the Telecommunications Act which incorporates an antitrust “Savings Clause” that states: … “[N]othing in this Act or the amendments made by this Act shall be construed to modify, impair, or supercede the applicability of any of the antitrust laws.”23 The plain language of this clause, together with a clear legislative history of the intent of the clause would seem to make the issue of the tandem applicability of antitrust and regulatory enforcement obvious.24 The Court, however, considers the “savings clause” as potentially limiting the applicability of antitrust: any antitrust application that “creates new claims” is seen by the Court to “modify” existing law and is therefore outside the reach of antitrust enforcement. The Court, then, turns to the question of whether the allegations made regarding the anticompetitive conduct of Verizon are within the reach existing antitrust principles. To frame this discussion, the Court first acknowledges that “it is settled law” that the crime of monopolization requires two elements: (1) the possession of monopoly power in the relevant market; and, (2) the willful acquisition or maintenance of that power as distinguished from growth or development as a consequence of a superior 22

For a discussion, see, e.g., Kaserman and Mayo (1995), pp. 588-594. Section 601(b)(1), Telecommunications Act of 1996. 24 In its review of the legislative history of the Telecommunications Act, the 11th Circuit Court “Our examination reveals that, rather than pre-emptive language, Congress specifically and directly stated that the [Telecommunications and Sherman] Acts were intended to be used in tandem to accomplish the congressional goals served by both acts-namely, the stimulation of competition.” Covad Communications Company vs. BellSouth August 2, 2002. As noted by Rep. Conyers in the development of the legislative history of the Telecommunications Act “Application of the antitrust laws is the most reliable, time-tested means of ensuring that competition, and the innovation that it fosters, can flourish to benefit consumers and the economy.” 142 Cong. Rec. H1145-06 (daily ed. February 1, 1996) (statement of Rep. Conyers). See also the President’s statement upon the signing of the Telecommunications Act “The Act's emphasis on competition is also reflected in its antitrust savings clause. This clause ensures that even for activities allowed under or required by the legislation, or activities resulting from FCC rulemaking or orders, the antitrust laws continue to apply fully.” Statement by President William J. Clinton upon Signing S. 652, 32 Weekly Comp. Pres. Doc. 218 (February 8, 1996), reprinted in 1996 U.S.C.C.A.N. 228-1, 228-3. Thus, both the legislative and executive branches recognized that the antitrust laws would coexist alongside extant regulation. 23

7 How Much and What Kind of Regulation Will be Needed

137

product, business acumen, or historical accident.25 Rather than engaging in the corresponding two–part inquiry that has become the trademark of monopolization cases following Grinnell, however, the Court frames the case narrowly as turning on the lessons from established case law regarding “duty to deal.” In this regard, the Court notes that the precedent is that firms generally do not have a duty to deal with competitors. Moreover, the Court is unpersuaded that the case at hand is covered by the “limited exception” provided by Aspen Skiing.26 The critical rationale the Court uses to exclude the applicability of Aspen to the telecommunications industry is that in Aspen the challenged conduct involved a deviation from an established cooperative arrangement between competitors. In contrast, Verizon (like all incumbent local exchange companies) had traditionally acted as a monopoly provider within its franchise territory and, as such, did not have an established pattern of cooperation with competitors. Thus, the Court views that the conduct in question – failure to provide particular network elements in a nondiscriminatory fashion – cannot be judged within established antitrust principles. The Court states that “[T]he defendant’s prior conduct sheds no light upon the motivation of its refusal to deal – upon whether its regulatory lapses were prompted not by competitive zeal but by anticompetitive malice.”27 Consequently, the Court concludes that “Verizon’s alleged insufficient assistance in the provision of services to its rivals is not a recognized antitrust claim under the Court’s existing refusal-todeal precedents.”28 The Court then goes the extra step of noting that the need to add Trinko to the “few existing exceptions from the proposition that there is no duty to aid competitors” is unwarranted in this case because “Antitrust analysis must always be attuned to the particular structure and circumstances of the industry at issue.”29 In this regard, the Court notes its belief that that the “existence of a regulatory structure designed to deter and remedy anticompetitive harm” limits the benefits of antitrust enforcement.

IV. Economic Fallacies of Trinko Unfortunately, the Court’s opinion in Trinko suffers from significant deficiencies both when taken on its own merits and, even more problematically, in conjunction with the nation’s aim to “uproot monopolies” in the telecommunications industry. These fall into two general categories, each with significant problems. First, the Court errs in its affirmative antitrust analysis, which it views as deterministic. 25

Trinko, p. 7 citing United States v. Grinnell Corp., 384 U.S. 563, 570-571 (1966). Aspen Skiing Co. v Aspen Highlands Skiing Corp. 472 U.S. 585, 601 (1985). (hereafter Aspen) 27 Trinko, p. 9. 28 Trinko, p. 11. 29 Trinko, p. 11. 26

138

John W. Mayo

Second, the Court errs in its perception of the costs of antitrust enforcement and the benefits of regulation at least as applied to the telecommunications industry. We treat these in turn.

A. The Court’s Antitrust Analysis 1. The Court’s framing of the case within the narrow context of a “duty to deal” is too restrictive. The behavior in question before the Court might have been accurately framed from a variety of alternative perspectives that would arguably fall under the reach of existing antitrust standards. For example, had the Court chosen to frame the alleged behavior in the context of “monopoly leveraging” then the discriminatory provision of network elements to downstream rivals would have found itself on sound legal and economic footing.30 Similarly, the Court may have reasonably chosen to see competitive local exchange companies (CLECs) – even those that relied on the purchase of unbundled network elements (UNEs) – as potential entrants into the network provisioning of local exchange telephone service. In this case, Verizon’s conduct may have been seen as erecting economic barriers to entry in “willful maintenance” of its monopoly power.31 Finally, the Court’s “duty to deal” perspective is especially curious in light of the fact that the alleged conduct was not a refusal to deal but rather that Verizon’s poor provisioning of network elements harmed competition in the retail market by degrading the quality of inputs it provided to retail-stage rivals. In this regard, the case may have been more appropriately considered as a case of “sabotage,” which is related to the established antitrust theory of “raising rivals’ costs.”32 Even within a duty to deal framework, both the Act and existing legal precedent make the alleged practice well within the reach of antitrust enforcement. The Telecommunications Act of 1996 establishes a clear duty to deal. For example, Section 251 of the Act states that ILECs have a “general duty” to interconnect and to provide for the facilities and equipment of any requesting telecommunications carrier, interconnection…that is at least equal in quality to that provided by the local exchange carrier to itself …”33 Given, then, this clear obligation established in the Act, the only question is whether deviations are potentially anticompetitive or, alternatively, innocuous “regulatory lapses.” Such an inquiry would surely seem entirely appropriate fodder for the antitrust courts.34 30

See, e.g., Kaplow (1985) and Sullivan and Jones (1992). For a recent discussion, see Farrell and Weiser (2003), pp. 109-111. 32 See, Beard, Kaserman and Mayo (2001), Salop and Scheffman (1983), and Krattenmaker and Salop (1986). 33 Telecommunications Act of 1996, Section 251. 34 In Section V infra, we address the issue of whether the merits of opening such conduct to antitrust scrutiny may be tempered by either by the technical nature of such inquiries or by a flood of cases that bog down the courts. 31

7 How Much and What Kind of Regulation Will be Needed

139

Even ignoring the competitive benchmarks established by the Act, existing antitrust principles already have framed the refusal to deal issue in a way that that would seem to clearly incorporate the alleged conduct by Verizon. Indeed, elsewhere the Court itself framed the duty to deal issue considerably more broadly than in Trinko, stating that the right to refuse to deal with competitors is not absolute, but rather “[E]xists only if there are legitimate competitive reasons for the refusal.”35 Whether, then, an ILEC’s behavior is motivated by “legitimate competitive reasons” or by monopolistic intent would seem to be exactly the sort of issue for which the antitrust statutes are designed. 3. Within the overly restrictive confines of a duty to deal framework, the Court relies on the narrowest of interpretation of how firm conduct may be accurately deduced. In particular, the Court found a critical distinction between “pre” and “post” behavior in the Aspen and Trinko cases. In Aspen, the defendant was found to be initially engaged in a cooperative venture with its competitor. This cooperative behavior ceased, and from this cessation, the Court was able to infer anticompetitive intentions. In the case at hand, the Court correctly notes that Verizon had no prior track record of cooperation with competitors.36 Rather than focusing on the “post” behavior effects of Verizon’s conduct, however, the Court merely notes that in the absence of a cooperative period between Verizon and its rivals, the Court is unable to judge whether the conduct was motivated by competitive zeal or anticompetitive malice. In taking this position, the Court ignores a myriad of alternative benchmarks (aside from prior cooperative behavior) against which it is possible to judge whether the alleged conduct willfully maintained Verizon’s monopoly power. That is, the fact that there was no prior history of cooperation does not eliminate all benchmarks against which to judge the behavior. Indeed, there is in fact a rich literature in antitrust/regulatory economics regarding the monopoly-preserving incentives for such sabotage of rivals by regulated firms.37 For example, economic theory indicates that such a regulated vertically integrated firm with monopoly power upstream will have an economic incentive to sabotage downstream rivals with the consequence of preserving its monopoly power by excluding equally efficient rivals. Moreover, the historical patterns in this industry, as validated by the antitrust courts, have repeatedly demonstrated behavior of incumbents with control over upstream facilities to engage in anticompetitive conduct through price and non-price anticompetitive exclusion. The non-critical nature of the existence (or lack, thereof) of prior cooperative arrangements can not only be gleaned from the economic literature but also in the 35

Eastman Kodak 504U.S. at 483 n. 32. As noted, supra, however, this cooperative benchmark was established in the plain language of the Telecommunications Act. 37 See e.g., Economides (1998), Beard, Kaserman and Mayo (2001) and Sappington and Weisman (2005). 36

140

John W. Mayo

courts. Indeed, the Eleventh Circuit has noted that the existence of prior cooperative arrangements “are not pre-requisites for a refusal-to-deal claim” observing that “Other cases, such as Otter Tail, involve a refusal to deal where there has been no prior arrangement, and a vertically integrated monopolist that refuses to deal with a customer to foreclose competition in a second market may violate section 2.”38 4. Where a duty to deal is established, whether by falling under the set of exceptions or by an alternative legally recognized standard – here the Act – deviation in the form of prices, terms and conditions charged to rivals become incongruous with the monopolist’s obligations. Indeed as noted in Terminal Railroad where the duty to deal is established, it must do so “upon such just and reasonable terms and regulations, as will, in respect to use, character, and cost of service, place every such company upon as nearly an equal plane as may be with respect to expenses and charges as that occupied by the proprietary companies.”39 Thus, deviations in the provision of inputs provided to one’s own downstream operations and competitors would, in the context of a properly established duty to deal fall well within the reach of existing antitrust precedent.

B. The Court’s Assessment of the Costs of Antitrust and the Benefits of Regulation While the Trinko Opinion makes clear that the Court views its “duty to deal” analysis as sufficient, the Court, nonetheless, adds supporting language by pointing to both the “existence of a regulatory structure” and the “realistic costs” associated with antitrust. Upon review, neither argument supports the exclusion of antitrust from the telecommunications industry. Specifically, the Court opines that “Antitrust analysis must always be attuned to the particular structure and circumstances of the industry at issue.”40 In this regard, the Court then notes that a factor “of particular importance” in this industry is “the existence of a regulatory structure designed to deter and remedy anticompetitive harm.”41 A variety of considerations, however, temper, if not completely contradict, the Court’s pro-regulation, anti-antirust rhetoric. The “Savings Clause” unequivocally recognizes the dual role and applicability of antitrust and regulation, thus making the Court’s observation of the “existence of a regulatory structure” irrelevant to issue of the legitimate applicability of antitrust enforcement. 38

Covad Communications Company v. BellSouth August 2, 2002, p. 12 United States v. Terminal R.R. Ass’n of St. Louis, 224 U.S. 383, 411 56 L. Ed. 810, 32, S. Ct. 507 (1912). 40 Trinko, p.11. 41 Trinko, p. 12. 39

7 How Much and What Kind of Regulation Will be Needed

141

As demonstrated during the 1980 antitrust case against AT&T, the mere “existence” of regulation has been shown to be ineffective in dealing with complex anticompetitive practices. Indeed, this was a principal argument in the antitrust suit against the Bell system that led to its divestiture in 1984. Thus, while it may generally be true that “where such a structure exists, the additional benefit to competition provided by antitrust enforcement will tend to be small…”42 , the Court fails to heed its own advice with respect to the telecommunications industry that “Antitrust analysis must always be attuned to the particular structure and circumstances of the industry at issue.” The “particular structure and circumstances” in this industry have repeatedly demonstrated a propensity for anticompetitive exclusion. In its only nod to the specifics of this industry’s “structure and circumstances”, the Court points out that the regulatory structure held out “entry into the potentially lucrative market” for long-distance service until the LEC had demonstrated “good behavior in the local market.” But, it has also been pointed out elsewhere that once allowed into the long distance market, the incentive for good behavior is severely diluted.43 More generally, it has been demonstrated in the literature that there are incentives within the existing regulatory structure for anticompetitive conduct.44 Indeed, it is the very existence of the regulatory structure that may, in certain circumstances, provide the impetus for anticompetitive behavior.45 The comfort the Court takes from the presence of regulation is squarely at odds with the dynamic that Congress intended in the passage of the Telecommunications Act. Specifically, both the plain language of the Act and its legislative history make it clear that the goal of the Act was not to rely more heavily on regulation in the future but less. As Sen. Thurmond stated “[W]e must rely on the bipartisan principles of antitrust law in order to move as quickly as possible toward competition in all segments of the telecommunications industry, and away from regulation.”46 In contrast to the clear intent of Congress, the Court seems to take considerable comfort in the presence of regulation. The Court points to financial penalties that may be imposed by regulators in the event of failure to deal. Unlike the antitrust courts, however, regulatory bodies may generally only impose fines consistent with actual – not treble – damages. The result is that the threat of antirust liability swamps the threat from regulatory penalties.

42

Trinko, p. 12. See, e.g., Schwartz (2001). 44 See, e.g., Beard, Kaserman and Mayo (2001), Sappington (2005). 45 See Beard, Kaserman and Mayo (2001). 46 142 Cong. Rec. S687-01 9dail ed. February 1, 1996 (Statement of Sen. Thurmond). 43

142

John W. Mayo

Beyond its arguably misplaced faith in the power of the regulatory structure to deter anticompetitive conduct, the Court also engages in a critique of antitrust as a procompetitive policy tool, pointing to “a realistic assessment of its costs.” The Court sees the costs as stemming from two sources. First, the Court notes that applying the Sherman Act “can be difficult” because the means of illicit behavior are manifold. Thus, the Court would appear to argue that detection of violations is costly. As a matter of pure economic logic, the Court is surely correct: with sufficiently high costs of detection – relative to the deleterious consequences of the anticompetitive behavior – the optimal policy is non-enforcement. Substantive questions, however, very naturally arise regarding the Court’s preference to avoid these costs. Specifically, because the Court ultimately is expressing a preference for regulatory, rather than antitrust, enforcement the relevant question cannot be the absolute cost of antitrust enforcement but rather centers on the cost of antitrust enforcement relative to the costs of regulatory detection and enforcement. In this regard, it cannot be unequivocally asserted that the costs of antitrust enforcement exceed those of regulation. Moreover, while the Court correctly notes the potential for costs of enforcement, it fails to recognize the potential for pro-competitive benefits to arise from antitrust enforcement. The second source of the costs associated with antitrust enforcement identified by the Court is seen to stem from the possibility of “mistaken inferences and the resulting false condemnations” by antitrust enforcers. In identifying these costs, the Court relies upon the assumption that antitrust will be misapplied. While such misapplications surely would impose costs, similar misapplication of regulation would surely prove equally damning. Thus, it would seem that the Court simply is expressing a general distrust of antitrust enforcement relative to regulation. Restoring the Balance: A Proposal for an Antitrust Foundation for a Deregulating Telecommunications Industry. The Supreme Court’s recent Trinko opinion has created a significant gap in the ability to accomplish the “pro-competitive” portion of the “pro-competitive, deregulatory” framework sought by the Telecommunications Act. Specifically, while the Court should arguably have seen Verizon’s behavior as well within the reach of traditional antitrust principles, it did not do so. Accordingly, if, in fact, Congress does wish to have an active oversight and enforcement role for the antitrust in the deregulating industry a legislative remedy is now required. To date, the sole legislative proposal for remedying the lacuna created by Trinko has been offered by Representatives Sensenbrenner and Conyers. Specifically, in H.R. 4412 introduced into the 108th Congress, a proposal was made to clarify the Clayton Act by making it [U]nlawful for an incumbent local exchange carrier or an affiliate to create or to preserve (or to attempt to create or to preserve) a monopoly in any part of commerce by using its network (or by providing a service over its network) to engage in an anticompetitive practice (which may include a failure to comply with either section

7 How Much and What Kind of Regulation Will be Needed

143

251(c) or 271 of the Communications Act of 1934 or with any agreement, rule, or order in effect under such section). This clarification would, if passed, ensure that attempts to “create or preserve” a monopoly by local exchange companies through the use of its ownership of network facilities would be subject to antitrust liability. Importantly, the bill incorporates the possibility that failures to comply with the interconnection provisions of the Telecommunications Act [Sections 251 and 271] may constitute antitrust violations.

VI. Potential Criticisms Considered The description here of a legislative remedy to the void created by Trinko can be anticipated to generate several criticisms. Upon careful scrutiny none rise to the level of a serious impediment to the value added of such legislation. First, as noted above, the prospect for a heightened role for antitrust oversight and, where necessary, enforcement can be expected to be met with the claim that such protections are unnecessary as a regulatory structure is already in place. As we have described, however, the existence of such a regulatory structure in the telecommunications industry has historically proven relatively impotent in eliminating anticompetitive problems. Indeed, historically, it has been the courts, often with pro-competitive antitrust principles in tow, that have overcome regulators’ propensities to protect incumbent firms. A second potential criticism is that enforcement of Section 251 of the 1996 Telecommunications Act by the courts would be “both technical and extremely numerous” thus creating the potential to bog down the courts. Here any dispassionate observer of the controversies before regulatory bodies of the past eight years must express a sympathetic notion that these regulatory debates not be extended to court. A corollary to this argument is that we don’t want to turn the antitrust courts into regulatory bodies. The administrative load under the existing regulatory structure – without any new antitrust legislation – should not, however, be used as a reason to fail to apply antitrust. Rather, the sensible path is to construct competitive standards for cooperation that are rooted not in the minutia of the interconnection language of section 251 but instead in antitrust principles that have proven robust for over a century. Another likely criticism is that the proposed legislation does not reflect today’s technologically dynamic industry which has “eliminated the monopoly problem”. This criticism raises two concerns. First, it raises the question of whether antitrust can be applied in technologically dynamic industries in a way that promotes rather than stifle competition. A full treatment of this debate is outside the scope of this paper, but the recent history of antitrust practice suggests that the statutory principles laid out decades ago have proven quite robust to the advancement of

144

John W. Mayo

technology.47 Second, the charge raises the factual issue of whether rapid technological change has rendered the historical monopoly power enjoyed by the ILECs moot. This is, of course, a factual matter and one that cannot be decided by mere assertion. Moreover, if the assertion is true, then it would seem that the incumbent firms would have nothing to fear with the exposure to antitrust enforcement. Finally, it may be argued that support of an antitrust principle to assure nondiscriminatory access to UNEs is unwarranted when regulators have increasingly found regulatory mandates for the provision of UNEs unnecessary. But this ignores the more general prophylactic effects of antitrust even in a model of intermodal competition. Indeed, looking forward, it may reasonably be anticipated that claims of denial, denigration or delay of new-generation broadband facilities will arise that will warrant the prospect of viable antitrust scrutiny and enforcement.

VII. Conclusion In this paper, we have examined the historical and recent evolution of antitrust policy in the telecommunications industry. While the interplay of regulation and antitrust in this industry has historically created more competition than either acting alone, the Supreme Court’s recent Trinko opinion has dealt a significant blow to the tandem application of both policy instruments. In this paper, we have critically examined the Court’s Trinko opinion and found it wanting in a number of respects. Given the position of Court as the final interpreter of extant law, the paper points toward recent legislative efforts to remedy the lacuna created by Trinko.

References Baker, Jonathon B. “The Case for Antitrust Enforcement,” Journal of Economic Perspectives, Vol. 17, Fall 2003, pp. 27-50. Beard, T. Randolph., David. L. Kaserman, and John W Mayo. “Regulation, Vertical Integration and Sabotage.” Journal of Industrial Economics, Vol. 49, September 2001, pp. 319-334. Burton, Mark, David. L. Kaserman, and John W. Mayo. “Shakeout or Shakedown? The Rise and Fall of the CLEC Industry.” Michael A. Crew, Editor, Markets, Pricing, and Deregulation of Utilities, Kluwer Academic Publishers, 2002. Crandall, Robert W. and Clifford Winston “Does Antitrust Policy Improve Consumer Welfare,” Journal of Economic Perspectives, Vol. 17, Fall 2003, pp. 3-26. 47

See, for instance, Klein, Joel “Rethinking Antitrust Policies for the New Economy,” Speech presented, May 9, 2000. (Available at www.usdoj.gov/atr/public/speeches/4707.htm)

7 How Much and What Kind of Regulation Will be Needed

145

Economides, N. “The Incentive for Non-Price Discrimination by an Input Monopolist.” International Journal of Industrial Organization. Vol. 16, May 1998,. pp. 271-284. Farrell, Joseph and Philip J. Weiser “Modularity, vertical Integration and Open Access Policies: Towards a Convergence of Antitrust and Regulation in the Internet Age,” Harvard Journal of Law & Technology, Vol. 17, Fall 2003, pp.85-134. Gabel, D. “Competition in a Network Industry: The Telephone Industry, 1894-1910.” The Journal of Economic History,Vol.,54, September 1994, pp. 543-572. Hovenkamp, Herbert “Antitrust as Exraterritorial Regulatory Policy,” Antitrust Bulletin, Fall 2003, Vol. 48,, pp. 629-655, Kahai, Simran, David L. Kaserman and John W. Mayo “Deregulation and Predation in Longdistance Telecommunications: An Empirical test,” Antitrust Bulletin, Vol. 40, Fall 1995, pp. 645-666. Kahai, Simran, David L. Kaserman and John W. Mayo “Estimating Monopoly Power in Regulated Markets,” “The Case of Local Exchange Telephone Service,” Working Paper, September 2005. Kahn, Alfred E. The Economics of Regulation, Cambridge, MA, MIT Press, 1988. Kaplow, Louis “Extension of Monopoly Power Through Leveraging,” Columbia Law Review, Vol. 85, 1985, pp.515-555. Kaserman, David L. and John W. Mayo Government and Business: The Economics of Antitrust and Regulation Dryden Press, Fort Worth, TX, 1995. Krattenmaker, Thomas G. and Steven C Salop “Anticompetitive Exclusion: Raising Rivals’ Costs to Achieve Power Over Price,” Yale Law Journal, Vol. 96, December 1986, pp. 209-293. MacAvoy, Paul W. The Failure of Antitrust and Regulation to Establish Competition in LongDistance Telephone Service, Cambridge MA and Washington DC, MIT and AEI Press, 1996. Noll, R. “The Role of Antitrust in Telecommunications.” Antitrust Bulletin, Vol. 40, Fall 1995, pp. 501-528. Salop, Steven and David Scheffman “Raising Rivals’ Costs,” American Economic Review, Vol. 73, May 1983, pp. 267-271. Sappington, D., D. Weisman. “Self-Sabotage.” Journal of Regulatory Economics. Volume 27, No. 2 (March 2005). pp. 155-175 Schwartz, Marius “The Economic Logic for Conditioning Bell Entry into Long distance on the Prior Opening of Local markets,” Journal of Regulatory Economics, Vol. 18, November 2000, pp. 247-288. Spulber, D. “Competition Policy in Telecommunications.” Martin E. Cave, Sumit K. Majumdar, Ingo Vogelsang, Editors. Handbook of Telecommunications Economics, Elsevier, 2002. Stigler, George J. “The Economic Theory of Regulation,” Bell Journal of Economics and Management Science, Vol. 2, Spring 1971, pp. 3-21.

146

John W. Mayo

Sullivan, Lawrence A. and Anne I. Jones “Monopoly Conduct, Especially Leveraging Power from One Product Market to Another,” in T.M. Jorde and D.J.Teece, (eds.) Antitrust, Innovation and Competitiveness, Cambridge University Press, Cambridge, MA., 1992. Temin, P. The Fall of the Bell System, Cambridge University Press, 1989. Weiman, D., and R. Levin. “Preying for Monopoly? The Case of Southern Bell Tele-phone Company, 1894-1912.” Journal of Political Economy Vol. 102, February 1994, pp. 103-12

7 How Much and What Kind of Regulation Will be Needed

147

7.2 Statement Prof. Stephen C. Littlechild Judge Institute, University of Cambridge, UK I am honored to be invited to speak here but admit to some apprehension because I am by no means an expert on telecommunications. My comments are more in the nature of cautionary remarks about what might realistically be expected of regulation, and what it has or has not done in the past, and therefore how it might play a role in the future. I am struck by how fast and how radically the whole telecommunications field has changed since I first looked at it. I met Eli Noam here today for the first time in nearly 40 years. We used to go to conferences in the late 1960s and early 1970s, and the things that we talked about at those conferences have not made an appearance today. And I don’t think there is anything that was talked about today that would have appeared at those earlier conferences. So, there has been a complete change, driven mainly by the actual and prospective changes in technology. In contrast, I have spent most of the last 15 years in the energy sector, and although the policy framework there has changed – for example, to incorporate competition – I do not think the technology has changed at all significantly. This has implications for regulation because regulation does not move that fast. In the light of this I am not sure regulation is so appropriate for telecommunications. Regulation tends to focus on simple concepts and moves rather slowly, and is often concerned with yesterday’s problems rather than tomorrow’s. It is not necessarily driven by a wish to facilitate the kinds of technical and technological developments that we heard about today. It is also influenced by the concerns of particular companies that may feel they are losing out, or by the concerns of particular consumer groups that may be harmed by the particular developments. Those concerns as much as anything drive a lot of regulation rather than more altruistic objectives. In fact, one of the main developments over the last 30 or 40 years has been a greater understanding by economists, at least, of what drives regulation. The question I was given for this session is: How much and what kind of regulation will be needed in the networked world of tomorrow? Clearly it depends, as I have indicated, on the objectives of the policy. Many technological and economic developments are very difficult to foresee. We have also heard that the situation may be different in one country compared to another. But there are at least a couple of things that one might say in the way of answering the question.

148

Stephen C. Littlechild

How much regulation will be needed? Less than we will actually get. What kind of regulation will be needed? More imaginative than we will actually get. So, you can gather that I have a few rather skeptical remarks to make about regulation. I want to illustrate those remarks by some particular issues that I happen to have been involved in, though they are by no means representative of the broad range of regulatory issues either yesterday or today or tomorrow. Then I want to suggest a few implications for policy.

1. Initial regulation of British Telecom When British Telecommunications was privatized in the UK in 1984, the question was raised: Do we need regulation? One thought was: perhaps we can look to competition alone and we don’t need regulation. However, the view was taken that, although there might be competition in the future, it was not there yet, and until then some regulation was needed. What really drove it was the feeling that you couldn’t sell a company to investors if they didn’t know what the company could do. They needed to know what price they could charge, some assurance that they could do this and that they shouldn’t do that, so the potential investors could assess the prospects for the company. Similarly for the customers: they had had a nationalized and publicly owned provider as long as they could remember. It was presumed to act in their interest – whether it did or not is another matter. But if it were turned over to private sector investors, wouldn’t they simple put up the prices and exploit the customers? Answer: Not if the government can provide some sort of reassurance. In order to make this policy feasible it was necessary to provide some assurance to consumers as well. The price cap that I proposed became known as RPI-X: the retail price index (RPI) minus a specified number X. This was intended to be an alternative to US style rate of return regulation, which economists widely felt at that time to be not conducive to efficiency or innovation, and to be rather intrusive. That was the genesis of the regulation. When I proposed the price cap, I didn’t want to go down in history as the man who invented a new price control. I wanted to help free up the industry, but I had to start by putting a price control in place, as the lesser of two evils. I thought it would be needed for maybe four or five years, then we would see how much competition had developed, and leave any further arrangements to the regulator. The presumption was that competition would develop – this was kind of industry above all in which there could be innovation and competition. So it would be possible to get rid of the price cap. But in the event it is still there, more than 20 years later. Some say that the present control is not very serious, and doesn’t actually constrain the company very much. But it is no longer the only price control, there are now about 30 of them. They have mushroomed instead of disappeared.

7 How Much and What Kind of Regulation Will be Needed

149

That raises the question: What went wrong? First, the government did not choose to promote competition as much as it might have done, in fact it created a duopoly for the first 7 years or so. Second, the price cap began to seem a rather useful device. It could protect particular groups of customers or companies. For example, small users might have been adversely affected if a reduction in the price of calls led to an increase in the monthly subscription. A cap on the subscription rate served to protect those users. These caps became more and more detailed, with sub-caps under the caps, and so on. Every time a potential problem arose, in came a new price cap. The regulatory body has shown a reluctance to abandon any of these price caps. New ones are introduced when there are problems to be solved, but those problems never seem to go away. Competition is always just around the corner, but it is not quite safe to get rid of the price caps just yet. So they tend to multiply.

2. Restructuring to create competition Now move forward to the present. The original regulatory body Oftel has been replaced by a bigger grander body called Ofcom, which includes broadcasting as well as telecommunications. Ofcom’s policy is to remove unnecessary regulation. It finds that competition is healthy in many respects. For example, there are over 300 providers of business services, and new entrants supply over 70 % of the international call minutes. But on the other hand BT still has a dominant market share in many markets including basic telephone. It has 82 % of the residential access market, and nearly 80 % of wholesale call origination minutes. There is thus some competition but not as much as expected after more than 20 years. What is to be done about this? Ofcom’s explanation is that there has been equality of access in principal but not in practice. What is needed now is real equality of access. To explain, there have been various shifts in regulatory policy over the last 20 years. In 1984, at the time of BT’s privatization, a competitor called Mercury was licensed. In order to persuade Mercury to invest in a number of less populated areas where it otherwise might not have done, no other entrants would be allowed into the market. This policy would not be adopted now. But in those days when there wasn’t even one competitor, a bird in the hand was worth two in the bush. That policy lasted for about seven years. In 1991 there was a review of the duopoly policy. The government concluded that the duopoly restriction was now inappropriate, and decided to allow competition. Other companies could provide networks, so there would be competition from network providers. In the event, such entrants did not emerge in any large way to provide network competition.

150

Stephen C. Littlechild

In 1997 European directives put the focus on service-based competition using access to BT networks. Clearly that depended on being able to get adequate access to those networks. Competitors were promised access but in practice did not get it on the same basis as BT did, so were not able to compete on equal terms. The most specific issue was access to the ‘last mile’ of the local loop. Competitors could not get equal access to that, and argued that it was not worth building a new local loop. Ofcom’s conclusion now is that to get competition in services there needs to be better access to the local loop. That resonates with me because that was the policy we adopted in the energy sector from the very beginning of privatization and competition back in 1989. We had a very clear separation between the monopoly networks and the potentially competitive services over them. The monopoly businesses had to provide separate accounts, and increasingly over time they had to set up a separate company, with separate management, staff, IT and other facilities. The networks were not allowed to discriminate between the companies that competed over them – that is, between the retail company in their ownership and other retail competitors. That proved to be very effective indeed. We have probably the most effective competition in generation and in retail supply in any country in the world. There are now no price controls of any kind on generation or on retail supply. It has thus been possible to avoid or remove regulation as a result of taking a firm line on structural separation. It seems that this could be done in telecoms too. To separate businesses in this way would be an integral part of delivering the desired equivalent access to the networks. Of course, where to draw the line between monopoly and competitive services may differ from one time and place to another. But the basic monopoly seems to be the existing local loop. One could separate that out, not necessarily requiring separate ownership but requiring a separate entity that provides equal access to everyone. If new investments were required in different facilities, they could be subject to competition instead of regulation. It should then be possible substantially to reduce if not completely remove the regulation from the other services. In other words, there are ways of reducing or removing price regulation by taking different approaches from a structural perspective.

3. Mobile termination charges The regulation of mobile termination charges started in the UK and now applies in Europe and other countries generally, and at increasingly severe levels. The most recent controls in the UK require the mobile operators to reduce their termination charges by 15 % a year in real terms. Actually the regulator proposed a reduction of 12 % a year, the companies resisted this, and the issue went to the Competition

7 How Much and What Kind of Regulation Will be Needed

151

Commission who said: 12 % is not enough, it should be 15%. This was after a period of four years during which the operators had been required to reduce their charges at 10 % a year. This kind of control has become standard, and many European countries are now going now in the same direction, as are others such as Australia and New Zealand for example. On the face of it, the presence of regulation is rather odd since there are usually several competitors in each mobile market and there seems to be competition between them, certainly in the UK. The markets are characterized by new entry, price reductions, improvements in quality, innovation and customer satisfaction. What is regulation doing here? The explanation is that the termination rates are very high and presumably indicate some market power. Whether high termination rates are really such a problem as they seem is more questionable. The normal concern about market power is that customers are exploited. They are paying higher prices than they need to, and the money goes to the company. But in the case of the mobile market this isn’t quite what it first seems because of what has become known in the UK as the ‘water bed effect’. Under the Calling Party Pays (CPP) system, each operator has a monopoly on access to its own subscribers, and so is able to set high termination charges. High termination revenues make additional subscribers very attractive. To attract them it may be worth subsidizing the monthly subscription (rental). It may also be worth subsidizing the handset or even giving it away free in order to attract customers. The additional customers in turn will attract incoming calls on which the operator will make a high revenue from termination charges. The consequence is that customers pay higher termination charges but also lower subscription charges and handset charges. It is not clear that they are much worse off overall. And reducing termination charges by regulation may simply lead to increasing subscription and handset charges, hence the term ‘waterbed effect’. It is not clear that regulation will make customers much better off. However, regulators point to what economists call allocative inefficiency. In simple terms the distorted prices mean that too few calls are made but there are also too many subscribers in the system. Resources are not used in the way that customers would most prefer, and there is a net cost to this. Is that really a problem? Allocative efficiency is not the kind of thing that regulators in general worry much about. Regulators nowadays worry more about incentives to reduce costs, what is known as ‘productive efficiency’. Does allocative inefficiency cost much? The UK Competition Commission and the operators did a great deal of modeling of the market and the effects of a price control. The Competition Commission said that the improved allocative efficiency would be worth about £54.4 million per quarter. But Professor James Mirrlees of Cambridge, a Nobel

152

Stephen C. Littlechild

Laureate in economics, whose expertise lies in such matters, and who was an adviser to one of the operators, took a different view. He argued that the benefit was £4.7 million per quarter, about a tenth of the Commission’s figure. There are thus radically different estimates of whether there really is a problem in this market or not. I have not studied the details of those calculations. What I do observe, however, is that in the US, which does not have the same level of termination charges, the calling rate is about two or three times what it is in the UK and Europe. The subscription rates are also higher, and the subscriber penetration rate is rather lower. For reasons that I will explain in a moment, I think that US prices are more competitive and cost-reflective. So there seems to be a considerable distortion in the UK market as a result of the system being used there. There is probably also some redistribution of income from the fixed operators to the mobile operators that could distort competition as well. This suggests that there is a problem in the UK, and elsewhere, and that something needs to be done about it. The question I pose is: does one really need to remedy this by regulation and specifically by regulation of the termination charges? That is the way the UK has gone and the way the rest of Europe is going. But there are two problems. First, it is intrusive and expensive to carry out this form of regulation. My estimate is that it cost the UK regulators and operators about £25 million to complete this price control. This is almost as much as the benefits that Professor Mirrlees calculated for it. Second, there doesn’t seem to be any end to this. There is no prospect of the termination market becoming competitive as a result of the present policy and therefore no prospect of getting rid of that price control. Moreover, if US prices reflect what termination costs really are then even the apparently draconian price reductions as a result of the price controls aren’t really doing much more than scratching the surface.

4. Methods of setting price controls The price controls on termination charges are being set in different ways in the different countries around Europe. Some countries allow higher prices to companies with higher costs. This provides no incentive to improve efficiency and probably distorts competition as well. In contrast, the UK approach allows different prices only for those factors that lie entirely outside the control of the company. In practice, the main factor outside the control of a company is the kind of spectrum to which it has access. On that basis the UK regulator calculated that the difference between the costs of mobile operators depending on their access to spectrum was

7 How Much and What Kind of Regulation Will be Needed

153

about 13 %. Among European regulators the differences in the price controls is up to 46 %. (This excludes the new entrants that either don’t have price controls, or have very different ones.) So, the way in which regulation is actually done in many of the European countries is not necessarily conducive to efficiency or competition, and may be vulnerable to various political influences.

5. Mobile Party Pays regulation The US works on a Mobile Party Pays (MPP) mechanism for termination charges. (This is called Receiving Party Pays (RPP) elsewhere.) It overcomes the termination charge problem in the following way. If a mobile operator charges a high termination charge that is something that a subscriber will take into account when deciding whether to join that network. The Competition Commission and the regulator in Britain all agreed that RPP would solve the termination charge problem and would avoid the need for price controls. Yet they did not propose it. Their explanation: it would be too disruptive for customers who wouldn’t like to pay to receive calls, and too difficult for the regulator to explain to the customer. They expressed a fear that the customers might turn off their mobile phones. As a result of this, they decided it was easier to continue with the present price control mechanism, even though this meant even more severe and more detailed calculations. In the United States, the MPP/RPP approach means there are no intrusive price controls. Because the operator does not have the right to collect from another operator for terminating calls, it charges its own subscribers for this, but at much lower prices. The calling price is about half of European levels according to various calculations, and average call minutes are about two or three times European levels. Competition seems to be very effective. There is nowadays no concern about phones being turned off, mainly because the cost of receiving calls is so low that receiving calls is not a worry to subscribers. There seem to be more cost reflective rentals and handset prices in the US. Therefore the penetration rate is lower than in Europe, where subscriptions are effectively subsidized. But the penetration rate is rising fast in the US. And one country cannot get very far ahead of others when penetration rates are towards 100 % or even beyond. My impression is that RPP works well in the US, there is no desire for any other system in this country, and that is the message we ought to be taking back to Europe and the rest of the world.

154

Stephen C. Littlechild

6. Conclusions Finally, what are the lessons to draw from all this? There are always pressures for more regulation and pressures not to relax regulation once it is being put in place. But there are examples where it can be distorting, or a disincentive to efficiency, or open to abuse. Price controls in particular seem an easy option, but once they are introduced they tend to stay. The moral is: be careful about introducing them in the first place. It is also worth looking for alternative policies that don’t evolve price controls. They may need to be a little more imaginative, or in some cases a little more radical. But I have given examples of how structural remedies can reduce the need for detailed regulation of conduct, and of how a different charging system (for mobile termination) can remove the monopoly problem and give a competitive solution instead of a regulatory one. In a number of respects the US may indicate the way forward for Europe. The other speakers on this panel will be able to correct me if I am wrong. But at least there is a lot we can learn from US policies and discussions.

7 How Much and What Kind of Regulation Will be Needed

155

7.3 Statement J. Gregory Sidak American Enterprise Institute I will try to fill in for John Mayo and give some predictions about telecommunications legislation in the United States. I have little to say. The Telecommunications Act of 1996 took about five years to pass through Congress, and I doubt that we will see a Telecommunications Act of 2005 for largely political reasons that I will discuss in a moment. It is an honor to be on a panel with Stephen Littlechild. His contribution to good regulation – the design of price caps in Britain – underscores an important aspect of what legislators should consider if they attempt to rewrite a statute like the Telecommunications Act of 1996. I would pose the issue this way: What defines the robustness of a new regulatory model? Price caps (in the United Kingdom and at the state and FCC levels in the United States) have been a very successful regulatory model when compared with the regulatory regime required for the unbundling provisions of the Telecommunications Act of 1996. The unbundling regulations have been intensely controversial and have been in court continuously since 1996. Those regulations now appear to have outlived their usefulness because the competitive local exchange carriers (CLECs) either have died off or (in the case of the two biggest ones, AT&T and MCI) are being acquired by the biggest Bell companies, SBC and Verizon. In short, the unbundling rules epitomize a contentious and Byzantine regulatory framework that has become obsolete because of changes in the industry. Let me say a word about regulatory theory versus regulatory implementation. I like Professor Littlechild’s water bed metaphor. It reminds me of Ramsey pricing, which is probably the most familiar concept to any person who has taken a course on regulation of industry in a law school or economics department. Frank Ramsey published his paper on optimal departure from marginal cost pricing in 1927. He was specifically addressing the problem of setting taxes on multiple products so as to raise a specific amount of revenue at least harm to economic welfare. But, of course, Ramsey pricing subsequently was extended to the regulated pricing of multiproduct firms in telecommunications and other industries having large fixed costs. If a firm producing multiple products must be subjected to price regulation, you set prices above marginal cost in inverse relationship to the respective price elasticities of demand. Consider the implications of the Ramsey rule for a regulator implementing a regulatory model like the unbundling regime of the Telecommunications Act of 1996. The regulator is setting regulated prices for unbundled network elements,

156

J. Gregory Sidak

including the leasing of unbundled loops, which are the least substitutable element in the network. Ramsey pricing would tell the regulator that loops should be priced above marginal cost in greater proportion than some more substitutable network element, like a switch. Regulators do not like that answer, however. To the contrary, one actually sees regulators adopt what David Sappington has called “reverse Ramsey pricing,” which causes the price of the least price-sensitive product to have the lowest markup above marginal cost. Regulators do not adopt reverse Ramsey pricing because they do not understand economics. They (or at least their staffs) all took the regulation course in law school or in their economics programs. Their motivation to deviate from this elegant theoretical model is more likely driven by a public choice explanation about what regulators are really trying to accomplish. Theory of regulation associated with Felix Frankfurter and the other New Deal regulators was that agencies regulate in the public interest. By the 1970s, lawyers and economists at the University of Chicago had advanced the regulatory capture model: the regulator becomes captured by the firms that it is supposed to regulate. Neither model satisfactorily explains what has happened in the United States in telecommunications regulation since 1996. A casual observation of mine is that economic rigor has declined in the debate over telecommunications regulation since the mid 1990s. What has risen in its place is more explicitly political influence over the decision making of an agency like the FCC. We observed the culmination of that process a couple of years ago in an important FCC decision on unbundling. Five commissioners split, 3 to 2, yet they did not have an order to release to the public. They had not seen the order because it did not yet exist. They nonetheless voted for an outcome, which the agency reported in the form of a press release. At least one commissioner admitted in a separate statement, “I am not sure that I will agree with all the details of what I have voted on.” How can one reconcile that kind of agency behavior if the purpose of having an independent regulatory body is to ensure that regulatory decisions are based on expertise and information rather than pure politics? In response to this decline in the intellectual integrity of decision making at the FCC, there has been a kind of hydraulic effect akin to Professor Littlechild’s water bed. The U.S. Court of Appeals for District of Columbia Circuit has direct appellate review of most of the FCC’s orders. My conjecture is that, since the mid 1990s, as politics has displaced economic analysis at the FCC, the D.C. Circuit has responded by becoming more aggressive in the way it reviews the FCC’s orders, reversing the agency not only with greater frequency, but also with greater forcefulness of economic reasoning. In American administrative law, the Chevron doctrine says that a reviewing court will defer to the decisions of the agency (here, the FCC) if the statute that the agency is applying is ambiguous and the reading of the statute that the agency uses is a reasonable one. The agency’s interpretation need not be the best interpretation, but merely a reasonable one. The court simply says: “Was the agency’s reading

7 How Much and What Kind of Regulation Will be Needed

157

reasonable?” If so, end of story. Before 1996, the Chevron doctrine was an impediment to the second-guessing of FCC regulators by the D.C. Circuit. My conjecture, however, is that Chevron’s practical constraint on appellate review of FCC regulation has diminished since the agency began its implementation of the Telecommunications Act of 1996. If my conjecture is correct, it underscores the significance of an independent judiciary within a larger, more dynamic model of regulation. One reason that the D.C. Circuit might push the envelope on Chevron deference is that these judges are themselves experts on economic regulation. After all, they hear regulatory cases every week they are in session – cases on energy, on postal rates, on transportation, on telecommunications. Moreover, some of the judges on the D.C. Circuit are former law professors who taught courses on antitrust and regulation of industry. So, they might think, with considerable justification, “Although other appellate courts might lack the expertise to second-guess the FCC on economic regulation, we do not – and in the face of politicized decision making by the agency, we will not defer.” Where does that hydraulic effect of judiciary lead us? Perhaps to a deeper appreciation of the virtues of an antitrust model as an alternative to industry-specific regulation. It is important to remember that the restructuring of the U.S. telecommunications industry in the last 25 years started with an antitrust case, not a regulatory proceeding. It was the AT&T divestiture case brought by the Department of Justice in 1974. AT&T and the Department of Justice agreed to the divestiture in January 1982, and the breakup took effect in January of 1984. The consent decree, known as the Modification of Final Judgment (MFJ), was administered by Judge Harold Greene of the federal district court in Washington until, twelve years later, the Telecommunications Act of 1996 superseded it. We are now nine years into the process of implementing the Telecommunications Act of 1996. It is useful to ask: Was Judge Greene better or worse in implementing the MFJ than the FCC has been in implementing the Telecommunications Act of 1996? I am not sure that we have a more deregulated telecommunications marketplace, producing greater gains in consumer welfare, than we would have if Congress never enacted legislation in 1996. Fights that took the form of agency rulemaking by the FCC would have been resolved by a court instead – without any Chevron deference to consider. Compared with FCC commissioners, a court has a very different agenda – a different kind of objective function, as economists say. And maybe that difference in objective functions would have produced different regulatory outcomes in the telecommunications industry.

158

Justus Haucap

7.4 What Regulation for Tomorrow’s Telecommunications Markets? Justus Haucap Ruhr-University of Bochum Introduction When considering how much and what kind of regulation will be needed for tomorrow’s telecommunications markets, the question that economists do and policy makers should consider is the following: How can we ensure that dynamic markets are regulated efficiently? Put differently, which institutional framework1 is best suited in order to achieve economic efficiency in markets which are characterized by network structures and specific investments (and, accordingly, sunk costs) on the one hand and rapid technological change on the other hand? In this context, two important sub-questions are: • How can we facilitate efficient innovation and investment with respect to communications infrastructure and the development of new services? • How can we ensure that ex ante regulation is only applied when it is really beneficial, i.e. welfare enhancing? Regulation for Tomorrow

Key Questions • How can we ensure efficient investment in communications infrastructure and development of new services? • How can we ensure that dynamic markets are regulated efficiently? • And, relatedly: How can we ensure ex ante regulation is only applied where it is really beneficial (= welfare enhancing)? Prof. Dr. Justus Haucap – Ruhr-University of Bochum

Figure 1

1

The New Institutional Economics defines institutions as a system of rules including their enforcement mechanisms (see, e.g., Furubotn, Eirik and Richter, Rudolf, 1997, Institutions and Economic Theory, University of Michigan Press: Ann Arbor).

7 How Much and What Kind of Regulation Will be Needed

159

Efficient Regulation of Dynamic Markets In order to analyze these questions a comparative institutional approach is warranted. That means, we take for granted that no regulation will ever achieve firstbest results, but that, in reality, regulation will always be fraught with mistakes as human beings and their actions are characterized by both opportunism and bounded rationality.2 Put differently, we know that regulation is characterized by principalagent problems, where regulators pursue, at least partially, their own objectives which are usually not equivalent to the public interest. Due to the regulators’ expertise and the collective action problems that consumers and taxpayers face actual outcomes of regulatory processes will often diverge from efficiency.3 Or, as Viscusi, Vernon, and Harrington (2000, p.44) put it: “In theory, regulatory agencies serve to maximize the national interest subject to their legislative mandates (…) Such a characterization of regulatory objectives is, unfortunately, excessively naïve. There are a number of diverse factors that influence policy decisions, many of which have very little to do with these formal statements of purpose.”4 And even if regulators were benevolent and always acted in the public interest, the problems of bounded rationality and/or asymmetric information remain. This in turn implies that first-best results will only be achieved by chance, but not in any systematic fashion. In summary, the above implies that regulated markets will necessarily diverge from our textbooks’ ideal world. In order to analyze the shortcomings of actual, realworld regulation in more detail it proves helpful to think about regulatory mistakes in terms of two types of error.5 A type I error occurs if a market or a firm is regulated even though it would be better (i.e., welfare enhancing) not to regulate it. And a type II errors means that a market or firm is left unregulated even though regulation 2

The comparative-institutional approach traces back to Coase, Ronald (1960), “The Problem of Social Cost”, Journal of Law and Economics 3, 1-44, as well as Demsetz, Harold (1969), “Information and Efficiency: Another Viewpoint”, Journal of Law and Economics 12, 1-22. For a recent elaboration and application see Dixit, Avinash K (1996), The Making of Economic Policy: A Transaction Cost Politics Perspective, MIT Press: Cambridge, MA. 3 See, e.g, Baldwin, Robert und Cave, Martin (1999), Understanding Regulation: Theory, Strategy, and Practice, Oxford University Press: Oxford. The path-breaking works in this area are Stigler, George J. (1971), “The Theory of Economic Regulation”, Bell Journal of Economics 2, 3-21, Posner, Richard A. (1974), “Theories of Economic Regulation”, Bell Journal of Economics 5, 335-358, Peltzman, Sam (1976), “Toward a More General Theory of Regulation”, Journal of Law and Economics 19, 211-240. 4 Viscusi, Kip, Vernon, John und Harrington, Joseph (2000), The Economics of Regulation and Antitrust, 3. Auflage, MIT-Press, Cambridge, MA. 5 Also see Haucap, Justus and Kruse, Jörn (2004), “Predatory Pricing in Liberalized Telecommunications Markets,” pp. 43-68 in: C. von Hirschhausen, T. Beckers and K. Mitusch (eds.), Trends in Infrastructure Regulation and Financing, Edward Elgar: Cheltenham 2004.

160

Justus Haucap

would be beneficial. Given the incentives that regulatory authorities face the positive theory of regulation, which is based on models of political economy, predicts that type I errors are far more likely to occur, which means that markets are usually rather over- than under-regulated. In fact, the available empirical evidence appears to support this hypothesis.6 The costs associated with these two types of errors are also quite different. If a firm or market is erroneously regulated while it should either be left unregulated or be subject to more light-handed regulation, the consequences of such an overregulation my be rather dramatic. At worst, innovation and/or investment incentives are reduced by so much that efficient investments and/or innovations are effectively prevented. This implies that the entire rent from a market is lost, as nobody can consume the service that is not being produced. If, however, a type II error occurs and a market is only regulated later or in a softer fashion than would be efficient, the welfare loss amounts to the well-known Harberger triangle.7 While in the latter case the usual welfare loss from monopoly may result, this welfare loss is relatively small, especially in a comparative perspective. At least some consumers get to consume the product or service and receive some utility associated with that consumption. That is, some positive consumer and producer surplus remains even in monopoly situations. In contrast, if an innovation or investment is delayed or prevented altogether, then the entire market rent is lost, which is – from an efficiency perspective – the worst case scenario. In plain and simple words this means that a monopoly is better than having no market at all.8 This suggests that in dynamic markets with rapidly developing technologies and changing market structures the welfare loss associated with type II errors will be much smaller than the welfare loss associated with type I errors. This again implies that, when in doubt, regulators should rather leave markets unregulated and rather err on the side of “too little, too late” than on the side of “too much, too early” regulation.

6

See Mueller, Dennis C. (2003), Public Choice III, Cambridge University Press: Cambridge, pp. 362-365. 7 See Baake, Pio, Kamecke, Ulrich and Wey, Christian (2005), “Efficient Regulation of Dynamic Telecommunications Markets and the New Regulatory Framework in Europe”, forthcoming in: R. Dewenter and J. Haucap (eds.), Access Pricing: Theory and Practice, Elsevier: Amsterdam. 8 In this context, a study by Hausman (1997) is illuminating, as it demonstrates the welfare loss that can be caused by regulatory failure. As Hausman describes, the introduction of mobile telephony in the US were effectively delayed by about 10 years through protracted decisions by the FCC and slow licensing proceedings. According to Hausman, these delays have resulted in a welfare loss of between US$19 and US$50 billion (basis: 1994). See Hausman, Jerry (1997), “Valuing the Effect of Regulation on New Services in Telecommunications”, Brookings Papers on Economic Activity: Microeconomics, 1-38.

7 How Much and What Kind of Regulation Will be Needed

161

Regulation for Tomorrow

Key Questions • How can we ensure efficient investment in communications infrastructure and development of new services? • How can we ensure that dynamic markets are regulated efficiently? • And, relatedly: How can we ensure ex ante regulation is only applied where it is really beneficial (= welfare enhancing)? Prof. Dr. Justus Haucap – Ruhr-University of Bochum

Figure 2

From an efficiency perspective it is, therefore, worrying that regulators tend to be biased towards over- rather than under-regulation, as political economy models and empirical evidence tell us. While economic theory demands that regulation should rather be biased towards under-regulation in order to avoid the more costly type I errors, actual regulation is unfortunately rather biased towards over-regulation (i.e., type I errors). Given this inherent bias towards over-regulation, it is sensible and appropriate that the European Commission demands that new and emerging markets are left unregulated. In its “Recommendation of 11 February 2003 on Relevant Product and Service Markets within the electronic communications sector susceptible to ex ante regulation” the Commission explicitly notes that “new and emerging markets, in which market power may be found to exist because of ‘first-mover’ advantages should not in principle be subject to ex-ante regulation.”9 That means, first-mover advantages are to be protected for a while to provide incentives for firms to invest, to take risks and to develop new services and contents or to upgrade networks. If profits are not regulated away, firms have every incentive to bring new services to consumers and to generate new rents.

9

See Paragraph 15 of the Commission Recommendation of 11 February 2003 on Relevant Product and Service Markets within the electronic communications sector susceptible to ex ante regulation and in accordance with Directive 2002/21/EC of the European Parliament and of the Council on a common regulatory framework for electronic communication networks and services”, Commission document C(2003)497.

162

Justus Haucap

In the economic literature the appropriate innovation and investment incentives have also been discussed under the phrase “access holidays”10, which basically means that network providers get a “holiday” from mandatorily providing competitors with access to their networks. The idea is closely related to the literature on the economics of patents and innovation incentives. Since patents, and similarly copyrights grant firms a temporary monopoly over some product, this prospect incentivizes firms to invest into R&D.11 Even though the temporary monopoly leads to an allocative efficiency loss from a purely static perspective, such a policy can be efficient from a dynamic efficiency perspective as it spurs innovation. Similarly, access holidays can be seen as analogous to a patent as they involve the right to use some newly built infrastructure exclusively for some period of time, thereby increasing firms’ incentives to undertake risky and specific investments into new infrastructure.

The Regulatory Hold-up Problem While we have argued that access holidays or, more generally, exemptions from ex ante regulation are desirable from an economic efficiency perspective, the question is whether regulators are willing to refrain from regulation. One specific issue associated with network industries such as telecommunications arises from the fact that major parts of the investments into network infrastructure are so-called specific investments which result in sunk or irreversible costs. One regulatory problem associated with sunk costs is that the regulators’ incentives to regulate change once a specific investment has been undertaken. While the regulator would like firms to invest and, therefore, may promise not to regulate new networks, the regulatory agency’s incentives change quite dramatically once the network is in place. Knowing that the network provider has significant barriers to exit, the regulator can put more stringent regulation in place and, for example, directly set lower tariffs or grant third parties access to the investor’s network in order to benefit consumers and increase static efficiency. Once the network is in place, the regulator may find it attractive to regulate the network. This problem is well known as the regulatory hold-up problem.12

10

See Gans, Joshua S. and King, Stephen (2004), “Access Holidays and the Timing of Infrastructure Investment”, The Economic Record 80, 89-100. 11 See Scotchmer, Suzanne (2005), Innovation and Incentives, MIT-Press, as well as Reinganum, Jennifer (1989), “The Timing of Innovation: Research, Development, and Diffusion”, pp. 849-908 in: R. Schmalensee and R. Willig (eds.), Handbook of Industrial Organization, Vol. 1, Elsevier: Amsterdam. 12 For the classical treatment see Goldberg, Victor P. (1976), “Regulation and Administered Contracts”, Bell Journal of Economics 7, 426-448, and more recently Spiller, Pablo and Levy, Brian (1996), Regulation, Institutions, and Commitment: A Comparative Study of Telecommunications, Cambridge University Press: Cambridge.

7 How Much and What Kind of Regulation Will be Needed

163

As is well known from the literature, if the firm foresees that the regulator cannot resist the temptation to regulate once an investment has been undertaken, any promise made by the regulator that new networks will not be regulated will not be credible. The regulator’s time inconsistency or credibility problem in turn will lead firms to shy away from specific investments, which results in the well-known under-investment problem, a lack of innovations, and dynamic inefficiency more generally. Regulation for Tomorrow

The Regulatory Hold-up Problem • Once specific investment is undertaken/ innovation is made, the regulator’s incentives change • Stricter regulation becomes much more attractive (think of what might have happened to Irridium) • The latter tendency amplifies if consumer and producer surplus accrue in different countries (national consumers=voters, international firms) • How can this be addressed? Prof. Dr. Justus Haucap – Ruhr-University of Bochum

Figure 3

To illustrate the regulatory hold-up problem one may think of what would have happened to Irridium if its global mobile telecommunications system, backed by satellites, would have been more successful. I am almost certain that by now Irridium’s satellite system would have been declared an essential facility to which competitors need access in order to compete with Irridium. Today we know that Irridium’s business model was not successful, but this was far from clear ex ante, i.e., before the investment was undertaken. Hence, from an ex ante perspective it may have been efficient to undertake the investment. If, however, the prospect is to be regulated once one’s business model is successful while one has to privately carry the losses if one’s business model turns out to be unsuccessful so that losses are privatized while profits are socialized, dramatic under-investment can result. The regulatory hold-up problem and the resulting under-investment are likely to be amplified if consumers are locally based and thereby largely equivalent to the electorate while firms are international. If consumer surplus enters into national welfare while producer surplus does not, the risk of an ex post regulatory hold-up may even be larger, as national regulators tend to focus on national welfare, at best, rather than global welfare. In Europe, for example, in many markets national regulatory authorities have to deal with international firms. If, however, national

164

Justus Haucap

regulators are likely not to take international firms’ profits fully into account, the commitment problem becomes even more severe. Hence, the dichotomy between international firms and national consumers will even amplify the regulator’s tendency to change the regulatory regime once an investment has been undertaken.

How to Overcome the Regulators’ Dilemma: Strategic Delegation of Competencies The question that now emerges is how this hold-up or commitment problem may be addressed. How can the regulator take measures that prevent himself from succumbing to the temptation to regulate a firm that has successfully invested into infrastructure? One possible way out of the regulator’s dilemma is to strategically delegate some of its competencies to somebody else. If the regulators know ex ante that they cannot resist regulating ex post even though they would like to commit themselves not to do so (in order to facilitate the investment in the first place), they may bind themselves by delegating competencies to some other body. If the dichotomy between international firms and national consumers amplifies the holdup problem this can be reduced by strategic delegation to some supranational body which takes the interests of international firms into account.13 Regulation for Tomorrow

Centralization of Regulation? • Idea: move (even more) regulatory competencies to the European (supranational level) • But: centralized regulation has major drawbacks in its own – who deregulates? • Three examples: – Forced introduction of call-by-call and carrier preselection for local calls in Germany – EU: Price-Squeeze Decision Against Deutsche Telekom – Mobile termination (see Stephen Littlechild) Prof. Dr. Justus Haucap – Ruhr-University of Bochum

Figure 4 13

In fact, the dichotomy between producer surplus accruing internationally and consumer surplus being generated „at home“ is one of the main reasons in favor of an international competition policy. See, e.g., Barros, Pedro and Cabral, Louis (1994), “Merger Policy in Open Economies”, European Economic Review 38, 1041-1055, as well as Haucap, Justus, Müller, Florian and Wey, Christian (2005), “How to Reduce Conflicts over International Antitrust?”, forthcoming in: S. Voigt, M. Albert and D. Schmidtchen (eds.), International Conflict Resolution, Conferences on New Political Economy 23.

7 How Much and What Kind of Regulation Will be Needed

165

Drawbacks of Centralization Of course, there are some major drawbacks from delegating competencies to a supranational body. First of all, the commitment problem is only partially resolved. As far as the dichotomy between international firms and national consumers is concerned the problem is alleviated as far as European firms are involved. However, the problem remains if American, Asian or other non-European firms are involved, while consumers are based within the EU. In addition, even with European firms, the Commission may be tempted to regulate successful innovations ex post due to the time inconsistency problem and the general hold-up problem that arises once irreversible investments have been undertaken. Secondly, one of the major drawbacks is that more centralization and harmonization prevent regulatory yardstick competition between different regulatory approaches. The benefits of mutual learning are easily dispensed, as there can be no mutual learning if the same regulations apply in all jurisdictions.14 Thirdly, and also very importantly, if regulatory competencies are allocated at a supranational level, it may be very difficult to move towards a more deregulated framework. This is especially of importance in telecommunications markets, where a broader deregulation, meaning the removal of Government intervention, is argued to the ultimate goal. At the moment, both national regulators and the European Commission have regulatory competencies which they like to apply in order to further increase their own relevance and significance. If two agencies are interested in further regulation the prospect for deregulation will be dim. This may be illustrated by three examples from Germany: Firstly, the European Commission has forced Germany to introduce both call-by-call (CBC) and carrier pre-selection (CPS) for local calls. Both mechanisms had been in place for national and international calls since January 1998 and also for dial-in into the Internet, but not for local voice calls. The German regulator as well as policy makers did not see this as a major concern, as the lack of CBC and CPS for local calls made infrastructure competition more attractive than it would have been with CBC and CPS for local calls. In addition, unbundled access to the local loop (local loop unbundling) has also been available in Germany as early as 1998. Hence, competing operators such as city carriers and regional operators, of which there are many in 14

In the area of competition policy, the benefits or yardstick competition have been analyzed in detail by Kerber, Wolfgang and Budzinski, Oliver (2003), “Towards a Differentiated Analysis of Competition of Competition Laws”, Journal of Competition Law 1, 411-448, as well as Kerber, Wolfgang and Budzinski, Oliver (2004), “Competition of Competition Laws: Mission Impossible?”, pp. 31-65 in: R.A. Epstein and M.S. Greve (eds.), Competition Laws in Conflict. Antitrust Jurisdiction in the Global Economy, American Enterprise Institute: Washington, D.C.

166

Justus Haucap

Germany by now, were able to sell bundles including both local access and local calls. In fact, in some cities such as Hamburg and Cologne these so-called city carriers have been extremely successful, reaching local market shares of up to 40 per cent.15 After the European Commission stepped in, Deutsche Telekom had to offer CBC for local calls (from April 2003 on) and also CPS for local calls (from July 2003 on). While this may be disadvantageous for Deutsche Telekom, it also hurts the alternative operators that had invested into local infrastructure. As CBC and CPS for local calls have become available, it has become much less attractive for consumers to switch with their entire access lines to a competing operator. Put differently, sticking with Deutsche Telekom for the provision of the access line has become more attractive for consumers, while switching to an alternative network-based operator has become less attractive.16 While the European Commission’s regulatory decision diminished the incentives for infrastructure-based competition and may have been flawed in its own, the most important aspect is that this decision prevented Germany from experimenting with a different regulatory approach. Moreover, as the decision has reduced incentives to invest in alternative infrastructure, it helps to manifest Deutsche Telekom’s monopoly in local infrastructure, thereby perpetuating the need for regulatory oversight. From a political economy perspective this is easy to explain, but at the same time regrettable from an efficiency perspective. Similarly, the European Commission’s price squeeze decision against Deutsche Telekom17 has been both flawed and unnecessary. As mentioned, Deutsche Telekom has to offer unbundled access to its local loop. The average access price, however, has been so close to the average retail price (sometimes even exceeding this price) that some competitors were arguing that Deutsche Telekom was practicing a predatory price squeeze. Whether or not this was the case, the German regulator RegTP did not see this as a concern as it followed Deutsche Telekom’s argument that consumers effectively purchase bundles which include both the local access line as well as telephone calls. Hence, renting out the access line for a price below costs is profitable for Deutsche Telekom as long as its consumers then make or even receive sufficiently many calls. In order to calculate the profitability of the offer one therefore has to consider the package of both access line and calls made, which Deutsche Telekom argued and RegTP agreed was the relevant market. In fact, many economists would agree that with complements it is not useful to analyze single item 15

See RegTP (2005), Jahresbericht 2004, RegTP: Bonn. For a critique also see Dewenter, Ralf and Haucap, Justus (2004), “Die Liberalisierung der Telekommunikationsbranche in Deutschland: Bisherige Erfolge und weiterer Handlungsbedarf“, Zeitschrift für Wirtschaftspolitik 53, 374-393. 17 See COMP/C-1/37.451, 37.578, 37.579 – Deutsche Telekom AG. 16

7 How Much and What Kind of Regulation Will be Needed

167

prices. The same logic of below-cost pricing for single items applies to mobile telephone sets, printers, video game consoles, and even some supermarket products. A supermarket may decide to sell chocolate bars below costs in order to attract consumers who then also purchase milk and other items. This has nothing to do with predation in the chocolate market. The European Commission did not even consider the complementarity of access lines and calls, which means that the analysis was not based on sound economic grounds. What is even more important here is that there is no economic reason whatsoever for the EU to step in and regulate only for the sake of harmonization. Again, the EU has successfully prevented that mutual learning and yardstick competition, associated with regulatory diversity, can emerge. The third example involves mobile termination, where we also see a very strong tendency for harmonization across Europe. Again the main driving force behind the move towards stronger regulation is the European Commission, again trying to prevent a diversity of regulatory systems from emerging. In addition, regulation almost always takes the form of ex ante price regulation instead of moving towards a system where receiving parties pay (RPP). The latter would arguably disband the need for regulatory intervention,18 which makes RPP unattractive for regulators from a political economy perspective. Overall, I am extremely skeptical about the net benefits of any further harmonization of telecommunications regulations in Europe. It is beneficial to have some common standards regarding cost-based interconnection and free market entry. The new regulatory framework, however, is extremely harmonized, involving central veto rights by the European Commission on market definitions and analyses and leaving very little scope for regulatory diversity. Moreover, the European Commission has used its veto rights very early already to signal that deviations from what is being regarded as appropriate regulation will not be tolerated. Hence, prospects for regulatory yardstick competition are dim, as is, consequently, the scope for mutual learning processes.

A Possible Way Forward In order to avoid over-regulation and to benefit from yardstick competition, the regulatory framework in Europe should be changed so that more regulatory diversity is possible. To achieve this, the allocation of competencies between the European Commission and national regulatory authorities should be changed.

18

For details see Stephen Littlechild’s contribution to this conference as well Littlechild, Stephen (2005), “Mobile Termination Charges: Calling Party Pays versus Receiving Party Pays”, forthcoming in Telecommunications Policy.

168

Justus Haucap

Regulation for Tomorrow

A possible way forward • The new regulatory framework is extremely centralized and harmonized – almost no scope for regulatory diversity • Introduction of stronger regulatory competition (as yardstick competition) between Member States • Assign regulatory competencies to the EU only in areas where external effects between Member States are significant (e.g. international roaming) • Assign a competency to the EU to deregulate only (but not to mandate new regulations) Prof. Dr. Justus Haucap – Ruhr-University of Bochum

Figure 5

First of all, the European Commission should only receive a full regulatory competency for telecommunications market regulations if there are international externalities, as may be the case with international roaming. This appears necessary as national regulators are not likely to intervene in this market as it is foreign consumers who may suffer from high prices. In fact, as of 31 May 2005 the market for international roaming has been the only of the 18 pre-defined telecommunications markets that not a single of the 25 national regulators has analyzed under the new regulatory framework. Hence, competencies for this market, including the imposition of remedies, should rest with a supranational authority. Secondly, to avoid, as far as possible, over-regulation and the regulatory hold-up problem one should assign a “one way competency” to deregulate only to the European Commission. That means, the EU should be allowed to take measures and issue vetoes wherever a national regulator intends to intensify regulation, but the EU should not be allowed to step in whenever national regulators intend to deregulate. The “one way competency” to deregulate only would help to alleviate the regulatory hold-up problem by delegating a veto right against intensified regulation to a supranational level, namely the European Commission, which takes into account at least some of the international firms’ interest (namely the European ones, in this case). By limiting the European Commission’s competency to deregulatory measures, however, such a system of “checks and balances” will be much better suited to avoid over-regulation than the current allocation of competencies, which fosters over-regulation, with potentially dramatic consequences for investment and innovation.

7 How Much and What Kind of Regulation Will be Needed

169

7.5 Statement Robert C. Atkinson Columbia Institute for Tele-Information (CITI) The quick and easy to the question is “it all depends.” Unfortunately, the quick and easy answer isn’t very helpful because determining what degree and kind of regulation will be needed will “all depend” on the specific circumstances of each market at a given time. And because each market will have different circumstances and those circumstances will be changing constantly but unpredictably, it will be impossible to determine “how much” and “what kind” of regulation should apply without doing a lot of hard work. It is important to remember that the principle purpose of economic regulation is to protect consumers from abuse by dominant suppliers of essential services. So, determining how much and what kind of regulation should be applied to which kind of service should be done from the perspective of consumers. Consider the case of consumers in Manhattan. The island of Manhattan has a population 1.54 million people in 23 square miles for a density of 66,941 persons1 per square mile. Those people have an annual per capita income of $42,922, 2 twice the national figure. It is the heart of the US financial services, media, advertising and other telecommunications-intensive businesses. Because of these circumstances, it is reasonable to expect that Manhattan is an attractive market for telecommunications services providers. It is clear that Manhattan has a plethora of competitive telecommunications services at low prices with generally high quality and availability. But consider the case of consumers in another Manhattan: Manhattan, Nevada. It has a population 1,841 in 1,800 square miles for a density of 1.02 persons per square mile3. Per capita income is only $ 20,8814 … about half of Manhattan, NY’s. It is an old mining town5 at the end of a canyon in the middle of no1 2 3 4

http://www.epodunk.com/cgi-bin/popInfo.php?locIndex=1101 http://www.epodunk.com/cgi-bin/incomeOverview.php?locIndex=1101 http://www.hometownlocator.com/ZCTA.cfm?ZIPCode=89022 http://mcdc2.missouri.edu/cgi-bin/broker?_PROGRAM=websas.dp3_2k.sas&_SERVICE= sasapp&zi=89022 5 “… Once a flourishing mining community of 30,000 people, Manhattan is now populated with vacation homes and just a sprinkle of year-round residents. The town of Manhattan sprang up, almost overnight, in 1905, after a ranch hand named Humphrey discovered gold during his lunch break … There have been a few other mining operations in recent years, and a small number of people make their home in Manhattan today. There is a post office and public library, as well as one or two bars open for business … the landscape still contains old mining artifacts scattered here and there. The surrounding countryside is attractive, with rough hillsides and forests of juniper and pinion trees. Manhattan and the surrounding area is a great destination for sightseers and history buffs.” http://www.mountainsage.org/Belmont.htm Photos of Manhattan, NV: http://www.ghosttowngallery.com/htme/manhattan.htm

170

Robert C. Atkinson

where.6 Because of these circumstances, it is reasonable to expect that Manhattan, NV is not an attractive market for telecommunications service providers. And this expectation seems to be borne out. A comparison of the telecommunications services available to consumers in these two Manhattans reflects differences as vast as their circumstances: Manhattan, NY’s largest telephone service provider is Verizon, the nation’s largest telecommunications company (market capitalization: $98.3 billion; 2004 revenues: $71.3 billion). Manhattan, NV’s telephone service is provided by Citizens Communications, a fairly large telephone holding company that specializes in serving rural areas (market cap: $4.32 billion; 2004 revenue: $2.2 billion) Each of Manhattan, NY’s 44 Zip Codes has between 14 and 23 Competitive Local Exchange Carriers (CLECs) while Manhattan, NV’s single Zip Code is not served by any CLEC.7 Manhattan, NY has ubiquitous cellphone service from all five national carriers and numerous resellers. Manhattan, NV doesn’t have much, if any, cellphone service.8 Manhattan, NV doesn’t have cable TV service and therefore doesn’t have cable modem service for high speed Internet access9; Manhattan, NY has ubiquitous cable TV service and cable modem service.10 Each of Manhattan, NY’s 44 Zip Codes has between 8 and 18 broadband service providers.11 There is no terrestrial broadband service in Manhattan, NV’s single Zip Code: the telephone company does not offer DSL and has no plans to do so.12 6

According to the Manhattan, NV town librarian, the nearest grocery store is 25 miles in one direction and 50 in the other; the nearest Wal*Mart is 300 miles from Manhattan. 7 FCC Form 477 data at http://www.fcc.gov/Bureaus/Common_Carrier/Reports/FCC-State_ Link/IAD/czip0604.pdf 8 Cellular Telephone and Internet Association (CTIA). Three cellphone companies reported serving Nye County in which Manhattan, NV is located. However, Nye County constitutes one-sixth of the entire state of Nevada or more than twice the size of the State of New Jersey. According to the Manhattan town librarian, the only place where “spotty” cellphone service is available in Manhattan itself is at the library parking lot because the main highway, about five miles down the canyon, is visible from the there. 9 Telephone conversation with Nevada Cable Television Association 10 Manhattan, NV residents can get satellite TV and over-the-air television (ABC, CBS and NBC TV-and sometime Fox-channels relayed from Reno and Las Vegas are available). According to the librarian, a few Manhattan, NV residents have satellite data service and the library itself is considering satellite data service. But note that satellite broadband isn’t suitable for VoIP telephone because of the propagation delay inherent in satellite service. 11 FCC Form 477 data at http://www.fcc.gov/Bureaus/Common_Carrier/Reports/FCC-State_ Link/IAD/hzip0604.pdf 12 Telephone conversation with company Customer Service representative

7 How Much and What Kind of Regulation Will be Needed

171

Manhattan, NV has zero public WiFi hotspots while Manhattan, NY has nearly 1,000.13 Considering the vast differences between Manhattan, NY and Manhattan, NV, is it likely that a telecommunications regulatory system that is reasonably optimal for one could also be optimal for the other? Consumers in Manhattan, NY have a wide range of competitive choices for their basic telephone service: traditional POTS and wireless from a number of companies and VoIP over broadband is a realistic option for a significant percentage of the population. It is reasonable to expect that little or no retail regulation is needed in these circumstances. By contrast, consumers in Manhattan, NV have no practical choice with respect to telephone services: just one POTS supplier, no wireless and no VoIP, so it is reasonable to expect that some economic regulation will be needed in these circumstances. While other Manhattans – in Montana, Illinois and Kansas – fall somewhere between the extremes of New York and Nevada, it is likely that the regulatory system that is optimal for one Manhattan will never be optimal for any other. In addition to differences between the Manhattans at any given instant, each is also changing. There is an assumption that multiple networks providing POTS, wireless, broadband and television and VoIP over broadband will be available everywhere in the future because they are available somewhere today. While this may turn out to be true, it is at least equally likely that the telecommunications industry will be radically different in the future. For example, in the absence of government intervention, the consolidation process that is well underway in the telephone, cable TV and wireless industries could result in two infrastructures in many markets (including Manhattan, NY): one fiber-based “fat pipe” to every home and business for video and data services (with telephone being a VoIP data application) and one wireless system providing “thinner pipes” for mobile and nomadic services. In thin markets such as Manhattan, NV, a “thin” wireless broadband system may suffice for all applications, including video. But is it reasonable to expect that multiple broadband systems will be deployed and survive in communities, such as Manhattan, NV, where there is concern that the first won’t be deployed? A regulatory system that assumed a permanent “natural monopoly” was incompatible with the competitive industry that evolved in the latter part of the 20th century. Similarly, a regulatory system that assumes that the equilibrium state of the telecom industry is intense competition among multiple infrastructures will clearly be suboptimal – and perhaps totally ineffective – if the industry settles into a monopoly or duopoly structure. Why should any Manhattan be condemned to suboptimal regulation, now or in the future? Wouldn’t it be better to have a system where the kind and degree of 13

CNET Hot Spot Zone http://reviews.cnet.com/4520-6659_7-726628-1.html?tag=fs

172

Robert C. Atkinson

regulation is dynamically and constantly adapting to the changing circumstances of each market? Isn’t it reasonable to expect that such a system of “circumstantial regulation” would produce results that are more optimal for each market? Of course, it is easy to suggest that regulation should be optimized for and be responsive to the circumstances of each market. But is such a system really practical and feasible? How will it work? Won’t it be chaotic? Won’t there be less regulatory certainty? Won’t it be difficult? CITI’s on-going “Remedies for Telecom Recovery” project has outlined in considerable detail how the regulatory system should be overhauled to encourage the recovery of the telecom sector after the recent historic “meltdown.”14 It also explains how “circumstantial regulation” would work. For purposes of this paper, it is sufficient to note that the current system doesn’t seem to be working very effectively and one reason is that it is too uniform, too static, and too rigid. Perhaps it is simply time to try “circumstantial regulation” – that is, a flexible, adaptable, dynamic system – instead of tinkering with “one size fits all” regulation in the expectation that it can be made to work well. With “circumstantial regulation” the kind and degree of regulation will dynamically and constantly adapt to the changing circumstances of each market so that there would be a greater chance that regulation would be more optimal for each market. There seems to be a growing consensus that the Communications Act of 1934 as amended by the Telecommunications Act of 1996 needs to be revised. The former Chairman of the FCC, members of Congress and industry leaders have, to varying degrees, called for substantial changes to the Communications Act. While there are many reasons to hope that Congress won’t try to rewrite the telecommunications law (not the least of which is the “gridlock” that may develop), this would be an opportunity to move away from “one size fits all” regulation to a system that responds to the specific circumstances of each market. For all the rhetoric about loosening the grip of government on telecom, the Telecommunications Act of 1996 ended up centralizing all fundamental telecommunications policy in the Federal Communications Commission (FCC), effectively federalizing the 50 states with respect to local competition and preempting the judicially-supervised modified final judgment (MFJ) with respect to Bell entry into long distance. This centralization appeared to satisfy some industry operators’ and many investors’ desire for less risk and more reward by providing what turned out to be the illusion of greater “certainty” and “predictability.” However, the Telecom Act did not take the “circumstantial” approach of simply establishing broad policy goals – such as competition in all markets and less regulation – and then leaving it to the FCC to achieve them. Rather, the statute itself 14

See: http://www.citi.columbia.edu/research/recovery2/CITI_RegulatoryUpdate04.pdf

7 How Much and What Kind of Regulation Will be Needed

173

sought to micromanage the implementation. Unfortunately, the result has been a legal “gridlock” that has satisfied no one. Consider the micromanagement inherent in: establishing numerous implementation deadlines for the FCC; specifying three pricing methodologies for ILEC-CLEC interconnection; establishing a detailed system for negotiating, mediating and arbitrating interconnection agreements; and devising a 14-point checklist to be satisfied before a Bell could offer long distance services. There is nothing substantively wrong with these policies except that they took away much of the freedom of the implementing agency – the FCC – to adjust policies later in light of unexpected or changed circumstances … such as the rapid development of the Internet or a monumental “bust” in investor confidence and the industry “meltdown.” If the Act took flexibility from the FCC, it took even more from the States. With respect to local competition, it is useful to recognize that the Telecom Act was neither revolutionary nor innovative. Rather, the Act largely codified into national law and policy the results of many experiments conducted by State public utility commissions (PUCs) over the prior decade.15 This state-by-state experimentation – with its admittedly untidy look of “muddling through” – did not provide the “certainty” and “predictability” sought by some operators or investors. But ironically and not appreciated by them at the time (and perhaps even today), “muddling through” was and is much less risky than a single federal policy, particularly one that gets “gridlocked” and “reformed” (or some would say deformed) in interminable due process. One reason that “muddling through” by the States is less risky is that a Federal policy can never be optimal in all markets across this diverse nation. Policies that benefit the low density rural states, for example, may disadvantage the densely populated states, and vice versa. “Muddling through” in the States also reduces risk by allowing for a continuous and low-risk iterative process of field experimentation, 15

Local competition (at least in the modern era) did not start with the Telecom Act. Rather, it started when the New York Public Service Commission, in mid-1985, issued a Certificate of Public Convenience and Necessity to Teleport Communications, proposing to provide local high-capacity private lines in New York City. By the early 1990's, many other PUCs had authorized “Competitive Access Providers” (CAPs) to provide unswitched local services. In so doing, the States had required “central office collocation,” later known as “collocation” after the FCC ratified the various PUC decisions, and some forms of loop unbundling to facilitate this initial phase of local competition. The pattern repeated for switched local services: in 1994 the NYPSC authorized the first competitive local exchange service in the country and by the end of the following year – 1995 – fourteen “Competitive Local Exchange Carriers’ (CLECs) had installed 70 competitive central office switches. Such issues as mutual compensation, now known as “reciprocal compensation,” number portability, and OSS interconnection were being addressed and had been at least partially resolved on a state-by-state basis.

174

Robert C. Atkinson

testing, and fine tuning of business strategies and public policies before irrevocable, major investment bets are placed. This was how local competition developed before the Telecom Act upset the process. Although the Act stopped the beneficial, risk-reducing state-by-state experimentation, it did not empower the FCC to undertake its own experiments. Instead, everything became a single high-risk roll of the federal dice. Now, every FCC decision – because it has such far-reaching application – literally becomes a “federal case” and leads not to finality but to litigation, with fundamental decisions being made not by an expert agency but by judges and their law clerks. This sort of gridlock cannot engender investor confidence. And it doesn’t provide “certainty” to business planners. A summary of the legislative recommendations included in CITI’s “Remedies for Telecom Recovery” program is attached to this paper. For purposes of the subject at hand, it is sufficient to consider only a few of them: Eliminate economic retail rate regulation everywhere, immediately. Generally, there is sufficient actual and potential competition for every retail telecommunications service, including basic local telephone service, to justify deregulation. Basic telephone service consumers in most (but not all) geographic markets now have reasonable alternatives to the traditional local exchange carrier (ILEC) from wireline resellers, numerous wireless services providers and, increasingly, from VoIP provided over telco and cable broadband services. (While competitive alternatives from carriers using the UNE-Platform will fade beginning in mid-2004, consumers’ opportunity for having “IP Telephone” (or VoIP) service from cable TV companies as well as from independent IP Telephone service providers such as Vonage is increasing rapidly.) Therefore, it is difficult to imagine that ILECs could abuse their customers by raising prices or offering poorer quality service without suffering substantial competitive losses. If a market is reasonably competitive, there is no consumer protection justification for retail service regulation. This principle worked well in the long distance market: once there as enough competition from MCI, Sprint and others so that AT&T was determined to be “non-dominant,” the FCC eliminated retail price regulation of long distance services. Similarly, prices of wireless telephone services are not regulated since no cellular carrier has been able to dominate that market. This policy will be optimal for Manhattan, NY and probably most cities and suburbs in the United States that are served by unrelated wireless, POTS and cable TV operators. None will need to be regulated in their dealings with consumers. (Carrier-tocarrier relationships should be governed by negotiated interconnection agreements which would be arbitrated under the auspices of State regulators if voluntary agreements can’t be reached.)

7 How Much and What Kind of Regulation Will be Needed

175

Reimpose economic regulation in markets that are subject to demonstrable abuse of consumers or similar harm to the public interest Of course, there will be a number of geographic markets, such as Manhattan, NV, where there is likely to be insufficient competition to protect consumers from abuse. However, deregulating on a market-by-market would require hundreds or thousands of proceedings, each of which would be an opportunity for gridlock that would bring the entire regulatory system to a grinding halt. Therefore, it would be better to “flash cut” retail rate deregulation in ALL markets, observe whether and where any abuse of consumers actually occurs and quickly reimpose regulation when it does. There are plenty of competitors, consumer advocates and state regulatory staff who would bring any suspected instances of consumer abuse to state and federal regulators’ attention. Where consumer abuse is demonstrated, swift re-regulation, presumably by State regulators who are best able to evaluate the local circumstances, would be appropriate and necessary. While the re-regulation process might be used immediately in Manhattan, NV, it could just as easily be utilized in Manhattan, NY in the future if the New York market evolves into a duopoly or monopoly. In re-regulated markets, the question of which services are regulated and to what degree would depend on the circumstances of the particular market, particularly the degree of actual and potential competition among service providers and service platforms. Reform Universal Service to enable retail rate deregulation and to get regulators out of the subsidy business. Since retail rate regulation is one means for artificially keeping basic service rates below cost in some markets and for favored classes of consumers, abolition of retail rate regulation would mean that Universal Service objectives would have to be achieved by means other than implicit cross-subsidies. This would be consistent with the stated but thus-far-ignored Congressional mandate of eliminating such implicit subsidies. It is not likely that markets could be deregulated unless there is assurance that such deregulation will not undermine the social objective of affordable universal telephone service. Affordable telephone service for the 1,800 residents of Manhattan, NV shouldn’t be sacrificed merely to benefit 1.5 million residents of Manhattan, NY. So, the issue of universal service reform must be addressed in conjunction with deregulation. Universal service is a good idea: reliable, capable telecom service is essential for participation in modern life and every citizen should therefore have reasonable

176

Robert C. Atkinson

access to essential telecom services. It is an appropriate role for government to guarantee that access. However, universal service is a social subsidy program, not a telecom regulatory issue, and it should be treated as such. If low income individuals (including those in Manhattan, NY) or all consumers in a very high cost area (such as in Manhattan, NV) need to be subsidized in order to afford the deregulated price of telephone service, the government should do so. But the telecom regulators shouldn’t be involved. Other government agencies are better suited for raising and disbursing telecom subsidies. And ideally consumers, not companies, should be subsidized, by providing consumers who need it with enough additional buying power to afford unregulated rates. This would allow low income individuals and residents of high cost (usually rural) areas to participate in and benefit from a competitive market in the same manner as wealthier or more urban citizens. Buying power can be increased through a portable voucher that individuals use to buy services at market rates. To keep things as simple as possible and minimize the involvement of the telecom industry and telecom regulators in the subsidy process, telecom vouchers for low-income individuals could be issued automatically to individuals who already participate in the Department of Agriculture’s food stamp program. The low income voucher could be adjusted based on the recipient’s zip code to bridge the difference between the unregulated retail price charged by largest supplier in the market and some percentage of the national average price for unregulated service. To address individuals in high cost areas such as Manhattan, NV, a “high cost” voucher could be mailed to each home in the market not receiving food stamps upon request. The size of the “high cost” voucher could be varied on a zip code or other geographic basis to equal the difference between the market’s largest service provider’s basic (unregulated) retail rate in that market and some affordability level. If complete retail rate deregulation is “too radical” and would itself cause more decisional “gridlock,” the regulation of cable television rates might provide a less radical model. Cable rate regulation has been eliminated, except for “basic” cable, with remaining regulation focused on regulating “access” to the cable television system. Analogously, only basic “lifeline” telephone service would be rateregulated.

Conclusion Because the circumstances of every market are so different, it is impossible to have a single regulatory policy that will be optimal for all markets. Rather than condemn most markets to sub-optimal regulation (or deregulation), the better approach is to

7 How Much and What Kind of Regulation Will be Needed

177

have a regulatory system that adapts dynamically to the circumstances of each market. Then, where no regulation is needed (such as Manhattan, NY), there will be no regulation; where just a little regulation is needed (such as Manhattan, Kansas), there will be a little regulation; and, where substantial regulation is needed (such as Manhattan, NV), there will be substantial regulation. Summary of Legislative Recommendations Based on CITI’s “Remedies for Telecom Recovery” Program October 2004 To minimize gridlock and to produce a law with lasting utility, a new telecom law should deal almost exclusively with two subjects: Principles, that most stakeholders can support so that regulators (and courts) are clear about the statutory goals and objectives; and, Process, so that final, sustainable decisions can be reached in a short period of time. Conversely, any new statute should NOT deal with “substance” in the sense of embodying in law Congressional micromanagement of the telecom industry, particularly to resolve current industry disputes or to specify a particular regulatory policy. Any such embodiment is likely to be wrong or obsolete or both.

Principles: A new statute should begin with a clear and concise statement of the fundamental goal of the law, perhaps modeled on the similar provisions of the current Communications Act. For example: “The purpose of this law is to establish and maintain an efficient, reliable and secure nationwide and worldwide telecommunications system that is capable, at a minimum, of providing all persons with access basic telecommunications services. The Commission hereby established and State Commissions authorized by this law shall rely, wherever reasonably feasible, on competitive market forces to achieve this purpose and shall regulate telecommunications services and facilities only where and for so long as market forces are insufficient to achieve this purpose or are unable to prevent the abuse of consumers.” A new telecom statute should then empower and require regulators to follow broad principles, such as: Competition is to be preferred in every market to protect consumers and encourage fair prices, innovation and efficiency.

178

Robert C. Atkinson

Network interconnection and the right of consumers to attach any devices to the network and to use telecommunications services without restriction are essential to a competitive market. Where competition is demonstrably insufficient to achieve the purposes of the law, regulation should be applied on a geographically granular basis to the minimum extent required to achieve the statute’s purpose or to protect consumers from pricing and service abuses. The Federal government has plenary authority over all telecommunications facilities and services. However, the Federal authority shall be delegated broadly by the FCC to State commissions when the varying circumstances of each locality or region require varying regulatory responses or policies. States may exercise authority, particularly traditional police powers, over telecommunications, telecommunication facilities and telecommunications services provided that such exercise does not conflict with Federal law, policy or regulations. The FCC or federal courts shall preempt any conflicting State action. The FCC may conduct regulatory experiments of limited geographic scope and shall generally encourage States to experiment with regulatory policies by, inter alia, forbearing from applying Federal laws or regulations that interfere with the experiment. Neither Federal nor State regulators shall regulate the price, quality or other characteristics of retail telecommunications services (those predominantly utilized by corporate and individual consumers) in the absence of demonstrable consumer abuse. All carrier-to-carrier issues (including but not limited to such matters as collocation, access charges, reciprocal compensation, performance standards, and all other interconnection matters) shall be resolved exclusively by bilateral negotiation and, if the negotiation fails, by binding commercial arbitration of any unresolved matters. The Commission shall allocate and assign all radio frequency spectrum not controlled by the Federal government for government use in the manner it deems most efficient and equitable. Regulators shall be prohibited from requiring telecommunications service providers to be involved in collecting or contributing funds to support “universal service” and regulators shall not require any implicit subsidies in any rate regulation. The Commission may, after due process, revoke blanket licenses for activities that constitute systemic untrustworthiness and may prohibit licensees from employing as managers persons who have a record of untrustworthiness in the telecom business.

7 How Much and What Kind of Regulation Will be Needed

179

Process With respect to delegation of the plenary Federal authority to States, the delegation must include the directives and decisional standards needed to comply with Constitutional requirements and the new telecom law. In most cases, the FCC would hear initial appeals of decisions made by State regulators pursuant to the delegated authority. With respect to experiments: A State may petition the Commission for authority to conduct a regulatory experiment of up to two years duration, including any necessary forbearance. Unless the Commission denies the petition within 60 days, the petition shall be deemed granted. The best evidence in proceedings before the FCC, State commissions or courts is the results of relevant State or Federal experiments The Commission and State regulators shall forbear from applying any statutory provision or regulation on a market-by-market basis, or for all markets, or on a service-by-service basis, or for all services, if they determine that forbearance is likely to better achieve the statutory objectives than regulation. If a petition for forbearance is not acted upon within 180 days, the petition shall be deemed to be granted. Forbearance may be revoked or modified in a proceeding alleging consumer abuse by a service provider. All adjudicatory proceedings before the FCC shall be conducted by Administrative Law Judges except where the Commission determines on a case by case basis that another process would be more efficient, fair and transparent. All appeals of the Commission’s decisions will be made to Court of Appeals for the District of Columbia Circuit. State decisions administering Federal statute are to be appealed to Federal District Court. With respect to the determination of consumer abuse: The FCC will define “consumer abuse” and may issue standards to be applied in determining the existence of abuse. States have the initial responsibility for determining the existence of consumer abuse and for applying the least regulation required to eliminate any demonstrated abuse. The FCC would act if States refuse to consider petitions alleging consumer abuse.

180

Robert C. Atkinson

The FCC will hear appeals from State determinations regarding allegations of abuse, the decision to regulate or to not regulate as a result of the determination, and the appropriateness of any regulation imposed by the State. At the first successful appeal, the matter will be returned to the State for further action in light of the FCC’s decision. At the second or subsequent successful appeals on essentially the same case, the FCC may assume decision-making responsibility based on the State record and any new evidence it develops. With respect to service provider negotiation and arbitration: Matters not resolved through bilateral negotiations shall be resolved by a proposed State Commission Order drafted by a commercial arbitrator; Parties may agree to any commercial arbitration procedure but “baseball” arbitration (where the arbitrator many only select the entirety of one of the party’s best and final package of offers regarding all the unresolved issues) will be the default arbitration process (i.e., winner takes all); Parties can agree that an arbitration result will apply only to specified markets within a State, an entire State or to any number of specified States but a state-wide scope will be the default. The arbitration decision will be submitted to the affected State Commission for ratification; if a party challenges the arbitrator’s decision, the State must accord the arbitration result “substantial weight” with the party challenging the arbitration decision having the burden of demonstrating that, overall, the arbitration decision is inconsistent with law, Federal policies or is likely to lead to significant harm to public interest; Where the arbitration covers more than one State, an ad hoc panel composed of one State Commissioner selected by a majority of the State Commissioners from each affected State will consider the ratification and the majority decision of the ad hoc panel will bind all affected States If it does not ratify the arbitrator’s decision, the State Commission’s or ad hoc panel’s only recourse is to order another arbitration. “opt-in” would be available for similarly situated carriers that choose to avoid negotiation; With respect to Universal Service, the telecom statute should provide for a nonregulatory mechanism to support Universal Service: To support low-income individuals, every individual receiving a food stamp would also receive an additional telecom voucher from the agency issuing the food stamp. The dollar amount of the telecom voucher would be the difference between the

7 How Much and What Kind of Regulation Will be Needed

181

unregulated retail rate for basic telephone service provided by the largest provider of service in the market (zip code?) and 115 % of the national average retail price for such service. All individuals in “high-cost areas” (for example, those where the unregulated price for basic service is more than twice the national average) who do not receive the lowincome voucher, would upon application receive a similar voucher. The telecom company providing the service selected by the consumer would redeem the telecom vouchers in the same manner and system used to redeem food stamps. Telecom vouchers should be funded from: a) the 3 % telephone excise tax (which shall not be increased by the telecom law); and b) if necessary, general revenues.

182

Stefan Doeblin

7.6 Statement Stefan Doeblin Chairman Network Economy Group As a non-University member of this panel I would like to present some provocative statements from the position of an entrepreneur. My intention is to support more efficient ways to do telecommunications business successfully.

Do we need national regulations for telecommunications? ƒ Convergence and Globalisation of ICT and Media ƒ EU attempts to harmonize But 25 national regulatory authorities in the EU ƒ 16 “Laender” authorities in Germany responsible for media regulations (constitutional law) ƒ

ƒ Non-regulated IT industry is still growing and on a global scale and will take over the telecoms industry

Dez-05

network economy group Bern - Frankfurt/M. - Brussels

page 1

Figure 1

1. Do we need national regulations for telecommunications? (Fig. 1) Telecommunications is a multi-billion $ industry. Maybe a more important aspect of telecommunications is its infrastructure-based functionality for other industries in a global market. But regulation still remains on a national level. The media and content industry is becoming global but is regulated locally. Especially Germany is regulating content by 16 Laender (states) and in Europe by 25 member states. Germany has to deal with this “regulatory divide” forever, since this belongs to the unchangeable parts of its constitution established after World-War II.

7 How Much and What Kind of Regulation Will be Needed

183

Why do we regulate the telecommunications industry? ƒ In Europe incumbents still dominant the market (80-90% market share) but cross border consolidation process is difficult ƒ Anti-trust authorities versus regulatory authorities ƒ Regulation drives often investment of large telcos into smaller ones which causes regulated management behaviour instead of competitive behaviour ƒ Standardization, quality, interoperability, compatibility, common business processes are needed globally - therefore we need large, cross-border telcos

Dez-05

network economy group Bern - Frankfurt/M. - Brussels

page 2

Figure 2

2. Why do we regulate the telecommunications industry? (Fig. 2) We had tried since the 80’s and 90’s to regulate the telecommunications market to improve competition. But in Europe, incumbents still cover 80–90 % of the market. We have no real regulations in the IT market. But IT is a very competitive market segment including a few monopolies like Microsoft. Those are regulated by the Anti-Trust authorities. Why do we need special authorities for telecoms? If regulation is implemented to create competition why don’t we use Anti-Trust Authorities to separate incumbent carriers into different parts competing against each other? Keeping incumbent carriers as they are, we need to invent rules for interconnection. A small carrier could use connectivity of large carriers and grow. This means part of the investment of an incumbent carrier is automatically done for small competitors and its company management is forced to be creative to avoid it and to promote investment only in non-regulated markets (such as IT, online services, solution business, but not infrastructure and not content). If we keep near monopolies then we need a complex regulation with civil servants which is often far away from market conditions.

184

Stefan Doeblin

With the Telco monopolies and their de-facto standards we are lagging behind in standardization efforts also, because the incumbents are acting now 100 % microeconomically. What we need is standardization! Quality, interoperability, compatibility, common business processes. These issues need to be solved on a global base. New entities (hopefully funded and managed by private economy forces) should be created. The development of necessary standards could be a task for infrastructure companies. Let us start in the US, Europe and Japan.

If regulations is still needed, do we regulate too much? ƒ Investment in telecommunications needs a stable and secure environment - there are conflicts of interests ƒ ƒ

Large share of incumbent telcos and TV providers owned by the government Political influence of Regulatory authorities

ƒ Broadcast, peer-to-peer content and IT applications are merging, TV is now on the internet and is becoming mobile (“Everybody becomes an interactive radio station” - Berthold Brecht). ƒ Infrastructure will be key for all additional services - we need powerful infrastructure and access to it. Dez-05

network economy group Bern - Frankfurt/M. - Brussels

page 3

Figure 3

3. If regulations is still needed, do we have too much of it? (Fig. 3) For investment reasons we need a stable and above all, a predictable regulatory environment. But regulation authorities are depending on the political power of the current elected government (influenced by the ideology of the ruling parties). A predictable bad regulation thus seems better than changing regulatory regimes after every election. Especially if the governments still own a large package of shares in the incumbent telcos or operates state owned TV providers with mandatory end-user subscription payments. Countries such as Germany should therefore make proposals to end state-ownership in Telekom shares and to transform the public TV system totally to a private one, or at least prohibit the commercials in public TV.

7 How Much and What Kind of Regulation Will be Needed

185

If the government would own a large stake of Microsoft we would have no anti-trust case in Europe and no open source movement for Linux in the government sector. So it is good that the government is not an owner of MS shares. Why do we need content regulations especially for TV? Ok, there are some moral and ethical rules, which a government (in Germany the Bundesländer) should control, but the rest could be reduced to the infrastructure question (how we receive which frequency). State-owned TV stations would be more efficient in a public private partnership where the government requires slots for education, news, arts and is paying the TV station for it. The rest should be in competition with all others. Broadcast content, peer-to-peer content and IT applications are merging and should be not regulated but it should cost something, paid for by users, companies or institutions. So I would like to quote Viviane Reding, Member of the European Commission responsible for Information Society and Media: “The Television without Frontiers Directive can no longer just be concerned with broadcasting. Television is now on the Internet; it is also going mobile. Admittedly, for the moment TV on the internet is small scale – but it will grow. We have to make sure it grows strongly and correctly. And for this we need the right, modern framework.”

What should be regulated? ƒ Digital rights, copy rights and software patents are global issues and there remain many open points (Writers and artists have survived centuries without Rights Management, their future should be secured by sponsorships and payments) ƒ Rights of way, radio frequency needs national regulations ƒ Infrastructure investment needs exclusivity of use ƒ Separation of infrastructure from higher services would enable its faster roll out ƒ Separation between voice and data is artificial. Universal services could be covered by the government, financed by income taxes and outsourced to operators. Dez-05

Figure 4

network economy group Bern - Frankfurt/M. - Brussels

page 4

186

Robert C. Atkinson

4. What should be regulated? (Fig. 4) We are regulating the streets, motorways but not the vehicle market except by antitrust authorities and technical supervision. Media applications like TV, streaming, video on demand are driving forces for broadband roll-out for sure. Media is becoming more and more global and with peerto-peer applications the borders between content produced by third parties and content produced by ourselves becomes blurred. One could say, that the old dream of Bert Brecht, that everyone will be an interactive Radio Station, comes true in the real broadband world. Digital rights, copy rights and software patents are global issues and there remain many open points. This has to be solved to avoid that all content brokers and peer-to-peer brokers are not becoming automatically criminals. Just a few days ago, the two German IT associations (ITG and GI) recommended the complete abolition of software patents, this could be applied also to the content industry. Writers and artists have survived centuries without Rights Management, their future should be secured by sponsorships and payments from the market that commercially exploiting them. Splitting telecommunications into streets and vehicles means: Rights of ways and radio frequencies are subject to regulation by individual nations, which hopefully standardize and harmonize the conditions and processes with each other. New infrastructure investments should have exclusive use (perhaps for a period) and not automatically shared with competitors, which only can claim their non-ability to bring up the necessary sums for infrastructure building up … Frequencies should be regulated and be priced in order to assure efficient usage of the available spectrum. Interconnection and local access should be regulated as long as the market is dominated by the incumbents. But at higher layers (i.e. services) regulations should be more relaxed. The separation between voice and data is artificial; therefore the discussion whether VoIP is related to the data- or the voice-market is artificial as well. If you follow the same logic for the IT market: Why shouldn't Microsoft have to obey a universal services obligation and pay for it? Nevertheless, Windows is installed in nearly every household. If VoIP is the better technology set it free, who cares about numbering. The internet regulates numbering in a non-governmental way. VoIP shows a non-national footprint. It is questionable why countries in Africa and Latin America prohibit VoIP only because the operators would lose 80 % of their sales (out of which 50 % must be paid back toBy or to? countries with hard currencies). Universal services could be covered by the government, financed by income taxes and outsourced to operators.

7 How Much and What Kind of Regulation Will be Needed

187

Identifying the location of a caller could be solved by GPS and not by a numbering plan – if it is necessary at all. Coming back to Europe: if infrastructure business would be separate from higher services, we could clearly regulate the first one and globalize the latter one more easily. National borders make no sense for the latter one and we would have immediately 25 former incumbents competing with each other heavily in Europe.

Appendix

189

Appendix List of Speakers and Chairmen Prof. Jeff Anderson

Prof. Dr. Michael Dowling

Graf Goltz Professor and Director BMW Center for German and European Studies Edmund A. Walsh School of Foreign Service Georgetown University, ICC-501 Washington, DC 20057 USA

University of Regensburg Chair for Innovation and Technology Management 93040 Regensburg

Robert Atkinson Director of Policy Research Columbia Institute for Tele-Information Columbia Business School 1A Uris Hall, 3022 Broadway New York, NY 10027-6902 USA Prof. Robert Calderbank Department of Electrical Engineering Princeton University Olden Lane Princeton, NJ 08544 USA Gary A. Cohen General Manager IBM White Plains Global Communications Sector 1133 Westchester Ave White Plains, NY 10604 USA

Jacques Dunogué Alcatel HQ Paris 54, rue La Boétie 75008 Paris FRANCE Prof. Dr.-Ing. Jörg Eberspächer Munich University of Technology Institute of Telecommunication Networks Arcisstr. 21 80290 München Thomas Ganswindt Member of the Managing Board Siemens AG Wittelsbacherplatz 2 80333 München Prof. Dr. Justus Haucap Ruhr-University of Bochum Industrial Economics and Competition Policy Universitätsstr. 150, GC3/62 44780 Bochum

Stefan Doeblin

Christine Heckart

Chairman network-economy S.A. Rue Berckmans 109 1060 Brüssel BRUSSELS

Vice President Marketing Juniper Networks 1194 North Mathilda Avenue Sunnyvale, CA 94089 USA

190

Appendix

Prof. Dr. Thomas Hess

Eckart Pech

Munich School of Management Institute for Information Systems and New Media Ludwigstr. 28 80539 München

President and CEO Detecon, Inc. 10700 Parkridge Blvd., Suite 100 Reston, VA 20191 USA

Prof. Stephen C. Littlechild

Prof. Dr. Dres. h.c. Arnold Picot

Senior Research Associate Judge Institute University of Cambridge White House, The Green, Tanworth-in-Arden Solihull, W Midlands B94 5 AL GREAT BRITAIN

Munich University Institute for Information, Organisation and Management Ludwigstr. 28 80539 München

Prof. Dennis Lockhart Georgetown University MSFS, ICC 811 37th and O Streets, N:W. P.O. Box 571028 Washington, D.C. 20057 USA Dr. Karl-Heinz Neumann General Manager WIK Wissenschaftliches Institut für Kommunikationsdienste GmbH Rhöndorfer Str. 68 53604 Bad Honnef Prof. Eli M. Noam Director-CITI Columbia University Professor of Finance and Economics 3022 Broadway, Uris Hall, I-A New York, N.Y. 10027 USA

J. Gregory Sidak F.K. Weyerhaeuser Fellow in Law & Economics Emeritus American Enterprise Institute 1150 17th Street, N.W. Washington, D.C. 20036 USA Prof. Dr. Rolf T. Wigand University of Arkansas at Little Rock Information Science Department 2801 South University Avenue Little Rock, AR 72204-1099 USA

Printing and Binding: Strauss GmbH, Mörlenbach