220 47 16MB
English Pages 308 Year 1993
The Vulnerable Fortress
Bureaucratic Organization and Management in the Information Age
In the past decade or so, organizations in both the public and the private spheres have invested - rapidly and massively - in office information and communication technology, to the point of near market saturation. Yet as one study after another has demonstrated, computerization has had no measurable positive effect at all on organizational productivity. As consternation replaces euphoria, we are compelled to ask: Why did the technology not work the way it was supposed to? Does its failure have something to do with the nature of organization itself? The Vulnerable Fortress addresses the questions of why established modes of administration are threatened by information technology and what kinds of changes will take place in the future world of management. Rather than being a buttress for managerial authority in bureaucratic organizations, argue James R. Taylor and Elizabeth J. Van Every, the new communication technologies are both an opportunity and a trap, and in either case they require a radical rethinking of our ideas about management and how it works. Basing their reasoning on contemporary principles of communication theory, the authors provide us with a reimagining of the nature of organization. JAMES R. TAYLOR is chair, Departement de Communication, Universite de Montreal. He served as president of the Canadian Communication Association during 1992-93. ELIZABETH J. VAN EVERY is a sociologist and writer.
This page intentionally left blank
The Vulnerable Fortress Bureaucratic Organization and Management in the Information Age James R. Taylor and Elizabeth J. Van Every with contributions from Helene Akzam Margot Hovey Gavin Taylor
U N I V E R S I T Y OF TORONTO PRESS Toronto Buffalo London
www.utppublishing.com © University of Toronto Press Incorporated 1993 Toronto Buffalo London Printed in Canada ISBN 0-8020-2948-5 (cloth) ISBN 0-8020-7773-0 (paper)
Printed on acid-free paper
Canadian Cataloguing in Publication Data Taylor, James R., 1928The vulnerable fortress : bureaucratic organization and management in the information age Includes index. ISBN 0-8020-2948-5 (bound). - ISBN 0-8020-7773-0 (pbk.) i. Bureaucracy. 2. Organizational change. 3. Office practice - Automation. I. Van Every, Elizabeth}. II. Title. HF5548.2.T31993
658.4'o2'o2854
C93-O93393-1
Contents
Acknowledgments / vii Preface / xi Part One Organization in the Information Age 1 A world in flux / 3 2 Coping with office automation: the conversation and the text / 34 3 Beyond the machine metaphor / 72 4 Organization as talk and text / 105 Part Two Management in the Information Age 5 The changing transactional environment / 145 6 The evolution of software: the new text / 178 7 Managing in the information society / 200 8 The fall of the fortress? / 228 Notes / 245 References I 263 Index I 279
This page intentionally left blank
AcknowledgmeEnt
There are so many people who have helped with the production of this book that it is hard to know where to begin in giving them credit for their support. Probably the best starting place is the federal Department of Communications (now called Communications Canada, but still better known as DOC), since the idea to write a book on the information age was first stimulated by a dialogue with Richard Stursberg, then assistant deputy minister responsible for planning. The first version of the book was in fact an attempt to write what might have eventually become a government green paper. That project never came to fruition, but it left us with an unpublished manuscript. It was also the starting point, during a sabbatical in 1988 (which was supported by the Institute for Research in Public Policy, or IRPP, and its president, Rod Dobell), for another continuing dialogue with Steve Resell, who was directing the IRPP's research program on governance. The Governing in the Information Society Roundtable (described in the book) grew out of this collaboration, and many of our ideas on management come from that experience. We have learned more about public dimension of administration today from listening to many of Canada's most experienced, and savvy, public servants than we ever could have absorbed from the literature. Our debt to DOC has many facets. It began supporting our research at the Universite de Montreal on the organizational meaning of technological change fully twenty years ago, when Richard Gwyn was still there. More recently, Taylor's stint as planning adviser to the deputy minister in the early 19805 became an opportunity to mount in mid-decade, an ambitious program of field research in the imple-
Acknowledgrtrtyrtrtrtrtrtrttr mentation of automation technologies in five government departments. This led to a productive association with Jacques Lyrette of the Canadian Workplace Automation Research Centre (CWARC) as his scientific adviser and in turn to an ongoing dialogue with Stephen Leahey, then an assistant vice-president at Bell Canada and subsequently a partner in a study that Taylor undertook in collaboration with CWARC. On the basis of this work and the IRPP project, and in conjunction with collaborators at the Universite de Montreal, Andre Lafrance and Nicole Giroux, we were fortunate enough to obtain a Social Sciences and Humanities Research Council (SSHRC) strategic grant, which is still in progress. Part II of this book reports on work conducted under that grant. No account of the debts we owe would be complete without a tribute to the work of the graduate students with whom we have been privileged to work. The research of two of them, Helene Akzam and Margot Hovey, is the basis of chapter 2 of this book, and chapter i was constructed with the help of two others, John Patterson and Gavin Taylor. But many others could be mentioned, Luc Asselin, Franchise Belanger, Henri-Paul Bolap, Frangois Cooren, Narcisse de Medeiros, Martine Frechette, Jean Gauthier, Geoff Gurd, Manon Jourdennais, Sylvie Lavoie, Loren Lerner, Jo Katambwe, Daniel Robichaud, Bernadette St-Jean, Alain Saumier, Marie Tardif, Nadine Tremblay, Sandra Verrault, and Huiru Xue, to mention only a few of those who have maintained the sort of conversation which is the ground from which springs every tree of ideas. Among others in our field, both Lee Thayer and Brenda Dervin were particularly warm in their support of this project, and we appreciate their help. And finally a word for colleagues from whose thinking we have borrowed, no doubt even without knowing that we were: David Conrath, Stan Deetz, Ralph Ginsberg, Fred Jablin, Klaus Krippendorff, Ken Mackenzie, James March, Gareth Morgan, Linda Putman, and Cynthia Stohl, and, even though we never met either of them in person, Karl Weick and Harold Innis (the latter deceased), the influence of both of whom is visible throughout the book. There are other colleagues we would like to thank, although we cannot name them - our anonymous readers, from whom we benefited greatly, and whose criticism of the original draft provided the basis for a complete revision of the text. To John Parry, who has so artfully edited our manuscript, we owe a special debt.
ix
Acknowledgments
This book has been published with the help of a grant from the Social Science Federation of Canada, using funds provided by the Social Sciences and Humanities Research Council of Canada.
This page intentionally left blank
Preface
The decisive reason for the advance of bureaucratic organization has always been its purely technical superiority over any other form of organization. The fully developed bureaucratic mechanism compares with other organizations exactly as does the machine with the non-mechanical modes of production ... Once it is fully established, bureaucracy is among those social structures which are the hardest to destroy ... Where the bureaucratization of administration has been completely carried through, a form of power relations is established that is practicably unshatterable ... Under normal conditions, the power position of a fully developed bureaucracy is always overtowering. Max Weber in H.H. Gerth and C. Wright Mills (1958) From Max Weber: Essays in Sociology, 214, 228, 232 The typical large organization, such as a large business or a government agency, twenty years hence will have no more than half the levels of management of its counterpart today, and no more than a third the number of 'managers.' In its structure, and in its management problems and concerns, it will bear little resemblance to the typical manufacturing company, circa 1950, which our textbooks still consider the norm. Instead, it is far more likely to resemble organizations that neither the practising manager nor the student of management and administration pays much attention to today: the hospital, the university, the symphony orchestra. For, like them, the business, and increasingly the government agency as well, will be knowledge-based, composed largely of specialists who direct and discipline their own performance through organized feedback from colleagues and customers. It will be an information-based organization. Peter F. Drucker (1989) The New Realities: In Government and Politics, in Economics and Business, in Society and World View, 207
Preface
xii
Introduction We have spent some fifteen years trying in one way or another to come to grips, from the point of view of communication science, with what ought to be a relatively straightforward question, but it is not (Drucker to the contrary): How is the introduction of new office information processing and communication technologies changing the way in which bureaucratic organizations are structured? We are well aware that there are a multitude of other intriguing questions that we could ask about what appears to be a communications revolution in the workplace - questions concerning the quality of working life, productivity and competitiveness, privacy and individual freedom, international relations, and the stability of government, to mention only some. But to us, as communication theorists, these are all secondary to an underlying question, which must be answered first. What is the relationship between organizations and their communication/information technology, and, more particularly, how does organization mediate technology and technology affect organization? In other words, most of the supposed 'effects' of technological change, we would claim, are caused not so much by the technology as by the way we go about putting it to use. We can have better or worse conditions of work, have more or less freedom, be better or worse informed than before, have better or worse government. These decisions will be shaped not by the technology but by the kind of organizations that we construct and reconstruct over time to carry out our collective social and economic functions. We do not mean to imply, however, that the communication technology is neutral: like the Canadian economic historian Harold Innis, we consider it to have a 'bias' which permits the conquering of either space or time (and, in the process, favours the emergence of one kind of organization over another). But its effects are mediated, we maintain, by the kind of institutional arrangements that we construct for ourselves, which are likely to vary significantly among societies and over time. Definition of terms We should make our own theoretical biases clear from the start. Our understanding of concepts such as organization, modern bureaucracy,
xiii
Preface
authority, and social change is based on the processes of human transaction that occur in their formation. When we think of an organization, for example, we do not have in mind some kind of body, such as a machine, or an organism, or a brain, or an actor with a personality; we are conceiving of it instead as a universe of communication. An organization, as we visualize it, is nothing but a fabric of communication: a collection of people in a process of talking, writing, and transacting with each other (J.R. Taylor 1988). That, when you think about what can actually be observed and described, is the reality.1 Everything beyond those processes that we normally attribute to 'organization' is a product of our imagining. From this perception flows a simple proposition: changes in the technology of communication may conceivably alter the pattern of how people communicate with each other. And when this happens, you have in effect modified the structure, indeed the very nature, of the organization. By definition. Having said this, we should back off a bit. If it is true that, intrinsically, an organization has no material existence distinguishable from its pattern of communication (unlike other objects, which can be touched, tasted, smelled, seen, or heard, directly), it is equally true that it has a symbolic existence that is more than the sum of its communicational parts. That this is so is a tribute to the extraordinary power of words and language. By naming an object we create it, not as part of material existence, but in the realm of the communicational. By giving an organization a name, we entitle it to behave as if it existed, communicationally speaking: to enter into contracts, to engage help, to appear before Parliament, to issue press releases, to make yearly reports, to make money. We endow it with a personality: it is seen to be daring, or conservative, or recalcitrant, or oppressive. Even more to the point, we take it to be constructed like any other being that really does exist materially: we come to believe that it has divisions, branches, departments, that it has a head and feet, and that it has a nervous system.2 All these properties come to be inscribed in the form of laws and regulations and procedures and thus become not only a comprehensive description of the real communication universe (otherwise too diffuse to be grasped intuitively), but equally its constitution and its guiding principle. And these are real - even if only symbolic. The tension between two versions of organization - the one that says that organization is nothing more than a configuration of inter-
Preface
xiv
action, and the other, which says that it is an object of conceptualization to which we attribute structural properties and onto which we project communicational rights - is the theme that moulds our argument. We understand the state of that tension to be reflected in the present situation: practitioners and theorists alike are finding it increasingly difficult to square the actual practice of bureaucratic organization with the properties that we have imagined it to have. This cross-cutting of perspectives comes into play most notably in our conceptualization of bureaucratic organization, and 'rational' administration in general. From our point of view, bureaucracy is constituted by, and in, communication, and we must again decide whether our concern is with its typical patterns of interaction (the interactional perspective) or with its imaginary properties and their real communicational effects (the symbolic perspective). Bureaucracy has been, for the better part of the twentieth century, the primary instrument of social and political control, and the primary agent in maintenance of economic monopolies and oligopolies in our society. When it is affected by change, we are all liable to feel the consequences. The Weberian 'ideal type' of a bureaucracy Max Weber (1946) was among the first of a line of economic historians (Coase 1937; Chandler 1962; 1977; 1990; Williamson 1975,1977; and Beniger 1986, among others) to understand that the emergence of the modern version of bureaucracy occurred in the nineteenth century in lockstep with the development of new communication technologies which provided for improved delivery of mail, the spread of the railway, introduction of telegraph service, and, toward the end of the century, telephones and typewriters. 'Among essentially technical factors, the specifically modern means of communication enter the picture as pacemakers of bureaucratization ... The degree to which the means of communication has been developed is a condition of decisive importance for the possibility of bureaucratic administration ...' (Gerth and Mills 1958, 213). The idea here is simple: as communication technologies evolve to facilitate new patterns of association, they open the door to different economic groupings, which then have an organizational effect because they alter the interactive flows and the way in which transactions are carried out. Weber and others went further, however, and their second step
xv
Preface
transcended the idea of organization as a mere fabric of communications, or even as a utilitarian means of doing business. To capture what they perceived to be the essence of the control and rationality embodied in this new kind of organization, early and influential analysts of the workings of the modern bureaucratic organization began to employ the analogy of the machine. Like others of his time, including Fayol in France and Taylor in the United States, Weber adopted the conception and on the basis of it constructed what he called an 'ideal type' of the emerging new brand of administration system. The professional bureaucrat is chained to his activity by his entire material and ideal existence. In the great majority of cases, he is only a single cog in an ever-moving mechanism which prescribes to him an essentially fixed route of march. The official is entrusted with specialized tasks and normally the mechanism cannot be put in motion or arrested by him, but only from the very top. The individual bureaucrat is thus forged to the community of all the functionaries who are integrated into the mechanism. They have a common interest in seeing that the mechanism continues its functions and that the societally exercised authority carries on' (228-9). The introduction of a machine metaphor did more than supply us with a model; it effectively reified, or made material, the concept of organization. Machines are built along clear rational lines which we all understand. If an organization were henceforth to be denominated a version of machine, then the same rationality could be attributed to it, with all that implied in an industrial society where the concept of machine had so recently taken on a status of ultimate legitimacy. In Weber's logic, an organization's parts were programmed to perform tasks according to pre-established criteria and coordinated within a tightly coupled communication network, obeying strict rules of interconnection ('following channels,' as it has come to be known). The result, however unattractive he found it on moral grounds, was nevertheless, he believed, an instrument of governance more powerful and reliable than any previous pattern known. Once established communicationally, the bureaucratic form of administration was practically invulnerable to assault from outside or inside. In its own way, it was a fortress, protecting the agencies of power from the vicissitudes of their environment. Were this all that Weber had written about bureaucracy, we could safely consign him, along with his contemporaries, to the shelf reserved for historical curiosities. But there was a further side to his
Preface
xvi
thinking, which has no echo in the other writings on administration of that time but must be taken into account here. He had an understanding, based on broad historical research and his own deep indoctrination in jurisprudence, of the critical role of authority and legitimacy in the foundation of governance of all kinds, whether nations, states, or companies and public administrations. Weber perceived that the legitimacy and the authority in the bureaucratic type of organization derived precisely from its supposedly implacable 'rationality/ which could be attributed to its apparently flawless, machine-like performance. This was an institutional step: now the bureaucracy's written edicts would be based on legally supported procedures, carrying weight that could not easily be denied. By a sequence of subtle semantic transpositions, what had started out merely as a practical rearrangement of the modes of commerce, occasioned by the availability of new technologies, had taken on object status and, in turn, legal status. On such foundations, entire economic and political systems are sited. Bureaucracy and communication In retrospect, it is clear that Weber's early characterization of modern rational management practice got some things right. He saw the correlation between the growth of the communications infrastructure and the coming to prominence of bureaucratic practices of management. Whatever else it accomplished, bureaucracy was what 'made the trains run on time.' It is equally evident, in our view, that however compelling his machine analogy of bureaucracy and organized management may have been, that analogy, even allowing for his own proviso that the model he presented was an outright idealization, contained some important errors and left certain crucial aspects of organization unexplored. When, a few years later, social psychologists and sociologists were finally to venture into the field to observe bureaucracy in action, they discovered a communication universe of interaction that paid little more than lip-service to the machine model of neatly integrated functions and linkages. Here, it seems to us, is the root of the matter. Bureaucracy was certainly more efficient than the patronage-ridden (and locally grounded) system that it replaced, for the operation of an extended network. It had the further advantage of appearing fair - an impor-
xvii
Preface
tant selling point in a society which, because of better communications, was throwing together people of diverse origins (pulled out of formerly isolated communities) in interaction that invited conflict, in the absence of some overriding principle of equity in employment. Bureaucracy also became an opportunity for advancement for the more upwardly mobile members of a growing middle class. It brought with it a comparatively attractive life-style, with security of employment, professional prestige, and opportunities for participating in and directing the consolidation of power. It was not just its technical superiority, in other words, that turned bureaucracy into the dominant modality of management for the twentieth century. That so-called technical superiority was in fact something of a mirage. What Weber failed to observe, or was prevented from doing so by his commitment to a machine metaphor, was that the bureaucratic organization had within itself the seeds of its own destruction. These seeds were communicational. The worm in the apple of administration rationality had to do not with any supposed machine-like properties of rational administration, but with another theme present in his work, even though not characterized quite so explicitly: modern bureaucracy's own internal communication properties. While Weber and others associated the arrival of bureaucracy with improvements in communication technology, the alleged improvements were modest at best, by today's standards. A finer-grained analysis of Weber's writings reveals the extraordinary restrictions on communications that he assumed as necessary therein. Those restrictions (although one has to read between the lines to see this) were crucial to the model, because it was in the monopoly of knowledge, in an imperfect communication system, that all the authority and the power of the bureaucracy were located. We have used the expression 'seeds of destruction.' The fact is, whatever the restrictions they place on communication in order to create monopolies of knowledge, most modern bureaucracies are exceedingly poor systems of communication overall. The results, as the administration of an organization grows in size, are striking inefficiencies (quite unlike a machine). As the cost of hiring well-educated people to staff the ranks of management rises, the cumulative result is an ever-increasing communication price to be paid, which eventually becomes an unbearable burden. Furthermore, bureaucratic management of the kind that Weber
Preface
xviii
described has a self-perpetuating character. He noted the pattern that we still find of proceeding by 'files/ But, as it turns out, files are wonderful raw material, from the machine perspective, and can be generated, and regenerated without limit, to provide as much work as is needed to justify an ever-growing staff (J.R. Taylor 1993). The vast improvements in communication technologies that we are now witnessing serve to make these inefficiencies stand out in bold outline. They reveal the inflated costs of bureaucracy and provide lower-cost alternatives. Bureaucratic authority based on patterns of restricted communication and the monopoly of knowledge is being undermined by technologies which instead enhance communication to the point of actually raising questions about the legitimacy of the established model of governing. In using the term 'bureaucracy/ we, like Weber, have a specific historical referent in mind, although it is not the same as his. We do not mean just any large and all-encompassing system of administration, that is, bureaucracy, in the generic sense of the term. Instead, we refer to the specific forms and social practices that have become the norm of the twentieth century. We have no doubt that we shall continue to see administrative systems in the future that structure the collective work world in many ways that are just as rigid as those described by Weber, or even more so. But we do not think that those organizations represent the whole story; there are emerging forms of organization that no longer have the same institutional properties, from a social and political perspective, as those that we now know. And that is the point: such 'institutional properties' have, until recently, underlain our whole system of governance. If they change, so does the system. And we will no longer be the same people. So it behooves us to pay attention. The problem of identifying change and effect Our theme is institutional change, but the identification and analysis of change in social contexts are difficult tasks. In fact, perhaps the single most intractable problem in social science over the years has been how to deal with time.3 The methods of the exact sciences applied to time series (serial correlation, spectral analysis, and similar manipulations of data), for example, are not particularly helpful for the understanding of a wide range of social macro-phenomena, because the latter - such things as institutional change, the evolution of
xix
Preface
culture, and diachronic patterns of organizational structure and process (the subject of this book) - may be essentially non-continuous and marked by disjunctions. People trained in a strict operationist and quantitative mode of analysis tend to look askance at some of the more intuitive, interpretive methods of the historian and see the latter's explanations as failing to meet adequate criteria of exactitude and reliability, from an operationist point of view. Yet the alternative is even less attractive: to pidgeonhole questions of social evolution on the principle of the Drunkard's Search - namely, if the question is not amenable to study by an accepted method, drop the question. What is clear to us is that humanity has traversed a half-century of the most astonishing developments in communication technologies. It would be easy to quote statistics to support this claim (and some of those will turn up in the pages that follow), but we confess that we use a different yardstick. All we have to do is look back at some of our own life histories to convince ourselves of the immensity of the change that has occurred. One of the authors of this book, for example, entered undergraduate studies a year before the first general-purpose electronic computer was built and in operation (and three years before the discovery of transistors). The field in which he has now worked for twenty-five years, communication science, did not even exist as an academic discipline when he started university. Television, in which field he worked for ten years, was still several years off in Canada, and commercial air travel was an oddity for most of the population. If we were to go back only a bit earlier than that, to 1942 (just fifty years ago), he was living in a society (rural New Brunswick, Canada) that had no electricity, where telephones and radios were a rarity (and neither worked very well), where only a few people subscribed to daily newspapers, and where even road travel by car was impossible for fully half the year (and disagreeable the other half, because of the clouds of dust). We do not need statistics to tell us that something important has been going on; we seem to have made collectively a long trek over the last five decades and now to be poised to gaze out, like the wagoneers of another age poised on the western half of the Rockies about to marvel at the opportunities and new spaces of the lush valleys of British Columbia (or, sobering thought, perhaps on some less attractive landscape). The statistics tell us little more than our own subjective intuition does. They might demonstrate with greater exactitude how much technology we have acquired, quantitatively
Preface
xx
speaking, but that would add nothing essential to what we already know; the statistics would not inform us as to the social and political meaning of this extraordinary revolution. That is a qualitative question, and one that puzzles us all. Moreover, the transformation of organizations that we have witnessed over the same period was at least as radical as the technological upheavals. Both of the authors grew up during a time when education was financed and run by a local school committee, when roads in many parts of Canada were maintained by cooperative community action, when health care was much more catch-as-catch-can than it now is, and when the principal centres of government were not located in Ottawa or Washington but somewhere much closer to home. We have, collectively, watched organizations grow by leaps and bounds in the postwar period, side by side with the introduction of sophisticated communication and information-processing technologies. That growth continued until about 1975, when what had seemed like a positive correlation between communicational capacity and organizational size turned negative. In retrospect, it seems that, overnight, the growth of bureaucracy went from expansive to contractive, throughout North America, in both the public and the private spheres. This abrupt (by historical standards) turn-around went largely unnoticed at the time; more immediate events, such as recessions and elections, masked the longer term secular pattern. Yet we think that it was a beginning of the end for the kinds of organization that we used to believe would last forever, and hence the beginning of the end of the political system of control that reposed on them. An understanding of how breaks of this scale can occur, and of the role of communication in explaining them, constitute our subject matter. Our approach differs from others that might equally be offered precisely in its origins in communication theory and a communication perspective. Communication technology is unique because it is the infrastructure that simultaneously supports our actions instrumentally and serves as the means by which we give them meaning.4 When this kind of technology evolves, the effect of its evolution may produce anomalies. The reason is simple: the dynamic of change occurs differently for the two parts of our experience. Systems of communication, as interaction, normally change incrementally and continuously, even though their accumulation (as we shall see in chapter 6) may appear to produce discontinuities. Descriptions of those systems may also evolve
xxi
Preface
by small steps, but they may also become discontinuous at certain junctures and so evolve on a different principle, that of discrete leaps - or what Thomas Kuhn (1970) called 'paradigm shifts.' Both interaction, and our descriptions of it, may appear to display a similar pattern of evolution, but there is a difference. Each follows its own internal logic. Social science has sometimes been quite good at measuring interaction and terrible at dealing with description,5 and it has largely ignored the relationship between them. It is that relationship that concerns us. Our thesis can now be stated simply: the implementation of new communication and information technologies in bureaucratic organizations over the past generation has produced an anomaly. The reality of organizational process no longer coincides with our concepts of it. As a result of some of the stresses precipitated by the introduction of new technologies into bureaucratic-style organizations, the model of bureaucratic communication and organization on which we have been accustomed to rely no longer describes very well the communicational reality which it is supposed to represent. This disjuncture, we further claim, occurs at a critical phase in a historical evolution that has been under way for at least a century and probably much longer.6 If historical processes could be seen as having fault lines, we would say that we now stand on the edge of one of them. Or, to employ another metaphor from nature, since our whole idea of how democracy should work has fashioned itself, in our time, around the assumption of bureaucracy (the machine version), much in the way in which a river conforms to the rocky outcroppings that break its surface, a change of this magnitude represents the passage not just to a different way of doing business but to a different society and a different political system. Summary of chapters This book is a journey through the experiences, the perceptions, and the theory that have led us to make these claims. We have divided it into two parts. The first has to do with concepts of organization, how they intervene to affect both the design and the implementation of new technologies and why they may explain important (and unexpected) consequences that have followed from computerization. Part I begins with a chapter describing the flux which we perceive to be the central reality of our current organizational and working
Preface
xxii
life. As the tertiary sector has come to predominate in our economy, and vast transformations have been effected in the information infrastructure, new patterns and models of enterprise are evident. Thus raised is the important question of whether we are developing the organizational and institutional capacity to deal with the changes, and if we are not, how we should begin. Over the last ten or fifteen years, communication technologies have been introduced into the workplace in the form of systems for computerizing many organizational procedures. Chapter 2 provides a close-up view of two experiences of 'automating' office communication systems, one carried out from the point of view of users of the technology, and the other, from the point of view of system designers. The office automation experience fused technology and the practices of organizational management. It also provided an opportunity to learn more about the discrepancy or 'tension' between the image of organization and that of the technology that automation seems to bring to the fore. Chapter 3 explores the logical implications revealed by computerization of buying into the metaphor of the organization as a rational machine - a metaphor that underlies much of traditional management theory. It also introduces some new approaches to management theory; they take into account the communicational aspects of organizational activity that have been highlighted by attempts at computer systemization. If not as machines, how do organizations work? So that we can begin to explain with greater success some of the changes occurring at the juncture of organization and technology, we develop in chapter 4 a set of principles that form the main outlines of a communication theory of organization. Rather than a rational machine, our model of organization is a 'discourse' of spoken and written communicational transactions: the organization is regenerated communicationally in a dialectic between 'conversation' and representation or 'text/ each having its own logic. Part II of the book is concerned with a related but distinct issue: how to manage in the new environment, given the different image, which we develop in the first part, of what an organization is. In chapter 5, we look once again at the larger picture, already outlined in chapter i, but this time from the perspective of a manager in a conventionally structured organization. Pressures towards both globalization and fragmentation are being experienced by large-scale bureaucratic organizations, partly as a result of developments in
xxiii
Preface
telecommunications and computing. We take stock of some of the structural changes described in chapter i in the light of new transactional patterns that are evolving. These transformations appear to be creating 'anomalies' between what is happening and how it is being administered that require a reconceptualization of organization and the management function as a whole. Computer software is a text. In the organization, where it is now one of the instruments by which transactional systems are textualized, it can become a text that is an instrument of centralized control, or, instead, one that is an expression of the many transactional systems that coexist. In chapters 6 and 7, we explore the implications of these alternatives for management and seek clues from new developments within the software itself for developing a post-bureaucratic administration. The inconsistencies in traditional models of organization that have increasingly been highlighted by developments in computerization and the extension of telecommunications in the workplace and the economy are forcing people to ask new questions and seek new metaphors. We suggest that the changes underway at the present time are communicational and that they are making many of what we think of as our most impregnable institutions vulnerable.
This page intentionally left blank
Part
1
Organization in the Information Age
This page intentionally left blank
1
A world in flux
We are in the midst of a fundamental economic and social transformation whose extent and implications we only partially grasp ... This transformation and the more richly interconnected, complex and turbulent world, the vast increase in information availability, and the compression in both time and space that result, has been labelled 'the information society'. In this ... environment, older ways of organizing and governing, which are premised on a more restricted flow of information and more limited interconnections (including public and corporate bureaucracies, and even representative democracy and the nation-state) seem to be overwhelmed. S.A. Resell et al. (1992) Governing in an Information Society, 3
Introduction
This is a book about organization, and our ideas of organization, and how, by transforming the organization, the realities of the so-called information age are also making our ideas of organization obsolete. The result is that our notions about how to administer the organizational processes that support the institutional structures of our society are no longer very realistic. We are in danger of losing the art of managing - in both the ordinary and the technical senses of the term: both of coping, and of administering. This introductory chapter first sketches an image of that elusive phenomenon, the 'information age' (or the 'information society' or the 'information era' or the 'information economy'), as it relates to the transformation of our systems of work and their organizational
The Vulnerable Fortress
4
correlates. It then outlines what we perceive to be the difficulties that entry into this new era has thrown up to management. The chapter is about the flux that seems to be the central reality of current organizational and working life in the final decade of the twentieth century, and how we are dealing with it. As we see it, a restructuring of the economy is gradually becoming visible. Much of the organizational change engendered by this restructuring is associated in our minds with the extraordinary transformation of the communication and information-processing infrastructure. Expansion, both geographical and functional, of that infrastructure is creating a new 'space' that has become an environment for the conduct of transactions that differ both in kind and in circumference from those of the past. The introduction of newer communication technologies is thus fuelling a shake-up of whole industrial sectors and a reshaping around new poles of development that are increasingly based on knowledge. In the process, it is also altering the scale and scope of organizational operations. We look in this chapter at a multifaceted pattern of effects - a proliferation of small business firms over the past two decades, accompanied by a decrease in the number of large corporations, but simultaneously an increasing trend to monopoly in world distribution networks and profit accumulation. Canada, for example, is becoming a nation of more, smaller organizations (atomization), often linked in multi-centred networks, while the concentration of ownership becomes greater through take-overs, buy-outs, and the formation of conglomerates (globalization). It sounds paradoxical, but it's not. It depends only on what part of the economy you are looking at. We also consider another sign of the times: as the automation of routine tasks increases, a new type of knowledge worker is coming to the fore, one who specializes in unravelling the complexities that computers themselves create - someone who can fill in the spaces left by corporate concentration. These are more than just straws in the wind. The information economy, about which we have heard so much, may finally be starting to reveal its shape. Against this backdrop of economic upheaval, with its connotations of growth and opportunity, there is another reality: we don't seem to be managing very well. Even though Canada and the United States are the most advanced economies in the world in the handling of technical information, the past decade, which is when we truly entered
5
A World in Flux
into the 'information era', has been a time less of triumph than of anxiety, marked by growing loss of confidence in our large organizations, both public and private, which we used to think invulnerable to the vicissitudes of change. While we may have plunged into massive investment in communication-related equipment for office workplaces, the results are at best ambiguously positive and at worst picayune. There are success stories, but failures to adapt to the new realities are even easier to cite. Office productivity was supposed to rise as a result of the communication technology, but it fell. Users of the most sophisticated technology have not necessarily been the most successful. The majority of managers are probably confused and uncertain as to how to implement a technology that they neither understand nor can ignore. There is a restlessness afoot that is hardly consistent with an image of entry into the Jerusalem, riding on the back of technology. A recent best-seller, Managing in Chaos, seems to have caught the mood of the time better than most. It is this tension between opportunity, and general consternation as to what to do about it, that was the motivation for writing our own book. Technology and the organization of work Let us begin by considering several images of what is happening in the workplace at the day-to-day level as computerization and telecommunicating become an operational given for enterprises of every size and shape. The first few scenes illustrate situations where computers, and telecommunications, are being used to perform the kinds of tasks that most of us think of as typical of what a company computer does - what we call the 'old' information environment. The 'old' information environment Imagine the following. You are in a suburb of Metropolitan Toronto. The landscape is dotted with ultramodern office buildings and apartments. The building (or rather the complex) that you are approaching is low-key and unpretentious, though not unimpressive in its underplayed way. A curious fact catches your attention: the only identification visible is a street number - no company logo, no name; just an address. The lobby is also large, modernistic - and impersonal. There are
The Vulnerable Fortress
6
the usual chairs, carpets, and plants, but no identifying labels. If you were to ask the receptionist the name of the company, you would be told - politely - that this information was not available. You could even apply for a job, but you would discover that the application form also had no identifying name or address, other than a postal box number for you to mail your application to. When you got past the lobby, with your guide (because you would have to have a guide), you would have to stop at not one but several checkpoints, each manned by a security guard, and each requiring the use of a special pass. At this point, you would realize that the complex is made up of several buildings - three, to be precise. The most important (though not the largest), and the raison d'etre for the rest, is the computer centre, where none but the specialized staff is allowed to enter. The computer centre turns out to be a large space, surprisingly open and certainly not crowded. Its core contains not just the computers themselves, which are modest in size, but also rows and rows and rows of memory banks - machines with spinning tapes that are automatically activated in a seemingly random pattern. Around the main core are service rooms, for software development, for the vendors, for maintenance. You are inside the central computer facility of a major Canadian bank. On those tapes are stored information about the accounts of every bank customer in the entire country, updated instantly to register each transaction as it occurs. The reason for the security is selfevident: it is the memory of the banking enterprise - part of the memory of a nation, in fact. Computing, in conjunction with telecommunications, has become the keystone of many kinds of big business, of which banking is a prime example. This is the outward extension of traditional computing: automated accounting pushed to the limit, integrated into a coastto-coast network, linking thousands of installations into a single complex system, all controlled from one point. Almost every major company and government department in the country could at least partially duplicate this operation (although few could match it in size or complexity). Its scope is increasingly international; its range of operations the entire globe.1 With 'electronic data interchange' (EDI), companies move towards formation of sector-wide alliances and new kinds of configuration. Nothing could better illustrate the heritage of the computer mainframe era and yet at the same time show the
7
A World in Flux
potential power inherent in telecommunications. Consider another image. We are in a large room, at the centre of which there is an immense round table, and enough room to seat twenty to thirty people at a time. We are far from Camelot, though. Off to the corner is a podium at which a young woman presides each morning between eight and nine o'clock over the daily meeting of the top management of Canada Post. To her left is a set of screens on which are projected, in detail, the results of the previous day's operations: how much mail, of what kind, was moved, where. On the same board, the day's performance can be instantly compared with company targets, last year's figures, expected loads for the upcoming week - whatever seems necessary. The daily get-together is for trouble-shooting, as well as planning: each current difficulty emerging out of operations is gone into carefully, and corrective action is proposed. For the assembled managers, it is a daily baptism of fire: they are expected to know not only what is happening but how to deal with it, before it gets to the reporting stage. The wall to the other side of the podium can be made to slide back, and when it does it reveals what makes the whole system go: a computer-supported telecommunications control centre, manned twenty hours per day and linked to all major stations in the postal network. From this facility, every truck shipment, and every airplane flight carrying mail, can be monitored minute to minute if necessary. From the same centre, there are 'taps' to the weather service and the airlines. Giant maps show the state of the network at all times. It is from here that the information is assembled to feed the morning planning session. In spite of their vulnerability - the risks from fire, and computerized crime, and subtle forms of software sabotage - companies in all parts of the country now build their operations around computersupported communications. Resource-based companies as well as those in the service sector link their operations electronically. In areas as diverse as sea food processing, hydro-electric power, oil exploration, and pulp and paper, companies lease truck lines to subsidiary plants for the handling of voice and data, use dedicated data networks between the head office computer and operating plants, move from dial-up access to on-line for data, voice, facsimile, telex, and word processing; estimate markets; and handle all of their usual administrative operations (production figures, inventory, payroll, and ledger data) by computer.
The Vulnerable Fortress
8
What does it all mean? Essentially, transactions that used to be carried out by memo and letter, and over a counter, have had their locus shifted to an electronic medium. When you use an automated teller machine, nobody stands between you and the computer, even though it may be hundreds or thousands of miles away. The new pattern of transactions is not limited to those occurring within a single organization, or between an organization and its customers. John D.'s father and uncle run a furniture factory. As part of his university training, John spent two years in a research laboratory devoted to office automation. Two years later, after learning some of the finer points of database software, John returned to his family firm, where he took on the task of restructuring the production process of a (by then) rapidly growing operation. That was just the first step. The company is now electronically integrated into the network of one of its main suppliers and has branched out to open up operations in the United States. It is growing into a distributed network of transactional links which have increased the range of its customer network, at the cost of having to respect the protocols imposed on it by its larger partner. Real estate companies increasingly work the same way, basically as franchising operations, North American in scale. It would not have mattered which field we chose, or how large the operation, we would have found similar stories: a record distributor in Montreal, a fast food operator in London, a school department in Saskatoon, a small hospital in Wakefield, Quebec, a university network in the Maritimes, plumbers, druggists, airlines, the Post Office. There is a very large new economy here, constructed around quite conventional and traditional computing practices and technologies. That is part of the equation; there is another dimension - in fact, more than one. Let us first look at a new category of enterprise: the information producer. The 'new' information enterprise Georgine S. is an established illustrator and graphic artist whose work is commissioned by a variety of clients in Canada and the United States. She is versatile, equally at ease creating a company logo, illustrating clothes for a newspaper advertising layout, or doing film animation. Her work can be seen in everything from local
9
A World in Flux
print media to network television. She runs a one-woman operation from her home studio. Her graceful, subtle work testifies to the finely honed skills of a professional artist. The effect of the technology on Ms S.'s work is not what you might expect. She does not claim that her computer makes her more efficient: it takes her as much time to produce a drawing on the screen as it would by hand. She is conscious of imperfections in both the software and hardware. For Ms S., the computer is another medium, with its own qualities and draw-backs. What attracts her is not that her drawings are now technically superior to those she was doing before, nor more perfectly executed. What they reveal, instead, is Ms S.'s exploration and comprehension of the artistic possibilities of the new medium. The facilities that she values are those that allow for a kind of graphic expression that is impossible (or extremely difficult) to achieve by hand, a potential that she is exploiting with remarkable skill. Laurent L. is one of Canada's experts in the technical side of art conservation. His writings on the science of conservation are well known internationally, perhaps even better in Europe then here. In this field, practitioners must keep abreast of technological developments: new equipment, new chemical processes, new practices. As a result, there is a ready market, in Europe particularly (since his publication appear in French), for the highly specialized texts that he produces. Laurent could live anywhere, but he has chosen Fredericton, New Brunswick, for its life-style. What gives him the liberty to select where he will reside is his desktop computer, on which he turns out his manuals. A French-language desktop publisher, with a European market, and working out of Fredericton? Times are surely changing! Benoit G. is a young man who spent over three years constructing a three-dimensional computer model of greater downtown Montreal. This database allows him to present a true-to-scale and very detailed colour drawing of the city from virtually any vantage point he chooses. The images, produced largely in single frames, are realistic. Benoit also made a thirty-second animated sequence in which the 'camera' swoops over, around, and through clearly recognizable buildings. To produce this real-time sequence (30 frames per second), he had to calculate, draw, and film each frame one at a time and then store it in the database. Yet in a couple of years what Benoit constructed so laboriously will be readily available on software. Television and film producers are also being freed from their reli-
The Vulnerable Fortress
10
ance on elaborately equipped studio facilities to fulfil their sound and image and editing requirements. Technical facilities now provide sophisticated software-supported systems, and these are independent of location, just as likely to be found in a Laurentian village north of Montreal as in Hollywood or on Madison Avenue. Gaetan L. is a dairy farmer outside Montreal. He uses a computerized system of accounting - extensively. He maintains a permanent record for every cow, including daily milk production, which he monitors by connecting his computer to the milking equipment. He has a database for field management, including production, fertilizer use, and cultivation practice. All his books, including records of purchases and sales, salaries, and insurance, are 'run off the computer. The Campus Bar and Grill is not quite what its name suggests. First, it is not on the campus of a major university, but nearby. Second, it belongs not to the university but (a rarity) to a cooperative run by its employees. Third, its clientele consists not of college students but mostly of blue-collar workers. Whatever its other characteristics, it is successful: its gross revenue is typically well over two million dollars a year. About seven years ago, after considerable soul searching, and interminable discussion (apparently typical of self managed firms), the board of coop members decided to computerize. The result has surprised them all: the computer is now used as much as a tool for communication functions of the enterprise as for its administration. All their advertising, menus, and annual report are turned out on the company Macintosh. Planning has also become more systematic: customer surveys (analyzed on the computer) have been conducted and the rudiments of a permanent database developed. We have chosen our examples of the 'old' and 'new' information environments for a reason. They illustrate the polar extremes of the economy: large (and increasingly medium-sized and small) companies that exploit advances in communication technologies for processing standard business information, and smaller enterprises (often one-person operations) that have lately found in aspects of the new technology an independence that they previously lacked. A concentration of capital and economic power greater than any ever seen coexists with an amazing diversification of enterprise and personal initiative. However much business and the arts may be undergoing change, their activities antedate computers. For them, the computer is the pen and pencil of our time. A new category is appearing, however,
ii
A World in Flux
as a direct consequence of computerization of the economy. To understand the emergence of this new sector, we need to comprehend something about the effects of largeness and computer complexity. The 'curse of flexibility' and the opportunities that it creates Some time ago, a Canadian minister of revenue is said to have admitted to a journalist that he had not filled out his own income tax form in years and had no intention of starting to do so. He blamed the Department of Finance, whose exemption provisions, he admitted, have become increasingly hard to figure out. The same year, the American television networks took some delight in reporting the results of a US study which showed that, of the responses of specialists in the Internal Revenue Service to citizens' requests for clarification of tax laws, 80 per cent were in error! Apparently, tax laws baffle even the department's own experts! We are witnesses to the effects of what John Shore (1985) called the 'curse of flexibility'. He was talking about computer software, not tax forms. But pay careful attention: 'Because it's easy to make changes quickly and without considering all the ramifications, software complexity can grow quickly, leading to software that's hard to read, hard to understand, likely to contain more errors, and likely to require further modifications. These results are hard to avoid, and their effects can be crippling. In this light, the computer's flexibility looks less like a blessing and more like a curse' (i74~5).2 Sounds familiar? 'Hard to read, ... hard to understand, ... likely to contain more errors, ... likely to require further modifications ... ?' Sounds like a description of the tax laws, in fact? This is no coincidence. All of the tax collection process is now computerized. Checking returns is a full-time job. So is rewriting the tax laws 'to close the loopholes'. If it reminds you of the perils of writing software, it is because the tax laws are a form of software. Each new 'simplification' adds another layer of software. This kind of growing complexity of established systems has a corollary. Roy K. typifies a new commercial sector. He graduated in finance from a Canadian university and initially took a job at an investment house. Eventually, it occurred to him that he preferred working for himself to being part of a large operation. Taking his courage in hand, he opened a modest office and hung out his shingle, as a
The Vulnerable Fortress
12
'financial adviser.' His main product: a financial plan. Such a plan includes more than advice on how to cope with the income tax laws; it incorporates an analysis of the client's overall financial situation, provisions for retirement, balance of investments. All of this a useful service for an increasingly diverse group of clients who, without having anything approaching a fortune, have nevertheless invested in a house, have some insurance and something set aside for a rainy day. Given the way in which laws are written, it pays even the modest wage-earner to hire a financial adviser: the resulting tax savings alone will pay for the service and leave the client with a surplus not to mention a better-organized plan for the future. Roy's business is itself built around a computer - not a huge mainframe such as Revenue Canada uses, but a more modest system. Originally, he rented computer services, time on a mainframe that someone else owned. Now, he has installed a personal computer, which provides him with the power that he needs and frees him from the hassle of renting. All that is required is easily available software, in the form of a spreadsheet, which records the details of the client's financial picture and allows for a simple analysis of his or her situation. Having your own financial adviser used to be a luxury reserved for the rich. The complexity of tax laws has given rise now to a fullfledged service industry, just like the fast-food trade (and like the latter, it is increasingly dominated by the presence of the equivalent of McDonald's: store-front experts who can advise you how to fill out your tax form). This is a spin-off industry - the direct consequence of computerization. Think of the process that we have illustrated as a cycle. A government writes a tax policy. It is a complicated document, full of conditions; looked at in the same way in which a computer programmer would examine one of his or her own productions, it is marked by little inconsistencies. The inconsistencies become, when successfully exploited, loopholes for the taxpayer.3 A loophole may mean a large saving. There now emerges a class of analysts whose speciality is not writing tax software (we use the term deliberately to emphasize the link between tax laws and software) but analysing it to discover the inconsistencies therein. This kind of knowledge circulates, and, as the number of people taking advantage of loopholes grows, the tax department is obliged to rewrite the tax law, exactly in the way in
13
A World in Flux
which computer programmers are constantly reworking their products. But in tax-law drafting, as in computer programming, elimination of one inconsistency merely leads to creation of another, and the cycle recommences, except that each process of recycling increases complexity to a new level. Shore (1985) estimated that a typical computer program, containing perhaps a million machine-language instructions, actually results in the execution of about seven billion instructions. If we were to divide the Empire State Building into seven billion parts, each part would look like a cube measuring about 2.5 inches across. But, of course, the component parts of a skyscraper are actually much larger than that. Computer programs are therefore, Shore thinks, 'mankind's most elaborate artifacts.'4 This situation has led to what he has called a 'software crisis' (209): 'Software usually takes longer to finish than was promised; and when it is finished, it's bigger, slower, and less capable than was promised. These deficiencies might be bearable if the resulting software were reliable, but it isn't. It tends to fail often, and efforts to fix it are just as costly and error-prone as its original development ... Of all the problems inherent in the software crisis, the most difficult, troublesome, and dangerous is our inability to write errorfree software' (164). As he said, 'With software, it's easy to start out and hard to finish.Total success is difficult because the flexibility of software facilitates partial success at the expense of unmanaged complexity' (165). Tax laws are starting to offer competition in the race to complexity! A decade ago, hardware made up nearly 80 per cent of the cost of a typical computer system, and software about 20 per cent; today those figures have been reversed, and the software companies such as Louis, Microsoft, and Ashton-Tate have become key players in the industry. Eighty per cent or more of the cost of a computer system is accounted for by software. Now software incorporates more and more of the 'intelligence' of the activities that it records (and often governs). The new 'interface consultant' who stands between the system and the public need know nothing about the basic operating system, as long as he or she understands the implications of the way in which the system represents the client's affairs (which are known as 'application software').5 This is a world where one brand of software is built to counteract the effects of another. Consider the case of Heidi M. She discovered what might best be called a 'niche.' Her work involves the boiling down or simplification of vast amounts of information - the bane of the bureaucrat's
The Vulnerable Fortress
14
existence. As an illustration: a single department in the Canadian federal government calculated the number of pages of internal regulations that govern its operation - roughly twenty thousand! That is the number of pages of often-obscure text to which employees are expected to refer in carrying out their work. Ms M. uses a technique called 'information mapping' (developed by Britain's Royal Navy). What she sells is her ability to reduce large volumes of information into compact, more easily digestible form. For example, one of her recent contracts involved a 3oo-page manual on air search and rescue intended to educate civilian pilots on protocols and procedures to follow when they participate in this type of operation. Few people would have the time or the patience to digest such a quantity of information. Ms M. reduced this body of information into a condensed (25-page) version. 'Information mapping' integrates text and graphics within the bounds of a single page. In the past, this might have meant either expensive hand-produced graphics or contracting out to graphic artists, typesetters, and printers. Today Ms M. can employ the graphics and word-processing applications software provided by her personal computer. It has become a pivotal component in her operation and allows her to produce high-quality, precise documents rapidly. What happens to her products, which are contracted for by one branch of government that is trying to cope with the complexity generated by others? Overall, each of these interventions, of which hers is but one example, drives up the threshold of complexity with respect to which people are expected to function. Or another area. Betty L. is a travel agent. Over the past decade or so, the travel agent has been transformed from an occasional to an indispensable service. The reason? - the complexity of the airline/ hotel/car rental system, world-wide. The schedules of every airline in North America constitute a mammoth mainframe-based store of complicated information. The construction of an airline schedule is a highly complex software problem: it must take account of external factors (for example, airport availability), interlocking flights, crew scheduling (including collective agreements), competition, and so on. When all the airlines are combined, the result is a software nightmare. Out of this complexity has grown a new industry, with several components. There is the independent packager. This is the agent who negotiates package deals with airlines, on one hand, and hotels, on the other. The packager has a specific type of client in mind:
15
A World in Flux
people looking for a holiday 'experience' or wanting to return to their homeland. There are the airlines themselves. They no longer merely sell seats but offer a wide range of special 'deals/ to the individual and to groups. What began with networked reservations has redefined the business: not just moving people and goods, but exploiting information.6 Knowledge about passenger movements gets translated into high-intensity marketing. There is the local agent, who must know how to read airline schedules, have a 'fix' on the availability of packages, and be able to translate all this material into the language of the customer - (or perhaps the reverse: to translate the not-infrequently vague wishes of the client into the all-too-precise language of a computer program, which delivers a ticket, makes the reservations, and stores a permanent memory of the transaction). All these people depend for their livelihood on the mastery of the essentials of the computerized communication systems that now form the backbone of airline operations. This system too is constantly having to come to grips with the effects of flexibility: frequent-flyer programs, for example, which produce complexity of nightmarish proportions, fare competition, and an unstable equilibrium between public regulation and private enterprise. A system with feedback What we have been describing are the sorts of complex systems that an economist would call a system with feedback. They are characterized by relations of interdependence, of which tax collection and airline scheduling are nice examples. Taxpayers are constantly trying to discover ingenious ways to circumvent the effects of new laws, which they can do, staying well within the law, by taking advantage of the imperfections that are typical of any large and complicated written document, while the tax department cannot ignore the cumulated effects of such evasive manoeuvres. Airline passengers are always seeking cheaper flights, while airline planning is based on customers' behaviour. When the result is to increase the level of complexity at each revolution of the cycle, economists would call this a system with positive feedback. It reflects an economy and a society that are undergoing change. In part, this kind of feedback is the result of complexity. It is however, a complexity that feeds on itself. The greater the use of software, the greater the incentive to push its complexity to new
The Vulnerable Fortress
16
levels. Software development is a major growth industry. It has been for some time. But now its range is widening: not just to process data within established frames of reference, but also to provide a tool by means of which people can map the complexity of the softwaredominated world they live in and navigate through it. Unfortunately, each 'navigation' adds even more to the complexity that it was meant to keep under control in the first place. Software, as Hofstadter (1979) has so elegantly described, is recursive in its effects: the object that it describes is ultimately itself. And recursive systems, as Chomsky (1957) showed, are capable of mapping infinity: a system of exchange without any logical limit. A symbiotic relationship of large and small The consulting industry, of which Roy K., Heidi M., and Betty L. are examples, has expanded by leaps and bounds as people have discovered the benefits of commercially available special skills. The opportunity emerged, in two ways. First, big systems grew very large and very complex (like an airline schedule, or the public accounts, or the regulatory mechanisms of a big company, or a police department's database), making it necessary to supplement the systems with secondary services. Second, the technology became available to support new kinds of enterprise. As little as five to six years ago, a working professional who wanted access to computing power had two options: buy time on a mainframe or purchase a personal computer. Neither was a very satisfactory alternative. If you went the first route, you discovered problems of access (you had to queue up) and of reliability and an absence of software written with you in mind. In Shore's view: 'Hardly anyone used to pay attention to user-interfaces. One reason is that, in the past, most computer users were experts, if not hackers. Expert users tend to view user-friendly interfaces with disdain rather than relief. For them, user-friendly interfaces are sometimes limiting and never essential' (1985: 99). If you went the personal computer route, you had a 'dedicated' computer, but one with limited power and a narrow range of software. It was no match for the big mainframes. Today, only a few short years later, the situation has altered, drastically, as a result of two factors. For one thing, for an investment of a very few thousand dollars, the working professional can acquire a
17
A World in Flux
machine that is fast and powerful (as much so as many mainframes of previous generations) and that possesses (with add-ons) an unlimited memory, for all practical purposes. For another, because of the software 'explosion' in the last decade, the professional now has available an extraordinary range of applications-oriented programs, covering all kinds of activities and capable of functioning 'on line.' The new machines have thus become easily accessible to the uninstructed user and are more than mere processors of digital data: in effect, they provide an iconic environment that offers as much to the artist and the engineer as to the data analyst and the accountant. On the one hand, we see systems that, as they are pushed to new levels of complexity - spanning the globe - begin to manifest the deadly side-effects of the 'curse of flexibility,' namely, gaps in consistency; on the other hand, we see little enterprises that can take advantage of the gaps. We have globalization on one side, and atomization on the other. Concentration and diversification: these are themes to which we return in chapter 5, when we look at how technology is affecting the pattern of organization in our society. The coming of age of the tertiary sector The economy of the so-called tertiary sector - the part of the economy producing services and administration, what we increasingly call 'information work' or 'knowledge work' - has been growing faster than agriculture, faster than the resource sector, faster than manufacturing. The service sector indexes not only the transition to a 'consumer society' but also the construction of a national system of education, health, and a whole range of other 'indispensable' public services. In 1951, in Canada, for example, less than half of the labour force was engaged in service-producing industries; by 1981, thirty years later, this figure had risen to two-thirds and by the beginning of 1992 it was almost three-quarters. The reverse side of this is that in 1951 more than half of the labour force was in a goods-producing sector (agriculture, manufacturing, resource development, construction). Since then, however, the proportion of the labour force engaged in agriculture has fallen from 15.6 per cent to 3.3 per cent, and that in manufacturing, from 24.7 per cent to 15 per cent. While the proportion of those in construction has stayed constant (a little over 6 per cent), other goods-producing sectors (including the resource sector)
The Vulnerable Fortress
18
declined from 6.6 per cent to 4.1 per cent.7 This country has, in other words, gone in forty years from having more than half of the workforce in goods production to less than a third. These figures do not necessarily represent a drop in the absolute number of people producing goods (except in agriculture, where the number of workers went down from over 800,000 to under 500,000). They also do not indicate a decrease in output: even in agriculture, the level of production was rising (output per individual worker in agriculture almost doubled during the 19608 alone). In manufacturing also, although fewer people were employed, production was on the rise. Yet the figures still tell us something important about the direction in which we have been heading. Within the service sector itself, growth has been uneven. The proportion of people in 'distributive services' (transportation, communication, the wholesale and retail trades) was constant (a little over 23 per cent in both 1951 and 1981 and 25 per cent at the beginning of 1992). The proportion of those in consumer services, including accommodation, food, personal services, amusement, and recreation, rose slightly during the same period (from 7.3 per cent to 10.4 per cent). But the proportion of those in 'producer services' (business management, accounting, engineering, legal, management consulting, finance, insurance, and real estate) almost tripled (from 3.8 per cent to 11.9 per cent in 1992).8 According to Statistics Canada (1986): The increasing importance of the producer service industries is related to the rise of the "information economy." For these industries, the processing, analysis and dissemination of information form the basis of much of the service they provide.'9 The other part of the service economy that more than doubled in this forty-year period (from 12.4 per cent to 29.1 per cent) was 'non-commercial services' - education, health and welfare, and public administration. This sector too, as Statistics Canada observes, depends on information and employs some of the people described earlier in this chapter. The character of the workplace - the kinds of work that people do - has been slowly, but surely, transformed. Impact on the work-force Back in 1979, Adam Osborne in his book Running Wild predicted: 'Of all jobs in the industrial world today, perhaps half will be eliminated during the next twenty-five years' (vii, also 143-$). Stated so baldly, the
19
A World in Flux
loss of half of the existing jobs sounded dramatic, if not disastrous. In the historical perspective of postwar Canada, however, we have seen just that. The increase of service-sector jobs from 47 per cent to 66 per cent in the thirty years leading up to 1981 (figures now more than a decade out of date) constitutes a proportional change in the character of work that is almost within the range of Osborne's prediction. The transformation of the work-force may now occur even faster than before, and with no guarantee that our society can create employment as fast as it can mothball it. Figures issued by Statistics Canada indicate that the period 1980-92 witnessed record job creation - a i6.6-per-cent increase in jobs available. Yet all the growth was in white-collar jobs, which rose by 33.4 per cent; the proportion of blue-collar jobs actually declined by 10 per cent!10 Even within white-collar employment, the division between clerical and knowledge-based jobs has been widening. The so-called automation of 'back-office' functions - such as accounting, transaction recording, bookkeeping, and payroll processing - has been conservative in spirit. Technology has been used to buttress the position of the existing organizational framework by making each of these functions more efficient, or 'productive,' to use a popular term. Its purpose has been to assure new levels of conformity to centrally determined norms, through imposition of a far-reaching system of monitoring and control. To most of the people who work at automated whitecollar jobs, computer-driven routines have become a normal (if not necessarily agreeable) part of their environment. They understand little of the mysteries of the computer-as-medium: it is merely an electronic substitute for a human supervisory process. Where the nature of their own tasks is concerned, it is more of the same: compared to paper records, the automated 'look-up' process is different (it means learning a new protocol, involving instructions typed into the terminal), but the result differs only in being faster, more up-todate, more integrated, and more reliable. Whatever we asked of these people in the past, we simply ask more of it now. But this sector, we have found, is the part of the white-collar workforce that is shrinking. The overall pattern of work in computer-oriented companies is undergoing a displacement, away from the more routine tasks of data entry and transcription necessitated by an earlier generation of computers and towards a more professional, or knowledgebased orientation. This tendency can only accelerate as the next generation of even more powerful computers comes onto the market.
The Vulnerable Fortress
20
The 19805 saw record job creation, but the new positions were not in the big, computer-driven companies (the 'old' information environment, as we call it). They came from new firms and from selfemployment.11 Between 1975 and 1986, total self-employment in Canada rose by 54.0 per cent. In 1986, 7.9 per cent of all Canadians were self-employed, as compared to 6.2 per cent in 1975.12 The work of the people described earlier in the chapter who are self-employed or who have opened their own companies depends on highly developed professional skills - what Birch (19873; 1987^ called 'thoughtware.' They have transformed themselves from simple practitioners of a craft or a skill into entrepreneurs, who both do their own work and market it. They have developed the 'navigational skills' that combine knowing how to create and use extended interpersonal networks, sometimes on a global scale, and knowing how to translate access to sources of specialized knowledge into packageable products attractive enough to interest a client (whether a customer who needs a consultant or simply someone else in one's own organization or interest group whose support is to be solicited).13 It is a world of shifting loyalties and complex inter-organizational networks, where negotiation, persuasion, and deal-making come as easily as opening an umbrella during a rainstorm. Now that we have looked at the changing employment picture, let us turn to some of the organizational dynamics that have accompanied the transition to an information economy. A trend to 'downsizing' As sectoral figures have already intimated, important changes in growth, employment, and business size have taken place over the past decade or so. Enterprises are shrinking. By the mid-1980s, 95 per cent of all companies in Canada were small businesses with fewer than twenty employees. Furthermore, the proportion of small firms was rising (an increase of 4.8 per cent between 1979 and 1984) while that of large firms (the 'Financial Post 500') was shrinking (a decrease of 1.8 per cent during the same period).14 Small businesses were also the major source of jobs: during the period 1975-82 (Figure 1.1), companies with fewer than twenty employees accounted for the greatest increase in employment.15 These figures reflected a pattern common to other similar economies during the same period, most notably the United States, but also Japan16 (Figure 1.2).
21
A World in Flux
Figure 1.1 Change in employment by business size, Canada, 1975-82
No. of employees
In the 19805, Birch (19873; 198712) reported that 700,000 new companies were formed in the United States alone in 1985 (compared with 200,000 in 1965 and 90,000 in 1950). When this figure was combined with 400,000 new partnerships and 300,000 newly self-employed people, the total was 1.4 million enterprises created in a single year. By contrast, the 'Fortune 500' companies employed 2.2 million fewer people at the end of 1985 than in 1980. About 30 per cent of the largest companies in existence in 1970 had vanished by 1981. While there is always a notoriously high turnover of firms in any economy, Birch affirmed the permanence of the trend to smaller outfits: the newer companies are not only smaller, they remain so, subcontract more, and are more likely to specialize in 'thoughtware.' They are service organizations, but the service that they offer is increasingly professional and requires employee specialization, sometimes of a very high order. Before 1975, in the manufacturing sector, for example, a quite different pattern had been visible, with firms with more than 100 employees accounting for the largest percentage change in employment17 (Figure 1.3). Between 1976 and 1985, however, in the same sector, the pattern was totally reversed, with small businesses accounting for the growth18 (Figure 1.4). This trend to smaller enterprises which we can trace back to about the mid-1970s, has accelerated visibly.19 By the late 19805 it constituted (Birch 19873, b: 21) 'a major structural change in the way America does business.' It coincided in that country with the decline of jobs in agriculture (falling to only 2 per cent of employment), manu-
The Vulnerable Fortress
22
Figure 1.2 Change in employment by business size, Japan, 1972-86
No. of employees
facturing (at that point less than 20 per cent, and falling), and construction (down to about 5 per cent); flattening out of employment trends in wholesaling and retailing (down to just under 25 per cent from about 27 per cent); and stability in transportation, communications, and other utilities. It also occurred at the same time as growth in finance, insurance, and real estate (to about 10 per cent from about 5 per cent a couple of decades earlier). 'Downsizing' also coincided with expansion in the service industries (from 15 per cent of the workforce in 1960 to almost 40 per cent in 1985). It was also Birch who christened this trend the 'atomization' of the United States: 'Neglected amid much of the talk about our economy's changes ... is a simple and underlying statistical fact: the American economy is breaking into pieces. More and smaller businesses now do what fewer and larger ones did before. Our economy is "atomizing"' (1987 a, b: 21). An analysis of data going back to 1970 by the Institute for the Future20 also predicted that the proportion of employment accounted for by the 'Fortune 500' companies would drop from 18.1 per cent in 1970 to 11.6 per cent by 1990 and level out at approximately 10.4 per cent by the year 2000. The institute points out that, while it is the big companies that make and sell goods, a large part of the economy is involved in creating and trading information and servicing and supporting the flow of goods. Smaller firms, now the dominant model of enterprise, are footloose. They have been freed from locational dependence on an immobile resource base and are no longer constrained
23
A World in Flux
Figure 1.3 Change in employment by business size, manufacturing, Canada, 1961-75
by reliance on transportational infrastructure. They follow people, and their choice of location depends on factors that used to be secondary: availability of communications, proximity to educational institutions that can form the cadres of the future, climate, and the general quality of life of a region. A trend to globalization Paradoxically, as the economy has been atomizing, the information age has also brought global centralization. Although small enterprises are proliferating, we are witnessing at the same time another sort of pattern whereby large trading blocs are forming; corporations are larger and fewer than ever; distribution systems are also larger, with new forms of monopoly control; and the profits of enterprise are more concentrated in fewer hands.21 The role of conglomerates is critical here. Between 1976 and 1986 in Canada, for example, conglomerates sharply increased their share of total corporate assets, sales, and profits, controlling 684 businesses in 1986, as compared to 539 a decade earlier. Their share of business assets increased from 29.6 per cent to 35.1 per cent; of sales, from 21 per cent to 23.3 per cent; and of profits, from 24.2 per cent to 29.8 per cent.22 Reich (1991) has suggested that we should look at simultaneous globalizing and atomizing in a new light. Rather than supposing that
The Vulnerable Fortress
24
^igure 1.4 Change in employment by business size, manufacturing, Canada, 1 ni7£ oc
No. of employees
large corporations are merely being replaced by small enterprises acting in concert, he thinks that the economy is now being shaped by what he calls 'weblike relationships': 'Here the core is no longer a "big" business, but neither is it a collection of smaller ones. Rather it is an enterprise web. Its centre provides strategic insight and binds the threads together. The resulting interconnections can be quite complex stretching over many profit centres, business units, spinoffs, licensees, franchisees, suppliers and dealers, and then on to other strategic centres, which in turn are linked to still other groups' (95,96). When manufacturing becomes organized on an international scale, smaller companies can migrate abroad. They compete with larger firms less by gigantism than by forming alliances of small players that join together to behave 'as though they were big/ The key word becomes 'connectivity': in this kind of dispersed network organization, hierarchies are flatter, chains of command more fluid, and functional structures more volatile. A communication infrastructure: from transportation to telecommunications In addition to atomization and globalization, a third indicator of change is in communications and the changing balance between investment in telecommunications and that in transportation. As noted by Roach (1987: i), 'high technology spending is the single largest
25
A World in Flux
line item in capital spending budgets of corporate America.'23 The trend is not new. Earlier, Vallee (1982) had pointed out that the telecommunications industry 'accounts for over 10 per cent of all plant and equipment expenditures made by American corporations for the last twenty years.' He went on: The telecommunications industry is responsible for more than 20 per cent of all corporate debt, and takes in revenue twice as fast as the gross national product of the US. In 1978, the information technology areas employed 51 per cent of our workforce and earned 47 per cent of our GNP'O}). While in the years immediately following the Second World War, North American policy-makers were preoccupied with the development of both national transportation and communications systems, it was the transportation infrastructure that had priority. In Canada, between 1950 and 1960, we initiated the Trans-Canada Highway; in concert with the United States, we constructed the Distant Early Warning (DEW) line and the Alcan Highway, began to open up the north, and built the St Lawrence Seaway. We set up the whole network of airports that constitutes the foundation of today's aviation system, shaped the traffic systems of the major cities, renewed the infrastructure of our ports, and conquered, to all intents and purposes, the problems of physical transportation of people and goods presented by the vastness of the territory that we occupy. It was not that we neglected the communications infrastructure far from it. This is, after all, the time when we introduced longdistance dialing, built a national microwave transmission system, and developed cablevision. By the 19605, Canada could claim the most extensive cable network in the world, had begun to build the first domestic satellite network, and upgraded telephone transmission once again. It was a question of degree, of relative effort and investment: moving goods and people still seemed more of a priority than moving messages - more real, more important, more profitable. Then the pattern began subtly to change, and the emphasis to evolve. After 1970, while expenditures in the infrastructure of transportation and of communication followed a similar pattern, remaining about equal in absolute amounts, what we got for our money was no longer comparable. Consider what Canada has invested since 1975 in telecommunications and computing,24 as compared to transportation (Figure 1.5). These figures represent total investments in dollars. They do not take into account that what a dollar could actually buy between 1955 and
The Vulnerable Fortress
26
Figure 1.5 Comparative investments in transportation and communication, Canada, 1956-86
SOURCE: Department of Communications
1985 differed significantly by sector. Costs have climbed drastically in transportation, for example, with its high dependence on both materials and labour. Anything that is energy-dependent today is costly. In the communication sector, however, costs have come down equally dramatically. One rule of thumb estimates that the costs of communications, measured on a capacity basis, decline by about 11 per cent per year, those of computer logic by about 25 per cent, and those of computer memory by about 40 per cent. The computer capacity that one can now buy off the shelf for about $2,000 to $3,000, or less, would have cost in the order of $20-30 million in 1960. Between 1965 and 1985 alone, computer costs decreased by a factor of some several hundreds. We Canadians have been investing in communications over the past decades about the same amount as for transportation; yet, in terms of performance, the investment in communications has been greater by orders of magnitude. The communication infrastructure does not always stand out in our minds with the same clarity as transportation. Objects such as cars, trains, and airplanes represent bigger 'one-shot' investments, and are more physically present to us, than telephones and laptop computers. Politicians enjoy opening airports, cutting the ribbon on a new highway, launching a frigate. It's a 'photo-opportunity/ It is
27
A World in Flux
hard to turn communications into visual imagery. In this we are fooled by scale: we can spend a thousand one-dollar bills without paying particular attention to any one of them; a single thousanddollar bill makes us sit up and take notice! We have been buying more - and more - and more - of communication equipment and services. It happened gradually - but it happened! Communications and the trend to polycentric organizations Until the 19405, Canada was a nation of small towns, small enterprises - and small government. Yet by the 19603 and 19708, Canadians had all become accustomed to large and centralized organizations/5 In the world of computing, during those same years, where the technology revolved around a few stand-alone mainframes, weakly connected into the telecommunications system, the result was centralization for data processing and, to the extent that organizations are constructed on their accounting systems, centralization of organizations, as well. Organizations' computer centres have not disappeared; with their greater integration into telecommunications, their power has been magnified a hundred times. But a new phenomenon has been added: the proliferation of computing and the increasing accessibility of telecommunications have created a much more distributed kind of communication, where it is harder and harder to locate exactly the centre - of almost anything. In the financial world, for example, the key decisions seem to be taken within a diffuse network of speculators, all armed with their individual computers and their favourite computer programs.26 The stock market is international. Films are international products; publishing is going on everywhere. The world of research and ideas is fluid and moving. In business, the ownership of companies seems to have forgotten all about the national boundaries: takeovers come from every direction. It's a new kind of polycentric, quick-moving universe, with linkages that break with all the patterns of the past. For large-scale corporations, the greater reach and the complexity of operations made possible by telecommunications may produce further concentration and a loss of sensitivity to the individual situation. By definition, growth in the number of customers leads the company or the government department to plan increasingly on the basis of the mean. Customers appreciate a policy of delivering an
The Vulnerable Fortress
28
average level of service when it means reliability (a Big Mac is a Big Mac is a Big Mac, wherever you are in the world), but they also find it frustrating on occasion when it means that their particular case is not taken care of. This is where global restructuring around webs comes into play. Even larger companies must now mimic the behaviour of small entrepreneurial operations, through internal restructuring to encourage 'intrapreneurship' and through changing corporate culture. For the small firm, computerization can be the beginning of a business opportunity. According to one report/7 small businesses are the pioneers of new approaches to satisfy customers' needs and serving the market niches that large businesses often fail to recognize or consider uneconomic or impractical to address. The sum of small businesses, as we saw earlier, is big business. But it is big business with a difference. The contrast with yesterday's top-down, vertically structured, multidivisional corporation could not be greater. The information-based society may be here but do we know what to do about it? The outlines of a changing economy and workplace, shaped in large part by computerization and the new telecommunications environment, are becoming clearer. It is an economy where, as we have seen in Canada, organizations are becoming smaller and more entrepreneurial; job opportunities, where they exist, are most likely to be found in the knowledge-based service industries (and no longer in clerical work, much less agriculture and manufacture). At the same time we are being shoved at breakneck speed into the global economy, with its mammoth constellations of new economic associations. How are we coping with these changes? Do we have the right kind of organizations and institutional arrangements to take advantage of the trends? Can we seize the opportunities offered by the flexibility of complex systems, the increased connectivity, and the polycentric character of today's world? Do we know how to encourage the navigational and entrepreneurial skills of employees that are brought to the fore by the new technologies? In other words, are we managing? To a depressing extent, we must admit that the answer is no. The signs are everywhere. For about twenty years, North America
29
A World in Flux
has watched the productivity of its most developed economies, those of Canada and the United States, stagnate and, until very recently, actually decline. This is all the more remarkable given the previous spectacular gains of the post-Second World War period. Those two decades of 'flat-productivity' coincide with the exploding demand for new communications and information-processing technology, an area in which North America leads the world. The technological revolution may not have reduced productivity, but it was supposed to have been the ultimate corrective - and it most assuredly has not been. There is even evidence, as we shall now see, that computerization has actually lessened the profitability of firms. That tale begins back in the early 19805, when, out of concern for the competitive position of the United States internationally, a presidential conference on productivity was established. It triggered a series of economic studies that are still going on. Some of the results of the early work have been reported by Paul Strassman (1985). Banking and insurance, for example, were among the heaviest investors in new technology during the 19705, yet research showed that over that decade their capital productivity declined and their labour productivity was stagnant (Strassman 1985:162-3). A study of wholesaling based on a sample of 138 companies found an inverse relation between the level of use of computers and profits, as measured by percentage return on assets (Cron and Sobol 1983, cited in Strassman 1985: 155-7). Another participant in the conference, Stephen Roach of Morgan Stanley, had broken down global figures on productivity into separate categories for blue- and white-collar work. He discovered that bluecollar productivity had actually continued to rise during the 19705 and 19805, but that white-collar work had not, and had even declined. He found these results particularly ironic in view of the fact that capital investment in technology in office professional work had risen from half that of blue-collar in the 19605 to become the principal factor by the mid-igSos. Other analyses, including one conducted at MIT, soon confirmed his findings. The productivity of shop workers had increased (by 13 per cent between 1978 and 1986, according to the MIT study), while office productivity was declining (by 10 per cent). A study done for the Brookings Institute (Baily 1986) also found that 'the administrative bureaucracies of the economy absorbed a large share of total investment without making corresponding improvements in efficiency' (443-51). It was concluded by bodies such as the US Labour
The Vulnerable Fortress
30
Department that 'instead of saving labor, the sophisticated machines in many cases have been hampering their work (Schneider 1987: Di, D6). Roach (1987: i) concluded: 'US industry has rushed headlong into the high-tech era ... outlays on new technology now account for over one-third of corporate America's expenditures on capital equipment ... the US economy is still trapped in a quagmire of low productivity growth ... technology has quite simply not delivered its long-awaited productivity payback.' While the evidence is more scattered in Canada, much the same pattern can be observed. As part of its field trials in office automation during the mid-1980s, Communications Canada - then the Department of Communications (DOC) - made detailed measurements of the productivity of one working group of professionals during the period when they went over to an electronic medium. At first, the results seemed positive, but as the experiment wore on, and individuals began to be more at ease with the new technology, the apparent productivity increases vanished, and group efficiency went back to its preautomation levels, even though the users appeared to be both enthusiastic and skilled in the use of the systems (Engel and Townsend 1985). The larger set of DOC studies, based on six field studies, echoed a growing body of internationally based onsite observations showing that the effects of computerization were not what had been predicted. They ranged from insignificant to ambiguous (Robey 1981; Mohrman and Lawler 1984; Rogers and Picot 1985; Kraemer and King 1986; Child, Gunter and Kieser 1987; Taylor and Katambwe 1988; George and King 1991). Another DOC agency, the Canadian Workplace Automation Research Centre, conducted a crossCanada study in the mid-1980s in collaboration with a wide sector of private and public enterprises (Kishchuk and Bernier 1986), only to conclude that 'even though managers say that they want to use technology to improve communications and access to information, ... very little is being done about it' (7). The senior managers who met to consider the report agreed that management was at a loss as to how to proceed; in the words of one participant: 'We're still in the kindergarten phase, when it comes to implementing the new technologies.' Nobody contradicted her. According to Dumais, Kraut, and Koch (1988), the concept of the effect of computerization had become meaningless, so different were the patterns from one milieu to another. Or, in Long's (1987) words: 'the technology is having a much slower and limited impact on the
31
A World in Flux
functioning of organizations than anticipated' (254). Our own work at the Universite de Montreal over the past five years, in sectors that vary from retailing to wholesaling to public and health administration to communication, has unearthed much the same result. Most managers have no clear idea of how to cope with communication and information-processing technologies and tend to slough the problem off onto a technical adviser, without making the strategic link to their own futures. This same conclusion emerges from a series of studies of Canadian enterprise by Stephen Leahey (1992), who discovers that the bridge between corporate strategy and technical development is, in most cases, non-existent. Even where the cross-over is being made, studies on which we are currently engaged suggest that the actual results, in operational performance, are at variance with management's intentions, and indeed with its public claims. Many of the cited studies were concerned with automation projects, based primarily on computerization of office functions, the dominant technological package of the 19805. Automation has since been superseded by a new phenomenon (the subject matter of chapter 5). Distributed 'computing power using mircocomputing (personal computers and their successors, which are opening up the power of artificial intelligence to professionals in many fields) is fusing with digitally switched telecommunications using very powerful transmission media (fibre optics, satellite, and microwave). We follow OECD in calling this synthesis 'TNS' (for Telecommunications Network-based Services). There is less field research to document its implementation processes. One preliminary study, using cross-national data, can be cited (Bar and Borrus 1989: 3): The intensive use of telecommunications networks can produce a broad range of benefits, from costsavings and productivity improvements to better market monitoring and customer service ... Achieving these gains, however, requires an ability seamlessly to interweave telecommunications capabilities and business activities in pursuit of corporate strategies. Very few of the companies studied are anywhere near to implementing this networkstrategy ideal.' Since this study, and its successors, include data on some of the most important companies in North America, its conclusion deserves serious attention. For every firm that makes the adjustment, several fail, it seems. The problems confronting the public sector mirror, and even exaggerate, those of the private. Three years ago, the Institute for
The Vulnerable Fortress
32
Research on Public Policy invited a group of Canadian senior public servants to discuss creation of a roundtable forum on governance in the information age. The intensity of the response surprised even the organizers. The opinions expressed confirmed that the management of government business finds it at least as hard as its private-sector homologue to adjust to the current environment. Subsequent meetings made it clear that the pressures of globalization and fragmentation are constricting possible action in public administration. Here as well, we see little understanding of technologies and their implications for management - and well-founded suspicion as to the authenticity of much that has been written about them. Confronted with these perplexing findings, we may tend to attribute them to a short-term learning problem for managers. Most professional administrators, the reasoning goes, never have had to develop skills related to information-processing and communications technologies, certainly not in the context of their relationship to economics and corporate strategy. They are how attempting trial-anderror experimentation and learning, but eventually they will make the transition, and things will return to normal. Gains in productivity will finally begin to materialize. Perhaps this may take a generation, but when the currently enrolled students in business schools enter the work-force, things will fall into place. It is a plausible line of argument, but not convincing. For one thing, the evidence is far from conclusive. After more than thirty years of experience with computing, and even more with telecommunications, the response of the North American managerial community is not reassuring (Harrison and Bluestone 1988). We have no doubt that the learning will eventually occur. We differ from these other analysts, however, in our belief that there can be no return to the status quo ante. We would suggest instead that, by a strange twist of logic, if the managers do learn, the fact of their learning will transform them into something other than what they are now. Furthermore, their learning will be correlated with the transformation of the organizational structures which they had previously taken for granted. Instead of returning to an institutional environment where managers managed, and technology assisted them in doing so, they will have been pitched into a strange new world, where their underpinnings will have been knocked out from under them. We will be living in a universe characterized by new kinds of organizational arrangements, whose eventual lineaments can only be dimly discerned on the basis of current experience.
33
A World in Flux
It is a double bind: if managers fail to adapt, they will quickly drop out of the race; if they adapt, their having done so will have changed the world, and they will find themselves in a new environment. We are not embracing technological determinism, nor do we perceive technology as an extrinsic parameter, moulding society and the economy in its image. On the contrary, as we show in the next two chapters, designers of technology are commonly wide of the mark in their prescriptions for social organization. The vision of office automation appears, in retrospect, to have been based on a distorted picture of how organizations actually work. Organization (as we see it, and as we develop the idea in chapter 4) incorporates a dialectical dynamic, in which human transactional processes are worked out within a context that is supplied by media of transmission of texts of many kinds. When there is a radical transformation of the available media (of the kind that we describe in chapters 5 and 6), through technological innovation and as people exploit the potential of changing symbolic environment, inevitably the transactional processes themselves generate new patterns. Not only do the transactional dynamics evolve, but a new conceptual space may also be created, which leads us to think differently about what an organization is. In this world, where both transactional patterns and our conceptual models of them have been irrevocably altered, the dialectic at the heart of organization is itself transformed. Chapter 7 is given over to consideration of how to proceed when we neither completely understand what is happening nor can adequately predict future trends. The victim of this historical evolution, as we see it, is rational 'scientific' administration (bureaucracy, in other words). We may not have had much occasion to love bureaucracy, from either the inside or out, but it has been the bedrock institution on which our economy and our society were constructed, for most of the past century. When it goes, it will be replaced by other systems, which we may, or may not, like any better, and which may, or may not, protect those values that we have traditionally held dear. If Keniche Ohmae (1990) and others are right, the nation-state may have become outmoded in the new global context. What will take its place? Chapter 8 is a plea for thinking seriously about our own future, when battlements of the vulnerable fortresses of bureaucracy have begun to crumble.
2
Coping with office automation: the conversation and the text
Every human tool relies upon, and reifies, some underlying conception of the activity that it is designed to support. As a consequence, one way to view the artifact is as a test on the limits of the underlying conception ... An action's significance seems to lie as much in what it presupposes and implies about its situation, as in any explicit or observable behavior as such ... human behaviour is defined by its ground. Lucy Suchman (1987) Plans and Situated Action: The Problem of Human / Machine Communication, 3 and 42 Office systems, and information systems in general, have been considered as technical systems. Attention has been paid to technological advancement and technical elegance and efficiency. Although lip service has been given to the 'human' aspects of computer-based systems, little real interest has been shown. R.A. Hirschheim (1985) Office Automation: A Social and Organizational Perspective, 167 Introduction At the end of the previous chapter, we presented evidence showing that implementation of new technologies is fraught with difficulties for the organizations and managers who embark upon it. In this chapter, by using a couple of case studies of office automation, we begin to explore the reasons why. Despite the recent trend to smaller size, large and powerful enter-
35
Coping with Office Automation
prises and ministries of government have dominated the social, economic, and political landscape of late-twentieth-century Western democracies. Since so many of us have hitched our individual wagons to the fate of those organizational behemoths, their responses to the new communication environment concern us all. Our prosperity depends, to an extent that is downright uncomfortable, on their successful adaption to the forces that are reshaping the globe's communication networks. Perhaps the adaptation should not be a problem. Large bureaucratic organizations have been the greediest consumers of office communication and information-processing technologies - tools for white-collar workers such as telephones, typewriters, copiers, printers, calculating machines, and dictating machines. Big companies and government departments have been the source of telecommunications traffic and are far and away the biggest users of postal services. When significant advances began in computing and telecommunications in the 19508, with office applications of computers, direct dialing, and, shortly, xerox copying, it was the great bureaucracies, both public and private, that could afford the new machines and benefited most from their capabilities. Computer technologies fuelled the growth of existing organizations. They even provided an incentive for the emergence of some new mastodons, companies such as Digital, Xerox, and Hewlett Packard, which not only manufactured and sold the technologies but also applied them to great advantage in their own work environments. The concept of an 'other-directed' 'organization man' expressed the administrative ideology of this era of the dutiful white-collar worker. The 19605 were to witness an unparalleled growth of bureaucracy, especially in North America. The technology's effect seemed nonetheless to be conservative: big, established players simply became bigger. When even more powerful technologies emerged in the 19705, the assumption followed quite naturally that, once again they would lead to greater burgeoning of large-scale organizations. Some of the people who ran the bureaucracies began to think of creating a totally interconnected network of information-processing and communication systems, linking all of their employees into one all-encompassing network, extending across the entire span, both geographical and functional, of the organization's operations, providing for more centralized managerial control than ever before. The transformation that had overtaken the factory floor a half-century earlier now appeared
The Vulnerable Fortress
36
to be integrating into the back office. These were no longer merely useful, discrete machines, each designed for a specific function, but were part of a totally integrated communication/information-processing system: office automation. There was no shortage of companies eager to benefit from office automation's potential for realizing the same kind of productivity boost that mass-production techniques had once brought to the assembly line.1 That was only one perspective, of course; few paused to ask the people who worked in these organizations what they were experiencing as the information-processing 'revolution' unfolded. As we shall illustrate in this chapter, as far as employees of the organization were concerned, the 'productivity gain' meant implementation - within very large, complex, and already fully operational sets of work configurations - of a rather complicated new technology. The technology's promoters did not understand its impact on office work, despite the 'hype' surrounding its introduction. Many documents appeared in the early 19805 with formulae for automating a company or a government department; they sounded authoritative and well-informed but were in fact based on speculation, not on experience, much less careful documentation and research/ Perhaps it is not surprising, then, to discover that things did not work out quite the way they had been predicted (Taylor 19863; Attewell 1992). The 'productivity gains' somehow never materialized, as we reported in the previous chapter, and bureaucracies were beginning to shrink. Company failures became common; enterprises that had once looked rock-solid faltered. Even though others flourished, what is important in all this from our perspective was the undermining of the assumption that technology would support a bureaucratic stranglehold on the organization. The reasons why one firm rose and another fell might, or might not, have to do with office automation - mostly not - although, as we shall see later, the much larger transformation overtaking communication technologies was very much a factor. The transformation prompted serious questioning of a lot of taken-for-granted ideas about how technology supports administration. (This book, obviously, is part of that questioning.) We have elsewhere reviewed the literature on the organizational impact of office automation (Taylor and Katambwe 1988; Taylor 1993), and there are other excellent reviews available (Markus and Robey 1988; George and King 1991). Instead, we invite you in this chapter to undertake a rather different exercise - to try to grasp what happens,
37
Coping with Office Automation
qualitatively, when an existing organization has to adapt to a whole new technological complex, which it is supposed to master without interrupting its normal routine or altering its essential nature. We look at two ventures in automation, from two quite different perspectives - the user and the system developer. Both cases are based on field work conducted during the past two years. In presenting these cases we would not want to pretend that either is fully typical of what happens when an organization plunges into automation. On the contrary, there is good evidence (Kraemer and King 1986; Dumais, Kraut, and Koch 1988) that there is no one identifiable pattern of impact of the new technologies, since so much depends on local conditions that are difficult to summarize using a few, well-identified variables. We are trying instead to convey a feeling for the automation experience - an intuitive sense of why the relationship of technology to bureaucratic organizational practice has turned out to be so problematical. Our purpose in doing this is to lead into the following chapter, where we begin to look behind the superficial phenomena of automation in order to see the conceptual dilemma that it has created. Automation in the context of police work Our first case study describes the implementation of a new integrated computer and telecommunications system from the users' point of view. It is a system designed to support police operations in a medium-sized North American city. The case is presented in three parts: operations from a pre-implementation perspective (the 'old' system), the new system, and how police work was affected by the implementation. The old system The context here is what might be called 'nerve centre' of a metropolitan police force. Like most such operations in North America, the system is built around a 9-1-1 emergency telephone answering service, designed to give citizens access to fire-fighting, police, and ambulance services 24 hours a day. The study on which our description is based is concerned only with the component intended to coordinate police operations. The old, pre-automation system worked as follows: When a 9-1-1
The Vulnerable Fortress
38
call was received, the telephone operator wrote down essential data on a piece paper, which was then handed manually to a police dispatcher who, then, in turn, if necessary, consulted his own files and radioed instructions to a police cruiser strategically located in a position to respond to the indicated site. The police car then took the appropriate action. This communication system, which had been in effect since 1963 without substantial modification, was universally acknowledged to be technically flawed. The channels easily became overloaded; they were susceptible to access by other than police officers; maintenance and replacement were continuing problems; obtaining supplementary information from the dispatching centre was not always easy when the centre was busy; and information could be lost or distorted throughout the chain of information flow from receiving operator to police car. The efficiency of the system depended greatly, in this latter respect, on the dispatchers. Under the old regime, a dispatcher was typically a service veteran with long experience on the beat. At the time of changeover, the average length of service in the force was over 21 years, of which more than four years had been spent as a dispatcher. While there were files of previous cases available to him (it was without exception a 'him'), a dispatcher typically relied on his own knowledge and memory to alert patrol cars to possible risks involving sites with a known history of previous crime and to inform them on particularities of a given situation (such as one-way streets and proximity of another car). The dispatcher enjoyed very considerable discretion in negotiating with the police on patrol over which calls got priority. If the dispatcher thought it was just a nuisance call, the person making it was likely to have to wait. Furthermore, when one dispatcher went off duty and another came on, the carry-over of a call from one shift to another was problematical. Technical inefficiencies were obvious. The system's human characteristics, both negative and positive, were more complicated. While the radio network was subject to overload, the people in the cruisers felt that they had an overview of what was happening in their immediate area: where other cars were located, who was on what call, and how much support might be expected if and when it were needed. Similarly, while the dispatchers might enjoy questionable discretionary latitude, from the point of view of the in-calling public, and while too much of their work relied on fallible memory, their long experience reassured the people in the cars. They were, to use the
39
Coping with Office Automation
jargon of the trade, 'police minded.' They knew the culture from the inside, including how to bend the rules (and also when to 'draw the line' on cheating). They brought some necessary flexibility into the system, such as when to be tough and when not to: don't send people who are about to go off shift on a call that is not urgent; leave it to the incoming officers. Know when it is reasonable for people to take their lunch hour. Lean on someone who is 'goofing off.' The exercise of discretionary power is one of the ways in which authority is maintained. (It also opens the door, of course, to possible abuses of power.) The new system In December 1989, Metro Police implemented a new $38-million system of computer-supported integrated telecommunications (CSIT), similar to others already in use in jurisdictions such as Los Angeles, Atlanta, and Chicago. By August of 1990, all police cars on the force had been outfitted with terminals, linked to a central VAX mainframe (at a cost of slightly under $100,000 per car, if the central computer is counted in). The goal of CSIT was to systematize the receiving of emergency calls, and the associated dispatching operations, and thus to enhance the efficiency of the system, the security of police officers on duty, and the level of service offered to the public.3 The study we are about to report coincided with the implementation phase and took the form of onsite observations and unstructured interviews, conducted in the dispatching centre and in patrol cars, during the first months of 1990. The purpose of the study was to describe, qualitatively, how the transition was experienced by the people concerned. CSIT includes four components: system diagnostics, digital radio communication, voice radio communication, and computer-assisted dispatching. The system provides its own internal 'diagnostics'; when alerted to a breakdown, a backup system is activated until repairs have been effected. It also maintains statistics on system use. The digital communication component, in addition to providing for the exchange of messages between dispatcher and patrol cars, maintains a constantly updated status report on all vehicles, permits the latter to interrogate directly police data banks, and allows for communication between vehicles and from them to district offices. Voice radio communication is now more reliable and has greater
The Vulnerable Fortress
40
capacity than the old system. The number of available channels has been increased from five to seven (not including four reserved for digital communication). With CSIT, all the networks are meant to be integrated, thus permitting an officer using portable equipment to communicate directly with someone in a car and with the dispatching centre. Officers using the system are automatically identified in the central office as soon as they activate their equipment and have only to push a button located in their car or on their walkie-talkie to send a message for help. Computer-assisted dispatching (which will we call CADISPATCH) is the particular object of the study. It links incoming calls from the public, routed through the 9-1-1 emergency system, and the actual dispatching of police units. Emergency calls are displayed for dispatchers on a computer screen as a telephone number and the address of the caller (both details being automatically provided to the computer by the telephone company). The dispatcher then keys up from 9-1-1 the problem and situation (indicated by a code number) and can verify if other calls have been received bearing on the same situation (to avoid useless duplication in assigning of resources) and confirm the address and the district in which it is found. Finally, a priority code is assigned to the call, depending on the information transmitted to the 9-1-1 receiving clerk. The dispatcher, with three screens at his or her disposal, has available a list of other calls in progress for the district concerned, along with the status of patrol cars in the area. The computer further supplies data on demand, such as previous police visits to the indicated address, a history of crimes associated with it (if any), the presence of ambulances and/or fire trucks at the site, and other calls in progress in the same district. This information allows the dispatcher to alert one or more units to respond to the call, taking account of its priority assigned by 9-1-1, the latter being based on factors such as danger to life and property and the likelihood of suspects still remaining on the scene. The computer itself suggests the appropriate action to take, although the final responsibility is the dispatcher's. The dispatcher selects which of the proposed units to activate and, by touching a single command key, sends a message on its way to the chosen cruiser (or cruisers) through the digital link, furnishing full relevant data supplied by the computer, and this may be followed up by a voice call. The cars are alerted by both an audible and visual warning signal; once the message has been received,
41
Coping with Office Automation
they confirm receipt by touching a single key (and they may also choose to communicate on the voice channel, to clarify a point). Officers in the cars can also interrogate the central computer to obtain further information (the presence of an 'H' on the screen indicates a history of previous activities associated with this address, for example; other information concerning suspect or automobile identification is available on request). If an officer has consulted the 'history' file, this is noted. As the car arrives on the scene, it signals its arrival to the central dispatcher. The dispatcher now has a record of how much time has elapsed for each stage of the call until it is completed. As a safety measure, if a response is delayed, a verification protocol comes into play, initiated by the computer. Officers on call may remain in constant contact, if necessary, with the central dispatcher and their district office, by either the data or the voice channel, and, in the latter case, from their walkie-talkie. Priorities, and suggested lines of action, can be updated at any time, to take account of new information. If the officer has been away from his or her vehicle for any length of time, he or she can call up on the screen messages received in the interval. Every police officer is assigned to a district, which is his or her command post. All information received by patrol cars is available to district offices which may, if they wish, follow step by step the progress of a call and intervene to suggest courses of action. District offices can also influence the initial dispatching of cruisers, depending on their knowledge of their area and of the cars that they have assigned to a given shift. One of the selling points of the CSIT system for Metro Police, in fact, was that it seemed to make possible the return of authority over dispatching to the district office, in line with corporate policy favouring increased decentralization of operational responsibility. In turn, each district has the duty to keep the central system informed on scheduling of units and any changes in their status. The transition to the CSIT system will presumably in the long run make redundant the tedious filling out of daily activity reporting sheets, since the information will already have been stored in the computer. Implementation planning and operationalization In planning the introduction of the system, it was recognized that the new system would call for redefinition of a dispatcher's responsibilities. To accommodate this eventuality, an analysis of a dispatcher's
The Vulnerable Fortress
42
job was conducted. According to this study, of the 47 activities carried out by a dispatcher, only five would remain unchanged following implementation of CSIT, given the capabilities of the new technology. Of the 42 to be modified, it was concluded that 25 would be totally automated, and 17 others would result in reduced responsibilities. On the basis of this analysis, a decision was taken to replace the old corps of dispatchers, all of them service officers of long experience in the force, with civilians - mostly young women without specialized qualifications, who would have to be trained on the job. The planners recognized that they might encounter resistance initially to CSIT from officers in the cars, based on fear of the unknown, of being subjected to minute-to-minute surveillance, of possible ridicule, of loss of employment for some, of an increased work load, of the physical inconvenience of having to live with bulky equipment in the car, of having to learn new work routines, or merely of increased social isolation. They further anticipated a negative reaction to the employment of civilians, to the extent that this might affect police security in action. They reckoned that the resistance would be most stubborn among long-service officers. To counter these expected effects, a considerable public relations campaign was launched, using videos and other literature, intended to convince officers of the benefits of the new system. Regular bulletins were issued to personnel as implementation proceeded. A training program was put into effect, organized partly by the equipment supplier, partly by the police force itself. Groups of fifteen patrol officers received four days of intensive indoctrination intended to familiarize them with the operation of the new technology. In addition, one of the system's communication channels was dedicated to providing them with help during the breaking-in period. The new civilian dispatchers were put through a 14-week training exercise under the supervision of five of the previous officers, chosen for this purpose, who themselves had received instruction in the operation of the new system. In addition, five other officers were assigned to give courses in police philosophy. Two other employees from the information department at head office remained available to provide specialized assistance during the training period. When this training period had been completed, the ten service 'consultants' stayed on in the dispatching centre for a further five months after the initial start-up of computerized telecommunications operations, to provide continuity (they were still in evidence when the study was conducted).
43
Coping with Office Automation
Dynamics of implementation The project was delayed initially when the Police Brotherhood which represents the 4,450 officers on the force (from constable to detective captain) balked at the proposal to replace policemen in the dispatching centre with civilians, who would belong to a different union, representing other municipal employees. Metro Police won this battle, but not before having to take the issue to a state labour tribunal for arbitration. The incident created bad feelings and left scars, which did little to smooth the transition to a new system. There was a further delay caused by a dispute with suppliers that had to be taken to court. As a result, implementation was held up until December 1989, two years after the first signing of contracts with suppliers. The decision of the labour tribunal against the Police Brotherhood was a particularly bitter pill for the previous corps of dispatchers. While being named dispatcher did not, officially, constitute a promotion for an officer, it was seen to be so by the people concerned. They interpreted it as recognition of merit, after years on the beat, because of the informal authority it conferred and the sense of belonging to a level of command. As one of our informants put it: 'I loved that job, ... it meant we could give the benefit of our experience on patrol to the younger officers. It allowed me to put my ability to work, to tell them what to do. It gave us a chance to exercise our authority.' The veteran dispatchers felt unhappy about the way in which they were treated by management: 'When they made the change, they knew we were frustrated, but they could have at least met us, shook our hands, thanked us. We were left feeling bitter over how it ended ... It's already tough enough to have to tell yourself you're being replaced by a software package/ 'Now I'm back at the bottom of the ladder, where I started, in a patrol car, it's a step backwards in my career, a big step backwards ... It was rewarding work, because we could make use of our experience.' 'After ten years away from the cars, you lose the taste for that kind of work.' Many remained convinced that they could have made the transition to the new system: I'm far from sure that the police couldn't have handled it ... The bosses seemed to think we weren't up to it ... We hoped it would be us, we had the know-how to do the job, ... The civilians had no business being there, we could have done the job.' 'I don't have anything against the civilians as such ... But we were disappointed ... I dreamed of learning the system because I'm a
The Vulnerable Fortress
44
computer nut... It's a big let-down. I'm back on the road; I was really pissed off.' The initial implementation was limited to 100 cars (out of the total of 410). The state of apprehension among members of the force is perhaps indicated by the spate of rumours that followed: 'We heard about keys that got stuck.' 'People in other cruisers told us it was terrible ... It was supposed to take you just ten seconds to get information, and instead there was a 45 minute delay.' 'Once, during a real cold spell, and one of the cars hadn't been on the road for three days, it took four hours for the computer to warm up ... And they said the computer was supposed to be up to every test!' Unfortunately, implementation got mixed up in people's minds with another recent innovation. Officers had been supplied with new walkie-talkies which proved unreliable and were constantly breaking down. The administration had been deliberate in taking account of the complaints from officers on the beat: The walkie-talkies had an effect on the credibility of the new dispatching system ... They promised us the moon. The department wanted to go too fast, the expectations were too high, we were let down.' In some quarters, skepticism reigned: 'If the new system works the way they say it will that will be great, except that that's not what seems to be happening ... I expect it will take two or three years before it's running the way it should ... If I judge on what they usually do, it's two years since they promised us we were going to have T-shirts, it's six months since they said we would have two belts, and so on ... We're always suspicious when they announce something, because we know you have to cut it in half ... It's because of that that we expect the same for the new system of dispatching ... Just more promises.' As another informant put it: 'In the Police Department, there's no communication between the lower echelons and the upper ... We're always the last to be informed about whatever is happening. Often, we learn about things that concern us directly through the media before we hear about them through official channels. It makes you mad, it's irritating to have to live with.' In spite of these negative comments, the people in the cars equipped with terminals were more than positive in their assessment. 'With this system, we've moved from the twentieth to the twenty-first century.' 'It's a marvellous tool; nobody doesn't like it.' 'It's pure gold.' 'It's maybe the best present the city has ever made us.' As one district
45
Coping with Office Automation
officer put it: 'My patrolmen wouldn't be without this system. Three weeks ago they had the system down for some reason -1 don't know what it was - and my people were bitching like hell because it had been taken away from them for a little while.' There were still some problems, as we shall see, but the system had passed the test, as far as most users were concerned. The problems lay elsewhere. The real issue, from the perspective of some of the people in the cruisers, was the 'foreign' presence in their midst, in the form of civilian dispatchers. The core of the problem was that the latter were not 'police minded': 'As long as the new dispatchers don't think police, it won't work. Our life depends on them. I don't think they're well enough trained. For example, during a chase, the dispatcher was more nervous than we were, and what was worse, he didn't know the part of the city by heart, he had to depend on the map.' The greatest fear centred on a major emergency: 'We knew all the old dispatchers, and then, from one day to the next, we didn't know anyone. We were afraid, what will happen, especially if something major should occur.' The civilians - they don't have a feeling for the calls, they don't have the judgment - it's because they're not policemen ... You have to have seen things to visualize a call.' The supervisors had a different perspective: They [the policemenl are part of the same group, they talk the same language, they know where they're going, they understand the consequences of their acts, they are familiar with the kinds of calls, they know exactly the percentage of alarms where there's no real break-in and the small percentage where there is. The police know all that because they've had to learn it through experience ... It's through operations that you learn after years and years of practice, of patrol, on the beats, by being there. Cops know who to send because they know the geography of their district. They know when to send out a call for a call from another district, when it's really urgent. They know that kind of thing.' Not all the problems had to do with the new personnel. The nature of the interaction itself had been altered: 'On the old radio system, you could recognize the tone of voice, and when the tone changed, you knew something was happening. You didn't lose any time while you waited for the car to tell you where it was, you already knew from having heard it on the radio. Now, you don't know where the others are, unless you look it up on the terminal, and that's a big
The Vulnerable Fortress
46
disadvantage.' To be police minded meant that you always knew where the cars were, like it was second nature/ 'With the voice system, if someone was in trouble, you knew it just by the intonation of the voice. Now, if everything's going to be keyed in, we won't know what's happening/ 'A 10-07 [call f°r nerpL we prefer to give it by radio, you feel safer, that's why we're skeptical about using the terminal for a 10-07.' For the patrol officer alone in a cruiser, the insecurity was even greater: 'What worries me is that we won't be able to keep in touch any longer with what's going on in the sector, when now we can't hear it on the radio ... I won't know what's happening the way I did ... because I won't be looking at what the others are doing because I'm alone and I have to keep my eyes on the road ahead of me while I'm driving. At night, for ordinary calls, that can be dangerous when a car goes on a call but no one knows, while now, if you're close by, you go there ... You can't spend all your time on the terminal just to be sure what the others are doing/ Quite apart from initial apprehension, difficulties with the system began to emerge with use. These tended to reinforce the patrol officers' sense of vulnerability in the cars and their suspicion that the new system was not working the way it should. The first of these could be traced back to a facility of the central data bank - a computergenerated map of city streets, on the basis of which dispatchers were supposed to locate the cars in a given district and place the calls as they came in from 9-1-1. This system did not work as well as it might have. Since it was based on 1975 data, not all streets were included (or at least adequately shown). The area of the city displayed on a terminal did not include neighbouring districts, and so the dispatcher had no way of knowing whether there was a car closer to the scene than the one that the system had identified. Especially at the beginning, the new dispatchers wasted time locating calls on the map. As a further complication, the pattern of cars available for a given district varied, depending on the shift. The system would recommend dispatch of a car when a better-placed vehicle in a nearby district could have been activated. This increased the comings and goings of those on patrol and generated inefficiencies. The people in the cars, who had no picture of what was happening in the dispatching centre, found this hard to understand. The technical difficulties affected relations, already strained, between police officers and dispatchers. During one observation pe-
47
Coping with Office Automation
riod, the unit being studied received a call to proceed to a distant point. Because of a major throughway, the destination could be reached only by a roundabout route. The cruiser advised the centre of this difficulty but was instructed to cover the call anyway. The officers complied, but not without demurring: 'This would never have happened before, when we had police dispatchers, because now there is going to be nobody left in our sector. It's not worth it, just for an ordinary alarm. By the time we get there, even if there had been a break-in, the thieves will have long since gone. The old police dispatchers knew that kind of thing already ... You have to think these new ones don't.' Most of the negative complaints about the civilian dispatchers could be traced back to the same source, the map. But the problem tended to be identified as a human, not technical, one: They've always got you on the go. They're too systematic in the way they do their job; they're not police minded.' 'Now, with the civilian dispatchers, you're on the move more, they don't pay attention to the districts.' They waste time looking up locations. They don't know the sector.' 'If the cruisers have to spend more time driving around, its because the dispatchers don't know their city. When something doesn't work, and when we get fed up with the dispatchers, it's because they don't think police, all they think about is just passing on the call, period. Where we are, here, this is a big sector. They don't think about that. When we still had police dispatchers, they kept us in our sector, because they knew our sector.' 'With the new system, the patrol sectors don't exist anymore; they [the dispatchers] are playing yo-yo with you all the time.' The fact that the dispatchers don't know the city is very bad. The people before knew the city, the one-ways, they could also tell us, when we engaged in an operation, "You, you block off that street there, you the other one, OK?'" The dispatchers, for their part, were aware that their problems with the technology were being personalized by their clients in the cruisers, and they were doing their best to correct the problem: 'If we only had up-to-date maps, it would help ... Several of us have gone out and bought little books of maps with the streets up-dated ... It was tiresome to have to listen to all the complaints ... It would be easier if we had maps that were up to date, because it's easier to find the streets, but they don't want to give them to us because the computer is supposed to be there for that.' This was not the only technical problem with which the new
The Vulnerable Fortress
48
dispatchers had to cope. When a call came in, the computer suggested the car (or cars) that might most appropriately be assigned to it. The system proceeded somewhat mechanically, based on geographical proximity, without taking account the relevant history for the call (which was stored in another database). As a result, dispatchers had to make assignments different from the computer's about half of the time; planners had predicated a rate closer to 5 per cent. The resulting delays, and occasional errors in assigning the right number of cars to a call, tended again to be blamed on the people in the dispatching centre. There was yet another problem. The system was supposed to show automatically who was speaking on the network, without people having to identify themselves. In practice, this seldom occurred, so that time was lost (and confusion created) by having to break in to request the unit to identify itself. This might have been only a minor irritation, but it was further complicated by another feature of the system, which required the speaker to pause for a split second before speaking. Since speakers usually forgot to pause, dispatchers often missed the first part of the incoming message and had to ask for a repeat. This distraction may, again, have been trivial, but it added to the complexity of the dispatchers' task and made them look even more awkward. As one of them remarked: There's such a difference between theory and practice!' The dispatchers, all young and two-thirds of them female (working in a male-dominated world, where more than 60 per cent of the men had ten years of service or more), were under no illusions about the feelings that the police in patrol cars harboured towards them. Quite naturally, they were concerned about how to be accepted by colleagues with whom they were in almost constant communication. The dispatchers were caught in a bind: on the one hand, they were prepared to do all they could to stay in good graces of the patrol officers; on the other hand, they knew that they would be held to account for their decisions by their own superiors, and they were anxious to avoid blame for any mistakes. It was a dilemma reflected in the character of the communication linking the dispatching centre to the cars: The cruisers don't like that - to be asked what's happening to them. They think they're being spied on, so the only time we do it is when they're on meal breaks, because we get checked up on that.' 'We only follow up on that when they're taking a meal because if we don't we get jumped on by the supervisors here in the centre ...
49
Coping with Office Automation
They check everything then.' The time monitor function is more a service for the dispatcher than anything else. The people on patrol don't like being asked what's going on, why something is taking such and such an amount of time, so we don't ask them just so we won't get under their skin.' Observations in the patrol cars confirmed the dispatchers' reading of the situation: The time the system allots for each call is reasonable enough, so if you're on a routine call, and the dispatcher calls you, you don't know for sure whether she's calling you to find out if everything's okay, safety-wise, or to check up on you, to control you.' For the newcomers, it seemed like a 'no-win' situation. According to one of the police supervisors in the centre: They're trying no per cent to be accepted, but they try too hard, and the result is that half the time they get on people's nerves.' Even worse, in the eyes of veterans functioning as consultants in a centre they once ran themselves: 'It used to be when the patrolmen got out of line and began to act smart, we gave it back to them. Now, the dispatchers want to be accepted, and they don't dare say anything ... They don't know how to give orders.' 'I told them the people in the cars would try things on for size, by taking extra-long lunches, for example, and for lastminute calls. It used to be they would try the same thing on us, but we knew their game. The people here in the centre don't use their authority, especially the young girls who don't know how to make sure they're listened to, there's the lack of experience too ... They're not sure of themselves.' The new dispatchers don't know police regulations the way they should, and the people in the cars take advantage of that.' 'Here in Dispatching we don't have the authority to give formal orders, but informally when you assign a call you have to make sure it's accepted. There have been abuses lately people getting into bad habits. They used to try it out on us too; they'd do it once, but they wouldn't do it a second time!' Tt used to be we made the calls, period. The new people aren't in control. It's getting better though, because they've begun to realize they're being taken advantage of.' This was not exactly how the patrol officers saw it. They found the new dispatchers even tougher than the old: 'When it comes to lunches and breaks, you have to look out for yourself, 'cause the dispatchers won't. The dispatchers and the bosses, they aren't thinking about us. You look out for yourself, all alone. If you were to take all the calls
The Vulnerable Fortress
50
one after the other, you would never be eating when you should. Before, it was the same I guess, but the old dispatchers were more smooth. They knew how it went better/ The balance between formal and informal has many dimensions in communication at work. One illustration showed up in another spontaneous phenomenon, the sending of 'personal' messages to dispatchers, particularly during the long watches of the night shift. In the words of one dispatcher (female): 'When there's nothing much happening, especially at night, there are lots of patrolmen who send us messages on the computer terminal; it's fun, and it's a way to communicate with them. Afterwards, they're more cooperative.' These kinds of communication were strictly forbidden by supervisors, who interpreted them as flirting. Civilians interpreted this ban as rejection - a message that they are not supposed to be friends with the police on patrol: The supervisors don't want us sending personal messages of any kind, but we do anyway because it's a way of creating links between us and the people on patrol. Anyway, the sergeant doesn't get to see the messages we send each other, unless he requests them specifically ... Certainly, there have been liberties where personal messages were concerned, especially at night, but there's nothing else to do, so we exchange messages with them ... It's a way to lighten the atmosphere a bit.' 'If the dispatchers were cops, the sergeant wouldn't be so strict. Our bosses treat us differently than they do policemen. The sergeant is opposed to civilians. He is always watching us on the messages. The people in charge don't want any familiarity between dispatchers and the police, even though we work with them in cooperation, and not against them.' The incident reveals the tension prevailing in some - though not all - teams of dispatchers: There are some teams where there's no problem, but in others the people in charge treat you as if you were not even there. They spend their time watching us, listening in on the calls, and riding us, but again it depends on the team ... Among ourselves, we have fun, but not where they're concerned. They don't have enough to do, looking after their own business; they have to be all the time telling us what to do, whether we ask them for their advice or not.' Buried somewhere within most of the difficulties in communication between civilian dispatchers and police on patrol was an issue of who was to decide what under which circumstances. Both sides believed that 'where we always seem to get hung up is on procedures
5i
Coping with Office Automation
or directives/ During one observation period, for example, the issue being worked out between dispatcher and patrol officer was whether the call should be answered by a unit manned by one officer or one with two, a driver and partner. At another moment, the debate involved a solo officer in a cruiser who insisted that he did not have to take this kind of call, only to be corrected by the dispatcher, who pointed out that during daylight hours he was indeed authorized to do so. Such differences of interpretation were not infrequent. According to supervisors and ex-dispatchers-turned-'consultants/ the dispatchers were to be boss, when it came to the point of issuing orders. What made the situation complicated, however, was that, officially, responsibility for making decisions was supposed to be the district's, where the command post of each officer was located. This ambiguity easily became translated into even more arguments. During one observation in a patrol car, just as the shift was about to end, a lastminute call came in. The officers in the car protested vigorously that it was too late for them to take. Their superior officer came on to the line and, in a tone of voice that verged on arrogant, demanded to know of the dispatcher whether the call could not wait until the next shift began. The dispatcher replied that he was required by the rules to dispatch immediately, without exception, all calls coded from 'one' to 'four' (calls coded 'one' being most urgent; codes 'one,' 'two,' and 'three' all being nevertheless transmitted both digitally and on the voice channel, while 'four' calls were sent only via the computer terminal). The district officer then intervened to say that it was his understanding that the rule applied only to codes 'one' and 'two.' This decision went to the district officer, the dispatcher having deferred to him, but all parties were left feeling perplexed and a bit frustrated, particularly with respect to how to behave in the future in a similar situation. Later the district lieutenant, at a pre-shift briefing, confirmed that dispatchers were under orders to pass on calls up to, and including, 'four' until ten minutes from the end of the shift, and if they failed to do so they would be held uniquely responsible for their delinquency. This statement hardly ended the perplexity. In the words of one dispatcher: Tn the city bureaucracy, it's everything or nothing. The people out on patrol don't like getting calls ten minutes before the end of their shift. This new procedure is going to cost the administration a fortune in overtime. The cops react very negatively to this kind of thing, but we don't have a choice:
The Vulnerable Fortress
52
the supervisor sees exactly what's the time when we receive the calls, and which cars are available, and we have to dispatch them. What's going to happen is that the cars are all going to start becoming 'unavailable' because they'll all be following up on something that they've seen on patrol/ One instance illustrates how such policies work out in practice. In response to a last-minute call, a cruiser replied on the digital channel that he would not follow up on the call but would leave it for the next shift. The dispatcher replied by saying that the cruiser was the boss and could handle the call however he liked. The officer, in turn, said that the dispatcher was responsible. The dispatcher remarked to the researcher: If that guy wanted to be rid of the call, all he had to do was punch in that he was on another call that had come up on patrol, but he's not experienced enough to realize that.' As one officer said: It hasn't changed the productivity; if you put off a call before, you can still do the same thing now.' The problem of authority was something new for the districts and arose directly out of implementation of the new system. Previously, even though formal authority resided in the district, the latter had no means of enforcement, since it did not have access to the necessary information to make decisions. Everybody understood, and accepted, this de facto state of affairs. As one patrol officer put it: 'Before, there was no problem, everyone knew who did what, it was clear how to proceed.' With CSIT, the only thing that seemed totally clear was the possibility of being punished for a mistake - especially since the presence of the computer made it so easy to go back and check how someone had done a job: 'If they want to find out something about an event, previously they had to go back and check the tapes, now all they have to know is the approximate hour and they'll find out.' It's easier to check up on you; that's how they can come down on you.' Given the ambiguity, the unsatisfactory relations with the police in the cruisers, and the omnipresent threat of sanctions, dispatchers had to fall back on rigid application of the rules. Even here problems could arise. In the words of a district officer: 'We're not out to get the dispatchers. They are under orders to make the calls, but we are instructed not to authorize overtime.' So where did the authority lie? Depending on whom you talked to, the answer could be quite different: 'Who's boss? Everybody seems to be involved. Officially, it's supposed to be defined a hundred per cent, it's supposed to be the district officer, but the priorities are
53
Coping with Office Automation
different from one situation to another/ 'We're supposed to ask permission from the dispatcher for everything, but the dispatchers say it's up to the patrolman to decide.' In the view of a former dispatcher, now a 'consultant.' 'If the dispatcher-police relation isn't all that great, if there's confusion about procedures, it's because the dispatchers don't have the authority. It's the centre who should decide; they are trying to decentralize something that can't be decentralized.' Out of this confusion, the winners seemed to be the police on patrol, because they were the ones who best understood how the system had once worked and (at least in their own view) how it should still be working. Because of their insecurity, the young dispatchers would do everything they could to conform to the style that was most acceptable to the people with whom they spent their day communicating. The officers to whom the researcher spoke seemed to recognize this: 'It's the dispatchers who will have to learn to think police, not us who have to think civilian.' 'It's getting better, now they're starting to talk our language.' 'It was tough at the start, it was really ugly, but the dispatchers have backed off some and they're starting to adapt.' 'They're learning to think police, they think less and less like civilians.' So the new dispatchers were trying - under the pressure of the job and while absorbing little by little the philosophy of the former regime, now their principal advisers - to take on the mantle of the office, in conformity with the unspoken conventions of police culture. At the conclusion of the research period, their ability to do so remained still very much constrained by the strictly applied rules within which they had to function and the ambiguity as to where the real authority lay - in the districts, in the dispatching centre, in the field where the operations were - or in the computer! Whenever they could, with impunity, conform to the practices typical of the old system, for issues such as last-minute calls, they did so; otherwise, they would fall back on strict application of the rules, even at the risk of being considered too rigid and not sufficiently 'police-minded.' It was, for the moment, an unstable equilibrium. Conclusion We are struck when we read this account of an actual experience in automation by how little it conforms to the image in the trade journals of happy office workers, increasing their productivity by the minute
The Vulnerable Fortress
54
through better communication technology. According to all reports, the new system of computer-linked communication was a vast improvement technically over the previous. But it did not visibly increase productivity - its ostensible purpose - and it seriously disrupted natural mechanisms of control built into the human interactions that it disrupted. Those 'mechanisms of control' were part of an ongoing conversation, maintained daily through exchanges involving the old dispatchers and the police officers in the patrol cars. The new conversation, at the time when it was observed, had not yet stabilized, leaving issues of authority very much 'up in the air' and the system designers' distribution of authority 'out of sync' with what seemed to be emerging. Presumably, with time, the new conversation would stabilize, into a form that is difficult to predict, and at that point reorganization would have occurred. Whether the new organization would resemble what the people who decided to implement the system hoped for, however, is an open question. Automation from the point of view of system designers Our second case study also looks at automation, but from the very different perspective of designers and developers of administrative systems. Through documentary analysis of texts produced by system designers and through observation of the conversations that occurred throughout the final developmental stages leading up to implementation, we can trace the steps that go into a software production process over a two-year period, from initial plan to delivery of the product. The process was recorded by one of the participants, albeit a somewhat peripheral one - the person responsible for developing the documentation that normally accompanies a new package to assist the user. The study has thus both an 'inside' and an 'outside' perspective. The observations come from someone who played an indispensable role, had access to proprietary documents, and was privy to the daily exchanges of developers; however, that person's role was a related function, not software development as such.4 The context of the case study The experience to be described took place within a software development firm (which we call Sofdel), while a new system was being worked through for a large corporate client. The client was a service-
55
Coping with Office Automation
oriented firm (we call it Clyco), with operations that are continental in scale and depend on a highly complex messaging network.5 The software system under development had as its goal to increase the efficiency of the corporate network infrastructure. The basic software idea had originally been developed for another company, but because this earlier project fell through, Sofdel was left with a product that it was convinced had great promise. It continued to work on the product, and its confidence was rewarded when Clyco proved receptive to its sales pitch and was ready to enter into an agreement to purchase the software, subject to guarantees of delivery following their specifications. Within Sofdel, the two main departments concerned were software planning and software development. Software planners, in the Sofdel scheme of things, define new technologies up to the point where software developers take over. They are responsible for creating business opportunities; they are the technology dreamers. This is the group that wrote the initial project specification documents and conducted research into the customer's operations, on the basis of which it prepared the blueprints for the design of the software. Software designers are the people who do the actual product development. They typically hold at least one degree in computer science, and their average age is about 30. They are versed in traditional methods of systems analysis and design; until they have weathered a few projects, they tend to rely on school-based methods of work. During the life of the project, a development committee, with representatives from Sofdel and from Clyco, met regularly. The client's representatives were responsible for planning the project within their own organization. They belonged to an operational methods group; their normal function is standardization of administrative operations - procedures - within one of three corporate 'families/ Traffic, Transmission, and Network Maintenance. Each family is a functionally distinct line group, hierarchically structured into tiers, with a methods group in each. Standardization of operations aims to determine the most cost-effective way of doing things; on the basis of periodic economic analyses, the departments write corporate practices and procedures that specify how tasks are to be completed in the organization. The development committee had no representatives from among those who would eventually use the software, although the methods people could claim that they were continually in contact with working members of the line group in the normal course of their work. They
The Vulnerable Fortress
56
talk to line people frequently, it appears, in order to collect information for their own planning. The liaison committee had no formal structure, other than the one provided by the respective internal hierarchical arrangements of each of the partners in development. People were brought together on the basis of their specific mandates and expertise. Each person represented the part of the corporate structure with which he or she was associated and was accountable only to his or her own superiors. Within Sofdel, one individual was designated team leader, but he functioned in other respects as first among equals. Within Clyco, the methods department in Traffic was considered 'prime' for this project, but the exact meaning of this label was not totally clear, and relations among the three departments remained largely horizontal. All those concerned were middle-level managers, well below the rank of senior officers. The core technology The core software component on which the new system was based the Generic Forms Tools (GFT) - was already in existence. It was aimed at providing for the electronic processing of forms within a network context. Routing and tracking functions would be built in. A product of Sofdel's research arm, it had been presented publicly to professional gatherings of computer programmers who had given it a favourable reception, further convincing the developers that they were on the right track. GFT provided the possibility for on-line configuration of forms and processes by the client company's personnel. It would allow the client to incorporate new procedures and methods into the system without returning to the traditional development cycle, with its nonincremental logic. The GFT idea was to move the customization of the software away from the software designers and into the hands of the user. The building block of the new system was a form whose profile would not only record essential data but also include a description of the rules and permissions (special routing privileges, specific protection for strategic fields of information, and so on) used to route the captured data to other users of the system. The profile would define not just the categories of data but equally the set of tasks to be performed on the data. Modifying the system would be the responsibility of system
57
Coping with Office Automation
administrators using powerful editors to change contents, layout, routing rules, and user permissions, as demands evolved with the growth and transformation of the company's operations. The setting up of form profiles would be a 'non-trivial task,' but since it would require an understanding of the operational processes and no programming expertise, this difficulty was not seen as an insuperable barrier to productive use. Nevertheless, the administrator charged with the task of modifying the system would have to have some knowledge of, and facility with, computers and systems theory. Once a form had been generated operationally, it would follow a path consisting of nodes linked in various configurations, with each node corresponding to a function to be performed and a user group responsible for the function. Part of the description of a node included its privileges and permissions (who is authorized to create forms, to which functions users are permitted access, and which other systems are accessible to users). Individual users might, or might not, appear as nodes in more than one path. Routing procedures had to be flexible enough to allow for maximal automation in instances where highly structured, well-defined process flows were concerned, for more flexibility for process flows requiring decisions, and for handling exceptions. This was the basic technology already available as the project began. Software development as the writing of a text In general, in software development, the systems development process is characterized by three stages: systems analysis, systems design, and implementation. Systems analysis defines the scope of the project. Activities include information gathering and analysis. The resulting report will state the scope and objectives of the analysis; analyse the characteristics of the present administrative system which is to be automated, including constraints on it; report on the technological alternatives available and their feasibility in the present context; and estimate the resources required, both human and financial, for the project. The report of the systems design phase typically reviews goals and objectives, lays out the overall system model, evaluates organizational constraints, considers alternative design features, conducts a cost-benefit analysis, and makes recommendations.
The Vulnerable Fortress
58
In implementation, the real programming occurs in response to clients' exigencies, end-user documentation is written, users are trained, and the old system is actually converted to the new. Overall, the process translates an initial perception of how a particular organization works into a formalized representation of it which conforms to the strictures of computer programming and allows for its modelling in computer-recognizable terms. The organization is literally encoded into the language of the computer. The user's subsequent experience with the new computer-based system (which is why he or she must be trained) decodes an image of his or her own work, from the language of computing into actual operations - reading the computer text. The cycle is complete: from organization to representation of it into reorganization. The initial phases of this project were the responsibility of the system planners and took the form of an operations plan, a technical proposal, a 'user needs requirements analysis,' and a project management plan. Analysis of these documents constituted the first part of the case study that we are describing. The operations plan was written in 1986. It was produced by a member of the planning department, following an investigation of Clyco's administrative operations, through informal interviews, sampling of actual administrative forms and user manuals, and discussions with people in administration. This phase was funded by Clyco. The plan begins by situating the proposed project with respect to the strategic objectives of the client company. These include standardization of the practices and procedures of administrative operations, with the long-term goal of greater efficiency and reduced costs, through 'phasing out' of clerical workers and substitution of automated information capture and processing functions by the use of computers. To this end, the operations plan analyses the operations of the network on which the company depends, using data collected during the field investigation on which the plan is based and concentrating on how vital administrative information could be made more easily accessible in computerized form to key departments. The method employed in describing the operation of the network is 'flow charting,' a technique commonly employed in computer science. A flow chart, in its most elementary form, is a language for representing structural and behavioural aspects of a network by employing symbols for algorithms and decision-making branch points. A trivial version, showing the calculation of a grocery bill, is shown in Figure 2.1.
59
Coping with Office Automation
Figure 2.1 Sample flow chart for a grocery bill calculation
In simplified form, this is the type of diagram that was employed to map out the administration of Clyco. Eight functional categories (information processes) were identified: manual collection of information, manual 'download' of information to the network, electronic 'download' of information to the network, record-keeping of information with computer systems, record-keeping of network usage with computer systems, audit systems, and procedural systems. At this point, Clyco's administration has been translated into a representation which describes it as a set of information processes and decision-making rules. The operations plan frames the project. It has a technological dimension - a software tool; an economic dimension - increased
The Vulnerable Fortress
60
efficiency and reduced costs; and a strategic dimension - reorganization of operations through standardization in order to increase scale in the face of global competition. The plan concludes with a 'feature rollout/ or list of recommended system features, and the benefits to be expected from each (one feature, for example, is the mechanization and automated updating of administrative databases, resulting in the elimination of clerks performing manual input during network administration; a second is automated validation of data consistency among databases, resulting in synchronization of databases and reduction of errors). Finally the report includes a timetable of activities to begin in 1987, leading up to implementation in 1989. Acceptance of the operations plan by senior management at Clyco was the signal to prepare a technical proposal. This document differs from the operations plan principally in its fineness of detail; it too reiterates the strategic advantages of the proposed system and the economic benefits of implementation. Its objective is to justify expenditures on the crucial phase of software development leading to implementation. The technical proposal develops in depth the analysis of network operations begun in the earlier document. Its analysis is based on a study of administrative practices written by various methods groups, supplemented by interviews with some key line informants. In an organization such as Clyco, there are 'practices' for almost every task making up every job. The technical proposal rests on the basis of this material; as we shall see, its authors had to assume that the practices described by the methods analysts were current and represented how the work was actually performed. The assumptions, as it turned out, were not always well founded. The interpretation of the data was based on structured analysis, a formal technique (cf. De Marco 1978) for composing data flow diagrams (DFDs) using a language with symbols for four elements: data flows (vectors or arrows), processes or functions (circles or boxes with rounded edges, known as 'bubbles'), files (straight lines), and data sources or sinks/destinations (square boxes). A function represents an operation that modifies incoming data; it is assumed that when people work with information they are performing functions. Functions that appear in one representation as a single process can be 'blown up/ or expanded, in another to show in greater detail the nature of the operations being performed. This is known as the level of 'system specification' (De Marco 1978). Using a DFD, the analyst
61
Coping with Office Automation
Figure 2.2 Sample data flow diagram
follows the path of transformations of information from the point where it enters the system to where it leaves. The diagram is meant to capture the flow of information and, in the process, to identify duplications, redundancies, irrelevant sources of information, and unnecessary manipulations. It is intended to be a model of how the system actually functions, not how it ought to. Suppose, for example, that the operation in question is a response to a new client. The initial step, for the part of the operation concerned with networking, is to assign a new line to the customer. This sets off a number of subsequent operations. First, this information is recorded in a marketing database. Second, the transmission inventory must be updated. Third, the forms inventory must be updated. Finally, the form, now having been created, must be distributed to other departments (such as billing) that may be concerned (Figure 2.2). All this gets captured in a DFD. Whether information flows correspond to the hierarchy of authority is not part of the analysis. All aspects of administration not concerned with the networking of information are ignored. The analysis highlighted what the authors took to be drawbacks in the current system. First, they noted that processes often involved
The Vulnerable Fortress
62
large numbers of groups, with a resulting flood of paper communication (forms, memos, and so on). Second, the analysts discovered a plethora of different forms in use, often confined to a single administrative unit, resulting from local initiative. Third, because of the number of people involved in handling operations, responsibility for tracking of administrative forms was diluted, with little control over the length of time that it took to carry out a process and limited control by management over its own processes. Finally, delays in carrying out a process (46 days for a form, typically) made detection and correction of errors difficult, particularly if the solution required coordinating the efforts of several groups. Nobody, it appeared, had a global view of the overall process. These criticisms of the paper-based system in use helped justify implementation of new electronic forms. The argument by the authors was backed up further by a fairly elaborate statistical analysis designed to show the savings resulting from transition to a cleaner set of processes. In the redesigned universe of work, it was assumed that each employee would have access to a workstation, interconnected electronically to the rest of the network. Here the authors took a radical step. With GFT software, a form not merely records categories of data but also incorporates a representation of all the steps of data flow which follow from the moment a request is initiated until it has been fully dealt with. Thus it should be logically possible to complete the full process from a single workstation. Each function, from ordering lines to billing clients to updating company databases, could be accomplished by a single keystroke, from a single computer. No more clutter of paper-based forms between functionally distinct groups; all the work formerly accomplished by a number of groups could be done by a single clerk. This was at least the ideal solution, although it was recognized that reaching it might not immediately follow installation of the new technology. Once this goal was achieved, however, in the view of the planners, the confused responsibilities and incoherent communication exchanges of the past would be eliminated, and control over the system would be returned where it belonged, to management. The new system, in turn, would support the strategic objectives of management in the strongly expansionist marketplace in which it perceived itself to be. Obviously, such massive reshaping of information flows would transform existing organizational structure. Just how radical, man-
63
Coping with Office Automation
agement would have to decide; while the technology permitted a single-clerk operation, it could also allow less centralized solutions, including division of the administrative task among many people, if that were thought desirable. Certainly the proposed software permitted whatever level of standardization of administrative practices management wanted. By the way, this was the only reference in any of the documentation to the relationship between the software system and the actual administrative structure. Acceptance of the technical proposal by Clyco's management triggered preparation of another report called 'user needs requirements.' This document, written in 1988, was prepared by the person designated by Clyco as having 'prime' responsibility for the project inside the client firm, the traffic methods manager, Jeff B., with the assistance of a colleague, Mary R. (not their real names), whose twenty years of experience in the department, in a variety of positions, gave her considerable direct knowledge of operations and a good sense of the feasibility of a given solution. By the time they set about drafting a report on the user's needs, both Jeff B. and Mary R. had heard many detailed presentations on the GFT software and had become thoroughly familiar with its characteristics and potential. This knowledge is plainly visible in the structure of their report, which is composed of two elements. First, the administrative processes are analysed using a version of structured analysis; second, the individual 'bubbles' making up the diagram, describing processes, are exploded as secondary diagrams representing features of the proposed administrative software. In business operations, creation and maintenance of forms are a central node directing a number of subsidiary operations (such as we have already described), including assignment of lines, updating of inventories, opening of accounts, and interconnections to other departments, such as Traffic, Marketing, and Accounts. When the business node corresponding to maintenance of forms is 'exploded' during software development, subsidiary functions take on a different character, creating users, profiles, nodes, and paths. In this way, the knowledge of administrative process possessed by the methods people in Traffic, including the priority of the operation, is translated into a set of functions that can be operationalized in programming code by the system designers. The result is designated Administration Forms software - GFT in a new guise.
The Vulnerable Fortress
64
While the report is described as an analysis of 'user needs requirements' and was written by methods people in the client organization, the predominant influence of programmers is evident. The report is based not on systematic investigation of the user's actual needs but on knowledge available to its two authors, framed in a language comprehensible to the system designers. Submission of the report led to the signing of a contract between Clyco and Sofdel and approval for the design and implementation of the new administration software. In 1989, the project was transferred from the planning department to a software development group in Sofdel. A first project management plan was issued in June 1989, to be followed by a further plan, with revised targets and deadlines, in 1990. By this time, planning documents were specifying 'project deliverables' (software components), the organization of the project, individual 'task assignments,' managerial techniques, and technical processes needed to support the project. Individual developers could begin production of particular features of the administration software (of which the forms software was a principal element). We are now in the 'nitty-gritty' of actual development and implementation, even to the point of user training. A number of characteristics of software development are now evident. First, the system was designed in relative isolation from the eventual users; any interaction between Sofdel and Clyco was mediated largely by the latter's methods people, who were already half converted to the systems philosophy. Second, analysis of the structure of work at Clyco was based on formal principles of computer design, framed in specialist language, not on empirically grounded research into what Suchman (1987) has called 'situated practices' (of the kind we described above in police work). The organization was mapped onto the software model, not the obverse. Third, administration was conceptualized in strictly functionalist terms, as changes operated on information flows. Issues of social identity and commitments such as we discovered in the previous case study have no part in such an analysis. Finally, there was a critical assumption of standardization that written administrative practices represent how users work, that practices are stable across time, that job responsibilities covered by a given job description are homogeneous across regions and offices, that interdepartmental relations are consistent, that use of administrative practices is predictable across the organization, and, ultimately, that administrative practices can be standardized.
65
Coping with Office Automation
As the next phase of development progressed, some of these articles of faith were to prove shaky, at best; the young systems developers were in for some rude shocks. Software development as conversation The phase of development we are about to describe covers the period from December 1990 to July 1991. We shall be concerned with only one of the sub-groups assigned to software development, but an important one: the administration forms team. The GFT-based forms software was the point of reference for all the ancillary functions that would have to be programmed (such as database and network management) and constituted the visible interface between the system and its users. It was important to get it right, if the transition to a new electronic environment were to be successful. After all, the mass of records being maintained in paper format would have to be transferred to a standardized electronic format, accessible from all types of terminals. The integrity of the network depended on maintenance of these records in clear logical order and their accessibility when required by users. Construction of new administration forms would have to be a cooperative effort, involving both system developers and client representatives. It was up to the latter to specify the fields that would appear on a form on the basis of their knowledge of administrative processes. The system developers had to translate the resulting configuration, or profile, into computer algorithms. To this end, a working group was set up, composed of representatives from both sides - the client organization and the system developers. On the software side, the team was headed up by a level-D manager (Stewart D.), assisted by a second manager and six software designers, each responsible for a software module. Stewart D.'s task was to supervise integration of the software modules to produce a coherent whole. On the client side, each 'family' (Traffic, Transmission, and Network Maintenance) was represented by two methods specialists, a tier-3 manager assisted by a tier-4 manager. Each of these representatives was responsible to his or her own family, although it was understood that the Traffic representative, Jeff B., and his assistant, Mary R. (the same two who had written the 'user needs requirements' report) would have 'primary' responsibility. (See Figure 2.3.) Difficulties appeared almost immediately. They centred on what had before seemed a transparent concept, namely 'administrative
The Vulnerable Fortress
66
Figure 2.3 Administration of the software project team
processes.' What appeared to a computer analyst to be an administrative process turned out to be a rather different entity when seen from the perspective of a manager. As long as the project was in the planning stage, these differences did not greatly matter; each side could safely lend to the term the sense that it liked, without danger of contradiction, since the disparity was purely theoretical and without immediate practical consequences. It was when it came to agreeing on the content of a form about to be put into use that the gap in understanding took on significance. The 'crunch' came during operationalization of an otherwise abstract concept. Little by little, it became clear that the concept was not so context-free as the planners had supposed; in real-life operations, the meaning of the term seemed to vary from one situation to another, depending on local understandings and practices.6 Computer analysts saw administrative processes as equivalent to information-manipulation; during the development phase, it became evident, that for users, administrative processes reflected the social dynamics of the organization and these could not be ignored if their essential cooperation were to be assured. Users' representatives found it difficult to relate to how
67
Coping with Office Automation
the concept was inscribed in computer code. The designers began by conceiving of work patterns as homogeneous: a process is a process is a process. The users saw things differently. The concrete issues now confronting the system developers were no longer theoretical. There was a particular problem establishing criteria for defining a user group (one of the GFT 'primitives'), deciding on its membership, specifying permissions for users within user groups, and determining relationships between user groups and the sequence of processing of Administration Form 101 from one group to another. Since all users (more than a thousand) had to be associated with groups (nodes, in the computer equivalent), it was necessary that the three 'families' agree on the conditions of group formation. Because it was essential that the definition of user groups mirror the internal structure of existing groups, the families had to agree on when the groups would interact. Privileges had to be assigned to each group to determine which forms each group could get access to and who could make changes to forms. Since each form followed a distribution path through the network of nodes, the sequence of distribution had to be agreed upon by users' representatives. Because these specifications had not been taken into account during the designing of the system, the system's developers now had to rectify the omission, by 'teasing out' the appropriate knowledge from the users' representatives. By the fall of 1990, the interaction involving software designers and users' representatives was well under way. The development team sought a way to standardize administrative practices, as a preliminary to constructing a form that would support them all, across the organization. This was a 'non-trivial' endeavour. There were, for instance, over seventy fields on Form 101 alone, and each had to be discussed with their users' representatives, who then had to return to their respective families for feedback. Once obtained, this feedback had to be rediscussed and negotiated within the committee in order that there be a common understanding. During the winter of 1990-91, the configuration of the forms software was already falling behind schedule. Before the administrative processes could be defined for an automated environment, users' representatives had to establish ways of working together as a group and of representing their respective communities. Part of the problem was the dynamics of the client group itself. No one at Clyco, it seemed, was eager to take on the responsibility
The Vulnerable Fortress
68
for 'profiling/ or 'configuring/ Administration Form 101 - designing the template, creating user groups (or nodes), and designing the distribution path. Warning signals were starting to be heard from the software development group, which sought to mobilize the energies of management within the client organization. Obviously, before they could configure forms, the client representatives would have to agree on what the administrative processes were, but since they had no history of working together, this was not easy. Their own peculiarities made the task even more difficult. Traffic, Transmission, and Network Maintenance were distinct in their structure and culture, and each had special needs. Traffic was responsible for monitoring network flow, to ensure that facilities were not overloaded, through rerouting, if necessary. This group had had little exposure to computers. Transmission assigned services and facilities. Here there was an interesting split. Clyco has continent-wide operations and has traditionally been divided into a Northern and a Southern operation. In the Northern region, the task of assignment was distributed among clerks otherwise responsible for circuit design. In the Southern region, circuit design is a distinct role, associated with a single user group. Only a small part of this job was automated. The two regions differed also in their conversion to automation. In the Northern region, it was proposed to convert paper to electronic during light work loads; in the Southern region, it was proposed to hire people on contract to do the conversion. Network Maintenance is responsible for the physical network. This is a group staffed with engineers, thoroughly versed in the use of computers. All these differences blocked progress because they implied variable use of information, depending on location. If the jobs are specialized in one region, and generalized in another (as in Transmission), then information may be either used for many specific tasks by several people or used by one person doing several tasks. Permissions are different, as is routing. Construction of a distribution path implies a decision as to who sends the form to whom and what the receiver will do with it. Automation seemed to be changing relations between Traffic and Transmission: it raised issues of when forms could be sent from one department to another, under what conditions forms could be modified, when they could be distributed and so on. Under normal circumstances, Traffic would issue the Form 101, but
69
Coping with Office Automation
then cases were indentified where Transmission would issue the form. Whose responsibility was it, and when? Other interdepartmental relations were up for grabs - for example, work orders: if Transmission decides, then the people in Network Maintenance are dependent on Transmission, something that they find hard to accept, since they are 'a hard-tech bunch of guys' (the observer's characterization). Interdepartmental suspicion was becoming evident. There was a dispute, for example, as to who would enter the data currently held in written form into the new automated database: Traffic and Transmission each argued that the other should logically do the work. Network Maintenance was often conspicuous by its absence from discussions. By January 1991, it was clear that the three families lacked the experience or the will to discuss business processes together productively. In the background lurked another consideration: would implementation of Administration software cause structural changes, such as the eventual merging of Traffic and Transmission? Nobody knew, but everyone did know that senior management was deeply involved in the project. Even within the same methods department there were tensions. In Traffic, Jeff B., on a promotion track, staunchly defended the Administration software to his management. Mary R. was keeping in touch with people in line positions, talking with them, getting their approval on features of the eventual software - written approval from line group committees. Mary R. always deferred to her superior in discussions, but her knowledge qualified her as a very special person, and her occasional irritation showed itself in subtle manifestations of discontent with the way in which things were going - especially when her boss was pleading for a 'team' approach. Patterns of communication within the liaison committee began to change. Initially, the software people preferred to work on the development of forms software in relative isolation. As deadlines began to be missed, communication between software people and users' representatives became more intense. Conversations, both between individuals and in committee, became more frequent. Finally, a solution was agreed on. Mary R. was asked to configure the form, including the listing of users' groups, assisted by Joanne P., the second representative from Transmission. When the profile had been completed, it would be returned to users' representatives from the other families to collect their reactions. Later that year, it was decided that Network Maintenance would not be part of the initial
The Vulnerable Fortress
70
implementation, because of the cost of installing terminals. It, to all intents and purposes, withdrew from the committee. From this point on, work went rather more smoothly. Yet there were still glitches. Totally new user groups seemed to emerge out of nowhere, performing functions that had not even been mentioned in earlier documents - groups such as Traffic Provisioning and the Order Control Bureau (OCB). OCB, it turned out, was the 'policeman of management reports': it tracked Form 101 through the processing cycle and ensured that the processing of each form conformed to predefined rules as to how long each stage of processing could take. Its inclusion in the conversation led to a new issue - who should be able to get access to the dates on a form. OCB argued that dates were its unique prerogative; Transmission claimed that it needed to be able to generate dates to determine how long it has to work on an order. It appeared that the meaning of a simple piece of information on a form is situation-specific, depending on the context of the work that a group does. The debate was confused even further when it appeared that management was considering phasing out OCB, by transferring its functions to Network Maintenance. By late summer, enough progress had been realized to prepare for a late-autumn field trial. After seemingly innumerable modifications to the forms profile to accommodate the multiple specificities of a complex organization in practice, the designers had finally come to accept the idea that rational standardization of procedures across the organization was an unrealizable goal. The concept of nodes, on which the entire software was predicted, had evolved from its standing for a precise working group in some location to its standing for functions, such as 'initiator/ that might be performed one way in one location and quite differently in another. Job responsibilities had been redefined as collections of tasks, which could vary from situation to situation. An 'initiator' could be from any of the families, or divisions, of Clyco. The process could be activated for many different reasons. The members of a user group designated 'initiator' need not correspond to the members of currently existing groups. This had proved to be the only way in which the current administration could be mapped onto the software to the satisfaction of all. Conclusion The two cases that we have been describing are the two sides of the automation coin. In the first, we observed something of what hap-
71
Coping with Office Automation
pened when a new system of computer-assisted communication was inserted into an established context of work, with a well-defined culture and conventions of authority. The organization had continued to function, in its way, through conversation whose rules and modes were understood by all. Automation arrived in that universe like an alien presence. Its interpreters - the new civilian dispatchers - were the target of the opprobrium, but the real cause of disturbance was the system itself. The system imposed on people who shared the conversation a regime of interaction which to them seemed artificial and lacking in the kind of elementary understanding that comes from regular involvement in an activity that one has mastered. On theoretical grounds, having to do with what upper management thought the proper distribution of authority, it imposed a mode of command which, by including the district offices, sowed confusion. In that case study, the reality of the working life of an organization is very present, while the spectre of computerization looms behind, in shadow - a kind of indistinct but menacing presence, whose logic remains not fully comprehensible to the people whose lives it is restructuring. In the second case study, it is the organizational conversation that has become blurred and indistinct. The computer is revealed as a textual representation of organization, whose discrete symbols and syntax outline a system of communication that is denuded of conversational properties and has become a network of linked functions. The purpose is to transform information, which is itself a product of a priori categorization designed to reduce the natural variety of human interaction, into terms corresponding to the organization's own capabilities. Links with the actual context of work are indirect, mediated by a corps of specialists committed to the discovery and implementation of norms and standards. The author of the case follows the progressive steps by which the designers reluctantly, but inevitably, were brought to admit the limits of their own methods in capturing the native understandings and local meanings which people at work develop, just to do their job properly. Our intention in describing these two cases has not been to blame those involved for what occurred, or to belittle honest efforts - to increase productivity and to develop a sophisticated software program sensitive to administrative procedures. Both instances have illustrated the tension or discrepancy between the ideas of technology and the organization that office automation brings to the fore. It is a subject at which we look more deeply in the next chapter.
3
Beyond the machine metaphor
Images or metaphors only create partial ways of seeing. For in encouraging us to see and understand the world from one perspective they discourage us from seeing it from others. Gareth Morgan (1986) Images of Organization, 34 First We Reshape Our Computers, Then Our Computers Reshape Us. James Bailey (1992) Daedalus, Winter, 67-86
Introduction This chapter, like the last, is about office automation. While in chapter 2 we focused on the 'practice' of computer-supported information work, we are now going to consider it from a theoretical perspective. There, we were preoccupied with the what? and how? of automation; here, with the why? We are intrigued with office automation less because of the organizational miracles that it promised - and frequently failed to deliver - than because of its not-so-obvious ideological commitments. Office automation was, and still remains, more than a package of new office communication and information-processing hardware and software; it also incorporates, as we illustrated in the previous chapter, an image of organization. The reasons for its incorporation of a particular image of organization have to do with the evolution of computing. When computers
73
Beyond the Machine Metaphor
first came into use in administrative contexts, they were limited to quite specific functions, such as accounting and payrolls. Later, as the systems became more flexible and powerful, they found new uses in on-line, real-time support of transactions such as banking and airline ticketing. New machines appeared for word processing and spreadsheeting during the 19705. The success of these diverse ventures into implementation of computing machinery led designers to imagine a much more ambitious goal: the total integration of all administrative functions into one seamless network, with everyone linked to everyone else, and all to banks of central servers, through a dense mat of cabling, called a local area net (or LAN). In their imagination, the designers conceived that the people who worked in management jobs had become 'knowledge workers' and that the work they did had become 'information processing.' Names for this supposed 'office of the future' had to be found: it was the 'Integrated Electronic Office' (IEO), or an 'Integrated Office System' (IOS) or, more commonly, Office Automation (OA). The use of the term 'automation' was already a clue to the designers' vision of an office - just a big information processing machine, a factory floor in new guise. Behind the hyperbole, however, there had emerged an academic discipline - management information systems (or MIS), linking specialists in computerization of work, statistical analysis, and organizational design and analysis. It was a field that had frequently been accused of under-achievement; although its prescriptions seemed plausible on paper, it was unable for some time to point to many definite or concrete successes in real life. Office automation promised a breach point for MIS and a means to persuade managers to adopt the rationalist philosophy and practices that it had worked out in theory. We became personally involved in the implementational dynamics of office automation when the senior author of this book participated in a set of field studies in five departments of the Canadian federal government. Monitoring the results of implementation of the new systems, we were unprepared for what emerged, little by little, in our research on the systems' impact: while most users were mildly enthusiastic about the new machines, the effect on their working life was negligible, and sometimes literally unmeasurable. We found this result so shocking (our country was by then spending as a nation upwards of $3.5 billion a year on this equipment) that we set out to discover whether the 'non-findings' were merely an anomalous ex-
The Vulnerable Fortress
74
ception. Our further inquiries, as we have noted in chapter i, suggested that they were not.1 We discovered that researchers everywhere were uncovering similar outcomes: not that there were no organizational results from office automation, but that it was hard to fit them into any definite pattern. Where productivity had increased, it was attributable usually to non-technological factors fully as much as to the system itself. At the beginning, when the shortfall began to be evident, there was a temptation to lay the blame for the problems on the users: negative attitudes, managerial conservatism, lack of adequate training, and the like. With time, these explanations became less and less convincing. It was becoming clearer and clearer that the problem was not the technology, or the users, but the fact that the technology was a misfit.2 Designers were mistaken about what people do in offices. (J.R. Taylor 19863), and, as a result, they had manufactured a tool for which there was actually no use. It was the idea of organization that the technology incorporated that was wrong. The idea behind office automation was not as uncommon as all that; we could have found only a few steps from our desks a dozen or so texts on organization used in business schools at one time or another with similar views on organization. Perhaps, therefore, the failure of office automation to deliver was an opportunity to learn something about own own implicit views of how communication works in organizations and about the role of information-processing in the communication process. After all, if there has been one common characteristic of the literature on organizations over the years, it has been its neglect of the communicational dimension. That perception was the genesis of this chapter which is therefore only peripherally about office automation. Our major preoccupation is, as we have explained, the why? of automation - the implicit theory of communication that OA incorporates and that has shaped more than one branch of the literature on organizations. First, we consider what possible theory of organization might explain the structure of computer programming. This is a way of exploring the contemporary notion of what a machine is. Second, we show what happens when such a theory is transposed to a human organization. Examples from recent literature suggest that it is highly implausible that concepts transferred from the field of machines will actually make any sense in ordinary human contexts of work. Third, we re-examine the premisses on when the bureaucratic model was based - its assump-
75
Beyond the Machine Metaphor
tions about communication, in effect. Finally, we conclude with some remarks on the epistemological dilemma that confronts designers of communication systems who are attempting to model organizations. The role of metaphor in organizational theorizing Writing in management theory, Gareth Morgan (1986) has suggested that the only way in which an organization can be conceptualized is by the interposition of an image, rooted in a metaphor. Metaphor understanding and experiencing one kind of thing in terms of another3 - is more than a device for embellishing discourse: Its use implies a way of thinking and a way of seeing that pervade how we understand our world generally' (Morgan 1986: 12). His work is a compendium of the better-known metaphorically based images that people have employed to think about organizations and how those metaphors have evolved4. Once a certain metaphor takes hold and comes to be widely accepted in a society, he observes, it thereafter tends to be very much taken for granted. Its metaphorical origin disappears from consciousness. Societal organization has been expressed, for example, in terms of a biological organism, with the equivalent of head, heart, and brain (and, of course, such humbler members as the hands). Since the early twentieth century, there has also been a marked preference for the metaphor of the machine, in its multiple forms (including the organism and the brain). But, as Morgan points out, the choice of an image by which to conceive of something that we can experience only indirectly is ultimately arbitrary, however much we chase the fact of our own imaging from our consciousness: nothing in nature tells us what an organization really is. Our choice is part of our interpretation, made to fit with our cognitive predilections, not part of the data. Furthermore, while metaphors reveal aspects of reality by highlighting connections that we might not otherwise perceive, they are, by definition, furnishing us with an incomplete picture. Metaphors do not so much describe, according to Morgan, as allow us to see the commonalties of pattern linking an object that we already know and understand with one that we comprehend less or, as in the case of organizations, are unable to apprehend directly in any way whatsoever. In organizational communication, researchers, like management theorists, have often employed the metaphor of a machine for conceptualizing the organization. Communication, in this perspective,
The Vulnerable Fortress
76
has been seen as a 'function' - an instrument permitting coordination and linking of its working parts (or, when applied to the organization, the people who work there). Lately, however, some scholars have become more and more uncomfortable with the strictures imposed by the functionalist explanations and models of communication (J.R. Taylor and S. Gurd 1993) and have begun searching for alternative frames of understanding. By so doing, they have stopped asking 'What are the communication implications of machine-ness?' and have begun to ask instead, 'What new metaphor or image of organization would best fit with what we now know about how communication actually works?' Taking the metaphor of office automation at its word, we shall look first at some of the implications of linking computer rationality with the organization and practice of management. We follow this with references from some new approaches to organizational management that suggest that organizational properties are too complex and diffuse to be modelled by a rational machine. By operationalizing rational management theory, we conclude, instead of strengthening management's control, office automation has actually undermined some of the principles that permitted bureaucratic organizations to function. What is a machine? The theory of computation: the 'universal' machine The ideas that made the new informational technologies possible coalesced between about 1935 and 1955. The year 1936 was the date of publication of a paper by a brilliant but eccentric (and ultimately tragic) English mathematician named Alan Turing (Hodges 1983; Bolter 1984; Penrose 1989). From his work, and from that of others of his generation,5 emerged a conceptualization so daring and original that it initiated a new era. It marked the launching of the information age,6 although no one necessarily thought of it as such at the time. On the surface, Turing's goal had nothing to do with information, or communication, or office automation. It was unrelated to the nuts and bolts of practical 'number-crunching' and grew instead out of something more esoteric: the theory of computation. His initial aim was to solve a problem in a highly abstract branch of mathematics,
77
Beyond the Machine Metaphor
called number theory. The problem involves arriving at coherent general statements ('proofs') about a system - ordinary numbers that is infinite in extent and therefore can never be inspected in its totality, not even in principle. The fact of infinity generates puzzles for example, is there a largest prime number, and how would we recognize it if there were? One way of dealing with such problems is to develop a logical procedure capable of deciding for any given particular example whether a certain property holds; then this procedure can be applied over and over until we come to an answer for the general case. The kind of procedure involved is one that we would use spontaneously ourselves should someone ask us whether, to take an example, a certain number, let us say '13', is a prime. We check to see if 13 is a product of any number smaller than 13 other than i (can it be divided by 2?, by 3?, by 5?, by 7?, etc.), and if we cannot find one, we say that the result is positive: '13' is in fact a prime. A procedure of this kind, involving repeated application of a simple test, is called recursive, and the method by which answers are recursively generated on the route to a solution is known as an algorithm. Within limits, important mathematical results can thus be established, in the context of an inductive logic. What Turing added was to conceptualize the recursively applied procedure as the work of a machine, which he described in minute conceptual detail (although he may or may not have intended it to be actually built)7 and which has come to be known as a Turing machine. Turing's imaginary machine was in fact the first true (if sketchy, on the practical side) blueprint for a modern general-purpose digital computer, whose principle also involves breaking complex operations down into simple steps. It had all the minimally essential components: a memory made up of (addressable) storage locations, an input-output device for reading and transcribing data, and a logic device that could perform simple operations step by step on the data, following a prescribed procedure that would now be called a 'routine'. Furthermore, Turing described a machine that could be made to display different types of behaviour: it was programmable. In his conception, every imaginable real machine incorporates a simple basic program in its form of construction. A Universal Machine is a Turing machine of a special kind: it takes for its input the description of another (any other) simple machine and can then exhibit the behaviour of that simpler machine. It is 'universal' in that, like a computer, it can be programmed to perform the steps that any other
The Vulnerable Fortress
78
machine can perform. All we need to do to transform a desktop PC from a calculating machine into a writing machine, as we are all now aware a half-century later, is to substitute one disc for another. The conceptual transposition accomplished since Turing's time is so complete that we give a name no longer to the machine but rather to its code, called software. We buy 'word processing' and 'spreadsheet' products such as Lotus 1-2-3, Excel, Word, WordPerfect, Cricket, and Ventura, not separate machines such as calculators, typewriters, and graphics kits. Our idea of a machine has undergone a subtle but very important change - a change that Turing introduced. It resulted in a new science, cybernetics, within whose boundaries the study of machines and the study of human organizations were to become treated as separate, but related, sub-fields. With Turing, machines stopped being purely physical artefacts, such as you can find in a museum, and became symbolically defined entities as well. The implications of Turing's conceptualization were not lost on the intellectual community of the time. People such as the biologist and cyberneticist R. Ashby, saw that Turing's notion of a machine was so abstract, and so general, that it provided not just a design tool for the engineer to develop new computing machines but also a key to understanding phenomena that would never before have been classified with machines, including brains, people, economies, and administration - anything that exemplified a regular pattern of behaviour. A computer was on its way to becoming more than a useful machine: it was soon to be given the status of root metaphor for, among other things, human organization. While people had used the metaphor of the machine before, they had done so in the absence of a precise theory, which Turing's work had finally attained. It is one thing to compare an organization to a loosely defined concept, and another to compare it to a logic machine. In 1938, in his master's thesis at MIT, the mathematician Claude Shannon demonstrated that it is possible to construct networks composed of simple electric relays that are formally identical (isomorphic) to both logical (Boolean true/false) and arithmetic functions (provided that the latter use the binary zero/one system of numbers). At the heart of a Turing machine is a device, which can be called an automaton, that expresses the program and is thus the 'intelligence' of the machine. What Shannon showed was that for every abstract automaton there is an equivalent physically realizable network and
79
Beyond the Machine Metaphor
vice versa. From this discovery to actually constructing a generalpurpose electronic computer was just one (conceptual) step - however difficult in practice. Shannon's work introduced an important principle. A computer, at the level of its hardware, is nothing more than a special kind of communication network. Signals are transmitted through it, and switched, or 'gated/ just as for any other network. What transforms a network into a computer is a question not just of how the network is constructed but also of how the result is interpreted. In most networks, sending messages is merely a means to an end to allow people to communicate with each other. The design of the network is meant to further the efficient conveyance of message traffic, to provide the best service possible to the greatest number of customers. Computing reverses the logic of this whole process: individual messages have no utility in and of themselves; it is the configuration of the network that is the message. The hardware configuration reflects the software meaning, and vice versa. Each pattern of 'switches on/switches off can be made to stand for a meaningful symbol. Shannon showed, as well, that those configurations, or network patterns, could be read as representing numbers, operations on numbers, such as adding and subtracting, or abstract logical operations, such as 'if ... then ... ' and 'either ... or ... ' Today's computers have exactly those components: they take in data, and they perform operations on the data according to the instructions of a program composed of logical operators.8 Turing had understood that a program and the data on which it operates can employ an identical language and an identical medium, but he had conceptualized that medium only as a 'tape.' Shannon showed that an electronic network could take the place of the 'tape/ and he thus made the link from pure mathematics to practical engineering a possibility. Since it was already known from Turing's work that an imaginary logic machine exists for every real machine, and since a network machine can be built to represent any logic machine, it follows that a computer is a machine that can take on the personality of any other machine. It is a machine in the abstract, not just another real machine. If an organization is also a network, a computer network can serve equally well as a model of organization.9 Communication is thus no longer to be thought of as just what goes on in an organization - a mere 'function'; instead, we have to ask whether an organization is anything but a system of communication.
The Vulnerable Fortress
80
The actual building of a true general-purpose computer took years to accomplish; the engineering of a piece of hardware as complex as the ENIAC (completed in 1945), with more than 17,000 tubes, 70,000 resistors, 10,000 capacitors, and 6,000 hand switches, was in itself a considerable achievement (Goldstine 1972). But by the time that Claude Shannon published his definitive mathematical treatment of the theory of communication in 1948, the information age had been born.10 The word 'information' had entered everyday language in a completely new guise - messages that could be generated, and comprehended by a computer. The computer as an organization Let us now 'put teeth' into the machine metaphor in a slightly different way. If we examine the distinctive features of the modern computer (in essence, a Turing machine) in terms of what they could suggest about its organizational structure, there are three important properties that are relevant to properties of organization: (i) the computer is rational, (2) it is 'single stream/ and (3) it separates communication from meta-communication (i.e. it has an author). We shall look at each in turn. The computer is rational
By 'rational,' we mean that the design of a computer (Turing machine) assumes a source of data, the product of forces extrinsic to the machine itself, on which the central processing unit of the computer 'operates,' in automaton fashion, to produce an 'output' - a state of affairs that can be read off from the outside. The computer is responsible for its output, but not its input. The equivalent assumption, for an organization, is that the latter exists in an environment, whose values can be ascertained by anyone who looks closely, on the basis of which the organization makes choices (decisions) that are in some sense optimal and that represent for people outside the organization its 'output.' For 'rationality' to be present, certain conditions must be met. (i) There must be an unambiguously identified source of information (in Turing's case, the 'tape,' more generally, for computers, the data). (2) The source of the data, from the computer's perspective, is unique (it is to be found in identifiable 'stores'; for Turing there was just one
8i
Beyond the Machine Metaphor
tape), and its values will be specified unambiguously. (3) The procedures by which 'input' is transformed into 'output' are what are called 'algorithms'; i.e. we can think of them as explicit instructions that operate on given values in a domain in order to produce a single-valued output (what is known, in mathematics, as a 'mapping'). When we translate computer rationality into organizational language, we have an image of a decision-maker (individual or group) examining what is known about environmental circumstances 'processing information' - and using explicit (potentially formalizable) logical procedures in order to determine what response would best further the interests of the organization. This image can be made more realistic by assuming that the environment is always characterized by a degree of uncertainty, that all knowledge is basically probabilistic, and therefore that the decisionmaking process is statistical ('best bet,' on the basis of incomplete knowledge), rather than deterministic. Some modern statistical techniques, such as the Bayesian, not only incorporate into their descriptions of the environment an assumption of uncertainty and imperfect knowledge but also take into account subjective preferences of the decider; in this way, they attempt to capture with considerable verisimilitude some quite complex choice processes. Contemporary techniques in artificial intelligence go even further in the direction of realism by attempting to formalize rules of thumb, hunches, and inspired guesses through complex (but fully explicated) systems of logic, or reasoning. Management Information Systems exemplify the assumption of rational decision-making. None of this incorporation of provisions to deal with environmental uncertainty, or executive subjectivity, contradicts our assumptions. We assume a unique, specifiable environment, determined by forces outside the control of the computer (or the organization), on the basis of which the processing unit (decision-maker) operates to influence outcomes following a determinable procedure which is 'rational'. We take as given that what is 'optimal' is determinable by the application of mathematical procedures, provided that individual knowledge can be coded into explicit terms, susceptible to formal reasoning. The computer is 'single stream'
One of Von Neumann's principles (though now being superseded by the so-called 'fifth-generation' of computers, which are introducing
The Vulnerable Fortress
82
parallel processing and neural networks) is that computer operations are sequential or serial in character - only one instruction is processed at a time. This property of standard computing is not always obvious from inspection of a computer program, since, like a human organization, it is by nature hierarchical, being built up out of an embedded set of modular symbolic entities called 'routines' and 'subroutines/ The construction of a computer program recognizes that complex systems can be developed out of simpler elements by seeing the latter as interrelated functions in a multi-level universe wherein very simple operations fuse together to produce more complicated operations which in turn make up even more complicated structures. This is the principle on which language works: words fit together functionally to make sentences, sentences to make paragraphs, paragraphs to make chapters, chapters to make a book, and books to make an oeuvre." Like language, these complicated logical structures are processed (or have been, in the first four of the computer's so-called 'generations') one character at a time, when the program is run on an actual computer. This (a hierarchy of embedded, graded functions) is also how we have tended to think of organization.12 Single-stream processing, however, takes the metaphor one step further. The organizational equivalent of single-stream processing would, strictly speaking, require every operation, though delegated initially to someone for specialized treatment, to end up on the desk of a single executive before it could be processed. Every transaction in every bank in Canada, for example, would have to be approved by the president before it could be finalized, every supermarket counter sale to be initialed by the CEO. While this stretches our sense of what is credible for real-life organizations, the close-to-light speed at which computers operate makes it feasible for them to sustain single-stream processing. The advantage of this, for maintaining unity of command in the running of a computer program (or organization), is evident. For our purposes, the organizational equivalent of single-stream processing is the assumption that organizations are animated by a common sense of purpose, that they are directed by a single executive head, and that all lower-level functions fit into a strict, logically coherent hierarchy of command.13 Only in single-stream systems does it make sense to speak of the organization's 'mission,' or 'goal,' as if the latter formed a unique, identifiable set.
83
Beyond the Machine Metaphor
Computing discriminates between communication and meta-communication: it is an 'authored' system
The distinction between 'communication' and 'meta-communication' is roughly the same as that between doing and deciding what to do. In a world where tasks are well defined and the environment is repetitive in character, communication tends to get pared down to the essentials: simple exchanges of information, elementary gestures of coordination and control, to keep everything synchronized. In the universe of computing, the messages that make a computer program work are of this kind: moving data from one store to another, transferring solutions, calling in sub-routines, keeping operations in their proper order. None of this communication reflects back on itself: the computer does not have to develop its own program as it goes along, which would involve it in not only carrying out operations but simultaneously passing judgment on how it did so. The actual construction of the program, which is where meta-communication begins, is extrinsic to the computer itself; all the computer knows about that kind of thoughtful reconstruction of its behaviour is what gets communicated to it in the form of coded instructions, fed into its stores ('initialized') at the beginning of a new run.14 From the perspective of the organization, this is equivalent to assuming that the executive at the top makes all the key decisions and everyone else is there to operationalize them.15 So much for our organizational model of how a computer works. Is there a theory of organization that corresponds to this model? The answer is yes. It coincides, unsurprisingly, with the traditional model of how to manage an organization (and, incidentally, with the information science concept of a Management Information System). The traditional rational model of organizational management and of office automation Consider the following description (Scott 1981: 77-78), cited in Weick (1982: 375-6): In the rational system perspective, structural arrangements within organizations are conceived as tools deliberately designed for the efficient realization of ends ... Rationality resides in the structure itself, not in the individual participants - in rules that assure participants will behave in
The Vulnerable Fortress
84
ways calculated to achieve desired objectives, in control arrangements that evaluate performance and detect deviance, in reward systems that motivate participants to carry out prescribed tasks, and in the set of criteria by which participants are selected, replaced, or promoted ... We have noted the great emphasis placed in the rational system perspective on control - the determination of the behaviour of one subset of participants by the other. Decision making tends to be centralized, and most participants are excluded from discretion or from exercising control over their own behaviour. Most rational system theorists justify these arrangements as being in the service of rationality: control is the means of channeling and coordinating behaviour so as to achieve specified goals.16
That 'rationality resides in the structure itself, not in the individual participants' is presented as a statement about human organization. But it is also, as we have seen, the essence of computer organizational logic: that in the end there is a single output (which is the organization's), that the components of the program (organization) are extensions of a single (if multi-faceted) intelligence, and that communication is about execution of a central, single-focus program, guided by the 'goals' of the enterprise (informatic or otherwise), whose meaning is established by a relationship with an environment which can be thought of as a game (Von Neumann and Morgenstern 1944: 1964) or as a decision (Wald 1950). The states that the environment can take, and the possible responses to them, are assumed to be known, and the calculation of a response (taking a decision) is a matter of inspecting the set of possible links between environmental state and organizational response, following the dictates of a probabilistic calculus. To the extent that we give credit to the rationalistic model, systems of technology masquerading under the banner of office automation can make a priori sense, in two rather different respects. First, at the level of metaphor, the computer analogy is a way to visualize the organizational experience: it provides a guide as to how organizations ought to behave in an ideal world. Straying from the ideal is what Scott (1981) calls 'deviance' - something to be attributed to poor 'motivation', or the way in which participants were 'selected, replaced or promoted.' Systems such as this, as much as they could ever be truly rational, would have no conceptual problems of coherence, since they would be a reflection of a single intelligence, controlled by a single executive program within which all communication was instrumental in purpose.
85
Beyond the Machine Metaphor
Second, at the level of practice, the day-to-day role of the actual communication and computing machines in this universe, people included, should be unambiguous: monitoring the states of the environment, storing and transferring data, making simple decisions (or assisting people to make decisions) at predetermined choice points in the program, tracking behaviour and issuing commands, scheduling, updating, and informing - exactly the elements described in the second case study in chapter 2. Welcome to the ideal image of a pure bureaucracy - Max Weber incarnate! Office automation and an image of organization So far, we have been postulating, on the basis of abstract properties, a metaphorical relationship between a conception of computing and of office operations that makes the computer, considered as an organizational form, an analogue of actual human organizations. But is this a link that has actually been made? Does the literature on organizational management support our assumption about there being a pattern of similarity between computer and organization, particularly where office automation is concerned? Computer promoters don't explicitly describe the organization. For the most part, it remains no more than a buried presupposition in the reports of organizational planners and consultants responsible for the promotion of the new technology. Katambwe (1987) examined the office-automation literature devoted to assessing and measuring users' needs. All the examples at which he looked concentrated on the functional role of document-handling and communication; none went beyond the assumptions of the rational model of organization, which assumes that everyone is contributing to an overall coherent, goal-driven, organizational purpose. Typically the officeautomation literature, while the language may have varied from one study to another (with words such as 'task' and 'function' being defined more or less idiosyncratically by the author), conceptualized the work of the office as a throughput system, with the input consisting of assembling a preliminary database, the processing taking the form of preparing documents, and the output being seen as the transmitting and storing of the result. Organizational tasks are seen to be well defined: planning, consulting, data collection, disseminating, filing. The technology intervenes at the level of functions such as
The Vulnerable Fortress
86
typing, printing, mailing, and photocopying - the media-related accompaniments to fulfilment of an organizational task.17 Similarly, while Katambwe found that adherence to a classical model of organizational rationality might be adjusted to make provision for assumptions of organizational 'adaptedness/ every case assumed a well-defined environment, a single-stream organizational purpose (everything contributes to a coherent set of larger, corporately defined objectives), and rational instrumentality of communicational (document-related or oral) behaviour. Katambwe concludes: 'In general, the needs analysis methodologies in office automation follow the functionalist paradigm. The two most common metaphors are those of the machine and the organism. Buttressed by these two powerful metaphors, the methodologies of needs analysis incorporate explicitly or implicitly assumptions of closed and open systems. Where a closed system is postulated, work in an office is seen to be determined by the missions and performance objectives set by management; where an open system is postulated, the work is seen to be the fruit of the adaption of structures to the environment of the organization/ (1987: 120, translated by the authors). The approaches examined by Katambwe, though differing in emphasis, all assume the existence of organizational missions and goals 'independent of all individual mediation on the part of the actors themselves'; (these larger purposes are the 'effective cause' of what people do in a day's work). These approaches also assume the 'existence of a relation between the operations (tasks and procedures) and the objectives of the organization' (20). Those who carry out the tasks or execute the procedures are not involved in the determination of goals and objectives. The methods of needs analysis studied also make an implicit assumption of organizational stability: they see the organization as cohesive, cooperative, and integrative; they make no provision for hostilities, conflicts, or less-than-ideal integration. This image of organization, which is an unexamined presupposition pervading the office-automation literature, reads just like the pictured structure of a computer program, whose components ('routines' and 'sub-routines') also have no say in determining the overall objective of the program that they are instrumental in carrying out. Nor do workers decide how they will conduct their own duties; like the parts of a computer program, office workers in the automation literature remain forever cooperative, cohesive, and well integrated, as if they too were the product or a programmer-god who created them .
87
Beyond the Machine Metaphor
but who failed to breathe into them a life of their own. There is only one problem with this sort of rationalist model for organization and office-automation technology: it would be hard, in today's environment, to find a sophisticated analyst of organizational processes who any longer believes without reservation in its validity! People in organizations do follow routines, but those routines do not provide an all-encompassing definition of what they are. People contribute to an organizational program of sorts, but they were not created by it nor are they forever constrained by it. New approaches to the theory of organizational management There are ways to imagine an organization other than simply as an extension of the three properties that define a computer (i.e. rational, single-stream, and separating communication from meta-communication). Recent literature on management and communication starts from the alternative premiss that rationality is always going to be partial and that human behaviour is too complex and the thought processes of people are too fuzzy to be easily modelled by a Turing machine. Let us go back once again to the organizational equivalent of Turing's tape to remind ourselves of its characteristics. As Turing imagined his abstract machine, there was a tape of infinite length, divided into squares, on each of which was recorded a symbol (possibly a blank). The machine worked in three steps by (i) reading the value in a square, (2) consulting an instruction as to what to do for such a value, given the existing situation, and (3) writing down a new value on the tape. By analogy, we could imagine the managers of an organization deciding whether or not to open a branch operation in some location, (i) They 'read off the appropriate values on the 'tape' (number of potential customers, availability of local labour, probable economic growth for the region, tax advantages offered by local governments, and so on). (2) They decide whether or not to go ahead, following their usual policy for this kind of situation. (3) Their decision, if positive, represents construction of a new facility in the locality, which we could conceptualize as a new value recorded on the tape (in the sense that the city where they locate is already changed by their having chosen it). The Turing conception of a 'tape' was thus a highly simplified, but faithful, metaphor for an assumption that is at the very heart of
The Vulnerable Fortress
88
classical theories of rationality - namely that nature is out there, solid, objective, positive, waiting to be read (always assuming that we have enough capacity, or 'squares/ in Turing's terms, to record and remember its values). Thus the organization, if it asked the right questions, would obtain an unambiguous answer on which to base its decision. Behind this lies a crucial presumption that there are 'right' questions to be asked - that there really is a 'tape/ if only one knows how to get access to it. This is the very assumption that has been attacked, however, by theorists who view the so-called 'environment' of a real organization very differently. The limits of rationality Weick and Daft (1983) employ the following analogy (borrowed from the physicist Wheeler). Imagine a game of twenty questions where group members decide not to limit themselves to one collectively chosen and predetermined word but rather to answer freely 'yes' or 'no' during the game, the only constraint being that their answers fit with all the previous replies and correspond to a word that they can decide on as they go along. In this game, when the one who is 'it' eventually does arrive at a correct answer, it is as much a product of how he or she formulated questions as it is of what was in the minds of the questioned at the start of the interaction. If a different line of questioning had been followed, another answer would have emerged. When the game began, even though the questioner may not have realized it, there was no exact answer to be discovered, yet the result of playing the game, from his or her point of view, is just the same as if there had been. An 'answer' is found. The authors are working towards a notion of organizational interpretation of environment that is very much unlike the assumptions inscribed in the totally rational model: Tf an organization assumes that the external environment is objective, that events and processes are hard, measurable, and determinant, then it will play the traditional game to discover the appropriate or correct interpretation. The key for this organization is discovery through intelligence, rational analysis, vigilance, and accurate measurement. This organization will utilize linear thinking and logic, and will seek clear-cut solutions ... If an organization assumes that the external environment is subjective, an entirely different strategy applies. The key is to construct, coerce,
89
Beyond the Machine Metaphor
or enact a reasonable answer that makes previous action sensible and suggests some next steps. The interpretation shapes the environment more than the environment shapes the interpretation' (1983: 78-9). There is one difference: the player in Weick and Daft's game can eventually be let in on the secret. In real life, it is possible for organizations to go on forever playing the 'game' against nature, without ever knowing whether there really had been an 'answer' to begin with. The assumption of 'rationality/ then, according to Weick, can be taken only to be part of the belief system of the actor, not a discoverable feature of external reality, which remains forever inscrutable.18 In drawing this contrast between kinds of organizational interpretation, Weick and Daft are not assuming that one is likely to be more effective than the other in 'reading' the real environment. On the contrary, they see the external environment as simply a 'flowing, changing, equivocal chaos,' of which the organization has somehow to make sense. As people in organizations try to 'sort this chaos into items, events, and parts that are then connected, threaded into sequence, serially ordered, and related to one another,' they reveal their system of interpretation. It is more important to have a convincing representation of events than a 'true' one. 'Rational' approaches to interpretation are no more or less accurate than nonrational ones; adherence to a certain way of treating the evidence indicates the style of interpretation considered acceptable in that particular organization. Environments are far too complex, veiled, fragmentary, and confused for simple, accurate interpretation: 'The puzzles faced by organizational members resemble those puzzles found in puns, poems, dreams, abstractions, and foreign languages. In many cases, when a person examines a display, two or more causes could have produced the result, which means that the outcome is equivocal. Thus it is plausible to portray organizations as embedded in an environment of puns' (Weick and Daft 1983: 75). If the organization is not to be immobilized, then 'managers literally must wade into the swarm of events that constitute and surround the organization and actively try to impose some order on them. Organization participants physically act on the environment, attending to some of it, ignoring most of it, and talking to other people to see what they are doing. Structure must be imposed on the apparent randomness in order for interpretation to occur. Interpretation is the process of translating the events,
The Vulnerable Fortress
90
of developing models for understanding, of bringing out meaning, and of assembling conceptual schemes' (74). Although 'the interpretation process is neither simple nor well understood' (Weick and Daft 1983: 74), we can isolate some of its characteristics. It is, for example, typically a posteriori, focusing on what has already occurred ('the prototypic case in which action precedes cognition') (75), and quasi-historical ('people who select interpretations for present enactments usually see in the present what they have seen before' (75). It constructs environments rather than discovers them ('a biased search takes place for stimuli that could have produced that interpretation ... The environment then becomes an output of organizing processes, not an input to those processes' (76),19 and it leads to conclusions that are reasonable rather than right: 'Reasonable explanations accommodate more data than they exclude, are sufficient in the eyes of more than one person, are no worse than other explanations that can be generated, and handle some new instances that were not used to generate the original interpretation. These soft criteria ... are adequate for practical purposes and that is enough ... [They] permit sufficient elasticity in organizations that diverse interpretations can be held without this diversity endangering coordinated action' (76). How organizations structure themselves, and how they behave, is not constrained by the exigencies of the 'environment' nearly as tightly as theorists (particularly those in the 'contingency' school) used to think. The Darwinian model is inappropriately applied to organizational adaptation. A better analogy is a rich plot of land: one person sees its possibilities as a beautiful formal garden; another, a Japanese garden; a third, flourishing rows of vegetables. They are all right. Many interpretations can work; only in retrospect does success appear the result of shrewd calculation or successful adaptation. And organizations can fail, even though they are using tried-and-true formulas for success. If, in arriving at an interpretation of what is going on in the world around us, we 'construct' an environment rather than 'discover' it, this is like saying that a Turing machine would proceed by seeking out those features of the tape transcription that corresponds to its accustomed way of performing operations, rather than by search out the correct operation for dealing with what is inscribed on the tape. At this point, the 'rationality' of a well-defined mathematical approach becomes self-justifying, a matter of style and not of substance - formalism for formalism's sake.
91
Beyond the Machine Metaphor
It could still be argued, even if we accepted Weick and Daft's logic, that although nature neither supports nor refutes the validity of the rationalist approach, at least this approach has the advantage of being clearly explicated and sharable, compared to more intuitive ways of proceeding, when we are confronted with the necessity of making an interpretation. The multiple-stream organization Now let us turn to the second postulate, single-stream purpose in both computer and organization. Nils Brunsson has observed that while textbooks written for managers leave the impression that the essence of good management is good thinking (solving problems, making rational choices between alternatives), 'descriptions of the way managers actually behave show that they spend very little time on problem-solving, decision-making or making choices, and when they do undertake any of these activities they tend to display considerable irrationality' (Brunsson 1985: 3). Brunsson, however, pursues another line of thought, leading to this conclusion: 'Rational decision-making is not after all the essence of good management. Successful management may have more to do with the ability to motivate people, to establish a good organizational climate, to create appropriate social networks, or to develop powerful organizational ideologies' (1985: 3).20 In most organizations, 'the main purpose and problem [of managers] are not to affect thinking and choice but to produce organized, co-ordinated actions' (Brunsson 1985:4). In the world of computers, the production of 'organized, co-ordinated actions' depends uniquely on the design skills of the programmer; organizations have no such deus ex machina. One of the best-known phenomena in the organizational literature is what is called 'displacement of goals.'21 However unified to start with, all organizations quickly develop multiple (and usually partly conflicting) purposes. The integration of an organization is the single most important task of management.22 It thus seems fair to make a distinction between two kinds of rationality: decision rationality (with respect to a supposed external environment) and action rationality (with respect to continued mobilization of the various interests within the organization).23 Getting people to act in concert means establishing a compatible cognitive frame: through their expectations (that their action is contributing to a common end), through their motivation (that it is the right action),
The Vulnerable Fortress
92
and through their sense of organizational commitment (the assurance that others too are working towards the same goal). For senior management to pursue a course that follows standard principles of decisional rationality would be to work against integration of actions; classical rationality means opening up as widely as possible the search for alternatives, evaluating negative as well as positive outcomes of each, and making an unambiguous selection of one. Managers concerned with finding (often narrow) grounds for consensus and collective action are bound to do exactly the opposite: keep the search for alternatives as limited as possible, emphasize the positive consequences of action (and play down the negative), and state eventual decisions sufficiently vaguely as to prevent key people from feeling that they have been 'painted into a corner.' Everything the skilful manager does is designed to mould expectations, sustain motivation, and build a network of commitments; whether what subsequently transpires looks 'rational/ in some abstract sense, is quite a different kind of concern, having to do with keeping up appearances.24 The kind of interpretation of the environment that is arrived at in an organization may sometimes have less to do with explaining the facts than with accommodating them to fit a view of the situation around which people can rally. Such organizational 'ideologies' (Brunsson's term) make it easier to agree on objectives, on action alternatives that are seen to be promising, and on outcomes that are probable. They can 'replace decisions altogether; many organizational actions are not preceded by any decision process; agreement and coordination are achieved without decision-making, because the actors entertain similar perceptions of situations and share a stock of common expectations and general values' (29). In this context, such things as 'goals,' 'organizational objectives,' and 'mission statements' have a fundamentally symbolic function; as Mintzberg and McHugh (1985) have pointed out, one may ascertain an organization's goals better by inspection of its performance ex post facto than by consultation of its explicit policy statements. Communication and meta-communication Much current thinking about organizations can be traced back to a book called Organizations (March and Simon 1958). The authors were the first to observe that no one in ordinary circumstances could come
93
Beyond the Machine Metaphor
close to living up to the ideals of the mathematical theory of decision-making. Our bases of knowledge are too sketchy, our understanding of trends too partial, and our preferences too fuzzy for us ever to approximate true rationality. Instead, they argued, people make decisions that 'satisfice' (seem good enough, all told). To describe this phenomenon, March and Simon used the term 'bounded rationality.' In organizations, people quickly become preoccupied by immediate secular goals. They tend to lose sight of the larger picture, and, since thinking takes place in group talk and contexts where people usually tend to reinforce each other's prejudices, they are thus 'boundedly' rational, at best. Their preoccupation is less with intellectual discovery than with 'uncertainty absorption' (making sense of the stimuli to which they are exposed), and this they do by fitting experience to the frames of reference that their previous experience has bequeathed them, rather than by any exercise in abstract problemsolving. March subsequently carried this line of thought further (Cohen, March and Olsen 1972; March and Olsen 1976). He and his collaborators based their reasoning on the assumption that 'decision-making' is not so much an individual as a collective process, conducted with as many political as instrumental goals in mind. They imagine organizations, and decision situations, characterized by three properties: problematic preferences (preferences discovered through action rather than informed by a coherent structure of thought), unclear technology (reliance on trial-and-error procedures, a residue of learning from past experiences, ad hoc problem-solving), and fluid participation (involvement in decision-making that varies for each participant, depending on availability of time, unclear boundaries, spontaneous networking, and similar factors of unpredictable variation). In such 'organized anarchies/ the best way to conceptualize choicemaking - in the absence of consistent, shared goals, and given that it is not always clear what the parameters surrounding the decisions are - is to conceive of it as a flow of 'people, problems, and solutions through organizational networks' (Padgett 1980: 583). According to rational principles, choice opportunities are instances when, first, decision alternatives are generated, second, consequences of each are examined in terms of objectives, and third, a decision is taken. March and his collaborators prefer to see choice-making as a convergence, more or less structured, of problems being constantly thrown up, solutions ('answers ... actively looking for a question'), participants
The Vulnerable Fortress
94
who come and go, and choice opportunities ('occasions when an organization is expected to produce behaviour that can be called a decision'). This pattern is what the authors call a 'garbage can model of organization choice/ While the label may shock, the description is clearly a better approximation of the fluid experience of collective decision-making in at least some kinds of organizations than the prescriptions of traditional decision theory. The parts of organizations are not linked in the same way as a computer's March and Olsen (1976) observe that rational decision-making is usually assumed to be and is characterized as a four-stage cycle where (i) individual actions or participation in a choice situation give rise to (2) organizational actions ('choices' or 'outcomes'), which then lead to (3) environmental actions, or 'responses/ which are translated by people in the organization into (4) individuals' cognitions and preferences, or 'models of the world/ which then stimulate new choices and actions, thus completing the cycle. In effect, the process is being looked at as a computing 'flow diagram/ a picture of how the functions are connected to each other. Since this model is abstract enough to serve (with some terminological variants) as a frame for both organizational and computing processes, we can use it to contrast the nature of the links, exact in one case and inexact in the other (Figure 3.1). There are four 'translation' points in the representation - from individual choices into organizational action, from organizational action into environmental response, from environmental response into individuals' perception of it, and from perception, or 'cognition,' into choice. The first three translation points occur within the organization and could be presumed to be susceptible to being automated. They (i) transform 'individual actions' or choices through some kind of summation into an 'organizational action' or a collective 'choice' or 'outcome'; (2) interpret data representing 'environmental actions' or 'responses' into 'cognitions' and 'preferences' or 'models of the world'; and (3) translate such cognitions and preferences into strategies for choosing action, leading to individual participation in a choice situation. (They correspond in fact to the objectives of an office-automation decision support system: collect and analyse data, develop strategy options, and propose 'best' solutions.) These translation points
95
Beyond the Machine Metaphor
Figure 3.1 The complete cycle of choice
SOURCE: March and Olsen (1976:13)
might be 'automatable' if (i) preferences were truly unambiguous, (2) the collective decision-making process were also unambiguous, and (3) the environment provided unambiguous signals and easily interpretable data. As March and Olsen point out, however, none of these assumptions of ignorable ambiguity is tenable. Individual preferences can be capricious, collective decision-making chaotic, and the environment unreadable. The substitution of an unambiguous routine (by definition, a computer algorithm is unambiguous, because it otherwise will 'crash') for a procedure marked by higher levels of ambiguity, within a situation that is irreducibly ambiguous, will not produce 'better' adapted behaviour. Quite the contrary, it leads inevitably to reduction of available organizational variety, which, in an ambiguous and fluctuating world, is still the best defence against threats to survival.25 The organizational communication system is quite different from the computer's Though perhaps oversimplified, this reconsideration of decisionmaking underlines the greatest weakness of employing a computeras-machine metaphor of organization: its image of how communication is carried on. In a computer, all the communication links are tight: laconic, unambiguous, stripped to the bone, and limited to the bare essentials. In ordinary organizations peopled by human beings, the links between components vary all the way from tight to loose. Karl Weick (1982) has made the difference clear with his organizational concept of 'loose coupling.' Loose coupling between elements of an organization, he suggests, exists if the way in which one component affects another is sudden and unpredictable (rather than continuous and predictable), occasional (rather than constant), negligible
The Vulnerable Fortress
96
(rather than significant), indirect (rather than direct), and eventual (rather than immediate). What makes a loosely coupled organization work is not rationality: 'Organizations use rationality as a facade when they talk about goals, planning, intentions, and analysis, not because these practices necessarily work, but because people who supply resources believe that such practices work and indicate sound management... Organizations are often heavily invested in the dramatization of their efficiency even though such displays often restrict actual efficiency. Elaborate public efforts to make rational decisions can often undermine the vigor and speed of subsequent action' (Weick 1985: no). Weick's image of an organization has little in common with the old-fashioned view of a bureaucracy. He rejects the idea of a monolithic, uniformly articulated structure, whose day-to-day operations are managed from the top down. On the contrary, he argues, management has little to do with the detail of field operations. Instead, its role is to decide on how to segment the organization, and it is the segmented units, grouped into clusters of tasks, that actually work out for themselves the operating structures and procedures that are characteristic of the way in which work gets done in any specific context. It is not so much the organization that management manages, in other words, as the process by which it is managed - a switch of perspective that leads Kuhn and Beam (1982) to call the segmenting function 'metamanagement.' The segments that Weick has in mind are small - not divisions, or regions, or branches, but groups limited enough in scale to develop stability because the people in them can interact regularly and find common bases of understanding which in turn promote a sense of shared fate and allow for emergence of 'trust relationships/ Larger groupings, as their structuring becomes complex, make orderliness, predictability, and 'sensibleness' difficult to sustain. They tend to disintegrate into simpler forms, which are stabler. It is in these smaller sub-units that most people spend their time, however large an organization is. Not even top management is exempt from this rule: it too is a segment. Organizations, then, are a mixture of associational patterns (which Weick terms 'coupling') that are 'neither entirely loose nor entirely tight' (1985: 120). Connections become tight within the stable subunits, loose between them. There is, as a result, much less total order in the organization than the rational bureaucratic model pretends (26).
97
Beyond the Machine Metaphor
Because the connections in an organization vary in strength, communicational ambiguity is always present, delays and distortions are endemic, systems appear unresponsive to demands for action, and the causes of events are hard to evaluate. When people can only with difficulty link what is happening to them to a source of explanation that is evidently rational they tend to fall back on what Weick calls 'superstitious' learning. Confronted with less order, and greater ambiguity than the textbooks say there should be, managers respond by pretending to themselves that they have actually discerned a pattern. Even where events stubbornly refuse to shape themselves into conveniently recognizable forms, the managers can still get by by presuming that events conform to a logical principle. This tendency to make the best of things - to behave 'as if - has a desirable outcome. By their expectations, they succeed in spanning some of the breaks that are characteristic of loosely coupled systems; through their confidence, they generate a pattern of interaction across segments that tightens the connections and reduces ambiguity sufficiently to make possible a modicum of order. By acting out ('enacting') a perception of order, they make order appear. The seeming coherence of what is potentially an otherwise disorderly universe is really an effect of the attribution of order on which managers base their intervention (and is also why people in it can still picture the organization as a homogeneous monolith). Such loosely structured entities are not at all, Weick insists, flawed systems. Instead, in an environment characterized by constant change, they are both a social and a cognitive solution to the puzzle of how to act when people with limited information-processing capabilities find it impossible ever to be totally certain what the future will be like. In the face of uncertainty, they make learning possible. A truly rational organization, tightly coupled from top to bottom, would have no learning capacity - no ability to adapt to change. Weick distinguishes so forcefully between tight and loose coupling because he understands that communication is something to be achieved, not something to be taken for granted. All communication generates at least some ambiguity, if only because ordinary language constantly employs ambiguous constructions and fuzzy expressions.27 To decipher the meaning of what someone says we are constantly supplementing the explicit content of the message by inferences as to the speaker's intention. And because intentions are never totally
The Vulnerable Fortress
98
transparent, there will always be some residual uncertainty in even the most banal of communications. Such uncertainty is likely to be at its minimum when people share similar ways of seeing things and interact frequently in well-defined situations; uncertainty increases, as we saw in the previous chapter, when frames of reference are quite different, the rules of interaction are unclear, contact is sporadic, and intentions are unclear. In large organizations, a choice is often involved; while people have a limited capacity to sustain interaction in the depth required to lower ambiguity ('It is difficult for people to maintain more than ten strong pairwise relationships/ according to Weick [1985: ii7]),28 the exigencies of their task may force them to interact with many more than ten people. If people 'are in one place generating events/ then they are simultaneously 'absent from some other place' (Weick 1976: 10). Tight coupling in one part of the system can occur 'only if there is loose coupling in another part' (10). And if tight couplings in one place imply loose couplings elsewhere, then it is 'the pattern of couplings that produces the observed outcomes' (10). Loosely coupled (or rather mixed) systems are not imperfect, or merely deviations from the ideal. Quite the contrary, as Ashby (1954), steeped in the cybernetic theory of Turing, Shannon, Wiener, and von Neumann, once undertook to show. Any organizational pattern where strong communication is universally present will produce a system without adaptive capacity, or ability to learn, or understanding of an environment that is in any way complex and changeable. What Ashby wrote is just as applicable to human organizations as it is to the human brain - his subject matter.29 Loose coupling means that whole organizations do not have to be mobilized for every little change in the environment, are more sensitive to the complexity of the environment ('know' their environments better), allow for localized adaptation, can develop and retain more mutations and novel solutions, have more diversity and can adapt to a wider range of changes in the environment, are able to deal with local problems at the local level, leave more room for self-determination and initiative, and are less expensive to run in that they require less investment in coordination (Weick 1976). What is true for brains, and organizations, it turns out, is equally true for programming. In describing how software programs are built up progressively out of modules, Shore notes an important principle in software engineering. A module's interface should shield other
99
Beyond the Machine Metaphor
modules from internal details that may change later. In other words, 'because detailed information about the operation of such modules is hidden behind their interfaces, they are called information-hiding modules. Their secrets are those aspects of the overall system that are subject to change' (Shore 1985: 214). So it turns out that even computer programs, like organizations, depend on mixing loose and tight coupling, once they get past a certain level of complexity. In the era of fifth-generation computers, von Neumann's organizational model no longer looks adequate as a basis even for computing machine design, much less for human administration. In summary, the ideas about organization that we have been citing challenge the conventional wisdom characterized by office automation. The authors in question do not form part of some particular 'school,' and we have deliberately glossed over the many differences in emphasis that characterize their writings, since it is not part of our purpose to provide analysis in depth of the new theory on organization. On some general points there is, however, something approaching a consensus in this literature. (1) Because environments are too complex ('veiled/ in Weick's term) to be simply reduced to elementary-category data of the kind that computers use, people respond to pattern in their environment they interpret events and do not simply record them. (2) People are 'boundedly' rational, to employ March and Simon's expression. They see things, and take decisions, from within a limited frame of understanding which cannot be dissociated from what they are doing ('locally rational'). They constantly use their intuition, and experience counts for more than logic. They read their environment in conformity with the needs of the activities in which they are engaged. Thus 'information' is created, not just received, while actions justify decisions more than decisions justify actions. (3) Organizations are complex mosaics of domains of specialized work, often loosely interrelated functionally and communicationally, dictated not by a single but by multiple logics. It follows that singlestream rationality is not a given, but an elusive achievement which requires a complex social process of influence, alliance-building, and negotiation, much more than it does cold rationality. (4) Much of senior management is concerned less with direct supervision, or detailed coordination of lower-level functions, than with agenda-setting and politics. As a result, task definition is much looser
The Vulnerable Fortress
100
and vertical communication usually more attenuated in most large organizations than we used to imagine. There is no single, nicely structured 'chain of command/ There is, instead, a ladder of statuses - not at all the same thing. None of this fits very well with the machine metaphor. Office automation and the monopoly of knowledge One central feature of the classical model of management tends to escape our attention, because it is often not stated in so many words: its considerable restrictions on the freedom to communicate. It is worth looking once again at Weber's (Gerth and Mills 1958: 196-7) text on bureaucracy. First, all the communication that he described was formalistic, paper-mediated (based on written documents, 'the files,' in his words), governed by strict legalistic principles, occasional and never spontaneous. It was, to use a current expression, very 'narrow channel'. Second, the authority to give commands was 'strictly delimited by rules' and the right to intervene in the affairs of subordinates was also constrained ('hierarchical subordination does not mean the 'higher' authority is simply authorized to take over the business of the 'lower' ... Indeed, the opposite is the rule/) Third, and above all, limited access to information was the keystone of the bureaucratic principle: 'Every bureaucracy seeks to increase the superiority of the professionally informed by keeping their knowledge and intentions secret. Bureaucratic administration always tends to be an administration of "secret sessions": in so far as it can, it hides its knowledge and action from criticism' (233). Rational administration thus incorporated a very elementary principle of information monopoly within the context of a limited medium of communication and a highly constrained network of interaction. Information monopoly and organizational power were, in Weber's view, strongly correlated. Office automation offered exactly the opposite prospect: enlarged access to information, a broad-band medium of communication, and an open network.30 Before we dismiss the secretiveness of professional administration as pathological, or 'dysfunctional' (Merton's term), let us briefly consider why it may be an intrinsic property of large, hierarchical organizations. It has been an established principle of modern organization, because its basis (and justification) are division of labour. An
ioi
Beyond the Machine Metaphor
ordinary organization chart is based on just such a differentiated task structure. Of course, what has been differentiated, as Lawrence and Lorsch (1967) have pointed out, eventually must be reintegrated. Each branch on the organization chart stands not just for a specialized function but also for a culture, often associated with long professional training and sometimes geographically linked to a region, and a budgetary unit in competition with other similar divisions in an intense political exercise. Furthermore, in very large organizations, the association between branches and divisions is often an arbitrary effect of history, without any particular logical foundation, in the instrumental sense. Communication between culturally distinct and wellestablished communities with institutional roots, and further marked by definite political interests with a history of mutual suspicion, is it should surprise no one - a matter requiring both first rate negotiating skills and considerable sensitivity to the interpretive gulfs that tend to separate people who have different professional backgrounds and organizational experiences. Information, in this context, is not a neutral quantity; it is a strategic resource, possession of which gives its holder an immediate advantage. It is' not something to be disseminated indiscriminately. There are monopolies of knowledge of many kinds in an organization, in many places and at many levels. As a result, there are also many kinds of power bases, or, to use Michel Crozier's evocative phrase, 'zones of uncertainty/ supporting local influence bases. Thus a great deal of organizational communication is brokered. The promise of office automation was to eliminate the brokers. But, Weick is arguing, it is the brokering of communication among the various communities making up the organization that serves to keep ambiguity under control and to prevent the confusion and conflict that follow from unbrokered communication. In other words, the automation of the office risked undermining the very principle that permits large organizations to go on functioning effectively. Conclusion: The office automation template We have been looking in this chapter into the theoretical link between two intellectual universes, computer science and organizational theory. It is a juxtaposition of domains that poses a chickenand-egg question (which is the real organization, and which is the representation of it?). The management literature has traditionally
The Vulnerable Fortress
102
treated organization as though it were a variety of machine; computer scientists treat computers as a variety of organization. And if we have, historically, thought of organization as a kind of machine (Mintzberg 1979; Clegg and Dunkerley 1980; Morgan 1986), was it because we designed machines in the image of organizations, or organizations to look like machines? The question has both a sociological and an epistemological dimension. Sociologically speaking, it is clear that, as a society, we are often fascinated by technology and tend to accept its claims uncritically. From the epistemological point of view, however, both technology and organization are merely constructs, the product of social processes, and the issue becomes not a temporally constrained one of what effectively causes what, but rather a conceptual one of what defines what. A computer is itself a form of organization. The people who developed computer-based technology were also most versed in its organizational properties. They may have tended to project what they knew about the kind of organization with which they were very familiar - the computer - onto another about which they knew less the office. Thus they in effect translated the computer model into an image of the kind of organization that would most effectively justify the technology. However plausible this step may have seemed at the time, it was, we have argued, not the right one. The template that guided the technology developers in their efforts (Child, Gunter, and Kieser 1987) was almost certainly a version of Morgan's machine metaphor. While working within the developmental context that led to office automation, they looked at organization and saw a mirror image of their own technology. They risked being caught in a loop. The lens through which the organization was viewed reflected technology's own properties, but the latter were only presumed to be a true representation of the object being viewed. There was no outside point of reference, because the literature on administration is itself unhelpful on the subject, to somebody whose concern is technology. The underlying epistemological dynamics remained invisible. As Morgan has emphasized, the idea that an organization is 'like a machine' is hardly new. It has been around for the better part of a century, perhaps even longer if we were to take the writings of Adam Smith into account. Yet there has also been evidence, dating back to
103
Beyond the Machine Metaphor
the famous Hawthorne studies and to the sociological investigations of Merton and his students and of Whyte and many others, to show that the rational model does not particularly well describe what actually goes on in contexts of work. Nonetheless, the gap between abstract idea of organization and concrete experience long continued without provoking a sense of crisis. Why then does it trouble us now? The answer, it seems to us, is that computerization alters the strength of the coupling (to use Weick's term) between theory and practice. Theory, in other words, is now being operationalized in the form of actual office machinery. For a long time, the gap between what was taught in the academy and practised in the workplace caused no serious problems. Often, the people who ran the organizations had learned to manage without the benefit of formal education in a business school, while the MBAs, once out of school, became absorbed in their new environment and put theory behind them. The gulf between rhetoric and reality, in a world dominated by 'action rationality/ was easy to cope with as long as the occasion for the one - report-giving and speech-making at industry conferences, for example - was clearly separated from the experience of the other - solving day-to-day problems in a reallife enterprise. Clearly, theory affected practice, and systems having their origin in theory came into use, but the impact was mitigated by intervening individual mediations. Like a loose-fitting shoe, the rational model might produce blisters from time to time, but it didn't pinch. Office automation made it pinch. Computerized systems of management communication make concrete the theory of organization on which they are based - to give theory a tangible effect. What had been before a kind of casual heuristic, meant to guide the behaviour of managers, more or less, now was being turned into an exactly described routine, with no degrees of freedom. Mistakes in the theory of organization would no longer be compensated for by the intuition-guided interpretive interventions of practical managers. If the theory were bad, its imperfections would start to be very visible. Something had to give: either theory or practice. Office automation assumed that it was practice that would have to change; it has now occurred to some that perhaps it is also the theory that needs retooling (Suchman 1987). In science, disconfirmations of theory are the result of experiment;
The Vulnerable Fortress
104
in organizations, the measurement of failures has traditionally been more problematical. However, even here, times change. Computerization is the driving force. When management information systems encounter the kind of difficulty that they have had and office automation goes awry, we have to begin to wonder if the theory behind them may not be at fault. The old rational, bureaucratic 'paradigm' now seems a little frayed around the edges.
4
Organization as talk and text
For it is the tension between the two poles which gives rise to the production of discourse as a work, the dialectic of speaking and writing, and all the other features of the text which enrich the notion of distanciation ... What we wish to understand is not the fleeting event, but rather the meaning which endures. Paul Ricoeur (1981) Hermenentics and the Human Sciences, 134
Introduction The sometime clumsiness of forced automation (and its uneven record of success) inadvertently called our attention to the deficiencies of the 'rational' theory of organization, especially in its technical manifestation as a 'software-driven' system. How can we then reach beyond the machine metaphor of organization to see the latter as a communication system - but a system that is more than a bare-bones network for the transmission of information? How can we achieve a theoretical model that is closer to the image of Weick, March, Kuhn, Brunsson, and others who take a more realistic, and down-to-earth, view of how organizations actually work? We already have some clues for developing an answer to this question in the two previous chapters. An adequate theory of communication, in our view, is one that can explain the two case-study situations that we described in chapter 2. There we saw a naturally occurring culture of people at work, situated in a context familiar to them and guided by the intuitive processes of spontaneous human
The Vulnerable Fortress
106
intelligence. It was pitted against an artefactual construction - a product of software engineering - in the form of a network, made up of nodes and links, and processing information, supported by the skills of artificial intelligence. In the first case, we conceptualized the native culture of the police force as a conversation. Its interactive processes took the form of dialogue, in which information did indeed get passed on, but always in a setting that was overtly, and profoundly, social in texture and intention. In the second example, we characterized the software simulacrum of organization as a text. It was a text in the way that it was expressed in reports to management; the constructive principle underlying the idea of network took the form of a grammar, with a repertory of elementary symbols (nodes, links) and a syntax for their combination into more complex structures (networks). The dialectic that we described in each case worked (as it must) bi-directionally. The police culture was adapting to a technical system to which it had become joined in processes of daily interaction; the software designers were constrained, little by little, to take cognizance of the peculiarities of the de facto heterogeneous organization that they were attempting to model. In chapter 3, we noted how computerization in the form of office machinery seems to be altering the strength of the coupling between theory and practice. As an organizational theory, the rational model served so long as its logic was not applied overly literally. The conversation, we could say, continued unimpeded. The introduction of office-automation technology changed this dynamic. Its goal was to textualize much of the organizational conversation in software as a rational system, with the mixed success that we have seen. The dynamic of the unstable accommodation between organizational conversation and representational text is the dominant feature of administration in late twentieth century, in our view, and is defining the challenge now facing management. Any theory of communication that claims to make sense of the organizational and managerial world in which we find ourselves must therefore incorporate that dynamic. There is, in our view, no such theory today. This chapter presents the outlines of such a theory. It makes the assumption that the conversation/text dialectic is not just a passing curiosity of the new information age but an intrinsic component of the communication paradigm. The translation of conversation into text, and text into conversation, is an ongoing interaction - indeed,
1O7
Organization as Talk and Text
the process by means of which organization is generated. In small, face-to-face societies, it occurs so unobtrusively that we remain largely unaware that it is happening. It's a very small step from office conversation to its textualization, for example ('So I said to him,..., and then he said to me, .... , so then I .really let him have it, and you should have seen his face! I guess he's not used to hearing home truths.'). In large, complex organizations, management makes the text on the assumption that the workers should restrict themselves to the conversation - and, furthermore, a conversation with rules already set by the managerial text (an idea that took root with F.W. Taylor at the beginning of the century and has not been abandoned since). The casual flip-flop from conversation to text and back characteristic of the micro-world of chit-chat gets institutionalized in the macro-universe of large organization, with consequences that we began to analyse in the last chapter and shall explore further in this. It seems to be a property of relatively unchanging organizations and societies that the relationship between conversation and text stabilizes. Whatever its relative efficiency as a system for transmitting information, this is what made the old system of police dispatching described in chapter 2 work. Against such a backdrop, implementation of new technologies would inevitably be destablizing, no matter what their nature. The destabilization is magnified many times over when communication technologies are in question, precisely because they directly attack the conversation/text equilibrium, triggering a dynamic search for a new basis of reconciliation. The conversational patterns of daily life at work are disrupted, but so are the frames of understanding. When this occurs, the restructuring of organization is no longer merely the deliberate strategy of some enlightened decision-maker bent on reform (in the interests of productivity, typically), nor is it 'learning' of resisting users. As it is repeated in one milieu after another, it becomes transformed into part of a historical movement sweeping us all along in its path - something that Schelling (1978) calls a 'macro-behavioral effect of micro-motives' and Boudon (1977) calls a 'perverse effect.' It is no longer just a question of rechannelling the conversation to make it conform to a text invented by engineers, bought by administrators, and implemented by users, as in office automation; it is an issue of rethinking the text, the conversation, and the relationship that links them in a dialectical tension.
The Vulnerable Fortress
108
An alternative metaphor: conversation and text The predominant model for organizational communication has for a long period been based on the idea of a network similar to the one described in our second case study in chapter 2. That model has a virtue: it reminds us that communication cannot occur unless there are physical channels capable of conveying the messages that people generate in interaction, from one location to another, even in such a simple context as face-to-face conversation. We create another emphasis, however, if we set the physical correlate of communication aside for a moment and recall instead that communication, no matter how it is conveyed materially, is made up of talk and of show: words, gestures, displays, sounds, and pictures. The sum of all the ways that we use to get an idea across from one person to another and coordinate our actions is called our discourse. Discourse, in turn, has two modalities, conversation and text. Conversation is immediate (in both senses, of imperative in its claim on attention and nonmediated), instantaneous, and involves identifiable participants' co-presence in interaction. It is also ephemeral - no sooner is it over than it is gone. Real-life managers in organizations, the research tells us, spend most of their time in conversation (Weinshall 1979). Text is more tangible and therefore operates on a set of different principles. It must be consigned to a manufactured medium of some kind, some sort of delay is normally implied between composition of the text and its deciphering, no physical co-presence is required, the product is durable across time and allows for a potentially unlimited number of readings by people who may or may not even know the author (while the latter may, indeed, not even be identified), and there are technological skills involved in its production. More than that, it is itself a technology, a skill to be mastered and requiring tools for its production. Because it is a technology, it is susceptible to the elaboration of methods in a search for perfectibility, and it lends itself to specialization and the acquiring of expertise by an elite trained in the procedures of its production. At the beginning of this century, management was hardly more than a handful of practitioners' precepts; it has matured into a massive academic field and, with MIS, into an explicit technology. There is now an impressive organizational 'text.' With only a slight effort of imagination, we can broaden the term
109
Organization as Talk and Text
'text' to include not only verbal transcriptions but also any kind of symbolic expression requiring a physical medium and permitting of permanent storage, including films, buildings, clothes, and even office or household appliances (Barthes 1967; Baudrillard 1968; 1970). Following Morgan's logic, we readily admit that the discourse model of organization is based on a metaphor. The term 'conversation' is being used here not only in the literal sense, to mean just the ebb and flow of spoken interaction, but also in the metaphorical sense, as an image, or even a simile. Similarly, for text. Metaphorically speaking, we can claim that an organization is a conversation, or, alternatively, that an organization is a text, or, even better, that an organization is both a conversation and a text, in dynamic interaction. It is a discourse. This is therefore a conceptual chapter. In it we are going to consider, in turn, organization as a conversation or transactional universe of discourse, and organization as a text, with its underlying frames of understanding. In setting out the theoretical principles of a communicational approach to organization, we draw on concepts from speech act theory, semiotics, sociology, and ethnomethodology. Nonetheless, the synthesis is our own.1 Organization as conversation It is of course obvious, and non-controversial (Smircich 1983), that an organization has conversations (in the sense that conversations occur within an organization). It is less obvious, however, that an organization is a conversation. The following eight premisses lay out the conceptual basis of such an assertion. Premiss i • Every conversation is composed of transactions.
If we mean to use conversation as a metaphor for organization, it is obvious that we have to decide what its units are (otherwise we shall not be able to deduce what the units of organization are). Empirical studies of conversation are helpful in this respect: many observers have noted that conversations, in general, can be broken up into clusters of verbal productions, which have been called variously 'adjacency pairs' (Schegloff and Sacks 1973), 'exchanges' (Wells, MacLure, and Montgomery 1981), and 'functional exchanges'
The Vulnerable Fortress
no
(Edmondson 19813; 1981!?). In practice, people talk not so much in sentences as in clusters of sentences - 'Did you manage to play today?' 'Yeah, I had a game this morning, how about you?' 'No, I was tied up. I couldn't make it. Maybe tomorrow/ A more powerful conceptualization, which has also been cited in the literature on conversation (Levinson 1983; McLaughlin 1984), is provided by the theory of speech acts (Austin 1962; Searle 1969).2 Austin has shown that language is not just a vehicle for the transmission of thought; it is also action, not just individual action (or behaviour, as the psychologists of his time were wont to think of it) but collective action jointly entered upon requiring interlocking commitments. Buying a car, making a promise, asking for someone to pass the salt, getting married are all accomplished through speech acts. A speech act, or, as we prefer to think of it, a transaction, is not only a transfer of information from one person to another. It is also the means by which the intention behind a communication is made clear (which Austin called the illocutionary force of an act of communication) and a behavioural or attitudinal effect produced (termed by Austin its perlocutionary effect). It takes partners to accomplish these things. To make a speech act, sometimes a single sentence will suffice, sometimes a more prolonged exchange is needed. The test is not the superficial pattern of the exchange in its verbal realization, but what has been achieved. There is a transaction involved, not just two joined actions, one of speaking, one of listening, but a joint accomplishment, one which, in the absence of a partnership, is impossible. It is important to emphasize here the change of perspective from network theory. In the latter, a node stands for a process, or function, which is the responsibility of an agent charged with its operation. When the agent has accomplished his or her task of transforming the input into output, he or she has no further responsibility: the network takes over. The link from one agent to another is just that - a link that carries a signal, but also a link that was chosen by someone else, and for which someone else remains responsible. To an engineer designing a network, this may seem a natural state of affairs, since the presumption is that the design of the network (and its subsequent administration) are separable from the activities of the people caught up in it. In a transactional theory of communication, we may not assume any such deus ex machina. The links, fully as much as the functions, are the responsibility of the people in interaction. But it is
in
Organization as Talk and Text
in the logical nature of a link that it be a joint responsibility, since if either of the partners to the exchange is delinquent in his or her duty as a communicator, there will be no link. To give, there must be a taker; to take, there must be a giver. Communication conceived as linking is literally give and take. And 'give' and 'take' are not the same as 'send' and 'receive.' The difference between an externally organized and a self-organized system is captured in this distinction. Network theory assumes no inherent power of self-management on the part of the people in the conversation. We, in contrast, find it difficult to imagine any human organization so deprived of the capacity to structure its own interaction that it must rely on an outside agency to do so. Certainly, in the case studies of chapter 2, processes of self-organization were very evident, both among the police and among the members of the team developing the software. In Austin's view, achievement of a communicational transaction presupposes mastery of a set of conventions on how to use language, something that we might want to call communicational competence. Searle (1969) has emphasized even more strongly the role of constitutive rules in making it possible for people to attach conventional meaning to different kinds of statement and gesture. The constructing of links in communication is not just a simple matter of laying the line and transmitting the signal; dialogue is involved, and dialogue takes skill. Fundamental to this theory of communication is the perception that communication is meaningful only in a context of people doing things and that, furthermore, communication is itself a 'doing.' Think back to the police case of chapter 2. The state of tension that reigned between the new teams of dispatchers and the cops on patrol centred throughout not on the information per se but on the transactional difficulties that both parties to the exchange were encountering. The inaccuracies in the information justified, more often than not, the questioning of the correctness of the transaction. The officers on patrol conceived of a right way and a wrong way to accomplish the (speech) act of dispatching. The dispatcher was not merely a passive link in an information chain, by which the data and the instructions of the computer could be forwarded to the cruisers. He or she was an active agent, transactionally linked to his or her clients, within a system where the rules were dictated (or ought to be, in the eyes of the veterans) by a set of conventions understood by all (if they were
The Vulnerable Fortress
112
'police minded/ that is). The dispatchers were not so much contesting the police view of how such speech acts of dispatching should be performed as trying, rather desperately, to master the intricacies of the game. The intention that came through to the police on the other end of the line was all too often not this, however, but a sense that the people with whom they were in interaction were concerned mainly with how to keep out of trouble, if necessary by an excessively rigid adherence to the rules laid down by the police superiors. The punctuation of a conversation is, thus, not only a way of breaking up speech; it is also a' principle by which action, and (it follows, if you accept our proposed definition of organization as a species of discourse) organization, are broken down into parts. To the police, the sum total of the transactions in which they were engaged was the organization. While the ebb and flow of interaction may be continuous, this is not how it is perceived; instead, people bracket the parts of the running talk that they generate in such a way as to make it conform to an expected transactional structure. It is the transactional structure that underlies the organizational principle and provides the frame for understanding the meaning of interaction. (We return to the notion of frame later in the chapter, when discussing the organization as text.) It is not that everyone frames the interaction in the same way (as both case studies of chapter 2 clearly demonstrate) but that they all do use frames to discern the organizational structure in the flux of interaction.3 If we add to this a notion (Watzlawick, Beavin, and Jackson 1967) that transactions are the means by which asymmetric interpersonal relations of control over the flow of communication are installed through discourse, we have an image of organization as a natural hierarchy with a built-in, de facto authority system. The relevance of a transactional mode of analysis to the modelling of management systems has not gone completely unnoticed in computer programming circles. Flores (Flores 1982; Winograd and Flores 1986) has taken over Austin's concept of a speech act and transformed it into a device by which managers elicit, record, and follow up on commitments from each other in computerized networks. It is an interesting development, but one that still assumes, on closer inspection, that the network is extrinsically shaped. By doing so, it misses the main point that we are trying to make - namely, that an organization is nothing but a set of transactions, realized through interaction, and that a network is not a separable object, different from communication, but its necessary physical infrastructure. To make
ii3
Organization as Talk and Text
the assumption that the network should drive the communication, not the converse, is indeed to put the cart before the horse. Given the difficulties that the police force ran into during implementation of the new dispatching system (chapter 2), it is not surprising, from the transactional view of communication, that these problems became translated into ambiguity as to where authority lay. There was an unspoken assumption, based on previous experience, that the dispatchers should be in a 'one-up' position of authority. But the new personnel lacked the credibility and the skills to 'pull off the relation of authority. The designers of the new system presumed, for reasons having to do with corporate policy, that the district offices would fill the role of authority figures. But the latter found themselves too far out of the information flow to satisfy the expectation. The result sometimes appeared dangerously close to a vacuum of authority, to no one's satisfaction. The police culture, like that of the army from which it borrowed in the beginning, expects authority to be exercised and obeyed, preferably without question, at least on the surface. When authority is made equivocal, the result is consternation, as much from below as from above. The old dispatchers were plainly scornful of the inability of the new to impose their authority through the act of dispatching. They had known how to achieve transactional stability. The patrol officers were hardly less skeptical. As a structure, the transactions making up a conversation could still be modelled as a network, but note the shift of emphasis. The nodes would now stand for transactions, involving complementary commitments, and no longer for individuals, or machines; the links would represent connections between transactions, not simply message transfers. The mode has shifted from static to dynamic. Transactions, unlike people, have, by definition, a time attached to them, i.e. when they occur. An organization is now a process, not an object. It is a pattern of transactions - an issue to which we return below. Premiss 2 • Conversation is the means by which people construct and maintain social identities.
One of the most original early contributions to a communicational theory of organization was George Herbert Mead's (1934) conception of the origin of the self. The concept of 'self,' he suggested, is reflexive.
The Vulnerable Fortress
114
It supposes that one can be simultaneously a subject and an object (as in 'I shave myself/ where T and 'me' are the same person). He reasoned that communication, like shaving, was fundamentally reflexive, from the individual's perspective. His idea was that the gestures that we produce, including spoken language, are available for inspection both by the person producing them and, because it is a transaction, by the person whom they target and thus furnish a common set of symbolic expressions to which a collective meaning can be attached. As in shaving, there are both proprioceptive and exteroceptive feedbacks available to make sense of what is happening. The individual's self results from this mirroring effect, in which 'the significant other's' reactions come to frame the initiator's self-image by reflecting it back, thus giving meaning to the internal experience with which it is correlated. In Mead's theory, subject and object are born in the same transactional process, since one's own subjective experience is inextricably linked to the objectification process by which one becomes a personality to others. What we would add to this idea, from a transactional perspective, in hindsight and following Austin, is that the identity is invariably linked to a 'doing.' One can only with great difficulty farm without becoming a farmer, or practise medicine without becoming a doctor, or go on relief without becoming a welfare recipient. When it is a communicational 'doing' that is in question, since there are two people (at least) linked transactionally, then two identities are simultaneously affirmed, and so is a relationship between them. Such arrangements lend themselves easily to negotiation - and to renegotiation. Identities, in this sense, depend on the goodwill of the other (as the new dispatchers discovered), but they can evolve over time. This concept of interdependence has been explored in depth by Erving Goffman (1959), who built on Mead's notion of selfhood by superimposing a dramatistic orientation. The 'presenting' of oneself has, with Goffman, become a kind of mixed-motive game, loaded with reward for the successful player and fraught with peril for the incompetent or disadvantaged. The individual is no longer just a reactive organism but an engaged actor in a social scene, committed to the ongoing exposition of a persona, acting out (it is hoped) a consistent line over time, and making a claim to a certain social status. Under normal circumstances, the individual can depend on a social convention that says, somewhat along the line of Grice's (1975)
ii5
Organization as Talk and Text
cooperative principle, that we have a duty to support others' claims to identity, as long as they support ours. Unlike Grice, however, Goffman makes no bones about the strategic implications of interaction, where everyone is engaged in a subtle exercise in reading not just the information given by others, in the form of explicit statements, but also the information given off, or 'leaked,' perhaps non-verbally, perhaps by other clues. This 'leaked' material allows us to unmask pretension and discover fraud - to the everlasting embarrassment of the person who has lost face. The dispatchers described in chapter 2 illustrate some of the difficulties of constructing a social identity, when circumstances are working against that goal. Nothing that they could do was likely to counter, in the short term, the sense that had been created during the lead-up to implementation that they had no legitimate place in the police world. Their behaviour was scrutinized for signs of weakness. To be young in a setting of veterans, and a female in a male world, merely served to complicate even further the problem of achieving an organizational identity. The dispatchers' only advantage was that, having not much 'face' to begin with, they had little to lose. They could be embarrassed, but not humiliated. From this new perspective an important insight emerges. Conversation is not only composed of transactions; it is made up of transaction whose outcome has an intrinsic value. The successful completion of a sequence of well-executed transactions may not only have desirable instrumental uses but may be satisfying in and of itself. Any organization that fails to provide for the sustenance of the identity of the people who work for it is issuing an invitation to anomie - and ultimately to self-destruction. Premiss 3 • The relationships linking people in communication form a semantical^ determined system. The structure of an organization is a fabrication of language.
Ours is a discourse theory of organization. The most central of all the discursive productions of people is language itself, which provides categories that can be occupied by individuals, so that it is not infrequently the category - tinker, tailor, soldier, sailor - that is seen to speak, not the individual who fills it. To occupy a recognized
The Vulnerable Fortress
116
category is to be legitimated, since one speaks no longer only in one's own name, as a frail individual, but instead in the name of the category (like a judge, or a chairperson, or a manager, or an accountant, or a doctor, or a traffic cop, or a sports commentator). The study of categories is a province of semantics. All meaning has two sides, one experiential, one conceptual, Experiential meaning is personal and essentially non-communicable. It may be what we feel the first time we visit Jasper, a childhood memory of Christmas in the country, the taste that we get when we munch a fall apple. Conceptual meaning is what we accept when we start to use language. It is the linguistic filter not only through which we perceive experience but by means of which we structure our social experience. We do not just converse, we enter into conversation, and when we do, our selfhood emerges as a reflection of the properties of speech, fully as much as the properties of interaction. Just as new clothes that we wear after a while no longer seem alien, language also soon becomes the 'natural' way to cover our experiential nakedness. Conversation is a culture; we are its host - the basis that it requires to thrive and propagate itself, to evolve. Kondo (1990) has reported on her experience as an American of Japanese origin conducting anthropological research in Japan. Looking Japanese on the outside but feeling American on the inside, she faced an initial identity crisis, only to discover that in Japan identity itself is seen differently than in the Untied States. In Japanese, for example, there is no single word that corresponds to the English pronoun T.' One identifies oneself, always, with respect to somebody else having an a priori presumed particular status in one's life: sister, superior, friend. The language provides a number of options, all interpretable as 'I/ depending on whom one is talking to, and all allowing one to refer to oneself as the subject figuring in an assertion that one is making. To say T is already to have committed oneself to an interpretation of the situation. The self-reference is also, simultaneously, an other-reference. Kondo found that she had no longer a single identity, but several, depending on the context. She sees in this usage an instance of a larger principle governing life in Japan: selfhood is realized as a derivative of relationships. One does not start by being a self and then enter into relationships; one starts by being in a relationship which already sets the parameters for selfhood. One becomes oneself through interaction, within a field of semantically derived relationships. There is no invariant 'self transcending
117
Organization as Talk and Text
all kinds of situations. One's personality is a function of social circumstance and commitments. Kondo's discovery should perhaps remind us of the dilemma facing the software developers in chapter 2 who found that functions cannot be nailed down across the departments of an organization. An 'originator' is an originator, it became clear, if, in his or her particular context, the system of commitments in which he or she is located says that he or she can be one. Perhaps our world is not so different from that of the Japanese as we think. A.J. Greimas (Greimas 1987; Schleifer 1987) has made a distinction between actors ('acteurs') and agents Cactants'). An 'actor' is what we normally think of as a person with his or her peculiarities and unique non-replicable identity (you and I with our names, histories, and physical appearances); an 'agent' is an organizational (i.e. semantic) category whose occupation by some individual is necessary to make the whole system go. The definition of an 'agent' is not a matter of individual choice, any more than the decision as to what a noun is, or a verb, when we are speaking a language. An 'agent' is a function within an overarching semantic system that gives an organization its shape. When we take on an organizational role, we can infuse it with our personality and give it a style all our own, but we are not at liberty to redesign the semantic net within which it fits. When we occupy an office in an organization, we do not just fill a physical space, we fill a semantic space. The concept of a 'dispatcher' has no meaning outside a universe where there are things to be dispatched and people to dispatch them to, each with his or her own identity. The notion of 'categories' used above does not, however, capture the full richness of Greimas's idea of agency. To be an agent (and we are all agents, if we belong to any organization) is to be caught in a web of mutual obligations, from which we draw our identity but which are also charged with the potential for social conflict. The categories of a social system are reciprocally defined; they have no meaning in isolation. An agent is responsible for doing something (and it is this 'something' that makes it a semantic category: a cyclist is a cyclist because her or she cycles, as opposed to driving a car). An 'agent' is also an agent because he or she acts for someone: there is an agent/beneficiary relationship involved. A computer programmer develops software solutions for a corporate client; a police officer fights crime and maintains the peace, for us citizens. The 'doing' is
The Vulnerable Fortress
118
always set within a matrix of social relationships. But there are not only agents and beneficiaries, there are also opponents and victims. An organization is a system of relationships, but it is not a neutral configuration. There are alliances and rivalries, friendships and enmities, benefactors and exploiters. To the employees, if management is not seen to be benevolent, it will be perceived as malevolent. And, by and large, vice versa. As the system developers in chapter 2 found out, interdepartmental relations are also a minefield of incipient jealousy. Astute managers, from Chester Barnard to Konosuto Matsushita (Pascale and Athos 1981), have always known that organization is ineluctably moral in character. So did group psychologists such as Newcomb (1953) and Heider (1958). To the semanticist, there is a virtual organization already present in the structures of language. Its categories serve to mould the values of people who actualize it, without their being aware that they have stepped into a situation that was already defined for them. Like a pinch-hitter approaching the plate in the ninth inning of a tight baseball game, language constructs the roles that they can play, and how to play them. It does not of course, guarantee the outcome. From this follows an important logical consequence. Premiss 4 • All communication has reflexive character. This is the dialectical relation linking conversation and text.
Let us return for a moment to Austin's notion of a speech act. 'Doing' a speech act, for Austin, included the following conditions. One must be (i) the right person (in the right circumstances) in order (2) to be even eligible to execute the conventional procedure associated with a given speech act, which must then (3) be carried through correctly for the act to succeed. Goffman shows the obverse of this construction when he demonstrates that people regularly engage in an exercise to become the right person by, if they can, executing the conventional procedure, all the while claiming the associated identity. Carried to its limit, this is the accomplishment of the 'con artist/ pure and simple, but we probably all have a little bit of con artist in us. Here there is a neat loop. Only the right person can do the act, but he or she affirms that he or she is the right person by doing it correctly.
ii9
Organization as Talk and Text
Which is the chicken, which is the egg? The answer is 'neither' or 'both': the doing of a speech act is reflexive. The 'doing' and the 'doer' (not to mention the circumstances of doing) are mutually defining: fishing, fisher (and indeed fish) all are linked in a semantic, as well as a practical, associative pattern. This link is known in linguistics as the 'functional' basis of meaning (Chomsky 1965). All meaning, in symbolic systems, is relative: there is no absolute North Star. Communication is framed by the circumstances (including personages) which it simultaneously creates. Conversation defines the people who take part in it and the circumstances in which it occurs (and vice versa). Goffman also recognized that in interaction there is a front stage and a back stage. The 'front' we present to others is conditioned by how we think people expect us to behave and what we are aiming to gain from the exchange. What is occurring backstage is another thing altogether: more informal, even irreverent, and more in line with our 'true' feelings. It is like a Japanese Noh play: we follow the action with conviction even where the mechanics would be plainly visible, if we chose to focus on them. Nothing could better illustrate the dichotomy that we instinctively take for granted between private experience and performance. It is through this translation that we turn the universe of subjective experience into an objective reality. The objectification of experience is the necessary pre-condition for the existence of an organizational text. This reflexivity of communication is, in other words, a consequence of the interplay between text and conversation that we have postulated as fundamental. As for the identities of people, as agents using language, we can explain them in one of two ways, depending on whether we take our inspiration from Burke and Goffman, as characters in a play (a dramaturgical theory), or from Propp, Levi-Strauss, and Greimas, as personages in a story (a narrative theory). Whichever explanation we choose, we are recognizing that a person's acts are not just isolated doings, meaningful in and of themselves. They are part of a continuing unfolding of events which collectively contribute to our understanding of a reality that transcends the purely phenomenal because it is not interactional, but textual, even though it is realized through interaction. They are incidents in a dramatic/narrative process, and it is the structure of the process that is meaningful. We cannot tell directly from a person's actions what they mean, because identity is relational, and the relations in communication are
The Vulnerable Fortress
120
the linking of the transactions to form a network. A network of transactions is a story, or the plot of a play, or - and here the step is critical - an organization. Organization unfolds as process, but its meaning appears only from our reading of its processes - the translation of its conversation into its text. The text - our understanding of the organizational 'story' - in turn informs the interaction and structures its transactions. The implicit text of organization defines persons and circumstances and how to act them out, but it is only from participation in the transactions themselves that one can infer their immanent structure. (The newly named dispatchers of chapter 2 not only had to master, as much as they could, the old story of organization but had also to persuade their opposite numbers to accept a new 'scenario/ with a different plot or story line). Whether we take an organization to be a conversation or a text is a matter of perspective; an organization is both. It is in this sense that organization is a communicationally constructed entity. It does not antedate communication; it is, from a conversational perspective, the structure of communication, or what Haley (1976: 101) calls its 'patterned, redundant ways of behaving' and hence the source of communication's 'status and precedence and inequality' and of its 'hierarchy.' When you have understood the structure, you have decoded the text. Because it is reflexive, it is constantly in evolution as people come and go, and the transactional arrangements adapt to new constraints. It takes time for organization to coalesce, and even then it is never static. Seldom does the shifting structure of an organization bear much resemblance to its official depiction (Mackenzie 1978; 1986). The study of police dispatching reported in chapter 2, for example, was undertaken shortly after the initial implementation; if we were to return later, presumably a new organizational equilibrium would have been attained, within which the organizational identity of the new dispatchers would be settled, for better or for worse, but even then only for the time being. Premiss 5 • Information is a property as much of the transactional situation as of the message.
Our treatment of the concept of information is rather different from its interpretation in network theory. The latter takes 'information' to
121
Organization as Talk and Text
be a property of the message. The information content of the message is the extent to which it contributes to the receiver's knowledge about the state of the world that the message describes. We see it as all this, but also more. Information is also a transaction, occurring within a communication situation that involves a giver and a taker and a relationship that links both in a system of mutual obligations. Inform' is, after all, a transitive verb, with subject (the informer), object (the informed), and indirect object (the information). The informed is so because he or she has been affected by the exchange (Austin's perlocutionary effect). The information in an exchange is thus a secondary property, structurally speaking, of the event; it is already embedded, syntactically, in a subject-object matrix, which is, at another level, a transaction in its semantic reading. The act of informing (which is necessarily a speech act, since it can occur only within the speech situation) is part and parcel of the context of 'doing.' Information has value because it is a guide to intelligent action. The anxiety of the police officers described in chapter 2, deprived of their habitual source of reliable information (as they saw it), was occasioned by their suspicion about the reliability of the new source of information. Like any other human transaction, information has a value. Earlier, we argued that transactions are intrinsically value-adding, because they create identities. Here we are asserting that they are also extrinsically value-adding, since they occur within a context and create value. The giver actually gives, the taker takes. Value is transmitted. In the chapter that follows, we return to this concept, which is not only communicational in its implications, but also economic. If we are to understand properly the internal dynamic of organization, in today's information context, we must grasp the transactional meaning of information. In conventional organizations with salaried employees, information remains a virtual transaction; it becomes, when the compensatory rules change, increasingly real. Transactional theory emphasizes that in the understanding of a communication, the features of context - what the situation is, who the people are, what has happened before, what are the probable intentions of the people involved - all get fed into a process of interpretation. Messages do not 'carry' information; they indicate, partly by the way in which they are structured, partly as a function of the context, what is informative in the unfolding situation. Information has to be constructed on the basis of the message, using an interpretive frame of explanation which supposes prior knowledge of the
The Vulnerable Fortress
122
situation and some understanding of its dynamic (Barwise and Perry 1983). In the making of information, the informer and the informed often put their collective heads together to assess the value of the data that they have at their disposal. They negotiate. Yet the value of the information is also a function of the reliability and trustworthiness of the source, so that beyond the joint interpretation there is an individual act of judgment, in which the properties of the transactional situation also figure. Both partners to the exchange compute the value of what has transpired and act accordingly in the future. The most important information in the dispatching situation described in chapter 2 was the proximity of a car to the location of the incoming call. But no amount of computer analysis could circumvent the reality that 'proximity/ in the context of police work, was subject to interpretation. The dispatchers also soon discovered that they could not convey the 'objective' information on which they based their decisions without simultaneously passing on information about themselves (not 'police minded' enough). Further, the information that got 'transmitted' was an amalgam of the two, not a separable quantity, abstractable from the speech situation. The system developers of the other case study in chapter 2 made their own discoveries of the 'situatedness' of the information on which they were basing their design decisions. But they were disappointed, time after time, as they came to realize that their informants were themselves embedded in a transactional system, which developers had to understand at least partially if they were to evaluate 'information' properly. This perspective borrows from a school known as 'ethnomethodology' the assumption that communicative behaviour is 'indexical' - a passing source of documentary evidence which provides clues as to what is going on and how to interpret it, but no more (Garfinkel 1967; Leiter 1980; Handel 1982; Heritage 1984). Making sense of experience requires active inference on the part of the participants to the conversational process: hermeneutics is a science of the everyday. The meaning of any act can be divined only by reference to the circumstances in which it occurs, one's knowledge about the intentions of its participants, and the probable outcome of the sequence in which it figures. One of the most pervasive illusions in the literature on computerization is the notion that information is something to be stored and forwarded, as though it were just another packaged good. It is not. Information is a property of a situation, not to be understood other
123
Organization as Talk and Text
than in the context of one's own and someone's else's purposeful activity. Beyond its utilitarian value, there is a transaction, and since transactions are the fabric of organization, it is out of information 'flows' that organization is created. Premiss 6 • Actors who take part in conversation can be either micro-actors or macroactors. Either way, they still follow the same conversational rules.
Most empirical study of conversation has been preoccupied with the face-to-face situation and with limited numbers of participants. The issue of organization has not been considered particularly salient. If we mean to use the conversational metaphor for the analysis of complex organization, then we have to take account of the fact that in an organization the actors are often corporate actors: presidents and vice-presidents, sergeants and lieutenants, software developers and users' representatives. How does a corporate actor come to have an intention (and hence the ability to exert illocutionary force)? In Austin's theory, the illocutionary force of an act is the communication of an intention. Unless the initiator of a speech act can be presumed to have an intention, and to be thus capable of understanding a commitment, there can be no illocutionary force, no speech act, and no transaction (and, hence, no communication and no organization). Only when corporate actors come to have an intention are they entitled to enter the transactional arena. According to Greimas's distinction between actor and agent discussed in premiss 3, it is agents, not actors, who are transactionally committed. Reality is microtransactional when we enter conversation as actors speaking for ourselves; reality is macrotransactional when we represent others who are collective actors. The question, rephased, is thus: how to represent others? An answer is given by Gallon and Latour (1981: 279). They refer to a principle which they call 'translation': 'By translation we understand all the negotiations, intrigues, calculations, acts of persuasion and violence, thanks to which an actor or force takes, or causes to be conferred on itself, authority to speak or act on behalf of another actor or force.' In their theory, too, intentionality is central: 'Whenever an actor speaks of 'us,' s/he is translating other actors into a single will, of which s/he becomes spirit and spokesman' (279).
The Vulnerable Fortress
124
The effect of translation is to create a source of intentions, associated with an 'actor' but with one proviso. The intentions expressed by a speech act are now taken to be the manifestation of a collective will and to stand for a collective commitment, which every individual who is a party to the agreement is expected to respect. The actor is no longer an individual, but an organization. We now have a macro-actor. Macro-actors behave in the same way as micro-actors; they too engage in conversation and must also conform to its conventions. The difference is not in how they speak, but in whom they speak for. 'Speaking for other people' is a source of power in every type of society, from the most simple to the most complex. What makes possible the emergence of large, lasting centres of power, Gallon and Latour argue, is the fabrication of material artefacts. We already knew from Goffman how important display was in the manufacture of a successful persona; when the 'displays' become badges and uniforms, vehicles and buildings, as in the case of the police, the macro-actor has a visible identity that can transcend the bounds of time and space. The actor can exert a will over great numbers of people by engaging them in a conversation that continually reinforces established power by a sustained process of renewed commitment, from generation to generation, and from nation to nation. Premiss 7 • Conversation has both a micro- and a macro-process dimension.
From premiss 6, it follows that one can read transactions, as the material of organizations, in one of two ways. They are both microevents, involving individuals in the realization of a unique, never-tobe-repeated conversational sequence, and macro-events, exemplars of a pattern to numerous members of the organization and repeated over and over, in form if not in specific detail. In the first instance, we understand 'conversation' in the usual way, as face-to-face (or equivalent) interaction, occurring in a definite time and place. Under the second reading, we take the notion of conversation metaphorically to stand for the culturally sanctioned pattern of realizing an exchange through symbolically mediated interaction. The macroconversation renders the micro-conversation 'organizational' and value-laden. The study of conversation - abstracted from its organi-
125
Organization as Talk and Text
zational moorings, as commonly practised in communication research - is context-free and preoccupied with the syntax of interaction - its procedural dynamics. This is not our perspective. Our concern is both semantic and pragmatic. To be a cop, you have to learn how to be a cop. How skilful you are in carrying out the role is not, however, limited by the macrorules; macro-categories have to be realized in micro-events, and here the mastery of the general rules of conversation is also relevant. There would not be cops unless there were robbers, not to mention ordinary civilians. The micro-processes of interaction involving police officers and their opposite numbers (incidents) constitute and maintain the macro-process of law enforcement. This macro-process gives meaning to the transactional incidents in which individual actors are involved, but the accumulation of incidents, and how they get played out, determine reflexively what law enforcement is like, for a given society. Interaction is fluid and frequently partial, incomplete, full of stops and starts, and elliptical. It works in part because the participants can fill in, by inference, its meaning from their understanding of its macro-transactional significance. Macro-transactions are few in kind and repeated: they are the glue of an organization and provide its coherence. A single transaction is both micro and macro, depending on our point of view.4 Micro-acts are concerned with solving immediate problems, specific to a particular situation; macro-transactions contribute to the sustaining of organization and the solution of collective problems. In ordinary circumstances, the underlying transactional structure is something that people take for granted.5 The phenomenology of daily existence is one of dealing with small crises - what Flores calls 'breakdowns' (Flores 1982). While we are focusing on this and that and the other thing, the larger picture stays in the background, invisible because we take it as presupposed. To turn our attention to the system of transactions that situates us organizationally may even be a symptom of 'maladjustment/ It is the kind of thing that may lead a family organization to the therapist's couch (Labov and Fanshel 1977) and a larger organization to a mediator in a labour dispute. It is to engage in meta-communication (Watzlawick, Beavin, and Jackson 1967). It is the state of affairs that holds frequently during implementation of radical change in an organization, of the kind described in chapter 2.
The Vulnerable Fortress
126
Premiss 8 • Conversation, as organization, is a multiplex phenomenon. The organization that is created through conversation is a mixture of tight and loose coupling. The organizational conversation, according to Karl Weick, is a much more local, small-scale phenomenon than we usually think (Weick 1976; 1985). The case studies cited in chapter 2 suggest the same pattern. There has been considerable attention lately to the notion of the 'culture' of an organization. Our own image is that of numerous 'cultures,' each manifested in an ongoing conversation, rooted in a semantic system of identities, more or less interrelated. In this overlay of conversations, one stands out. It is what might be called the 'master conversation,' concerned directly with the organization as a whole. What counts as the 'organization as a whole' is also a matter of semantic definition, not just of transactional dynamics. The organization is a macro-actor which has been accorded legal status which in turn allows it to transact financial affairs, including remuneration of its employees. The transactional dimension enters through the conversation which determines how the macro-actor behaves. In principle, anyone could be involved in such a conversation, but in practice exclusivity reigns, and the only people made privy to the content of the master conversation are the masters themselves the senior managers (since if there was no exclusivity, there could be no masters!). What distinguishes the 'master conversation' from all others is the responsibility that it has appropriated to itself to write the text for the organization as a whole. It is thus the ultimate organizational authority. It is this characteristic above all that distinguishes the 'master conversation'; in other respects, it behaves like a subculture, and it is also, for the most part, loosely coupled with the organization that it 'runs'. The word 'authority' has as its root the Latin word for author 'auctor.' A text is the work of a person (or a group of people); conversation, which it governs, is a collective achievement, made up of all the transactions to which the text refers (and more). The existence of a text is a sine qua non for the existence of all government, including administration (although in societies without formal media, the 'text' may be reproduced orally). The organization as text, then, is the official organization. It specifies the organizational agents and their duties, it describes the activities and their expected outcome,
127
Organization as Talk and Text
but it does so in a literary, not a transactional, mode. In the 'master conversation' (and indeed all managerial conversations), the subject matter being discussed is likely to be the state of other conversations, namely those of the lower echelons. That was how the dispatching conversation, in the police example, came to be redefined in the first place, in the form of a project, with goals and means. It is what the system designers thought of themselves as doing: reshaping the organizational conversation to make it more 'efficient.' The tension between conversation and text which we described in the last chapter begins here. A conversational image of organization: a multiplexed transactional universe of discourse When we try to picture an organization in our mind, on the basis of these premisses, we imagine something like a mille-feuille pastry whose layers are patterns of conversation, sustained over time, involving mostly the same people, with recurrent preoccupations. Each layer is organically linked through a sustained structure of discourse to create a culture and an organizational structure of transactions that is, to use Weick's term, 'tightly coupled.' Each conversation has overlaps with others, similar to those linking dispatchers to police officers in cruisers, or dispatchers and police officers to their superiors. Some interconnections are also tightly coupled; some are not. Some links may become very loosely coupled. However loose the coupling becomes, there is nevertheless a common substratum of discourse to which all members of the organization can relate (even if it is only a pay cheque). Thus we can visualize the organization as a single conversation, which is multiplexed, with simultaneous strands of interaction. When we think of the organization in this way, we may pose the question: to what extent is this single conversation, which is the organization, coherent? Does it 'hang together,' in any meaningful way, or is it just a pulling together of conversational fragments, with historical connections, but no real internal logic? And if the latter, might it not be better served by being structured in a new way - by disbanding the old organization, for example, and starting new ones? It appears that many organizations in the changing environment that we have described are beginning to ask this question. It is the question that we take up in the following chapters.
The Vulnerable Fortress
128
Organization as text Let us now consider the concept of the organization as a text. First of all, what is the text? A text, obviously, is a written form of discourse, produced and consigned to a medium. Unlike conversation, which can be accomplished only through interaction, a text is a unilateral production of an individual or individuals, although it too, like conversation, is mediated by language. A text is thus not an experience per se (although the reading of it may be) but a recording and an interpretation of it. The issue for us, however, is not so much how the ordinary experience of face-to-face conversation gets transmuted into text, and in so doing makes it possible for the conversation to have a meaning, as it is to understand how an organizational conversation is turned into an organizational text and, in the process, comes to support management and managerial authority. It is the organizational meaning that we are in pursuit of: organization conceived of as a text. The dominant view of text production in linguistics and artificial intelligence until about 1970 was of a generative process which, beginning with a repertory of signs (such as a vocabulary, or lexicon, of ordinary words) and a set of rules for combining them ('syntax' or 'grammar'), would produce a set of 'strings' (or sentences, for ordinary spoken language). We encountered such a system in the second case study in chapter 2. Under this sort of reading of text, the signs that figure as elements in the string can be thought of as referring to real things, and meaning is born in the correspondence between the structure of the textual production and the situation that it describes. This sort of reading takes the syntactic structure of the sentence to correspond to the relations between objects in the world that is being characterized. In the example in chapter 2, the 'boxes' in the data-flow diagram were supposed to stand for functions, or processes, handled by people and/or machines, and the network that turned them into a meaningful configuration, to the links between functions (and, by implication, between people). There had to be a lexicon where each sign was given a reference; the resulting theory of meaning is called a 'look-up' system (Katz and Fodor 1963). Here, surely, in chapter 2, the text was assumed to be the organization. With time, this view of text, and what it is, has come to seem less
129
Organization as Talk and Text
persuasive; its point of take-off is generally understood to be the publication of Noam Chomsky's (1957) text on syntactic structures. Further reflection has reinterpreted the signs making up the nodes of a sentence of natural language. Clearly, they are not just arbitrarily selected, neutral markers (like boxes) to be assigned a meaning by giving them an outside point of reference in the 'real' world. Rather, they are part of a semantic system in which signs relate to each other, through a matrix of contrasts and similarities. The hypothesis of one-to-one correspondence between structures of language and the states of the world that they 'describe' cannot be sustained. The link between the complex creations of language and the outside world is not quite that simple, and this is why we use the term 'dialectic' to refer to the cross-over from one modality to another. This line of reasoning led to another idea. Sentences are not best thought of as simple concatenations (sequences formed by placing in a linear order) of discrete signs, according to a recursively applied set of combinatory rules, or 'grammar.' Rather, they are conceptual structures with their own integrity. Sentences convey complex ideas that are not reducible to the sum of the meanings of the words of which they are composed. They portray, in embryo, a social structure, linking functions through processes (J.R. Taylor 1993). This change in our view of how language works, and texts are constructed, was in turn accompanied by a dawning realization among philosophers, linguists, workers in artificial intelligence, and psychologists that the old Gestalt psychology idea was right. The notion of a situation ought to be a logical primitive, since we perceive it holistically, in a single insight, rather than building it up synthetically, detail by detail (Barwise and Perry 1983; Barwise 1989; Devlin 1991). Furthermore, the same principle applies to linguistic structures that transcend the sentence. People can grasp sequences of events, or stories, as integral unities and are not limited to treating them as merely composites of discrete events (Rumelhart 1977; Mandler and Johnson 1977). We seem to be endowed with the capacity to perceive structures beneath the surface, on the basis of the mostly incomplete information that comes to us through transient experience. We seem to have what might be called 'archetypal' intuitions. Again, these intuitions are the starting point for a model of organization. The making of a text does indeed involve synthetic construction, using the symbolic material at hand, such as goes into a sentence, but
The Vulnerable Fortress
130
the result is a meaning gestalt - a pattern, not just a collection of points. There is a translation involved, moreover, and the symbolic production, however well crafted, mirrors only imperfectly the underlying idea, which remains not directly communicable. It has to be filled in by the reader. And this is where art comes in. To quote Minsky (1985: 247): 'How does the writer's craft evoke ... lifelike characters? It's ridiculous to think that people could be well portrayed in [al few words. It takes great skill to create ... illusions - to activate unknown processes in unknown readers' minds and to shape them to one's purposes. Indeed, in doing so, a writer can make things clearer than reality. For although words are merely catalysts for starting mental processes, so, too, are real things: we can't sense what they really are, only what they remind us of.' The dialectic that we perceive to typify all communication is here made salient. We can 'sense' the meaning of conversation (the organizational 'real thing/ to use Minsky's words) only to the extent that it 'reminds us' of something. The craft of making text is to know how to evoke the right 'reminders/ which in turn are the store of shared past experiences that we use to interpret present reality. The imagery is as real as the world that it mirrors for us. A key concept in this emerging contemporary theory of text is that of frame. Frames The notion that we group things together and recognize them as a pattern is part of the received wisdom of the traditional gestalt psychology and phenomenology of perception and cognition. The concept of frame, referring to the structuring of communicative sequences, was, however, first sketched out by Bateson in an article written in 1954 and published in 1955 (reprinted in Bateson 1972). The idea was subsequently picked up and elaborated on by Goffman (1974) and Minsky (1975). The concept has continued to intrigue researchers in a variety of fields, including public policy (Schon 1979), organizational analysis (Bartunek 1988), and family therapy (Keeney 1990; Chenail 1991). Because of Minsky's article, and the work that it inspired, it is also part of the lore of artificial intelligence. Bateson was influenced by a distinction that Bertrand Russell drew between a set and the members that compose it. Russell had argued for a theory of 'logical types/ claiming that confusing a set and the
131
Organization as Talk and Text
elements of which it is made up creates the conditions for paradox (Who shaves the barber who shaves all the people - without exception - who do not shave themselves?). As Bateson saw it, a set functions like the frame of a picture: it separates clearly what is in (to which we pay attention, and in which we can look for meaning) from what is out (mere background, or context). We frame communicational interaction, as noted in premiss i, by 'bracketing/ or isolating, those parts of behaviour that contribute to the transaction and those that are merely ambient noise.6 Since the informational content of a transaction can concern the transaction itself ('What I'm saying to you right now is not an apology, it's merely an explanation'), Bateson proposed to keep the logical levels unambiguous by introducing a notion of meta-communication (i.e. communication about communication). Since management is always about the organizational conversation, for example, it is to this extent intrinsically meta-communicational. From these insights, Bateson was led to speculate on the nature of learning. He identified three kinds of learning. At the level of zero learning, we respond to events in the environment mechanically, following routines dictated by a well-established frame. This is equivalent to data-processing: keyboarding into a computer data from a personnel application, for example. Learning I is where, within a well-understood situation, we nevertheless respond intelligently to what is happening and adapt to events as they unfold. We learn from experience, in the usual sense of that expression. Learning II (which he also calls 'deutero-learning') has a more radical expectation - that we not only adapt to the changes in the content of the frame - the picture, in effect - but that we step back a pace and reframe, discovering not just new meaning but new kinds of meanings as we combine the elements of our experience into other and novel groupings. This last distinction of Bateson's is one to which we shall return in chapter 7. Minsky's ideas on frames are somewhat more operationally oriented, as befits a labourer in the vineyard of artificial intelligence. Borrowing from the linguistic school of generative semantics of the time, he observes that the structure of a sentence provides a frame. We can interpret a sentence such as 'Jacques drove from Montreal to Toronto on the Trans-Canada' because it has a buried semantic substructure, whose nodes correspond to semantically primitive concepts
The Vulnerable Fortress
132
such as 'actor/ 'origin/ 'destination/ 'trajectory/ and 'vehicle.' As Minsky (1985: 245) sees it, 'A frame is a sort of skeleton ... with many blanks or dots to be filled.' We already have the skeleton in our head, through our familiarity with language, and the reading of a given sentence allows us to 'fill in the blanks' to grasp a present reality - to be informed, in other words. Any blanks not specified in the actual text we automatically fill in by making default assumptions based on our knowledge of the typical case. For Minsky, the frames are the structures activated in order to make sense of a perceptual experience. If Minsky tends towards formalization, Goffman moves in the opposite direction. His ideas on frames and framing are saturated by a sociological preoccupation with the structuring of small social groupings. From a starting point of the analysis of strips of activity involving human interaction, Goffman delves into another idea of Bateson's - that the same sequence takes on a very different meaning depending on whether we frame it as 'serious' or 'playful.' This leads Goffman to postulate 'primary' frameworks which, in our system, would be equivalent to transactions. How the transaction is interpreted, however, depends on how it is 'keyed'7 (as the real thing, or as play, for example). Goffman observes that the keying can itself be rekeyed to produce what he calls 'lamination' - frames embedded with frames. The concept of frames is useful, but its exact definition tends to remain elusive. It is not the kind of term, everyone agrees, to which we can easily give an exact operational definition. It is an obviously holistic way of perceiving pattern without paying attention to all the details. It is perhaps best understood through the use of examples. A courtroom, for instance, is a frame: we recognize it as a place where a certain genre of communicational sequence is played out repeatedly, always obeying the same procedural rules, even though the matter under discussion changes from case to case. It is a frame that we associate with a physical setting, distinctive costumes (though less so than once was the case), invariable rituals ('All rise/ 'Do you swear to tell the truth, the whole truth, and nothing but the truth?' 'Objection, your Honour/ The court will now adjourn/ 'Has the jury arrived at a verdict?'), and a long tradition, culminating in recognition of a privileged profession, with a very distinct sense of its own identity. In their own ways, classrooms, restaurants, automobile garages, family picnics, customs inspections, and even brothels all partake of the character of a frame; as soon as the situation is identified, we understand
133
Organization as Talk and Text
where we are and pretty much how people are supposed to behave. Frames may be embedded; there are frames within frames. The courtroom has many sub-frames: 'approach the bench/ 'address the jury/ 'cross-examine the witness/ 'consult a colleague/ and so on. The courtroom, in turn, is only one of several frames that make up the practice-of-law frame, or family of frames, depending on the relevant point of view. Management is a family of frames. Because of embeddedness, as Goffman (1974: 8) notes: 'Any event can be described in terms of a focus that includes a wide swath or a narrow one and - as a related but not identical matter - in terms of a focus that is close-up or distant.' These characteristics of focus he describes as 'span' and 'level.' We will return to the issue of focus span and level in chapter 7. Frames also have a recursive property. We are not usually even aware that we are framing, because our attention is centred on the events being framed, not on what frames them. But there are times when we are forced to become aware of our frames (when Bateson's 'deutero-learning' becomes possible). When a frame passes from the background to the foreground, to become an object of conscious attention, it ceases to be a frame. The new frame that results is a metaframe, or frame whose content is understanding frames. This is often identified as an academic frame, although it need not be, since all that is required to activate a meta-frame is to step back a pace and see one's own actions in context. Meta-framing is the exception, not the rule, because we cannot ask what our frames are when we are caught up in the action, since we are usually the last to know. Most of our lives are structured by frames of understanding and behaviour, not all of them communicational. These frames simplify our existence by demarcating the periods of a day from the moment we take a shower to when we read ourselves to sleep at night. Practices are always framed; frames are part of what gives them their constancy, since the discursive content may vary from moment to moment. Administration, as a practice, is also a set of frames. Frames, in that they circumscribe agreed-upon correct practices by giving them a context, are more than a system of etiquette or a guide to personal behaviour. They are rooted in a community and a structure of discourse bounded by historically derived values. Frames provide a constant setting for an otherwise variable focus of attention and interaction. They are not operational: they do not prescribe
The Vulnerable Fortress
134
behaviour, but identify the situation in which it occurs (Barwise and Perry 1983). They do not deal with process, as such. A concept of practice that is consistent with the notion of frames, but that does specify behaviour, is that of 'script' (Schank and Abelson 1977). Scripts and frames A script, as Schank and Abelson (1977: 41) define it, is a 'structure that describes appropriate sequences of events in a particular context/ Since Roger Schank is one of the best-known figures in the field of artificial intelligence and expert systems, his use of the term 'structure' can be taken to read programmable structure, or algorithm. It specifies the 'sequence of events' as the behavioural equivalent of computer routines of the kind proposed in his and Abelson's book. Their term 'particular context' is what we refer to as a frame (as their examples illustrate: a restaurant, a bus ride, a football game, a birthday party). Scripts are prescriptions for action. Learning to be a lawyer, a doctor, a garage mechanic, or a bureaucrat is to have mastered the scripts that distinguish these careers from each other. At the same time, scripts are mental schemata; it is easy to see the fascination of scripts for people in expert systems, because they can potentially be translated into programs which capture some of the properties of an event but are also components in knowledge systems. Most scripts, in real life, are not exactly programmable (any more than communication itself). Real-life scripts are open-ended: like improvisational theatre, they establish some routines but leave room for innovation. Possession of a certain amount of specific, detailed knowledge about a situation to start with reduces the load of information-processing. One does not want to be left 'wondering about frequently experienced events' (Schank and Abelson 1977: 41). Because frames are linked to scripts, it follows that information is often conveyed as much by context (or what Barwise and Perry 1983 call 'situation') as by content. The unspoken script knowledge about the event as it transpires that comes from recognition of the frame is equivalent to the default conditions in a typical computer program, as Minsky argued. Metaphor and frames Where do frames come from? Minsky's answer would be that the semantic structures of a language are themselves a source of frames:
135
Organization as Talk and Text
they supply a set of templates by means of which we can locate the events in our world. This is a powerful concept, with an honourable tradition dating back to Benjamin Whorf's hypothesis that different languages lead their speakers not only to converse differently but actually to see the world differently (Whorf 1956). It is a tradition that has recently been given a new direction and thrust by George Lakoff, who claims that language works on the basis of metaphor in order to make us always see one thing in terms of another (Lakoff and Johnson 1980; Lakoff 1987). While these are ideas to which we generally subscribe, the restriction to linguistic traits as the primary source of metaphor seems to us too limiting. For one thing, it is evident that in the understanding of a new situation our first instinct is to see it as the playing out of a situation with which we are already familiar. We have recourse not just to linguistic, but also to experiential, frames (although we freely admit the influence of language in having 'framed the frame/ as it were). 'Oho/ says the crafty veteran, 'I've been in this situation before/ For another thing, as Gareth Morgan has reminded us, there are powerful images to which we subscribe, typical of a certain era, that we use to frame our current experience - images such as machine, or organism, or nervous system. These too are a source of primary frames and take on particular significance in the context of organizational studies and management. The making of an organizational text When people make a text, they are starting from a frame. Writing and quilting are part of the same family of human activities. The notion of text is itself already a frame. Nowhere is this axiom more true than for the making of organizational texts; if there were no frame, we would not know where to start, so complex is the object that we are considering. The frames are themselves rooted in a metaphor: the organization becomes a comprehensible phenomenon because we are able to see in it the shape of something else, which we find easier to describe to ourselves. Once we have the frame, we have, as Minsky put it, a 'skeleton/ or template, with blanks to be filled in. Since texts are not only descriptive, but prescriptive, the 'blanks' lend themselves to scripting: job descriptions and procedures, for example. The text itself has to be written down, using the apparatus of language - a vocabulary and a syntax. Skill at turning the frame into
The Vulnerable Fortress
136
a text, using language and graphics (both of which, because of their cultural density, interpose their own frames of understanding in the process), is, as Minsky said, a matter of artistry. And whether the text at which we finally arrive will be taken to be a faithful representation of the other organizational reality, the conversation, is quite a different question. Conversation and text: the dialectic In all communication, we claim, conversation and text stand in a dialectical relation: each informs the other. The difference between the micro-world of casual conversation and the structured universe of large organizations is the immediacy of the coupling from one to another. The degree of immediacy is in turn an effect of the institutionalization of vertical communication; errors of interpretation that are susceptible to instantaneous correction in small societies receive more attenuated feedback in modern, complex organizations. Because the texts of an organization carry authority, they, and their verbal interpretations by people in positions of command, are the instrument by which the organizational conversation is governed. Because the conversations of the organization are its basic reality, they are in turn the criteria by which one can judge the legitimacy of the organizational text. Yet, in the normal course of events, there may be an appreciable slippage between the two. Every organization is governed at best imperfectly, if for no other reason than the unpredictability of natural events. No author of an organization, however foresighted, can look very far into the future. Confronted with unexpected variety, the agents in the field have to innovate, and, as they do, they necessarily transcend the bounds of the given text. They learn, in Bateson's formulation, and as they do they render the original text obsolete. Furthermore, before it is effectual, authority, as set out in the text, has to be authenticated through interaction, as again chapter 2 well illustrated. Governing teams, however, often carry on in power even when the legitimacy of their text could reasonably be called into question. They count on the inertia that is built into transactional systems, the secrecy of their own managerial conversation, and the credibility of the abstract organizational identity, as an icon that commands the respect and loyalty of members independent of how it is represented, to keep them safely in office, however shaky their performance.
137
Organization as Talk and Text
As Weick and others cited in chapter 3 pointed out, the conversations of the administrators and the administered are at best loosely coupled, and the reign of text is merely approximative. The implementation of new communicational technology most notably transforms the coupling of levels, both in substituting a new kind of text for the old, and in its impact on the actual conversation. The goal of office automation was a rational one: to reduce slippage between text and conversation by specifying routines in such detail, and monitoring the subsequent behaviour so closely, that the conversation would have to conform to the text. The case studies of chapter 2 furnish an example of how this can work out in practice. Such tight coupling between text and conversation does not necessarily improve the performance of the organization, as the police case clearly showed. The algorithm that decided which car to dispatch where turned out to lack the elementary intelligence of even inexperienced human beings in an unfamiliar new context; it sent patrol cars off on useless missions, while the overtime bill skyrocketed. The supplementary effect was to invalidate the authority structure that the managerial text had prescribed. This was surely not one of the predicted benefits of automation. Similarly, for the second case study, the meticulously scripted project of the software planners failed, time after time, the elementary test of good sense, when it was translated into operations. These are, however, isolated cases. What can we say about office automation, on more theoretical grounds, using the concepts that we developed in the first part of this chapter? Re-evaluating office automation The first evident fact of the office-automation process is that it too is based on a metaphor and establishes a set of frames, which in turn are used to justify its scripts. Historically, the metaphor of a communication network came into general favour in the 19405 and 19505 as the result of work concluded at MIT, in part by a student of Lewin's (Bavelas 1948; 1950), in part by a mixed group of psychologists and systems engineers (Luce, Macy, Christie, and Hay 1953), who were influenced by Claude Shannon's then recently published monograph on message flow and the measurement of information (Shannon 1948). This early work generated an extraordinary flourish of research activity, and by 1965,
The Vulnerable Fortress
138
when Guetzkow wrote his review of progress towards a science of organizational communication (Guetzkow 1965), the network metaphor had become accepted doctrine, later even to be elevated to the status of a 'paradigm' by Rogers and Kincaid (1981). To recapitulate, network theory conceptualizes the organization as a set of nodes, each standing for a potential source or destination of information (either human or machine), linked together by a configuration of channels for the transmission of information from one node to another. Each node is visualized as a receiver of input data which it processes into informational output. The resulting pattern is seen to be the structure of the organization. To our way of thinking, this 'text' describing the organization is also a characterization of the conversation of the organization. But consider some of the differences from our own 'text' (and our theory of the organizational conversation). (1) The 'transactions' have become 'transmissions.' Since they are no longer conceived of as acts mediated by speech, the intentionality of the source is taken to be irrelevant, and any effect other than reception of a message is ignored. This being so, the people in communication are not thereby committed morally (by mutual obligations), and control over behaviour is no longer an intrinsic property of communication - occurring within transactions as a dimension complementing that of information exchange - but rather something to be imposed externally. On the principle that internal controls are more powerful than external, precisely because of their moral force (recall the police case study), the net effect is diminution of authority. (2) The change of perspective from transaction to transmission has wished out of existence the self-organizing properties of the working groups, negating any opportunity for collective learning and hence ways in which the organization can adapt to changing circumstances. A typical initiative by management is to instigate a version of 'quality circles' in order to encourage individuals to contribute their original ideas on how to improve corporate performance. These campaigns are usually thought of as a means to improve 'communication with employees.' They also usually fail, because they presuppose that it is the individual nodes that learn; learning, as Bateson argued (Bateson 1972: 298), is a property of the transaction. (3) Another casualty of the transition from a transactional to a network theory of organization is the creation of organizational
139
Organization as Talk and Text
identities. Again the assumption is made that the identities are determined from outside and coincide with nodes in the network. In our theory, identities are relational in character and have to be sustained through interaction. The software development described in chapter 2 came to grief above all on the issue of identity, which it found impossible to resolve other than by inventing an identity called an 'originator/ Can you imagine coming home and telling the family you have just been hired on as an 'originator'? The very idea boggles the mind. (4) In network theory, nodes are information processors; in transactional theory, nodes are transactions. Networks of transactions are in effect stories, with characters, circumstance, motives, and a progression from state to state that is comprehensible to our naturally endowed narrative imagination. Such organizational stories make it possible to understand events as they occur and to orient ourselves to them. They underline the capacity to learn. Under a network interpretation of organization, the only meaning is extrinsic, part of the frame of reference of the designer, and no part of the experience of the people in the net. (5) Network theory lacks the property of reflexivity and hence of auto-regeneration. (6) Network theory takes information to be non-indexical. What this means in practice is that people in networks transmit data, not information. In the absence of a capacity to evaluate data in context, there can be no real intelligence, no capacity to take account of particular circumstances. When we put all these differences together, the net effect is to create out of office automation a strait-jacket that forces people into a totally artificial way of communicating, without elasticity, without the capacity to learn or take account of circumstances, without meaning, either for the individual or the organization itself - bureaucracy in the purest sense. It is a regime that senior managers can impose only by force majeure and the ever-present threat of punishment for any deviation from the norm. It is tolerated, above all, because of fear of loss of employment. It has to be imposed from the top down, and, like autocratic systems everywhere, it generates a subterranean organization of resistance. The transactions that are inherent to spontaneous processes of organization through conversation don't go away; they just go sour.
The Vulnerable Fortress
140
Why was automation such a popular idea? Technology has to be packaged. It is seldom presented for sale simply as naked machinery: it must be given a functional meaning, for some particular community of users, before it becomes marketable. The motivation behind the push for office automation in the early 19805 was to introduce computer-based tools as substitutes for previously non-electronic office machinery (such as typewriters, calculators, copiers, and mail transmission). The size of administration in the modern economy created an enormous market to be tapped, and great profits to be made, if managers could be persuaded to buy the new machines. But the packagers needed a convincing metaphor, and this is where 'automation' came into the picture. The vendors were promising, in effect, that the sorts of great gains in productivity that followed from introduction of assembly-line technology on the factory floor could be duplicated in the office. By calling the products 'automation/ the vendors and buyers were committing themselves to a vision of what administration is and how it works. Office automation, in other words, was an attempt to transpose a rationality or logic of the shop floor to the domain of management. It was like trying to equate the 'processing' of information, and the 'making of decisions/ with the processing of iron ore, and hamburger meat, and fermented grapes, and the manufacture of cheese and Hondas. Unfortunately for the fate of a good many office-automation adventures, whatever it is that managers do when they 'process information' and 'make decisions/ it does not much resemble standardized industrial processes. Information does not go in, nor decisions come out, in the neatly mechanical manner that we have come to expect of manufacturing. The automation advertising portrayed individuals happily engaged in 'massaging' data and communicating electronically. Yet the examples of recent management literature discussed in chapter 3 convey a quite different picture: administrators spending most of their time talking, usually face to face, resolving problems as they occur by discussion, getting people 'on board/ calming, persuading, motivating, inspiring, correcting, reasoning, pleading, chastising, brokering, and the hundred other things that people do when they get involved in the social dynamics of an office. No 'data.' No 'processing.' This other view of management, if we take Brunsson's (1985) terminology, is 'action rationality' at work, not 'decision rationality/ and the relevance of technology to action rationality is far from transparent, we discovered.
141
Organization as Talk and Text
The office-automation image was centred on the accomplishment of practical transactions, with a measurable utilitarian outcome; what it left out of account were the interactional dynamics that accompany the conduct of human affairs and make transactions possible in the first place. Its appeal to senior managers was subtle. It offered a technical solution to a real problem: declining productivity in North America. Technical answers seem somehow safe, practical, especially when the alternative is not clear. There is probably another reason for the appeal of office automation. Ideologically, it encouraged an image of the working part of the organization as an object to be manipulated and run. It encouraged the feeling of separation between conversation and text and the idea that, with a better text, the people at the top could finally control their organization and make it run the way that they wanted. To people of an authoritarian stamp (note that root 'author' again), this was an attractive possibility. Unfortunately, uncritical reliance on a machine model of managing blocked, while it was dominant, investigation of a basic question, 'What is an organization, and how does its structure reflect the communicational imperative?' It offered false security. By restricting the circle of questioning, it inhibited people from asking whether the events of the past few years - the kind of trends reported in chapter i - might not be pushing us in the direction of new organizational forms, not merely reinforcement of the old. This is the question that we take up in the chapters that follow. A postscript on automation There was one positive benefit from the office-automation 'movement.' Whatever its limitations, it was a model of communication processes. It made people look not at the way in which tasks were distributed but at how they got carried out sequentially, usually involving more than one department. By doing so, it forced into our consciousness the fact of communication. The processes of communication have received almost no attention in the business schools of North America in our century, and there is no body of knowledge extant on how to analyse the effectiveness of those processes. The natural conversational processes have a tremendous built-in internal resilience, but there is no guarantee of their efficacy, on external criteria. The software designers made us look more closely at how the processes of work were being conducted, and why productivity had become so
The Vulnerable Fortress
142
problematical, in the face of confused responsibilities, duplication, and tangled patterns of exchange (for an excellent analysis, see K.D. Mackenzie (1978). The office-automation 'solution' may not be the right one, but at least its identification of the problem makes it clearer where we should be pointing our inquiries. We shall take this issue further in chapter 7.
Part
2
Management in the Information Age
This page intentionally left blank
5
The changing transactional environmenE
Space and time, and also their space-time product, fall into their places as mere mental frame-works of our own constitution. Gauss, cited by Harold A. Innis (1951), The Bias of Communication, 92 I have attempted to trace the implications of the media of communication for the character of knowledge and to suggest that a monopoly or oligopoly of knowledge is built up to the point that equilibrium is disturbed. Harold A. Innis (1951), The Bias of Communication, 3-4 Introduction
Until recently, bureaucratic management has been assisted in its growth by communication technologies which enabled administrators in both public and private spheres to achieve vertical integration of activities on an unprecedented scale. It appears, however, that developments in telecommunications and computing have now outdistanced these communicational lines of control and integration and may even be working against them. The technologies are not only accelerating the pace of the evolution in the economy and the workplace that we sketched in chapter i but are affecting its shape. On the one hand, developments in telecommunications are increasing transactional interdependence and connectedness, on a global scale; on the other hand, the way in which computer technology augments individual knowledge and skills is leading towards organizational fragmentation and atomization.
The Vulnerable Fortress
146
A century ago Let us look back to the origins of the bureaucratic form of organizational management in North America in the late nineteenth century. There is now a considerable literature dealing with that very critical historical period which witnessed the genesis of the modern form of organization. Authors such as Chandler (1977), Williamson (1977), and Beniger (1986) have explored, from a mainly economic perspective, the changing nature of organization during this era, when significant new technologies had emerged. The modal form of contemporary administration came into being at that time as a result of at least two historical developments. First, technologically supported manufacturing processes (the industrial revolution) rose to dominance, and, second, the technology of communication and transportation improved (the communication revolution). The first development made possible economies of scale in manufacturing through introduction of standardized production routines. The second opened the door to extended networks for distribution of a range of goods and services to a large and heterogeneous population, as opposed to the local, single-product distribution systems typical of earlier industrial societies. Both developments took time: the technology had been evolving a half-century before the organizational consequences became fully visible. An administrative system was required to make extended distributional networks functional. It had to be able to assure uniform accounting and billing in order to track the expanding set of transactions, install procedures for ordering and delivering supplies to maintain a far-flung network of outlets, and develop procedures for the hiring, training, and control of personnel to maintain both the production and distribution of goods or services. Management emphasized maintenance of a standard level of product (either a good or, increasingly, a service), in conformity with the dictates of the market, a product that was advertised well enough to attract a large clientele in the face of competition from other suppliers. The model of administration that best conformed to these exigencies has been called variously rational, functional, and bureaucratic. It encouraged conformity to certain crucial norms of standardized accounting, billing, ordering, customer relations, personnel policy, and so on. This normalization was achieved, by and large, and the administration worked, even though, in its early years, the organization
147
The Changing Transactional Environment
almost certainly remained much more local in character, and culturally diverse, than the bureaucratic model made provision for (in its predilection for 'scientific' principles over 'cultural')- Despite the lack of official recognition (much less approval), this sub rosa persistence of local organization was in fact, if we credit the arguments of Weick and others in chapter 3, a source of strength, not a flaw.1 The economist, as well as the communication theorist, can conceptualize the modern firm as a historically situated aggregation of activities of production and distribution, linked together to form patterns of transactions in order to capture an economic advantage. Transactions, organizationally speaking, have no single manifestation: they could, for example, be market transactions. Before about 1890, in fact, they were by and large market transactions.2 The 'bureaucratic' set of transactions making up the body of the modern enterprise came to be conceived as components of a unified, centralized organization only as an indirect result of the combined effects of the discovery of new sources of energy, application of scientific knowledge to industrial technology, and improvements in communication. Hierarchical, or vertically integrated methods for organizing economic activity proved to have an economic advantage over the prevailing market mode. They led to a form of organization that replaced the transactions of the market with transactions involving incentives (salary), supervisory control and measurement (accounting), and planning (staff functions). The prototypical case was the nineteenthcentury railway. The motive behind the growth of enterprise was economic (the requirement for capital investment), but telegraphy made possible a system of station agents all working as employees of a single firm, and made it economically advantageous. While such new managerial 'instruments' may have reduced individuals' autonomy in business transactions, they also kept costs down and supported greater productivity (and hence more affluence, on average) than did the market mode. They thus represented a historic shift of transactions out of intermediate goods markets, both for manufacture and for distribution, and into the firm, involving 'the substitution of internal incentive, control, and resource allocation devices for functions that are traditionally associated with capital markets. Transactions that could have been, and at one time were, executed across markets were thus relocated within the firm, where they were executed under common ownership - with all of the incentive and control advantages (and limitations) attendant thereon' (Williamson 1977:10).
The Vulnerable Fortress
148
The principle of coordination involved is economic: 'aggregation occurs largely on the basis of the relative costs of alternative methods of managing the information flows required to co-ordinate economic activity' (Osberg 1989: 5). According to this argument, the creation of establishments in the early twentieth century, with congeries of workers reporting to employers, amalgamated into departments, and then into organizations, made economic sense. Firms subject to the rule of the marketplace, and open competition, according to Osberg, react to a combination of factors: 'Boundary conditions of firms are set by the relative costs of market and non-market organization ... Firms will expand or contract (by amalgamation or subcontracting) as a result of the trends in the costs of information processing' (5). Those years witnessed a process of substituting of one kind of transactional procedure for another. This is not to claim that the switch-over to vertical integration was total. Complete rationality would have brought the whole system to a halt. What was achieved was a shift in transactional patterns rather than a complete turnaround. Despite increased integration of activities, according to Williamson, 'no firm is fully integrated' and 'market pressures are still at work' - people move freely, for example, from one firm to another in search of a better situation. As well, 'internal organization and market organization coexist in active juxtaposition,' and managers 'are subject to the humbling limits of bounded rationality' (1977: 19-20). There was also a textual dimension to the transactional shift. Whatever its economical basis, the integrated organizational system began to consolidate in another fashion, this time in the form of the symbols of hierarchy. It was given a name: in the private sector, 'corporation' (and the law was altered to permit corporations to act as persons, in the transaction of business); in the public sector, 'public administration.' Words, as we have seen, are able to create objects in our minds. The very act of naming reified the abstract idea of bureaucratic administration. It took on an appearance of rationality which was no longer altogether constrained by the original considerations of minimizing transactional costs in an open, competitive system. It began to seem not only right, but inevitable - just as had been the case earlier for monarchy and church. Bureaucracy not only made economic sense; it also fitted our ideas in Canada of 'peace, order and good government.'
149
The Changing Transactional Environment
One hundred years later A century later, as we have seen in chapter 3, the hierarchical model of administration, based on a rational metaphor of the machine, is becoming somewhat tattered. The competitive advantage enjoyed by large business corporations organized on the basis of its precepts has been eroding, and government administrators face general lack of confidence in their ability to manage public goods. Development of the vertically integrated model of bureaucratic organization followed a revolution in communication technology (transportation, telegraphy, and telephony) begun about 1830. After a time lag, these innovations became tools that permitted administrators to manage and control the integration of economic transactions on a much larger scale than ever before. They made modern bureaucracy possible. Since 1960, communication technology has once again been undergoing a vast sea change - one that dwarfs even the discoveries of the past century. Advances in computing and telecommunications have introduced capacities and pressures whose implications we are only beginning to comprehend. As in the nineteenth century, there is a time lag between such developments and their impact on patterns of organizational transactions. Before we discuss the potential outlines of the present 'fit' between organization and technology, however, a chronological description of recent technological innovations may be helpful. Four phases in telecommunications and computing Communication technology could be said to have gone through four phases over the past thirty or so years. During the first phase, computing and telecommunications were distinct from each other. Computing was used mainly for accounting functions such as billing and payroll - after-the-fact, or 'back office' functions. Other than for improved telephone service, electronic communication played only a minor role, since conventional paper transmission was involved in both the input and output of data (not to mention other, more standard, formal office communications, such as memos, which were also paper-based). In the second phase, computing went 'on-line/ in 'real time/ to
The Vulnerable Fortress
150
provide the memory back-up for functions such as airline reservations and banking transactions, while simultaneously carrying out the accounting functions that accompanied them. The electronic networks that supported on-line computing existed, by and large, within the boundaries of a single firm or organization (although it might be geographically dispersed) or of a closely associated cluster of organizations, using special (and expensive) communication facilities. This was the period of the dedicated Local Area Networks (LANs), although the latter tended, progressively, to evolve towards regional networks. Office automation was an outgrowth of this era, a company-specific technological package. Its realizations are visible everywhere: in banking, airlines, retailing, manufacturing, and the resource industries. The third phase has had two dimensions. One is a result of the introduction of personal computing, a product of microprocessing in an era of very large-scale integration (VLSI). Personal computing, with its 'spreadsheets' and 'desktop publishing' facilities, has had exactly the opposite effect of its predecessors. It has led to 'democratization' of computer use, to proliferation of incompatible, weakly integrated, but powerful computing systems, accessible to ordinary users and, for all practical purposes, outside the control of the computing professionals or of a firm's office automation system. Telecommunications have been the other side of this third phase. Historically, introduction of personal computing technologies coincided with a similarly dramatic, but less immediately felt, improvement in telecommunications technology. In 1975, the same year in which the first personal computer was marketed, digital switching was introduced into telecommunications, permitting integration of data traffic with ordinary telephone commerce. That development, in combination with the increasing role of satellite and fibre optic transmission, and with increases in capacity and reliability, has made telecommunications a driving force in today's technology, particularly when seen from the perspective of corporations, including government. Telecommunications, like the personal computing technologies, fell outside the control of the old computing establishment, but for exactly the opposite reason. While personal computers are a technology too small to be comfortably managed centrally within the bounds of a single organization, telecommunications are mostly too large.3 Unlike office automation, in which technological implementation was largely company-specific, telecommunications spill over the borders
151
The Changing Transactional Environment
of any single firm. Most companies, with the possible exception of a few mammoth enterprises large enough to construct their own facilities, rent the service. And even here there has been a change. Since the new nets can be configured to meet specific clients' needs, 'renting a service/ for the larger companies, means more than installing sets and making calls; an organization can now rent whole networks - or rather a portion of the capacity of a network dedicated to the client during the rental period. Since these rented networks use the existing networks without being structurally identical to them, they have been termed virtual networks.4 We are now on the brink of full integration of the standard computing and telecommunications technologies. It is reasonable to begin to consider them as a single integral influence on, and tool for, the conduct of both business and public affairs. This is the fourth phase, which will occur when telecommunications have become as transparent to the PC user as voice telephone communication is now (a prospect being promoted under the title 'integrated services digital network/ or ISDN. Already, dense networks of electronic messaging, linking computers through telecommunications, are part of the working infrastructure of all who presume to call themselves 'knowledge workers.'5 Even to talk of telecommunications as if they were something different from computing is misleading. What used to be three separate technologies - computers, office machinery, and telecommunications (including surface mail and telephony) - are rapidly merging into one. Office automation, for example, is now one component among several in the overall mix. We have almost moved from an era in which computing depended on a centralized facility linked into a limited, configured network within the bounds of the organization to an era of widely distributed computing power and memory, linked together by a universal, non-specialized network, independent of organization, which offers configurable networking as an option. The telecommunications networks can be configured to fit the profile of a user, but they are not inherently limited, technically, to such patterns. We can detect the effect of new communication technology on organization by examining how it affects transactional patterns. At the end of the nineteenth century, we have argued, transactions, enabled by technological change were transformed from a marketplace to a more hierarchical, bureaucratic mode, a pattern that was consolidated over several decades. In the 19905, the new mix of telecom-
The Vulnerable Fortress
152
munications and computers is pressing in two directions large organizations, both public and private, that are seeking to maintain their 'competitive edge.' Along one dimension, telecommunication network services are allowing a new, global organizational interconnectedness of transactions. At the same time, the technology that augments individual communicative capacities is splitting traditional transactional patterns into fragments and atomizing organizations. TNS and pressures towards globalization Let us look first at the globalizing influences on transactions stemming from new telecommunications services and at several ways in which the organizational consequences of these influences have been conceptualized. Telecommunication Network-based Services (TNS) combine information production, manipulation, and/or distribution with the use of telecommunications facilities and software facilities. Although they are, like office automation (OA), a packaging of functions, or services, rather than simply a technological infrastructure, their telecommunications component, much less present in the OA designs, combines profitability and global access. They are not, in other words, merely office automation writ large. TNS include such facilities as on-line computing services, remote access to databases (both public and proprietary), interchange of electronic data for the conduct of inter-company transactions, flexible client-specific networks on demand for businesses operating in dispersed locations, fully integrated voice, data, and graphics capability, mobile communications and electronic messaging, expert systems and 'black box' money management, and distributed processing and computer-assisted design (CAD) for multi-site design teams. They have come into use in varied contexts and modes, to cut down on inventory (through 'just-in-time' delivery), to manage a production cycle, to support service delivery, to integrate network outlets, to communicate with customers and maintain better intelligence on their needs and preferences, to link with suppliers (ordering and billing), to accelerate the transfer of funds (and, by doing so, deliver new kinds of service), to coordinate distributed world-wide operations, and to feed into planning systems. New 'geodesically' structured networks are available in which 'any one point can get in touch with another, directly or through routes that may have nothing to do with the [political]
153
The Changing Transactional Environment
jurisdictions that shaped and contained the hierarchical networks of the past/6 The development of new services on this scale should affect the practice of management. It is less than obvious, however, how to conceptualize such implications. In the section that follows, we discuss two recent approaches to understanding the globalizing implications of TNS, again from an economist's perspective - one, using the concept of the information intensity of the value chain, and the other, a resource-based approach. Conceptualizing the organizational consequences of telecommunication technologies TNS are based on an interweaving of information processing and telecommunications. But it seems clear that, beyond the fusion of technologies, it is the effect of expanded communications that is transformative. Unlike computing systems supported by a local area network, such as office automation, the new constellation of technologies transcends the frontier of the organization and hence opens up the boundary question. Some observers are now arguing that established organizational configurations are being made obsolete, creating the need for completely new alliances of functional centres of activity. Those who had been integrated vertically within a company or government department now find themselves migrating into an inter-organizational environment. The dynamic potential of the newer communication technologies, these people are suggesting, implies not only a realignment of the elements making up whole industrial sectors, but a rethinking of the means of coordination necessary to integrate economic activities. This transformation, it can be argued, is of the same kind, and of the same magnitude, as that of the latter nineteenth century, when the model of organization typical of corporate entities today began to take shape. The arrival of TNS, with their virtual network capacity, pushes economic players toward increased interconnectedness - new alliances. In 'shadow,' or Virtual' organizations,7 the components fit together logically, even though the resulting configuration may not coincide with any single, identifiable corporate identity. The availability of such technology challenges established ideas of organizational design. It raises questions about the appropriateness of the way in which we train people to fit into work structures and how we
The Vulnerable Fortress
154
manage them in consequence. It can also force the regulatory bodies that set the ground rules for the conduct of enterprise to re-examine their current philosophy and practice. We must not turn our backs on what is happening in the hope that it will pass. Because of the increased interconnectedness that telecommunications inspire, structures are being forced to change. When we talk about these technologies, we are thinking in terms not of a management information system in a new guise, but of another dimension of infrastructure, one which incorporates customers, branch and regional offices, and the operational centre into what appears to be a new type of configurational pattern. Reshaping the value system: understanding sectoral dynamics In the traditional economics of industrial organization, market structure provided the context for a firm's behaviour, and the combination of the two - structure and behaviour - decided economic performance. A description of market structure has usually included (i) the number and distribution of buyers and sellers (2) how they relate to each other, (3) barriers to entry of new players, and (4) the threat of substitute products or services. Firms' profits arise from monopoly power (or oligopoly power), where one or a few sellers can restrict output below the competitive level. Firms may be more or less vertically integrated, with more or less diversified product lines. This pattern fits well with the bureaucratic model of management: each firm operated as a distinct unit, within well-defined boundaries. Though fundamental, this kind of model has proven less useful in analysing the kinds of changes engendered by advances in telecommunications. It tends to assume that firms are homogeneous and operating in mainly equilibrial conditions, basing their decisions on perfect information. But in the universe that we have been describing, this is far too restrictive a set of premisses. It is precisely not a system in equilibrium. The traditional model can tell us little about how the market got to where it is now, or about how technological change might alter the settled arrangement. There is no place in it, for example, for the kind of virtual or 'shadow organization' that characterizes the current situation and presages restructuring. Aware of new pressures on transactional patterns, theorists have been proposing alternative ways of thinking about organizations, particularly for corporate bureaucracies. Porter (1980; 1985) has in-
155
The Changing Transactional Environment
troduced the notion of a value chain into the classical model of market structure. He conceives of a company as a set of technologically and economically distinct value activities (whose 'value' is determined by what buyers are willing to pay), linked together logically in a structured pattern, so that the way in which one activity is performed affects the cost or effectiveness of other activities. Value-adding activities are all the kinds of operations that contribute to the company's profits. The profitability of a business is determined by the difference between the cost of carrying out the value activities and their market value. Competitive advantage can be gained by reducing the cost of the activities or by enhancing their value (for example, by differentiating products in order to command a premium price). Companies may, of course, engage in diverse kinds of activity, each reflecting a separate market and corresponding to a distinct strategic business unit (SBU). There is thus a separate value chain for each of these value streams. The number of value chains is an index of the firm's scope. The concept of value-adding activities, the grouping of which goes to make up a given economic enterprise, provides an analytical tool to begin to consider how to reconfigure such 'groupings/ or sets of transactions, into new kinds of logical structures, which may or may not coincide with established organizational units. First, however, a brief digression to explore the term Value-added' may prove helpful. Information as value-adding The notion of value-adding is an offspring of the information age. While we tend to talk about the information economy as a distinct sector of activity, in parallel with goods and services, this interpretation is in fact only partially justified. There is, of course, something that could be called an 'information economy' (sale of databases, computing services, broadcasting, and so on). Nevertheless, information and telecommunications more commonly figure as an important component of value enhancement in a wide range of goods and services. This has eroded a number of traditional distinctions. Thus, a hamburger is food, but a fast-food outlet is also a service and, at least as significant for companies such as McDonald's a distributed information system. While restaurants were perhaps always simultaneously product, service, and information system, the communication part of
The Vulnerable Fortress
156
the equation was so secondary as to be almost invisible. It was a question of local managerial intuition and interpersonal communication: such things as looking after the paper work, ordering supplies, and handling personnel. Similarly, while an airline is a transportation system, it is also a service and, as American Airline's Sabre reservation system demonstrates, above all a communication network.8 By making communication into a tool for the administration of a new kind of customer service, with a global reach, telecommunication services have been reshaping our ideas about restaurants, food, airlines, air travel and, more important, international patterns of goods and service delivery and hence, about how to structure an enterprise. The growing salience of the communicational function has changed industrial competitiveness. The customer may now be influenced more by perception of the quality of the communication service being offered (universal availability, for example) than by the actual good and service itself. For Porter, communication technology can add value with respect to both primary and support activities of a firm. The affected's 'primary' activities (i.e. those involved in the physical creation, marketing, delivery, and servicing of the core product or service) would consist of (Porter and Millar 1985: 151) automated warehousing (inbound logistics), flexible manufacturing (operations), automated order processing (outbound logistics), telemarketing and remote terminals for salespersons (marketing and sales), and remote servicing of equipment, computer scheduling, and routing of repair trucks (service). Technology would add value in 'support' (i.e. back-office) activities in the form of, say, on-line procurement of parts, computer-aided design, electronic market research, automated personnel scheduling and planning models. The effect of TNS on discrete work activities with definable output - centres of production, for example, or back office functions - can be easily seen. Their capacity to make changes such as creation of new and more effective linkages are much more subtle, however, and harder to evaluate. For one thing, because the way in which one activity is performed may affect the cost or effectiveness of another, linkages create trade-offs that have to be resolved. It is not necessarily clear whether a linkage will contribute to competitive advantage overall, for example, if the greater cost of production caused by use of superior materials results in lower after-sales service costs. Similarly, since linkages imply coordination of activities, careful
157
The Changing Transactional Environment
management of them (through practices such as just-in-time delivery) is crucial. Unfortunately, the management of these kinds of linkages is what firms have traditionally found most difficult to achieve. (This issue of how to conceptualize the linking of value centres to form chains is one to which we return in chapter 7; it is the essence of a 'hypermanagement' strategy.) However difficult it may be to ascertain the effect of TNS on the efficiency of intra-firm linkages, that complexity pales before the question of their effect on inter-firm activity linkages. As Porter notes, the value chain of a single firm is embedded in a larger stream of activities, which he calls a 'value system/ (Since value systems transcend individual organizational boundaries, they are the basis for what we have been calling 'shadow/ or 'virtual' organizations.) More than one buyer/seller relationship may be involved in the chain, or system, stretching from primary resource to ultimate consumer. The linking of activities creates inter-firm interdependencies and poses a question of strategic management in an era of information technology. How to create an interdependence that contributes to overall efficiency without becoming the prisoner of a relationship, whose benefits may have to be re-evaluated in the near future, when technological change has made yet another arrangement more attractive, as the cost picture shifts? We live in an era when, as Porter and Millar (1985: 151) put it: 'Information technology is permeating the value chain at every point, transforming the way value activities are performed and the nature of the linkages among them/ (This is again the issue that will preoccupy us in the next chapters.) The permeation is evident even in functions that Porter would have included as 'support activities/ There are signs, for example, that a software crisis is building in North America. Custom-built systems for large-scale enterprises such as banks made tremendous demands on highly qualified specialists. Few people can build reliable systems composed of some 2.5 million lines of software code (this is the kind of development described in chapter i). In other words, the back-office accounting function has become a scarce resource and may migrate out of the intra-firm value chain into the larger value system. And there are, of course, implications for the practice of 'outsourcing' (Malone 1987; Malone, Yates, and Benjamin 1987). Some organizations are closing up their own computer departments, which may employ several hundred people, and out-sourcing their entire
The Vulnerable Fortress
158
data-processing operation to specialized companies such as EDS, which will build and manage their operation for them. This pattern appears to be increasingly common. One Arthur Anderson affiliate, for example, expected to general $20 billion in outsourcing fees by 1995. A large Canadian computer services company estimated that Revenue Canada could cut its data-processing costs by up to 40 per cent by out-sourcing (although it is unlikely to do so, given the confidentiality of tax data). Clearly operations that were once vertically integrated are tending to fly apart, and the apparently growing trend to out-sourcing is merely one index of this new reality. A 'resource-based' approach Some further insight into these trends has been provided by Williamson (1975; 1977)/ who has developed, building on Chandler's (1962; 1977) work, a theory of transaction cost. It posits a hierarchical arrangement of activities, within the bounds of a single administration, as an alternative to the open market. Other alternatives are more or less viable, depending on how the costs of transactions balance off against the scale of operations that can be incorporated under a single administrative head. His theory would argue, for example, that companies resort to out-sourcing when the economies of scale coming from in-house organization of the computing function no longer match up with the prevailing market price. Such reasoning is a reminder that the present situation is extremely dynamic. The entire value system is being reconfigured, with a number of trends at work, some in the direction of hierarchy as a linking mechanism, others in the direction of the market (Reynolds 1989). Eric demons (19893; 1989^ has taken Williamson's theoretical line further, within a 'resource-based' view. He thinks of firms as resource bundles under common managerial control. He sees economic structure as the distribution of resources to activities and the interactions among these activities (roughly equivalent to Porter's Value activities' and 'links'). The structure of the organization that emerges, following Williamson and Clemons's logic, is subject to the trade-off between production economies and transaction costs. Interactions may be vertical - the flow of goods and services along a single value chain or horizontal - to get more efficient access to the fixed resources required in a given activity. Both may occur as either intra- or interorganizational linkages.
159
The Changing Transactional Environment
Resources thus become the drivers of structural change. Industrial restructuring is the result of changes in the value of resources. The TNS communication technology will enter the resource picture partly through the way in which it alters the value of a resource. Either the technology figures itself as a new resource (such as a computing facility), or it affects the production/distribution economics of another resource (through rationalization of purchasing and distribution, let us say). TNS may also be a mechanism for implementing strategies that become salient as a result of changing values of other, non-technologyrelated resources. An example cited by demons is Merrill Lynch's cash management account (CMA), enabling business customers to handle their banking transactions from their office computer. Either way, economic restructuring may occur. Firms may change their scale of operations (expanding or contracting within a particular market). They may also change their scope (expanding into, or withdrawing from, different markets and industries).9 Finally, firms may change their vertical integration (expanding into, or withdrawing from, activities linked within a single value chain). Each of these possibilities can operate towards greater or less integration. As we have mentioned earlier in this chapter, restructuring may be virtual, in that many of the rearrangements occur without changes in ownership. As in the expression Virtual networks,' with the concept of virtual organization we have in truth entered the information age. These are known variously as 'constellations/ or Value-adding partnerships/ or Virtual companies.' A diverse pattern of proprietorship is compatible with a single control system (through franchising or cooperative arrangements, typically). In demons's theory, the form of resource integration that emerges (ownership, out-sourcing, or cooperation) depends on both the initial resource position and the way in which the technology is used. Obviously, any change in one firm's structure is usually accompanied by related changes in others, especially in today's competitive environment. demons concludes by arguing that the resource position of a firm may be more relevant to its long-term survival and profitability than being at the 'leading edge' in information technology. Where a single firm enjoys a clear competitive advantage, through controlling a key resource, TNS may enhance that position. Where firms are at a resource disadvantage, however, they will tend towards out-sourcing
The Vulnerable Fortress
160
or cooperative development, preferring the former if the player with which they enter into an agreement is not a direct competitor and the application is not critical to the core business. As he says: 'It may become increasingly difficult to ignore inter-relationships between markets and between business units within firms. As competitors continue to exploit potential economies in resource integration, particularly between industries, a much more holistic approach towards planning and strategy may be required' (demons and Row 1989: 22). The vanishing status quo The part of the puzzle that resists explanation, for many managers, is the precise value of the new technologies and how to evaluate their contribution. If there is one finding that seems to stand out from our reading of the literature, it is that, as the citation from demons above indicates, TNS cannot offer permanent competitive advantage to an organization. The advantage of the technology comes from leveraging - turning a natural advantage, the result of non-technological factors, into an enlarged market share by using the technology to sharpen a competitive edge or amplify the effect of an innovation. demons has been particularly insistent in arguing this point: 'Instead, the greatest competitive impact of information technology can be achieved by exploiting the strategic potential of the firm's portfolio of business units, in ways all too frequently overlooked by management in its strategic planning and by MIS executives in their search for technology opportunities' (demons 19893: i). His argument is thus that competitive advantage is conferred on a firm by having access to some resource that can be profitably exploited and that the competition finds difficult to match. Innovations in communication and information technology, considered as a resource, are hard to protect and easy to imitate: patents and copyright are a fragile defence, trade secrets are porous, and most of 'state-of-the-art' technology is readily available on the open market. Applications of technology available to a firm are usually equally accessible to competitors. Any temporary advantage flowing from being the first to adopt a technological innovation eventually erodes and the innovator may even end up paying a 'learning curve' penalty for having been the first in the field. This does not mean that companies can safely ignore the pressure to adapt to TNS, if for no other reason than what demons (1989^ has
161
The Changing Transactional Environment
called the 'problem of the vanishing status quo/ The alternative to investing in TNS is unlikely to be continuation of existing market conditions. Other companies will pick up the innovation, and, when they do, previously established profit margins will be altered, customer behaviour will change, and the size of the market itself may be different. Market share may drop. In combination with other factors, inability to foresee, or slowness to respond to, the new market reality can leave companies in very serious difficulties, particularly if their position was already fragile. For these players, TNS are not just a potential source of competitive advantage but a strategic necessity - the price of remaining competitive. Managers thus confront choices. A well-founded venture in TNS is a way of building scale into operations quickly, can turn a 'cost centre' into a 'profit centre/ and may open the way to wider scope for the company. This is leveraging, which McFarlan (1984: 98) describes as follows: 'With great speed, the sharp reduction in the cost of information systems (IS) technology (i.e. computers, remote devices, and telecommunications) has allowed computer systems to move from applications for back-office systems to those offering significant competitive advantage. Particularly outstanding are systems that link customer and supplier.' There is no guarantee, however, that a given investment in the technology will succeed and, even if it does, that it will turn out to be anything more in the long run than a strategic necessity - a condition for staying in the game, but no permanent contributor to profits. There is no magic formula for success in computerization. There are, furthermore, some further complications: Though such links [i.e. from customer to supplier] offer an opportunity for a competitive edge, they also bring a risk of strategic vulnerability. In the case of the aerospace manufacturer, operating procedures have shown much improvement, but this has been at the cost of vastly greater dependence, since it is now much harder for the manufacturer to change suppliers' (McFarlan 1984: 98-9). In the past, technology tended to be identified with a semi-permanent infrastructure, meant to support ongoing core activities of production and distribution. In today's environment, information and communication technology figures in a fast-moving game of strategy. Rivals jockey for position, often to gain short-term advantage, from which they can branch out into new directions, always aware that the opposition is trying to out-guess them, by finding an innovative
The Vulnerable Fortress
162
use of the technology in areas of which no one had yet thought: The next decade is more likely to be one of 'economic war' than 'economic peace' ... We are entering an era of discontinuities; overcapacity in industry, increasing global competition, rising expectations both of the quality of products and services as well as expectations as to one's standard of living ... the management systems and ways of doing business in such a period of continuing economic change will be different from those in the past... "Business as usual" will not be adequate in the years ahead' (Morton 1986: 12). The one thing that no one can afford to do is to stand still. Economic implications of micro-computing: atomization and fragmentation All the technologies associated with the transformation in telecommunications are related to growth in the scale of operations - integrated administration covering a greater-than-ever territory. Their challenge is how to get from mid-sized operation to large and from large to global, while retaining integrity of purpose and without losing control. The computing systems needed to maintain at least a semblance of centralization are larger than ever. When bureaucracy finds it difficult to cope in this environment, it is because its logic is primarily segmental - division of responsibility and departmentalization - not spontaneously integrative. Differentiation, in a bureaucracy, comes easier than coordination; protecting one's turf is easier than acting together to produce a concerted response. Yet, as we noted earlier, TNS are merely one aspect of technology as we enter the mid-1990s. Side-by-side with TNS, a second phenomenon has surfaced over the past decade. This is a take-over by 'augmentation' technologies (whose origins we describe in the next chapter) - namely, micro-computers, or personal computers. Personal computers, as we illustrated in chapter i, have created a new type of knowledge worker. In offices, they have liberated white-collar professionals, giving them a potent instrument to carry out on their own complex operations that used to require the services of whole departments. For the small organization and the self-employed as well, accounting, statistical, and other numerical computations, publishing (from A to Z), modelling and design in architecture and engineering, and film and television production capabilities are all available, and possibilities continue to expand with each successive generation of new computers and new software.
163
The Changing Transactional Environment
The same people who have mastered the use of applications software to carry out their professional tasks have also discovered a second powerful support system - electronic networking. Telecommunications make possible a range of powerful networks for very large organizations, but digitalization has also opened the gates to individual electronic messaging, database consultation on demand, library searches, fax, and mobile telephony. These new tools, in combination with micro-computing, have 'augmented' the value of any member of the work-force who already possesses expertise, in whatever field. Let us see what economists once again have to tell us about this development. The decision to participate The underpinnings of modern administration are economic; in Max Weber's words: The official receives the regular pecuniary compensation of a normally fixed salary and the old age security provided by a pension' (Gerth and Mills 1958: 203). Williamson (1975, 1977) would see a transaction implied in this citation. In classical economic theory of the firm (Barnard 1938; Simon 1947; Simon, Smithburg, and Thompson 1950; March and Simon 1958) it is a transaction linked to 'the decision to participate': The decision to participate lies at the core of the theory of what Barnard (1938) and Simon (1947) have called "organizational equilibrium": the conditions of survival of an organization' (March and Simon 1958: 83). Organizational equilibrium is the result of the balancing of two factors - inducements offered by the organization (e.g. salary and pension) and contributions which an employee brings to it. 'Augmentation' technologies such as those we have just described - i.e. those that encourage individuals to become expert in analysing and developing information - affect the contributions part of the equation and hence the equilibrium. They increase the value of the worker's services. As noted above, a wider range of knowledge can be generated, and it can be disseminated rapidly. People armed with a personal computer can perform statistical analyses of considerable complexity; can turn themselves into editors of profession-looking documents; can get access to information from a wide variety of sources without regard to local constraints; can 'network' in several modes simultaneously, from spoken to written to iconic; and can build databases
The Vulnerable Fortress
164
and plan campaigns, the complexity of which depends only on the power of their own imagination. The inducements that bureaucracies could once offer to recruit staff are no longer adequate, now that an alternative has emerged to sell, as one's own master, the same service for which one was once paid considerably less in salary. This change attacks the equilibrium that is postulated to be the 'condition of survival/ in March and Simon's words, of the organization, since it renders obsolete the previous system of 'inducements' (for which, read 'salary'). Senior management is forced to look at its administration in a new light. It is no longer leading an obedient army of disciplined 'organization men and women,' easily affordable in a monopolistic economy even when they turn in sub-par performance. Rather, it is directing very expensive sources of knowledge and expertise that must be skilfully tended. These are people who have turned themselves into authors, or, as Robert Reich calls them, symbolic analysts.10 They are sources of information, to be treated as a renewable resource. The superiorsubordinate relationship is being transmuted into an information transaction. The organization's traditional authority relationship has therefore been altered, and the cause for this transformation is inscribed in the nature of the information transaction itself, as we are about to see. The perils of information as transaction Economists Osberg, Wolff, and Baumol (1989: 48) have noted a peculiarity of the information transaction, when compared with transactions that have to do with goods and service: 'In essence, in the market for information the buyer of information asks the seller of information a question (or questions) and pays the seller for the answer. The answer, however, may not be a "true" answer. Only if the buyer already knows the answer can the buyer know for certain whether he/she is being supplied with a true answer. In that case, however, the buyer would not have needed to purchase information. Hence there is an unavoidable problem of asymmetry in the market for information - a problem which is not inherently present in markets for goods and personal services.' This asymmetry is intrinsic to the definition of information. Information, according to Shannon (1948), is the amount of uncertainty that a given message can resolve in the mind of its receiver. As Osberg,
165
The Changing Transactional Environment
Wolff, and Baumol (1989) point out, if the client for an informational exchange already has knowledge, there is no remaining motivation to 'pay' for it, since there is no uncertainty, and hence, by deduction, no information value. Information thus has, logically, an irreducible element of novelty. When we place this fact in an organizational context, a curious conclusion follows. For a superior to accept without qualification the authority of a subordinate (the issue is literally one of author-ship) in the making of judgments, or to treat the latter as a source of real knowledge, would be to admit that the information is being taken on faith. This is tantamount to conceding (for at least the purposes of the present transaction) the subordinate's superior intelligence (in the sense of access to key data and ability to analyse them) and would thus contradict the underlying principle of hierarchy which presided at the exchange to begin with! A slightly different way of seeing why hierarchy fails as a control system in these circumstances is to consider a conventional treatment in management theory of the coordination of work. Mintzberg (1979), to take one example, cites five mechanisms of coordination: mutual adjustment,' direct supervision, standardization of work, standardization of outputs, and standardization of skills. Mutual adjustment alone is compatible with the description by Osberg et al. of the information exchange. One cannot, for example, 'supervise' the development of information, because that would assume that the supervisor already knew the outcome to be produced, and, as Osberg et al. suggest, in that case there would be no information, since supervision assumes prior knowledge. Similarly, all forms of standardization have as their goal to eliminate uncertainty in the output and hence, by definition, evacuate the information content. Only mutual adjustment is left, and mutual adjustment is non-hierarchical. The moment we conceptualize managerial work as based on information, the hierarchy crumbles, along with the system of authority that it both manifests and sustains. The bureaucratic form of management supposed implicitly that there was really only one author - the chief executive. This singleauthor (or single-stream) fiction allowed generations of organizational theorists to conceptualize an organization as a machine. It was a fiction that could be maintained (as we noted in chapter 3) as long as the system was loosely coupled, communication was restrained to well-grooved channels, and competition was weak. As knowledge workers become more and more expensive, it is
The Vulnerable Fortress
166
their augmentation skills that have become valuable (since, as Osberg et al, indicate, it is only then that they contribute to productivity). Entrepreneurship enters to exploit this renewable resource, and when this happens the symbolic analysts either become marketable or have to be managed within flat hierarchies, in loose networks, using mutual adjustment as the coordinating mechanism of 'control/ In either case, the effect is anti-bureaucratic. The metamorphosis of computing into a distributed, very powerful 'authoring tool/ we are saying, erodes the logical basis of bureaucracy. The production of information can never be organized on an assemblyline principle. Repetitive 'information' ceases to inform and soon has no value. The scale effects in an information economy are all on the side of the storage and transmission of data. The personal rewards to knowledgeable individuals consequent on the development of marketable knowledge, in a global economy, have skyrocketed. The authority which was once the undivided property of the organization is being fragmented. As this occurs, we are witnessing as well the commoditization of information. In the previous era of rational bureaucratic administration, information work (management) was calculated as a necessary cost in the production and distribution accounts covering the main goods and services bought and sold by the organization. It was not itself computed as a resource - that is, as a product or a service. It was certainly not something to be 'out-sourced/ The fragmentation of the information economy starts to become evident when information work paid for by salary is no longer competitive with information work that can be bought and sold. We have come full circle from management as substitute for the market, as outlined at the beginning of this chapter, to management as service that can itself be bought and sold. The trend that has characterized this century now seems to be coming to its end. Passage from reliance on a market economy to dependence on the 'visible hand' of hierarchical administration (Chandler 1977) is reversing, even though advances in communication technology were at work in both cases. The new networking and personal computing technologies incorporate a hierarchy-destroying, or fragmenting property. They are a Pandora's box." 'Stagnant' and 'progressive' sectors We have been describing two trends in this chapter. In one, the trend towards globalization, TNS technologies are powerful instruments of
167
The Changing Transactional Environment
integration. Because they permit economies of scale in the management of far-flung enterprises, they also permit new patterns of organizational interconnection and interdependence. Individuals with intellectual skills have also discovered in software a range of empowering tools that support a second trend, to fragmentation of organizational communication. These tools seem to open the door to new kinds of enterprise, stimulate the generation of start-up companies, encourage out-sourcing and consultancy as a substitute for permanent employment, and push us towards emphasis on ideas as a principal resource and commercialization of knowledge (including management itself) in a very mobile social environment.12 Here, once again, concepts developed by economists can help us understand these seemingly contradictory tendencies. A framework articulated by Baumol (1967) suggests that the changes that we have been describing may be two dimensions of an enveloping transformation of organizational administration. According to Osberg, Wolff, and Baumol (1989), a shift, such as we have witnessed in North America, from (i) goods production to (2) provision of services to (3) 'information-processing' transforms the conception of management work. The core of their idea is to draw a distinction between 'progressive' and 'stagnant' sectors of the economy. At first glance, the terms seem to contradict their usual meaning. By 'progressive/ they refer to an economic sector where labour inputs become relatively less a contributing factor to the cost of production (and hence where productivity is increasing), as a result of better technology. 'Stagnant' describes a sector where a non-decreasing fraction of the labour force is needed. A stagnant sector is thus labourintensive (and shows little or no gains in productivity over time): The term "stagnant" may be thought to be pejorative, but it really refers to those human activities which cannot be replaced' (Osberg, Wolff, and Baumol 1989: 11). In banks, for example, machines may substitute for human tellers, but loan officers perform a task requiring non-automatable personal judgment and the ability to assess risks using qualitative information.13 The production and distribution of film have become more efficient because of improved technology, but program decisions still need as much application of human intelligence as ever. Research is by definition a 'stagnant' sector, following this logic.14 So is management. In goods production (the progressive sector), North American productivity has risen continually since the beginning of the century,
The Vulnerable Fortress
168
reflecting substitution of capital for labour in every area from farming to manufacturing. The personal-service sector has shown the opposite effect: it is predominantly stagnant, especially in the crucial areas of education and health, where the role of the worker has, if anything, become more central than ever. Costs keep rising. Management is a part of the service sector classed as 'information work/ It too can be divided into 'progressive' and 'stagnant' sectors. In manipulation of data, a progressive sector, productivity has increased (automatic data-capture and -processing require the intervention of fewer clerical workers, as the technology takes over). In knowledge development, a stagnant sector, involving 'the generation and dissemination of new conceptual categories, relationships or hypotheses (i.e. the production of "knowledge")' (Osberg, Wolff, and Baumol 1989: 12), however, human creativity is as essential as ever. Indeed, the developments reported in chapter i would suggest that recent trends in technology are emphasizing even more strongly the contribution of human intelligence. Technology augments the knowledge worker (thus making his or her contribution yet more indispensable), rather than replacing him/her. In other words, TNS are providing a means to realize the gains of the progressive sector: they lead to outward extension, beyond the range of usual organizational frontiers, of the assembly-line model of production. They result, when effectively applied, in a smaller workforce. By providing economies of scale and of scope, they are a motor of corporate reorganization, with an interesting effect. Because control can be effectively realized across company lines, they seem to encourage new patterns of interdependence, a sort of vertical disintegration. They favour what we have called Virtual organizations,' 'constellations,' or 'value-adding partnerships/ Personal computing tools are encouraging growth of the 'stagnant' part of the workforce - people with professional training and a grasp of domains of activity that require intellectual skills of a high order. These are people whom it is costly to educate, who are in demand precisely because they are a relatively scarce commodity, and, in an economy of global proportions, whose value is steadily climbing. Given the competition for their services, they can command the highest fees. The result is increasingly bi-modal distribution of work, as the examples cited in chapter i demonstrated (all the growth is in the white-collar sector). This is having a carrot-and-stick effect on the
169
The Changing Transactional Environment
bureaucratic form of organization. The 'carrot' is the range of economies to be realized from TNS; the 'stick' is the phenomenal growth in the costs of highly qualified administrators. Over time, the greater economies realized in the 'progressive' sector and the continued labour-intensiveness of the 'stagnant' sector must, we claim, combine to alter the proportions of the inputs to the total cost equation. Given the salary disparities separating the 'stagnant' from the 'progressive' sectors, the effect is to make management ever more 'stagnant.' Administration finds itself 'in a fix,' increasingly dependent on a sector whose relative costs continue to climb with every passing year. Even where the administration remains constant in number of total employees, its salary rolls continue to expand. Senior management is taking up a greater part of the pie. Furthermore, the process is, if anything, accelerating. What is this doing to bureaucracy? According to Osberg: 'Most information sector workers are actually engaged in the "command, coordination and control" functions of advanced economic systems' (1989: 7). Look at what happened a century ago. In the case of the early development of a modern bureaucratic organization such as the railway, the ability to 'command, coordinate and control' the network of employees within a single firm increased profits and promoted consolidation of railway empires. But administrators at that time were comparatively cheap to hire! Now their cost is exorbitantly high. It follows that we should evaluate the impact of technological developments in communication in terms of the economies of scale of the operations that they permit. If the proportion of high-cost employees were to go up, profits would increase if the range of transactions that can be effectively managed were also on the rise, fast enough to keep up with the inflating, or 'stagnant,' part of the equation. Alternatively, if the span of administrative control were not to increase, the rising costs resulting from growing dominance by the stagnant sector would depress the productivity. The rising costs of administration would be justified if the governors could better 'command, coordinate and control.' Conversely, they would become less tolerable when unaccompanied by greater managerial effectiveness. The size of the network of control sets the main parameter of management's task (the dimensions of the distribution system to be
The Vulnerable Fortress
170
administered). The logic of this pattern is inexorable: to survive, organizations must extend their operations geographically to a global scale (or at least achieve economies of scope within a more limited market through diversification of product lines), while achieving administrative economies by excising stagnant parts (perhaps through contracting out essential services). This is a model at variance with the established practice of bureaucracy; it is a change in logic that marks the end of an era. Managing in the information economy The effect of information content on hierarchy Information, as we have seen, constitutes much of a product's value in today's environment, through Porter's concept of the 'information intensity of the value chain.' Innovations in TNS differ from older, more conventional technologies. They are not only a production tool and distributional support system for making and marketing goods and services but also the infrastructure of management itself - the coordination system of the enterprise. They are part of the company's operational plant, and thus a means by which to increase the efficiency of transactions and operations, like any other machine. But they also operate on the company, transforming the organizational structure. Implementation of a new electronic network system can alter the pattern of administrative processes, as well as the nature of the business or businesses in which the firm engages. Technology of this sort is not just something to take decisions about; it alters the way in which decisions get taken. TNS change not only the range of products and services that can be offered, and the efficiency with which they can be delivered, but the character of the organization itself. Thus their integration is different in kind from the integration of other non-communicational technologies. By reshaping the way in which the company makes decisions, they can transform the rules and the processes that support the corporate culture. They yield their benefits only when they are accompanied by reorganization - not a prospect that all firms eagerly anticipate. This situation raises several new questions. What happens in enterprises whose raison d'etre is production and marketing of knowledge work (including management itself), when the problem is how to 'command and control' work, the content of which may be 'commanding, coordinating and controlling'? How are scale economies to
171
The Changing Transactional Environment
be obtained for stagnant information work, in a globalizing system? What configuration of command, coordination, and control produces appropriate scale results for the kind of knowledge work requiring application of intelligence, innovation, and tact? Large management consulting firms are an interesting case, since their prime product is a service - knowledge. Many of them operate internally on a very different principle from the old-fashioned, standard bureaucracy. They emphasize recruitment of only the very brightest of graduates, they add a period of in-house training which competes with graduate school in depth and intensity, and they adopt a policy of 'up or out' which both offers extraordinary financial rewards and places tremendous pressure on the individual. Their promotion processes make a mockery of the old ways: it is possible to rise to vice-president almost overnight and retire in comfort, still young, at an age when bureaucrats of an earlier generation would still be politicking for a lateral transfer in an endless arabesque of musical chairs. The new experts take it for granted not only that they begin as an equal partner of the company presidents who employ them as consultants but that, in the end, they could step in and run the company better than the people who engage them. If they fail to convince the client that they are his or her equal, they will never gain entry. These systems of enterprise reflect very little of the characteristics of a given region or culture. They are indifferent to spatial boundaries and are anti-hierarchical in spirit and in practice. They are not so much hostile to local control as tuned into it. Their flows of information tend to be increasingly transparent, out of necessity, since all that keeps them coordinated is information's rapid availability. The very transparency favours the imploding of the management function: the people who would previously have occupied a 'rung' in the management hierarchy behave more like the general-purpose agents found in international relations before bureaucracy took over. Bureaucracy reached its pinnacle when manufacturing was the dominant industrial sector. Its transfer to the service sector was problematical (particularly since, for a long time, the latter was typified by small, often 'Mom-and-Pop' enterprises). Its applicability to the information sector is even more dubious. The management conundrum The main point of this chapter can now be summed up in a nutshell. We are claiming that (i) large, multi-divisional organizations along
The Vulnerable Fortress
172
the traditional model are no longer a 'best solution' to an economic problem - they now appear as poor, uncompetitive systems for minimizing transactional costs - and that (2) computerization of their communications has not reinstated their competitive advantage. On the contrary, it poses a threat. What made bureaucratic administration powerful was not its efficiency - commonsense tells us that that is an exaggeration - but the scale of operations that it permitted and its consequent historical monopoly/oligopoly position in a market or, for government, a state, which that scale made possible. As long as the monopoly was not threatened, the relative inefficiency of the bureaucracy did not particularly matter; in the present international context of open trading arrangements, it matters a great deal. Even government now finds itself in a competitive environment. The depth and scope of the changes present a challenge to the managerial class and the representational frames of understanding that so far fail to capture what is occurring at the transactional level. The bureaucratic mode of administration may have provided effective (if heavy-handed) control in a previous generation; it now blocks change. Management risks being caught in an epistemological time warp. Today's challenge is one of imagination: to find ways to rethink our descriptions of organization and by so doing to give ourselves a basis of action and, even more important, a basis for a stable society. Managers have treated office automation as a tool that they can use for the control of operations carried out by employees; its logic not only fits with bureaucracy's, it intensifies the bureaucratic ideal of rationalization. The analysis of this chapter leads us to believe that the 'fit' may not be quite so neat as anticipated. Although automation may be an appropriate managerial technique for progressive sectors of the information universe (although, as illustrated in chapter 2, the 'book is still out' on that claim); it is clearly a non-solution for the stagnant sector. But, as we have shown, it is the stagnant sector that has become important in the new economy (Reich 1991). And the stagnant sector includes management itself. The question is posed for the first time: How does management take itself as object? How does management manage itself? If the organizational conversation is taking a radical new form, what is the text that goes along with it? Not bureaucratically, if the logic explored in this chapter is any guide. In the next chapter, we return to take up this question of a new text, when the bureaucratic model has become outmoded.
173
The Changing Transactional Environment
The long-term historical drift reconsidered We began this chapter by looking at the long-term historical transformation of organization that began in the nineteenth century, and is evidently still going on, fed as before by technological innovations in communications. For most of the chapter, we have been considering the current situation as it appears to those actually caught up in it - managers and theorists for whom the restructuring of organizational form appears as an immediate strategic challenge, in a world that they consider to be competitive in ways different from the past. These events, so much a part of the corporate and governmental landscape of this decade, are also episodes in a longer-term drift, which we associate with the transformation of our communicational infrastructure, through technological innovation. To the communication scholar, the challenge is not merely to grasp the short-term dynamics (although that too is important) but equally to begin to discern, at least in general outline, the emerging historical pattern. Harold Innis hypothesized that the emergence of nations and empires has been closely tied to the evolution of the means to communicate (Innis 1950; 1951). Communication (and, by extension, organization), he suggested, have both a space and a time dimension. Communication technologies always exhibit a bias. One bias favours the conquering of space - sending messages over greater and greater distances, the province of telecommunications ('tele' is the Greek root meaning 'far'). The other bias favours the mastery of time - the fact that a medium of communication such as paper can be stored for long periods allows an organization to maintain its control from one generation to the next - the hold of tradition. Both biases are related to the extensions of control that media allow and, in so doing, give the text a wider dominion over the conversation. In Innis's view, the technological state-of-the-communication-art not only limits the size and reach of an organization, over time and across space, but encourages emergence of specific organizational forms. If we credit Innis's theory, the remarkable centralization of bureaucracy in our century would tell us that it has been the spatial dimension that was most affected by things such as telegraphy, improved mail service, and telephony. A century ago, government was modest in size, and disjointed, and so, typically, was private enterprise. Our communicational space was defined (and measured) by
The Vulnerable Fortress
174
the railway and telegraph technology of the time. But communicational space is not the same as geographical space, even though it was defined by geographical measure until quite recently. Being surrounded by a vast tract of land or sea is no advantage if we have no means to communicate within it. In the past few years, transportation and communication have been coming unhooked. The new technologies are changing our communicational reach, our control over space and time (the places we can visit, the people to whom we can talk, in one place, in one time). The growth of big government and the multinational corporations came about in lock-step with the improvement of our means to communicate over distance, in conformity with the spatial bias of the new media. And the process goes on. In business, the important networks of distribution, including banking, manufacturing, and 'high tech' are now international in scale, no matter where head offices are located. Simultaneously, software has opened new possibilities of linkages for the individual, enlarging the range of individual activities through the augmentation capacity of computer software, connected to electronic messaging. The temporal dimension has been much less affected. Recordkeeping in contemporary bureaucracy (at least until recently) did not evolve rapidly even while the organization was mushrooming. There was some evolution in the technologies of organizational recordkeeping, including such tools as typewriters, filing cabinets, and copying machines, but they were less dramatic than the transformations affecting telecommunications. The new technologies permit permanent storage of documents, in unimaginable quantities, but the emphasis remains on now, not yesterday.15 Librarians, once the custodians of the wisdom of the past, now are redefining themselves as 'information scientist,' a professional class integrated into the electronic network, like everyone else. These technologies actually discourage a time bias. Tradition has little importance in worlds such as the fashion and apparel trade, described by Lardner (1988: 63) as 'a labyrinthine challenge of delivering designs, specifications and materials to factories, monitoring quality, bringing finished goods into warehouses and shipping them out to retailers and doing all this on a coordinated schedule when a company imports clothes from all over the world.' What counts is the extension of the company's range of operations to include countries on more than one continent.
175
The Changing Transactional Environment
It is a history-less society. Time-binding' is out. 'How do we solve this problem right now?' is in. The kind of people described by Lardner who 'have a certain turn of mind and enjoy logistical problems - the more complex the more fun' (63) are as little like the priests of an Innisian time-bound epoch as they could possibly be. Nor do they resemble the traditional bureaucrat: they are impatient, selfreliant, quick to take decisions, innovative, and flexible. They are not desk-bound like their predecessors; they are mobile and accustomed to rubbing shoulders with diverse cultures and personalities. They are like chameleons in their capacity to reflect the surface style of a new environment. However well read, however skilled in analysis, and however expressive their reasoning, these are not 'paper people.' They are a self-conscious new elite - extraordinarily well paid and accustomed to pressure, no longer bureaucrats, but 'hyperbureaucrats' (a concept explored in chapter 7). They do their work face to face and by telephone. They use electronic means to communicate because paper is too slow. The products of the personal electronic work tools that have replaced pen and paper are as ephemeral as the electronic signals that form their base. We have available to us immense electronic storage facilities, but they operate to support an expansionary economy, not to assure continuation of the old, established patterns, unchanged, over time. They are instruments to coordinate a far-flung system of operations. To continue Innis's logic, we could thus argue that the debt of rational administrative bureaucracy to inherited traditions would also have evolved slowly, while its orientation to space changed rapidly. The culture, style, and structure of bureaucracy would be altered less than its geographical and functional span of authority. On the evidence, this is a reasonable hypothesis. The Canadian public service, as we noted in chapter i, is now much larger than it was before the Second World War; like the private sector, it has become a multidivisional enterprise, but its clerical modes and general way of behaving have evolved only marginally. Now that is all changing: the spatial preoccupation sweeps aside concern for temporal continuity. Of course, because of spatial expansion, people's relationship to their institutions, and the relationship of region to region (not to speak of country to country), have undergone a great revision. The level of national and regional integration (in state accounts, education, health, transportation, and communication) that we now take for granted would have appeared truly exceptional as little as a half-century ago.
The Vulnerable Fortress
176
Innis's hypothesis asserts that, because communication is how we span space and time (or perhaps, following Einstein, we should think of it as the space-time continuum), it fixes the parameters of the economy and of society. It is the instrument by which control is exerted over the territory that we occupy and over the future. It affects the way in which people relate to each other, and, since some people become more expert than others in gaining access to the stored knowledge of a society, it shapes the distribution of power, status, and wealth. By changing the media of communication, the computerization of communication can create an environment for new kinds of social control and new forms of organization. Presumably systems that came into being as a result of one kind of economic advantage can disappear as the result of another. Are the new technologies pushing us even further into bureaucratic organization, or in an entirely new direction? Although office automation would suggest the former, evidence in this chapter leads us to believe the latter. Extensions and intensions: the importance of text There is one respect in which Innis's concept of space needs to be explored further: the inner space of thought. In logic, a distinction is made between the extension of a term and its intension. The extension of a word is what it refers to in some possible world. In logic, however, the extension of a term is merely part of its semantic meaning. The pragmatic implications are not considered - there is no relationship to a field of action. Our perspective is different: to us a text is not a neutral entity but a moment in a social system of communication. Telecommunications, in the theoretical frame that we developed in the previous chapter, serve to extend the range of an organizational text, to make it sovereign over a greater region, and thus to create the basis for large organizations (or 'empires' to use Innis's expression). In communication theory, because of its commitment to pragmatics, the consequence of meaning is seen to be organizational authority - the power to rule the conversation in extenso, the very basis of organization. The intension of a text in logic is the territory that it occupies in semantic space. There, too, a system of nodes and links mirrors, in its own way, the patterns that people such as Porter and demons describe, even though the realm is that of ideas. Turing's revolution
177
The Changing Transactional Environment
opened up that inner space for exploration. It is no accident that cognitive science has come out of nowhere to become, in a generation, a dominant field in psychology, linguistics, communication, and information science. It is not just the immense memory stores that computers make available, dwarfing the storage capabilities of paper, that matter. As we try to capture the nuances of human thinking, we have been stimulated to consider deeply the organization of the elements of knowledge. We have become self-conscious about our own structures of thought in a way that marks off the present era from all that preceded it. Out of that effort, new principles are emerging. Again, they have potential pragmatic implications. The chapters that follow represent a modest excursion into one of the new areas, in the search for a text of organization that corresponds to current realities better than the old bureaucratic one.
6
The evolution of software: the new text
For many years I have been developing a research program at the Stanford Research Institute aimed at augmenting the human intellect. By intellect, I mean the human competence to make, send, exchange and apply to decisionmaking the commodity called knowledge as applied toward giving human individuals and organizations more effectiveness at formulating and pursuing their goals. Douglas C. Engelbart (1970) 'Intellectual Implications of Multi-access Computer Networks/ 2 We firmly believe that knowledge workers are motivated to grow in knowledge and skill and that provisions in system design should support this. This support translates into a rich set of augmenting provisions aimed at providing speed and flexibility for skilled workers in organizing and pursuing their core knowledge work - in which 'authorship' is a primary activity.' Douglas C. Engelbart (1984) 'Authorship Provisions in "Augment",' 465 Introduction In chapter 2, we developed the germ of an idea: that organization is endlessly regenerated through a dialectic involving, as its poles of tension, a conversation and a text. We began by considering practical, down-to-earth illustrations. The first case was a community of users accustoming itself through spontaneous day-to-day interaction, realized as conversation, to a system of technology shaped by the logic of today's evolving software, which it incorporated into a com-
179
The Evolution of Software
munication network. The second case, by way of contrast, was a community of developers coming to grips, through conversation with clients, with the formally unexplicated, but stubborn, logic of the workaday world of doing business. In chapter 3, we inquired into some of the historical origins of our inherited organizational text, which stands at the intersection of traditional management theory and the modern theory of computation, both situated within the Western scientific tradition. Against this backdrop, we recalled a few selected observations of management in practice which contradict the scientistic view, and we argued that successfully implemented office automation may have tightened the coupling between actual conversation and the received text, and, by doing so, brought to our attention their fundamental incompatibility. In chapter 4, we presented conversation as a set of transactions negotiated through talk and gesture and text as an interpretation of the conversation, expressed in the formal and informal structures of language. That interpretation, in turn, reflects the patterning of an underlying semantic framework of understanding whose application brackets the flow of interaction and gives it a meaning - identifies its transactions, in other words. Both conversation and text are modalities of communication. Each is driven by its own logic, and while the logics intersect through the mechanism of transaction, in other respects they are quite different, even incompatible. Each can evolve independently, through patterned ongoing interaction and through thinking and writing and reasoned debate, respectively. In chapter 5, we considered how technological transformations of communication and information media may have altered the conduct of economic transactions and thereby oriented the conversation into new patterns no longer conforming to the bureaucratic template - producing what Kuhn (1970) might have called anomalies (an idea to which we shall return in chapter 8). Presumably, if for whatever reason the organizational conversation is transformed (as we believe it has been), the text must eventually follow suit, if we are to continue to make sense of our world. Finding new ways to frame the conversation and to see its transactional patterns differently thus becomes imperative. This chapter examines the notion that the technology may not only be stimulating new patterns of transaction, but, as we delve further into its logic, allowing us to think about organization more imaginatively - rewriting it. The conversation may have been evolving, but so has the text.
The Vulnerable Fortress
180
Software logic: the new dimension Software as medium of communication The computer is a medium of communication because of software. Software is, after all, a manifestation of language; 'programmers/ according to John Shore (1985: 187-8),'... are writers/1 The difference is that, when most language is transcribed, it is put down on something like paper. When software is 'written down/ its support system is a computer.2 A computer is to software what paper is to natural language. Programming, the basis of software, is a language, as well, but a 'formal/ or 'artificial' one. Software is itself also a medium with a double meaning. Because it is literature (in the large sense of that term, which includes all forms of creative expression), it becomes a manifestation of ideas in our head. Software creators can be compared to artists and producers. Yet, because software is also the code of a machine, or an office, or the knowledge system of a professional, it is also a potent means of controlling the world around us, and a key to industrial enterprise. This is the domain of the engineer. The power of the software medium, as a projection of human thought and communication, comes from the fact that it is a code. Because it is a code, we can read it as a set of instructions called a program (somewhat like the instructions for concocting a special dish). At another level, however, it is a description - more specifically, a description of a process. It can describe a procedure for calculating numbers (adding, subtracting, multiplying, and so on), in which case the 'computer' really is a computer. It can describe the operation of a machine, with the computer becoming a tool of 'machine intelligence/ the domain of robotics. It can be a description of a program of studies, in which case it becomes 'courseware.' As we saw in chapter 2, it can describe the steps by which decisions are (in principle) taken and by which a business is (or is thought to be) conducted, in which case we speak of 'management information systems/ It can imitate the reasoning processes by which someone like a doctor arrives at a diagnosis of an illness, in which case we would think of the computer as an 'expert system/ or a 'decision support system' (DSS). Finally, of course, it can describe the operation of the computer itself, in which case it becomes the 'operating system' of the computer (as contrasted with the various 'applications' of the computer).
181
The Evolution of Software
The sum of all these ways in which the programming capacity of software can be used makes up the field of artificial intelligence. The degree to which any of its multiple potentials is exploited determines our image of computerization. Historically, these potentials have been realized unevenly, with disproportionate institutional effects (often reflected in the size of a given establishment in relation to others within a firm or department). The bias of the technology, to use Innis's term, affects its institutional identifications. Much writing aims to divert or entertain or inform or educate. Computer software writing, in contrast, has usually aimed to control, directly or indirectly, the objects that it describes. This is what makes it 'organizational/ The computer as instrument of control The computer as medium, in the sense just outlined, is an instrument for thinking and acting that was available until fairly recently to only a few individuals. They were people who had undergone enough indoctrination into the mysteries of the computer's languages to qualify as members of a new, and powerful, elite of programmers which, in mastering a new language, acquired the capacity to produce descriptions - texts - which are of great economic value. As those descriptions were both descriptions and (potentially) prescriptions of processes, they were also a means to control processes. Ashby (1956), one of the early cyberneticians, was under no illusions on this count. The computer, to him, was a logic machine, based on a mathematical framework (with more than coincidentally a material equivalent), that allows a variety of processes to be 'ordered, related and understood.' As such, it is also the means to a 'correspondingly increased power of control' (1956: 2): Cybernetics stands to the real machine - electronic, mechanical, neural, or economic - much as geometry stands to a real object in our terrestrial space. There was a time when 'geometry' meant such relationships as could be demonstrated on three-dimensional objects or in two-dimensional diagrams. The forms provided by the earth - animal, vegetable, and mineral - were large in number and richer in properties than could be provided by elementary geometry. In those days a form which was suggested by geometry but which could not be demonstrated in ordinary space was suspect or unacceptable. Ordinary space dominated geometry.
The Vulnerable Fortress
182
Today the position is quite different. Geometry exists in its own right, and by its own strength. It can now treat accurately and coherently a range of forms and spaces that far exceeds anything that terrestrial space can provide. Geometry now acts as a framework on which all terrestrial forms can find their natural place, with the relation between the various forms readily appreciable. With this increased understanding goes a correspondingly increased power of control. Cybernetics is similar in its relation to the actual machine. It takes as its subject-matter the domain of 'all possible machines,' and is only secondarily interested if informed that some of them have not yet been made, either by Man or by Nature. What cybernetics offers is the framework on which all individual machines may be ordered, related and understood.
The 'power of control' is an economic lever of no mean consequence; the success of the computer industry, like that of the printing industry in an earlier epoch, flowed directly from it. Ashby, however, was writing about the programming power of a computer, its capacity to model a variety of processes by the construction of a program which is homomorphic to them - a capacity that can capture their patterning in its syntax of instructions (the software) to the operating machine (the hardware). By making possible the representation of transactional patterns of immense complexity, computers introduced a new kind of control capacity. But Turing's machine can also be employed as an instrument of representation: it has memory. Software databases: their economic and social importance A database, like any other computer software production, is a representation, a coded description. Representations have traditionally been classed by communication scholars according to the degree of iconic fidelity with which they represent the object for which they stand. At one extreme, pictures can represent because they physically resemble the original. At the other extreme, numbers stand for a single abstract characteristic of the object, the link between representation and object represented being arbitrary - i.e. fixed by an established code (of which a dictionary is an example). Literature falls in between: it employs abstract symbols, or words, to stimulate listeners' imagery pictures in words. A traditional database is a representational device that records
183
The Evolution of Software
abstract symbolic representations of objects (it is thus low on the scale of iconicity). Typically, it includes (i) a marker (such as a name) that indicates the object to be represented, (2) another set of markers (names of variables such as age, sex, occupation, income, and educational level) that indicate properties of the object, and (3) values (numbers, or other constants), which are mapped onto the variables, a process called in logic the instantiation of the variables.3 A database could be a list of customers, a compilation of the results of a survey, a record of criminal behaviour, a mailing list for subscribers (actual or projected), a census, or whatever. In each case, the individual appears in the database as a name, a set of categories, a list of variables, and a set of values (or constants, in the form of numbers). Databases of this kind have both economic and sociological implications. They are a motor for entrepreneurial expansion. The growth of a network of outlets for transactions with a population of clients is supported by the development of the accounting system that allows for keeping track of exchanges as they occur. Planning through analysis of statistical patterns of transactions is also enhanced. The result is a push towards enlargement of the distribution system that supports an enterprise. The root of the sociological significance of databases is different. It is a power to define and control social identities. Every human society has a database equivalent. Even the simplest society keeps count of people by category (husband/wife, father/ mother, warrior/priest) and category variables (parent of so-and-so, related to such-and-such a family, responsible for one kind of function or another). The social identity of an individual is determined by the sum of the relationships described by the various databases, in this very large sense, within which his or her name figures. The substitution of written databases (as with the Domesday book) for oral ones marked the beginning of the end of the feudal mode of government and the eventual passage to the modern state. The feudal system was locally or regionally based; the modern system, nationally. The continuing development of databases in our society is thus a matter of sociological and political, as well as economic, significance.4 A database in this sense is an instrument of governance. It is no accident that the emergence of bureaucracy coincided with censuses and the growth of statistics. This is where IBM got its start, analysing census data using the famous Hollerith cards. The technology of communication helps determine what kind of
The Vulnerable Fortress
184
databases a society can employ (and by those representations establishes the common patterns and extent of its social control). In societies with few communication media (other then words and gestures), databases are confined to people's heads, and their capacity is limited by the constraints of human memory (even though the latter tends to become extraordinarily developed in such circumstances). Control is exercised within the bounds of conversation. When the cumbersome media of stone and clay began to be used, they served little else, it seems, than to record a limited number of vital commercial transactions. Later, as media became more accessible and portable, they were used to record and circulate a much wider range of material, from stories to philosophy. When paper was introduced in the Middle Ages, new kinds of databases (now called books) were added and eventually proved marketable; they were economical to mass-produce and transport over sufficient distances to reach a sizeable public.5 Newspapers were last century's addition to the repertory, with film, radio, and television as today's equivalents. Databases, software, and administration The software that is used for most administrative purposes is depressingly banal, closer to accounting software such as COBOL than to artificial intelligence, the glamour child of computing. Administrative software has to do with keeping books: transactions in, transactions out, accounts receivable, accounts payable, payroll, inventory, and pricing. The development of databases of this kind is an effect of the push to find more effective systems of communication. The databases support a communication system that is structured like a wheel: the 'intelligence' is located at the hub, and the spokes connect with a network of 'dumb' terminals, each having a function of input/output of data. The vital information is collected at the centre; in a globalizing economy, the database supports a rapidly expanding network of operations, without loss of control. The companies which, through cleverness or luck, discovered this were able, within a short generation, to dominate their market and drive their competitors to the wall. Examples were American Airlines (with its Sabre system for airline reservations), McKesson's (with its Economost system to support its wholesale operations), and Otis Elevator (with its Otisline system for customer service). Like a book (as in 'the company's books'), a database is a tool of
185
The Evolution of Software
administration; also like a book, it is a marketable commodity.6 Databases not only support enterprise; they generate it. We all use the airline database, directly or indirectly, but nobody more than the travel agents who do our booking for us and thereby define a sector of the enterprise (chapter i). Conceptually, this kind of linked computer-telecommunications system not only does not constitute a break with bureaucracy; it is its ultimate apogee. From 'dumb' terminals to obedient and well-programmed employees is a short step, in the logic of administration. If this had been the only implication of the computer phenomenon, our story would have ended before now. But it is not. What we are about to consider can be classified neither as centralized database management nor as artificial intelligence (in the narrow sense, as the stimulation of, and eventual substitution for, human intelligence - say, in the form of an expert system). Rather, the computer has become a medium of communication and a support for a range of transactions quite different from those just described. Over the past decade or so, the idea of who should be able to use databases, how, and for what purposes has grown. This has generated the pressure to atomization. It is also here that we look for a new way to frame the transactions that go to make up organization. Beginnings: the California counter-culture (the SRI research)7 Our chronicle begins in the early 19605, when a small group of researchers and software engineers at the Stanford Research Institute clustered around a visionary inventor, Doug Engelbart, who in turn drew inspiration from another computer pioneer, Vannevar Bush (cf. Rheingold 1985; Greif 1988). Engelbart's philosophy was in direct opposition to the mainstream computing idea, which tended to see people as an adjunct to the computer - necessary to write its programs, input its data, and read its output, but otherwise extraneous (and potentially replaceable). Like HAL in 2001: A Space Odyssey, the computer appeared to be a form of new human being in embryo. Engelbart set out to reverse this relationship - to turn the computer into a 'tool' to be used by the individual. He coined the term 'augmentation' - the computer should be designed so that it became an immediate, and very visible, aid for the thinking human being, something to augment our capabilities, not something to replace us!
The Vulnerable Fortress
186
In retrospect, his image seems to have had much in common with that of the artist to his or her medium. The writer watches the words fill up the page and gets 'feedback' on his or her own thoughts; he or she may not even know whether the ideas were clear (or what he or she had in mind, really), until they come out on the page.8 The page becomes a mirror for thought, but not a simple reflective mirror. There is an interaction between thought and its expression that is fundamental to the creative experience. The same thing happens with the painter in relation to the canvas, the sculptor to the stone, the musician to the piano, the weaver to the loom, the architect to the blueprint, even the farmer to the field. Indeed, in one form or another, the experience of seeing one's idea realized in concrete form, before one's very eyes, is a thread that unites the experience of everyone in the world. A medium is a material base on which we can transcribe our thoughts for storing or communicating; its properties affect what is recorded and hence come subtly to mould the act of thinking itself.9 Could a computer become a medium in this sense? Engelbart thought so. First, he imagined the thinking person facing a screen on which his or her ideas would unfold in real time (that is, as they were transcribed). Real-time computing is something that we take for granted, but it was not the prevailing norm in the early 19605. Immediate visual feedback, Engelbart knew, would be crucial. Then he asked himself how the image on the screen differs from an ordinary piece of paper. After all, you can write your ideas down on paper, you can draw on it, you can doodle and make pictures, and you can do calculations. What can a computer do more of than paper, and better? For one thing, it could be a much bigger piece of paper (with present-day computer memories, measured in megabytes, practically unlimited in size). But a big piece of paper is of no use unless one can manipulate it with ease, changing both its orientation and one's own place on it. Standard typewriter keyboards, which computers have borrowed, have no capacity to move the paper around; the typist does that manually. To solve this first problem, the SRI group developed a new gizmo, the mouse. With this addition, it would eventually be no harder to navigate on the (computer screen) sheet of paper than it used to be in the pre-computer world. (Actually, Engelbart found the keyboard so archaic that he developed not one, but two,
187
The Evolution of Software
mouse-like input devices, one for moving around the screen, the other for inputting information. Only the first idea has caught on.) There was another problem that had to be resolved. Most computer screens are quite small - smaller than a sheet of ordinary stationery. So Engelbart conceived of the screen as a window, which displayed only one small part - a corner, if you like - of the immense space behind. The space is really the computer's memory, only he visualized it as a large two-dimensional space, as if the computer stores had been arranged to form a sheet of paper, one store, called a 'pixel/ for each atom making up the paper. Again the SRI group was borrowing from the creative arts: we know that the film is projected so that one image appears on the screen at a time and that the film and television camera 'windows in' on a scene, of which it shows us only part at a given moment. Now he had to solve the problem of how to move the 'paper' around so that different parts of it could be easily viewed through the window of the screen. Thus was born the concept of 'scrolling,' and again the mouse could be used, not just to move from point to point on the screen, but also to move different things into viewing position behind the screen. Once the group had gone this far, other possibilities opened up things that were hard to do with paper, but easy with a computer. For example, what if one wanted to move something from one place to another? A cinch. Mark it with the mouse, cut or copy it, paste it in a new place. All electronically. Similarly, people often want to have several documents available to them simultaneously. What we now call 'multiple windows' was developed. In the same way, people don't just write: they draw, they paint, they make graphs, they manipulate moving imagery. Nowhere does the significance of the mouse become clearer: a warm, analogic tool, whereby you can trace the contours of your thoughts visually, in a cold digital world. The technology that supports this is called 'bit-mapping/ by means of which free-flowing designs, similar to how we write on paper, became possible.10 All of this technology was on display by the mid- to late 19605. But it was an idea ahead of its time: not a computer as a standalone creature, substituting for us, or controlling and manipulating us, but an extension of ourselves, a medium for us to think and communicate with. The grants ran out. SRI decided to drop the experiment; an abortive effort followed at marketing the NLS system (which included a field trial at Bell Canada in Montreal),11 and then obscurity.
The Vulnerable Fortress
188
Or so it seemed. Unfortunately for the SRI group, the idea was not yet viable - the hardware was too feeble to support the software. It was all very well to demonstrate a concept inside the protected environment of SRI, with a dedicated computer, but in the real world, there was still not sufficient electronic power to bring the group's ideas to fruition. The invention of the computer chip changed all that. The chip and the democratization of computing Late in the 19605, the idea of incorporating a miniature computer into a terminal was taking shape. One of the interested companies, Datapoint, contracted out to Texas Instruments and Intel, two giants in the hardware business, to develop an entire computer on a single electronic circuit - what we would now call a 'chip/ Eventually, Intel succeeded, and it named its product, small enough to fit on your fingertip, the 8008. The 8008 was the ancestor of all today's microprocessors (a microprocessor is the business end of a computer, its 'central processing unit/ or 'brain'). Datapoint having found another solution in the mean-time, Intel looked elsewhere to sell its new product. In January 1975, the first 'personal computer' was marketed as an assemble-it-yourself kit in the magazine Popular Electronics. The demand far exceeded expectations. Within a year, new companies such as Apple, Atari, Commodore, and Radio Shack were springing up to fill it, presenting their wares not as business machines but as 'toys' for the home market. Adam Osborne (1979) has observed that, although the microcomputer business went from zero in 1975 to, by his (conservative) estimate about $35 billion ten years later ($2 billion in Canada alone), not one of the established computer manufacturers (of which there were about thirty in 1974, when Intel marketed its new microprocessor) got in on the market at the beginning. 'Why/ Osborne asks rhetorically, '... was this entire industry left to reckless entrepreneurs, lucky amateurs, and newcomers to computer manufacture?' His answer?: 'this new market was too bizarre to fit any predictions made by established means/ (33). In other words, the product didn't fit with people's idea of what a computer was! Computers for the masses? How could there be a market there? For the workplace, the shift of perspective seemed even more outrageous. What could these toys have to do with busi-
189
The Evolution of Software
ness? A different idea of organization would have to become acceptable before these new machines could have any relevance to organized work. The big, established manufacturers sold machines to support centralized accounting and transaction-control; the new makers of computers were selling instead to an 'organization' of independent users. The 'serious' use of the new computers started with schools across North America, which instituted classes in a misguided campaign to promote computer literacy, and parents, who invested in a home computer in the same way that they might have purchased an encyclopaedia set for their children. Office applications began with individuals who bought a computer, not infrequently on the sly, to carry out their work or business from home. The new computers entered the office environment as much through clandestine infiltration as because of deliberate corporate strategy. Development of the IBM PC Eventually, however, the skepticism of the mid-1970s gave way to growing realization of the advantages that microprocessing offered. The market potential was all too evident. IBM may have been slow out of the starting blocks, but it soon caught up. Something of the unconventionality of the new technology, from the point of view of the computing establishment, can be guessed at from the fact that IBM chose to short-circuit its usual development process to develop a personal computer. Rather than give over the project to one of its imposingly resourced laboratories, the company elected instead to set up a small task force outside, isolated from the usual IBM environment. By 1980, the first IBM PC was being readied for the market. Although its technical superiority was not necessarily evident to insiders, its success was practically guaranteed by a particular circumstance: 'Big Blue/ as IBM was affectionately called within the computing community, had over the years built up a privileged relationship with the specialist buyers inside major corporations and government departments (many themselves IBM-trained), who had come to appreciate the reliability of its products and services. In a short time, sales of IBM PCs rocketed to the top, largely outstripping, among corporate buyers, the many similar upstart products already on the market. Buyers of computing equipment in large companies viewed ar-
The Vulnerable Fortress
190
rival of the PC with some ambivalence, however. The PCs flexibility impressed everyone, and office designs were quickly rewritten to include PC nodes side by side with already available dumb terminals. But computer departments were alarmed to discover that, because the new technology was cheap and required no network support, thousands of machines were being bought on discretionary budgets by people in their organizations with no computer credentials, often hidden under innocuous categories usually reserved for conventional office equipment. For people trained to be concerned about 'integration/ the spectre of 'technical incompatibility' loomed; behind the spectre was a quite different reality - erosion of the department's control over the company's high-tech budget. There was no stopping the trend, though. IBM had made a critical choice in favour of 'open architecture;' its principal disc-operating system (or DOS), developed by Microsoft (hence MS-DOS), could be used by third-party companies developing applications software. The avalanche of new software products thus stimulated soon vindicated IBM's decision. These new systems were in no way simple adaptations of older mainframe programs, cut down to size for smallercapacity micro-computers; they represented a wealth of innovative new kinds of application of the computer, offering capabilities that working professionals were quick to appreciate. The first truly popular applications were 'spreadsheets' for manipulation of financial data and management of a company's books. Following close behind was 'word processing,' the predecessor of 'desktop publishing' (and a threat to the specialized word processors). Both developments empowered users. Although users had not become programmers, availability of a real choice of software had liberated them from the tyranny of the experts. A marketplace for innovative software did more than merely making existing programs more widely available; it stimulated a range of new products. Computer programming was beginning to go beyond data entry and printout reading to an era of user versatility. Open architecture had a negative aspect, however, from IBM's perspective: competition from a new quarter, the so-called clones which advertised machines (often lower priced) that were 'IBM-compatible.' From the customer's point of view, though, the clones were just more evidence of a new era in capacity. The new computers were impressive machines, and the new applications software was an immediate success. Still, the interface
191
The Evolution of Software
between machine and user was not yet very comfortable - too many 'function keys' to be learned, little graphic capability, and so on. Development of the Macintosh When Engelbart's SRI team disbanded, its members migrated elsewhere in California's Silicon Valley. Some turned up at one of the truly innovative US research centres, Xerox PARC (the Palo Alto Research Centre). At PARC, they found support for designing a new computer-based workstation for professional workers which came to be known as the 'Star' system. It had all the 'bells and whistles': a mouse, windows, and the lot. And it was networkable. It was also very expensive.12 But the idea was still making its way, this time to Apple. Like Xerox, Apple's original attempt at the new concept, called a 'Lisa,' came out too rich for the buyers. But the co-founder of Apple, Steve Jobs, persisted, and in 1984 the Macintosh, at popular prices, was introduced to the market.13 Every manufacturer of personal computers, including the largest, IBM, has now moved to integrate the SRI software into its products; many of the new, more powerful 32-bit machines incorporate the Microsoft 'Windows' technology, which is a derivative of the SRI work. Engelbart has finally been vindicated. The Macintosh differed from its predecessors less in the kinds of applications it could run, or in its technical specifications, than in its so-called 'interface.' By incorporating Engelbart's augmentation theory of computing, the interface made working with a logic machine less like mastering an abstract, logic-based computer language (however simplified) than it did like using other media, such as paper. The Macintosh 'visualized' computing (through its use of icons) and in doing so made it accessible to non-specialists. If you could drive a car, you could use a Macintosh - that was the boast. Development of augmenting technologies, far from having run its course, seems, if anything, to be accelerating. Beginning in the late 19805, corporate alliances between computer developers and new software products proliferated, extending the artistic and production capabilities of the individual user through innovations in the computer medium such as digitalization of sound and film and electronic publishing, graphics, and film editing. The immense space for recording provided by current computer memories puts a great store of available material at the fingertips of the user; this makes highly sophisticated
The Vulnerable Fortress
192
project management, issue analysis, financial modelling, and accounting increasingly accessible to users, extending the knowledge systems of individual professionals. Groupware (or computer-supported cooperative work - CSCW) is coming into the development cycle - tools by which knowledge workers can collectively share information and produce texts on-line, in an environment that makes accessible to groups and individuals the kind of augmenting feedback Engelbart had in mind.14 The examples cited in chapter i were just a few of the ways now available for clever people to do interesting new things. Control re-examined in the light of the PC/Mac phenomenon We began this chapter with a discussion of how the writing of a computer text can be translated into an instrument of control, by setting the text of the organization. The image of an organization that fitted with the conventional-mainframe, pre-PC/Mac-computer establishment postulated a unique source of intelligence and its associated database for each organization, located at the centre of a dispersed network of terminals, each with a workstation to which a user, or group of users, was attached. The procedures of work for everyone on the network would be determined by the logic of the mainframe algorithm, while the data entered from the terminals would be collected and analysed, at the centre as well - Taylorism in a new guise. The specificity, and exactitude, of the computer reduced the discretionary circumference of action of the people on the network in principle (and more and more in practice) to practically zero. The resulting system is one of total control, since it has become possible to monitor the behaviour of all employees in such minute detail that nothing is left in obscurity for the overseers. These are not fantasy systems; they are to be found everywhere in contemporary North American society - in fast food restaurants, airline reservations systems (any of several companies), banking and insurance (similarly, many enterprises), telecommunications (in Canada, Bell), courier and mail services (Canada Post, along with a number of private firms), public utilities (the hydro utilities), merchandising, and so on. In the words of Garson (1988:10): 'a contribution of twentieth-century technology and nineteenth-century scientific management is turning the Office of the Future into the factory of the past/ No particular human skills are required in such environ-
193
The Evolution of Software
merits, since, as the people she interviewed attested, the computer runs everything. Other commentators (Shaiken 1984; Zuboff 1988; Mosco 1989; Poster 1990; see also Salvaggio 1989; Franklin 1990: particularly chapter 3) have observed the resemblance of the totalitarian systems of control over labour to the Panopticon of Jeremy Bentham. The nineteenth-century utilitarian described the perfect model for a prison - a system of surveillance, imposing a central authority in a constant, total manner through the absolute transparency of everyone's behaviour.15 To employ a concept used in the previous chapter, this is what is happening in the 'progressive' sector of the information economy; it is Weber's 'iron cage' of bureaucracy carried to its logical conclusion: a prison-house of work. This is one image of an organization: office automation. In it the information that is generated by 'smart machines' is assumed to be the instrument by which centralized control is maintained over a distributed network of operations. It is the controlling capacity of the person at the top of the hierarchy that has been 'augmented.' This is an image of how technology works to support the power of the manager. Against this image stands another, which we have been developing in this chapter - 'creative people with splendid new tools, doing interesting new things.' The centralizing formula of automation ceases to make any sense at all in the context of Engelbart's vision of the computer as a quasi-artistic medium. The computer augments the performance of highly intelligent and creative knowledge workers, each equipped with a machine whose computing power rivals that of yesterday's mainframe and whose range of available applications is greater than ever. Moreover, each individual's machine is linked to a telecommunications network of enormous capacity and flexibility, making available unlimited potential for setting up interconnections with others of similar disposition and talent, to create disciplinary and cross-disciplinary associational arrangements, with no geographical constraints at all. Here the ability to produce texts using the medium of the computer has become highly distributed. Imposition of draconian central control is unimaginable. It is counter to the most elementary principles of logic. We are in the domain of the 'stagnant' sector of the information economy - with empowerment rampant! These are images of two extremes. Both represent tendencies in
The Vulnerable Fortress
194
contemporary society, and each exploits in its own manner the potentials inherent in the technology of computing and telecommunications, though in different ways and for different ends. They are, however, points on a continuum; the technology provides a gamut of alternative possibilities. Similarly, the distinction between 'progressive' and 'stagnant' sectors of work is a convenient conceptual dichotomy, rather than an absolute division. Most situations, such as those in chapter 2, have both (programmable) regularity and (problematic) unpredictability. Organizations are both centralized (and rulegoverned) and local (custom-governed) in their logics, both rational and irrational. The issue then is the relative effect of the technology on this delicate balance. 'Informating' versus 'automating' It is an issue that has been addressed by Zuboff (1988). In her studies on how implementation of information-based technologies affects the conduct of work, she contrasts what she calls 'action-centred' and 'intellective.' In action-centred skills, 'the crucial know-how that distinguishes skilful from mediocre performances eludes formal codification. People learn by experience - imitating and attempting.' Action-centred skills, she claims, are part of an oral culture, 'limited to the time frame of events and the presence of actors in the context where those events can occur/ They are passed on from experienced worker to inexperienced, from generation to generation, through conversation - exactly the kind of thing described in the police case study in chapter 2. 'Orality,' she says (175-7), 'relies upon situational and operational frames of reference that remain close to real human activity.' A culture based on action-centred skills is characterized by 'close empathic identification with what is known'; it is 'oriented toward the present tense'; it 'tends to unite people in groups'; and it is always 'externalized and public.' It can be 'highly-charged, emotional, and potentially conflictual. When communication must be faceto-face, interpersonal attractions and antagonisms are kept high.' It relies on seeing, touching, and hearing, among both manual and clerical workers and even among 'managers who must engage with their subordinates in hallways and corridors in order to feel they "know" what is going on.' Individuals do not so much think about what they know as act it out in their daily lives: they are 'immersed in the present-tense dynamics of utterance and action.'
195
The Evolution of Software
Introducing computers, she says, textualizes work. The computer, as we argued at the beginning of this chapter, codifies things, renders what was part of an intuitive understanding explicit, turns it into script - programmable script. Expert systems are an illustration of this transposition, but they are only part of the picture. From manufacturing to accounting to writing and publishing a book, work is made visible in a new way: 'once-mute material processes [are] now translated and displayed as data.' Zuboff calls this the informating capacity of the technology to produce a new medium of electronic text 'through which organizational events, objects, transactions, functions, activities, and know-how could be enacted or observed.' Know-how is rendered transparent, and knowledge is detached from the context of action in which it was developed. Even informal discourse is absorbed by textualization: cooperative work becomes 'computer-supported/ cooperative work (CSCW), and natural group processes are turned into 'groupware' (Ishii and Miyake 1991). Electronic text exists independent of space and time; it can contribute to centralization (automation), but equally as well to decentralization (the distributed hypothesis). Zuboff notes: The contents of the electronic text can infuse an entire organization, instead of being bundled in discrete objects, like books or pieces of paper/ Since data are generated automatically by the computer, the electronic text does not have an author in the conventional sense and, because of this, may seem 'more definitive and less vulnerable to criticism than a written document whose human authorship is clear.' It destroys the sense of meaning inherent in action-centred skills and undermines the oral culture. The computer becomes a medium, as Engelbart thought it would, standing between the individual and his or her work, by textualizing the latter. This results is a distancing from experience and a 'thinning/ as Zuboff calls it, of meaning. While distancing may thus seem a deprivation, it can also be experienced as empowerment, at the point when 'intellective skill' begins to replace action-centred skill (Zuboff 1988: 178-81). 'Knowledge is freed from the temporal and physical constraints of action; it can be appropriated and carried beyond the moment/ The further effect of this is to encourage a more comprehensive view of work, extending well beyond single activities, as they used to be understood, to open up a perception of how events are strung together to compose systems - 'a view not only of one piece of equipment but also of the process in an entire production module/ As Zuboff puts it (178-81): 'A
The Vulnerable Fortress
196
new playfulness becomes possible. Events and relationships among events can be illuminated and combined in new ways/ The technology's ability to informate 'can free the human being for a more comprehensive, explicit, systemic, and abstract knowledge of his or her work.' Intellective mastery of their work allows people 'to become interpreters of the text and so add value to its contents.' The world of work that Zuboff is analysing is not one of designers, engineers, film-makers, artists, and writers - those already-indoctrinated workers with intellectual credentials in hand - but ordinary folk, skilled in their own field of activity, though rooted in a traditional context of work. The cases that she cites (and confirmed by our own and others' research16) testify to a change of perspective, which has resulted in the learning of a new way to approach work, where textualization of the latter is the new factor. It is a matter of immediate relevance to the themes of this book, because, in the traditional view of management, since the time of Frederick Taylor, the employees were seen to be embedded in an action-centred matrix of work a conversation - and it was the administrators who composed the text. Indeed, the management/worker distinction relies on the separation of the conversation from the text: to be a manager was to be custodian, and enforcer, of the text. Informating technologies undermine the distinction, because they make every worker potentially an expert in reading a text and, hence, every worker is a manager*. Management is turned into a recursive, and reflexive, domain - that which distinguishes managing managers from managing employees. The fork in the trail Whether employees embark on the path from action-centred to intellective skills, or whether they are part of automation, is not just a function of the technology; technology enables but does not determine outcomes. Informating, to use Zuboff's term, as opposed to automating, requires an organizational climate that at least permits, and at best encourages, experimentation and learning. Many efforts in automation take it as given that it is important to prevent local initiative, to exact perfect conformity to the rules, and to discourage actively the growth of intellective skills (Garson 1988; Jourdennais 1992), on the principle that it is 'both possible and preferable to design human beings "out of the loop," despite the potential costs of suboptimization, inflexibility, and lost opportunities for adding value
197
The Evolution of Software
through insight and innovation' (Zuboff 1988:183). To hold on to the text-making and -reading capabilities of informating retains authority, and power, but means rejecting the unique resource-making potential of the new technologies. The true Luddites in the information age may turn out not to have been the recalcitrant workers who resisted change - 'technopeasants' - but the authoritarian owners and managers who were afraid to cede power in order to reach toward a greater goal, who eschewed long-term development for short-term 'productivity.' This is to neglect an enormous potential resource, which somebody will develop. If bureaucracy is a fortress, run by a baronial class of managers, it increasingly appears to be a vulnerable one. Managing managing The theme that is common to Zuboff's conceptualization and ours is the role that computerization ('informating') plays as an instrumentality to textualize systems of action, including transactional systems. Her emphasis is on micro-level work activities, in a context of organized processes, but not otherwise concerned with theory of administration. We, in contrast, have been preoccupied with the more macroperspective, centred on the textualization of the transactional system as a whole (the 'organizational conversation'). Automation is an example of how an a priori idea of what organization is can be elaborated in computer code and become not just an image of organization, as Morgan calls it, but a powerful instrument of control. It thus becomes not just a theory of organization but an enacted ideology, with very real human consequences and economic implications that we believe to be insufficiently explored. The issue is usually taken to be one of immediate productivity, through reductions in personnel, and in the calibre of training of the personnel. It is less often seen as a missed opportunity to begin to exploit, for purposes of international competitiveness, our best remaining renewable resource, the human ability to innovate and learn (Reich 1991). The image of an informated organization that follows from Zuboff's analysis is, by contrast with bureaucracy, of a layered system of texts, not of a simple text-conversation separation (to produce a management/employee dichotomy). In this universe of work, technology can augment human abilities by providing a medium which allows us to read back subjective understandings as objective data. It facili-
The Vulnerable Fortress
198
tates a different, more analytical, kind of comprehension and implies a new model of management. The conversation is no longer governed by reference to a text, but the governance of texts takes place through creation of a text which is their synthesis. Work processes will be textualized, through the augmenting influence of a new medium, and the intuitively guided practices of management will themselves become an object of textualization. This runs against the grain. Studies of managers at work, beginning with Mintzberg's (1973) groundbreaking research, have consistently pointed out the role of the informal, the importance of unstructured talk, and the reliance on intuition that characterizes the work of executives. Theory be damned; as Zuboff puts it (1988: 178): 'Senior executives, because of the authority they enjoy, have been able to preserve the orality of their culture, perhaps more successfully than any other group. This serves to maintain the conditions that support their authority, as it protects the opacity of their know-how.'17 People long thought of the computer as an instrument of centralized control, making feasible the continuing rule of the bureaucratic text, and not as a tool of augmentation. The insulation of management from the dictatorship of the computer (and the limited success of MIS) were thus quite comprehensible, since it makes little sense for the controllers to be controlled by the very instruments that they themselves employ to mystify their own hegemony. But the image of informating which we take from Zuboff is one not of centralized control but of the discovery of an intellectual power that, while it may impoverish experience on one level, can also add new value and a new potentiality for self-realization on another. It would seem odd indeed were the members of the executive class to deprive themselves of the very instruments by which their own servants are being progressively strengthened. To do so is to remain mired in a backwater of automation and eventually to discover that the management technology of the nineteenth century is no longer relevant to the twenty-first. The computer revolution will not have been completed until management rediscovers itself in the mirror of an augmenting medium. In an automating approach to computerization of the workplace, management is achieved through a centralized system of control of production processes. This is to see computerization through the lens of the traditional bureaucratic model of management, based on the metaphor of the rational machine. It is, to use Panko's (1984)
199
The Evolution of Software
typology/8 a type-i model of organizational control. A type-2, or informating model is a much more diffuse and reflexive image of management, one that capitalizes on the augmenting capacity of the software to retextualize itself. Although we have no clear metaphor for this model as yet, in the next chapter we shall go to the new software systems themselves for clues about the possible outlines of a type-2 world.
7
Managing in the information society
We should keep in mind that information acts on us in two ways. We use it to create more things such as new technology and more complex relationships such as the international banking structure ... Then we use information to tell us more about that which we have created. D.N. Michael (1984) Too Much of a Good Thing?' 348 We do not know how to meet these challenges: We shall have to learn how to do so. And we shall have to learn how to learn to do so: this is part of the uncertainty. D.N. Michael (1984) Too Much of a Good Thing?' 348
Introduction In the last chapter we contrasted two versions of the software 'text' of the organization. In one, the single organizational text (office automation being an example), by transforming workplace activities into data, helps maintain centralized control over a distributed network of transactions. In the other version, software texts illuminate events and the relationships between them in new ways and also foster a distributed interpretation of the organization. This contrast is a matter of more than managerial 'alternatives/ We are in the midst of an organizational transformation which is calling traditional management practices into question. As we saw in chapter 5, classical practices and interpretations of management have
2oi
Managing in the Information Society
been increasingly under fire as the competitive advantage of large bureaucracies in managing both private and public goods in a globalizing environment has slipped. At the same time, the fastestgrowing part of the economy in North America is the 'stagnant' sector, where value lies in skilful manipulation of information - interpretation and communication - rather than in production of physical goods. Interpretations of informational 'value' are supplanting 'productivity' as the standard by which success is measured. In chapter 6 we explored innovations in computer software. Software has been 'authored' by management to 'write' the organization as a rational automaton, breaking down office processes, including all but top-level decision-making, into programmable steps. Yet software authorship involves an increasing number of individuals, who can now write their own organizational versions and develop intellectual mastery of their working environment. It seems clear that the management that insists on distancing itself from these changes in organizational authority is no longer managing (or coping). In this chapter we pursue the concept of an 'informating,' or type-2, management, to see how some of the capabilities of software 'hyper-text' might contribute to management's own augmentation. These capabilities suggest the possibility of a new philosophy of management and metaphor for the organization. This approach considers organization not as a machine but as polytextual, and management's function is to write it in a new frame. To take this approach is to adopt truly the stance of the learning organization. Software provides our first clue, which is the concept of hypertextuality. Hypertextuality What is 'hyper'? One of the most interesting recent changes in computer technology is the evolution of what have been called hypermedia. Let us start by looking into the origins of the idea of a hypermedium. The concept of hyperspace had been well established in mathematics long before anyone began to imagine a new software based on it. In graph theory, for example, it is common practice to distinguish between 'graphs' and 'hypergraphs' (Berge 1970). Hypertext depends on the hypergraph model.1
The Vulnerable Fortress
202
In its elementary form, a graph is defined by two primitive terms, vertices (or peaks) and edges (or crests). We can substitute the terms 'node' and 'link,' the terminology most frequently used for hypertext. Visually, then, a graph is just a number of nodes, or points, with links connecting them, like the games for children where you have to connect up dots to find a hidden picture. Graphs have many uses. The most obvious is to describe a network, such as a telephone system or a road map. An organization chart is an example of a graph: the nodes are interpreted as positions in the organization, and the links, the reporting and authority lines. A hypergraph is a family of simple graphs in which each simple graph is constructed on the basis of the identical overall set of nodes. Imagine that we had interviewed all the members of an organization and asked each to draw his or her idea of the organization's structure. An 'overlay' of all the resulting images, superimposed one on the other, would be a hypergraph. Selecting one of the charts as the official one would then be equivalent to going from a hypergraphic to an ordinary graphic representation. An organization chart is a graph (not a hypergraph) precisely because it is the accepted, 'official' version. This unidimensionality is what makes it often seem so artificial (but also so convenient). To conform to the language of graph theory, a written 'text' would also be a set of nodes (the recipes in a cookbook, for example) and a set of possible links (their page order, perhaps). This is the idea exploited by the makers of hypertext. A text (let us say a book) stops being a simple text and is transformed into a hypertext when we imagine it in all its possible variations of differently ordered pages, making up a family of virtual books, each with its own meaning. The difference between hypertext and hypermedia is that the latter's nodes may take many forms: not just text, but cards, pictures, sounds, images, even computer programs. Hypermedia have enlarged the range of information that can be stored in a database, and manipulated; they can handle not just numerical but also graphic information, even film and television. There are many versions of hypertext currently under development. We shall not enter into the complexities of the technology (since all we are looking for is its value as a metaphor). Let us consider a simple variant that was the first to be marketed for a general public, HyperCard.22 STGYEDTYYTYETYRTRTTRT3EYTTYTYTYUYTKI of which stores a certain category of information. There is no limit in
203
Managing in the Information Society
principle to the kind of information that can be entered on a card (the software conforms, in other words to the definition of a hypermedium). It could be text, or pictures, or other symbolic material, such as recordings of sound and pictures. Furthermore, the system provides simple tools that permit the user to decide how the card is to be formatted: what kinds of information fields it will contain and how they are related to each other. Once stacks have been created, the technology then allows for flexible ways to link them. For example, a word or other item on a card can be turned into a so-called 'button/ which, when activated, searches through the stack for other instances of itself, tracing as it does a path between cards - 'linking nodes.' Buttons become buttons because they have a script associated with them that determines how the network of interconnections is to unfold - a script that, as the user becomes familiar with the programming language called Hypertalk (inspired in a general way by another Xerox PARC product called Smalltalk), he or she can write himself or herself. Tracing out paths between cards is called 'navigating/ and it is here that the user's creativity is supposed to attain its highest level, as he or she moves from cut-and-paste to actual programming. Since the navigating paths themselves are remembered by the software, they too can become matter for a stack of cards, and, with enough ingenuity, the user can come to design networks of networks - recursivity in practice. The software can thus feed back to the user his or her own learning path (within limits).3 In contrast with older database systems, hypertext software allows for a constructive and evolutionary approach to creation of its database. Its aim is to leave room for variation and exploration within an organizational framework constructed by the individual. The production of a text eliminates variety; hypertext aims to retain it. Traditional computing systems in support of transactional systems such as banking and airline reservations concentrate on standardizing data-recording and access protocols to ensure adherence to a uniform set of procedures across the entire network of users. The structure of the text is fixed. Hypertext, in contrast, supports an evolutionary process of building a database by its potential for discovering links and patterns through incremental searches ('browsing'). By incorporating recursivity, it is able to represent not only other texts but itself as a text. This is an instrument conceived for people who 'author' things. In this sense, it is part of the desktop family of software tools. The latest
The Vulnerable Fortress
204
software to use HyperCard as its point of departure is Quicktime, which is intended to make the computer an instrument for editing films and television programs - an auteur, as they say in the film world. The physical world, of course, is not 'hyper/ We can explain why by an example. When several 'peaks' in hypergraph theory are linked together by 'crests/ they make up 'ranges' (as in mountain ranges). For a mountain range to be transformed from a physical reality into a 'hypergraphic' interpretation, we would have to image that each time we closed our eyes and opened them again, the same complex of mountains would be found but joined together in a different pattern. There are no hypergraphic structures in the physical world. Hypergraph theory thus deals with conceptual objects, multiply defined, as in the planning of a network, before the network is actually constructed. The case is the same for a theory of text and hypertext. Someone reorganizing a government department or restructuring a company is involved in hypertextual thinking. Once the process is over, it is usual to go back to an ordinary graphic or textual representation. Ordinary machines are not 'hyper/ software objects may be. This is the metaphor for which we were searching. Let us now consider its application to an informating kind of organization. A hypertext model of organization Organizations, we have insisted, are not concrete objects (although they may occupy concrete buildings). They are 'abstract' objects, composed of the sum of the transactional relationships of the people who make them up. They exist for us through the testimony of the people who have to do with them, and in no other way. Abstract objects may be quite as real as concrete objects, but they present problems of apprehension, especially when, as with organizations, the only possible way to understand the configuration of relationships (which is the real organization) is by entering into a relationship yourself. You thereby, of course, guarantee that your perception will remain partial - 'objective' observer or no! Abstract objects may be real, but they have no existence outside an interactional context which they define and within which they are constantly regenerated. Organizations, as described in chapters 3 and 4, exist for us as permanent constructions because of their artefactual manifestations or 'text/ They are not only the names that people give them, the rituals that they
205
Managing in the Information Society
inspire, the buildings whose names they carry, and the contractual obligations that they undertake, but the 'frame' within which all this is carried out. Mountains are for eternity; organizations disappear without a trace the instant they no longer occur in the discourse of the people who invent them, record them, contest them, analyse them, and live in them. The conversation goes on; it is the organization that has to be manufactured, and endowed with an existence that reassures us that it is real. No wonder organizational theorists and practitioners alike have tended to fall back on metaphors of solid objects whose relational properties are easy to understand. Unfortunately, the result is to suggest ideas about human organization that might seem plausible to those versed in the universe of tangibility but that make little sense for social units - images such as clearly identifiable borders, single-minded rationality, and uniformity and consistency of purpose - all the ideas that we criticized in chapter 3. Once constructed, machines (including computer hardware) stay much the same until they fall apart; organizations are in constant evolution. Deciding on the 'boundary' of a human organization is always arbitrary, goals are forever being reformulated to fit into today's discourse, and the purposes of the organization vary depending on whom you talked to last. It is this mutability that makes hypertext an interesting metaphor. A hypertext model would visualize organization not as a fixed, unyielding structure but as a set of alternative possible transactional arrangements. Organization then would be seen as the structure emerging out of the process of communication, just as communication is the process by which the structure of organization progressively manifests itself. To the communication theorist, talking about the organization is to miss the point: all organizations are filtered through the perceptive systems of the people who experience them and for whom they provide a backdrop of reality. Any single perception of the organization, depending on the criteria in question, is as valid as any other; to the scientist there can be no 'official version' and no other way to discover the real truth about the organization than through perceptions. And there are as many perceptions (however frequently they converge) as there are people. We therefore ask of a model organization that it be able to incorporate these properties of being open-minded, multi-faceted, and yet visibly structured, indeed hierarchical.
The Vulnerable Fortress
206
The principle that guides hypertext is that there is no single best way to structure a complex field of information (or set of operations). The new software systems also manifest a structural openness different from the past - hypertext is built to be constantly in evolution, always in a state of becoming. And yet it is a structured system. This is how we also see a type-2 organization: structured but in constrained evolution. Using this as our point of departure, we could then go on to conceive of work groups, and the people in them, as a pattern or network of transactions whose definition is context-bound. The structure of this type of network depends on how the nodes are associated, and we make the assumption that such associational patterns are variable. There is thus both a 'static' structure (a description of the potential of each node, or employee, beginning with a name and a title) and a 'dynamic' structure (the organization in action, the result of the sum of the ways, in interaction, in which people have realized a version of that potential).4 A hypertext can be used to represent a conventional bureaucratic hierarchy, but it is not limited to such a possibility. Within a static structure (which can be represented in, say, an organizational chart), the members of the organization are reduced to titles, possibly with functional descriptions attached, typically grouped into divisions. These titles, from a hypertext perspective, are nothing more than icons, standing for 'buttons' which senior-level executives consider they have a right to push and open at will. The 'associative network' implied by such a text is equivalent to a system of graded statuses. Hypertext is, however, capable of more. It accommodates the fact that every executive (indeed every employee) develops his or her own map of association. He or she does so on the basis of available knowledge of the nodes of the organization (in large organizations, inevitably partial) and patterns of 'accessing' nodes which develop like a system of successive approximations - reflecting that individual's responsibilities and powers. While the nodes may begin as people, they evolve to become transactions. Although organization charts tend to be treated by executives as the official measure of organizational structure, they are in fact at best a loose approximation of the real thing. The 'real thing' is not a single but a multiple representational image composed of the overlay of maps of the value centres of the organization - an idea very close to that of people such as Porter, discussed in chapter 5.
207
Managing in the Information Society
In hypertext, nodes do not require any single, fixed definition. They can figure dynamically in a variety of associative networks, depending on the nature of the task and the needs of the moment. Their definition is contextualized. They can evolve over time. They can themselves individually represent a network. Their function varies with the context evoked. Each of those associative networks is hierarchical, examined in isolation, so that this new metaphor of organization does not assume the absence of hierarchy. All it assumes is that hierarchy is not 'single-stream' but fluid and multi-functional, reflecting the many coincident purposes and the disjointed agenda of a complex organization - the kind of thing described in chapter 2. In naturalistic organizational contexts, organizational groupings are not so much fixed, as contextually dependent, without clearly demarcated frontiers. Individuals participate in many of them simultaneously. The hypertext model accommodates this kind of contextual dependence by defining a 'group' as an associative pattern of nodes and a 'node' as a value-adding transaction whose definition depends on its place in a set of networks, each of which is a hierarchically structured system of associative links. This is to see the organization as a virtual, not a fixed physical, network (Mackenzie 1978). Furthermore, each node contains a map of the organizational universe; one node is distinguished from another by its communicational density and centrality. In principle, in a hypertext model, an organization can be run from anywhere. Organizational structures are not, in this view, monolithic. Instead they are simultaneously multiple and constantly evolving, as new associative patterns develop.5 Hypermanagement How might this concept work out in practical terms for the senior manager? It suggests, first, that day-to-day administration consists of 'navigating' a network of organizational nodes. This means not simply sending messages in some fixed pattern of task responsibility, as the classical theory of organization (and office automation) supposes, but rather creating the conditions by which the potential that a node represents can be realized. Calling a subordinate in for a meeting has a 'hypertext' equivalent in activating an icon, or pushing a 'button.' To bring people together in a meeting is to create an associative network of greater or less duration, depending on follow-up.
The Vulnerable Fortress
208
Second, the definition of an organization depends on how the people within 'call it up/ It may be evoked in new ways, and simultaneously by many people. The skilled manager in an informating kind of world is the one who knows how to activate an organization well. The skilled strategist is the one who can conceive alternative patterns of linking. Managing, in the sense of navigating (and map-producing), is an idea that has hardly been explored. Cohen, March, and Olsen (1972), with their 'garbage can' model of decision-making (chapter 3), capture something of the phenomenon, but not all. It recognizes the interactional character of upper management processes, with all that that implies in negotiation skills and tactical opportunism. But it tends to underplay the cognitive challenge of organizational programming within a hypertext universe, where there is no 'absolute' point of reference - no 'North Star/ as we noted in chapter 4 - and where representations are a matter to be negotiated. We should make one thing clear. We are not arguing that an organization should be run on a hypertext model, but rather that the hypertext model captures more accurately the reality of complex organizations where it is assumed that autonomous action is possible and desirable. If we want to understand the pattern of evolution of tomorrow's world, we are better to begin from an image or metaphor that is close to the reality rather from one that is not. Rethinking bureaucracy within the context of the hypertext metaphor In chapter 6 we considered two possible images for modelling an organization - 'automating' and 'informating/ Both are based on a machine metaphor, with one difference or qualification. We have been thinking of the organization as a software machine; a software machine model, in contrast with a more conventional physical artefact, can also be conceptualized as a text. We have thus made a transition from thinking of the organization as a machine (a traditional approach) to conceptualizing it as a text (a more current view). Further, we have argued that texts may be of two kinds, a text (in the usual sense of the word) and a hypertext, and we have been particularly concerned in this chapter to describe the latter - something that does not exist in the natural world of ordinary physical objects but is easy to conceptualize programmatically.
209
Managing in the Information Society
The advantage of making the transition from machine to text as a basis for modelling lies in its infinite flexibility. We can ask the hypertext model to reproduce, for example, the conditions of the Weberian model for a modern bureaucracy. In this case, there would be two conditions, (i) Nodes would have to have a fixed definition, specified if possible in detail and in advance. (2) Links would have to follow a fixed configuration, according to a principle of strict hierarchy. Given these constraints, we would have indeed returned to a Weberian administration. Texts, like machines, also have parts that are functionally connected. Obviously, though, a hypertext that conforms to these restrictions is no longer 'hyper'; it is a simple, old-fashioned text - a single, not multiple reality, a unifunctional, not programmable machine. That Max Weber conceived of bureaucracy in a way compatible with the metaphor of a text is not surprising, since he was a jurist. It is precisely its 'seamless' textual coherence that has always made 'rational' administration appear legitimate. Like all texts, bureaucracy has been taken to be rational, 'single-stream' and 'authored' (chapter 3). Just as we do for other texts, we accord the organization the status of a consummated oeuvre; since it can be thought of as the work of an author, we even attribute goals to it - those that must be assumed to lie behind statements such as one that begins: The Government of Canada today announced that Bureaucracy is the system that results when members of a highly differentiated (and not necessarily de facto coherent) universe of work nevertheless agree to restrict their communication in such a way as to conform to the constraints imposed by a text. They have implicitly agreed to regard the text as legitimate, including the fiction that it is the work of and exemplifies the intentions of a unitary actor, incarnated in the titular head. The organization, as we conceive it, is a 'multiplexed' conversation (cf chapter 4). Text is merely a projection of that multidimensional reality onto a unidimensional medium. The success of rational administration, historically speaking, was the triumph of a particular text. It held people's imaginations for a long time. A hypertext represents an organizational image of structure with which we are less familiar. Since it is no longer clear whether it has one author or many, one logic or several, one or diverse definitions of reality, it resists definition. It is a new kind of fiction - one not yet
The Vulnerable Fortress
210
widely disseminated, or internalized by the working members of society. It perhaps comes closer, however, to today's emerging organizational reality. If we are questioning the old single-text model now it is because the new technology is making us do so. By its spatial bias, which includes dimensions of both distance, or extension (the transformation in telecommunications) and depth, or intension (the transformation of augmenting tools for knowledge work), it is rendering inoperable the restrictions on the generation and sharing of information on which bureaucracy depended. But the kinds of restrictions, other than those of single-stream rationality, that would make a system of interlinked transactions function is not yet clear. As things now stand, the hypertext metaphor - and others suggested by the evolving technology of computing, such as parallel processing - becomes not so much a description, to be enfolded in a new ideology of management, as a tool for thinking about organization in a globalized and fragmented world. We are not suggesting that these newer forms will become clear overnight (anymore, indeed, than was true for the gradual emergence of modern bureaucracy itself). Rethinking the processes of authorship The need for a new philosophy of management An alternative to the bureaucratic form of organization is by no means self-evident. The 'alternative' is not a new version of an administrative system or a new ideology - a substitute for bureaucracy, with the same plot but a new set of players. It is rather a philosophy of management whose goal is to open the door to learning and ongoing adaptation. As we see the issue, it is not a recipe that is needed, but tools with which management can continue to re-evaluate its own performance and objectives, in a context where change is coming so fast that no one can predict the ultimate outcome of the managerial revolution. It is time for management to realize that it is management and that its inherited frames of understanding and action were indeed 'frames' which too may need to be reassessed and brought into line with today's realities. Max Weber's 'ideal type' of scientific administration (or prototype, in more current terminology) was as follows (Gerth and Mills 1958: 196):
211
Managing in the Information Society
• There would be fixed jurisdictional areas, governed by rules and regulations. • The regular activities would be distributed in a fixed way as duties. • The authority to give commands would itself be distributed in a stable way and strictly delimited by rules concerning its exercise. • The fulfilment of duties would be methodical and performed only by qualified people. While Weber's text may be at least seventy years old, even superficial analysis of the case study of software development in chapter 2 would demonstrate that his concept has survived intact, provided, of course, that we make allowance for the evolution of jargon ('functions' for 'regular activities' and 'network' for 'distributed in a stable way'). The bureaucratic ideal may have migrated into a new form, software, and found a new context, office automation. But it is still saleable to at least some of the people who run the companies and government organizations (Garson 1988). We are prepared to admit the bureaucratic solution for so-called progressive sectors of information work (chapter 5) - discretionless data-processing - because by definition progressive work consists precisely of the tasks where a machine can be substituted for human personnel. Bureaucracy, however, is an inappropriate mechanism for the 'stagnant' sector of information work, at exactly the moment that the latter becomes, as a side-effect of automation, an increasingly salient part of the equation. Since the stagnant sector includes management itself (chapter 5), the issue for management is no longer how to manage its 'employees' but how to manage itself. The existence of this reflexive loop is the death knell for a scientific, rational theory of administration precisely because it always presumed a clear distinction between management as subject (the chooser, planner, analyser, decider, commander) and the organization that it ran as object. That is the whole point of scientism. The moment that we take management to be its own object, the assumption of 'fixed jurisdictional areas' governed by 'rules and regulations' turns paradoxical, because we would be in the strange domain of rules that write themselves and domains that define themselves - a condition that we have known since the time of Epimenides to be rife with antimonies. Our purpose throughout has not been to take an advocacy role to substitute, for example, hypermanagement for bureaucratic. Instead, we have been trying to detect in the trends an underlying
The Vulnerable Fortress
212
logic, pointing to the emergence of new models of how to manage, largely unexplicated until now, but gradually taking visible form, in proposals for innovative modes of administration and planning. It is events, not we, that are making bureaucracy out of date. The concept of 'hypertext' seems to us an important clue to what is happening, but nothing more. What is crucial is that managers brought up in the rationalist school of running things need now to be able to visualize an alternative, which they can themselves evaluate in the light of their particular situation and which opens up new horizons of managing. Not a new formula, but a habit of self-criticism. It is in this context that we offer an alternative way of proceeding. The alternative: a transactional frame As the reader will recall, in chapter 4 we suggested the idea of 'frames' as fundamental to the making of texts, and the writing of scripts, and hence as a key to interpretation and understanding and a guide to intelligent action. We use frames to bracket things - put borders around them - categorizing them as parts of our workaday experience. If we agree that organization is communication (and in the end nothing but communication) then the issue of framing is how we bracket communication. In the network theory of communication (also described in chapter 4), organizational interaction is framed using two kinds of template. The first is an 'input-output frame/ where a person or machine reads in data, processes it to satisfy a given function, and emits a message that bears information. The second template describes the transmission of information from one processor to another (Figure 7.2). As individuals, the people in the organization conform to the first frame; as nodes in a network, to the second. One frame is behaviourist in orientation, perceiving people as isolatable units of action; the other is functionalist, seeing communication as a configuration of relays in a flow of processed information. It is this combination of templates that makes the bureaucratic model appear viable; the performance of tasks is individualized by turning it into distributed functions, governed by rules, and performed by qualified people (or by machines that either do the work themselves or dictate the actions of otherwise unqualified human beings). Coordination is simply the conjoining of tasks to make processes. The management hierarchy - the control system - is arrived at by embed-
213
Managing in the Information Society
Figure 7.1 The input/output view of a communication node
ding lower-level in higher-level networks where networks themselves become objects to be configured (data in, decisions out). This is exactly the pattern described in the case studies in chapter 2. The reframing that we proposed in chapter 4 would substitute a unit of transaction, replacing that of transmission and incorporating the concept of processing. It is described formally by J.R. Taylor (1993) in Figure 7.3. We have bracketed the flow of information differently from the conventional network diagram to make it a joint accomplishment, a transaction of the kind described earlier. In our view, the transformation is the minimal unit of communication; anything smaller (including the network unit alluded to above) does not count as communication. It is merely a fragment of communication the shard of information-processing. In our new frame, all transformations (a term that we prefer to functions, or processes) are situated in a context of a complementary relationship involving giving and taking accomplished by partners in communication. There were many reasons cited in chapter 4 for this reframing of the organizational conversations: 'processors' have identities, and identities are mutually negotiated; information is dependent on context and mutually determined; interaction is reflexively self-organizing; and so on. The point is that by carving up the organizational conversation in a different way - bracketing it differently - we are seeing it by another lens through which the precepts of rational bureaucracy (its 'scripts') no longer make sense. The functions of a transaction, for example, can no longer be unilaterally imposed from the outside, if only because the interactional function (or what Halliday [1970] calls its 'interpersonal' function and Austin its 'illocutionary force') is co-determined by the participants and freshly discovered through each succeeding exchange. Similarly, the network is not a structure of dead channels - physical circuits for the transmission of signals - but living tissue - the stuff from which organization is created. Let us explore this change of perspective a bit further by considering the transactional frame in conjunction with some of the traditional bureaucratic frames.
The Vulnerable Fortress
214
Figure 7.2 The transmission view of a communication relationship
Bureaucratic frames and scripts in a new perspective Management is, in essence, a set of communicational practices, a socially constructed reality (Berger and Luckmann 1966), and its meaning resides in the sum of those practices. Practices provide stability. They furnish frames that simultaneously create fields for the conduct of interaction - a behavioural parameter - and make it comprehensible - a semantic parameter. It is the frames that guide the construction of 'scripts' - guidelines for conduct in action. The practices of rational administration, both public and private, have been framed largely by the bureaucratic model. Within this comprehensive framework (a 'family of frames'), the communicational situations of administration themselves break down into sub-frames. The 'supervisory' frame, for example, brackets those interactions involving someone 'reporting' to another, who is identified as having superior authority. The accompanying managerial script has the person in charge being given responsibility for organizing the work of his or her subordinate and for assuring that the objectives of work are met. The employee's script is assumed to be complementary: everything undertaken by him or her should be verified with the superior. Any deviation from this rule is equivalent to insubordination. Within these broad guidelines, there is room for a variety of styles of management, all the way from detailed micro-management to a more casual, laissez-faire mode of working together. But the supervisory frame, and the general lines of the script that accompanies it, are invariant. A transactional model of organization radically revises the concept of 'supervision' (overseeing, literally). One is 'visioning' not the performance of individuals, as such, but the conduct of transactions.
215
Managing in the Information Society
Figure 7.3 Communication as a transaction
How people behave, how they interpret things, obviously comes into it. But the focus is elsewhere, on the way in which the critical exchanges, out of which a dynamic view of organization is built, are realized and what value they generate. If the focus has shifted, so has the conception of how supervision is achieved. In a transactional view, the exercise of authority is something to be accomplished, not something to take for granted, merely because one is 'in charge.' One is dealing not just with individuals, but with processes. The hierarchy is a structure of conversations, not a ladder of statuses and authority. We have explored the issue of 'worldview' elsewhere (J.R. Taylor 1983; 1993). In simulations of enterprises by computer, it is common to distinguish between basic points of departure, centred either on the 'particles' on which operations are performed, or the 'operations,' or processing nodes, which deal with the particles. In writing a computer simulation it is necessary to decide which of the two 'worldviews' to adopt; in either case, the way in which the organization is imaged is different - focused on either the customer (or the equivalent) or on the functions that serve him or her. By its framing logic, its 'worldview,' bureaucracy is operations-, not particle-oriented. An operations-oriented style of management commonly loses sight of transactional dynamics, since, as the second case study of chapter 2 showed, transactional trajectories do not respect 'jurisdictional' boundaries. 'Rational' administration is notoriously unresponsive to the problems of snarled processes and confused jurisdictions. Transactional
The Vulnerable Fortress
216
theory treats this issue quite differently, since it is inherently oriented to processual dynamics rather than to task mechanics. The supervisory frame is itself set within the 'hierarchical' frame which specifies that between levels of authority other than immediately supervisory, the norm is to 'go through channels/ Direct interaction may be initiated only by the superior level, other than in very special circumstances, as with clear dereliction of duty by one's immediate superior (and even here there may be an ambiguity in favour of strict respect for hierarchy). The hierarchical frame equally brackets interactions between the horizontal components of the bureaucracy, such as departments. This frame presupposes that the responsibilities of administration are grouped around functions with clearly defined frontiers, both socially and conceptually, and that, within a given division, vertical integration of functions is the norm. Between divisions, the rule is 'hands off!' This concept of hierarchy, however, actually inhibits dealing with the transactional patterns that cross jurisdictional frontiers and turns decision-making for broad issues into an exercise in 'turf protection.' Again, the software development case (chapter 2) illustrates the phenomenon very well. In contrast, in our image (the hypertext model of organizational conversation), the hierarchy is one of transactions, not of people. The difference between them is that in a hierarchy of transactions issues of authority are not raised; what counts is optimization of value. A 'client/bureaucrat' frame brackets exchanges involving the bureaucrat and the public. In the traditional bureaucratic frame, this transaction was supposed to be circumscribed by strictly regulated conditions and routines. The service provided to the public had to be governed by norms, establishment of which was a responsibility, and a prerogative, of the administrator; it was not even conceptualized as a service. The accompanying script had a member of the public making application for attention - petitioning, at the limit - to which the bureaucrat courteously, but firmly, responded, within the guidelines for a given category of request. All requests falling outside the rules had to be (politely, in principle, but certainly) rejected. The rules were supreme.6 In our model, all transactions are assessed, instead, on their 'value.' The transaction involving the organization and its public is crucial here, because it helps shape both functions and identity. The concept of 'customer orientation' comes into play, and recognition of client
217
Managing in the Information Society
relationships within the organization as well as without. The reward system of a transactions-driven administration focuses not on individual performance but on the outcomes of transaction. We accept that the pattern should be the same whether the service is performed within the organization or is 'out-sourced.' We are not envisioning, thus, a return to a pure market system, but management confronts alternatives, between doing something inside and farming it out, that have to be evaluated on merit. In this respect, our perspective resembles that of the contemporary economist-organizational analysts such as Porter and demons, cited in chapter 5, who emphasize the economic imperative behind organizational structure. They posit organization as a collection of value centres, interconnected to form a value chain, or a value system. They thus leave open such questions as whether to manufacture a part oneself, or to buy it, and whether an ancillary service such as computing should be done in-house or out-sourced. Both questions involving comparative cost and quality, make the notion of an organizational boundary appear an arbitrary effect of history. As also explored in chapter 5, if the distinction between 'in' and 'out' is negotiable, then the mechanisms of control central to bureaucracy are now very much open to question. The people who are 'in' may realize that they could just as well be 'out' and demand to be treated as though they already were. Yet another bureaucratic frame brackets exchanges involving senior management and its outside points of reference, such as shareholders, board of directors, cabinets, the press, and the public. This 'answering' frame clearly demarcates the organization as an independent unity and those parts of society entitled to hold it responsible for its corporate actions. This frame is also based on a presupposition: that the officers of the organization have access to a privileged set of information sources, which they uniquely control, and management of which defines the relationship with their presumed 'masters.' The script that accompanies this frame predicates judicious control over what information is released, in a context of formal respect for the authority of those who have the final word on the organization's policy. We have no difficulty with this frame, because it is inherently transactional. In fact, it confirms what we already knew, that administrators never thought of themselves as conforming to the rules of bureaucracy, only their employees.
The Vulnerable Fortress
218
The strategic dimension Let us pause for a moment to take stock. In our conceptualization, organization is born in the translation from conversation into text, and subsequent events are driven by the opposite translation of text into conversation. The conversation, in and of itself, is merely experience: it has no meaning. Its meaning comes from its textualization. The 'reality' of organization lies neither in the conversation nor in the text, but in their dialectical interaction. The text has a buried structure, or 'frame.' Organizational frames allow for the bracketing of the ongoing conversation which gives it meaning. Such frames begin with a metaphor: we see the organization in terms of something else. The parallel case is well known in science, where a frame is called a 'paradigm' (Kuhn 1970) (we return to this idea in chapter 8). The solar system looks different, depending on whether we see Earth as its centre, or the Sun, whether we conceive of the atom as matter or energy, of light as particle or wave. There is no absolutely certain way to describe the organization; how we perceive the organization in the conversation depends on how the conversation is framed. This in turn determines how we write our texts and what scripts we follow. We are proposing and exploring a reframing, away from a 'rational' bureaucratic concept of administration, grounded in a network theory of communication, to a hypertext model of administration, grounded in a transactional theory of communication. The traditional bureaucratic view is a heritage of nineteenth-century positivism, which held that social systems, such as complex organizations, are reducible to a material explanation; hence the machine metaphor. A machine, in its physical manifestation, is composed of functions and the links between them. By extension, we have come to think of people as functions and their interconnection as mechanical - just signal transmissions, carrying information. This view, we have argued, has been outmoded by the information revolution of the twentieth century. After Turing, it was discovered that for every materially realizable machine there is a text, called 'software.' But there are texts that describe machines that have no physical realization - the subject of chapter 6. There are ways, it follows, to conceptualize the organizational conversation other than as a physical object, or machine. This is the domain of hypermanagement. The existence of this universe of ideas opens the door to a kind of
219
Managing in the Information Society
non-static administrative thought which allows for evaluation of alternative strategies of development. In a globalizing and atomizing world, this new way of imagining the organization will become important. The deficiencies of a bureaucratic structure of command stand out sharply in profile when re-examined from a transactional perspective. First, the detailed control of individual behaviour is an expensive business, requiring an elaborate structure of supervision, which has led to the towering - and very costly - superstructures of North American administrations, so strongly criticized in recent years, especially as they compare with their much slimmer Japanese counterparts. Control over individuals is not simple, since the writing of rules and regulations is merely a first step, and what costs the most is ensuring conformity - through conversation - to the rules. When supervision is delegated, the probability of dilution is greatly increased, as middle managers substitute their own interpretations of the intentions behind the rules. Second, a hierarchical command structure makes the relationship of supervisor to supervised the most salient transaction. We sometimes forget that authority is a relationship: in real-life organizations it must be negotiated transactionally, as a speech act. It is no accident that managers spend four-fifths of their time in talk (Weinshall 1979). In a bureaucracy, people quickly come to understand that it is this transaction, involving negotiation of authority, that most determines their future in the system and that they should be paying primary attention to it. The transactions for which they are functionally responsible, such as dealing with the public, are of secondary importance. The organizational hyperplanner conceptualizes the organization as being composed of transactionally realized, value-creating centres. The task of administration is to find ways of optimizing value-producing potential. There are two ways to do this: by enhancing the value-creating potential of the transactional nodes and by arranging the interconnections of the nodes in such a way as to produce a more effective process. The first of these, technologically interpreted, is an issue of augmentation; the second, a question of global planning. Managing supposes implementation of a conversation involving the planner/manager and the people involved in the transactions. This is not a command structure, but a negotiation process. The power of the manager resides less in his or her authority (other than that which
The Vulnerable Fortress
220
can be legitimately earned through interaction) than in the resources that he or she commands and the incentives at his or her disposal. Behind this conceptualization lies another - that the planner/ manager is concerned with virtual as much as with real organization. From knowledge of existing and potential systems of transaction, and from analysis of the paths by which they might be connected, organizations are constructed, conceptually, before they are assembled, in fact. The constraints of atomization and globalization The hypermanagement model proposed in this chapter is intended to promote creative methods of putting together the parts of an organization and managing them. It is not an invitation to conform to a new 'authored' text. The contemporary manager needs to be able to vary his or her frames of understanding, depending on circumstance. The relevant circumstances are globalization and atomization. Erving Goffman has pointed out that frames may differ along two dimensions: 'level' and 'span.' When a film director or television producer is creating a scene, for example, he or she has a choice to make, in positioning the camera, or cameras. The camera can placed 'close to' or 'far from' the object of attention, and when the physical placement of the camera is selected, a second order of choice is involved, to decide on the 'focal length of the lens' to be used. Depending on these options, a very different view of the scene can be obtained. By moving the camera, the object can be viewed from in front or from the side, from below or above, in the distance or near. The choice of focal length of the lens allows for more or less of the scene to be shown: to focus in close-up on one person or thing in the situation, or to widen the focus in order to see the object of attention in relation to its context of other people and decor. Such directorial artifices subtly shape our perception of unfolding events, our attention having been oriented by the strategy of focusing. The manager confronts a similar potential. The film/tele vision director was building a scene that unfolds within a physically circumscribed situation involving the interaction of a limited number of actors. The manager, however, must develop an imaginative capacity that allows him or her to switch from the micro-settings of the organization to the macro-developments of an economy in flux, while seeing both from two different perspectives: the social and the technological.
221
Managing in the Information Society
Figure 7.4 The manager's organizational perspective
Figure 7.4 suggests one way of visualizing this challenge. The level' of focus can vary from outwards to inwards, depending on whether the orientation is external or internal; this dimension is strategic. The 'span' of focus can vary from social to technological; its dimension is socio-technical. The forces of globalization, and the transformation of transactional patterns thus implied (chapter 5), have placed new emphasis on the external environment. Previously, firms competed as units for position in a relatively stable market, with known players. Now, corporate alliances shift in a gavotte of floating partnerships and virtual organizations, all in search of at least a temporary comparative advantage, in markets that keep changing their boundaries, pushed by entrepreneurial aggression and political manipulation of the rules of competition.7 Strategically, the manager must decide which transactional units are making up his or her organization, what is their value, how
The Vulnerable Fortress
222
they relate to the larger value system in which his or her organization is embedded, and whether to integrate vertically, by controlling the chain of transactions, perhaps by out-sourcing some value centres, or to integrate horizontally, by joining a value-enhancing partnership with other centres of value creation. The situation is further complicated by the Vanishing status quo/ where rapid technological change keeps altering the values that distinguish a given transactional system. This is the reality confronting established enterprises in the private sector,8 but similar effects can be detected in the public sector.9 Atomization gives the internal orientation a new importance because here again the character of transactions is undergoing a revolution (chapters 5 and 6). The organization is no longer a fixed structure of functions; it is a configuration of value centres in rapid evolution, as the micro-technology of computing is integrated by professional knowledge workers to produce an extraordinary augmentation of skills. This universe requires a new managerial philosophy, which places its incentives on the products of transactions, not on their mechanics. The move to a transactional way of thinking sacrifices clear boundaries, both internal and external. The blurring of boundaries is one of the primary facts of the information society (Cleveland 1990; Michael 1992). Because of globalization, the current reshaping of the value system involves vertical disintegration, rather than vertical integration. The conduct of business increasingly resembles the operation of Reich's term 'global web/ Because of atomization, there is a merging of formerly distinct functions and products to form new hybrids. The strategic orientation, inwards and outwards, is in turn set within a socio-cultural context where the manager must begin to understand technology not as an infrastructure, or a fixed parameter, but as an enacted environment (Weick 1979). Technology is thus both a creation of social forces and a reflection of organizational ideologies, while being simultaneously, an extrinsic material factor, necessitating continuing adaption. The result is what Weick called 'sociocultural evolution/ but he might have designated it 'sociotechnical' evolution just as well. Bureaucratic ideology has traditionally encouraged a fixed image of the organization, within the frames that we have outlined. It posits a relatively static view of the environment of transactions in which the organization figures as a player, in competition with a few others
223
Managing in the Information Society
of similar structure and dimension. It has equally encouraged seeing technology as a function to be delegated to the appropriate department, responsible for the network and information-processing infrastructure of the organization, but not otherwise implicated in its strategic decisions. The task of the manager is different in our vision. First, we conceive of the organization not as an army of workers to be run from a central command headquarters, engaged in economic warfare with hostile forces, but as a set of centres of activity that develop value. It follows that the strategic task of senior management is not to direct operations but rather to design and coordinate combinations of enterprise which include both in-house and outsourced components, on a chessboard with multiple players. Second, the organization, in a hypertext world, is made up of loosely coupled entrepreneurial cells, capable of augmenting the variety of products and services that the organization can offer to an increasingly demanding public. It follows that the manager's task is less to regulate, set standards, and evaluate performance according to fixed norms than to create the back-up structures of coordination and service that will foster a strong operational base. Finally, we reject the network theory of communication and see communication instead as the glue of the organization. The contemporary manager must therefore develop a socio-technical orientation, within which to evaluate not just the technical innovations that continue to proliferate, but also changing human assumptions about communication. The learning organization You will recall from chapter 4 Bateson's distinction between 'Learning I' and 'Learning II.' Learning I, in the context of management, would mean staying within the old bureaucratic frame of understanding and trying to adapt its structures and practices to the realities of a very much changed world. It is problem-oriented, but within limited parameters of search for solutions. While it is prima facie a reasonable approach, as long as it works, it has a risk. When existing frames no longer unambiguously identify unfolding situations, things that could have previously been taken for granted - the 'default' conditions - come forward into conscious attention. Background ('frame') becomes foreground (a focus of conscious at-
The Vulnerable Fortress
224
tention). This increases both information to be processed (a function of message content) and the ambiguity of interpretation of events. This is equivalent to a state of 'information overload/ for our frames and scripts reduce information load. As in computer programming, the more powerful the basic system, the less the application software has to accomplish, so with managers when frames are unambiguous. Bureaucracy could reduce ambiguity and variety for the manager. Once functions had been identified, and procedures put in place, management could get on with other preoccupations, secure in the knowledge that the 'system' would take care of performance. Learning II therefore involves fundamental refraining. It cannot be done from the outside, by some 'objective' observer, because the situation has far from stabilized, and the patterns of the future are still to be discovered. This type of learning must come from the inside, from the managers themselves. For it to occur, there has to be a new kind of partnership, uniting in the learning exercise both practitioners and theorists, in a climate of mutual openness - a conversational exchange without hierarchy. It is an exercise that can occur only over time, out of the regular field of daily responsibilities, by practising administrators open to new ideas and prepared to examine, through case studies, their own experience. A similar suggestion has been advanced by Karl Weick (1979). In his view, the environment confronting the manager is always open to more than one interpretation: events are fundamentally equivocal, their meaning never quite transparent. In 'relatively' quiet and predictable circumstances, however, administrators develop systems that can become quite elaborate and can be translated into detailed rules which people in the organization are supposed to follow, to the letter. In terms of our theory, we would say that the text has then become dominant, and the conversation subservient to it. The premiss of automation is that rationalization along this model is possible. Such a solution to the managerial dilemma works, in Weick's view, only so long as the environment will tolerate it. When events become increasingly equivocal, less and less in conformity with established routines, and more and more difficult to read, the instinct to respond rationally, but slightly readjusting the existing system, is doomed to failure. At this point, the organization's adaptedness to the previous conditions actually becomes an impediment to successful readaptation to the new. In Bateson's terms, such circumstances push us from Learning I to
225
Managing in the Information Society
Learning II. What is needed to confront equivocality is augmentation of the variety of ways of seeing things - framing them in different ways - out of which understanding can begin to merge. To achieve this, the organization has to turn away from strict adherence to its formulas and rules and admit the existence of what Weick calls 'a sprawling, equivocal process [that] contains many independent elements that have few internal constraints' (1979: 190). Out of this conversation (our term) learning occurs: 'Equivocal processes are rather untidy, they often work at cross purposes, and they often have the appearance of being wasteful and inefficient. Those seeming inefficiencies testify that the process is working, not that is has malfunctioned. The process is working because it has registered discontinuities, and it preserves them for further sense-making' (92). We return to this idea of Weick's in the final chapter. Conclusion In their synthesis of the results of a seven-country OECD-BRIE (Berkeley Roundtable on the International Economy) program of case studies on information networks and business strategies, Bar and Borrus (1989) describe a 'telecommunications learning cycle.' The first instinct of a management confronted with the potential of the new information technologies, they point out, is to automate the existing organization.10 Automation, especially vis-a-vis productivity, speeds up processing and cuts costs. Productivity is a ratio of output (the value of what is produced) to input (the costs of production) (Baily 1986; J.R. Taylor 1993). On the most elementary of logics, reducing staff and shortening production time is a way of trimming costs, and hence increasing productivity. The shrinking of bureaucracy beginning in the mid-1970s that we have described in chapter i - and that Harrison and Bluestone (1988) have termed the 'Great U-Turn' - indexed the trend to a 'leaner and meaner' approach to management. The challenge was seen to be how to cut positions, eliminate operations, simplify tasks. Automation, though, is not the ultimate answer. It may not even be the right first response, certainly not for all kinds of enterprise: 'During the automation stage, companies tend to deploy separate networks and applications to automate discrete tasks; the architecture and configuration of these networks closely [follow] the company's existing organization, or that of the task being automated. All the
The Vulnerable Fortress
226
case studies illustrate these characteristics ... Over time, however, the firm's organization, the kinds of work it does and the ways in which it operates will need to evolve and adapt to changes in its environment ... Corporate networks inherited from the automation of a previous organization are not well suited to the new requirements and may therefore limit the firm's ability to re-organize' (Bar and Borrus 1989: 18). The previous two chapters should make clear why automation is not the final answer. It is a management theory designed for the 'progressive' sector and takes no account of the augmenting, or informating, capacities of the new technologies. It is irrelevant to the 'stagnant' sector. Even in terms only of the contribution of the technologies to productivity (itself a notion derived from the progressive sector), the flaw in management's reasoning can be seen: automation reduces costs, but it does not enhance value. Lowering costs is not the only means to increase productivity; it is often a palliative designed to ward off the evil day when a firm, or a government department, finally has to face the fact that it can no longer deal with today's environment. A more creative solution would keep costs down but would also open up new frontiers of development. The new technologies constitute a mine of potential value-creating resources. To exploit their potential, management will have to pass beyond its initial tendency to try to assimilate these new kinds of tools to its old ideas of production. It will have to transcend automation as its privileged strategy. This will mean breaking with some well-ingrained habits of thought. The issue is no longer how to control people within ever more tightly constrained strait-jackets of routine. It has become how to organize, creatively and coherently, the efforts of a highly creative and independent-minded work-force. People who make the vital choices may not know how, or spontaneously try, to do this. Donald Norman (1992: 89) has recently remarked: 'Computer systems intended to aid people, especially groups of people, must be built to fit the needs of people. And there is no way that a system can work well with people, especially collaborative groups, without a deep fundamental understanding of people and groups. This is usually not the sort of skill taught in computer science departments.' Norman was addressing technologists but could just as well have referred to managers trained in one of the excellent schools of business located all over North America. The contemporary MBA typically arrives bearing quite extraordinary qualifications, in many areas from
227
Managing in the Information Society
accounting to marketing to organizational design and strategy. Yet there are also significant gaps in that education. For one thing, most business schools have lagged behind in offering programs in communication, and so graduates often see relations with employees as a problem of personnel and individual psychology. This approach misses the interactive dimension illustrated in chapter 2 and trivializes the supervisory role. One cannot intelligently plan a system of transactions before learning, first, to think of them as being transactions. For another, until recently, few management schools have seen the technology of communication as a significant area of study and professional preparation. As a result, the leaders of many organizations failed to understand the profound transformations now under way (chapters 5 and 6). It is the interaction between the technology and its human context that needs to be grasped - and this is seldom even considered. Finally, and perhaps most serious, the modern administrator has been given little sense of historical perspective. The acquisition of highly technical skills can actually inhibit understanding of what is happening. An interesting approach that could be much more widely practised is that of simulation and 'scenario-building/ thinking seriously about alternative futures. Such exercises could stretch the imaginative capacities of the future executive and encourage the kind of flexibility needed to examine more than one possible future, and more than one response. This is the kind of thinking that is essential to hypermanagement. As Bar and Borrus envisage their 'learning cycle/ the only way to learn is by trial and error, by experimentation. We hope that this chapter may have suggested some of the lines along which such learning might proceed.
8
The fall of the fortress?
The ideas of instability, of fluctuation diffuse into the social sciences. We know that societies are immensely complex systems involving a potentially enormous number of bifurcations exemplified by the variety of cultures that have evolved in the relatively short span of human history. We know that such systems are highly sensitive to fluctuations. This leads both to hope and a threat: hope, since even small fluctuations may grow and change the overall structure. As a result, individual activity is not doomed to insignificance. On the other hand, this is also a threat, since in our universe the security of stable, permanent rules seems gone forever. Ilya Prigogine and Isabelle Stengers (1984) Order out of Chaos, 313-14 The modern study of chaos began with the creeping realization in the 19605 that quite simple mathematical equations ('causes') could model systems every bit as violent as a waterfall. Tiny differences in input could quickly become overwhelming differences in output - a phenomenon given the name 'sensitive dependence on initial conditions.' James Gleick (1987) Chaos: Making a New Science, 8 Introduction Social scientists face the same problem as everyone else in identifying and describing the relationship between processual change and structural effect, in the context of broad secular trends. It is very difficult to determine when or how change has taken place when you are actually living through what may later turn out to have been
229
The Fall of the Fortress?
a transition from one era to another. Which variables should we be looking at? When does the sum of individual experiences of change add up to a collective structural transformation? This indeterminacy lies at the root of our present dilemma as a society - how to react to such all-encompassing phenomena as the globalization of the world economy and the informatization of our society. Our inability to predict and understand what is happening was pinpointed many years ago by Norbert Wiener (1948). Wiener was responding to pressure from friends to apply his undoubted mathematical skills to analysis of compelling social problems, in what he called 'the present age of confusion.' Here is his response (1948: 24-5): Much as I sympathize with their sense of the urgency of the situation ... I can share neither their feeling that this field has the first claim on my attention, nor their hopefulness that sufficient progress can be registered in this direction to have an appreciable therapeutic effect in the present diseases of society. To begin with, the main quantities affecting society are not only' statistical, but the runs of statistics on which they are based are excessively short. There is no great use in lumping under one head the economics of steel industry before and after the introduction of the Bessemer process, nor in comparing the statistics of rubber production before and after the burgeoning of the automobile industry and the cultivation of Hevea in Malaya. Neither is there any important point in running statistics of the incidence of venereal disease in a single table which covers both the period before and after the introduction of salvarsan, unless for the specific purpose of studying the effectiveness of this drug. For a good statistic of society, we need long runs under essentially constant conditions ... The advantage of long runs of statistics under widely varying conditions is specious and spurious. Thus the human sciences are very poor testing-grounds for a new mathematical technique: as poor as the statistical mechanics of a gas would be to a being of the order of size of a molecule, to whom the fluctuations which we ignore from a larger standpoint would be precisely the matters of greatest interest ... I may remark parenthetically that the modern apparatus of the theory of small samples, once it goes beyond the determination of its own specially defined parameters and becomes a method of positive statistical inference in new cases, does not inspire me with any confidence unless it is applied by a statistician by whom the main elements of the dynamics of the situation are either explicitly known or implicitly felt.
The Vulnerable Fortress
230
The limitation to which Wiener is referring would be recognized by any specialist in statistics. It is the case where the values of a system parameter change so rapidly that the phase space of the system being studied is modified in a step-wise fashion. The scientistanalyst who normally makes judgments using trends perceived while looking over previous data, on an elementary principle of linear extrapolation, can no longer validly do so. Exact predictive activities in statistical reasoning suppose stable system parameters. The problem is not limited to scientists. The practical person-ofaffairs, administrator and politician, may similarly find that intuitions honed to a fine point over years of experience in one universe of experience lead to unfounded conclusions and misguided decisions in another. The recourse suggested by Wiener - namely, reliance on additional explicit knowledge or an implicit feeling of the 'dynamics of the situation' - supposes more than usual perspicacity, as well as the courage to break with exclusive reliance on habitual (and reassuring) 'hard-data' supports to judgment. It also supposes recourse to theory that cannot be empirically generated, or 'grounded,' to employ Glaser and Strauss's (1967) term. The grounding in one phase would lead, on the most elementary of logical principles, to false presuppositions in another (the phase space being just what we presuppose - i.e. the 'frame'). We are impelled therefore towards a strategy that pushes insight - Verstehen/ to use Weber's name for it - out onto centre stage. In such a context, creative use of theory takes on new importance. New criteria for scientific adequacy and validity come into play. The notion of scientific evidence itself takes on new meaning.1 We believe that theorists and practitioners of organization have had the rug pulled out from under them, just as Wiener described. Fusion of the modern technologies of computing and communication into a single, seamless system for storage, manipulation, and transmission of information is at least as disruptive as 'introduction of the Bessemer process,' 'cultivation of Hevea,' and 'discovery of a cure for syphilis.' This is especially so when the administrator's dependence on these new technologies in large corporate and governmental agglomerations becomes daily more salient. It looks like one of those situations where once again, with our statistical crutches taken away from us (as Wiener argued), we must fall back on our knowledge of the 'dynamics of the situation.' For the authors of this book, this exigency translates into seeing organization as communication.
231
The Fall of the Fortress?
The unique character of communication A communication system possesses a double nature. It is at one and the same time a medium for realization of our exchanges (conversation) and the means by which we give those exchanges meaning (text). Messages are units of meaning - information, based on a system of knowledge - but they are also the matter of transactions, based on systems of social exchange. At both levels, communication is subject to evolution, but the logic of change is different - in the latter case, real (configurations of messages, composing actual transactions), and in the former case, symbolic (what the configurations mean). Measurement of the real part and the symbolic part calls on quite different disciplines, process analysis (with or without benefit of statistics) and structural analysis of meaning, Foucault's (1976) 'archaeology of knowledge/ respectively. What does this conceptual dichotomy mean for understanding the evolution of bureaucratic organization, after one has factored in the variable of technological transformation? We believe that when communication technology evolves, it triggers a different dynamic of change for the two parts of communicational experience. On the one hand, our systems of communication are built up through an iterative process of interaction, and they seem usually to change incrementally and continuously, with no particular point in time identifiable as a 'break.' This is because, as Wiener said, we 'molecules' caught up in the daily round of experience cannot detect the larger patterns. Even when there are discontinuities, historically speaking, they are unlikely to be immediately evident to the people involved. Nevertheless, if we credit Wiener, there are such discontinuous breaks. The problem is to recognize them. On the other hand, our descriptions of these systems function on the principle of the discrete. Our manner of carving up the matter of experience into categories appears to fall under the heading of discontinuous (Leach 1964). It is easy to see this tendency at work in the popular press. We say, for example, that many countries have either been 'abandoning communism/ or 'adopting capitalism/ as though the difference between communism and capitalism were absolute, and as if the change would put more food on the shelves and eliminate bureaucracy in a single day. For most citizens it does not seem this way, if they judge only by their daily round. This discontinuity was in the text, 'communism versus capitalism.'
The Vulnerable Fortress
232
So both conditions can occur. The text may change, while the transactional system that it describes lags behind (the typical problem of a revolution). But equally, the transactional system may change, while our interpretive account of it - our text - may not have budged. It is the latter case with which we are concerned here. A system of presuppositional images of reality is essential to making sense of the world - it frames events to make them comprehensible. But when the world is changing quickly - to the point of abrupt shifts of formerly stable parameters - it blinds us to what is happening. The explicit part of language - what we focus on in conversation - is remarkably well suited for dealing with the ordinary flux of experience. The buried, presuppositional part of language - its frames of embedded metaphors - is not so flexible. There is always, of course, a tension between the manner in which we do things and the way in which we assign them meaning - between our actual ongoing transactional patterns and the metaphors that we accept as their symbolic equivalent. Yet we tolerate the tension between real and the symbolic as long as the inconsistencies have not become so great as to render the basic description completely inadequate. It is when too many anomalies exist that symbolic change may finally occur. When it does, it is because the metaphor that we once used has lost its power once and for all. We believe that our society is at such a place. Wiener referred to step-functional kinds of change. In chapter i, we saw evidence of such change of pattern with respect to the variable of the modal size of organization. When we divide the period from the beginning of the communication revolution, roughly 1960, to the present into two parts, and we examine the behaviour of the variable of typical size of enterprise, the discontinuity becomes clear. Furthermore, the most recent evidence continues to support this hypothesis. The shrinking of organization in North America continues unabated. This is a real change, often accompanied by personal anguish for those affected (Bennett 1990). While size is the variable that we cite at this point, our intuition tells us that the shift is much more fundamental, affecting even more socially important traits of organization, such as exercise of authority, stability, and legitimacy. How can we conceptualize step-functional changes of this kind? Technological development as a catastrophe Change has tended to be treated in scientific theory (in classical, pre-
233
The Fall of the Fortress?
quantum physics, for example) as smooth alteration of behaviour, or, in other words, as continuous. Yet empirically we know that change may also occur as a break, something that is discontinuous. These are 'sudden changes caused by smooth alterations in the situation' (Poston and Stewart 1978: i). Mathematically speaking, such breaks are called catastrophes. Catastrophes typically occur where some small change in one variable causes large variations in a dependent variable. The modelling of change as catastrophe is associated with the French mathematician, Rene Thorn (1980). By 'catastrophe/ Thorn means not a disaster in the ordinary sense of the word but rather a discontinuity, a 'rupture,' a clear break, a displacement or abrupt change of system parameters (note the connection to Wiener's citation). His use of the word has none of the negative connotations of catastrophe in normal use. Why might the catastrophe metaphor offer particular insight into our present problem? The basic notion underlying catastrophe theory is quite simple (although the theory quickly becomes complex). Imagine a situation where an agent of some kind is trying to maximize (or minimize) the values of a particular variable, taking account of a second variable that is environmental in character. For example, we might be thinking of an administrator who is trying to formulate policy that maximizes public support (Isnard and Zeeman 1976). Suppose that public opinion is divided into two major tendencies, 'hawk' versus 'dove.' If at the same point in time, by a cumulative shift in personal positions, the popular majority evolves from one tendency to the other - let us say, away from the hawk and towards dove, when policy has previously been predicated on the hawk majority - the policy-maker now confronts a change of 'catastrophic' proportions. Previously, changes of policy could be incremental, as minor shifts in the consensus occurred, while now they become catastrophic. An abrupt reversal of position from hawk to dove may be necessary if the administrator is to continue to adhere to the philosophy of the majority. Here the main conditions for a catastrophe are met. The independent variable, public opinion, was undergoing a smooth alteration from one configuration to another, while the behaviour of the dependent variable, policy, exhibits a clear break, once some threshold value for the independent variable is reached. This is the same situation that we have described in terms of bureaucracy. Office automation has been the quintessence of rational
The Vulnerable Fortress
234
administration - bureaucracy carried to its limit.2 We could call it the hawk version of how to manage. An alternate version explored in chapter 7, which we called 'hypermanagement/ is incompatible with a bureaucratic logic of control. We could think of it as the dove position. Our argument was that, because of incremental development of augmenting technologies and their gradual implementation in a widening circle of activities, the dominant, hawk position is showing its limitations and circumstances have created a catastrophe. Bureaucratic models of administration, we argued in chapter 5, were a solution to the question of how best to organize, given the considerable changes in production and communication technologies of a century ago. Over the years, the technologies evolved smoothly, and bureaucratic organization followed suit, adapting its structures and procedures in response to environmental change. But the progressive change of technology has now reached a point where such a policy of local adaption to a stably evolving equilibrium is no longer possible. The catastrophe that we are postulating is a transformation that will take some time to unfold. It may be clearly visible as a sharp transition only in retrospect, as a kind of telescopic effect, at a time when something that took a half-century to happen can be seen as a single event, historically speaking. This will not make it any less a catastrophe, nor is it unique in this respect. We know, for example, that fairly minor shifts in mean annual rainfall can subsequently have catastrophic effects in certain climates. In the 19305, North America lived through a dustbowl; more recently, the sub-Saharan regions of Africa are passing through a similar catastrophe. Small changes of ocean temperatures can have catastrophic effects in species of fish and wipe out whole populations. Quite modest climatic changes, or imperceptible (to us) increases in atmospheric acidity, can alter the distribution of kinds of trees over vast areas. So we are talking about the kind of catastrophe whose effects are visible (at least initially) only by considering cumulative effects on a widely distributed population over time, the members of which were either well adapted to the previous environment and are distinctly less so to the new, or vice versa. Such catastrophes are most visible at a distance or over the long haul, since the individuals most immediately affected may be unable to perceive the overall pattern. The environmental shift may seem only one more of many disturbances to which the individual is obliged to react.
235
The Fall of the Fortress?
We are claiming that large-scale bureaucratic organization now confronts such a catastrophe. Computing as a paradigm shift So far we have been talking about the 'real' part of the communication equation - that which concerns patterns of transaction. If our organizations are facing a catastrophe, then it seems obvious that our ideas should also begin to undergo some quite fundamental shift, if we are not to risk being prisoners of an out-of-date ideology. So the question at which we now look is how ideas undergo radical reshaping. The most common theory is that of paradigm shift. The notion of change as a 'paradigm shift,' as we have already mentioned, was developed by Thomas Kuhn (1970). The prevailing image of scientific research had been of slow and progressive accumulation of experimental results, producing step by step a continuously growing body of knowledge about natural phenomena. Kuhn's reading of the history of science led him not so much to question this tortoise-like theory of normal science as to add a twist. He perceived occasional sharp disjunctions that could hardly be explained as part of a continuous series. These clear breaks, or transitions, corresponded to conceptual breakthroughs, such as non-Euclidean geometries, the general theory of relativity, quantum mechanics, and the genetics of DNA. Kuhn never defines his concept of paradigm exactly (nor, perhaps, could anyone). He refers to it by using expressions such as 'intertwined theoretical and methodological belief that permits selection, evaluation, and criticism,' and 'accepted model or pattern' (1970: 17). Scientists deal in general with evidence that is ambiguous until a frame of understanding is supplied that allows them to make sense of the data by seeing them as an index of a theoretical entity. The frame of understanding, or paradigmatic set, is not to be confused with an exact theory, complete with hypotheses. It is closer to an image, or a gestalt, which allows us to conceive of a pattern behind the observable phenomena and thus to give meaning to observations. It is a frame. It may have its origin in a metaphor, such as Bohr's conceptualization of the structure of an atom as a kind of planetary system, complete with a sun (composed of protons and neutrons) and orbiting planets (the electrons). Even though physics has long since abandoned this image, its existence made its rejection possible.
The Vulnerable Fortress
236
Without such strong image structures, it is doubtful, in Kuhn's view, that science could exist at all, even though the images per se can be neither proved nor even exactly defined. We need not restrict use of the term 'paradigm' to a context of scientific discovery and intervention. Kuhn himself speculated whether what he described as part of the history of science was not, in fact, a general property of perception. What we see is invariably guided by what we expect to see, and that depends to a great extent on the images that we hold of reality. Central to Kuhn's theory of scientific revolutions is the notion of 'anomalies.' Anomalies occur when discoveries of an empirical kind cannot be fitted with the existing paradigm. When this happens, the first temptation is to reject the data, not the theory. The idea that one's preferred image of reality is no longer confirmed by the facts is likely to be a source more of irritation than of joy. As Kuhn points out, normal science is mostly 'problem-driven/ 'puzzle-solving.' Scientists do not so much aim for revolutionary discoveries, as they are forced to them. Indeed, a paradigm can 'even insulate the community from those socially important problems that are not reducible to the puzzle form, because they cannot be stated in terms of the conceptual and instrumental tools the paradigm supplies.' (Kuhn 1970: 37). Given the choice, scientists will delay changing their paradigm. Original scientific discovery is the result of the dawning awareness of anomalies that cannot be 'explained away' under the old scheme of things.3 Kuhn's idea captures very well indeed the process of the discovery of computing machinery described in chapter 3. The mathematics discussed in papers by Church, Godel, Post, and others in the early 19305, and the ideas about computing put forward in papers by Shannon, Turing, and von Nuemann, from 1936 to 1946, exemplify exactly the kind of volte-face - this time with respect to the machine - Kuhn was talking about. The computer was not just a new machine; it was a paradigm out of which the whole field of artificial intelligence was to emerge and which has changed irrevocably our understanding of the processes of thought. In a broader sense, Weber's 'ideal type' of bureaucracy was an attempt at getting people to accept a paradigm of administration and 'rational' organization. The machine model was not just some arbitrary choice by Weber and a couple of other speculative theorists of the time. What made it paradigmatic was the tremendous power of
237
The Fall of the Fortress?
the machine image, at the height of the industrial age, to mould conceptions. There is a kind of Zeitgeist that makes some images more acceptable than others, an effect well documented by Kuhn for science and eloquently educed by Foucault for more general social phenomena. The machine image had this and other characteristics of a good paradigm. It provided a plausible basis for beginning structural analysis of organizations - by functional units, by principles of coordination, by instruments of feedback and control, by design of the 'communication system/ and so on. It supplied, in other words, not just a theory of organization, but a methodology, which permitted practitioners to get on with what might be called, to follow Kuhn's example, 'normal administration.' This task, as anyone who has studied management up close knows, is concerned mostly with 'puzzle-solving.' The machine image has now, like paradigms in science, a historically based hold on practitioners. It has a constituency, and again, using Kuhn's rationale, it is resistant to change and reluctant to deal with anomalies. Nevertheless, as we showed in chapters 3 and 6, a growing community of scholars is demonstrating the existence of numerous anomalies, things that the machine model, in either its pristine or its modified, computer-based form, cannot explain. A sociological fact noted by Kuhn is that the emergence of a new paradigm in a field creates a kind of generation gap, which continues until adherents of the previous paradigm die off, so great is the difference of perspective marking off those who have mastered the world-view implied by the new paradigm from their predecessors. It appears to us that this is where we who study organizations now stand in certain important respects. In computing science, the generation gap is behind us, having been absorbed by waves of specialists totally indoctrinated in the principles that seemed so revolutionary when they were first introduced. What was once a paradigm shift is now orthodoxy; computer science has been 'normalized,' in Kuhn's terms. But the shock wave has only just hit organizational science. The attempt to implement office automation was one of the factors that brought it to a head. There is now a new conceptual gap revealed, this time in the dominant image of organization. The machine metaphor was once the paradigm of organization. It will shortly be no more. We believe that we are at the edge, historically, of what we would term a fault line. The effect of communication technologies is, we
The Vulnerable Fortress
238
claim catastrophic. The transformations that they are provoking, in the ways in which we both work in organizations and think about them, are discontinuous rather than continuous. In other words, the tension between the ways in which our practices are changing and the ways in which we symbolize them is close to becoming unbearable. Although we cannot predict its final outline, a paradigm shift is occurring. It is not something that we can force; we can only expand our horizons to help it through its birth pangs. One principle can, however, be enunciated. What is happening is communicational. Organizations are communication systems. If we are to understand them, we shall have to look first and foremost at their communicational properties - not, we may add, something that organizational science has ever done very well. Communication theory is itself undergoing radical change. The models that we build of organizational structure and process should increasingly be informed by that theory. Understanding the effects of change We began this chapter by raising the issue of how to comprehend phenomena of change. The communication theory that we espouse envisions the organizational dynamic as an interplay between two modalities of communication, conversation and text. By conceptualizing computerization as text-making, and images of organization as the underlying frame that structures the scripts of management, whether computerized or not, we laid the groundwork for a kind of explanation of how change occurs. To conclude this chapter, we examine two kinds of scenario that are consistent with this way of seeing an organization in the process of transformation. The first of these is inspired by the work of Karl Weick, in particular his book The Psychology of Organizing (1979). We quoted him briefly at the end of the previous chapter, now let us look further into his ideas. Weick chose the word 'organizing' (rather than 'organization') because of his perception that people are forever in the business of getting organized; they never actually achieve the goal as such. As he imagines an organizational process unfolding, there are indeed environmental events taking place ('ecological change/ 130) to which we react. They take on meaning to us only when we interpret them as relating to our preoccupations and making sense in terms of es-
239
The Fall of the Fortress?
tablished categories of meaning ('enactment'). As he sees it (and as we saw in chapter 3), the environment is far too equivocal in the messages that it supplies us with to be ever immediately knowable. We are always having to 'read' it, and in our reading our subjective systems of knowledge interpose themselves between raw experience ('variation') and eventual understanding ('selection' and 'retention'). Weick's approach differs from stimulus-response (S-R) psychology by assuming that the 'S' part of the equation is not a clear given, but must literally be enacted, in order to make it relevant to the 'R' part. Solutions need the right kind of problems. In Weick's scheme of things, the understandings of environmental events become the skeleton that supports, and shapes, the organizational systems. To the extent that the environment smiles on patterns of organizational behaviour that seem to work, they are reinforced and become turned into routines, accompanied by strict codes of behaviour. These are 'assembly rules' - recipes (1979:113) - and they dictate how the cycles of behaviour making up organizational work processes are supposed to unfold. Cycles are composed of what we call the 'transactions' of communication, and Weick refers to as 'double interacts' (89, 114). There is an inherent reflexivity in this model of organizing. The processes (sets of 'double interacts,' or transactions) are the means by which the equivocal raw data of experience are turned into information, a transformation that 'is variously described as separating figures from a ground of labelling streams of experience' (134). As the environment is interpreted through action, it informs, and gives meaning to, the action. The organization is caught in a loop: its actions structure its understandings, and its understandings structure its actions. If the understandings are wrong, the actions will be too, and vice versa. When that happens, the problem is to 'get out of the loop.' When we translate Weick's terminology into the one developed in this book, it is clear that his 'assembly rules' are precisely what we mean by 'text/ and his 'cycles' are what we call 'conversation.' Weick's theory of organizing claims that as the messages coming from the environment become more equivocal, the rules become less useful. When that happens, the cycles begin to stretch out and become less governable. The 'loop' has started to dissolve. This is the only way in which the organization can adapt; nothing could be worse than sticking stubbornly to an outmoded set of rules that no longer provides an effective guide to a transformed environment. To use the language
The Vulnerable Fortress
240
of chapter 6, type-i organizations do not work in type-2 worlds. How do we apply Weick's theory? First, it is obvious (and it was the subject matter of chapter 5) that, under the impulsion of the new communication technologies, the environment that we now confront offers extraordinary levels of equivocality. The status quo, to cite Clemens, has vanished. 'Ecological change' (Weick's term for the environment) has now attained unprecedented levels of variability (unpredictability caused by economic and financial uncertainty, political upheavals, and social turbulence). The inevitable result is emergence of a kind of organization characterized by 'a sprawling, equivocal process [that] contains many independent elements that have few internal constraints' (Weick 1979: 190). This is not a bureaucracy, and management is no longer just 'rational/ rule-based administration. It is hypermanagement. There is no other alternative in an environment typified by very high levels of uncertainty. Only through relatively unrestrained conversation can the variety be restored. Automation works in the opposite way, to make the rules ever more determinant. It works where the environment is enactable; it is a formula for disaster where enactment is problematical. The technologies of augmentation in contrast, offer a possibility for learning and adaption, even while they make centralized control difficult. The problem that faces us, as a society, is that we have never developed a managerial philosophy and technology appropriate to dealing with people who learn and who are creative. We know how to run hierarchies, but much less how to manage horizontal systems. Until we understand that, our economy will falter, and social problems will multiply. The other scenario of organizational change, in the context of a theory of communication, is that of Harold Innis (1951). In Innis's interpretation of history, the discovery of a new medium affects the character of knowledge. The system of writing that is stimulated by the new medium is a technology that must be mastered, the accomplishing of which creates a new elite, which in turn translates into a monopoly, or oligopoly, of knowledge: 'Inventions in communication compel realignments in the monopoly or the oligopoly of knowledge'(4). In every society, there is an equilibrium linking the oral tradition (or what Innis calls the Vernacular') and the specialized languages of the elites, which are in turn the necessary support of the governing class. As the complexity of the new language multiplies, the distance separating the specialist from the common sort of hu-
241
The Fall of the Fortress?
manity becomes greater and greater. When this happens - when knowledge has divided the polity into intellectual 'haves' and 'havenots' - the conditions for revolution are created. The oral tradition (the 'conversation/ in our terminology) 'implies freshness and elasticity/ while concentration on learning, says Innis, is followed by rigidities. In the end, he claims, it is the peripheral regions, the people outside the magic circle, who complete the revolution, because they have retained more variety. Our reading of the history of the past half-century or so (since 1936, and Turing's discovery, in effect) is that the new medium for the production and distribution of computerized texts has produced not one, but two, elites (cf. Bardini, Horvath, and Lyman, n.d.). These two elites have quite different ideas about the nature and concentration of knowledge and, hence, about the eventual structure of organization. Whichever vision prevails, however, it is also evident that, either way, we are exacerbating the distinction between educated and uneducated classes in our society. Unless we are prepared to begin to deal with the exclusionary effects of the new technologies, by widening the 'empowerment potentials' of these technologies to include all citizens, we may well indeed be creating the conditions for a revolution. The stability of North American society used to be linked to its literacy. The new literacy is upon us, and we seem not yet to have realized that it is an issue of more than productivity. The perspectives of Weick and Innis are in certain ways complementary; one is focused on the short term, the other has a historical bent. In the near future, globalization will be driving ecological change and impelling organizations away from routine into perpetual innovation, as they try to develop the variety that they need to survive. It is atomization that is altering the balance between text and conversation from the inside, making rules easy to apply for the progressive sector, and difficult for the stagnant. The result is a very significant transformation of organizational structures and processes, which will accelerate before the end of the century. In the longer run, the prospects are for even more fundamental change, as the transformation of the basis of knowledge starts to turn the world upside down. The fall of the fortress? Medieval Europe was dotted with impregnable fortresses, governed by princes, bishops, dukes, earls, counts, and barons. Gunpowder,
The Vulnerable Fortress
242
and the printing press, and new techniques of manufacture, and improved transportation brought the end of feudalism and heralded the arrival of the modern nation-state. Like the strongholds of late medieval Europe, today's castles of bureaucracy are under attack simultaneously from the outside and from the inside, from globalization and from atomization. Today's 'gunpowder' may come packaged in electronic code, but its impact is no less devastating, even though today's 'fortresses' are constructed out of paper. Like the citizens of an earlier epoch, we are witnesses to the gradual dismantling of walled worlds of governance which have simultaneously protected us from the 'marauders' and imprisoned us in a hierarchical system of power and privilege. History informs us that the people inside those medieval domains, surrounded by their unassailable ramparts, had already begun to change into a community of people very different from the model of high chivalry - a new nation in embryo, preoccupied by industry, by manufacture, by science, and by trading. It is happening again. The collapse of those medieval fortresses not only signalled the end of feudalism and the arrival of the nation-state. It also meant eventual attempts to suppress, especially in countries such as France, and Britain, autonomous regional cultures, such as those in Provence and Aquitaine and Brittany, in Wales and Scotland. That was the bad news. The good news was that it also brought about the downfall of the petty tyrannies that those fortresses stood for and an enlargement of opportunity for at least some of the folk who peopled them. Unfortunately, even the good news had a 'down' side: the conditions had been created for democracy, as we know it, but also for even greater tyrannies than before. The advent of the modern form of democracy, like the growth of complex industrialized economies with which it is associated, was signalled by the spread of the ability to make and distribute written text, following on the invention of the printing press by Johannes Gutenberg in the fifteenth century (Eisentein 1979). If we believe Harold Innis (1950; 1951), the wealth of Canadian forests, and the stimulus provided by their availability - in combination with the steam press - supplied the energy for parliamentary democracy in the nineteenth century. The capacity to make and disseminate text has again undergone an expansion, of equal importance, proportionately, to Gutenberg's innovation. Its effects will also be eventually
243
The Fall of the Fortress?
felt, although the outcome is probably as little predictable now as it was then. The potential for both greater democracy, and greater tyranny, exists. Centuries after Gutenberg and the end of feudalism, the emergence of industrial giants in North America and the urbanization accompanying it were a wrenching break with an essentially rural, small-town heritage. Once again, the new order meant the destruction of a culture and a way of life. Again also, it meant new opportunities - and new risks. It seems that we today are once again facing change of a comparable order, bringing to an end an era overshadowed by a set of predominantly bureaucratic institutions. Those bureaucracies were also, we are likely to remember, the bulwark of modern democracies. When they go, they leave a void. What lies beyond, in a globalized and atomized society, is not easy to foresee. But we think that it is time that we began to try.
This page intentionally left blank
Notes
Preface 1 Empirical literature, based on field studies of management behaviour, confirms that managers spend most of their time in conversation; cf. Weinshall (1979) for several reports. 2 See Gareth Morgan (1986). 3 For a more extended discussion, see Carey (1981). 4 This has been referred to by Silverstone, Hirsch, and Morley (1990: i) as the 'double articulation' of communication technology. 5 With honourable exceptions, such as Michel Foucault. 6 Cf. Weber (1958) and Jacoby (1973). Chapter i 1 As an example, financial centres are electronically linked by computer, screen, and fax. The Chicago Board of Trade and the London International Futures Exchange trade each other's financial contracts, the Chicago Board of Trade is open late into the evening to mesh with Asian trading hours, the London exchange has a similar mutual trading operation with the Sydney Futures Exchange, and so on. Brokerage firms now plan to follow the markets around the world. All of this is a direct consequence of the new communications environment. So international has become the investment business, creating disembodied marketplaces such as the foreign exchange market and NADASQ, that concern has been expressed as to whether any national government is any longer in a position to control it. 2 Perrow (1984), when discussing high-risk systems, uses the term
Notes to pages 12-15
2
4^
'interactive complexity' to describe the way in which some of the special characteristics of these systems, beyond their toxic, or explosive or genetic dangers, make accidents in them inevitable, even normal. This has to do with the ways in which failures can interact and the way in which the system is tied together. 3 For example, a few years ago, the Globe and Mail (23 August 1988: B-8) reported: 'Last week's announcement that new federal tax rules governing retirement savings will be delayed for another year has been greeted with sighs of relief from employers. The income tax changes, which are designed to provide all Canadians with equal opportunity to shelter their pension contributions, were supposed to be introduced for the 1989 tax year. Last Friday, however, Finance Minister Michael Wilson announced that the tax changes, including plans to increase limits for allowable registered retirement savings plan contributions, would be delayed until 1990. In a statement, Mr Wilson said that his department needed additional time to revise those parts of the 33o-page draft legislation that had met with criticism. Both the Canadian Institute of Chartered Accountants and the Canadian Bar Association have labelled the draft bill the most complicated legislation they have ever seen.' 4 To the point where computer programming courses now include training in how to evaluate the adequacy of a computer program. This is becoming a new science, as programming leaves the 'flying-by-the-seatof-your-pants' era to entertain new levels of complexity. Unfortunately, there is a well-known principle, associated with the name of Kurt Godel, that the internal consistency of such elaborate constructions can never be determined with absolute certainty. 5 Note that this trend to intelligent interfacing is no longer limited to human consulting enterprises. Operating systems such as OS/2, developed by Microsoft, for example, include what is called a Virtual device interface' (VDI), which means that software developers may in future ignore the details of the hardware on which their products will operate. The interface 'understands' the working system of the machine; what is presented to the software maker masks variations in machine design among manufacturers. The layer between the basic machine and the eventual user gets thicker; access less onerous. 6 American Airlines' travel reservation system Sabre is the best known. Some fifteen years after its introduction, it had risen to an estimated market value of $1.5 billion, which was calculated as more than half of the total capitalization of American Airlines' parent company, AMR,
247
7
8 9 10
11
12 13
Notes to pages 18-20
estimated at $2.9 billion. In other words, the reservation system had outstripped in value the core airline business. The marketing of information about travel, for this company, has turned out to be bigger business than transport itself! Leacy (1983), series 08-85; Statistics Canada, Labour Force Annual Averages, 1975-1983 (Cat. 71-529) and The Labour Force (Cat. 71-001, monthly). Statistics Canada, Employment, Earnings and Hours (Cat. 72-002, monthly). Statistics Canada (1986). Statistics Canada, Cat. 71-001; see also Drucker (1986: 775-6) re the 'second major change in the world economy which is the uncoupling of manufacturing production from manufacturing employment... in the last 12 years total employment in the United States grew faster than at any time in the peacetime history of any country ... that is, by a full onethird. The entire growth, however, was in non-manufacturing, and especially in non-blue-collar jobs. See also The New Economy: The Rising Value of Brain Power' (Globe and Mail, i Sept. 1992), which states that highly knowledge-intensive industries such as health, education, and business services in Canada have created 303,734 net new jobs over the past seven years - or 89.5 per cent of all new employment. They account for 26 per cent of the total in Canada, up from 24 per cent in 1984. The Global Economy: Who Gets Hurt' (Business Week, 10 Aug. 1992) notes that AT & T in the United States has cut 21,000 low-skilled jobs since 1984, leaving 32,000, but expanded white-collar by 8,000, to 18,000. 'We're losing low-skilled jobs overseas or to technology ... we need more engineers, designers and high skilled technicians, but fewer people who make circuit boards.' A study conducted by Birch for the Canadian Department of Regional Industrial Expansion showed that 55 per cent of net employment gain between 1974 and 1982 came from companies with less than 20 employees. Not all of these jobs are the kind we have been describing: in 1988, for example, while all job creation was in the white-collar occupations (blue-collar jobs again declined), most was concentrated in the lowpaying clerical and sales sectors. How successful Canadians are in penetrating the new markets for thoughtware determines the percentage of high-scale, white-collar jobs created (Globe and Mail, July 1988). Statistics Canada (1988), Enterprising Canadians: The Self-Employed in Canada, Gary L. Cohen (Cat. 71-536). See, as an example of navigational skills, Lawrence Wright, The Man
Notes to pages 20-7
248
from Texarkana/ New York Times Magazine (28 June 1992), which describes the circumstances of Ross Perot's founding of his computer services company, EDS: This was very much the early dawn of service industries ... It was an idea for a computer-services company that would provide everything from the specialized software a customer might need for his business to the actual employees to run the machinery.' 14 Woods Gordon (1985). 15 Canadian Federation of Independent Business (1983). 16 Japan Statistical Yearbook (1989), Table 4.3, pp. 121-2. 17 Leacy (1983), series R 795-811. 18 Statistics Canada, Manufacturing Industries of Canada: National and Provincial Areas (Cat. 72-004, annual). 19 In 1992, for example, ICI, which on some measures is Britain's biggest firm and the world's fourth-largest chemicals company, announced that it was going to 'dismember itself.' ICI grew up like a conglomerate, when conglomerates were in fashion. It bought companies that often had little in common beyond the paternal touch of its Millbank headquarters ... It was only when shareholders started asking the really awkward questions - about performance - that the ... strategy was shown to be pitifully thin' (Economist, 27 June 1992:18). 20 Institute for the Future (1988). 21 Kolko (1988) argued that a restructuring of the world economy is occurring as companies, driven by falling profits in intense international competition, compensate by speculative activities such as foreign exchange trading and takeover raiding rather than investment in new plant and equipment. The concentration of wealth is in the hands of a few conglomerates thus indexes the failure of large organizations to attain acceptable levels of productivity and to remain competitive in the terms of yesterday's economy. 2 2 W. Krause and J. Lothian, 'Measurement of Canada's Level of Corporate Concentration.' Canadian Economic Observer, Statistics Canada 11-010, 2 no. i (3.14-3.31) (Jan. 1989). 23 Roach (1986: 9). 24 Communications Canada (DOC) (1988). 25 Although a trend to urban growth has been visible since the nineteenth century in Canada, in the 19205 Canada's urban and rural populations were still almost equal. By 1986, however, the 19 million people living in urban areas accounted for 77 per cent of the total Canadian population (Brian Biggs and Ray Bollman, 'Urbanization in Canada,' Canadian Social Trends, Statistics Canada n-oo8E no. 21, Summer 1991: 23-7).
249
Notes to pages 27-66
Forty years have seen the expansion of government: for example, in 1940, there were 49,656 federal employees; in 1960, 195,630; and by 1980, 338,114. See Leacy (1983), series ¥211-59; Statistics Canada, Federal Government Employment (Cat. 72-004, quarterly). 26 Financial centres, markets, and financial firms move around more easily than people realize (Economist, 27 June 1992: 21-6): 'Almost all these centres are insecure about their future. Two of the business's favourite buzz words, globalization and securitisation have made financial services more mobile than ever before.' 27 Woods Gordon (1985). Chapter 2 1 Cf. Giuliano (1982: 148): 'The office is the primary locus of information work, which is coming to dominate the US economy. A shift from paperwork to electronics can improve productivity, service to customers and job satisfaction.' 2 An example is the Diebold Information Technology Scan, 1983, produced by the Diebold Group, which focuses on five-year projections for annual rates of change in unit costs resulting from emerging technologies and the changing economics for existing technologies, providing 'many opportunities for MIS departments to extend the scope and increase the value of their information services' (Abstract, 2). Strong Central Management of Office Automation Will Boost Productivity (Comptroller General of the United States, 1982) suggests that to reap the benefits of office automation, 'which has the potential to improve the productivity of Federal managers, professionals and clerical workers, without wasting resources, agencies should establish strong, central management of office automation' (p. i). 3 The system also provided for other functions, including consultation of outside databanks and compiling of operational statistics. 4 Because of the proprietary character of some of the information (this is a commercial product), we have gone to some pains to disguise the identity of the organization and the precise characteristics of the system. No technical information even approximating the actual system will be found here. While this inevitably sacrifices some verisimilitude, we are convinced that the essential facts of the case, as they relate to the thesis of this book, stand out clearly enough. 5 Both names are fictitious. 6 This dependence of meaning on context is known in ethnomethodology as 'indexicality'; cf. Handel (1982) and Leiter (1980).
Notes to pages 74-9
250
Chapter 3 1 As mentioned in chapter 2, we have discussed these findings in greater detail elsewhere (J.R. Taylor and Katambwe 1988; J.R. Taylor 1993). 2 Cf. Markus (1984). Although not going so far as to term the technology a misfit, Markus reminds us that 'the impacts of systems are not caused by the system technology or by the people and or organizations that use them but by the interaction between specific system design features and the related features of the system's organizational setting' (ix). 3 Cf. Lakoff and Johnson (1980: 5). 4 Bolter (1984) has, like Morgan, also pointed out the influence of metaphors in determining how we conceptualize the world around us. The introduction of the clock into medieval Europe provided a metaphor that nourished the reflections of the scientific and philosophical community for centuries. McLuhan (1962) argued that the printing press was a metaphor of equal power in determining how we imagine our world. 5 Among others, Von Neumann, Godel, Church, Post, and Kleene. 6 We use the term 'information age' here to denote a stage in the development of computer technology, not to enter into debates about 'revolution,' post-industrialism, and critiques of capitalism. 7 During the Second World War, Turing was part of a group that did construct a machine incorporating much of this basic design. The machine, part of a top-secret code-breaking project, was known as the 'Enigma.' 8 Computers in this sense are reflexive: they can take their own operating code as data and can thus refer to themselves, as Kurt Godel had earlier foreseen (cf. Hofstadter 1979). 9 Hofstadter (1979), in his witty treatment of artificial intelligence, includes a fable that has an anteater conversing intelligently with an anthill, the point being that while the individual ants are without meaning (other than as fodder), the pattern of their social comings and goings makes the anthill an organization whose meaning can be understood. This is one of Hofstadter's many elegant illustrations of the importance of taking logical level into consideration in the understanding of human communication. This distinction is employed to great effect by a group that grew up around Gregory Bateson ('Palo Alto school'), which has shown the importance for organizational researchers of keeping the pattern of relationships in mind as much, or more than, their manifest content when the meaning of an observed body of messages is in question.
251
Notes to pages 80-3
10 Although Beniger (1986) and Schement and Mokros (1990), among others, would argue against linking the beginnings of the information age with computer technology, for our purposes of dealing with communication and organization it is the watershed date. 11 Cf. Smith and Weiss (1988: 816-19). 'In most conventional paper documents - such as journal articles, specifications, or novels - physical structure and logical structure are closely related. Physically, the document is a long linear sequence of words that has been divided into lines and pages for convenience. Logically, the document is also linear: Sentences to form paragraphs, paragraphs to form sections, etc.... If the document has a hierarchical logical structure, as do many expository documents such as journal articles, that hierarchy is presented linearly: The abstract or overview of the whole comes first, followed by an introduction, the first section, the second section, etc., until the conclusion ... Such documents strongly encourage readers to read them linearly, from beginning to end following the same sequence.' 12 Some of the flavour of this image of organization is captured in extracts from an early textbook on Canadian public administration. Dawson (1963), describing a government department, imagined an 'apex' at which stands the minister. Under him a deputy minister supervises the work of a group of officials, each of whom in turn supervises the work of other groups of officials, and so on, until 'the most humble member of the department is affected.' In this way, 'instructions and orders flow from the supreme head down through subordinates until they culminate in action at the appropriate level.' While a more sophisticated theorist such as Max Weber in Germany realized that what was 'flowing' through the levels of the hierarchy was not so much 'instructions and orders' as authority, legitimated by government policy, he too insisted on the structuredness of the hierarchical relations making up the body of an administration. Computer programming provides a powerful metaphor for this kind of naive theory of hierarchical relations. 13 The classic text on the subject is Weber (1949). Mackenzie (1978) has also written extensively on the subject of hierarchy and its measurement in contemporary organizations, which he terms 'buroids/ to contrast them with bureaucracy. He finds few organizations that in fact have anything like the degree of hierarchy posited in Weber's 'ideal-type.' 14 Anatol Holt has written extensively about the manner in which current discussions of information depend on leaving the programmercomputer relationship out of the account. 15 Every computer program includes choice points where 'decisions' must
Notes to pages 84-9
252
be made. These internal decision points are not to be confused with the design decisions made by the computer program in constructing the program. Decisions in the program have zero degrees of freedom: they involve the comparison of some actual state of affairs with a preestablished criterion. The only 'decision' involved is in recognizing the states of the environment, the criterion having been set externally. The computer programmer is of course in a quite different situation in that he or she is confronted with a problem of design that implies the use of imagination. Computer programming shares with literature more than the learning of a language; both involve an element of artistry. In organizations, many people are constantly taking decisions, but classical theory sees most of them as similar in kind to the internal choice points of a computer program with the 'fun' decisions, which involve imagination, being reserved to senior executives. In this sense, classical theory sees an organization as an emanation of a single persona. Since this is a flattering image for the chief executive, the popularity of this view in senior business circles is not surprising. See also Katz and Kahn (1966) for a three-level image of the division of responsibility in an organization. 16 The meaning of 'control' in this citation is different from the idea of control used in cybernetics. 17 Kling (1980) distinguishes between 'systems rationalists' and 'segmented institutionalists.' 'Systems rationalists' fit the office-automation mould, and 'segmented institutionalists' do not (in that they take account of non-instrumental, non-task-related behaviour and of ugly facts such as the exercise of power). For a more recent discussion, see Hirschheim (1986). In his view, taxonomies based on office activities or tasks have not 'provided a great deal of insight into the inner workings of offices. The office categorizations, although interesting, have been too simplistic to be of much value.' 18 Current theory in physics raises quite similar questions as to the 'objective' nature of the 'real world' of material phenomena. Even the perception of physical phenomena cannot be abstracted from the point of view of the observer. The Wheeler example was, as we noted, borrowed from the physical sciences. We have now reached the strange position, historically, where those who work with essentially symbolic realities, which one would think shade easily into the realm of the imaginary, now give more credence to the 'hardness' of the facts of external reality than do those who deal with the material world.
253
Notes to pages 90-100
19 Cf. Feldman and March (1981). 20 See also Keen (1981: 24): In general, decision processes are remarkably simple; what has worked in the past is most likely to be repeated. Under pressure, decision makers discard information, avoid bringing in expertise and exploring new alternatives; they simplify a problem to the point where it becomes manageable.' 21 Cf. March and Simon (1958). 22 Cf. Lawrence and Lorsch (1967), a contemporary restatement of a point made by Barnard (1938), a classic of the organizational literature. 23 Alternative terms might be 'cognitive rationality' versus 'pragmatic rationality.' 24 Japanese management style, if we are to credit some of the published reports, places less emphasis on maintenance of a superficial appearance of rationality and seems more open to assumptions of collective decisionmaking processes, intuition, and a level of ambiguity in communication that would seem excessive to us. Cf. Pascale and Athos (1981). 25 In Cohen, March, and Olsen's (1972: 37) words: 'It is clear that the garbage can process does not do a particularly good job of resolving problems. But it does enable choices to be made and problems sometimes to be resolved even when the organization is plagued with goal ambiguity and conflict, with poorly understood problems that wander in and out of the system, with a variable environment, and with decision makers who may have other things on their minds. This is no mean achievement.' 26 Cf. Van Maanen and Barley (1985: 31-2): The notion that organizations have cultures is an attractive heuristic proposition ... The phrase "organizational culture" suggests that organizations bear unitary and unique cultures. Such a stance, however, is difficult to justify empirically. Moreover, culture's utility as a heuristic concept may be lost when the organizational level of analysis is employed. Work organizations are indeed marked by social practices that can be said to be "cultural," but these practices may not span the organization as a whole. In this sense, culture is itself organized within work settings.' 27 See once again, for example, Handel (1982) and Leiter (1980). 28 These limits have been called the 'valency' of a relationship (Friedman 1972). 29 We have discussed this link in J.R. Taylor (i986b). 30 Not coincidentally, one of the features that system designers had to build into their products, at the behest of corporate customers, was a set
Notes to pages 109-32
254
of artificial restrictions on access to some parts of the system (through the use of passwords, for example). Left to themselves, as for many electronic mail networks, users encounter communicational 'pollution/ in the form of quantities of junk mail, the by-product of an open-channel philosophy. Some of the hierarchy-destroying consequences of this openness have been discussed in earlier work (J.R. Taylor 1982; igS6b). Chapter 4 1 A fuller presentation of these ideas can be found in J.R. Taylor (1993). 2 Goldkuhl and Lyytinen (1982) also see language as a mediating force. Unlike speech act theory, however, their language action or rule reconstruction, in which office action is conceived of in terms of activity and of finding ways to create a situation of undistorted communication, is related to Habermas's critical theory rather than our own (cited in Hirschheim 1985: 70-4). 3 The differential framing of conversation by its collaborators is called 'punctuation' by Watzlawick, Beavin, and Jackson (1967). 4 The task of qualitative research of the kind reported in chapter 2 is to discover the macro-pattern in the micro-transactions. Accumulation of reported and observed incidents can lead to conclusions of a more general order. This method is termed 'naturalistic' (Lincoln and Cuba 1985) and is produced in the form of case studies. 5 The macro-actor's success depends on this preoccupation with the immediate concerns of today and willingness to adopt a 'taken-forgrantedness' attitude towards the macro-transaction structure that lies behind the micro-events of everyday life (Deetz 1991). The resulting 'bounded rationality' (March and Simon 1958) makes organizations normally quite stable, by masking what might otherwise blossom into destructive conflicts of interest. 6 Cf. also Weick and Daft (1983: 76): 'Interpretations are forms of punctuation. Interpretations consist of discrete labels that are superimposed on continuous flows of people and experience. All interpretations are arbitrary in the sense that they exclude portions of the flow. Interpretations introduce spacings within continuous flows that make induction possible. As the spacings are introduced at different positions, different inductions become more and less plausible.' 7 His concept of keying conveys somewhat the same idea as the expression 'take/ as in 'My take on this situation is not the same as yours.'
255
Notes to pages 147-51
Chapter 5 1 Elementary cybernetic principles (Ashby 1954) suggest that locally semiautonomous organization is a prerequisite to adaptation to a complex environment (complexity having been precisely the result of the economies of scale in distribution). A major advantage of the bureaucratic model, therefore (though never admitted), was the relatively skimpy communication that it permitted, even while it ostensibly emphasized good 'communications.' It was the 'skimpiness' that made possible a monopoly on information and supported managerial authority. Bureaucracy was well adapted to its circumstances, even if the reasons were not exactly the official ones (and even if its self-image was in important respects inaccurate as a description of the prevailing reality). 2 Until 1840 or so, all economic activities were in effect conducted by family firms, which employed agents for their extended network transactions. Corporations, in the modern sense, simply did not exist. 3 In most companies and government departments, there used to be, typically, both a computing and a telecommunications service department. Now, the tendency is to treat them as a single service. The technological philosophy involved, however, is still rather different. 4 With increased network capacity, and digital switching, the hierarchical pattern of networks of the past is being superseded. In the old system, switches were organized in cascades, from local to regional to national, with each level in succession collecting calls from its satellites and retransmitting them. In the 'geodesic' pattern of networking (Huber 1987), because of the much greater power of the digital switches and the intelligence built into them, messages track their way through the network, in no preordained pattern, depending on channels available. Such networks are more easily configurable. 5 The telecommunications networks have been called 'electronic highways' by writers such as James Martin (1981): 'Building telecommunications channels is expensive, as was the building of highways. Eventually, it will have more impact on society than building highways has had ... The facilities we describe will become essential to society's infrastructure, essential to the way society is governed, essential to its response to the energy crisis, essential to productivity and hence to the wealth-generating processes. Communications media will be the cornerstone of the culture of our time' (4).
Notes to pages 153-67
256
6 See MacLean and Associates (1992: 20), discussing 'the freeing of communications from its traditional tethers, driven by technologies which permit more efficient use of the radio frequency spectrum, improve network intelligence, and expand the capabilities of terminal devices at the same time as they miniaturize their size.' 7 Cf. Economist (18 July 1992: 65) for a recent example. The announcement by Toshiba, IBM, and Siemens that they will collaborate in developing advanced memory chips is the latest of the recent joint ventures that are reshaping the electronic semi-conductor industry as they did in the auto industry a few years ago. These companies no longer see any sense in trying to compete on the components that are common to all their products. Instead, they will differentiate their products by other means. 8 There was, no doubt, a sense in which all three dimensions were always implied in any activity; TNS have transformed their relative importance. Traditional statistics had no separate category for the informational/ communicational component until recently. Today, this previously invisible part of the equation has in many instances become the primary component. 9 Cf. Porter and Millar (1985: 153): 'Broad scope can allow the company to exploit interrelationships between the value chains serving different industry segments, geographical areas, or related industries. For example, two business units may share one sales force to sell their products, or the units may coordinate the procurement of common components. Competing nationally or globally with a coordinated strategy can yield a competitive advantage over local or domestic rivals. By employing a broad vertical scope, a company can exploit the potential benefits of performing more activities internally rather than use outside suppliers.' 10 Cf. Reich (1991: 177): '[Symbolic-analytic services] include all the problem-solving, problem identifying, and brokering activities ... Like routine production services ... symbolic-analytic services can be traded worldwide and thus must compete with foreign providers even in the American market, but they do not enter world commerce as standardized things. Traded instead are the manipulations of symbols - data, words, oral, and visual representation.' 11 As we have already pointed out, consulting firms such as McKinsey and Arthur Anderson sell to companies expertise on how to plan their own communication systems! 12 Although the software is creating the opportunity for new kinds of enterprise and employment, it is also displacing workers in the less skilled or automatable occupations.
257
Notes to pages 167-85
13 Authors such as Garson (1989) have made the point that computer expert systems are also increasingly being used to standardize and control the decisions formerly taken individually by professionals such as social workers, bank officers, and stockbrokers. 14 The term 'stagnant' has a counter-intuitive connotation, since in Baumol's reasoning it is the 'stagnant' sector that is growing (in the sense of providing employment) while the 'progressive' sector is in decline, to the extent that the term measures employment opportunities. A sector is progressive, in his sense, if it shows productivity gains and is stagnant otherwise. 15 Smith (1980: 324) suggests that 'a major break with the past is clearly at hand ... an important shift in the way we treat information, the way we collect and store it, the way we classify, censor and circulate it. People will regard the process known as education in a quite different light in a society in which human memory will be needed for different purposes than in the past; we shall think of librarians, journalists, editors, and publishers as different creatures from those of today.' Chapter 6 1 In Canadian copyright law, for example, software is considered a form of creative writing. 2 It is in this sense a bit like music: although it can be transcribed, it is really meant to be played. The difference is that music is not a representation and cannot be used as an instrument of control. 3 Present-day databases may also include text, images, and other material up to and including program code. In this latter case, the database becomes not just a representation but also an explanation of the object represented, where such an object is a dynamic system, or machine ('explanations' being one kind of representation). 4 Cf. Beniger (1986) for a discussion in depth of the role of information in expanding systems of control, as well as Poster (1990) on databases. 5 Some people might want to argue that the invention of double-entry bookkeeping ranks in importance with the introduction of the printing press. Both exemplify how the representational function evolves. 6 As this was being written, a new venture was starting in California, designed to open an electronic market for information. Called 'AMIX,' it allows buyers to browse through descriptions of offerings from information suppliers. The market operates on the principle that the buyer advertises a question and sellers compete to meet his or her needs. The trend is towards 'unbundling' information; rather than sell a complete
Notes to pages 185-7
2
5^
market report, the supplier extracts segments of information from it to fit clients' particular expectations. 7 The change of perspective which underlies the section that follows is associated with the entry of a new product, the Apple Macintosh, onto the market in 1984-5. The Macintosh was not the only machine of its kind (there were PCs before the Mac), but its philosophy was somewhat at variance with that of the other vendors. Bill Atkinson, who along with Steve Jobs had helped develop the product, has commented: The Macintosh dream really centers around putting personal computing power in individuals' hands' (Kaehler 1988: xiii). Even allowing for the 'hype,' the contrast is clear: top-down design versus bottom-up choice. The older computing companies grew out of a tradition of working for government (computing the census), science (lunar mechanics), the military (ballistics), and accounting (for large companies); their buyers were corporate specialists in information science. Apple started as a manufacturer of intellectual toys for a diffuse consumer market, and its machines reflected the company's preoccupation with accessibility of its technology to untutored users. In the previous office-automation tradition, when technology proved difficult to learn, the tendency was to fall back on talk about inadequate training or problems of motivation. Mass-market-oriented companies reason differently: if something is hard to learn, it won't sell. While the usual pressure of consumer demand is mitigated in large organizations by the fact that the people who actually buy the new equipment are usually not its end users, but themselves computer specialists, Apple was gambling that users' preferences would eventually make themselves felt: it targeted the college market, for example. Unlike the previous generation of computers, the entry of the new family of products into administrative workplaces coincided with the take-up of technology by people, who, compared to clerical help, were used to exercising discretion in the organization of their own work and accustomed to controlling their own budgets. 8 The phenomenological idea underlying hermeneutics, for example, is that ideas become real only in their expression. 9 Heim (1987). 10 In the interests of simplicity, we are attributing somewhat more to the Stanford group than it alone achieved; the developments that we describe drew on innovation by the entire computing community. This does not detract from the contribution of Engelbart's group, which was unique in its philosophy of 'augmentation.'
259
Notes to pages 187-98
11 Leduc (1979). See also J.R. Taylor (1986!?). 12 We are deliberately telescoping a good deal of history here. Xerox had already been involved in an earlier project to develop a personal computer, the Alto. For reasons having to do with the internal dynamics of corporate strategy, the project was aborted (for a fascinating account of the unsuccessful venture, see Smith and Alexander 1988). There were many projects afoot in Silicon Valley at the same time; if we have concentrated on the Macintosh, it is because of its eventual impact on professional users. 13 Jobs did not achieve this feat without opposition from within his own company. Initially, Macintosh sales lagged behind those of the older, well-established Apple II, and many of those within Apple contested hotly the priority that Jobs was giving to a product that many felt had not yet earned its spurs. By 1985, the internal conflict had reached the point where Jobs felt it better to leave, and he founded another company, which he called, appropriately, 'NeXT Inc.' 14 See, for example, Schrage (1990), who outlines the new technological possibilities for transforming the process of meeting and collaboration. 15 See also Beniger (1986: particularly 390-425). 16 Survey and interview results indicate worker enthusiasm for computerization (while being concerned about potential job loss and health effects). Workers identify computerization with access to more and better information in order to better serve the public. They expect to be able to accomplish this by using the computer as a 'tool' while simultaneously reducing the tedium of their job and increasing their independence and variety ... We see technical systems which require workers to perform work traditionally attributed to the management function analysing interpreting and decision-making.' British Columbia Federation of Labour (1986). 17 Garson (1988) cites Ray Kroc on this point: 'We have a computer in Oak Brook that is designed to make real estate surveys. But those printouts are of no use to me. After we find a promising location, I drive around it in the car, go into the corner saloon, the neighborhood supermarket. I mingle with the people and observe their comings and goings. That tells me what I need to know about how a McDonald's store would do there.' While the rest of the enterprise works to a rigid, computer-driven plan, leaving no room for individual flair, Garson observes (37) one exception to this rule: 'It's interesting and understandable that Ray Kroc refused to work that way. The real estate computer may be as reliable as the fry vat probe. But as head of the company Kroc didn't have to surrender to it.
Notes to pages 199-221
260
He'd let the computer juggle all the demographic variables, but in the end Ray Kroc would decide, intuitively, where to put the next store.' 18 Panko's typology of office work has two basic archetypes. Type-i departments handle the firm's routine information-processing, such as accounting, payroll, and billing. Procedures abound in these departments, and the automation of procedures is central to improved performance. Type-a departments, in contrast, handle the firm's nonroutine information-processing. Procedures are comparatively few, and support of procedures is not central to improved performance. Cf. Panko (1984). Chapter 7 1 Hypertext is just one more step in a long-standing progression that includes relational databases and object-oriented programming. The emphasis that it receives in this text might well appear distorted to someone doing software research; it is here because we see in it a convenient metaphor, as well as a powerful technology. 2 An initiative of the Apple company. 3 Cf. Nielson (1990: 297-310). Through hypertext, it is suggested, an individual's interaction history can be used to increase the user's sense of context in the information space and lower the risk of disorientation. 4 The distinction drawn in chapter 4 between node as source or destination of a message and node as transaction is relevant here. This latter conception of an organization is quite consistent with Porter's value chain and Clemons's resource-based organization. It also fits well with Weick's view. 5 Mackenzie (1978). 6 Firms such as McDonald's like using slogans such as 'We do it all for you/ but of course the reverse is true - we do it all for them. It is we who go to the counter, they who tell us how to place our order, we who seat ourselves at the table, even we who clear the table. This is an increasingly common pattern in some of the service industries, where the system dictates to the client how he or she is to behave in order to obtain a response. As we bend to the system, we confirm the bureaucratic principle. 7 Fresh approaches to the global marketplace are being suggested by 'management's new gurus,' such as Hammer, Nadler, Ohmae, Peters, Senge, and Stalk (Business Week i992b). Terms such as 'the learning organization' and 'organizational architecture' are among their 'buzzwords,' it is reported.
261
Notes to pages 222-36
8 A number of case studies conducted by Stephen Leahey (1992) and published by the Canadian Workplace Automation Research Centre in Montreal bear on this issue. 9 Roundtable on Governance in the Information Society, Institute for Research on Public Policy, 1992, Ottawa, unpublished case studies. 10 A similar observation can be found in Strassman (1985). Chapter 8 1 In this respect, the physical sciences are in advance of the social. Ironically, physics started bringing theory back into the centre of its preoccupations in the mid-19305, just when the social sciences were heading into their empirical trough and becoming almost paranoid in their suspicion of theory. 2 Frederick Philip Grove, a Canadian novelist, wrote a remarkably perceptive description of the ultimate in automation in The Master of the Mill (1944). The technology of computing had not even been discovered, but the system that he described comes close to some of today's automated environments. 3 Catastrophe theorists note a similar tendency. Given a catastrophe, the decision-maker has two choices: (i) change policy to a point where support is maximum (Maxwell's rule) or (2) change policy in the direction that locally increases support (the delay rule). The delay rule has a sociological basis: lack of information, having to make quick decisions based on the intuition, sociological pressures to conform, inertia, and past history (Isnard and Zeeman 1976). The logic is similar to Kuhn's.
This page intentionally left blank
References
Akscyn, R.M., McCracken, D.L., and Yoder, E.A. (1988). 'KMS: A Distributed Hypermedia System for Managing Knowledge in Organizations.' Communications of the ACM 31 no. 7 (July) 820-35. Akzam, H. (1991). 'Implantation technologique et adaptation organisationnelle: periode d'experimentation.' Master's thesis, Universite de Montreal. Ashby, R. (1954). Design for a Brain. London: Chapman & Hall. - (1956). An Introduction to Cybernetics. London: Chapman & Hall. Attewell, P. (1992). Information Technology and the Productivity Paradox. Forthcoming, report on productivity from the National Research Council, Washington, DC. Austin, J. (1962). How to Do Things with Words. Oxford: Clarendon Press. Bailey, J. (1992). 'First We Reshape Our Computers, Then Our Computers Reshape Us: The Broader Intellectual Impact of Parallelism.' Daedalus (Winter) 67-86. Baily, N. (1986). 'What Has Happened to Productivity Growth?' Science (24 Oct.) 443-51. Bar, F., and Borrus, M. (1989). Information Networks and Competitive Advantages: The Issues for Government Policy and Corporate Strategy. Paris: OECDBRIE Telecommunications User Group Project, 1-47. Bardini, T., Horvath, A., and Lyman, P. (n.d.) The Social Construction of the Microcomputer User. Working paper, Anneberg School of Communication, University of Southern California, Los Angeles. Barnard, C.I. (1938). The Functions of the Executive. Cambridge, Mass.: Harvard University Press. Barthes, R. (1967). Elements of Semiology. Trans. A. Lavers and C. Smith. London: Jonathan Cape. Bartunek, J.M. (1988). 'The Dynamics of Personal and Organizational
References
264
Refraining.' In J.M. Bartunek, R.E. Quinn, and K.S. Cameron, eds., Paradox and Transformation: Toward a Theory of Change in Organization and Management, Cambridge, Mass.: Ballinger, 137-68. Barwise, J. (1989). Situation in Logic. Palo Alto, Calif: Center for the Study of Language and Information, Stanford University. Barwise, J., and Perry J. (1983). Situations and Attitudes. Cambridge, Mass.: Bradford (MIT Press). Bateson, G. (1964). The Logical Categories of Learning and Communication.' Position paper, Conference on World Views, Wenner-Gren Foundation, 1968. - (1972). Steps to an Ecology of Mind. New York: Ballantine. Baudrillard, J. (1968). Le systeme des objets. Paris: Gallimard. - (1970). La societe de consommation: ses mythes, ses structures. Paris: Gallimard. Baumol, WJ. (1967). 'Macroeconomics of Unbalanced Growth: The Anatomy of Urban Crisis.' American Economic Review 57 (June) 415-26. Bavelas, A. (1948). 'A Mathematical Tool for Group Structures.' Applied Anthropology 7,16-30. - (1950). 'Communication Patterns in Task-Oriented Groups.' Journal of the Acoustical Society of America 22 no. 6, 725-30. Beniger, J.R. (1986). The Control Revolution: Technological and Economic Origins of the Information Society. Cambridge, Mass.: Harvard University Press. Bennett, A. (1990). The Death of the Organization Man. New York: William Morrow and Co. Berge, C. (1970). Graphes et hypergraphes. Paris: Dunod. Berger, P.L., and Luckmann, T. (1966). The Social Construction of Reality. New York: Doubleday. Biggs, B., and Bollman, R. (1991). 'Urbanization in Canada.' Statscan ii-oo8E no. 21 (Summer), 23-7. Birch, D. (19873). The Automation of America.' Inc. (March) 21-2. - (i987b). Job Creation in America: How Our Smallest Companies Put the Most People to Work. New York: Free Press. Bolter, J.D. (1984). Turing's Man. Chapel Hill: University of North Carolina Press. Boudon, R. (1977). Effets pervers et ordre social. Paris: Presses universitaires de France. Boyett, J.H., and Conn, H.P. (1988). Maximum Performance Management. Glenbridge Publishing. British Columbia Federation of Labour (1986). Summary: Case Studies on New Technologies in Five British Columbia Workplaces. Report, Vancouver.
265
References
Brunsson, N. (1985). The Irrational Organization: Irrationality as a Basis for Organizational Action and Change. New York: John Wiley. Business Week (19923). The Global Economy: Who Gets Hurt?' 10 Aug. 4853- (iggab). 'Management's New Gurus.' 31 Aug. 44-52. Gallon, M., and Latour, B. (1981). 'Unscrewing the Big Leviathan: How Actors Macrostructure Reality and How Sociologists Help Them to Do So.' In K. Knorr-Cetina and A.V. Cicourel, eds., Advances in Social Theory and Methodology: Toward an Integration of Micro- and Macro-Sociologies. Boston: Routledge & Kegan Paul, 277-303. Canadian Federation of Independent Business (1983). A Full Employment Future: Submission to the Royal Commission on the Economic Union and Development Prospects for Canada. Carey, J.W. (1981). 'Culture, Geography, and Communications: The Work of Harold Innis in an American Context.' In W.H. Melody, L.R. Slater, and H. Paul, eds., Culture, Communication and Dependency, Norwood, NJ: Ablex, 73-91. Carney, T. (1988). The Effects on the End User of the Production and Distribution System for Microcomputers.' Paper presented to the Canadian Communication Association, Windsor, Ont. Chandler, A.D., Jr. (1962). Strategy and Structure: Chapters in the History of the Industrial Enterprise. Cambridge, Mass.: MIT Press. - (1977). The Visible Hand: The Managerial Revolution in American Business. Cambridge, Mass.: Harvard University Press. - (1990). Scale and Scope: Cambridge, Mass.: Belknap (Harvard University Press). Chenail, R.J. (1991). Medical Discourse and Systemic Frames of Comprehension. Norwood, NJ: Ablex. Child, J., Gunter, H.D., and Keiser, A. (1987). Technological Innovation and Organizational Conservatism.' In J.M. Pennings and A. Bertendam, eds., New Technology as Organizational Innovation, lLondon: Bellinger, 87-115. Chomsky, N. (1957). Syntactic Structures. The Hague: Mouton. - (1965). Aspects of the Theory of Syntax. Cambridge, Mass: MIT Press. Clegg, S., and Dunkerley, D. (1980). Organization, Class and Control. London: Routledge & Kegan Paul. demons, E.K. (19893). Corporate Strategies for Information Technology: A Resource-Based Approach. Wharton School of Business, University of Pennsylvania, Philadelphia. ). StraEEDFGFHGtegic Strategies for Information Technology: A Resource-Bas
References
266
Approach. Wharton School of Business, University of Pennsylvania, Philadelphia. demons, E.K., and Row, M.C. (1989). Information Technology and Economic Reorganization. Wharton School of Business, University of Pennsylvania, Philadelphia. Cleveland, H. (1992). '"Safe for Diversity": The Challenge of Governing in an Information Society.' In S.A. Resell et al., Governing in an Information Society, Montreal: Institute for Research on Public Policy, 111-19. Coase, R.H. (1937). 'The Nature of the Firm.' Economica 4, 386-405. Cohen, M.D., March, J.G., and Olsen, J.P. (1972). 'A Garbage Can Model of Organizational Choice.' Administrative Science Quarterly 17 no. i (March) 1-25. Cron, W.L., and Sobol, M.G. (1983). The Relationship between Computerization and Performance: A Strategy for Maximizing the Economic Benefits of Computerization.' Information and Management 6,171-81. Crozier, M. (1963). Ee phenomene bureaucratique. Paris: Editions du Seuil. Crozier, M., and Friedberg, E. (1977). L'acteur et le systeme. Paris: Editions du Seuil. Dawson, M. (1963). The Government of Canada. 4th ed., Toronto: University of Toronto Press. Deetz, S. (1991). Democracy in an Age of Corporate Colonization: Developments in Communication and the Politics of Everyday Eife. New York: State University of New York Press. De Marco, T. (1978). Structured Analysis and System Specification. New York: Yourdon. Devlin, K. (1991). Logic and Information. Cambridge: Cambridge University Press. Drucker, P.P. (1986). The Changed World Economy.' Foreign Affairs (Spring) 786-91. - (1989). The New Realties: In Government and Politics, in Economics and Business, in Society and World View. New York: Harper and Row. Dumais, S., Kraut, R., and Koch, S. (1988). 'Computers' Impact on Productivity and Work Life/ Presentation to Conference on Office Information Systems. SIGOIS Bulletin 9 nos. 2 and 3 (April and July) 88-95. Economist (1992). 'Financial Centres.' 27 June, 4-5. Edmondson, W.J. (19813) Tllocutionary Verbs, Elocutionary Acts, and Conversational Behaviour/ In H. Eikmeyer and H. Reiser, eds., Words, Worlds and Contexts, Berlin and New York: Walter de Gruyter. - (i98ib). Spoken Discourse: A Model for Analysis. London: Longman. Eisenstein, E.L. (1979). The Printing Press as an Agent of Change: Communica-
267
References
tions and Cultural Transformations in Early-Modern Europe. 2 vols. Cambridge: Cambridge University Press. Engel, G.R., and Townsend, M. (1985). Final Report on the Impact Assessment on the Office Communications Systems Program Field Trials in Customs & Excise, Prepared for the Department of Communications, Ottawa. Toronto: Engel & Townsend. Engelbart, D.C. (1970). 'Intellectual Implications of Multi-Access Computer Networks.' Paper presented at the Interdisciplinary Conference on MultiAccess Computer Networks, Austin, Tex. - (1984). 'Authorship Provisions in "Augment".' Digest of Papers, Compcom 84 465-72. Fayol, H. (1925). Administration industrielle et generale. Paris: Dunod. Feldman, M., and March, J. (1981). 'Information in Organizations as Signal and Symbol.' Administrative Science Quarterly 26,171-86. Flores, G. (1982). Management and Communication in the Office of the Future. University of California at Berkeley. Foucault, M. (1976). The Archaeology of Knowledge. New York: Harper & Row. Franklin, U. (1990). The Real World of Technology. CBC Massey Lectures. Toronto: CBC Enterprises. Friedman, Y. (1972). Societe-x-Environnement. Brussels: Paul Mignot. Garfinkel, H. (1967). Studies in Ethnomethodology. Englewood Cliffs, NJ: Prentice-Hall. Garson, B. (1988). The Electronic Sweatshop: How Computers Are Transforming the Office of the Future into the Factory of the Past. New York: Penguin. George, J.F., and King, J.L. (1991). 'Examining the Computing and Centralization Debate.' Communications of the ACM 34 no. 7 (July) 62-72. Gerth, H.H., and Mills, C.W. (1958). From Max Weber: Essays in Sociology. New York: Galaxy. Giuliano, V.E. (1982). 'The Mechanization of Office Work.' Scientific American 247 no. 3 148-65. Glaser, B.G., and Strauss, A.L. (1967). The Discovery of Grounded Theory: Strategies for Qualitative Research. Chicago: Aldine. Gleick, J. (1987). Chaos: Making a New Science. New York: Viking. Globe and Mail (1988). July. - (1988). 23 Aug. - (1992). 'The New Economy: The Rising Value of Brain Power.' i Sept. Goffman, E. (1959). The Presentation of Self in Everyday Life. New York: Doubleday. - (1974). Frame Analysis. New York: Harper & Row.
References
268
Goldkuhl, G., and Lyytinen, K. (1982). 'A Language Action View on Information Systems.' Proceedings of the Third International Conference on Information Systems. Ann Arbor, Mich. Goldstine, H.H. (1972). The Computer from Pascal to von Neumann. Princeton, NJ: Princeton University Press. Goodman, D. (1987). The Complete HyperCard Handbook. New York: Bantam. Greif, I. ed. (1988). Computer-Supported Cooperative Work: A Book of Readings. San Mateo, Calif.: Morgan Kaufman Publishers. Greimas, A.J. (1987). On Meaning: Selected Writings in Semiotic Theory. Minneapolis: University of Minnesota Press. Grice, H.P. (1975). 'Logic in Conversation.' In P. Cole and J.L. Morgan, eds., Syntax and Semantics vol. 3: Speech Acts, New York: Academic Press, 4158. Guetzgow, H. (1965). 'Communications in Organizations.' In J.G. March, ed., Handbook of Organizations, Chicago: Rand-McNally, 534-73. Haley, J. (1976). Problem-Solving Therapy. New York: Harper & Row. Halliday, M.A.K. (1970). 'Language Structure and Language Function.' In J. Lyons, ed., New Horizons in Linguistics, Harmondsworth: Penguin, 140-65. Handel, W. (1982). Ethnomethodology: How People Make Sense. Englewood Cliffs, NJ: Prentice-Hall. Harrison, B., and Bluestone, B. (1988). The Great U-Turn: Corporate Restructuring and the Polarizing of America. New York: Basic Books. Heider, F. (1958). The Psychology of Interpersonal Relations. New York: Wiley. Heim, M. (1987). Electric Language: A Philosophical Study of Word Processing. New Haven, Conn.: Yale University Press. Henderson, J.C., and Venkatraman, N. (1989). Strategic Alignment: A Framework for Strategic Information Technology Management. CSIR WP No. 190, Sloan WP No. 3039-89-1^8, go's WP No. 89-076. Center for Information Systems Research, Sloan School of Management, Massachusetts Institute of Technology. Heritage, J. (1984). Garfinkel and Ethnomethodology. Cambridge: Polity Press. Hirschheim, R.A. (1985). Office Automation: A Social and Organizational Perspective. Chichester: John Wiley. - (1986). 'Understanding the Office: A Socio-Analytic Perspective/ ACM Transactions on Office Information Systems 4 no. 4 (Oct.) 331-44. Hodges, A. (1983). Alan Turing: The Enigma of Intelligence. London: Unwin Paperbacks. Hofstadter, D.R. (1979). Godel, Escher, Bach: An Eternal Golden Braid. New York: Vintage.
269
References
Holt, A.W. (1985). 'Coordination Technology and Petri Nets.' In G. Rozenberg, ed., Advances in Petri Nets, Berlin: Springer-Verlag , 278-96. - (1988). 'Diplans: A New Language for the Study and Implementation of Coordination.' ACM Transactions on Office Automation Systems 6 no. 2, 109-25. Hovey, M. (1992). 'Conversation in the Software Development Process.' Master's thesis, Concordia University, Montreal. Huber, P. (1987). The Geodesic Network. US Justice Department, Washington, DC. Innis, H.A. (1930). The Fur Trade in Canada. New Haven, Conn.: Yale University Press. - (1950). Empire and Communications. London: Oxford University Press. - (1951). The Bias of Communication. Toronto: University of Toronto Press. - (1956). Essays in Canadian Economic History. Toronto: University of Toronto Press. - (1980). The Idea File of Harold Adams Innis. Toronto: University of Toronto Press. Institute for the Future (1988). 1988 Ten-Year Forecast. Menlo Park, Calif. Ishii, H., and Miyake, N. (1991). Toward an Open Shared Workspace: Computer and Video Fusion Approach of Teamworkstation.' Communications of the ACM 34 no. 12 (Dec.) 37-50. Isnard, C.A., and Zeeman, E.C. (1976). 'Some Models from Catastrophe Theory in the Social Sciences.' In L. Collins, ed., The Use of Models in the Social Sciences, London: Tavistock Publications, 44-100. Jacoby, H. (1973). The Bureaucratization of the World. Translation of Die Biirokratisierung der Welt: Ein Beitrag zur Problemgeschichte. Berkeley and Los Angeles: University of California Press. Japan Statistical Yearbook (1989). Jourdennais, M. (1992). 'Changement organisationnel et reconstruction de marges de manoeuvre a la Societe Canadienne de Postes.' Master's thesis, Universite de Montreal. Kaehler, C. (1988). HyperCard Power Techniques and Scripts. Reading, Mass.: Addison-Wesley. Katambwe, J.M. (1987). 'L'analyse des besoins en bureautique et en telematique: une etude critique.' Master's thesis, Universite de Montreal. Katz, D., and Kahn, R.L. (1966). The Social Psychology of Organizations. New York: John Wiley. Katz, J.J., and Fodor, J.A. (1963). The Structure of a Semantic Theory.' In J.J. Katz and J.A. Fodor, eds., The Structure of Language: Readings in the Philosophy of Language, Englewood Cliffs, NJ: Prentice-Hall, 479-518.
References
270
Keen, P.G.W. (1981). 'Information Systems and Organizational Change.' Communications of the ACM 24 no. i (Jan.) 24-33. Keeney, B.P. (1990). 'Recursive Frame Analysis: A Method for Organizing Therapeutic Discourse.' Therapia Familiare 33, 25-39. Kishchuk, N., and Bernier, M. (1986). 'Presentation to Organizations Participating in a Study on Productivity and the Management of New Technology.' Canadian Workplace Automation Research Centre, Montreal. Kling, R. (1980). 'Social Analyses of Computing: Theoretical Perspectives in Recent Empirical Research.' Computing Surveys 12 no. i (March) 61-110. Kolko, J. (1988). Restructuring the World Economy. Toronto: Random House of Canada. Kondo, O.K. (1990). Crafting Selves: Power, Gender and Discourses of Identity in a Japanese Workplace. Chicago: University of Chicago Press. Kraemer, K.L., and King, J.L. (1986). 'Computing and Public Organizations.' Public Administrative Review 46, 488-96. Kuhn, A., and Beam, R.D. (1982). The Logic of Organizations. San Francisco, Calif.: Jossey-Bass. Kuhn, T. (1970). The Structure of Scientific Revolutions. 2nd ed. Chicago: University of Chicago Press. Labov, W., and Fanshel, D. (1977). Therapeutic Discourse. New York: Academic. Lakoff, G. (1987). Women, Fire and Dangerous Things. Chicago: University of Chicago Press. Lakoff, G., and Johnson, M. (1980). Metaphors We Live By. Chicago: University of Chicago Press. Lardner, J. (1988). The Sweater Trade (Annals of Business).' New Yorker (11 Jan.) 39-73 and (18 Jan.) 57-73. Lawrence, P.W., and Lorsch, J.W. (1967). Organization and Environment. Cambridge, Mass.: Harvard University Press. Leach, E. (1964). 'Anthropological Aspects of Language: Animal Categories and Verbal Abuse.' In E.H. Lenneberg, ed., New Directions in the Study of Language, Cambridge, Mass.: MIT Press 26-63. Leacy, F.H., ed. (1983). Historical Statistics of Canada. 2nd ed. Ottawa. Leahey, S.G. (1992). Telecommunications Network-Based Services and Organizational Strategy. Report, Canadian Workplace Automation Research Centre, Montreal. Leduc, N. (1979). 'La communication mediatisee par ordinateur: une nouvelle definition du dialogue groupal?' Master's thesis, Universite de Montreal.
271
References
Leiter, K. (1980). A Primer on Ethnomethodology. New York: Oxford University Press. Levinson, S.C. (1983). Pragmatics. Cambridge: Cambridge University Press. Lincoln, Y.S., and Cuba, E.G. (1985). Naturalistic Inquiry. Beverly Hills, Calif.: Sage. Long, R.J. (1987). New Office Information Technology: Human and Managerial Implications. London: Croom Helm. Luce, R.D., Macy, ]., Jr., Christie, L.S., and Hay, D.H. (1953). Information Flow in Task-Oriented Groups. MIT Research Laboratory of Electronics, Technical Report No. 264. MacCormac, E.R. (1985). A Cognitive Theory of Metaphor. Cambridge, Mass.: MIT Press. McFarlan, E.W. (1984). 'Information Technology Changes the Way You Compete.' Harvard Business Review (May-June) 98-103. Mackenzie, K.D. (1978). Organizational Structures. Arlington Heights, 111.: ARM Publishing. - (1986). Organizational Design: The Organizational Audit and Analysis Technology. Norwood, NJ: Ablex. McLaughlin, M.L. (1984). Conversation: How Talk Is Organized. Beverly Hills, Calif.: Sage. MacLean & Associates (1991). 'Communications, Culture and the Public Interest: Policy Perspectives on the Networks of the Future?' Discussion paper, Ottawa. McLuhan, M. (1962). The Gutenberg Galaxy. Toronto: University of Toronto Press. - (1964). Understanding Media: The Extensions of Man. New York: McGraw-Hill. Malone, T.W. (1987). 'Modeling Coordination in Organizations and Markets.' Management Science 33 no. 10 (Oct.) 1317-32. Malone, T.W., Yates, J., and Benjamin, R.I. (1987). 'Electronic Markets and Electronic Hierarchies.' Communications of the ACM 30 no. 6 (June) 484-97. Mandler, J.M., and Johnson, N.S. (1977). 'Remembrance of Things Parsed: Story Structure and Recall.' Cognitive Psychology 9,111-51. March, J.G. (1978). 'Bounded Rationality, Ambiguity, and the Engineering of Choice.' Bell Journal of Economics 9, 587-608. March, J.G., and Olsen, J.P. (1976). Ambiguity and Choice in Organizations. Bergen, Norway: Universitetsforlaget. March, J.G., and Simon, H.A. (1958). Organizations. New York: John Wiley. March, J.G., and Weissinger-Baylon, R. (1986). Ambiguity and Command: Organizational Perspectives on Military Decision Making. Marshfield, Mass.: Pitman Publishing.
References
272
Markus, M.L., and Robey, D. (1988). 'Information Technology and Organizational Change: Causal Structure in Theory and Research.' Management Science 35 no. 5, 583-98. Markus, M.L. (1984). Systems in Organizations: Bugs and Features. Boston: Pitman. Martin, J. (1977). Future Developments in Telecommunications, and ed. Englewood Cliffs, NJ: Prentice-Hall. - (1981). Telematic Society: A Challenge for Tomorrow (previously The Wired Society). Englewood Cliffs, NJ: Prentice-Hall. Mead, G.H. (1934). Mind, Self and Society. Chicago: University of Chicago Press. Merton, R.K. (1940). 'Bureaucratic Structure and Personality.' Social Forces 18, 560-8. - (1949). Social Theory and Social Structure. Chicago: Free Press. Michael, D.N. (1984). Too Much of a Good Thing? Dilemmas of an Information Society.' In Technological Forecasting and Social Change 25, 347-54. - (1992). 'Governing by Learning in an Information Society.' In S.A. Resell et al., Governing in an Information Society, Montreal: Institute for Research on Public Policy, 121-33. Minsky, M. (1975). 'A Framework for Representing Knowledge.' In P. Winston, ed., The Psychology of Computer Vision, New York: McGrawHill, 211-77. - (1985). The Society of Mind. New York: Simon & Schuster. Mintzberg, H. (1973). The Nature of Managerial Work. New York: Harper & Row. - (1979). The Structuring of Organizations. Englewood Cliffs, NJ: Prentice-Hall. Mintzberg, H., and McHugh, A. (1985). 'Strategy Formation in an Adhocracy.' Administrative Science Quarterly 30,160-97. Mohrman, A.M., Jr., and Lawler, E.E.I. (1984). 'A Review of Theory and Research.' In F. Warren and N.C. Parian, eds., The Information Systems Research Challenge, Boston, Mass.: Harvard Business School Press, 135-64. Morgan, G. (1986). Images of Organization. Beverly Hills, Calif.: Sage. Morton, M.S.S. (1986). Strategic Formulation Methodologies (184.5-86). Sloan School of Management, MIT. Mosco, V. (1989). The Pay-per Society: Computers & Communication in the Information Age. Toronto: Garamond. Mulgan, GJ. (1991). Communication and Control: Networks and the New Economies of Communication. Cambridge: Polity Press. Newcomb, T.M. (1953). 'An Approach to the Study of Communicative Acts.' Psychological Review 60, 393-404.
273
References
Nielson, J. (1990). The Art of Navigating through Hypertext.' Communications of the ACM 33 no. 3, 297-310. Norman, D.A. (1992). Turn Signals Are the Facial Expressions of Automobiles. Reading, Mass.: Addison-Wesley. Ohmae, K. (1990). The Borderless World: Power and Strategy in the Interlinked Economy. New York and Toronto: Harper-Collins Canada Ltd. (by arrangement with McKinsey & Co.). Ortony, A. (1979). Metaphor and Thought. Cambridge: Cambridge University Press. Osberg, L. (1989). 'Paying Information Workers.' In L. Osberg, E.N. Wolff, and W.J. Baumol, eds., The Information Economy: The Implications of Unbalanced Growth, Halifax, NS: Institute for Research on Public Policy, 47-86. Osberg, L., Wolff, E.N., and Baumol, W.J. (1989). The Information Economy: The Implications of Unbalanced Growth. Halifax, NS: Institute for Research on Public Policy. Osborne, A. (1979). Running Wild: The Next Industrial Revolution. Berkeley, Calif.: McGraw-Hill. Padgett, J.F. (1980). 'Managing Garbage Can Hierarchies.' Administrative Science Quarterly 583-604. Panko, R.R. (1984). 'Offices: Analyzing Needs in Individual Offices.' Transactions on Office Information Systems 2 no. 3, 226-34. Pascale, R.T., and Athos, A.G. (1981). The Art of Japanese Management. New York: Warner. Peirce, C.S. (1940,1955). The Philosophy of Peirce: Selected Writings. London and New York: Routledge & Kegan Paul and Dover. Penrose, R. (1989). The Emperor's New Mind: Concerning Computers, Minds, and the Laws of Physics. London: Oxford University Press. Perrow, C. (1984). Normal Accidents: Living with High Risk Technologies. New York: Basic Books. Pool, I. de S. (1977). The Social Impact of the Telephone. Cambridge, Mass.: MIT Press. Porter, M.E. (1980). Competitive Strategy. New York: Free Press. - (1985). Competitive Advantage. New York: Free Press. Porter, M.E., and Millar, V.E. (1985). 'How Information Gives You Competitive Advantage.' Harvard Business Review (July-Aug.) 149-60. Poster, M. (1990). The Mode of Information: Poststructuralism and Social Context. Chicago: University of Chicago Press. Poston, T., and Stewart, I. (1978). Catastrophe Theory and Its Applications. London: Pitman.
References
274
Prigogine, I., and Stengers, I. (1984). Order Out of Chaos: Man's New Dialogue with Nature. New York: Bantam Books. Putnam, L., and Stohl, C. (1988). 'Breaking Out of the Experimental Paradigm.' Presentation to the International Communications Association Meetings, New Orleans, 30 May-2 June. Reich, R.B. (1983). The Next American Frontier. New York: Times Books. - (1991). The Work of Nations. New York: Alfred A. Knopf. Reynolds, R.J. (1989). The Evolution of Electronic Markets.' Master's thesis, Sloan School of Management, Massachusetts Institute of Technology. Rheingold, H. (1985). Tools for Thought: The People and Ideas behind the Next Computer Revolution. New York: Simon & Schuster. Ricoeur, P. (1981). Hermeneutics and the Human Sciences. Cambridge: Cambridge University Press. Roach, S.S. (1986). 'High Technology and Productivity Performance: Challenges for American Industry.' Presentation before the Canadian Workplace Automation Research Centre, Montreal. - (1987). America's Technology Dilemma: A Profile of the Information Economy. New York: Morgan Stanley. Robey, D. (1981). 'Computer Information Systems and Organization Structure.' Communications of the ACM 24 no. 10 (Oct.) 679-87. Rogers, E.M., and Kincaid, L. (1981). Communication Networks: A New Paradigm for Research. New York: Free Press. Rogers, E.M., and Picot, A. (1985). The Impact of New Communication Technologies.' In E.M. Rogers and F. Balle, eds., The Media Revolution in America and in Western Europe, Norwood, NJ: Ablex, 108-33. Resell, S.A., et al. (1992). Governing in an Information Society. Montreal: Institute for Research on Public Policy. Rumelhart, D.E. (1977). 'Understanding and Summarizing Brief Stories.' In D. LaBerge and S.J. Samuels, eds., Basic Processes in Reading: Perception and Comprehension, Hillsdale, NJ: Lawrence Erlbaum, 265-304. Salvaggio, J.L., ed. (1989). The Information Society: Economic, Social, and Structural Issues. Hillsdale, NJ: Lawrence Erlbaum. Schank, R., and Abelson, R. (1977). Scripts, Plans, Goals and Understanding: An Inquiry into Human Knowledge Structures. Hillsdale, NJ: Lawrence Erlbaum. Schegloff, E.A., and Sacks, H. (1973). 'Opening up Closings.' Semiotica 8, 289-327. Schelling, T. (1978). Micromotives and Macrobehavior. Toronto: George J. McLeod; New York: W.W. Norton. Schement, J.R., and Mokros, H.B. (1990). The Social and Historical Con-
275
References
struction of the Idea of Information as a Thing.' Unpublished paper. Schleifer, R. (1987). A.J. Greimas and the Nature of Meaning: Linguistics, Semiotics and Discourse Theory. Lincoln: University of Nebraska Press. Schneider, K. (1987). 'Services Hurt by Technology: Productivity Is Declining.' New York Times, 29 June. Schon, D.A. (1979). 'Generative Metaphor: A Perspective on ProblemSetting in Social Policy.' In A. Ortony, ed., Metaphor and Thought, Cambridge: Cambridge University Press, 255-83. Schrage, M. (1990). Shared Minds: The New Technologies of Collaboration. New York: Random House. Scott, W.R. (1981). Organizations: Rational, Natural and Open Systems. Englewood Cliffs, NJ: Prentice-Hall. Searle, J. (1969). Speech Acts: An Essay in the Philosophy of Language. Cambridge: Cambridge University Press. Shaiken, H. (1984). Work Transformed: Automation and Labor in the Computer Age. New York: Holt, Rinehart & Winston. Shannon, C.E. (1948). The Mathematical Theory of Communication.' Bell System Technical Journal 27 no. 10 (Oct.) 379-423, 623-56. Shannon, C.E., and Weaver, W. (1949). The Mathematical Theory of Communication. Urbana, 111.: University of Illinois Press. Shore, J. (1985). The Sachertorte Algorithm. New York: Penguin. Silverstone, R., Hirsch, E., and Morley, D. (1990). 'Information and Communication Technologies and the Moral Economy of the Household.' Presentation to the International Communication Association, Dublin. Simon, H.A. (1947). Administrative Behavior. 2nd ed. published in 1960. New York: MacMillan. - (1957). Models of Man. New York: John Wiley & Sons. Simon, H.A., Smithburg, D.W., and Thompson, V.A. (1950). Public Administration. New York: Knopf. Smircich, L. (1983). 'Concepts of Culture and Organizational Analysis.' Administrative Science 28 no. 3, 339-58. Smith, A. (1980). Goodbye Gutenberg: The Newspaper Revolution of the 19805. New York: Oxford. Smith, O.K., and Alexander, R.C. (1988). Fumbling the Future: How Xerox Invented, Then Ignored, the First Personal Computer. New York: William Morrow. Smith, J.B., and Weiss, S.F. (1988). 'An Overview of Hypertext.' Communications of the ACM 31 no. 7, 816-19. Statistics Canada. (1986). The Changing Industrial Mix of Employment, adapted from Canada's Industries: Growth in Jobs over Three Decades. Cat. 89-507,
References
276
Feb. 1986, W. Garnett Picot, Social and Economic Studies Division. Strassman, P.A. (1985). Information Payoff: The Transformation of Work in the Electronic Age. New York: Free Press. Suchman, L. (1987). Plans and Situated Action: The Problem of Human/'Machine Communication. Cambridge and New York: Cambridge University Press. Taylor, F.W. (1911). Principles of Scientific Management. New York: Harper. Taylor, J.R. (1982). 'Office Communications: Reshaping Our Society?' Computer Communications 5 no. 4 (Aug.) 176-80. - (19863). The Computerization Crisis: End of a Dream or Threshold of Opportunity? Montreal: Institute for Research on Public Policy, 59-85. - (i986b). 'New Communication Technologies and the Emergence of Distributed Organizations: Looking Beyond 1984.' In L. Thayer, ed., Organizations