168 17 3MB
English Pages 132 [133] Year 2020
SOCIOLOGICAL THEORY IN THE DIGITAL AGE
What is the role of sociological theory in the information age? What kinds of theories are best suited to analyzing the social uses of digital technologies, and for using digital technologies in new ways to study the social? This book contributes to several ongoing conversations on how the social sciences can best adapt to contemporary information technologies and information societies. Focusing on practical or ‘usable theory,’ it surveys the challenges and opportunities of conducting social science in the information age, as well as the theoretical solutions that sociologists have developed and applied over the last two decades. With specific attention to three theoretical approaches in digital social research—critical theory, forensic theory and Bourdieusian theory—the author provides an overview of the history and main tenets of each, surveys its use in sociological research, and evaluates its successes and limitations. Taking a long-term view of theoretical development in evaluating schools of thought and considering their productivity in analyzing and using contemporary digital communication technologies, this book thus treats theory as a tool for empirical research and the development of theory as inseparable from research practice. As such, it will appeal to scholars of sociology and social theory with interests in research methods, the development of theory and digital technologies. Gabe Ignatow is Professor and Graduate Director in the Department of Sociology at the University of North Texas, USA. His research interests are in the areas of sociological theory, digital research methods, philosophy of social science and cognitive social science.
SOCIOLOGICAL THEORY IN THE DIGITAL AGE
Gabe Ignatow
First published 2020 by Routledge 2 Park Square, Milton Park, Abingdon, Oxon OX14 4RN and by Routledge 52 Vanderbilt Avenue, New York, NY 10017 Routledge is an imprint of the Taylor & Francis Group, an informa business © 2020 Gabe Ignatow The right of Gabe Ignatow to be identified as author of this work has been asserted by them in accordance with sections 77 and 78 of the Copyright, Designs and Patents Act 1988. All rights reserved. No part of this book may be reprinted or reproduced or utilised in any form or by any electronic, mechanical, or other means, now known or hereafter invented, including photocopying and recording, or in any information storage or retrieval system, without permission in writing from the publishers. Trademark notice: Product or corporate names may be trademarks or registered trademarks, and are used only for identification and explanation without intent to infringe. British Library Cataloguing in Publication Data A catalogue record for this book is available from the British Library Library of Congress Cataloging-in-Publication Data Names: Ignatow, Gabe, author. Title: Sociological theory in the digital age / Gabe Ignatow. Description: 1 Edition. | New York : Routledge, 2020. | Includes bibliographical references and index. Identifiers: LCCN 2019054404 (print) | LCCN 2019054405 (ebook) | ISBN 9780367263461 (hardback) | ISBN 9780367263478 (paperback) | ISBN 9780429292804 (ebook) Subjects: LCSH: Social sciences--Research. | Information technology. Classification: LCC H62 .I36 2020 (print) | LCC H62 (ebook) | DDC 300.1--dc23 LC record available at https://lccn.loc.gov/2019054404 LC ebook record available at https://lccn.loc.gov/2019054405 ISBN: 978-0-367-26346-1 (hbk) ISBN: 978-0-367-26347-8 (pbk) ISBN: 978-0-429-29280-4 (ebk) Typeset in Bembo by Taylor & Francis Books
CONTENTS
Preface Acknowledgments
vi xii
PART I
Introduction
1
1
A crisis in theory?
3
2
The sociology of the digital
12
3
Computational sociology and big data analysis
19
PART II
Theoretical innovations
29
4
Critical Information Theory
31
5
Forensic Social Science
47
6
Bourdieusian theory
68
PART III
Conclusions
89
7
91
A more continuous sociology
References Index
100 116
PREFACE
Sociology is still finding its footing in the information age. What is the use of academic sociology at a time when social data is generated, circulated and analyzed constantly throughout society? What is sociology’s ‘unique selling point’ when search, social media and marketing companies are able to collect and analyze consumer data on a scale and at a speed academic researchers cannot hope to match? Should sociologists learn programming languages and develop social media expertise so as to engage with the latest information technologies, or should they maintain a critical distance from these technologies to position themselves to critique them without conflicts of interest or otherwise compromised objectivity? Since the 1990s these sorts of questions have sparked thoughtful discussion throughout the social sciences; within sociology, such discussions have mostly concerned data and methods: how to use internet-based social data, including so-called ‘big data,’ in sociological research projects; how to study digital inequality in its various forms; how to train graduate students to collect, organize and analyze large data sets from online sources; and how to incorporate social media and emerging digital methods into undergraduate curricula. Since the early 2000s, important and wide-ranging discussions of the relations between the social sciences and information technology have been published in special issues of journals such as Big Data & Society, Sociological Methods & Research, The Sociological Review, Sociology and other outlets. In one of the most impactful discussions about how the social sciences can best adapt to the challenges of the information age, the British sociologists Savage and Burrows suggested in a 2007 article in Sociology that while in the years between about 1950 and 1990 sociologists ‘could claim a series of distinctive methodological tools that allowed them to claim clear points of access to social relations’ (p. 886), by the early twenty-first century such jurisdictional claims were no longer justified. Today corporations and public sector organizations routinely produce, collect and analyze their own social data and data from their clients and competitors. Where 50 years ago academic social
Preface vii
scientists could be seen as occupying the apex of national social research fields, today they occupy a mostly marginal position within a global commercial research infrastructure. This is an infrastructure characterized by Thrift (2005) as ‘knowing capitalism,’ which he described in terms of a stage of development at which capitalism considers its own practices on a continuous basis. This continuous self-analysis in the service of innovation, uncertainty reduction and profit works through a proliferation of circuits of information that are embedded in numerous information technology platforms. Business and academe interact with greater frequency and intensity, and symmetries between the two increase: business has become more academic as academe has become more business oriented. It is no longer possible, if it ever was, to think of academia as an epistemologically privileged sphere. Similarly, it is no longer possible to write off business as though it were the haunt of the epistemologically challenged; business has become more ‘intelligent’ in a number of ways. (Thrift 2005: 22–23) One element in knowing capitalism is academic sociology’s capitalistic other, what Burrows and Gane (2006) refer to as ‘commercial sociology.’ Commercial sociology involves the collection and analysis of social data for commercial purposes, and it often makes use of research methodologies developed by sociologists (survey methods, focus groups, network analysis, content analysis) as well as theoretical concepts taken from academic sociology. Burrows and Gane give the example of researchers from the geodemographics industry referring to ‘habitus,’ ‘ideal types,’ ‘Weltanschauung’ and ‘globalization.’ Though it borrows much from academe, commercial sociology is an enterprise in which academic sociologists are almost entirely peripheral. Using data from mailing lists, consumer surveys, websites and other sources, commercial sociology data sets often dwarf anything an academic sociologist could hope to collect. And because these sorts of commercial data sets are privately held, academics generally cannot access them (see Lazer et al. 2009: 721). Yet within commercial sociology, big data of this kind does not require special efforts to collect because it is a by-product of routine business operations. Thus if corporations and public sector organizations have more and higher-quality data than do sociologists, perform demographic and behavioral research using methods that are at least as sophisticated as those employed by academic researchers, and are less restricted by ethical standards and regulations than are university researchers, what is left of sociology? On what basis can sociologists claim any special jurisdiction in the collection or analysis of social data? Gane (2011) treats this question as part of a ‘crisis of measurement’ within sociology, by which he means a crisis in the value of quantitative sociological methodologies that are based on the production of data through measurement. Quantitative sociology within academe is threatened by marginalization and redundancy (Savage & Burrows 2007) as the academy ‘loses whatever sovereignty it previously held over quantitative research techniques’ (Gane 2011: 152–153). Emblematic of this shift is the
viii Preface
nascent field of data science, a field that welcomes researchers with high-level computational and research design skills and that competes with sociology and other academic fields for talent and resources (Alvarez 2016). Within the crisis of empirical sociology as a whole, sociological theory is faced with its own closely related mini-crisis in the form of a lack of conceptual renewal, innovation and excitement since the generation of Beck, Baumann, Giddens and Bourdieu. Gane (2011) and before him Turner (2006) have each identified a contemporary generation gap in sociological theory. Gane draws on Outhwaite (2009) and Collins (1998: xix) to suggest that the lifespan of an intellectual generational cohort is approximately 33–35 years, at which point the creative work of a generation of theorists will have started to fade and be displaced by a new generation of thinkers. Abbott’s (2006) vision of generational change is similar. His claim is that 25 years is about the length of time it takes a single group of individuals to make up some new ideas, seize the soap boxes, train a generation or two of students, and finally settle into late career exhaustion. Their students may keep things going, but their students’ students tend to be fairly mechanical appliers of the original insights. The really creative people don’t make their careers by hitching themselves to other people’s wagons. (2006: 61; qtd. in Gane 2011: 165) If Gane, Outhwaite, Collins and Abbott are correct, the early 2000s should have been the point at which Beck, Baumann, Giddens and Bourdieu were replaced by a new generation of sociological theorists. As we will see in Chapter 1, this has not happened. Why is the generation of Bourdieu, Beck, Baumann and Giddens still dominant within sociology? What has happened to the next generation of theoretical talent in the discipline? This generational crisis encompasses star theorists and the rank and file as well; writing in 2006, Turner observed that ‘it would be hard to name any young grand theorists in sociology today … most have three to five decades in the field’ (p. 27). One explanation that has been advanced for the contemporary crisis in sociological theory relates the crisis to features of knowing capitalism (Thrift 2005). Fuller’s (2009) account of the rise of quantitative evaluation of academic ‘outputs’ shows how national university audit measures have disincentivized risky theoretical work in favor of derivative empirical research programs. It is the latter sorts of research programs that allow individual researchers to advance their careers by demonstrating consistent productivity and ‘impact’ to university administrators. While sociological methodology has taken steps to navigate the opportunities and challenges of the information age—of knowing capitalism, commercial sociology, big data and the like—the subfield of sociological theory, perhaps partly as a consequence of its generational crisis, has fallen behind. Contemporary theorists have not explained theory’s importance within knowing capitalism, commercial sociology or data science, and as a result, sociological theory has become less central
Preface ix
to public sociology. ‘Whereas in previous generations it was social theorists—often sociologists—such as Habermas, Giddens, Foucault and Beck who commanded public as well as academic debate, it is now social scientists of data … who are at the fore’ (Halford & Savage 2017: 4). Today ‘data books’ that deploy large-scale assemblages of data from diverse sources are the voice of public sociology rather than books of grand theory. Such data books include Putnam’s (2000) Bowling Alone on social capital; Piketty’s Capital in the Twenty-First Century (2014); and Wilkinson and Pickett’s The Spirit Level (2011) on the consequences of social inequality. These trade press books use powerful theoretical arguments to draw data from diverse sources, such as US Census data, the General Social Survey and the World Incomes Database, into arguments that address intellectual or policy puzzles. The theories they employ include communitarian approaches to social interaction and social networks (Putnam), social psychological theories of shame and stigma (Wilkinson and Pickett) and economic theory (Piketty). None of these authors introduce new theoretical ideas, and they do not lay out their theoretical arguments deductively. Rather, they return to their theoretical points repeatedly as a kind of refrain and support these points with extensive data visualizations. In contrast to popular data books like those of Putnam, Piketty, and Wilkinson and Pickett, contemporary theory monographs and textbooks tend to be retrospectively oriented. They cover Enlightenment thought and the emergence of sociology in the nineteenth and early twentieth centuries (e.g. Turner, Beeghley & Powers 2011), the mid-century syntheses of Talcott Parsons and Robert Merton, and the challenges posed by postmodernism from the 1970s through the 1990s (Seidman 1994). Recent theory monographs cover philosophical issues as they relate to internal disciplinary disputes over methodology (Reed 2011) and theoretical strategies (Rueschemeyer 2009) and tactics (Rojas 2017). Many theory monographs and textbooks include late chapters on theories of globalization, but they almost never consider the technological or institutional contexts in which sociological theory itself operates (Camic and Gross 1998: 468) or sociological theory’s potential value for understanding and working with information technologies (Mohr 2015, 2016). Thus while sociological theory is taught at undergraduate and graduate levels in nearly every sociology department in the world, and several highly regarded social science journals publish primarily theory papers, theoretically oriented sociologists have not systematically considered the place of sociological theory in the information age. What is the role of theory in data-driven research using large data sets (Kitchin 2014) and in the emerging field of data science? Is theory relevant to the commercial sociology enterprise? What knowledge claims are theoretical sociologists justified in making in a world of knowing capitalism when social data and scientific knowledge are available instantaneously to anyone with internet access? Absent answers to these questions, sociological theory risks becoming intellectual history—‘what Durkheim-Weber-Bourdieu and so on have said’ (Swedberg 2012: 2)—if it has not already. A premise of the present book is that Savage and Burrows’ arguments about the challenges posed by knowing capitalism and commercial sociology to empirical sociology were relevant to sociological theory in 2007 and remain relevant today
x Preface
(Burrows and Savage 2014; Gane 2011; Mohr 2015, 2016). From the Victorian era through the late twentieth century, sociological theory was predicated on the notion that social data was scarce and difficult to acquire and that sociologists’ special education and training afforded them privileged access to ways of understanding and analyzing social reality that were unavailable to the uninitiated. Since the 1990s knowing capitalism, commercial sociology and data science have fundamentally altered this picture, and it is not yet clear how sociological theory can best position itself in relation to modern research infrastructures—either by operating within them or at critical distances from them. A productive big-picture conversation about the role of sociological theory in the digital age has not yet taken place, but if it is to take place it must consider the purposes sociological theory has served within and beyond the discipline of sociology. Since Comte, sociological theory has been a tool for understanding specific patterns of historical change. It has helped its readers to connect the social issues they experience in their everyday lives to analyses of large-scale social patterns. In Mills’ often cited passage, the facts of contemporary history are also facts about the success and the failure of individual men and women. When a society is industrialized, a peasant becomes a worker; a feudal lord is liquidated or becomes a businessman … Neither the life of an individual nor the history of a society can be understood without understanding both. (2000 [1959]: 3) Sociological theory provides people with tools for understanding both and for navigating ideological discussions of topics ranging from war to social problems such as poverty and inequality (Turner & Turner 1990: 135). A second purpose of sociological theory has been to serve as a lingua franca that allows empirical research from disparate areas to communicate and cumulate. Theory is used as a guide within empirical research practice, and the results of empirical research projects are brought to bear on theoretical discussion. Theoretical frameworks and concepts connect sociological research on disparate topics using unrelated methodologies to each other and to research from other disciplines, and these frameworks and concepts have become more central to academic sociology since the 1970s as the more applied branches of sociology have lost ground due to competition from more specialized academic fields and from nonacademic research organizations. Beginning in the 1970s, research units run by academic sociologists in the United States were forced to compete with Washington-based consulting agencies that had strong bureaucratic connections and the ‘ability to shift interests as the political priorities of funding agencies changed’ (Turner & Turner 1990: 145). Around the same time a series of federal grant program decisions led to changes in the composition of sociology departments. Prior to this period departments had operated as ‘residual social sciences’ that took responsibility for topics and programs
Preface xi
that were too small or incoherent to sustain themselves as independent disciplines. Such topics and programs often included gerontology, criminal justice, social work, and race and gender studies. Ultimately government grants and regulations that gave employment preferences to students with specialized degrees allowed these and other applied research areas to develop into self-sustaining independent disciplines. As more applied and specialized research areas splintered off from sociology, the home discipline shifted from emphasizing sociological practice (sociologists performing needs assessments, program evaluations and the like) to identifying sociology as a general social science (see the discussion of Talcott Parsons in Chapter 7) dedicated to basic research, and as a critical social science seeking to promote progressive social change (Turner & Turner 1990: 179–187). Theory in its various forms is centrally important to both of these contemporary sociological orientations, and so holds a relatively secure position within contemporary academic sociology even as its functions within digital-age social research, and academic sociology’s jurisdiction within contemporary society (Savage & Burrows 2007), appear less secure. This book is about how sociological theory has adapted itself to the digital revolution, which I will treat as comprising three interrelated trends that have special relevance for the social sciences: knowing capitalism (Thrift 2005), commercial sociology (Burrows & Gane 2006) and big data (Kitchin 2014; Mohr 2015, 2016; Mützel 2015). I think sociological theory is an incredibly innovative field that has adapted to the digital age in several important ways. Its capacity for innovation has allowed sociological theory to continue to play centrally important roles in basic social science research as well as in the public and applied sides of sociology. While sociological theory may not be in a state of crisis per se, it may still be productive to start a conversation about the future of sociology based on a critical and thorough evaluation of the various approaches to adapting sociological theory to contemporary information societies that have emerged since the 1990s. My hope is that this book will be received as it is intended: as a modest contribution, and an open invitation, to such a conversation.
ACKNOWLEDGMENTS
This book could not have been written without the help and support of innumerable friends and colleagues. Conversations in the meeting rooms and hallways of various conferences, in the Q&A sessions after presentations, in Sycamore Hall at the University of North Texas and via email and Twitter have all made this book what it is. To be sure, not everyone on this list will agree with every point made in this book, but I appreciate every one of the contributions made by David Arditi, Donna Barnes, Emo˝ke-Ágnes Horvát, Kevin Lewis, Omar Lizardo, Richard Machalek, Kevin McCaffree, Doug Porpora, Matt Rafalow, Craig Rawlings, Isaac Reed, Laura Robinson, Jeremy Schulz, Katie Sobering, Stephen Turner, Neil Jordan and Alice Salt at Routledge, and the participants at talks at the American Sociological Association, Eastern Sociological Society and Facebook Foo Camp. The book is dedicated to the memory of John Mohr. Anyone who ever met him, even if for just a moment, knows how special a person he was and the debt of gratitude we all owe him for the warmth and attention he gave so freely.
PART I
Introduction
1 A CRISIS IN THEORY?
Introduction For academic sociologists, their home discipline’s traditions of theoretical speculation and theoretically guided empirical analysis are gifts that keep on giving. Every year hundreds of thousands of students worldwide enroll in required and elective undergraduate and graduate sociological theory courses. These courses cover the theoretical luminaries of the nineteenth and twentieth centuries, from Marx, Weber, Mead and DuBois to Giddens and Bourdieu, while the topics covered range from theories of social integration, bureaucratic organization and class conflict to feminist theory, postmodernism, and theories of globalization and transnationalism. Hundreds of theory textbooks have been published, many in multiple editions. There are numerous successful dedicated sociological theory journals and many more generalist social science and sociology journals that regularly publish theory papers. National and international sociological associations have sections and working groups dedicated to theoretical development and critique, including the American Sociological Association’s theory section, the British Sociological Association’s theory working group and the International Sociological Association’s Research Committee 16. Then there is social theory, sometimes referred to pejoratively as ‘grand theory,’ the mostly continental European tradition of philosophical and historical analysis of social trends and conflicts. As sociologists embraced new quantitatively oriented research methods and statistical analysis techniques in the 1950s and 1960s, sociological theory and social theory began to go their separate ways; advances in empirical research practice during the twentieth century contributed to a differentiation between two senses of theory. On the one hand, some sociological theory was very close to empirical research projects; it consisted largely of relatively formalized structures of hypotheses and
4 Introduction
confirmations. On the other hand, much of the classical tradition of sociological theory tried to offer large-scale perspectives on social life in general. (Calhoun, Gerteis, Moody, Pfaff & Virk 2012: 17) The postwar rejection of social theory can be seen in a 1956 textbook on sociological theory by Borgatta and Meyer in which the authors suggested that ‘the dismissal of early theorists is too quick among the students in training today,’ and that there was among students of sociology in the 1950s ‘a tendency to think that ideas begin with their last statement, or with the testing of an experimental hypothesis’ (viii). In the postwar period sociologists developed research tools such as social survey analysis, experiments and content analysis, and they found new ways to use theoretical concepts to direct empirical research that made use of the new techniques (see Mohr, Wagner-Pacifici & Breiger 2015). The most prominent example of the adaptation of sociological theory to quantitative research methodologies was Merton’s concept of theories of the ‘middle range,’ which were theories of circumscribed social phenomena from which testable hypotheses could be derived. Middle-range theory was successful, particularly in the United States, but its victory was only ever partial. The European tradition of social theory/grand theory, most but by no means all of it critical Marxist in orientation, continued to develop in response to the rapid social, political and cultural changes of the postwar period. Social theory’s development occurred almost entirely independently of the development of middle-range theorizing and empirical research. Critical Marxist social theory regained some of its lost stature in the 1960s and 1970s as part of a widespread rejection of the excesses and failures of scientific positivism in social science and in society generally. In 1966 Theodor Adorno had already argued against the possibility of social science objectivity. In 1971 Gouldner, a former student of Merton, had attempted to align sociological theory with the ‘structure of feeling’ (Williams 1977) of the 1960s counterculture. Gouldner approved of the counterculture’s rejection of scientism and ‘inclination to not disprove or argue with the old theories, but to ridicule or avoid them’ (Gouldner 1971: 7). Twenty years later Seidman took the antiscience positions developed by Adorno and Gouldner one step further in announcing that the time had come for social theory to replace sociological theory once and for all. Because by the 1990s sociological theory had ‘lost most of its social and intellectual importance … disengaged from the conflicts and public debates that have nourished it in the past [and] has turned inward and is largely self-referential’ (Seidman 1991: 131), to revitalize sociological theory requires a renunciation of scientism and replacing ‘hopes for a great transformation of society by more modest aspirations of a relentless defense of immediate, local pleasures and struggles for justice’ (p. 131). In Seidman’s version of sociological postmodernism, sociologists would replace empirical research projects designed to test general theory with ‘broad social narratives’ developed by social theorists (see also Lemert 2015). The result of the postwar divorce of social theory—in its critical Marxist, postmodern and other forms—from sociological theory (see Mouzelis 2003: 3) is that where sociological theory is today closely tied to empirical social research practice, including quantitative research practice, social theory almost always rejects
A crisis in theory? 5
sociological theory’s ‘pretensions to scientism’ (Seidman 1991). There are large areas of overlap between social and sociological theory to be sure, and social theory readers and textbooks often include work by sociologists such as Robert Merton and Pierre Bourdieu despite these sociologists’ self-identification with sociological theory but explicitly not social theory (referred to at times by Bourdieu condescendingly as ‘theoretical theory’). But this inclusiveness is arguably a result of the exigencies of undergraduate pedagogy and textbook and reader production more than of shared fundamental metatheoretical or philosophical positions or commitments. As we will see in Parts II and III, the challenges and opportunities associated with the digital revolution have laid bare the metatheoretical divide between sociological theory and social theory, making it difficult for sociologists to avoid taking the side of one or the other approach. Sociological theory provides the conceptual foundations for the leading contemporary sociological research programs and the sociological communities that support each of these programs. These include the social networks research community led by Burt, Uzzi and others that traces to Coleman and Granovetter’s ideas on social capital (Coleman 1994; Granovetter 2017); work on institutional theory in Meyer’s tradition (Meyer, Krücken & Drori 2009); work by Latour and others based on actor-network theory (Latour 1987); research on identity and boundaries by Zuckerman, Brubaker and their colleagues (Brubaker 2004; Zuckerman 1999); and research on cultural geography and globalization in the tradition of Urry, Lash and Castells (Lash & Urry 1993).1 This list is of course neither exhaustive nor definitive. Amidst all this activity in sociological theory textbook production and pedagogy, social theory, and empirical research programs informed by sociological theory, there have been regular expressions of concern about sociological theory’s contemporary relevance and vitality. In his monograph Sociological Theory: What Went Wrong?, first published in 1995, Mouzelis (2003) attempted to use elements of postmodern thought to rescue sociological theory from what he saw as its state of stagnation. Then in 2006 Turner identified a generation gap in sociological theorizing, a consequence of which was innovation in sociological theory effectively ending in the 1990s. In a 2015 American Sociological Association newsletter article Szelenyi suggested that sociological theory has been on a ‘downward slope since the 1980s,’ while in European sociological circles the lack of conceptual renewal, innovation and excitement since the generation of Beck, Baumann, Giddens and Bourdieu has been a topic of debate for well over a decade (Outhwaite 2009). So what happened to the next generation of theoretical talent in the discipline? Perhaps the best explanation that has been advanced for this stagnation in sociological theory relates the current situation to features of contemporary capitalism and the reorganization of universities and national higher education systems along business lines. Fuller’s (2009) account of the rise of quantitative evaluation of academic ‘outputs’ shows how national university audit measures have disincentivized risky theoretical work in favor of derivative empirical research programs. It is the latter sorts of research programs that allow individual researchers to advance their careers
6 Introduction
by demonstrating consistent productivity and ‘impact’ to university administrators (Subramaniam, Perrucci & Whitlock 2014). One predictable negative consequence of this national and global ‘audit culture’ (Fuller 2009) is a lack of daring theoretical innovation. In this climate in which consistent research ‘output’ is rewarded with grants, tenure and promotions, the relevance of innovative sociological theories to the study of social phenomena influenced by information communications technologies and to emerging digital research methods is not at all clear. It may be the case that the crisis in sociological theory today is due not only to too little theoretical innovation, but also and relatedly to the existence of too many competing theoretical paradigms. Perhaps sociology never recovered from the ‘theoretical and methodological disarray of the 1970s’ (Abrams, Deem, Finch & Rock 2018: 3), has lost whatever ability it may once have had to move on from unproductive or demonstrably flawed theoretical paradigms, and is as a direct result poorly positioned to compete against other social science disciplines for attention and resources when disciplines such as psychology and economics encompass fewer incommensurable theoretical paradigms and can therefore more readily communicate their value to university administrators and the public. Accordingly, across sociology there have been calls for both fewer theories and less theory. For instance, the ethnographers Besbris and Khan (2017) have argued that sociology would be better off if it were ‘a descriptively rich discipline, where theoretical frameworks are considerably less numerous and therefore more powerful’ (p. 147). Sociological theory certainly appears to be both more prevalent within sociology and less relevant in the public sphere than it was a generation ago (Halford & Savage 2017). But is it in a state of crisis? Or a temporary state of stagnation? It may be the case that changes within academe have made it increasingly challenging to identify the value of specific sociological theories and theoretical paradigms, and of sociological theory generally.
The permanent crisis The main contours of contemporary sociology emerged in the years after the Second World War. In comparison to sociology in the late nineteenth and early twentieth centuries, after the war sociology experienced influxes of both students and government research funds, particularly in the United States. As government funding supported the quantification of the social sciences, the shift toward sociology as ‘Big Science’ split the discipline. On one side, sociology emerged as a science, more precisely a ‘basic social science’ (Owens 2010: 72–73), along with anthropology, economics and social psychology and in contrast to more applied social science fields such as education and management research. Scientific sociology has at times sought to distance itself from the rest of the field, for example in the founding of the Society for Social Research as a more scientific alternative to the American Sociological Society (Turner & Turner 1990: 148), and more recently in the founding of the journal Sociological Science. In sociology as a basic social science (a ‘general science’ in Turner and Turner’s (1990) terms), theory is
A crisis in theory? 7
used as a guide to empirical research and empirical research is brought to bear on theoretical discussion. Theory serves as a conduit that allows sociology’s disparate topics and methodologies to communicate with each other and with research from other disciplines, and it has become more visible in the discipline as the institutionalization of applied social science fields such as criminology, education research, management research, public administration and public health has left sociology with claims to only a small number of residual empirical research areas. The main divisions within modern sociology between sociology as basic social science and critical public sociology took shape during the immediate postwar period. In the United States these divisions crystallized in the Columbia University sociology department, which emerged during this period as a major center of world sociology. Three successful forms of sociology emerged at Columbia (see Chapter 7), and to the present day proponents of each of these three often perceive crises in those subfields in which one of the other forms is preeminent or on the rise. The first form, sociological empiricism, was represented by Paul Lazarsfeld. Lazarsfeld was a methodological innovator in research on unemployment, public opinion, mass media and communications, and in the fields of applied sociology and political sociology. He developed new applications of statistical survey analysis, panel methods, latent structure analysis and contextual analysis (Jerˇábek 2001). With Robert Merton he invented the focus group interview, and is today considered one of the founders of mathematical sociology. As the founder of Columbia University’s Bureau of Applied Social Research he was also remarkably successful at securing corporate and government contracts for his sociological work (Clark (1996) refers somewhat disparagingly to Lazarsfeld’s ‘clientelism’). Lazarsfeldian sociology thus often resembled ‘sociological practice’ in Turner and Turner’s (1990) terms. The second form of sociology is critical public sociology. A central justification for the existence of sociology has always been its popularity. In the mid-twentieth century the Columbia sociologist C. Wright Mills was a pioneer as a public sociologist whose books sold well to a general audience. Mills was a harsh critic of Lazarsfeld, although his criticisms arguably missed the mark (Clark 1996). Turner and Turner (1990) refer to Mills’ version of public sociology as ‘critical social science.’ The third form was sociology as ‘basic social science,’ in Parsons’ terminology, or ‘general social science’ according to Turner and Turner (1990). At Columbia, sociology as basic social science was advanced by Robert Merton and his well-known concept of middle-range theory. Contemporary examples of empirical research based on middle-range theorizing include network theory, social exchange theory and rational choice theory. This kind of sociological research is characterized by a relatively balanced connection between methodology and sociological theory. Social theory might be considered a fourth form of sociology, but at this point it cannot be counted as a success within postwar sociology. There are hardly any grand theorists left in sociology (Turner 2006); Parsons was the last major social theorist in American sociology, and Baumann, Beck, Giddens and Bourdieu the last generation of major European social theorists. Bourdieu however, as we will discuss in Chapter 6, took pains to distance his highly theoretical approach to sociology from social theory per se.
8 Introduction
Sociological empiricism, public sociology, and sociology as basic social science have coexisted effectively enough, if with inevitable frictions and tensions, for decades. The three versions of sociology arguably complement each other institutionally by building up multiple audiences for sociological courses, books and journals, and by generating funding from multiple sources. But the sharp intellectual differences across the approaches produce a permanent sense that the discipline is afflicted by multiple crises. The intellectual differences and contradictions between the approaches have also contributed to an unusual volume of institutionally self-reflective commentary. In their section newsletters, conference addresses, published debates and tweets, empirical sociologists working in something like the Lazarsfeld tradition regularly lament that their discipline is methodologically underpowered. For instance, Szelenyi noted in 2015 that [b]y the turn of the century the crisis had arrived. Social sciences, starting with economics, followed by political sciences underwent fundamental changes; neoclassical economics, rational choice theory and experimental research design appeared to be victorious and sociologists are still searching for ways to respond. Advocates for public sociology lament what they perceive to be the narrow technical focus, careerism and inward-facing orientation of empirical sociology (Agger 2013). And sociologists working in the Mertonian tradition of sociology as basic social science lament an overabundance of low-quality, low-impact empirical research, and the cultural myopia and moralizing of public sociology (e. g. Turner 2017). There is a long history of Millsian criticism of sociology from sociologists themselves and from external critics. For instance in a review of the 2012 British Sociological Association conference, the journalist Aditya Chakrabortty of the Guardian expressed a dissatisfaction with the lack of sociological attention given to major public events such as the global financial crisis. He contrasts what he perceives to be sociology’s current inward orientation to the figures of Karl Marx, C. Wright Mills and Pierre Bourdieu, all public intellectuals who were politically engaged and ‘unafraid to stray outside their disciplines’ (Chakrabortty 2012; cf. Toby 2019). Chakrabortty compares Marx, Mills and Bourdieu with the picture of today’s teacher in a modern degree-factory, forever churning out publications for their discipline’s top-rated journals. Not much scope there to try out a speculative research project that might not fly … Nor is there much encouragement to engage with public life. Because that’s what’s really missing. (Chakrabortty 2012; qtd. in Gane & Back 2012: 403–404) Where Chakrabortty and other sympathetic external critics are asking sociologists to improve how they communicate the results of their research to the public, internal critics have gone much further. Like Mills before him, in The Coming Crisis of Western Sociology (1971), Gouldner argued that sociology must turn away from
A crisis in theory? 9
producing objective truths. Gouldner argued that positivist ideology was based on false premises and used as a tool by a ruling elite and that, therefore, critical subjective thought is more important than scientific objectivity for sociology. Therefore sociologists should engage in public sociology not as a secondary process of communicating the results of their research to the public; rather, public engagement should be a primary goal of research. As noted above, in the 1980s and 1990s postmodernist sociologists took Gouldner’s internal critique of sociology a step toward its logical conclusion. Seidman claims that theoretical and methodological disputes in the late twentieth century served as little more than legitimating rhetorics for ongoing research programs. Foundational disputes over the logic of the social sciences, theoretical paradigms emphasizing conflict versus order, agency versus structure, structure versus culture, and macro versus meso and micro levels of analysis, all amounted to ‘discursive clamoring’ in local skirmishes, a ‘virtual babble’ of theoretical vocabularies that has failed to produce a ‘centered, evolving vital theoretical tradition’ (Seidman 1991: 133). Faced with ‘metatheoretical proliferation’ but little conceptual order or progress, the thing for sociologists to do was to engage in public sociology without any further appeals to ‘abstract or formal reason’ (Seidman 1991: 134). Another form of sociological self-critique concerns theoretical dis-integration. From a Mertonian perspective, the most fundamental crisis in sociology today is one of theoretical disarray and research overproduction. Sociology does not have a theoretical core but is rather a multiple-paradigm field in which research is conducted within highly specialized subfields. There is little communication across subfields of the discipline because there is no agreed-upon theoretical language that would allow research to cumulate. Efforts in the 1990s to establish a postmodern sociology failed, and feelings about the state of sociology as a discipline were generally conflicting and mixed (Calhoun 1995). In postmodernism’s wake new efforts were undertaken to transcend sociology’s various theoretical dualisms of structure versus agency, culture versus structure, and micro versus macro. As we will see in Chapter 6, perhaps the most influential integrative approach to sociology was developed by Pierre Bourdieu, who was sharply critical of sociological movements associated with postmodernism and Cultural Studies.
Sociological overflow? Internal and external critiques of sociology from Lazarsfeldian, Millsian and Mertonian perspectives might all be missing the big sociological picture. The sociology crisis that matters most may be that there is simply too much of it; sociology may be a victim of its own success not because sociology has been more successful than other academic fields, but simply because the entire global system of science and higher education has expanded extraordinarily quickly (Frank & Meyer 2007; Meyer, Ramirez, Frank & Schofer 2007).
10 Introduction
In an analysis of citation patterns across 251 fields (over one billion citations among 57 million papers over 54 years) covered by the Web of Science data set, Chu and Evans (2018) have traced some of the consequences of the sheer size of the scientific community for scientific innovation. They find that as scientific fields grow in terms of numbers of researchers and publication outlets, papers effectively crowd each other out, stifling innovation and theoretical progress. A large number of papers published within one field ‘does not lead to rapid turnover of central ideas but rather the ossification of canon.’ A straightforward view of scientific progress would seem to imply that more research is always better: more papers published implies a greater rate of scientific progress; the more researchers working in a field, the more ground that can be covered; and with more papers, the probability at least one of them contains an important innovation increases. This appears not to be the case. Researchers in sociology and other fields are evaluated and rewarded based on productivity, measured in terms of publication of a large number of articles within a set period of time. At university and national levels, research quantity is measured and national and international rankings determined by total numbers of publications, patents, degrees awarded, scientists and expenditures. When assessed in addition to quantity, quality is evaluated primarily with citation counts of individuals, teams and journals. Chu and Evans argue that when the quantity of papers grows very large, the sheer size of the body of knowledge in the field can limit the potential impact of each new publication. If the publication rate of novel papers is too fast, no new paper can rise into canon through localized processes of diffusion. They show that when a field has many new publications in a year, ‘the list of most-cited papers will change little year to year, new papers will be more likely to cite the most-cited papers rather than less-cited papers, the probability a new paper eventually becomes canon will be small, and localized diffusion and preferential attachment will not explain the rise of a new paper into the ranks of the most-cited.’ This finding suggests troubling implications for contemporary sociology. If too many papers are published in short order, new ideas cannot be carefully considered against old, and processes of cumulative advantage cannot work to select valuable innovations. The result is a status quo bias in sociology and other fields of knowledge production.
Theory production in audit cultures To attempt to understand the forms and functions of sociological theory in the digital age, I follow Chu and Evans (2018), Holmwood (2009) and Savage and Burrows (2007) in situating sociological theory within sociology, and sociology within national higher education systems that model themselves on for-profit corporations more than ever before (Bailey 2011; Palfreyman & Tapper 2014; Troschitz 2017), a development that Mills detected in his writings on the bureaucratic ethos. This has been accompanied by the emergence of new mechanisms for governing or auditing research (see Burrows 2012) that encourage routinized forms of writing and research that can be easily assessed and quantified. As audit cultures for
A crisis in theory? 11
classifying, quantifying and ranking ‘research outputs’ have come to dominate life within the contemporary academy, the result is ‘institutionalized cowardice’ (Fuller 2009: 86) rather than novel or disruptive research. Gane and Back (2012) remind us that the current pressure on sociologists not only to publish, but to publish the right type of work in the right places, is not exactly new. In the 1950s C. Wright Mills’ teacher, friend and co-author Hans Gerth ‘was desperate for publications’ as these, increasingly, were ‘the only coin of the American academic realm’ (Jacoby 2000: 155). But these pressures pose a special kind of danger to sociology which, as an ‘exporter discipline’ providing concepts and research tools to neighboring, generally more applied social science disciplines, may be in danger of losing its own distinct identity (Holmswood 2010). As an exporter discipline it may be more important now than ever before that sociologists produce theoretical and methodological innovations that are valued by members of research communities beyond the institutional boundaries of academic sociology. To conclude this section and this chapter, this book emphatically is not a crisis book, but is rather a metatheoretical exercise. There is in sociological theory today a need for some housekeeping: theory is increasingly central to the discipline, yet it has become increasingly difficult to identify the most important innovations in sociological theory. Thus this seems like the right time to ask some pointed questions about the value of sociological theory now and into the future, and to critically evaluate the answers to the question of the value of sociological theory for digitalage research that have emerged over the past two decades. Before we review the most innovative responses by sociological theorists to the digital revolution, we turn to how this revolution has transformed the field of sociology and the social sciences more generally. Understanding how the practice of empirical sociological research has adapted to the digital age provides needed context to allow for an evaluation of the various ways sociological theory has adapted to the same.
Note 1 My thanks to an anonymous journal reviewer for pointing out how contemporary sociological research communities are organized around more-or-less mid-range sociological theories.
2 THE SOCIOLOGY OF THE DIGITAL
Introduction To understand the value of theory for sociological research in the information age we need to distinguish between two of the main sociological responses to the challenges and opportunities presented by the internet, social media, big data and computational research methods. These are, first, the sociology of the digital, which refers to the use of established sociological theories and methods to study social phenomena influenced by information communication technologies. Second is digital and computational sociology, which both refer to the use of computational technologies to develop new sociological research methodologies or to substantially revise established methodologies. Digital and computational sociological methods include various forms of data mining, simulations, online interviews, online surveys and online ethnographic methods. The purpose of this chapter is to survey how sociological theoretical concepts are being used in the sociology of the digital. In Chapter 3 I turn to digital and computational sociology and the sociological analysis of big data. Both the sociology of the digital and digital/computational sociology are large and fast-moving research areas, so the most that can be accomplished here is to take a snapshot from which a few provisional conclusions can be drawn. As we will see, some of the theories that feature in empirical research in these areas include social network theory, theories of gender, critical theory, Bourdieusian theory and globalization theory. Some of the earliest contributions to theoretical understanding of the social impact of information communication technologies did not emerge from sociology but rather from the economics and management fields. The American economist Fritz Machlup’s 1962 study ‘The production and distribution of knowledge in the United States’ introduced the concept of the ‘knowledge industry’ and argued that as early as the 1950s a large proportion of the Gross National Product of the United States was based in knowledge-intensive sectors such as education, research and development, mass media,
The sociology of the digital 13
information technologies, and information services. Around the time of Machlup’s study, the management consultant and writer Peter Drucker was beginning to argue that modern societies were transitioning from economies based mainly on material goods to ones based mostly on knowledge (e.g. Drucker 1968). Around the same time the sociologist Alain Touraine published The Post-Industrial Society (1971), and analysis of class and cultural conflicts in the ‘programmed society’ rooted in Marxian historical materialism. At the Columbia University sociology department there was in the postwar decades extensive discussion and empirical research on social and cultural aspects of the transition to post-industrialization. This research, generally not Marxian in orientation (Clark 2005), had perhaps its greatest impact through Daniel Bell’s 1973 classic The Coming of Post-Industrial Society. Bell discussed how the post-industrial information society had evolved out of industrial capitalism as a form of capitalist society that was based economically on services rather than industrial production. In post-industrial societies ‘a majority of the labor force is no longer engaged in agriculture or manufacturing but in services, which are defined, residually, as trade, finance, transport, health, recreation, research, education, and government’ (Bell 1976: 15). Bell described the social patterns associated with computer technology entering homes and offices in economically and technologically advanced nations. With post-industrial society came a new class of ‘symbolic analysts’ (Bell 1976; Reich 2010) who were able to implement and take advantage of these new information technologies in many settings: within capitalist firms, in government, and at all levels of the education system. Post-industrial capitalism evolved in dramatic new ways in the late 1990s as companies’ investments in information technology began to contribute to productivity increases on a grand scale (Jorgenson 2001). By the 1990s the use of digital information technologies was no longer limited to an occupational caste of knowledge workers or to the high technology sector; rather, information technology had infiltrated virtually all industries and government sectors. One of the most prominent theorists of this developmental phase of capitalism was the sociologist Manuel Castells, who argued that information technology had led to a new ‘network logic’ of social organization, the culmination of a historical trend in which major functions and processes of advanced nations are increasingly organized around networks. Networks ‘constitute the new social morphology of our societies, and the diffusion of networking logic substantially modifies the operation and outcomes in processes of production, experience, power, and culture’ (Castells 2000: 500). The information revolution has increased demand for knowledge workers and led to brain drain of educated workers from developing countries to developed countries where they can make far higher salaries. Migration also allows these workers to stay up to date in terms of research and technological development. Sociologists and geographers have shown how the geographies of metropolitan areas in advanced industrialized nations have been reconfigured by skilled migration. In global cities such as New York, London and Hong Kong there is a geography of centrality and marginality in which skilled international knowledge workers cluster in gentrified neighborhoods in city centers, raising prices in central areas and pushing working-class residents out to peripheral areas (Sassen 2001).
14 Introduction
Digital inequality Sociological research on the social consequences of digital information technologies took off in the 1990s as analysts recognized that the transition to knowing capitalism was associated not only with accelerating economic growth based on increases in worker productivity but also with increasing socioeconomic inequality. ‘Digital divides’ were seen to exist within and between nations based on new patterns of inequality in access to and use of knowledge. The information technology revolution proved to be skill-biased, rewarding those with the education and cognitive skills who were best positioned to take advantage of technological change. Knowing capitalism has led to a sharp decline in the demand for less skilled workers in advanced industrialized nations and a concomitant weakening of unions and collective bargaining. By virtually all measures, income inequality in industrialized nations has risen sharply since the end of the era of industrial capitalism in the 1970s, resulting in a hollowing out of the middle classes of both advanced industrialized and developing nations. Rather than working in unionized factories, middle-class workers are increasingly resigned to service-sector work that offers few benefits or opportunities for advancement. Sociologists have paid close attention to the myriad social consequences of the transition to post-industrial society. In 1979s Gouldner discussed the ‘new class’ of knowledge workers that had emerged in late capitalism, theorizing that this class was a product of the de-sacralization of authority, the secularization of educational systems, the opening of educational systems in terms of social class, the decline of the extended patriarchal family system and rise of the nuclear family, and new markets for knowledge workers’ skills. Empirical research and theorizing on the new class continued into the 1980s (e.g. Brint 1984; Szelenyi & Martin 1988). In the 1990s Esping-Andersen (1999) analysed the political economy of the welfare state in relation to the shift to post-industrial society, while the political scientist Ronald Inglehart has continued the tradition of new class attitude research, in Inglehart’s case based on a series of multinational surveys (Inglehart 2018). Bourdieu analysed social changes in France associated with post-industrialization, globalization and Americanization, including changes in elites’ lifestyles such as the French adoption of ‘California’ leisure activities such as windsurfing (Lane 2000). Bourdieu’s analysis of culturally exclusive bourgeois consumption patterns has been challenged and extended by sociological analyses of cultural omnivorousness as a characteristic of globalized post-industrial societies (Peterson & Kern 1996; Lizardo & Skiles 2009, 2012, 2015). As socioeconomic inequality steadily increased from the 1970s onward, sociologists turned their attention to inequalities caused or reinforced by digital technologies. In the United States, sociological research on digital inequality initially focused on unequal access to broadband internet service. Then in the early 2000s, as internet access expanded, sociological attention turned to differential use of internet technologies, including consumption and production of internet content. A generation of sociologists have adapted concepts developed by Bourdieu, including especially cultural capital and habitus, to analyse the social distributions of different forms of internet use and digital skills such as blogging (see Chapter 6).
The sociology of the digital 15
For instance, Hargittai has use Bourdieusian and other sociological theoretical concepts in a series of explorations of status (Hargittai 2002, 2010) and gender differences (Hargittai & Shafer 2006) in internet use. Robinson has developed the concept of ‘digital habitus’ as a tool for analysing class differences in perceptions of the utility of internet use (Robinson 2009; Robinson & Schulz 2013), while Schradie (2011, 2012) has focused on class differences in digital production, specifically in blogging.
Social networks and relationships Manuel Castells’ social theoretical analysis of social networks in the information age (Castells 2000) has given way to numerous empirical sociological research programs on digitally mediated social networks. These research programs draw variously on Bourdieu’s and Putnam’s alternative theorizations of social capital (Julien 2015; Siisiainen 2003), and often analyse data from Facebook, Twitter and other social media platforms. There are far too many sociological studies that use social network theory to analyse data from social media platforms and other sources to possibly survey here. A few of the more theoretically innovative projects in this area are Murthy’s (2012) development of a sociological theorization of Twitter that draws on Goffmann, and Hofer and Aubert’s (2013) analysis of following behavior and perceived bridging and bonding social capital on Twitter. Other studies have paid attention to practices of impression management, micro-celebrity and personal branding (Hargittai & Litt 2011; Jackson & Lilleker 2011; Marwick & Boyd 2011) and to participatory democracy and political mobilization (Grant, Moon & Busby Grant 2010; Larsson & Moe 2012; Segerberg & Bennett 2011; Tufekci & Wilson 2012).
Politics There are many political dimensions of the transition to post-industrial information societies. In the 1990s Strange (1996) focused on the potential of global financial flows to overwhelm the ability of many states to manage their own economies, while Castells (2000) has focused on a transition in advanced capitalist countries from ‘party politics’ to ‘informational politics’ in which traditional political parties’ monopoly over political organization and citizens’ political identities has dissolved. Many countries have harnessed information technology in e-government strategies that attempt to make government services more accessible to citizens, or at least to citizens who have internet access, and many national governments make strategic investments in education systems and in public institutions that make Internet access and other information services available to the public at low or no cost (Castells & Himanen 2002). At the level of national electoral politics, knowing capitalism has been accompanied by rising political polarization (DiMaggio, Evans & Bryson 1996). Without question, the internet has driven dramatic changes in news media industries, including incentivization of partisan outrage as advertising revenue has replaced subscription revenue (Sobieraj & Berry 2011). The internet has also empowered previously marginal groups, allowing such groups to influence public discourse on
16 Introduction
immigration (Bail 2012; Ignatow & Williams 2011) and other topics. Zhuravskaya, Petrova and Enikolopov (2019) provide a useful review of empirical studies of the effects of the internet in general, and social media in particular, on voting, protests, attitudes toward government, polarization, xenophobia, politicians’ behavior, false news, and strategies used by autocratic regimes to censor the internet and use social media for surveillance and propaganda.
The sociology of algorithms and search The sociology of algorithms is a relatively new area of empirical research. Software algorithms, including algorithms for information search and algorithms used in sales, service, social media and marketing platforms, are immensely important for all types of social interactions and outcomes. They influence media content consumption and relationship formation. But because algorithms are proprietary information owned by corporations and inaccessible to users and researchers alike, they can be challenging to investigate. Kitchin (2017) suggests that there are three main obstacles to investigating algorithms. The first is gaining access to their formulation. The second is that they are heterogeneous and embedded in wider systems. And the third is that their effects unfold contextually and contingently and are therefore difficult to trace. Despite these obstacles, recent empirical sociological studies demonstrate that while algorithms in commercial software cannot be directly observed by researchers, they are not a ‘black box’ either, as there are now tools available for analysing their informational and social dynamics. Gillespie (2017, 2018) advocates for analysis of social media platforms themselves rather than studying social dynamics occurring on social media platforms without regard for algorithms, content moderation, and other technologies and policies that shape online interactions. Social media platforms ‘don’t just guide, distort, and facilitate social activity—they also delete some of it. They don’t just link users together; they also suspend them. They don’t just circulate our images and posts, they also algorithmically promote some over others’ (Gillespie 2015: 1). Gillespie illustrates the power of the interplay between algorithmic systems that grant visibility and certify meaning and those trying to be seen by them with an empirical analysis of attempts to redefine the name of US Senator Rick Santorum, a tactical intervention that topped Google’s search results for nearly a decade and then mysteriously dropped during the 2012 Republican nominations (Gillespie 2017). Bishop (2018) for instance has analysed how YouTube’s algorithm for video visibility favors middle-class vloggers whose content aligns with advertisers’ demands and needs. She finds that the YouTube algorithm privileges and rewards feminized content and contributes to gender polarization via its promotion of hegemonic, feminized videos created by beauty vloggers with significant social and cultural capital. In a 2019 study Bishop analyses beauty vloggers’ strategic management of algorithmic visibility based on her ethnography of both beauty vloggers and industry stakeholders. Her ethnography focused on ‘algorithmic gossip,’ which she defines as collaborative processes used to formulate and sustain algorithmic expertise. Algorithmic gossip is productive as it informs and supports practices such as uploading frequently and producing feminized beauty that performs effectively on YouTube.
The sociology of the digital 17
Influenced by Foucault’s writings on panopticism, Bucher (2012) analyses EdgeRank, the algorithm that structures the flow of information and communication on Facebook’s news feed, to explore how Facebook algorithms determine news item visibility. She argues that Facebook’s regime of visibility imposes a perceived ‘threat of invisibility’ on the part of participatory subjects. Bucher (2017) also explores the situations and spaces where people and algorithms meet in ordinary life by examining people’s personal stories about the Facebook algorithm through their tweets and interviews with ordinary users. She develops the notion of an ‘algorithmic imaginary,’ ways of thinking about what algorithms are and how they function that plays a generative role in shaping the Facebook algorithm itself. Rieder (2012) has provided an historical analysis of Google’s PageRank algorithm within ‘a larger genealogy of ideas, concepts, theories, and methods that developed, from the 1930s onwards, around the fields of sociometry, citation analysis, social exchange theory, and hypertext navigation.’ Rieder advocates a multilayered approach that combines different types of methodological and conceptual resources. Rieder, Matamoros-Fernández and Coromina (2018) propose to study algorithms’ social power by investigating broader forms of agency involved in their creation. They examine YouTube’s search results ranking over time in the context of seven sociocultural issues. Through a combination of rank visualizations, computational change metrics and qualitative analysis, they study search ranking as the distributed accomplishment of ‘ranking cultures,’ identifying three forms of ordering over time: stable, ‘newsy’ and mixed rank morphologies. Mager (2012) draws on the tradition of the social construction of technology and 17 qualitative expert interviews to theorize how search engines and their revenue models are negotiated and stabilized in a network of actors and interests including website providers and users. She shows how corporate search engines and their capitalist ideology are solidified in a sociopolitical context characterized by a techno-euphoric climate of innovation and a politics of privatization. Mager conceptualizes ‘algorithmic ideology’ as a valuable tool to understand and critique corporate search engines in the context of wider sociopolitical developments. Drawing on critical theory (Chapter 4) she shows how capitalist value-systems manifest in search technology, how they spread through algorithmic logics, and how they are stabilized in society.
Conclusions What is the role of theory in the sociology of the digital? While I cannot claim to have presented a comprehensive review of sociological research on the social uses of contemporary information communication technologies in this chapter, I do think there are trends in the use of theory in sociological analyses of digital inequalities and digitally mediated social networks and politics as well as in the sociology of algorithms. In these areas there has been, first, a shift from early grand theorizing (most prominently from Castells but also from Bell and others) to the
18 Introduction
use of middle-range theorizing and empirical analysis. Second, middle-range studies in all these areas use eclectic theoretical resources, including theories of class, gender, social networks and globalization as well as social constructionism and critical theory. While probably necessary given the variety of topics analysed in these literatures, the shift from a few grand theories to numerous middle-range ones has perhaps come at the expense of knowledge cumulation.
3 COMPUTATIONAL SOCIOLOGY AND BIG DATA ANALYSIS
Introduction In the early 2000s big data advocates considered the possibility that the information revolution might make sociological theory irrelevant; instead of designing studies that involved testing established theories using carefully sampled data, perhaps researchers could simply analyse large data sets inductively, allowing correlations and trends to reveal themselves (Anderson 2008). This scenario was never realized, in part because of the possibility of unidentified confounding variables in data with anything less than a perfect collection of factors. Because spurious relationships are almost always found in large data sets (McFarland & McFarland 2015), some reasoning around relationships and theoretical interpretation is a practical necessity in big data research. For these and other reasons there is among sociologists near consensus that theory remains critically important for empirical research. Theory has always provided sociology with a shared language through which researchers working in disparate subfields using disparate methodologies could communicate their findings to each other so as to allow for knowledge cumulation. And as we will see, contemporary computational research methodologies and the advent of big data may make theory more valuable than ever. The focus of this chapter is recent discussions within sociology on the role of theory in digital sociology, in computational sociology, and in sociological analysis of big data. I use the term ‘digital sociology’ to refer to the use of digital technologies within sociological methodology. Digital sociological methods include online surveys (Stern, Bilgen & Dillman 2014), virtual ethnography (Hine 2000), internet search methods (Ignatow & Williams 2011) and text mining methods such as topic models and opinion mining (Ignatow & Mihalcea 2016, 2017). Computational sociology, as a subtype of digital sociology, represents a more radical departure from ‘mainstream sociology,’ if such a thing can be said to exist (Calhoun & van Antwerpen 2007). Computational sociology is a branch of computational social
20 Introduction
science that involves quantitative modeling of new kinds of digital trace data as well as computer simulations of social phenomena (Lazer et al. 2009). Computational social science is performed on a large scale at companies like Facebook, Google and Yahoo, and generally on a smaller scale in academic settings. Computational sociology almost always involves analysis of so-called big data, which is generally defined as data sets characterized by heterogeneity, noise and sheer size (Lim 2015). Big data also refers to the use of inductive statistical methods to analyse data sets having these characteristics. Methods for big data analysis emerged from the natural sciences and have been applied in e-commerce, the video game industry, and by now virtually all areas of business activity, as well as in the social sciences. In business and the social sciences big data often includes ‘trace data,’ which are data sets made up of digital traces of individual and social behavior, such as actions taken by individuals on social media and e-commerce platforms. Big data is often textual, containing news stories and users’ comments, conversations and product reviews on social media, streaming video, news and e-commerce platforms. While many sources of textual data are in the public domain, some require access through a university subscription. For example, sources of news data include the websites of local and regional news outlets as well as private databases such as EBSCO, Factiva and LexisNexis. These sites provide access to tens of thousands of global news sources including blogs, television and radio transcripts, and traditional print news. Social media platforms are a major source of trace data. Many platforms provide their own Application Program Interfaces (APIs) for programmatic access to their data; for instance, Twitter APIs allow access to a small set of random tweets every day and to larger keyword-based collections of tweets. Blogs can also be accessed through APIs; for instance, the Google Blogger platform offers programmatic access to blogs and bloggers’ profiles, and includes a rich set of fields covering location, gender, age, industry, and favorite books and movies. Big data from social media sites, news sites, digital archives and other sources has become entrenched in the formal institutions that support social science research. Public funding agencies and private foundations have taken up the banner of big data, and dozens of universities have promoted programs or tracks in either big data or data science (Halavais 2015). A number of sociologists have made important programmatic statements about how sociology might position itself with regard to big data. Neuhaus and Webmoor (2012) discuss data mining in the context of analysis of social media platforms, and introduce the phrase ‘massified research’ to denote the changes of scale that accompany research on social media platform data. Massified research poses challenges with respect to established ethical protocols, operating as it does in a grey area of undefined conduct. Kitchin (2014) argues that there is a need for critical sociological reflection on the epistemological implications of the big data revolution and for sociological big data analysis that is ‘situated, reflexive and contextually nuanced’ (p. 1) rather than analysis that uncritically accepts hyperbolic claims made on behalf of big data. He also suggests that it is necessary to consider social and ethical implications such an epistemological position may
Computational sociology and big data 21
entail. Other sociologists have likewise expressed concern that a naive positivist concept of data as objective and transparent information reflecting social reality will be adopted by researchers working with big data (Halavais 2015). Tinati, Halford, Carr and Pope (2014) suggest that the promise of big data for sociology has not been realized because sociologists have restricted themselves methodologically. They outline a set of methodological principles for approaching these data that stand in contrast to previous research, and introduce a new tool for harvesting and analysing Twitter built on these principles, working their argument through an analysis of Twitter data linked to political protest over UK university fees. Others have suggested that big data will entail a positive epistemological change among social scientists if they embrace values of research transparency and ‘kludginess,’ as is the case in computer science and engineering (Karpf 2012). And as was noted in Chapter 1, as a consequence of big data’s potential to contribute to the proliferation of social research conducted by non-sociologists, a number of sociologists have expressed concern about sociology’s declining academic leadership in the big data field (Burrows and Savage 2014). DiMaggio (2015), however, foresees a peaceful convergence between sociological and non-sociological epistemic cultures involved in social research with big data. Clearly there is a spectrum of sociological views on big data, and by extension on the prospects for computational and digital sociology, from cautious to enthusiastic. The question of the implications of big data for sociology and other social sciences is so pressing that the journal Big Data and Society was started in 2014 to provide a forum in which researchers could work how to revise their theories, methods and research strategies to best take advantage of new large social data sources. The journals Information, Communication & Society and New Media and Society cover similar ground. In this chapter I will survey position papers and discussions published in these journals and elsewhere and consider possible implications of these for this book’s overarching question of the value of sociological theory for digital age sociology. The structure of this chapter is based on Mohr’s (2015, 2016) outline of three possible functions of sociological theory in relation to big data analysis. Mohr suggests that the big data revolution has the potential to produce accessible information richer than any other type of data sociologists have ever had to work with in part because, unlike interview or survey data, big data is often trace data produced, unprompted, within the flow of daily life. Such data is often geo- and time-stamped and contains relational signatures and audio and visual content. Mohr’s three possible functions of sociological theory in relation to big data include theory’s function as a navigational tool, as a tool for hermeneutic analysis, and as a tool for reflexive and critical analysis of research methods themselves.
Theory as a navigational tool Mohr’s argument for the use of theory as a navigational tool for big data analysis resembles arguments made for Forensic Social Science by McFarland, Lewis and Goldberg that are reviewed in Chapter 5. The basic idea for both is that a measurement
22 Introduction
paradigm predicated on data scarcity came to the fore in sociology in the decades following the Second World War, and that this paradigm is fundamentally incompatible with big data analysis. After the war, survey methods emerged as the preeminent means of collecting data for sociological research: Many of the relevant technologies, organizational forms, and intellectual developments of survey techniques had already been pioneered and refined in the decades before the war, but WWII was an important catalyst. Scholars from multiple disciplines were brought together in large, well resourced, practical project teams. For example, the U.S. War Department surveyed over a half million active duty American soldiers about their experiences in combat, unit morale, racial prejudice, and many more topics … Such efforts led to an accelerated articulation between the theory and the method of survey analysis, as well as an elevation in the legitimacy of the work. (Mohr 2015) Lazarsfeld (see Chapters 1 and 7) and other sociologists developed these methods and promoted their institutionalization in American and global sociology (Converse 1987; Mohr & Rawlings 2010, 2015; Platt 1996). Applied within survey analysis, the data scarcity measurement paradigm implied scientific investigation that begins from a statistically controlled sample of respondents answering survey items designed to measure subjective and objective characteristics of the respondents. Item responses are statistically extrapolated to indicate how such characteristics are distributed, interrelated and perhaps causally connected within the larger population from which the respondents were drawn. A major advantage of this measurement paradigm was that it allowed for efficient leveraging of scarce information to learn about a large population that could not feasibly be directly measured. Sociological subfields including medical sociology and research on social stratification, social mobility and public opinion continue to operate within the data scarcity paradigm. And areas of sociological specialization including probability theory, sampling theory, survey design and scale analysis have all developed in response to problems associated with this paradigm. Within this paradigm the central empirical sociological project of survey analysis continues to be updated, for instance by Vaisey (2009), whose ‘dual-process’ model seeks to distinguish between two levels of thought processes in survey answers, and Goldberg’s (2011) development of Relational Class Analysis (RCA), a method for analysing the relations among survey item responses rather than the responses themselves. Mohr, Wagner-Pacifici and Breiger (2015) discuss how social science content analysis is also based on the data scarcity measurement paradigm. As was the case for survey analysis, the Second World War was critical for the development of content analysis methods. During the war the United States Experimental Division for the Study of Wartime Communications team led by Harold Lasswell developed a set of formal procedures for systematically extracting essential information from a collection of documents. The document collections yielded by these procedures
Computational sociology and big data 23
could be used to map the distribution of information across the larger textual space from which the sample had been drawn. Lasswell’s team used human coders to read foreign newspapers to gather war intelligence using these procedures, and after the war Lasswell worked to help refine and institutionalize the wartime methodologies into a research program of content analysis (Lasswell & Leites 1949; Lasswell, Lerner & Pool 1952) that was subsequently extended by Krippendorff (2013) and many others. The underlying measurement framework defining both survey analysis and content analysis is the careful and calculated leveraging of scarce information. This paradigm is proving itself inadequate for organizing sociological research based on big data. Some problems of applying quantitative practices predicated on data scarcity to the world of big data are simply practical: sample sizes quickly become so large that traditional statistical measures of significance are rendered useless because practically all relationships become significant (McFarland & McFarland 2015). There are also systematic distortions of the social world that are embedded in big data formats (Adams & Brückner 2015; Diesner 2015; Lewis 2015; Shaw 2015).
Theory and hermeneutics A second function served by sociological theory in relation to big data is to guide analysis of complex processes of human interpretation. Such interpretive processes are traditionally performed by researchers in the humanities trained to recognize subtleties of expression and phrasing and nuances of meaning. Mohr, Wagner-Pacifici and Breiger (2015) refer to theory’s role in computational hermeneutics, which they define as the use of algorithmic and computational text mining tools to analyze large collections of texts in their ‘full hermeneutic complexity and nuance’ (p. 3). In contrast to survey data, big data can ‘provide us with articulated access to complex levels and systems of meanings … big data can allow us to strategically examine different types and forms of meanings, from simple sentiments to complex thoughts, from immediate reactions to deliberative reflections’ (Mohr 2016). Such hermeneutic examination requires theories of culture, language, signs and symbols. Computational hermeneutics à la Mohr, Wagner-Pacifici and Breiger (2015) is predicated on the availability of text mining methodologies. Text mining refers to the range of computational tools available for analysing large sets of textual data, although quantitative textual analysis predates computer technology, and for that matter, sociology. As early as the late seventeenth century Europeans used text analysis in inquisitorial church studies of newspapers. Text analysis was performed in Sweden in the eighteenth century when the Swedish state church analysed the symbology and ideological content of popular hymns that appeared to challenge church orthodoxy (Krippendorff 2013: 10–11). Systematic quantitative analysis of newspapers was performed in the late 1800s and early 1900s by researchers such as Speed (1893), who showed that New York newspapers had decreased their coverage of literary, scientific and religious matters in favor of sports, gossip and scandals.
24 Introduction
The field of text analysis expanded rapidly in the twentieth century as researchers in the social sciences and humanities developed new techniques for analysing texts, including methods that relied on subjective human interpretation as well as, and often in combination with, formal statistical methods. In the late twentieth century social researchers and statisticians developed formal content analysis methods that brought rigor and precision to human interpretations of texts. Social scientists continue to develop a wide variety of computer-assisted text analysis tools, including thematic analysis, metaphor analysis and narrative analysis. Where content analysis and text analysis methodologies developed slowly, often in isolation from one another, these fields are today being upended by innovations from the fields of Artificial Intelligence (AI) related to machine learning and natural language processing (NLP). Over the past few years text mining has started to catch on in academic fields as diverse as anthropology, communications, economics, education, political science, psychology and sociology (see Ignatow & Mihalcea 2016, 2017). In contemporary sociological research text mining involves the collection and analysis of textual data for the purpose of making inferences about the communities, groups and individuals who are the source of the data. Text mining is a relatively new interdisciplinary field that incorporates tools developed by computational linguists and statisticians. It includes several main elements: tools for acquiring digital text data; data cleaning and preparation tools; and analysis methods such as topic models and opinion mining. Topic models involve automated procedures for coding collections of texts in terms of meaningful categories that represent the main topics being discussed in the texts. Topic models assume that meanings are relational, and that the meanings associated with a topic of conversation can be understood as a set of word clusters. Topic models treat texts as what linguists call a ‘bag of words,’ capturing word co-occurrences regardless of syntax, narrative or location within a text. Topics can be thought of as clusters of words that tend to come up in a discussion, and therefore to co-occur more frequently than they otherwise would, whenever the topic is being discussed. Each topic is a distribution over all observed words in the texts such that words that are strongly associated with the text’s dominant topics have a higher chance of being included within the text’s bag of words. Based on these distributions, authorship is conceptualized as an author repeatedly picking a topic and then a word and placing them in the bag until the document is complete. The objective of topic modeling is to find the parameters of the statistical process that generated the final text or text collection. Topic models are frequently used in the field of digital humanities. A surge of interest in topic models among humanities scholars began in 2010 with widely circulated blog posts by Matthew L. Jockers on topic modeling and Cameron Blevins on a late eighteenth-century diary. Then at an Institutes for Advanced Topics in the Digital Humanities conference in Los Angeles in 2010 several advocates of topic models, including David Mimno, David Blei and David Smith, introduced the method to many humanities scholars for the first time. Since that conference, humanities scholars have used topic models in studies of themes in nineteenth-century literature (Jockers & Mimno 2013), the history of literary scholarship (Goldstone & Underwood 2012) and many other historical and literary topics.
Computational sociology and big data 25
Mützel (2015) used topic modeling for empirical research projects on the field of breast cancer therapeutics and in a study of the gastronomic field of Berlin. In Mützel’s studies results from the procedure allowed her to describe and trace developments over long periods of time. Her results also allowed her to zoom in on particular moments in time to conduct qualitative analysis of her texts. Sociologists have also used topic models for analysis of historical data from newspaper and scholarly archives. For example, DiMaggio, Nag and Blei (2013) used topic models to investigate controversies over federal funding of the arts in the United States during the 1980s–1990s. They coded almost 8,000 newspaper articles selected from five newspapers in order to analyse ‘frames,’ defined as sets of ‘discursive cues’ that suggested a ‘particular interpretation of a person, event, organization, practice, condition, or situation’ (p. 593). DiMaggio, Nag and Blei found that different media frames were promoted by different institutional actors as a way to try to influence the course of public discourse and political debate. Of the 12 topics identified in their analysis, several clearly reflect politicized frames, such as the ‘1990s culture wars’ and National Endowment for the Arts grant controversies. Levy and Franklin (2014) used topic models to examine political contention in the US trucking industry. Their data were online archives of public comments submitted during agency rulemakings, which they mined from the online portal regulations.gov. They used topic models to identify latent themes in a series of regulatory debates about electronic monitoring, finding that different types of commenters use different interpretive frames. Comments submitted by individuals were more likely to frame the electronic monitoring debate in terms of broader logistical problems plaguing the industry, such as long wait times at shippers’ terminals. Organizational stakeholders were more likely to frame their comments in terms of technological standards and language relating to cost/benefit analysis. A second text mining tool is opinion mining, also known as sentiment analysis or subjectivity analysis, which involves extracting attitudinal information such as sentiments, opinions, moods and emotions from texts. An important kind of information that is conveyed in many types of written and spoken discourse is the mental or emotional state of the writer or speaker or some other entity referenced in the discourse. News articles, for example, often report emotional responses to a story in addition to the facts. Editorials, reviews, weblogs and political speeches convey the opinions, beliefs or intentions of the writer or speaker. Opinion mining is defined as the task of identifying such private states in language, and is typically divided into two main subtasks. The first is subjectivity analysis, which identifies if a text contains an opinion, and correspondingly labels the text as either subjective or objective. The second, sentiment analysis, classifies an opinion as either positive, negative or neutral. The subjectivity and sentiment of texts can be evaluated at several levels, including at the document level and, to a limited degree, the level of individual sentences. Opinion mining research has benefited from the growing number of product reviews available online, on sites such as Amazon.com or epinions.com, which can be used to build very large sentiment annotated data sets. Such reviews are usually available in many languages, thus enabling the construction of sentiment analysis
26 Introduction
tools in languages other than English. The methods available for opinion mining include rule-based systems, relying on manually or semi-automatically constructed lexicons, and machine learning classifiers trained on opinion-annotated corpora. Stoltz and Taylor (2019) have recently introduced a third text mining tool for computational hermeneutics, Concept Movers Distance (CMD). This method measures a text’s engagement with a focal concept using distributional representations of the meaning of words. It involves measurement of the minimum distance the words in a document need to ‘travel’ to arrive at the position of a ‘pseudo document’ consisting of only words denoting the focal concept. Stoltz and Taylor’s approach models the prototypical structure of concepts, and can be used even when terms denoting concepts are absent from a document collection. Stoltz and Taylor use a wide variety of historical and social science cases to demonstrate the utility of their approach, including Jaynes’ (2000 [1976]) hypotheses about the presence of ‘consciousness’ in the Iliad, Odyssey and the King James Version of the Bible; the relationship between engagement with ‘death’ and the number of deaths in Shakespeare’s plays, as compared to 200 concepts that are fundamental in 87 Indo-European languages; and Lakoff’s theory (2010) regarding the competing cultural models of morality in US politics by comparing engagement with the compound concepts ‘strict father’ and ‘nurturing parent’ in State of the Union Addresses. Kozlowski, Taddy and Evans (2019) have recently introduced word embeddings as a fourth text mining tool for computational hermeneutics. They discuss how the modeling of word embeddings, in which semantic relations between words are represented as relationships between vectors in a high-dimensional space, can be used for cultural and historical analysis. They demonstrate the utility of word embeddings with an analysis of changes in the social meaning of social class in several million of books published over 100 years.
Critical analysis of research methods A third function of sociological theory in relation to big data is as a tool for critical analysis of research methods themselves. Even the largest data sets and most sophisticated methods, after all, cannot critically analyse themselves. Theoretical concepts developed in the sociology of knowledge and sociology of science empower sociologists to engage in reflexive analysis of the tools and research infrastructure that allows empirical sociological research to be performed. Social research methods are intrinsic features of contemporary capitalism (Savage & Burrows 2007; Thrift 2005). So sociological interest in the ‘politics of method’ (Steinmetz 2005) involves sociologists renewing their interests in methodological innovation while also reflecting critically and sociologically on the sources and social consequences of these innovations. Savage (2012) demonstrates critical and sociological reflection on sociological methodology in exploring the tensions between sociological research methodologies (the interview and the survey) centered on people talking, and post-representational theories that focus on the interplay of social forces of which centered subjects are often unaware. Savage’s solution to this stand-off between
Computational sociology and big data 27
method and theory involves analysis of the changing jurisdictions commanded by different methodological repertoires. He suggests that the social survey and the qualitative interview are part of the same generation of research methods that came to prominence during the mid-twentieth century but are now being eclipsed (Mohr 2015, 2016), and advocates a renewal of the critical project of sociology by challenging current practices in the collection, use and deployment of social data.
PART II
Theoretical innovations
4 CRITICAL INFORMATION THEORY
Introduction What is the value of sociological theory in the information age? What can sociological theory contribute to our understanding of how information technologies effect, and are influenced by, social change at every level, from interpersonal communication and relationships to changes in organizations and the world economy? And how can sociological theory contribute, if at all, to scholarship that employs emerging digital research methodologies? Our first set of answers to these questions are from the field of Critical Information Theory (CIT). CIT, as a branch of social theory, is less oriented toward empirical research practice than are the approaches to sociological theory covered in Chapters 5 and 6. While prior to the Second World War there was not a sharp division between sociological theory and social theory, in the postwar period sociological theory and social theory differentiated as sociologists embraced new quantitatively oriented research methodologies (Chapter 1) and reoriented the use of theory in their discipline so as to more effectively guide empirical research using the new methods. There is no question that, in sharp contrast to mainstream contemporary sociological theory, CIT is not oriented toward directing social research, and certainly not toward directing research that uses quantitative empirical research tools such as surveys, experiments, content analysis methods, or computational social science tools such as data mining tools or simulations. Like social theory generally, from the classical Critical Theory of the Frankfurt School to postmodern social theory (Seidman 1991), the orientation of CIT is toward ethical social change, or praxis. It is ‘a lever of possible practice’ with the goal of a reasonable society, an association of free people based on a sustainable utilization of technical means. It starts from the judgment that human life is livable or can and should be made livable and that in a given society there are specific
32 Theoretical innovations
possibilities for improving human life and specific ways and means for realizing these possibilities … It strives for a condition without exploitation and oppression and for the emancipation of humans from enslaving relationships. (Fuchs 2008: 6)
Historical and philosophical foundations Critical Theory is one of the most influential social theoretical schools of thought. It provides intellectual foundations for scholarship across the social sciences, communication studies and media studies. As there are hundreds of books and articles that survey Critical Theory (e.g. Agger 2013), here I provide only a brief overview of Critical Theory’s historical and philosophical foundations. Critical Theory was an attempt to reconstruct the relationship between theory and praxis following the defeat of socialist revolutions and the rise of fascism in Europe in the 1930s. Critical theorists attempted to explain the rapid rise of fascist propaganda and mass media including newspapers, radio, television and popular film, and at the same time to construct a practically oriented theory of society. Critical Theory is a branch of Western Marxism, a theoretical movement initiated by the Hungarian Marxist Georg Lukács who had adapted Marx’s analysis of authoritarian elites in industrial capitalism to what he saw as a new phase of capitalism. His focus on the failure of socialist revolutions to occur as Marx had prophesied led him to conclude that class consciousness was not an automatic response to material economic conditions, and he attempted to explain the delay in the socialist revolution, which Marx assumed to have been imminent in the late nineteenth century, in terms of a class consciousness that did not derive directly or automatically from business failure or unemployment. As capitalism advanced into a global, technologically advanced mode in the twentieth century, workers began to accept it as an inevitable social and economic system. Lukács pointed out how capitalists fostered precisely this image in order to divert workers from the possibility of socialist revolution. A second precursor to Frankfurt School Critical Theory was the work of the literary theorist Walter Benjamin. Although not formally a member of the Frankfurt School despite being associated with it, Benjamin’s analysis of fascism and his literary approach to social analysis are recognized to have been influential on the Frankfurt School, and particularly on Theodor Adorno (Berry 2015: 33; cf. Agger 2013: 91, 127; Roberts 1982: 172–174). Benjamin argued that fascism ‘introduced aesthetics into political life by providing the masses with a chance to “express themselves”’ (Berry 2015: 33) as a substitute for opportunities to change property relations in their favor. Benjamin further contributed to many of the themes that would feature in Critical Theory, including theories of capitalism, state bureaucracy, science, technology, positivism, and the susceptibility of the public to ideology. Influenced by Lukács and Benjamin, the theorists of the Frankfurt School, including Theodor Adorno, Max Horkheimer, Herbert Marcuse, and Adorno and Horkheimer’s student Jürgen Habermas, reconstructed Marx’s theories in order to explain how in the twentieth century the ‘culture industries’ engaged in ideological
Critical Information Theory 33
and cultural manipulation of mass publics, and how scientific positivism had become a modern mythology (Agger 2013: 78). Under feudalism and industrial capitalism, it was primarily religious elites who had promoted cultural and ideological conformity in the broader society, but in the twentieth century the entertainment industry came to compete with older forms of cultural and ideological authority. The entertainment industry produced mass popular culture including blockbuster films, network television, commercial radio and print news, all of which contained ideological content supportive of capitalism and nationalism. In The Dialectic of Enlightenment (1972) and other publications, Horkheimer and Adorno followed Lukács, and before him Marx, in arguing that mass popular culture served as an ephemeral narcotic that diverted people’s attention from pressing personal and social problems. First Marcuse in The Aesthetic Dimension (1979) and then Adorno in Aesthetic Theory (1984) explained the critical functions of art and culture as spheres of experience and expression in which critical insights can be gained, and lamented how entertainment products such as movies, television, trade press books and radio had absorbed noncommodified art and culture into a cycle of commodification and ideological hegemony. Horkheimer and Adorno argued that Western society had a penchant for viewing the world as an object to be plundered. This penchant for domination extended to the domination of nature, of women, of minorities, and of non-Western societies. Horkheimer and Adorno conceived of scientific positivism as a modern form of social domination, of arrogance and hubris in the name of social progress. In Negative Dialectics (2003), Adorno critiqued the implicit Western theory of knowledge, which he termed ‘identity theory,’ that presumed that concepts in language can perfectly describe the external world. Identity theory is not only a theory of knowledge but also a social theory and philosophy of history in which the subject (the person) can completely master the object (nature and other people) by manipulating it socially and technologically or capturing it with scientific concepts. Adorno argued that identity theory provided the philosophical basis for all manner of social and ecological atrocities. According to Christian Fuchs, a leading contemporary critical theorist of information technology, there are several elements of the classical Frankfurt School Critical Theory of Adorno and Horkheimer that are especially important for modern CIT (Fuchs 2008: 6–7). These include, first, their deconstruction of the consumerist and nationalist ideologies of consumer capitalism. Following Adorno (1966), CIT also seeks to deconstruct identity theory (the idea that scientific language can completely and unambiguously capture reality). Fuchs asserts that CIT can deconstruct Western conceits about the power of scientific language by creating ‘a linguistic and theoretical universe that is complex and dialectical’ (Fuchs 2008: 6). A second element of classical Critical Theory that is prominent within CIT is an openness to idealistic and utopian thought via the development of dialectical critique focused less on the exigencies of social realities than on the ‘possibilities of existence’ so as to develop new categories of thought that ‘question the world that is and that which existing society has done to humans’ (Fuchs 2008: 6). Third, CIT is motivated by an explicit ethical orientation that challenges all forms of exploitation and oppression. A fourth element is a Marxian
34 Theoretical innovations
ontology that Fuchs refers to as ‘dynamic materialism.’ This ontology is materialistic and sociological ‘in the sense that it addresses phenomena and problems not in terms of absolute ideas and predetermined societal development, but in terms of resource distribution and social struggles’ (Fuchs 2009: 70). The fifth element is an epistemology of ‘dialectical realism’ in which academic work is understood to be capable of allowing humans to understand and partly transform the material world by identifying contradictions within social, economic, and political arrangements. Sixth is an axiology, ‘negating the negative,’ that is closely related to CIT’s epistemology of dialectical realism. ‘Negating the negative’ refers to the potential for critical academic work to deconstruct destructive ideologies and replace them with positive alternative concepts at the ‘interface between theory and political practice’ (2009: 71). Finally, Fuchs advocates for humanistic methodologies rather than ‘positive’ (computational, quantitative) methodologies for achieving the academic and social potential of CIT. In the twentieth century the Critical Theory of Lukács, Benjamin and the Frankfurt School had a major influence on the academic fields of media studies and journalism. It also had an impact on sociology beginning in the 1960s–1970s. Influenced by Critical Theory, some social researchers fully adopted Critical Theory’s social ontology, with its focus on ideological manipulation committed by authoritarian elites, while in reception studies that emphasize audiences’ agency, reflexivity and interpretive creativity this authoritarian ontology can serve as a foil (e.g. Liebes & Katz 1986). In any event, whatever successes it has had as a theoretical tool for the analysis of one-tomany broadcast media such as commercial radio, newspapers, blockbuster films and network television, we must now ask whether the social ontology, epistemology and exclusively humanistic research methods of Critical Theory are suited to analyzing contemporary information technologies and the industries that produce them.
Can Critical Theory cope with the information revolution? While Critical Theory began in Europe in the 1930s as an attempt to explain the rise of fascist propaganda and mass media and to reorient social theory from ivory tower academic speculation to concrete social praxis, CIT emerged in the 1990s as an attempt to explain, and to whatever degree possible influence, the public reception of internet-based information technologies. Early critical information theorists such as Andrew Feenberg developed cultural critiques of information technology that grew into the subfield of Science and Technology Studies known as Critical Technology Studies (Feenberg 2017). In contemplating the information revolution of the past two to three decades, critical theorists have generated an impressive volume of critique and commentary. Perhaps due to sociological theory’s postwar separation from social theory, sociological theorists have not paid close attention to, and certainly not critically evaluated, Critical Theory’s responses to the information revolution. After several decades of development of CIT (Feenberg began publishing in the early 1970s after all), this seems like an apt time to ask where things stand. What have been CIT’s most important contributions? What can sociological theory learn from CIT? Does CIT provide viable answers to the question of the value of theory in the information age?
Critical Information Theory 35
Critical Information theorists As noted by Fuchs (2008), there are a number of elements that run across the Critical Theory of Lukács, Benjamin, the Frankfurt School and contemporary CIT. There are also differences across the social theorists who self-identify as working within the Critical Theory tradition, and there have been shifts in the overall direction of CIT over time, for the most part in the direction of closer adherence to orthodox Frankfurt School and even Marxian theoretical positions. Thus this section is organized by theorist, and also, approximately, chronologically.
Andrew Feenberg Andrew Feenberg is recognized as a pioneering figure in CIT. In the 1960s he was active in the New Left and studied philosophy under Marcuse at the University of California, San Diego. He has published widely on the philosophy of Marcuse, Heidegger, Habermas, Marx, Lukács and the Japanese philosopher Nishida, although his primary contribution to the study of technology is his argument for the democratic transformation of technology. He is interested in reconciling technology and freedom and optimistic about the possibility of the democratization of the technological order via public participation in technological policy deliberations. His optimism derives from the Duhem-Quine principle in the philosophy of science, a principle that, contra widely held assumptions that it is science and technology that shape society and not the other way around, holds that there is an inevitable lack of logically compelling reasons to prefer one scientific theory over another. Applied in the realm of technology, technical choices are inevitably ‘underdetermined,’ and the ‘final decision between alternatives ultimately depends on the “fit” between them and the interests and beliefs of the various social groups that influence the design process’ (Feenberg 1995: 4). Influenced by the defeat of communism (Feenberg 2002: vi) and interested in overcoming long-standing theoretical dualisms, Feenberg’s argument for the underdeterminedness of technological innovations stands in stark contrast to the arguments advanced by several more recent critical theorists of technology. Feenberg is heterodox in his development and application of Marxist and Critical Theory (1995: x; 2002: 61–62) and interested in overcoming the opposition to which the Frankfurt School contributed between those who are in favor of or opposed to information technology. For Feenberg technology is ‘neither a savior nor an inflexible iron cage’ but rather a new kind of cultural framework that is subject to transformation from within. He thus seeks to reconcile what he and other critical theorists view as the conflicting claims of reason and culture. Feenberg is relatively optimistic about the democratic potential of information technologies. The theoretical foundations for Feenberg’s arguments for underdeterminedness and the potential for technological democratization are developed in The Critical Theory of Technology (1991) (re-published as Transforming Technology: A
36 Theoretical innovations
Critical Theory Revisited [2002]), Alternative Modernity: The Technical Turn in Philosophy and Social Theory (1995) and Questioning Technology (2012). These foundations include a concept of dialectical technological rationality he terms ‘instrumentalization theory.’ Instrumentalization theory combines the social critique of technology from Marx and the Frankfurt School with results from empirical case studies from Science and Technology Studies. Combining social theory of technology with empirical studies led him to propose a branch of ‘social informatics’ that would constitute a new academic transdiscipline. Feenberg has applied instrumentalization theory in studies of online education, the Minitel, the internet and digital games (e.g. Feenberg 2012). Writing in relatively accessible prose, he does not target the technology industries themselves or societal trends putatively caused by technological advances (‘informationalism,’ ‘network society,’ ‘digital divides’ and so on) but rather public ideologies of technological determinism and technological value neutrality. From these studies he concludes that critical analysis of technology is not only possible but necessary so that technologies can be designed and implemented more appropriately to serve humane purposes within democratic societies.
Scott Lash One of the most influential social theorists of his generation, Scott Lash is not known only or even primarily for his work on social theory and information technology. But he has made several recognized contributions in these areas, mostly in his 2002 book Critique of Information, which was not so much a critical theoretical analysis of technology but rather a metatheoretical analysis of the implications of contemporary information technologies for Critical Theory itself. For Lash, because critical social science ‘grew up in the age of Ideologiecritique’ (2002: 1), its value since the 1980s, in a post-ideological age in which symbolic power is more informational than ideological, is uncertain. There is for Lash an immediacy to information that has little in common with the traditional objects of Critical Theory—systems of belief such as Christianity and the Enlightenment. In keeping with postmodern and anti-foundationalist social theory, Lash argues that there has been an erosion of ‘transcendental groundings’ for knowledge, inquiry, morality and politics. The search for ‘transcendentals’ in all their diverse forms has been discredited by the dynamics of informational modernity, so there is as a consequence no ‘space’ for transcendental critique in a global society inundated by information produced in real time as events unfold. Writing a decade before the advent of Twitter, Lash followed Benjamin in recognizing that critique had finally devolved into commentary (also Latour 2004) and that there is today no longer any transcendental, objective or privileged position from which critique or social analysis can be undertaken. If, following Thrift (2005), contemporary society is intrinsically informational, and the analyst is inescapably part of the society, ‘so too must the analyst and the analysis be informational, as tied up with and characterized by the nature of information as every other social entity or phenomenon,’ and therefore ‘Information critique must be critique without transcendentals’ (p. 9). The Critical Theory text becomes
Critical Information Theory 37
just another object, just another cultural object, consumed less reflectively than in the past, written … under conditions of time and budget constraint much more than in the past. Information critique itself is branded, another object of intellectual property, machinically mediated. (Lash 2002: 10) Without recourse to transcendentals, what is left of Critical Theory is far from clear (cf. Lash 2006; Taylor 2006).
Christian Fuchs Christian Fuchs is perhaps the world’s leading critical theorist of information technology and the author of a series of social theory monographs and textbooks published between 2008 and 2018 including Internet and Society: Social Theory in the Information Age, Foundations of Critical Media and Information Studies and Culture and Economy in the Age of Social Media. The position Fuchs has staked out in these volumes is a throwback to Frankfurt School Critical Theory and to Marx. He views contemporary social media through an ‘Ur-Marxist, 19th-century lens’ (Raetzsch 2016: 2). For Fuchs, Lash (2002) is wrong to claim contemporary critical theory must do without transcendentals; CIT is potentially emancipatory because it makes egalitarian social cooperation possible as an alternative to global capitalist competition in a new ‘transcendental space for participatory democracy’ (Fuchs 2008: 7). What is transcendental in CIT is ‘cooperation,’ which, as a ‘societal force,’ is an immanent sort of a transcendental. In Foundations of Critical Media and Information Studies (2011), Fuchs developed theoretical foundations for the analysis of media, information and information technology in contemporary information societies, and introduced theoretical and empirical tools (mostly theoretical ones) for the critical study of media and information. Fuchs discussed the role classical Critical Theory can play for analyzing the information society and the information economy, as well as for analyzing the role of the media and the information economy in economic development, the ‘new imperialism,’ and economic crises. Fuchs critically discusses transformations of the internet and argues that alternative media can serve as critical media in volumes including Internet and Surveillance (2013), Digital Labour and Karl Marx (2014) and Culture and Economy in the Age of Social Media (2015), among many others.
David M. Berry In Critical Theory and the Digital (2015) David Berry argues for forms of social theoretical analysis of information technology that look beneath the interface (p. 63) and beyond screens to understand code and algorithms. Like Lash, what Berry is offering is a critique of social theory itself in the digital age, in this case in the form of a series of reminders to the community of critical theorists who unfortunately have ‘relatively poor conceptual
38 Theoretical innovations
and theoretical understandings of the digital’ (theoryculturesociety.org/interview-withdavid-berry-on-digital-power-and-critical-theory/). This, combined with a ‘medial’ approach that tends to over-emphasize the ‘screenic’ in accounts of media and digital forms, has prevented critical theorists from taking into account both the commodity level of the interface and the underlying machinery of the algorithm instantiated in code. For empirical support for his critique Berry relies on a wide variety of journalistic and technical accounts. For Berry these ‘nominally descriptive accounts of software’ (theoryculturesociety.org/interview-with-david-berry-on-digital-power-and-criticaltheory/) can provide useful information about the scale, structure and implementation of software systems. Berry’s reading of computational ontology is heavily influenced by Alexander Galloway’s work on the topic (Galloway 2004, 2013). Like Galloway, Berry characterizes object-oriented philosophy as ‘reifying’ and regards ‘the computational’ as fundamentally ‘object-oriented.’
Evaluation While there are differences across theorists working in the area of CIT, there is also a shared theoretical framework that is realist in the sense of de-emphasizing culture, ideology or identity. CIT seeks to identify the essential socioeconomic and organizational processes that are determinative of macro trends in the social adoption of information technologies. With the exception of Feenberg’s position of technological underdeterminism the approach here is deterministic in so far as it de-emphasizes technology users’ agency and capacities for reflexivity. And it is essentialist in that underlying changes in information technologies and their societal adoption are seen in terms of essential tensions and contradictions and are amenable to analysis by an observer familiar with nineteenth- and early-twentieth-century European social theory. The essentialist element in CIT gives rise to a near anthropomorphization of complex, contingent sociotechnical processes in terms of what are claimed to be their fundamental internal logics. Beyond its realist ontology, essentialism and determinism, one can ask, following the software developer and writer Dominic Fox (review31.co.uk/article/view/274/bluescreens), whether any one social theorist or group of theorists armed with journalistic accounts of the happenings within information technology industries can possibly understand contemporary information technology well enough to analyze it critically. It may be effectively impossible for anyone working in a university humanities department, given academic isolation, the slow pace of academic research (Gane 2006) and the incentive structures in place within modern research universities (see Chapter 1), to understand contemporary information technologies or the industries that produce these technologies well enough to develop adequately informed critique. In addition there are elements within contemporary social theory generally and CIT in particular that limit the kind of interdisciplinary collaboration that could potentially generate critical insights into contemporary technologies.
Critical Information Theory 39
An ‘occult, metaphysical pseudo-science’? In a 1979 philosophy journal article on Critical Theory and positivism, Ray dismissed the Frankfurt School as ‘occult, metaphysical pseudo-science’ (p. 150; see Adorno & Adey 1976; Keuth 2005, 2015; Strubenhoff 2018). Since that time Critical Theory has developed without much critical reflection on its historical origins in esotericist fin-de-siècle intellectual movements despite historical scholarship that provides some support for Ray’s characterization (Josephson-Storm 2017; Lebovic 2013; Stauth & Turner 1992). In this section I proceed in reverse order of Ray’s quotation, beginning with the accusation that Critical Theory is pseudoscience before turning to the metaphysics accusation and finally to the occult origins of the Frankfurt School—and by extension of CIT. Whether Critical Theory is fairly characterized as pseudoscience is debatable, with the debate turning first on one’s view of the scientific merits of Marxian economic theory, and second on whether Critical Theory was more influenced by classical Marxism or by less ‘scientific’ sources such as Nietzsche’s Lebensphilosophie (Stauth & Turner 1992) and the writings of the fin-de-siècle occultist Ludwig Klages (JosephsonStorm 2017; Lebovic 2013). In any event Critical Theory’s antipositivism translates into a practical disinterest in statistical or computational research methods. Because of its exclusive reliance on interpretive historical research methodologies and overall limited support for empirical research (see Feenberg 2012), there have been almost no attempts to integrate modern research methods with classical Critical Theory. The second accusation concerns the question of metaphysics in Critical Theory. Discussing Berry’s critical analysis of object oriented programming, Fox notes that anthropomorphism emerges ‘when we begin to treat code as Code, an Orwellian construct that can so easily become a metanarrative to replace or combine with earlier metanarratives like “Capitalism”, “Modernity”, or “Progress”’ (review31.co.uk/article/view/274/blue-screens). This shadow theater of reified abstractions recalls Mills’ ridiculing of Parsons as a detached theorist lost in a self-created world of circularly defined concepts. For Mills, Parsons’ structural functionalism was drunk on syntax, blind to semantics. Its practitioners do not truly understand that when we define a word we are merely inviting others to use it as we would like it to be used; that the purpose of definition is to focus argument upon fact, and that the proper result of good definition is to transform argument over terms into disagreements about fact, and thus open arguments to further inquiry. (Mills 2000 [1959]: 34) For instance, Fuchs never provides a working definition of what he means by as fundamental a concept as ‘data’ (Raetzsch 2016: 9) despite the outsized role played by the concept in his theoretical system. Fox claims that for Berry the anthropomorphism problem is not due to his lack of technical knowledge,
40 Theoretical innovations
but that, as a social theorist, he is fundamentally allergic to the inhuman characteristics of the computational. These must always be reduced–led back– to the domain of recognisably human concerns and praxis. There is never any doubt that, in the confrontation of ‘Critical Theory’ with ‘the digital’, it is ‘Critical Theory’ that will win out in the end. (review31.co.uk/article/view/274/blue-screens) CIT’s authoritarian emphasis and metaphysical anthropomorphizations of largescale sociotechnical phenomena combine to produce a theoretical blindness to audiences’ capacities for active, agentic, reflexive social media participation and to their capacities for resistance, such as in the case of lawsuits brought by citizens against social media companies’ privacy policies, or in the various open source software movements and collectives. In a review of Fuchs’ Internet and Society, Raetzsch notes that by associating power primarily with questions of ownership, Fuchs deliberately avoids considering the dimensions of user agency, individual creativity or rational judgment in theorizing social media (Raetzsch 2016: 5). The third question concerns Critical Theory’s associations with European esotericism. The Old Testament origins of Marxist prophecy have been recognized for decades (Parsons 1964), as have Jewish mystical and messianic influences on Walter Benjamin (Rabinbach 1985) and the influence of the ‘submerged tradition of cultural critique’ based on Nietzschean Lebensphilosophie on the ‘highly regarded historical philosophy of the Dialectic of Enlightenment’ (Stauth & Turner 1992: 51). Correspondingly critics have noted the ‘common failing of the return to Marxian literature—its essentially religious cast, operating through the ritual incantation of texts rather than through argument and evidence’ (Garnham 2016: 2). While the influence of German Lebensphilosophie on Critical Theory was explored by Axel Honneth in the 1980s (Honneth 1983; also Miller 1978; Kellner 1985; Turner 1996), and the interwar period of revolutionary thinking referred to variously as the ‘Weimar-complex’ or ‘Weimar syndrome’ (Lebovic 2013: 1) is recognized to have influenced social thought for generations (Greenberg 2016), the particular influence of the occultist Ludwig Klages on the Frankfurt School is less widely recognized. In 1992 Stauth and Turner argued that the theorists of the Frankfurt School were closer to the tradition of Nietzsche and Lebensphilosophie in their cultural critique than to Marxism, and that the work of Ludwig Klages was the key connector between Nietzsche and the Frankfurt School. The legacy of Lebensphilosophie was recognized by Adorno, although Adorno wished to negate this stream within philosophy as merely a form of jargon in the case of Martin Heidegger … and as a mistake in the case of Kierkegaard … The existentialist tradition of Jean-Paul Sartre was also attacked and dismissed by Herbert Marcuse … The consequence of this broad critique by the critical theorists was to treat all forms of Lebensphilosophie … as an inadequate reflection upon contemporary society, because its subjectivism and individualism prevented an adequate and systematic evaluation of the objective
Critical Information Theory 41
circumstances which in capitalism precluded a viable and progressive critique of contemporary traditions. (Stauth & Turner 1992: 47) Lebovic’s 2013 history of Lebensphilosophie and Josephson-Storm’s 2017 history of the occult milieu of fin-de-siècle continental Europe provide extensive support and elaboration for Stauth and Turner’s argument for the influence of Lebensphilosophie on Critical Theory and by extension much of modern and postmodern social thought. For both Lebovic and Josephson-Storm, Walter Benjamin emerges as a key connector between Klages and Critical Theory. Benjamin’s commitment to Klages is examined in biographical studies of both Benjamin (Roberts 1982) and Klages (Lebovic 2013). As Stauth and Turner, Lebovic, and Josephson-Storm all implicate Ludwig Klages as a pivotal figure in Benjamin’s thought and in the development of the Frankfurt School, it is worth quickly summarizing Klages’ version of Lebensphilosophie. Ludwig Klages (1872–1956) was a well-known public philosopher, mystic and handwriting analyst. His interests were in four main areas, including characterology, the analysis of expression, graphology and philosophy. The central theme in his work is that human reality is an area of conflict between the soul (Seele) and the mind in which consciousness is finally triumphant over the soul. The soul is ‘identical with life itself, and the rhythmic structure of the cosmos as a whole. In more contemporary terminology, the evolution of the human race is part of a larger conflict between consciousness or rationality and the life-world of the cosmos’ (Stauth & Turner 1992: 49). In his first major philosophical work Mensch und Erde (Man and Earth), published in 1913, Klages interrogated the celebration of ‘progress,’ which he saw as the central ideology of his age (Klages, Beck & Ehmcke 1930). He argued that the rhetoric of progress is ‘just the public face of the domination of nature and the exploitation of indigenous peoples. The heart of progress is the lust for power’ (Josephson-Storm 2017: 216). Klages claimed that it was Christianity that gave birth to the ruthless, expansive impulse to enslave non-Christian races. Thus along with imperialism and slavery, science and capitalism were the fulfillment of Christianity. Klages’s main interest was cognition. In Vom kosmogonischen Eros (1926) he developed a symbolist theory of human understanding based on pictures rather than concepts. He related ‘an alien, distant pre-world which cannot be entered again to the mother-world of the things that have been and that resurrect the spirit of those who are already dead’ (Stauth & Turner 1992: 50). Pictures, as reflections of the world corresponding with the soul of human beings, provide the possible basis for a reconciliation between humanity and nature. In his Der Geist Widersacher der Seele (1972) he described reality as free-floating in time and space without any fixed reference point; with this notion of reality he developed an anti-ontological argument against the Greek philosophical notion of being in which reality was conceptualized as a relational system of fixed things. Klages’ picture theory of cognition is recognized to have had a strong influence on both Benjamin and Adorno (Wiggershaus 1986: 226; cited in Stauth & Turner 1992: 52). Other major interests of Klages that prefigured themes in twentieth-century anthropology and social thought include matriarchy and myth (Stauth & Turner 1992:
42 Theoretical innovations
50) and ecstasy and mystical orgy (Josephson-Storm 2017: 224). His oeuvre makes sense in the context of the ‘generation of 1914’ or ‘lost generation’ that embraced various forms of romantic anti-capitalism in reaction to rapid industrialization and urbanization and the experience of the First World War (Rabinbach 1985). Klages was one prominent member of this generation whose thinkers and writers included Buber, Elliot, Faulkner, Fitzgerald, Hammett, Hemingway, Huxley, Joyce, Kafka, Stein, Tolkien, Wolfe and Woolf (see Wohl 1979). Although Adorno dismissed Klages as a nature mystic and sought to distance Critical Theory from Klages’ esotericism and Lebensphilosophie, he also cited him approvingly. In a footnote to the Dialectic of Enlightenment, Horkheimer and Adorno suggested that Nietzsche, Gauguin, George, and Klages recognized the mindless stupidity which is the result of progress. But they drew the wrong conclusion. They did not denounce the wrong as it is but transfigured the wrong as it was. The rejection of mechanism became an embellishment of industrial mass culture, which cannot do without the noble gesture. Against their will, artists reworked the lost image of the unity of body and mind for the advertising industry. (Horkheimer, Adorno & Noeri 2002: 194) Klages was rejected by several thinkers associated with Critical Theory. In 1954 Lukács identified Lebensphilosophie as ‘the dominant ideology of the whole imperial period in Germany.’ He went on to identify the significance of this ideology as the antiparliamentary and irrational ‘belligerent preparation for the impending barbaric reaction of the Nazi regime’ (Lukács 1962: 403; qtd. in Lebovic 2013: 4). Klages’ greatest significance for Lukács ‘lies in the fact that never before had reason been challenged so openly and radically’ (Lukács 1962: 403; qtd. in Lebovic 2013: 4). The Nazis’ co-optation of Lebensphilosophie began in the 1930s. A mere decade after it was considered fashionable, life philosophy was coopted by the Nazis … In 1935 Thomas Mann attacked Lebensphilosophie as the core of ‘fascist’ rhetoric and named Ludwig Klages as a representative of this philosophy and a prefascist thinker himself. A well-known conservative and mystic, Klages was also seen by his opponents as an early proponent of national socialism, or as Mann put it, a ‘criminal philosopher,’ a ‘pan-Germanist,’ an ‘irrationalist,’ a ‘Tarzan philosopher,’ a ‘cultural pessimist … the voice of the world’s downfall’ (Mann 1978: 195). From then on, Lebensphilosophie— and Klages as a leading Lebensphilosopher—would be identified with Nazism, racism, and anti-Semitism. The earlier positive reception of Lebensphilosophie among radicals on the left was ignored and suppressed. (Lebovic 2013: 4) Josephson-Storm (2017) points out that Klages’ anti-Semitism, which he never disavowed, if anything became more pronounced as the Nazis rose to power. As a consequence
Critical Information Theory 43
[c]onservative thinkers are unlikely to have been sympathetic to his pacifism, his embrace of sexual fluidity, his feminism, or his anti-Christianity, and progressives were probably thrown by his traditionalism and his attacks on progress and science; but it is probably mainly Klages’s explicit anti-Semitism that prevented his canonization. (Josephson-Storm 2017: 225) Klages expressed something akin to anti-Semitism in his early writing but tended to describe Judaism as an attitude rather than in racial terms. In this way his antiSemitism resembled the ‘anti-Judaism of a Nietzsche or a Voltaire, where the real target is often Christianity’ (Josephson-Storm 2017: 225). Over time the figure of the Jew, and the American Jew in particular, came to personify all that was wrong with modernity, and Klages did not tone down his language as the Nazis rose to power. If anything, he did the opposite (Lebovic 2013). Yet Klages was not a fascist. He ‘had nothing to do with the fascist movement as such in Germany as he had migrated to Switzerland during the First World War in protest against such developments’ (Stauth & Turner 1992: 59). There is nothing in the apocalyptic, Jewish messianic (Rabinbach 1985), esotericist (Josephson-Storm 2017) or anti-Semitic (Stauth & Turner 1992) origins of Marxism or Frankfurt School Critical Theory that necessarily preclude these traditions from providing suitable theoretical foundations for analysis of contemporary information technologies. But at the same time, these origins are more than mere history: they are visible in contemporary CIT’s blanket rejections of science and capitalism and in its intellectual reliance on anthropomorphistic metaphysical concepts. At a minimum it would seem incumbent on CIT advocates to work through the possible relevance of these historical origins for the viability and contemporary applicability of their preferred theoretical orientation.
Orientation toward Bourdieu Critical theorists are fond of accusing empirical sociologists of raw careerism, and of holding a naive faith in positivism and the search for laws of society akin to the laws of physics (e.g. Agger 2013: 144–148; Berry 2015: 54–55). The accusations echo Parsons’ claim that what the sociologists of the Chicago School had produced was mere ‘dust bowl empiricism’ grounded in theoretical naiveté. But as we will see in Chapter 6, one of Bourdieu’s cardinal contributions was to develop viable epistemological and ontological positions to allow empirical sociologists to overcome the subjective/objective dichotomy, and with it the dichotomy between positive and hermeneutic methods. And philosophy of social science schools of thought such as Critical Realism (Bhaskar 2013, 2014; Elder-Vass 2012; Kaidesoja 2013) have developed similar epistemological and ontological positions for decades. Critical theorists’ accusations of naive positivism are thus both passé and illinformed. Invective against sociologists’ ostensibly naive positivsm does raise the question of critical theorists’ level of familiarity with sociology and philosophy of
44 Theoretical innovations
social science generally, and with Bourdieu specifically. Bourdieu was far and away the most highly cited sociologist of his generation, and at least numerically has had a greater impact than any social theorist of his or any subsequent generation. He was also a harsh critic of French movements inspired by Critical Theory, of social theory generally (‘theoretical theory’ in his terms; see Wacquant 1989: 50), and of practitioners of Cultural Studies, whom he referred to disparagingly as ‘massmediologists’ (Lane 2000: 39–40). Yet despite the prominence of Bourdieu’s reflexive sociology and his negative views of movements inspired by Critical Theory, the response to Bourdieu by Critical Information theorists has been muted. Feenberg cites him in passing and does not engage with his ideas. Berry (2015) never cites him. Fuchs is a more interesting case. In a 2003 article he selectively appropriates ‘certain aspects’ of Bourdieu’s theories in support of his own project of developing a ‘theory of social self-organization.’ But Fuchs engages with Bourdieu purely qua social theorist, never mentioning his empirical work or distaste for social theory undisciplined by empirical research. Following the 2003 article Fuchs appears never to have engaged with Bourdieu again and certainly not to have responded to Bourdieu’s twin rejections of ‘theoretical theory’ and orthodox Marxian theory, both of which have obvious relevance to Fuchs’ work. Despite Bourdieu’s prominence and antipathy toward strands of Critical Theory, critical theorists’ engagement with Bourdieu has been limited, and where they have engaged with him they have done so in a highly selective manner. This is a lost opportunity in light the extraordinary output of research that has employed Bourdieu’s theoretical concepts (capital, field, habitus, symbolic power, symbolic violence, hysteresis) to analyze digital inequalities and other social patterns shaped by contemporary information communication technologies that would presumably be of great interest to advocates of CIT (Chapter 6).
Intellectual isolation [W]ithout an understanding of the structure of code, many features of our contemporary situation are impossible to explain or understand. We must, therefore, seek to open up the black box of digital technologies to critique. (Berry 2015: 26)
Critical Information theorists have not engaged deeply with contemporary philosophy of social science, the sociology of Pierre Bourdieu or the sociological literature on digital inequalities that employs Bourdieu’s conceptual vocabulary. This pattern of disengagement from research traditions having obvious relevance to topics of immediate interest to critical theorists appears to be part of a broader pattern of intellectual isolation. For instance, Critical Information theorists have not engaged with other major intellectual trends such as movements against essentialism and foundationalism identified with postmodernism. They have also been conspicuously silent on the central sociological topics of race, gender and sexuality.
Critical Information Theory 45
Critical Information theorists’ disinterest in empirical research methods, including computational methods, has surely played a part in their intellectual isolation. Without computational skills or a willingness to work with researchers with complementary computational skills, in so far as Lash is correct in asserting that a viable critique of information ‘will have to come from inside the information order itself’ (2002: vii), Critical Information theorists are unlikely to produce information critique that is technically sound. As outsiders viewing the computational sphere from afar, Critical Information theorists must rely on second-hand accounts produced by journalists and foundations. This has contributed to a pattern of misstatements regarding technical aspects of programming and social media. In his review of Berry’s Critical Theory and the Digital, the software developer Fox noted that Berry made fundamental errors regarding object-oriented programming. Fox notes that, contra Berry, object-oriented programming is but one of several available programming paradigms, and by no means a definitive model of what computation really entails. He questions Berry and Galloway’s claim that an object-oriented ontology is inherently ‘reifying’: Of course it is, from the point of view of a Critical Theory which regards more or less all putative ‘things’ as reified relations, created in the image of the commodity form. But object-oriented ontology is a rival theory, with rival claims of its own. Amongst these would be the claim that ‘objects are what they are’: that when we take things for things, we are not mistaking them for things – that is, reifying them – but apprehending a thing-like quality that they intrinsically possess. Berry is not obliged to accept object-oriented ontology as true, but he is also not entitled simply to assume without argument that it is false. (review31.co.uk/article/view/274/blue-screens) Berry proposes that Critical Information theorists must go beyond ‘screenic accounts’ of social media and other contemporary information technologies, but it is far from evident how they might accomplish this with no first-hand understanding of what is happening on the other side of the screen—i.e. without gaining technical skills or communicating with researchers and practitioners who do have such skills. Thus there would appear to be some validity to Fox’s conclusion that in Critical Theory and the Digital the ‘proposed encounter between Critical Theory and its digital “others” is reduced to a monologue in which the former tells the latter what is wrong with them’ (review31.co.uk/article/view/274/blue-screens).
Conclusions CIT encourages critical reflection on contemporary information technologies, the industries that produce them, and their social and political consequences. As a catalyst for public reflection on information technology there is much to be said for CIT. But is CIT generative of reliable knowledge, or, rather, of commentary? I think the answer is the latter as, due to the problems and limitations discussed in this chapter, CIT can be fairly characterized as a nineteenth-century solution to the
46 Theoretical innovations
intellectual challenges of the twenty-first century. Because it reproduces the ‘two cultures’ dichotomy that is widely recognized as an impediment to social science innovation, CIT is poorly positioned to respond to the challenge of new information communication technologies the workings of which are extraordinarily opaque to non-technologists, of new computational research methodologies, and of new approaches to sociological theory and the philosophy of social science. Contemporary economic, political, technological and social conditions demand newer and sharper tools; these conditions challenge sociologists to move past theoretical and methodological orthodoxies handed down from the occult milieu of the Weimar Republic.
5 FORENSIC SOCIAL SCIENCE
Introduction We need to develop a new, bigger sociological imagination that allows us to incorporate big social data rather than reinventing the wheel. That requires careful mining of our methodological and theoretical history, along with a reexamination of the ways in which we collect and use our data. (Halavais 2015: 583) What is the value of sociological theory in the information age? Where Critical Information Theory’s answer to this question takes the form of social philosophical reflection on the role of information in society—reflection that engages only selectively with empirical social science research—a second set of answers has emerged from within empirical sociology. Following McFarland, Lewis and Goldberg (2016), I refer to this second approach as Forensic Social Science (FSS). As a centrally important element of the classical scientific method, theory features in all the major scientific approaches to sociology. The theory that is used within such approaches is sometimes categorized as ‘empirical theory’ in contrast to ‘grand’ or ‘theoretical theory.’ Compared to these latter forms of sociological theory, empirical theory is less abstract; it involves the development and testing of meso theories and theoretical models of narrowly circumscribed objects of study rather than theories of society, social class, capitalism, culture or other broad categories of social phenomena. The meso theories that feature in empirical theory are theories of the ‘middle range’ (Merton 1949) that operate between the abstractions of grand theory on the one side and empirical research practice on the other. Drawing on empirically supported substantive theories and models, meso theories are common in sociology and psychology where researchers develop, test and refine relatively narrow theories related to specific social and psychological phenomena, such as theories of cognitive biases or of gender
48 Theoretical innovations
discrimination in hiring. Closely related to meso theories, theoretical models are simplified, often schematic representations of complex social phenomena. They are models of cause–effect relations between small numbers of empirical phenomena that are used in almost all empirical social research, particularly in research that is done in a positivist mode of inquiry. Knowing capitalism, commercial sociology and big data present challenges for established ways of employing meso theories and theoretical models in the service of drawing inferences from social data. A major challenge involves fundamental changes in the practice of acquiring and sampling data. Big data and contemporary computational tools allow for access to data about basic social behaviors that have always been practiced but were previously rarely documented. Such data are often unprompted trace data, ‘digital footprints’ (Golder & Macy 2014) of actual social behaviors rather than information expressed by a particular individual in an artificial and narrowly delimited relation with a researcher, such as in the context of a social survey or ethnographic interview. Information in the form of digital traces of everyday activities does not require a research-driven hypothesis to be collected because it is a by-product of daily life activities such as digital device use or social media activity. This has major consequences for how, and whether, data is sampled. In contexts in which the complete set of events in question, such as all transactions between buyers and sellers on an electronic marketplace or participants on a social media platform, is documented, very large data sets and new computational tools may reduce or even completely eliminate the need to draw representative samples (see Mohr 2015, 2016; Savage 2012). Recognizing the fundamental nature of the challenges posed by big data to scientific sociology, researchers have begun to systematically consider the consequences of the comprehensive and unprompted nature of big data for theory, sampling and hypothesis testing. One recent attempt to systematize the implications of big data for empirical sociology is McFarland, Lewis and Goldberg’s (2016) proposal of what they term FSS. FSS is strikingly different from mainstream twentieth-century social science in that it involves using theories as sources of preliminary predictions in analysis of large data sets. Its goal is not primarily the development of grand theories; rather theories as such are less important than the skill of theorizing (Swedberg 2012), a skill that develops and is applied in attempts to analyze data for the purpose of answering focused questions about specific social phenomena (see Chapter 7). Like Merton’s theories of the middle range, FSS represents a middle ground between grand theory and atheoretical empiricism. McFarland, Lewis and Goldberg (2016) discuss how in the twenty-first century knowing capitalism, commercial sociology and big data have created a kind of ‘trading zone’ between industry and academic social science. They claim that in the short term, competition from data-driven social science, including from commercial sociology, will reduce demand for theory within academic research communities. But eventually the market for sociological theory will strengthen as researchers discover the limits of atheoretical data mining and turn to theory for guidance (2016: 32). McFarland, Lewis and Goldberg’s claim for FSS is that it provides a research paradigm for contemporary
Forensic Social Science 49
sociology where big data is ascendant and the value of grand theory is unclear (Gane 2011; Lizardo 2014; Turner 2006). Thus FSS claims to provide for digital age social science what Mertonian middle range theory provided for empirical sociology in the twentieth century: a template for performing social science as regular science that uses theory as a means to allow research findings from diverse areas to communicate and cumulate. McFarland, Lewis and Goldberg contrast FSS with mainstream twentieth-century social science. Over the course of the twentieth century a ‘“paradigm” of surveys/methodological individualism’ (2016: 18) emerged in academic sociology. This paradigm, based in the United States where it became nearly hegemonic, was supported institutionally by the National Defense Education Act of 1958 and other Cold War programs that massively increased federal funding to colleges and universities. This funding benefited STEM fields the most but also supported positivist social science in the form of experimental psychology and quantitative survey analysis in sociology. In the post-war period, aside from the anti-disciplinary movement of the 1970s (Menand 2010), there was a clear trend in the social sciences toward quantification and appropriation of statistical methods from STEM fields. As computer technology advanced, statistical modeling become more sophisticated and prominent, and the social sciences began to converge on research paradigms that integrated statistical models. As a result of this convergence, sociology moved away from ethnographic community studies and adopted a methodological individualistic perspective … This perspective was evident in the nature of surveys asking for individual responses and viewpoints. It was also evident in statistical procedures that relied upon assumptions of independent observations … With the advent of ordinary least squares (OLS) regression, social science journals became replete with regression tables, OLS equations, path models, and a variable-centric viewpoint … With survey research and the accompanying statistical models came assumptions about social actors, their interrelation (or lack thereof), and even conceptions of time (as panels) … These assumptions were never believed to be an accurate conception of social life, nor a warrant for developing methodological individualism. Rather, they were acknowledged after the fact as necessary byproducts of performing regression-based research using survey data. (McFarland, Lewis, & Goldberg 2016: 14) As the paradigm of surveys and methodological individualism became central to mainstream sociology, it effectively institutionalized scientific deductivism and hypothesis testing. Deductivism entailed an approach to sociological theory in which theory precedes data collection and the goal of empirical research is to find statistical support for hypotheses derived from a preconceived theory or set of theories. In contrast to the twentieth-century package of scientific deductivism and middle-range theorizing, FSS is based on two related non-deductive inferential logics: abduction and induction. Using the former logic, researchers derive or develop provisional theories and use results from empirical analyses to adjust their theories.
50 Theoretical innovations
A cycle of theorizing and hypothesis testing continues until a match is found between theory and empirical results. With the latter logic, researchers begin with data and build theory from the ground up rather than presupposing a theory and collecting and analyzing data that will support or refute it.
Digital abduction Abduction seems to be having its moment in the sun as researchers across the social sciences are advocating for it as an apposite logic for working with data with complex structures (Park 2016). Abduction is a category of inferential logic that is also known, approximately, as ‘inference to the best explanation’ (Lipton 2003). One weakness of deductive inference applied within the scientific method of testing hypotheses derived from theory is that it does not provide guidance about how theories, whether grand theories, middle-range theories or theoretical models, are discovered in the first place (Hoffmann 1999). Abduction, however, does account for theoretical innovation. It differs from other forms of inferential logic in that it involves an inference in which the conclusion is a hypothesis which can then be tested with a new or modified research design. The term was originally defined by the philosopher Charles Sanders Peirce, who claimed that for science to progress it was necessary for scientists to adopt hypotheses ‘as being suggested by the facts’ (Peirce 1901: 202–203). Abduction involves a creative process of producing new hypotheses and theories based on surprising research evidence. It ‘seeks no algorithm, but is a heuristic for luckily finding new things and creating insights’ (Bauer, Bicquelet & Suerdem 2014). In abduction there is a continual, creative juxtaposition between theories and empirical data. Abductive logic does not replace deduction and induction but bridges them iteratively. Where deductive reasoning begins with a theory and proceeds through an analysis which either supports the theory or falsifies it, and induction begins with a collection of data and proceeds by examining patterns in the data to develop an inference that supports a theory, abduction is the form of reasoning through which we perceive the phenomenon as related to other observations either in the sense that there is a cause and effect hidden from view, in the sense that the phenomenon is seen as similar to other phenomena already experienced and explained in other situations, or in the sense of creating new general descriptions. (Timmermans & Tavory 2012: 171) Abduction is the most conjectural inferential logic used in social science because it seeks a situational fit between theories and observed facts. It is a forensic form of logic in that it resembles the reasoning of detectives who interpret clues that permit a course of events to be reconstructed, or of doctors who make inferences about the presence or absence of illness based on patients’ symptoms. In big data research, abductive analysis does not outsource sociological analysis to computational tools but rather engages these tools in a process of critical interrogation. It
Forensic Social Science 51
thus places more demand on theory as the analytical focus shifts from proving or disproving preconceived hypotheses to ‘figuring out how to structure a mountain of data into meaningful categories of knowledge’ (Goldberg 2015: 3). Abduction demands attention by sociologists interested in how sociological theory can contribute to digital social science because it features in cutting-edge thinking about sociological theory and sociological methodology in several areas at once. First, theorists such as Swedberg argue that abduction is centrally important for theorizing within empirical research projects. Second, abduction is compatible at a deep level with Bayesian inference (Gelman 2017), which is the dominant contemporary approach to statistical inference in most quantitatively oriented research fields. Third, several sociologists have taken note of the sales success of sociological ‘data books’ that make extensive use of abductive reasoning (Halford & Savage 2017). And, fourth, sociological ethnographers seeking to demonstrate the value of ethnographic methods for theory development have called for more explicit use of abductive inference (Timmermans & Tavory 2012; Tavory & Timmermans 2014), as have sociologists who use text mining methodologies (Bauer, Bicquelet & Suerdem 2014; Ignatow & Mihalcea 2016, 2017; Nelson 2017).
Static theories and iterative theorizing Arguments for abductive inference have emerged in the last decade from several sociological subfields. Swedberg (2012) has drawn on Peirce to argue that sociologists need training in theorizing rather than theory as such. When theory is taught as a static body of knowledge, ‘what Durkheim-Weber-Bourdieu and so on have said’ (p. 2), students do not learn how to use theory in their own empirical work. According to Swedberg the way theory operates in actual empirical research practice is dynamically, iteratively and in close unison with observation. Theorizing draws on intuitive ways of thinking as much as on formal logic, beginning with abductive inference and flowing from there to deductive logic. Only once an initial hypothesis has been formed through abductive inference can deduction help develop the hypothesis by providing a plausible generalization or causal chain.
Bayesian inference A critical impediment to greater collaboration between social and computer scientists is that where the former employ primarily deductive logic to advance explanation, the latter employ primarily inductive logic to advance prediction (Wallach 2018). Sociological abduction can potentially bridge this gap, but abduction requires a complementary approach for statistical—and not only logical—inference. Bayesian inference, which is widely used in machine learning and deep learning, modeling neural networks, and agent-based models, is such an approach. There are important similarities between abductive inference and Bayesian statistics, which is an approach to statistical inference that is today widely used in almost all quantitatively oriented research fields. Sociology is an outlier here, as despite sociology’s early embrace of
52 Theoretical innovations
social statistics (Hubbard 2015: 25), with few exceptions (Western 1999, 2001) Bayesian statistics are marginal within contemporary sociological research and instruction. Named for Sir Thomas Bayes (1701–1761), the core of Bayesian statistics is the idea that scientists should use new evidence to update their confidence in causal theories. Gelman (2017) writes that as for abduction where explanatory considerations contribute to making some hypotheses more credible and others less so, and in contrast to the hypothetico-deductive model of science, Bayesian statistics is all about generating and revising models, hypotheses and data analyzed in response to surprising findings (see Gelman et al. 2013: ch. 6). Bayesian inference is based on the idea that confidence in a theory after new evidence is evaluated is a function of prior confidence times the evidential weight of the new data. The amount a theory is updated by new evidence is a function of both prior confidence in the theory and the weight of new evidence. Prior confidence in a theory or causal hypothesis derived from it is important because, if there is a large amount of theoretical and empirical knowledge suggesting that a theory is valid, only very strong new empirical evidence can further increase confidence in the theory. Bayesian empirical updating can go in the direction of either confirmation or disconfirmation, but it never reaches 100 percent or 0 percent confidence in a theory. This epistemologically probabilistic approach can serve as the inferential underpinning for both probabilistic and determinate claims (Howson & Urbach 2006). Although Bayesian confirmation theory makes no reference at all to the concept of explanation, several authors have recently argued that abduction is compatible with and complementary to Bayesianism. Lipton (2003: ch. 7) claims that Bayesians should be ‘explanationists’ (his name for advocates of abduction) and proposes, in essence, that Bayesians ought to determine their prior probabilities and, if applicable, likelihoods on the basis of explanatory considerations. But beyond prima facie similarities between Bayesian and abductive inference (Gelman 2017) and Lipton’s very general argument, precisely how Bayesian statistics can be used to evaluate theoretical explanations is a topic of ongoing debate. Okasha (2000), McGrew (2003) and others have suggested that explanatory considerations may serve as a heuristic to determine, even if only roughly, priors and likelihoods in cases in which no other information is available. Psillos (2000) proposed that abduction may assist in selecting plausible candidates for testing, where the actual testing then follows Bayesian lines.
Data books In the digital age, what kind of sociology succeeds beyond the academic marketplace? Halford and Savage (2017) suggest that academic sociologists can benefit from being aware of the forms of sociological research that are able to have an impact in the public sphere. Where in the late twentieth century ‘theory books’ by Baumann, Beck, Bourdieu, Giddens and other luminaries were the public face of sociology, today ‘data books’ are more popular and influential. Popular data books are not atheoretical but rather use theory to guide data analysis much as a conductor guides a symphony orchestra by providing direction and organization.
Forensic Social Science 53
Halford and Savage claim that a ‘symphonic approach’ to sociological data analysis can be seen in popular data books such as Putnam’s 2000 Bowling Alone on social capital; Piketty’s Capital in the Twenty-First Century (2014); and Wilkinson and Pickett’s The Spirit Level (2011) on the consequences of social inequality. These trade press books use powerful theoretical arguments to draw data from diverse sources, such as US Census data, the General Social Survey, and the World Incomes Database, into an argument that addresses an intellectual or policy puzzle. The theories these books employ include communitarian approaches to the value of social interaction (Putnam), economic theory (Piketty), and social psychology theories of shame and stigma (Wilkinson and Pickett). None of these sociologists introduce new theoretical ideas nor do they lay out their theoretical arguments deductively. Rather, they return to their theoretical points repeatedly as a kind of refrain and support them with extensive data visualizations. The analysis in popular data books such as these ‘demands abductive reasoning’ because only abductive reasoning ‘focuses on the unfolding interplay between data, method and theory and with regard to their co-constitution’ (Halford & Savage 2017: 11–12). In this mode of investigation, concepts, theories and methodological expertise are used to direct the process of knowledge discovery; data are used in turn to direct further investigations as well as processes of interpretation and theoretical development (see Kitchin 2014: 6).
Ethnography There is in sociology a long-standing question of the relationship between theory and ethnographic research methods. Ethnography never fit comfortably within the twentieth-century’s dominant sociological research paradigm of scientific deductivism and methodological individualism. And indeed ethnographic research is associated with attempts to challenge the dominance of hypothesis-testing in social science methodology, such as Glaser and Strauss’s formulation of Grounded Theory in the 1960s. Because ethnographic data resembles big data in its richness and complexity, it is perhaps not surprising that sociological ethnographers have embraced abductive logic much as big data researchers have done. For instance the ethnographers Timmermans and Tavory (2012; Tavory & Timmermans 2014) argue that while grounded theory based on inductive inferential logic has become a leading approach to theoretical innovation in qualitative sociology, few novel theories have emerged from the combination of grounded theory and qualitative research methods. They suggest that because induction does not, either logically or in practice, lead to theoretical insights, it is therefore abduction rather than induction that should be the guiding principle of theoretical development in ethnography. They make an additional claim that abductive analysis arises from researchers’ social and intellectual positions rather than a mysterious mental discovery process, and that abductive discovery can be further aided by careful analysis of ethnographic data. Timmermans and Tavory outline several formal methodological steps that they claim can enrich abductive data analysis. A first step is revisiting the same observation again and again. Because perception is always ‘saturated’ (Marion 2002), as we
54 Theoretical innovations
attend to phenomena over time we revisit our experience, and as we revisit it we re-experience it in different ways. The second proposed step is defamiliarization of the known world. Following Berger (1963), who discussed how one of the ways sociology enriches our understanding of social life is by estranging the familiar, Timmermans and Tavory suggest that writing ethnographic field notes is a technique of defamiliarization because we think differently in textual than we do in atextual activities. Writing ‘problematizes and crystallizes things that we would gloss over in atextual accounts’ (p. 177), allowing us to make detailed comparisons and recognize logical fallacies. Writing allows researchers to establish a distance from the taken for granted and apply the third methodological step, alternative casing, to observations. Alternative casing involves applying multiple theoretical lenses as a study progresses (Ragin & Becker 1992). Timmermans and Tavory recommend that sociologists force themselves to take a relatively small data sample and work through it in detail in light of their theoretical knowledge, trying to find as many ways to understand the data as possible. They conclude that their three methodological processes require of researchers a ‘gestalt switch in which the theoretical background is foregrounded as a way to set up empirical puzzles’ (p. 177). Using Timmermans and Tavory’s three proposed methodological processes allows ethnography to speak to theory without ethnographers presenting themselves as either theoretical atheists or monotheists. Instead they should be ‘informed theoretical agnostics’ (p. 169) who seek to explain surprises and puzzles encountered in their field work. Timmermans and Tavory discuss their own ethnographic research projects as examples of abductive theory construction in the service of solving an empirical puzzle. Timmermans’ (1999) study of resuscitative efforts in two emergency departments is a prime example. One empirical puzzle in his study was variation in the extent to which emergency staff were committed to saving a life during resuscitative efforts. While the official explanation for such variation was that in emergency medicine lifesaving efforts depended on the presence of vital signs, this explanation belied the reality of observed resuscitative efforts. In some situations the staff did not seem to want to regain a patient’s pulse, while for other patients they would continue resuscitating even when signs of death had obviously set in. Initial coding of the empirical data revealed differences in intent to bring the patient back to life along a variety of patient, staff and organizational characteristics; it was only when revisiting the data while reading a 1967 study of dying in hospitals by Sudnow that Timmermans was able to make sense of the variation within the emergency department. Sudnow had argued that dying in hospitals rests on a patient’s position in an institution-specific moral evaluation based on a calculation of how patient characteristics added up to a presumed social worth. It was Timmermans’ abductive move of applying this new theoretical understanding to his data that allowed him to solve the sociological puzzle of variation in resuscitative efforts within the hospital emergency departments he had studied.
Forensic Social Science 55
Text mining Social researchers who work with large text data sets and text mining methodologies often advocate for abduction. As with ethnographic data, the sheer complexity of textual data poses a challenge to purely deductive hypothesis testing. Bauer, Bicquelet and Suerdem (2014) argue that with abduction, text analysis need not face a dilemma between ‘the Scylla of deduction on the one hand, and Charybdis of induction on the other’; they suggest abductive logic as a middle way out of this forced choice. Abduction is compatible with a number of text mining tools and is used in the early stages of many text mining research designs that include hypothesis tests. One example is Ruiz’s text analysis of transcriptions of discussions with Spanish manual workers. In these transcripts the workers are seen as criticizing the chauvinism and submissiveness of Moorish immigrants from Morocco. Ruiz describes his use of abduction but also of inductive and deductive logics of inference in his 2009 survey of sociological text analysis methods. More recently Nelson (2017) has proposed a three-step methodological framework for text mining, ‘computational grounded theory,’ based on a combination of abductive and inductive logic. Computational grounded theory combines expert knowledge and interpretive skills with contemporary computer technology. The first step in her proposed methodology, pattern detection, involves inductive computational exploration of text, using techniques such as unsupervised machine learning and word scores to help researchers recognize novel patterns in their data. The second step, pattern refinement, involves interpretive engagement with the data through qualitative deep reading. A third step, pattern confirmation, assesses the inductively identified patterns using further computational and natural language processing techniques. Nelson argues that this sequence of methodological steps is efficient, rigorous and reproducible for analyzing data such as transcribed speeches, interviews, open-ended survey data and ethnographic field notes (see Nelson 2019; Nelson, Burk, Knudsen & McCall 2018).
Research transparency There are solid reasons why abductive theorizing is finding new advocates in methodological fields as varied as ethnography and text mining. Iterative theorizing and data analysis may be essential for understanding relatively unstructured data or data with complex structures, as it is extraordinarily difficult to recognize patterns in such data with a single-shot high-stakes hypothesis test. As Timmermans, Tavory, Swedberg and others have noted, complex empirical data is less amenable to testing established theories than it is to theorizing, a skill developed through repeated close engagement with both theory and data that is often performed in collaborative research settings. Besides being suitable for analyzing unstructured data and, in principle, compatible with Bayesian statistical inference and machine learning, abduction has the added advantage of transparently representing how social science is often conducted. This is currently a somewhat less pressing concern in sociology than it is in research psychology and the life sciences. In these latter fields there are widespread
56 Theoretical innovations
concerns about the validity and reproducibility of published research. Sociology, however, is not immune to several ‘diseases’ that threaten science (Gerber & Malhotra 2008), including inter alia significosis, an inordinate focus on statistically significant results; neophilia, an excessive appreciation for novelty; theorrhea, a mania for new theory; arigorium, a deficiency of rigor in theoretical and empirical work; and finally, disjunctivitis, a proclivity to produce large quantities of redundant, trivial, and incoherent works. (Antonakis 2017: 5) At the root of many of these ills is what is known as publication bias. Social scientists have been concerned about publication bias since the 1960s (e.g. Bakan 1966), and these concerns have not diminished over time. Publication bias occurs when the outcome of a research study influences whether or not it is published. Driven by a combination of ultra-competitive publish-or-perish research environments and journal editors’ unwillingness to publish studies with results that are not significant at p