Changing Face of Information: Support Services for Scientific Research 9783110650778, 9783110645538

The health of scientific enterprise has become a critical political and social issue as nation states tackle austerity,

179 103 7MB

English Pages 344 [346] Year 2020

Report DMCA / Copyright

DOWNLOAD PDF FILE

Table of contents :
FOREWORD
CONTENTS
OVERVIEW
ACKNOWLEDGEMENTS
1. INTRODUCTION
2. LITERATURE ANALYSIS
3. ENVIRONMENTAL AGENTS FOR CHANGE
4. STEM DYSFUNCTIONALITY
5. UNAFFILIATED KNOWLEDGE WORKERS
6. UK LEARNED SOCIETIES
7. ASSESSMENT
8. CONCLUSIONS
9. APPENDICES
INDEX
Recommend Papers

Changing Face of Information: Support Services for Scientific Research
 9783110650778, 9783110645538

  • 0 0 0
  • Like this paper and download? You can publish your own PDF file online for free in a few minutes! Sign Up
File loading please wait...
Citation preview

Changing Face of Information: Support Services for Scientific Research

Current Topics in Library and Information Practice

David J. Brown

Changing Face of Information: Support Services for Scientific Research

ISBN 978-3-11-064553-8 e-ISBN (PDF) 978-3-11-065077-8 e-ISBN (EPUB) 978-3-11-065150-8 ISSN 2191-2742 Library of Congress Control Number: 2019955252 Bibliographic information published by the Deutsche Nationalbibliothek The Deutsche Nationalbibliothek lists this publication in the Deutsche Nationalbibliografie; detailed bibliographic data are available on the Internet at http://dnb.dnb.de. © 2020 Walter de Gruyter GmbH, Berlin/Boston Typesetting: Integra Software Services Pvt. Ltd. Printing and binding: CPI books GmbH, Leck www.degruyter.com

FOREWORD This book addresses the extent to which publishing in, and for, the sciences is facing a paradigm shift brought about by a “perfect storm” of diverse and powerful external trends. These often-global trends impact on a tight, specialised and introspective research system which, in terms of the way it communicates, has witnessed little change over recent decades. The premise of this book is that these external drivers are leading towards greater “democratisation” of the scientific publishing process, created by more openness, interactivity, sharing and reach within an Internet-enabled and a more science aware society. But there are barriers which need to be overcome before scientific publishing (or STEM, science, technology, engineering and mathematics publishing) can be made fully in tune with emerging and powerful external social, technological, administrative and commercial contexts. This analysis describes the challenges facing the industry and offers suggestions how they may be overcome. Examples of the problems include how to ensure quality in a world enveloped by ‘white noise’ and digital data. Another is how to cope with the ‘serials crisis’ facing STEM publishing where supply factors are not aligned with demand. There are complaints about the business models which have been adopted in this sector. A further problem is how to make STEM more engaged with satisfying information needs of a broader and expanding unaffiliated knowledge worker (UKWs) community – a community which could benefit from easy access to STEM research outputs. STEM publishing also needs to adapt to new online communication mechanisms which render traditional publishing methods slow and outdated. These challenges are exasperated by negative effects the Internet is allegedly having on individuals, democracy and society, as suggested by influential writers such as Keen (2015), Carr (2010) and Greenfield (2014) amongst others. Though the Internet offers scope for greater involvement within a wider community there are weaknesses at a macro level in a system which enables fraudulence, divisiveness and anti-social activity to operate alongside the Internet’s openness, scale and freedom. Addressing these issues involves a strategic rather than operational approach. It raises questions about the function, efficiency, effectiveness and viability of a system of communication within the scientific research process, and how the external environment will force changes to what has hitherto been a cautious, conservative, stable and, in many instances, highly profitable industry sector. https://doi.org/10.1515/9783110650778-202

VI

FOREWORD

It presupposes a significant cultural and technological change in the world of research. STEM publishing is also part of the same transformative trends which are affecting other sectors of modern society. The recommendation is that more effort and resources be given to developing long-term strategic scenarios to help current and future stakeholders come to terms with what the future for science and research holds.

Contents FOREWORD

V

OVERVIEW

XI

ACKNOWLEDGEMENTS 1

2

3

XXI

INTRODUCTION 1 REASONS FOR INVESTIGATING THIS TOPIC SCOPE FOR THIS REPORT 3 AIMS 4 OBJECTIVES 4 LITERATURE ANALYSIS 7 STEM Described in Informal Literature Assessment of Commentaries 15 The Information Society 16 The Information Economy 17 The Networked Society 19 Global Information Trends 20 Environmental Developments 22 A “Perfect Storm” 22 ENVIRONMENTAL AGENTS FOR CHANGE Chaos Theory 23 TECHNOLOGICAL TRENDS 26 SOCIOLOGICAL CHANGES 36 DEMOGRAPHY 42 CULTURE 52 LEGAL ISSUES 57 SOCIAL MEDIA 61 COMMERCIAL ISSUES 71 PUBLISHING AND STEM DEVELOPMENTS RESEARCHER BEHAVIOUR 97 WORK PROCESSES 103 RESEARCH DEVELOPMENT 109 SCIENCE POLICY ISSUES 115 Summary 121

1

7

23

87

VIII

4

Contents

STEM DYSFUNCTIONALITY 123 STEM PUBLISHING SECTOR 123 BUSINESS MODELS 131 A DYSFUNCTIONAL STEM 138 THE SCIENTIFIC JOURNAL 159 FUTURE STEM COMMUNICATION TRENDS

170

5

UNAFFILIATED KNOWLEDGE WORKERS 181 OVERVIEW OF UKWs 181 KNOWLEDGE WORKERS 185 PROFESSIONALS 189 SMALL AND MEDIUM ENTERPRISES – SMEs 195 CITIZEN SCIENTISTS 203 ACADEMICS 208 THE GENERAL PUBLIC 218 Summary 225

6

UK LEARNED SOCIETIES 227 UK Learned Societies 227 Journals and Professional Societies 230 Example of Learned Society Publishing Policy 233 EDP Study on Learned Societies and Open Access 235 Collaboration Among Learned Societies 236 Learned Society Robustness 237 Libraries Within Learned Societies 239 A Distinctive Community 240 Future Strategy for Learned Societies 242 Summary 244

7

ASSESSMENT 245 ROLE OF DEMOCRACY 245 OVERVIEW OF RESEARCH RESULTS 247 ESTIMATE OF LATENT DEMAND 252 IMPLICATIONS FOR KNOWLEDGE WORKERS 253 IMPACT ON STAKEHOLDERS – SWOT ANALYSIS 255

8

CONCLUSIONS 275 A Different Approach 275 What Practical Initiatives are Required? An Action Programme 280

276

Contents

9

APPENDICES 285 BIBLIOGRAPHY 285 ACRONYMS 297 DEFINITIONS 302

INDEX

311

IX

OVERVIEW There are several interconnected global issues facing STEM publishing; the following will disentangle these and explore their relevance for the future health and viability of science communication during the next five to ten years.

Democracy Many futurologists and evolutionists claim that the information society is at a crossroads. Projections of a steady, easy and gradual transition from a structured traditional print-based system into the new open, interactive and digital information economy have been questioned. Digitisation, the Internet and World Wide Web have changed the technical infrastructure and business conditions more significantly than at any other time since the development of the printed press and the arrival of the first scientific journals between the fifteenth and seventeenth centuries respectively. These new features lead, arguably, to an increase in democracy within traditional science and research communities. They facilitate more interaction, support increased interactivity and offer greater opportunities to innovate in a rapidly evolving digital world. They foresee dramatic improvements in the way scientific research is conducted and disseminated during the next decade. However, there is another school which argues against such positivity. The negative impact of digital developments on industrial and merchandising sectors during the past decade has been dramatic, the fallout being the demise of many leading organisations and industries, and a workforce forced to be reskilled. The consequences have led to a more divisive society, in terms of education, wealth and political leanings (Keen 2015). This dichotomy exists within STEM (science, technology, engineering and mathematics) publishing. Increased democracy or wider participation is confronted by protective barriers in STEM which hold on to past traditions to ensure quality standards are upheld. Dynamic external changes face a largely cautious, conservative industry sector. It is how this confrontation, between openness/democracy versus holding on to the past, is resolved in future that will determine the size, structure and influence of the STEM publishing sector. Crucial to this is finding the optimal business system to ensure that the needs and requirements of the various target markets and stakeholders are effectively being met.

https://doi.org/10.1515/9783110650778-204

XII

OVERVIEW

Business Models A barrier to an open science society is the business model inherited from a bygone age. Such reliance has resulted in claims that STEM has become “dysfunctional”. Traditional models have focused on publishers selling subscriptions or licences for research journals and books to libraries. It has been profitable for many STEM publishers, and during recent decades of heavy public investment in science and engineering has become the accepted model for disseminating research results. But as new technology impacts on information services, and austerity hits the library, the toll-based business model has come under scrutiny. In business terms, the emphasis now is on “market niches”, “platforms” and “networks” rather than brands. Brands, which were the Holy Grail for journals, are no longer such valuable assets. Instead, the importance in establishing relationships between producer and consumer has become paramount. This relationship has been identified as “prosumption” (Tapscott and Williams 2006), an active collaboration between creators and users, and not just presenting something to a static unresponsive (or supply-driven) market. Speed and abundance of information also help to distinguish the Old (print-based) from the New (digital-based) information environments. The day of the research article is by no means over. It still has a role, but in future it will have links to other supporting media forms and not remain as an isolated entity, without ties to primary evidence in support of the article’s content. Nor will it remain the dominant communication device in disseminating research outputs; other more appropriate information services are becoming available. During the past decade Open access (OA), enabling free at the point of usage, has become a business model which challenges the reliance on traditional subscription and license arrangements. OA is gaining traction within funding agencies and libraries, though it remains a small share of current STEM activity. There are nevertheless many public initiatives to force a switch, including a consortium of European funding and research agencies agreeing in September 2018 to a “Plan S”. This puts pressure on OA publishing business models by capping article fees, ending embargoes and withdrawing support for “hybrid” OA journals. In parallel with these new developments there are “pirates” in the private sector who offer access to published articles for free or at low prices. This includes Sci-Hub, an article delivery service currently run by a researcher in hiding to escape legal sanctions so far successfully imposed on the service by leading publishers including Elsevier and the American Chemical Society (ACS). Despite its legal frailty, SciHub generates considerable ground level support from researchers and librarians. ResearchGate is also one of several new

OVERVIEW

XIII

services which have been created by researchers operating at the research coalface and which undermines the increasingly challenged subscription business model in the name of developing more appropriate, timely and cost-effective information services. Again, this operation has been the source of a lawsuit lodged by both Elsevier and ACS in 2018. The complaint is over the massive infringement of peer reviewed published journal articles to which ResearchGate provides ease of access. There is, therefore, business opinion which argues that the current state of STEM financing is facing severe challenges as it adapts to the new research environment. At an equally generic level, the journalist George Monbiot, writing in The Guardian (September 13, 2018), suggested that “As the system has begun to creak, government agencies have at last summoned the courage to do what they should have done decades ago, and demand the democratisation of knowledge” (Monbiot 2018)

Broader Reach for Scientific Publications Under ideal conditions, an enlightened democracy can create opportunities for latent talent within society to come to the fore – opportunities which are restricted by traditional structure and procedures. A currently disenfranchised social group with expertise outside the core of science endeavour could possibly enhance the scientific research process. It could become a new market for scientific research results. Equally tantalising is the prospect for greater collaborative partnership in the scientific endeavour itself between researchers and the hitherto disenfranchised community or communities. Such a “new digital market” for scientific information is currently largely unexploited. Its participation in the scientific endeavour is prevented by the above restrictive commercial business model. Nevertheless, if a wider market for scientific communication could be opened up, the benefits to science, society and the economy generally would be considerable. Stability and trust within scientific institutions have relied on established and accepted commercial arrangements. This includes that access to formally published scientific research output is only given to those institutions which agree to a tight licensing arrangement with the publisher. This precludes those institutions being able to provide access to a wider market beyond the welldefined local community. Such disenfranchised sectors include academic-trained knowledge workers who operate outside academia, in the private, business or government sectors. They include a variety of sci/tech-based professionals, SMEs (small and medium

XIV

OVERVIEW

[research] enterprises) and the growing community of citizen or amateur scientists. They all face financial barriers in trying to get access to results and data which could be of use to them in their specific research interests and activities. Several of these institutions – such as the established professions – are themselves in a state of turmoil, facing similar disruptive forces as those facing STEM (Susskind 2015). For the purposes of this analysis, these information-disenfranchised sectors are described as “unaffiliated knowledge workers” or UKWs. Out of a global population of over seven billion individuals, one estimate puts knowledge workers at 500 million people worldwide (Microsoft 2010). This compares with the hard core of approximately seven million “enfranchised” researchers globally (Unesco 2015) and fourteen million authors who have published in the market leader’s journals (The Guardian 2017).

Demography There are also changes occurring in a nation’s demography. A young generation of Net Gens is arising, fully in tune with digital technology and the Internet from an early, formative age. Their patience with a print-based culture wears thin once attractions of modern technology and its ability to increase speed of access, online collaboration and free interactivity through new media is experienced. They have grown up able to use these features in a way mysterious to the older generation of silver surfers. This young generation of school leavers and students will become the future researchers, with a mindset more in tune with adapting to the power of new information technology in support of research activities, and no longer acceptant of the traditional way scientific research has been packaged and disseminated. These include a subset referred to as early career researchers (ECRs) (Nicholas 2018). In addition, the higher education system continues to create a “fire hose” of young graduates from academia with advanced educational attainments, who migrate into knowledge work generally in the private and public sectors. They have been educated to become science-aware. In the UK about 20% of graduates remain in academia; the rest, moving away from the information-rich higher education institutions, confront financial barriers to accessing STEM even though the published information may prove useful to them in their new careers outside academia. There are also geographic differences. Significant changes are occurring in Asian communities, not so much playing catch-up with more mature research countries in the west as leading the way in the adoption of social media and

OVERVIEW

XV

other platforms in gathering research data. China, in particular, has arisen as a powerhouse to challenge western counties in support of research in applied science and its publishing support services.

Research Change Greater collaborative research is being undertaken, both nationally and globally. “Big Science” and “Big Data” have become features of the new digital research process, eclipsing the singleton approach with its so-called “publish or perish” mentality. At the individual researcher level, the neuroscience community claims that the change in gathering research information leaves its mark on synapses and neuron activity within the brain. Instead of in-depth reading, researchers now skim brief commentaries using gateway services and search engines such as Google to satisfy their primary information needs, placing a future demand on information artefacts which are short, succinct nuggets or metadata, rather than making use of lengthy refereed articles or books. As one commentator admits, “Google is making him (and his colleagues) stupid” (Carr 2008). This is on the basis that the focus on metadata services leaves no quiet contemplative space within which to indulge in creative and serendipitous thoughts. To overcome this, new formats providing an audience with different scientific content becomes a derivative of the new information order. They are being created by pioneering researchers at the cutting edge of communication and science, rather than the traditional stakeholders, the STEM publishers. Also, there is a growing emphasis on sharing and collaboration taking place in the research process. Notably among the younger generation of researchers, the aim is to find meaningful opportunities to participate, share and collaborate. In combining these three functions they find new ways to organise that are more transparent, cheaper and less top-down than hitherto. It facilitates new ways of creating and disseminating STEM.

Technology These cultural trends are created, and reinforced, by technologies which provide ever faster means of communication, both at the hardware level and through access to more powerful networking facilities on the web and the Internet. In combination, such technological advances enhance collaborative and multidisciplinary

XVI

OVERVIEW

research. This also incorporates multi-format research outputs, many having roots in the Internet and social media. As a result of the Internet, more individuals than ever before have access to information which will improve education, enhance innovation and reduce the differential opportunities which bedevil global societies. The web matters because it allows more people to share ideas and information with more people in more ways. However, many pundits are unsure whether an Internet-based society will leave individuals feeling more or less in control of their lives. Whilst openness, interactivity and collaboration are enhanced, it also becomes the source of stalking, erosion of privacy, terrorism and information overload. It could destroy what is valuable in society – culture, education, cohesiveness and expertise. The future role of scientific communication through the new social media remains uncertain. In a world of social media, the published “version of record” of scientific research is no longer the primary means for alerting the community to latest developments in their area of interest. Instead, there is a proliferation of preprints and data exchanges among and between colleagues, peers and even competitors. The attractiveness of early online disclosure of findings and discussion with colleagues is an important facet of the new research process. It requires appropriate carriers to convey research outputs, new carriers which marginalises the published article in terms of speed, impact and relevance. In this context, informal exchange becomes a useful adjunct to support of the formal research process. It is uncertain the extent to which such informal systems could even eventually eliminate the established primary journal article as a forum for information exchange and particularly for inter-peer communication, but erosion of the primacy of the learned journal article for information exchange remains a pertinent issue to be addressed during the forthcoming decade. Of topical interest is the rise of blockchain technologies in service industries which could also leave their mark on scientific communication. Blockchain’s decentralised technology, in enabling self-regulation of information and data, provides platforms which allow participants to submit their findings in whatever format, without the need for refereeing or formal certification. It also incorporates transactional systems, crypto-technologies such as Bitcoin, which bypass current payment and banking procedures, and creates opportunities for alternative business models to be developed. Also, automation, incorporating advanced computer technologies, is set to transform the research process, bringing new ways of disseminating output from the research effort. As commented by Eric Schmidt, executive chairman of Google, “There is a race between technology and people” (World Economic Forum, Davos, Switzerland, February 2014), a race which will determine the research agenda and its moral codes. Placing controls so that artificial intelligence, machine learning

OVERVIEW

XVII

and cognitive computing operate within guidelines set to ensure that the longerterm welfare needs of society (and not just immediate commercial requirements) take precedence is a new challenge which will need to be faced by research administrators. The concern is that programmers are being unleashed without controls on their activities and this could damage society’s crucial mores, values and institutions. The rise of cyber-attacks, online bullying and deformation, and other undesirable aspects of social media, stems from the difference between communication taking place in a formal, structured way as traditional STEM and the uncontrolled, anything goes in the new Twitter-world of testiness.

STEM (Science, Technology, Engineering and Mathematics) Publishing Industry The barriers imposed by scientific publishers to protect their intellectual property and commercial operations are being scrutinised at a time when hardcore printbased scientific publishing is threatened by the emergence of the open access and ‘free’ (or freemium) information world. Overall, STEM journals are currently a US $9.9 billion industry sector, accustomed in the past to double digit annual profit margins (Simba 2017). Elsevier/RELX’s 40% profit margins are a dual edge sword for the company – on one hand demonstrating its corporate efficiency to investors, on the other attracting criticism over the morality of diverting scarce research resources from a publicly funded scientific effort into the pockets of corporations and individual shareholders in the commercial sector. Novel ways of meeting a broader spectrum of researchers’ needs are being developed to give more effective targeting of information: text, data, software, multimedia to those who need it. At the same time, pollution within the traditional system exists in so called “predatory” or “illegitimate” journals, each containing low quality, poorly reported and sometimes fraudulent research results. Adding greater social media mix without acceptable systems for monitoring or quality control would exasperate the growing amount of digital noise.

Future Threats New, large, multinational organisations emerge and begin to control the way individuals access information. They are a challenge to democratisation. Google, Apple, Facebook, Amazon (GAFA) – each of these behemoths have agendas which involve maintaining control over their customers and is supported by their

XVIII

OVERVIEW

substantial financial power. They have become the information market in their respective sectors, with a monopolistic power that this brings. They have power to mould what information will be disseminated and how. Facebook is currently (2018) being challenged for the use it makes of freely submitted private information for purposes not agreed upon by the individuals concerned. Democracy could be a casualty as GAFA’s commercial and strategic policies increase their power as advertising agents, at the expense of offering an equitable approach to quality independent editorial content (Foer 2017) within scientific research. The GAFA powerbase – their collection of data about users’ online footprints in all arenas – can only be limited through more government intervention, worldwide, to enforce greater transparency and openness (Keen 2015). GAFAs in turn face challenges from a few of the larger STEM publishers – Elsevier, Nature (Digital Science), Springer, Wiley, Sage – which have refocused part of their operations to develop analytic software in support of the research process, rather than just focusing on the publication of research articles and texts. They implement new software solutions to aid researchers. Whilst these new informatic areas are being explored by the few big commercial and society publishers, the main body of publishers and librarians remain committed to keeping the old print-based system alive, despite concerns about its fitness for purpose and their own survival prospects in a fully digital economy. Neither corporate size nor scale are exclusive requirements to implement new information systems in the Internet world. Small, entrepreneurial organisations are being established at the fringes of the scientific research effort. They are masterminded by scientists who recognise the value in adopting different research procedures relevant to their own subject fields. There are novel services being used, including www.91lib.com, an illegal web platform popular in China, as well as Library Genesis, which some researchers use. Few researchers appear to regard ResearchGate as being questionable; it has gone from a disruptor service to becoming mainstay. Other novel services are also emerging including GitHub, MedSci, WeChat, YouTube etc. (Nicholas 2018). Several of these organisations do not follow traditional modus operandi within STEM, and infractions into sanctified intellectual property rights are made with impunity. They are often supportive of a more democratic and open approach to STEM. Some go further, such as Sci-Hub, with political aspirations along the lines of offering a science commons or communism (Graber-Stiehl 2018). These new entrants challenge the established paradigm and undermine the classic subscription/licensing business, the mainstay of STEM journal publishers. Full adoption of new research processes and STEM information dissemination procedures depends on cooperation among many stakeholders in

OVERVIEW

XIX

the research arena – funders, governments, research institutions, universities, libraries, distributors, publishers and notably individual researchers. Each will have different requirements for what it judges to be an improved system – each will have different take-up procedures for ensuring acceptance of an effective new paradigm. Their often-diversified, mutually competitive approaches lead to an overall conservative, cautionary approach to new academic support processes within STEM. This could artificially hold back the rate of change which STEM will need to take and prolong its infrastructure well beyond its sell-by date.

Future Approach The conclusion is that a major paradigm shift is underway. The current squabbles between existing STEM stakeholders around pricing, ownership and copyright is preventing fundamental, longer-term strategic issues being addressed. In effect, extrapolation of past trends to provide a vision of the new future is no longer appropriate. Smooth growth lines will be broken as new digital opportunities emerge, as new procedures are developed, as new players arise, as new business models are adopted and a new digitally-based information environment arrives. It is no longer an evolution but rather a distinctive break in the chain of progress, an industry revolution. Evidence of the trail of disaster left by unwillingness to adapt in tune with the changing market environment can be seen in many areas of society such as in music and retail. The traditional legacy of STEM is of historical interest, but not relevant in coming to terms with the “perfect storm” or tsunami of integrated forces which are about the envelop the research scene. Extrapolation of past trends could be replaced by more innovative forecasting techniques, embracing the agendas of all potential stakeholders and their approaches to STEM in its broadest conception. Such approaches, instead of relying on questionnaires and exploring past behavioural trends, could adopt a Delphic approach incorporating views from the left field as its database and evidence for analysis. An example is the “Research Futures” project from Elsevier and Ipsos, published in 2019, and including over 50 experts in its analysis of the impact of future developments on science to 2030. Some may see the demise of the STEM industry (as currently structured) as a positive outcome, particularly those who have been critical of the high profit margins reaped by the market leaders such as Elsevier in recent years. However, it is beholden on the STEM industry at large not to defend the indefensible but to marshal energies across all stakeholder sectors to develop new communication systems which are fit for purpose in a new scientific world

XX

OVERVIEW

where sharing of results in a digital, fast, efficient way occurs, whilst also ensuring that noise is eliminated, and quality of product and service is ensured. And that new systems are required which are acceptable, affordable and sustainable. A challenging set of requirements given current circumstances and industry structures.

ACKNOWLEDGEMENTS Researchers having difficulties in accessing relevant published outputs have been described in several recent studies, one of which was a research project undertaken by CIBER (formerly a department within University College London) in 2010 – the so-called “Gaps and Barriers” project (Rowlands and Nicholas 2011). As a member of the research team in “Gaps” this author investigated researchers’ information activities and what additional research would be required to understand their needs and habits. It would provide market evidence from which strategic assessments on the future of the STEM publishing process could be based. However, interest in this topic dates back further, seven to eight years, during which the author was a member of the senior management executive team at the British Library, and awareness arose of the generic challenges which broad sectors of UK society faced in getting hold of relevant published research literature. Discussions with the then director of information technology at the British Library, Richard Boulderstone, highlighted the distinction between meritocracy, or elitism, and democracy as underlying themes within the STEM sector. Prior to the British Library the author spent ten years undertaking market research and new business development within Elsevier Science in Amsterdam, and subsequently with Pergamon Press in Oxford. Contacts maintained with Elsevier staff and its alumni enabled current publishing perspectives to be analysed. The author was also involved in several international intermediaries in the scientific communication arena both in the UK and USA. As director of the Ingenta Institute in the early 2000s the author organised conferences to explore future developments in the information industry, and this interest was reinforced as co-editor in a newsletter entitled Scholarly Communications Report which was published as an independent monthly bulletin from 1998 to 2010. It highlighted emerging challenges and new developments relevant to the STEM industry during this volatile period. Therefore, the issues addressed in this study are viewed from the perspectives of each of the main stakeholders in the scientific information industry – scholarly publisher, librarian, intermediary, quasi-journalist, author and most recently an honorary research associate at UCL – and the views are not restricted to, nor promoted by, any one group or sector of the information society. The issues tackled in this study are such that an impartial approach was vital, one without preconditioned sectoral agendas. Scientific research and STEM publishing have been dominated in the past by a singleton approach, with individual researchers protecting the results of https://doi.org/10.1515/9783110650778-205

XXII

ACKNOWLEDGEMENTS

their work so that they gain kudos, promotion, tenure and/or further research funding. This led to “publish or perish” becoming a driving force in authorship, thereby exposing the competitive approach in the way publishing operated. However, neurological research during the past decade has highlighted the benefits of the group or “hive” mind, with many experts combining to move a project forward along several fronts and relying on unique and in some cases non-academic specialisms. This, combined with the evolution of Big Science research projects and Big Data outputs, has shown that scale, collaboration and cooperation have become important. The Internet has given wider access to the research process, not only in reaching out to newly science-enfranchised sectors of society but also enabling them to participate as partners and collaborators. Group activity is becoming a powerful feature of STEM at the expense of singleton activity. All these trends lead to democracy taking a greater role in the current and future debate about scientific research and Science. “The task of democracy”, Dewey claims, “is forever that of creation of a freer and more humane experience in which all share and to which all contribute”. For STEM, this involves bringing the disenfranchised knowledge workers into the scientific endeavour. But whilst in the early days of the Internet, during the first period of Internet growth from 1900 to 2000, there was enthusiastic support for a principled approach to maintaining an uncommercial approach to the Internet, this has changed in the current third phase of Internet (2010 onwards) where the large Silicon Valley innovators have made the Internet a data resource for capturing the mindset of the community and from this earned billions in advertising revenues (Keen 2015). Purist democracy is potentially the net sufferer. These issues provide the framework for this book. It combines the traditional professionalism within scientific research, with its protected and restricted access policies on research outputs, with a broader view of openness and collaboration which the digital economy and the Internet has brought about. There are inherent clashes in culture and approach between the two, and this analysis explores some of the areas where dangerous conflicts do and may occur. Many external experts were contacted for their views. However, none bears responsibility for the structure of the book, nor its content, interpretations, conclusions or recommendations. Though research ethics of ensuring that comments from those interviewed as part of a postgraduate study were accurately encapsulated, it is nonetheless a post-positivist assessment of the current state of scientific publishing, overlain by as much hard evidence as is available in support of the main finding that change is imminent, substantial

ACKNOWLEDGEMENTS

XXIII

and will affect a wide spectrum of research communication activities. Key among these is the impact change will have on a wider community of scholars beyond academia, notably “unaffiliated knowledge workers” or UKWs. Their inclusion within a future STEM research process will help achieve a wider, healthier, open and more democratic research society.

1 INTRODUCTION REASONS FOR INVESTIGATING THIS TOPIC In recent years, funding agencies and the media have focused on commercial and operational challenges facing the scientific publishing system, usually referred to as STEM (science, technology, engineering and mathematics). By contrast, comparatively little attention has been given to longer-term strategic issues. There is concern about the rationale behind an industry amounting to $25.2 billion a year, of which almost $10 billion is from journals, and which generates profit margins of 35–40% for the handful of leading academic publishers who dominate the sector. These margins are sometimes far greater than those of global tech companies which have come under recent public scrutiny, such as Apple, Facebook, Twitter and Google. Besides concern about haemorrhaging science funds away from the research sector into a few profitable commercial operations, there is also a related worry about reliance by these publishing companies on a paywall business model which restricts access to those sitting behind academic walls, at the expense of serving a wider science-aware community. The current system is elitist and highly delineated. Commentaries by pundits in STEM are assessed to see whether they offer solutions to resolve such commercial concerns. Other concerns focus on optimising research output formats to meet emerging new user demands in a digital world; on attracting participation from groups having wider, more practical or applied but nevertheless relevant skill sets; in ensuring an equitable quality control system is in place which judiciously eliminates noise and information overload; and in identifying the status and role of STEM as a utility within society. In theory, participation by knowledge workers and a more ‘research aware’ public could enhance R&D efficiency and generate additional outcomes and resources which would in turn feed back into providing support for future funding activities in research. A virtuous circle would be created. If appropriate changes are required to the communication system what would this mean for the health of the scientific information process and the commercial viability of the current stakeholders involved? There are problems which prevent a healthy STEM publication system being introduced. The dysfunctionality of the present STEM publication system has been highlighted by The Guardian journalists Monbiot (2011) and Brown (2009); by academics such as Gowers (2014), Murray Rust (2014) and Allington (2013); by independent observers such as Susskind (2015); by government agencies such as the UK Office of Fair Trading https://doi.org/10.1515/9783110650778-001

2

1 INTRODUCTION

(UKOFT 2002); by Jisc, and within the Finch report (RIN 2012); and by American commentators such as Neilsen (2009), Esposito (2013) and Shirky (2008), among many others. These informed commentators pointed to the many weaknesses facing STEM in adapting to a rapidly changing socio/technical and business environment and to the alleged avarice of several STEM publishers. The benefit in analysing this topic is that it is both timely and significant. New source of revenues could be tapped if business models could be introduced which push all right buttons required by a digital research community and which could also make available science results to a new science-aware audience. The challenge is to bring the changes which are likely to occur within one approach whilst ensuring that the objective of achieving a sustainable and viable business proposition for the industry as a whole can be met. Some vital questions need to be answered. For example, how can a viable commercial future for STEM be reconciled with the growing political pressures to achieve openness? How can needs to disseminate highly technical, esoteric research results be combined with broadening the reach of these results in dumbed-down versions suitable for a more general but interested audience? How can book and journal publishing survive in the face of totally new formats for research dissemination which are based on raw datasets and non-text media, and what agencies will emerge to support new forms of data accreditation and dissemination? How can social media and social networking be integrated effectively into a traditional formal approach to STEM? As will be explained in the following report, scientific communication is currently in a volatile and vulnerable state, which raises many questions on how it can be structured over the next decade. The need for the current system to be fixed is highlighted by the results of an Ithica S&R “Survey on UK academics” (Ithica 2016) which reported “There is a growing interest from academics in reaching audiences outside those in academia with their research [findings]”. Compared with an earlier Ithica study in 2012, there was a significant increase in respondents who said that non-academic professionals, undergraduate students and the general public are all important audiences to target with research findings. These wider communities of potential users have not been considered in traditional marketing strategies adopted by STEM publishers. An academic, insular rather than open, democratic position has been the norm. A translation of content to include the essence of the expert’s conclusions, whilst enabling a vast community of disenfranchised knowledge workers to be reached, is lacking in scale. The above were the starting points, providing both the stimulus and setting for the for this analysis’ conceptual framework.

SCOPE FOR THIS REPORT

3

SCOPE FOR THIS REPORT Several practical issues are covered in this analysis. – This is primarily a UK-focused research project even though the issues are global. Comparative international data in this area is often lacking. It is recommended that future iterations of this project would add to the available international statistical evidence. In the meantime, it remains a weakness of the STEM system that quantified, quality data about market aspects of the business is currently lacking and needs to be addressed in a coordinated and systematic way. – This project is both commercial and strategic as well as academic in its approach. A commercial assessment of market size, trends and business models informs on how far STEM has become dysfunctional and identifies the extent of change currently occurring. The strategic focus addresses the viability of new digital means of communicating specialised information and how users are changing. Future investments in the STEM infrastructure will be dictated by how confident organisations are that there is a socially acceptable, commercially viable and strategically sustainable business model underlying the output of research results in future. These issues are tackled through the prism of a commercial and business approach whilst also recognising that an academic approach, which includes independent and structural rigour, is also relevant. – This book’s content is based on science, technology, engineering and mathematics (stm, STM, S&E or STEM) rather than broader scholarship, albeit that there is also a fragmented approach within STEM disciplines in their respective adoptions of digital information systems. A physicist is different, in information terms, from a humanist; a biologist from an econometrician. Even within individual scientific disciplines there are different informational sub-cultures. – It is an independent, impartial study, based on the experiences of the author who has been part of organisations which are involved in all stages of the research cycle – from publishing (at Elsevier Science, Pergamon Press), librarianship (the British Library), intermediaries (Ingenta, Faxon; Blackwells) to consultancy (DJB Associates), authorship (of books with de Gruyter and editor of an independent monthly newsletter) and postgraduate researcher (University College London). Relying on any single existing stakeholder to make balanced assessments would suffer from traditional cultures distorting the picture. Impartiality is important at this juncture, particularly when feelings are running high over activities of certain stakeholders. For example, threats of boycotts against commercial journal publishers – currently in

4

1 INTRODUCTION

vogue – reflect more the failings of the system rather than promoting realistic solutions and sustainable and unbiased strategies for the future.

AIMS The aim is to review the trends towards greater democracy and openness within scientific research and STEM communications. There is a triple aspect to this aim – the first is to evaluate external developments, encapsulated within the terms “perfect storm” or “tsunami”, and are analysed both in terms of implications on UKWs specifically and the STEM industry generally. Secondly, information needs and habits of so-called unaffiliated knowledge workers (UKWs) in the UK are assessed. Finally, these assessments are placed within the context of the present STEM information system and its future structure. All three aspects are linked. An analysis of the STEM information process will inform whether it is fit for purpose in a rapidly changing information world. Notably, it assesses implications which the current structure of STEM has on those communities which are not included in the mainstream STEM effort. At stake is the health of science communication during the upcoming decade as stakeholders cope with a combination of disruptive technologies and social change. It raises questions about the development of effective UK national science, research and information policies, as well as how extensively a mantle of democracy could enshroud these. In conclusion, recommendations based on the analyses which have been described as part of the above activities are offered. The intention is to produce recommendations for scientific information and research communications to move forward using viable and sustainable platforms which meet different requirements from both old and new stakeholders and for both established and new market sectors.

OBJECTIVES Derived from the above aims, the objectives for this study includes: – Describing the impact on the STEM industry in migrating from print through hybrid to digital publishing – Providing an analysis of relevant statistical sources on demographic trends – Monitoring usage patterns of STEM research outputs

OBJECTIVES

5

– Identifying public concerns expressed by recognised experts regarding the current STEM publication process, and assessing their relevancies – Exposing the culture conflict between meritocracy or elitism and democracy in STEM information exchange – Bringing together publishing, financial and policy concepts which provide understanding about the extent and direction of emerging trends in STEM communications – Reviewing the emerging technical options for STEM which developments in IT and the Internet are creating – Reviewing the impact which social media and social networking has in transforming the communication and publication processes – Assessing the impact of the various open access (OA) routes on facilitating ease and freedom of access – Evaluating the impact that the changing nature of STEM communications will have on existing stakeholders (notably publishers, librarians and intermediaries) – Reviewing the position of learned societies as providers of innovative STEM services – Pinpointing the role which unaffiliated knowledge workers (UKWs) currently have in the STEM information process – Assessing the extent and nature of information needs of knowledge workers – Identifying factors which prevent knowledge workers from engaging in science research on as equal a basis as academics and those in corporate R&D – Establishing policies and strategies to enable unaffiliated knowledge workers to be more fully integrated into the overall scientific research system – Providing a strategic vision which enables STEM to migrate from a traditional mode to a creative, open and interactive service in the future

2 LITERATURE ANALYSIS STEM Described in Informal Literature Prominent industry-watchers have given their views on the current STEM scene. They frequently adopted the moral high ground, looking at the political, social and economic consequences to society resulting from perpetuation of the present largely commercially orientated publication system. Many of these watchers have therefore been critical of the status quo. Social media and social networks have become platforms upon which topical issues have and are openly and energetically discussed. It is in this forum that problems and options for STEM are raised, such as questioning whether traditional STEM publishing processes are fit for purpose, whether it is dysfunctional and whether failure to meet UKW information needs could be a casualty. However, such discussions are being done ad hoc, uncoordinated and in some cases couched in emotive terms. They are notable for being conducted at the margins of the STEM debate, which is dominated by operational features. So far, the needs of knowledge workers generally do not appear as frontpage issues in social media, with the focus instead usually about the iniquities of the existing players – notably large commercial journal publishers. What is exposed in the informal literature is the tension within the sector which is an important cornerstone to this book. Leading writers about the STEM scene include: Michael Nielsen, author “Is scientific publishing about to be disrupted?” was a question raised by Nielsen in a blog dated June 29, 2009 (Neilsen 2009). His premise was that several related industries have been sidelined in recent years because they were unable to cope with emerging trends facing their operations. He cited the print newspaper industry, music and mini computers as examples. The leaders of these industries were not, he claims, either stupid or malevolent – it is because the underlying structure of their industry, primarily their scale of operations, was unsuitable for new market conditions that consternation was caused. The immune systems of these industries were protective of an established organisational structure, and this was counter to the openness and demands for a change in the delivery of data and information which has emerged on the back of a technological revolution.

https://doi.org/10.1515/9783110650778-002

8

2 LITERATURE ANALYSIS

Nielsen asserted that STEM publishing is about to face the same disruption. He claimed that large publishing houses will need to compete with new companies which focus on meeting specific new digital preferences in the information industry. In effect, he claims that traditional publishers will have to traverse “the valley of death” to survive. He pointed out that senior positions in larger scientific publishing houses are rarely held by technologists. Most publishing management have strong business and/or editorial skills. In ten to twenty years’ time, Neilsen claims, “scientific publishers will be technology companies. Their foundation will be technological innovation and most key decisions made will be by people with deep technological expertise”. This builds on the escalating rate of change created by the technological revolution. Neilsen further points out that there is a flourishing ecosystem of start-ups in scientific publishing which experiment with new ways of communicating research, radically different in approach from journals. They are better prepared to cope with a change in techno-market conditions and emerging democratic trends than current STEM publishers wedded to serving academia (Neilsen 2009). Lessons can be learned from new giants that have emerged on the information scene (Google, FaceBook, Twitter, Microsoft, Apple, Amazon, Netflix). They have been successful in exploiting a free and open industry sector. By reaching out to wider global communities and taking smaller individual payments for services provided, revenues flow in. Many smaller payments from a much larger audience is a healthier business proposition than relying on a few customers who regularly complain about high subscription prices (see section on Dysfunctional STEM). As identified by Nielsen, the immune system for scientific communication is strong in protecting traditional publishing formats and systems. Conservatism prevails. The question is whether the existing scale of operations will be enough to sustain them given the economic, financial, social and technological challenges they face (as described in later chapters). Neilsen argues for a technologically-managed future for STEM publishing to be its DNA in the future.

George Monbiot, The Guardian Another strong critique of STEM publishing came from the journalist George Monbiot in an article printed in The Guardian on August 29, 2011 (Monbiot 2011). His claim was that it is not possible from their current actions to recognise a picture of flexible, rapidly reactive large commercial publishers rushing to embrace the new millennium. There has been, according to Monbiot, lack of

STEM Described in Informal Literature

9

leadership from publishers in switching from the traditional publishing paradigm to new untested ones. This is because the commercial risks involved are unknown and possibly unpalatable. Why throw away a regular and stable almost 40% gross margin on a serial subscription service, the basis of their revenues, in favour of something which would likely be a lot less? It could be corporate suicide (Monbiot 2011). However, this has led to many seeing the scientific publishing industry as being greedy and non-responsive to new market needs. According to Monbiot, “who are the most ruthless capitalists in the western world? Whose monopolistic practices make Walmart look like a corner shop and Rupert Murdoch a socialist?” His vote goes not to the banks, the oil companies or the health insurers, but instead to STEM publishers: “Of all corporate scams, the racket they run is most urgently in need of referral to the competition authorities”. “Without [access to] current knowledge, we cannot make coherent democratic decisions”. But according to Monbiot, “the publishers have slapped a padlock and a ‘keep out’ sign on the gates”. Downloading a single article published in one of Elsevier’s journals costs $31.50, while Springer charges €34.95 and Wiley-Blackwell, $42. And the journals (publishers) retain perpetual copyright: “If the researcher wants to look at a printed letter from 1981 that can cost a further $31.50”. Though the local research library may have access to the required item, they have been hit by budgetary constraints: “The average cost of an annual subscription to a chemistry journal is $3,792. The most expensive primary research journal is Elsevier’s Biochimica et Biophysica Acta at $20,930”. Though academic libraries cut subscriptions to make ends meet, journals still consume 65% of their collection budgets, which means they have had to reduce the number of books they buy, and pressures are being exerted on staff and facility budgets, as well as on curation and storage. In addition, not everyone is able to make use of the nearby university research library. Unless one is affiliated with the library – as a student or member of staff – the terms of the licensing agreement between publisher and research library is such that that the unaffiliated would be turned away from accessing online information published by STEM publishers. Monbiot laments that STEM publishers get their articles, their peer reviewing (vetted by other researchers) and even much of their editing for free. Also, the material they publish was commissioned and funded by the tax-paying public through government research grants and academic stipends. But to see it, the general public, knowledge workers and much of academia must pay for it again.

10

2 LITERATURE ANALYSIS

Publishers claim they need to impose these charges because costs of production and distribution are recouped from a small (research library) market, and that they add value because they “develop journal brands and maintain and improve the digital infrastructure which has revolutionised scientific communication in the past 15 years”. However, an analysis by Deutsche Bank reached different conclusions: “We believe the publisher adds relatively little value to the publishing process. If the process really was as complex, costly and value-added as publishers claim that it is, 40% margins wouldn’t be available” (Monbiot 2011). Far from assisting the dissemination of research, the big publishers impede it, as their long turnaround times can delay the release of findings by a year or more. However bad the situation is for academics and researchers, it is far worse for the laity. Independent researchers who try to inform themselves about important scientific issues are charged high access rates. It contravenes the Universal Declaration of Human Rights, which says that “everyone has the right freely to. . . share in scientific advancement and its benefits”. In the USA, in support of open access, Dr Stuart Schieber, Director of the Office for Scientific Communication at Harvard University, has pointed to Thomas Jefferson’s claim that “the most important bill in our whole US [legal] code is that for the diffusion of knowledge among the people.” (Harvard University 2012). These are important mantras around which this study is built. Empowering the many with the results of society’s scientific progress (and not keeping it locked away for the wealthier academic/industrial institutions) offers attractive social benefits for a democratised and healthier STEM information system. It was inevitable that Monbiot’s criticisms would be challenged. In one of the leading publisher journals, The Scholarly Kitchen, on September 1, 2011, Kent Anderson in the U.S. claimed that the arguments put forward by Monbiot were “uninformed, unhinged and unfair – the Monbiot rant” (Anderson 2011). Others who are closer to the publishing industry feel that Monbiot has a jaundiced view of the commercial STEM publishers and is extreme in his arguments. However, the Monbiot “rant” does provide a catalogue of issues being debated at present about STEM among those who are outside the industry looking in.

Sir William Timothy Gowers, Cambridge University From a different perspective, a respected UK scientist who supports the sentiments expressed by Monbiot is the Cambridge-based mathematics Professor, Sir Timothy Gowers. He saw Elsevier as the villain in draining profits from the general science budget into the hands of a few non-science-based financiers. In

STEM Described in Informal Literature

11

2012, Gowers wrote an article for The Sunday Times which ignited a campaign for authors and readers to boycott Elsevier publications (Gowers 2012) as it is they which typify the high cost approach to scientific communication. Such a campaign from within the research community is not new. There had been an earlier outcry against commercial journal publishers led by Dr Michael Eisen in the USA which resulted in the formation of the Public Library of Science (PLoS, an innovative free to access publication). Thirty-four thousand signatories were collected in Eisen’s campaign complaining about STEM’s business model. Within weeks of the UK-based Gowers appeal, 9,000 scientists globally had signed up to the petition pledging to refrain from editing, publishing or sponsoring articles in any of Elsevier’s over 2,000 journal titles. The stimulus for the campaign, from what The Sunday Times referred to as this “thoughtful academic”, was partly the high profits generated by Elsevier, and partly from the effects which the economic downturn was having on science budgets, including libraries (Gowers 2012). Gowers believed that publishers such as Elsevier were ruthless in cutting off journal supplies to the captive market they serve – research libraries. There were barriers stopping attempts to negotiate better deals on the package of journals within their portfolio of “big deals”. This included preventing librarians discussing and comparing the financial terms each library had negotiated with the publisher under pain of the imposition of legal sanctions. More recently (2017/8), the main German and Swedish universities have threatened to boycott Elsevier journal titles unless a better license agreement could be reached which would radically reduce the costs of the Elsevier journal package. At the time of writing this issue has still not been resolved. According to Gowers, the Internet is overcoming the stranglehold which journal publishing has had. New forms of communication are being created, relegating the published journal article to that of being a version of record (VoR): “Interesting research gets disseminated long before it gets published in official journals so the real function that journals are performing are the validation of papers” [and giving credibility to the author]. Given that published articles are no longer the primary communicator in progressing science, it seemed, to Gowers, a travesty that Elsevier should have earned £768 million for its private investors in 2011 from its archival activities in the public scientific arena. Had this been recirculated within the scientific research sector, the amount of valuable new research results would have been considerable, and all society would have benefitted, not just a few financiers and shareholders. As long as authors are distanced from the commercial activities of STEM publishing giants, and readers are separated from the purchasing decision by

12

2 LITERATURE ANALYSIS

the research library and their collection development policies, the status quo will continue. Gowers’ call for action was an attempt to highlight dysfunctionality within the industry. The conflict was essentially between freedom and openness of science in the Internet which was clashing with the profitability targets set by the owners of publishing conglomerates. He invited mathematicians from Cambridge University to give their views on the importance or otherwise of continued access to Elsevier journals as part of their research efforts. He concluded “most people would not be inconvenienced if they had to do without Elsevier’s products and services, and a large majority were willing to risk doing without them if that would strengthen the bargaining position of those who negotiate with Elsevier”. Although complaining about the actions of the market leader in itself is not an indicator of the dysfunctionality of the publishing system, it does suggest that there may be alternative directions for STEM to consider in future.

Andrew Brown, The Guardian In the February 5 2009 issue of The Guardian, Andrew Brown offered another complaint about the STEM publishing system. Brown pointed out that the government paid universities to conduct research for public benefit. The authors of research results are paid nothing; the peer review is done for free, by academics employed and paid for by universities. The results are then sold back to the universities who supported the research in the first place: “This is poor value for governments. It is also difficult for those outside a university who may want to learn, and that’s a situation the web has made more tantalising”. Almost all journals are indexed and references to them can be found on Google Scholar, PubMed Central and other leading comprehensive online data sources: “So the truth is out there. But it will cost you” to get access to the full report (Brown 2009). One answer, he claimed, was to promote free scientific publishing, and also free access to the immense quantities of data that lie behind most published papers.

Daniel Allington, Open University Daniel Allington, professor of sociology at the Open University, wrote a blog about the role of such free or open access (Allington 2013). Though he initially felt that Green Open Access (a particular form of open access) was an improvement over

STEM Described in Informal Literature

13

the traditional toll-based or subscription-based publication system, he subsequently felt that the wrong questions were being asked. Open Access was being proposed as a solution to a range of STEM problems which had little to do with one another, and very little to do with creating an effective scholarly communication process. Support for the green Open Access movement (which involves depositing research outputs in a local institutional depository and the latter then making the results available free of charge to anyone who requests them) ignores that there is still a need to fund institutional repositories. This means that the financial pain is switched from one institutional account (the library) to another (the institutional repository). Overall costs of producing scientific publications would remain essentially the same for both Subscription-based and Open Access publishing as long as quality control is exerted over what is published. Allington describes an alternative open access model which could be adopted if the goal is to make published material available to a wider circle of knowledge workers. This alternative requires funding agencies to include “nontechnical summaries” written by recipients of research grants in support of their research application, and to have these summaries, together with the conclusions from the final research outcome, posted on the funding agencies’ web sites. These would be freely accessed. This does exist in some instances but is not heavily used. This is probably as much due to a failure in promoting the system as to any weakness in the concept. It is one which has been proposed by Esposito as part of a new tertiary publication service from the publishing industry (see Nautilus by Esposito 2007). This concept has potential value in both speeding up the dissemination of STEM publications and extending their reach into new markets. Alice Bell, a researcher in science and technology policy, had made a similar proposal in Times Higher Education in 2012 in which she argued that Open Access may lead to clearer write-up of results if it is understood that a broader audience could then be able to understand the contents of the article (Bell 2012). However Allington was not convinced and claimed that “To translate a research article from a technical register into everyday English would. . . make it more ambiguous or more verbose”. In either instance, it would be worse from the perspective of the primary target audience of researchers and experts in the field: “Open Access is one thing; expecting researchers to take up the task of public education by radically changing the manner in which they communicate among themselves is quite another”. Writing popular science articles is vastly different from reporting on a highly technical research project. The real challenge facing the Open Access movement is that a substantial number of academics and knowledge workers cannot be bothered searching

14

2 LITERATURE ANALYSIS

around for free copies online. Could this be a major barrier, or is it one which can be addressed in future through improved alerting, search and delivery systems? The openness, transparency, interactivity and cooperation features offered by the web and the Internet could provide the mechanisms to smooth the timely delivery of relevant information to specific target audiences. External technological changes could become a powerful force in creating innovative systems to activate latent demand. Allington was not convinced that social media would provide an adequate surrogate for the journal. Relying on LinkedIn, Twitter, Facebook and other social networks to provide a secure and reliable platform for scientific communication runs counter to a scientific culture which includes a strict code of peer review and assessment. Meanwhile the debate rages as to what may be inconsequential disagreements over where the funds for supporting scientific publishing should come from – Green, Gold, Grey, Hybrid Open Access, subscriptions, licences, PPV or Big Deals (see later). In the meantime, strategically significant issues of future organisational structures are left pending.

Peter Murray Rust, Cambridge University In a blog dated February 2014, Peter Murray Rust, Reader in Chemistry at University of Cambridge, took issue with the fact that publishers are not subject to regulatory controls over pricing: Scholarly publishing industry is almost unique in that it provides an essential service on an unregulated monopoly basis. In other words, the industry can do what it likes (within the law) and largely get away with it.

This points to academic publishers’ having an unusual form of monopoly. It is not a particularly concentrated market. Even the largest player, (Elsevier) publishes only 17% of all articles. Nor do publishers control the means of distribution (i.e. the Internet). Authors have plenty of options if they want to publish elsewhere. But elements of a monopoly situation are reflected in the high profit margins of the big five commercial publishers: The “customers” are the University libraries who seem only to care about price and not what the service actually is. As long as they can “buy” journals they largely don’t seem to care about the conditions of use (and in particular the right to carry out content mining). In many ways they act as internal delivery agents and first-line policing (on copyright) for the publishers. This means that the readers (both generally and with institutional subscription) have no formal voice. “Because publishers have no regulatory bodies overseeing their

Assessment of Commentaries

15

operations, they operate effective micro-monopolies. Readers have no choice what they read – there is no substitutability. They can either subscribe to read it or they are prevented by paywalls. If they have access they can either mine it or they are subject to legal constraints. Publishers can go a very long way in upsetting its readers without losing market share”. (Murray Rust 2014)

Assessment of Commentaries Comments found in social media reflect pressures building up within the STEM industry. Social media and social networks have become platforms within which questions about the state of STEM publishing are being debated. These support key issues addressed in this report by questioning whether the traditional STEM publishing system is fit for purpose, whether it is dysfunctional and whether neglect of UKW information needs is one of many consequences. There is evidence of innovative communication services now emerging in response to concerns from researchers themselves (such services include Mendeley, ResearchGate, Knovel) rather than initiated by publishers and librarians. This includes new reading patterns being adopted by researchers at the coalface as described by Jeffrey (Jeffrey 2012) and Murray Rust (Murray Rust 2014). These are more relevant in meeting the aims of this study rather than descriptions of small-scale historical user studies contained in published, formal literature. STEM publishing relies on traditions established in a print-based publishing system. These traditions do not migrate well into the worlds of digital communications and the Internet. Problems are exposed in this migration – such as commercial activities of large commercial publishers; questionable support for an open access publishing solution; relative decline in library budgets – all of which ignore strategic issues such as structural formats required of the STEM system in the new millennium. Tinkering around the edges is not an option when radical social, technological and economic developments are taking place in the context within which STEM operates and will operate in future. In the complaints in social media about STEM there is one significant omission (though several commentators make passing reference to it). It is the stimulus to STEM which would come through opening access to a wider community of science-aware knowledge workers. UKWs are currently priced out and locked out from being active participants in STEM (Brown 2016). One recommendation which emerges from literature is that more evidence is required on user habits and needs of these unaffiliated knowledge workers so that business models can be constructed as part of strategic initiatives to

16

2 LITERATURE ANALYSIS

ensure that STEM develops a healthy approach to meeting information needs over the next decade. These models would need to be absorbed and integrated within developments taking place in the wider information society.

The Information Society The concept of an “information society” is recent. In broad terms, in the eighteenth century the UK was an agrarian economy, moving on to an industrial economy in the nineteenth century and the service sector in the early twentieth century. It was not until the middle of the twentieth century that the emergence of information and knowledge economies was recognised. Only now has the role of knowledge workers become a topic for analysis. The business consultant Drucker, in his book The Landmarks of Tomorrow (Drucker 1959), is attributed with coining the term “knowledge workers”. He was followed by Machlup who provided a systematic analysis of knowledge within the US economy in his book The Production and Distribution of Knowledge in the United States (Machlup 1973). Machlup was followed by a similar but more extensive quantitative study by Porat in The Information Economy: Definition and Measurement (Porat 1977). Other contributors to the knowledge industry debates included Bell in The Coming of Post Industrial Society: A Venture in Social Forecasting (Bell 1973) and Castells in his The Rise of the Network Society (Castells 1996). Most recently, Webster (Webster 2014) drew attention to what he felt was the darker side of information developments. This has been added to by Nicholas Carr in his 2015 book The Glass Cage – Who Needs Humans Anyway? (Carr 2015), which postulates the growing influence of technology in effecting social decisions in a digital information economy. A disturbing feature affecting the information society has emerged with the growth of the Internet in the early 1990s. This is the willingness of individuals to contribute information about themselves to the social platforms, and thereby allow themselves to be exploited for economic or political gain by third party organisations. This balance between information which should remain personal and private, and information which should be part of the public domain, is a complicated one. It has highlighted some invidious developments in social media, such as the willingness of individuals to engage in communication and information exchange in an aggressive, hostile way. The extent that governments should become involved in controlling such behaviour and exploitation of the new information economy is currently under consideration.

The Information Economy

17

The Information Economy As indicated above, a rigorous treatment of the size of the knowledge worker sector was first undertaken by Porat (1977). He collected data about information activity in the US economy. Porat proposed a conceptual framework for defining information activities and how to quantify such activity. US society was divided by Porat into six sectors: three information sectors; two non-information sectors, and a household sector. These information sectors produce and distribute all information goods and services in support of the economy. The two non-information sectors supply the physical or material goods and services whose value or use do not primarily involve information, but nevertheless have relevance. The household sector supplies labour services and consumes final goods. Porat identified 26 information industries that constitute the primary information sector. He also referred to the contribution from secondary information sectors which accounted for 82 non-information industries. Porat concluded that 25.1% of the USA Gross National Product in 1967 was bound up with the primary information economy which is where information is exchanged as a commodity. The secondary information economy includes information activities produced for internal consumption by government and other organisations, where it is embedded in some other good or service and is not explicitly exchanged; this amounted to an additional 21% of the US economy. According to Porat, nearly half the labour force held an informationrelated occupation. By 1967 the information sector became dominant, rising from a low of 15% of the workforce in 1910 to over 53% of all labour income in 1967. Conceptually, Porat claimed that “information” cannot be classified as one distinct sector – such as mining – but rather the production, processing and distribution of information goods and services should be an “activity” operating across all industries. There are other unique aspects to information which have become folklore. Information has become a utility which does not depreciate over time; in fact, its value is enhanced with age and usage. Information also has a non-rival aspect in that it does not preclude others from making use of the same information without anyone incurring a loss. Furthermore, information is cumulative, “building of the shoulders of giants” as new research results build upon past endeavours. Finally, information is now digitisable, which creates new opportunities for its dissemination through digital networks using different business paradigms. However, in practice STEM has surrounded research outputs with constraints such as copyright and intellectual rights protection which affects its open availability.

18

2 LITERATURE ANALYSIS

These factors, taken together, emphasise that STEM-related information is a unique commodity but is potentially undergoing change in both function and process as we move into a digital world. Machlup produced a similar treatment to Porat on the US information economy (Machlup 1973). Machlup’s accounting schema began with five major classes of knowledge production, processing, and distribution, and 30 industries that were classified into (i) education, (ii) research and development, (iii) media or communication, (iv) information machines and (v) information services. Machlup’s estimate of total knowledge production of $133,211 billion (1958) compares with Porat’s $71,855 billion for the same year. The difference is accounted for in the latter’s exclusion of secondary information services. Relevant to STEM information, there is the quaternary sector which covers everything from universities and higher education to the pharmaceutical industry, computer software and technology start-ups – sectors that involve the extensive use of knowledge to create something new of value. It does not cover financial services, banks and medicine as they are part of the service economy, or the tertiary sector. Historical studies provide an interesting sideline to the role of electronic information as a transformative process. An iconic assessment was delivered by McLuhan in 1964 (in his book Understanding Media: The Extensions of Man). In this book he coined the term “the medium is the message” (McLuhan 1964). This encapsulated the idea that it is not information content itself that is critical but rather the means with which it is transmitted. According to McLuhan, a popular medium, whether printed newspapers or online databases, molds what we see and how we see it. He was an early advocate that the medium can work its magic on the individual’s nervous system: it not only “supplies the stuff of thought but also shapes the process of thought” (Carr 2010). Another approach to understanding the information economy has been put forward by Ackoff, who promoted the idea of there being an information pyramid (Ackoff 1989). At the base of the pyramid are societies’ vast data resources. Data by itself has limited value. Above this is information, which has become a problem as “information overload” has entered the vocabulary (one of the first mentions being by Toffler in his 1970 book Future Shock (Toffler 1970). By the mid 1990s the concept of knowledge and understanding had been built on top of the information stratum. Whilst information has become “structured data”, knowledge has become “actionable information”. Knowledge is about the filters in place, reducing the “fire hose” of what is available to that which we need to know. Filtering and linkages are a key phenomenon of the digital age, as they were in the printed era. At the top of the pyramid is wisdom.

The Networked Society

19

Wisdom

Knowledge

Information

Data Fig. 1: Ackoff’s information pyramid. Data is the raw material on which the information economy is based. Information is data that is processed to be useful; it provides answers to “who”, “what”, “where” and “when” questions. Knowledge is the application of data and information and answers the “how” questions. Understanding is an appreciation of “why”. Wisdom: evaluation based on knowledge and understanding. See Ackoff, R, 1989, “From data to Wisdom”, Journal of Applied Systems Analysis. 16:3–9.

Wisdom requires broad connectivity, whereby associations and decisions are drawn from an ever-wider range of experiences that enable the assignment of more generalised values (Greenfield 2014).

The Networked Society The new elephant in the room is the Internet and World Wide Web. From humble beginnings in the early 1990s, conceived by academics, libertarians and theorists who promoted a decentralised and free platform for information exchange, the dominant feature in the latter part of the 2000s has been the arrival of a small group of organisations which now dictate the direction which the information economy is taking. These include the so-called GAFA group – Google, Apple, Facebook and Amazon. Each has sought to capitalise on information collected about the behaviour of its users for commercial purposes. This has become an issue of concern as these organisations exploit the privacy of individuals, becoming “Big Brothers” in their monitoring and manipulation of the users’ online information data to create distinct profiles which advertisers pay to use. The concern is that such data is freely given to GAFAs, as individuals use their platforms as a modern communication device, or in leaving a “digital trail”

20

2 LITERATURE ANALYSIS

whilst searching the Internet for items of relevance. These datasets are then mined by organisations for purposes not originally intended, primarily as a source for effective targeted advertising. A consequence is that owners of these Internet-media platforms have in a few years become dollar billionaires. In so doing, they are leaving a trail of disaster among companies which relied on the traditional trading arrangements before the onset of online and the Internet (Keen 2015). The few employed within the GAFA circle of companies come at the expense of thousands made unemployed by the decline of traditional industries. Growing pressure on and by government agencies to achieve a balanced approach to the future information economy, one which does not endorse monopolists with their power base over information-about-people, is evident. Meanwhile, STEM information sector is also in the frontline for a torrid future unless it can adapt to what is essentially an uncertain future, created by the arrival of new, global information platforms.

Global Information Trends European copyright-intensive industries now account for almost 9,400,000 direct and indirect jobs and contribute nearly 510 billion Euros to the European GDP (European Observatory 2013). This represents a distinctive economy, one which is watched over with some interest by pundits and governments alike. Any weaknesses in its future could have repercussions on society and the overall economy. As such, control measures are sought, supported by international law, which will ensure that the creative aspects of the information or networked society are protected. The output of peer-reviewed science and engineering journals, books and conference proceedings indicates the current scale of scientific research activity. Data made available in 2019 show that worldwide S&E publication output continued to grow, reaching 2.4 million journal articles in 2017, with the United States and China being the two largest producers (17% and 19% respectively of the world total). However, when grouped together, the European Union (at 26%) produced more S&E publication output than either the United States or China. Globally, S&E publication output grew at an average annual rate of 4% between 2007 and 2017. Over the same time period, the share of internationally coauthored S&Epublication output also increased from 17% to 22%. The world devoted 1.7% of its gross domestic product (GDP) to R&D in 2007, a share that has remained stable since 2001. In monetary terms this translated into US$ 1,146 billion in 2007, an increase of 45% over 2001. Despite global

Global Information Trends

21

economic problems, countries still spend on R&D as a way of stimulating new economic growth and buying themselves out of austerity (Unesco 2010). There are difficulties in identifying and applying an accurate measurement for growth in science. One measure is to relate it to the growth in scientific publications. Though Mabe and Amin (Mabe and Amin 2001; 2003) authored articles in the early 2000s to show that science publishing and science grew at a steady 3% to 3.5% per annum, this has been challenged by some authorities, and figures in excess of 4% have been suggested (+4.7% per annum by Thomson Reuters, a leading organisation in informatics at that time which provided data on scientific publishing) (Adams et al. 2009). The problem is comparing like with like. Whether just the core natural science journal outputs have been looked at (which show a slower growth rate than some of the newer sci/tech disciplines); whether conference proceedings are included (with conference proceedings more important in scientific fields with high growth rates); whether other social media are included; whether institutional repository holdings are included; whether primary datasets are linked to or integrated – these all have implications on the overall growth estimates for science and STEM publishing. There are also regional differences in R&D. BRIC countries (Brazil, Russia, India and China) and VISTA (Vietnam, Indonesia, South Africa, Turkey and Argentina) have until recently been expanding their research commitments at a faster rate albeit from a low base. The developed world is maintaining a steady annual increase in support for R&D. However, China, India and other Asian countries have more recently become powerhouses in global research activity. The needs of developing countries for research information mirror the requirements of unaffiliated knowledge workers within the UK. Both suffer exclusion from the fulcrum of research activity – both stand at the periphery looking in at the western academic-based research effort, being unable to interact or become fully involved with its research on a level playing field. Both the developing nations and UKWs in developed countries face the challenge of breaking down the exclusivity generated from past barriers erected to scientific information access and interaction. A fundamental question which underpins this analysis is how equitable and evenly spread commoditised information really is, particularly among academically trained users of research information. And if, as is the starting hypothesis for this research, there are barriers in place preventing interested parties from gaining equal and ease of access to required information sources, how can this situation be remedied? The underlying fear is that traditional information support services will go the same way as the music and newspaper industries. The consequence from the introduction in the music sector of online companies such as Napster and

22

2 LITERATURE ANALYSIS

Scour, manned by a few IT skilled staff, disrupted a $36 billion industry “that shrank to $16 billion because of libertarian badasses. . . invented products that destroyed [the music industry’s] core values” (Keen 2015, 191).

Environmental Developments The consensus among observers of the STEM scene is that it is at the cusp of major changes. All stakeholders in STEM, notably publishers and librarians, will need to adapt to changes which are taking place as the industry migrates from a print-based format of research dissemination to the current hybrid approach, and in the future to a fully digital information world. The conditions creating this new digital world are outside the control of existing players in the STEM business. Environmental conditions are part of an ever-increasing sophistication within society driven by improvements in technology, social adaptation, organisational developments and changes in business practices. STEM publishers have hitherto largely remained immune from the severity of such developments, adopting a position which has strong roots in a legacy system of print-based publishing and a reward system which is conservative in structure. However, there are some powerful agents for change which will have an impact on this cautious, traditional STEM industry sector.

A “Perfect Storm” A “perfect storm” is when several unrelated factors come together to create dramatic consequences. In this instance, the factors are sociological, technical, political/administrative, economic/commercial, openness and trends in the research process itself. The consequence is that STEM communication will undergo a change in its size, nature and shape and over which it will have little control. The factors which lead to this changed environment are partly general trends and partly significant individual drivers unique to STEM.

3 ENVIRONMENTAL AGENTS FOR CHANGE This chapter focuses on the various external changes which impact on researchers and their support services. It also looks at publishing developments, policy issues and changes in science and how these relate to STEM and its alleged dysfunctionality. This approach offers a “world view” affecting both the STEM information industry and the disenfranchised knowledge workers (Pickard 2013). This world view comes from an analysis by acknowledged industry experts. The intention is to see whether there is consensus. During this analysis, several driving forces have been identified and presented as models. They constitute building blocks on which this book describing STEM’s difficulties and UKW developments have been based.

Chaos Theory Even with the aid of the research paradigm of world events and conceptual paradigm modelling, determining the direction which STEM and the knowledge worker sectors will take is not straightforward. Forecasting the future is paved with difficulties under any circumstances. Aspects of chaos theory can be applied. There are several variables which have influence on trends in the scientific domain. Some have high relevance in promoting a changed paradigm, whereas others have a marginal effect. The analogy frequently used is the “Butterfly Effect” – a butterfly beats its wings in the Amazon basin, and this influences weather conditions on the other side of the world. Such is the variability, volatility and unforeseen consequences arising from variables involved in an unstable environment. Though chaos theory has its roots in mathematics, its applicability spreads beyond to the information industry and can be used as an analytical tool in assessing STEM’s future. Many dynamic external systems interfere with research and science which accentuate the difficulty in making effective long-term predictions of the future industry structure. There is no one single determinant, no linear extension of the effects of one variable which will lead to an accurate assessment of change. At best, the situation could be tackled through a delphi model of scenario building, using the talents, expertise and knowledge of experts and authorities in various sectors of STEM and beyond. There is little evidence of this happening at a high level, though the European Commission has set guidelines in support https://doi.org/10.1515/9783110650778-003

24

3 ENVIRONMENTAL AGENTS FOR CHANGE

of the European information economy (such as through the current Horizon 2020 and Innovation Union). What few vision statements exist rarely bring together the two challenges of an alleged dysfunctional STEM and also the inclusion of a wider knowledge worker community within the same approach. The following section describes the building blocks for concepts and models which can explain the mechanics for the changes which could take place in the UKW space. These are: Structure of Following Chapter TECHNOLOGICAL TRENDS Technological Advances The Internet and the Web Web Versus Apps Challenges to the Internet Mobile Devices (Including Smartphones) Table – Technical Trends SOCIOLOGICAL CHANGES Neurological Studies Natural Group Size Cognitive Surplus Table – Sociological Trends DEMOGRAPHY The Digital Scholar The Net Generation Demographic Data on UK researchers CULTURE Cultural Adaptation LEGAL ISSUES Creative Commons Copyright Clearance Centre Publishers Versus Sci-Hub Publishers Versus ResearchGate Implications SOCIAL MEDIA Adoption of Social Media Within Society GAFAs Summary

Chaos Theory

COMMERCIAL ISSUES “Openness” and Open Society Open Access (OA) Sci-Hub and LibGen Freemium Freemium and STEM Publishers “The Long Tail” “Tipping Points” “Product Life Cycle” Economies of Scale Table – Economic/Commercial Trends PUBLISHING AND STEM DEVELOPMENT “Information Overload” “The Twigging Phenomenon” Vertical Search “Wisdom of the Crowd” Cult of the Amateur “Serendipity” Miscellanised Information RESEARCHER BEHAVIOUR Typology of Researchers/Users The SuperJournal Project Patterns of STEM Use Ofcom Gaps and Barriers Early Career Researchers WORK PROCESS Sharing Results Collaboratories Designed Serendipity RESEARCH DEVELOPMENT Data and Datasets Workflow Processes Artificial Intelligence and Cognitive Computing Role of Machine Learning SCIENCE POLICY ISSUES “Tragedy of the Commons” National and Centralised Policy Directives Brexit Future of the Professions

25

26

3 ENVIRONMENTAL AGENTS FOR CHANGE

“Valley of Death” Table – Political/Administrative Trends The above demonstrate the extent of the “digital wildfire” which is affecting society in general (World Economic Forum 2013. Global Risks Report (8th edition)). The next sections of this book will look at each of the above in more detail.

TECHNOLOGICAL TRENDS The end of the twentieth century saw innovation in communications just as dramatic as had occurred with the introduction of moveable type by Gutenberg in the fifteenth century, and the subsequent launch of scientific journals in the mid seventeenth century.

Technological Advances – Underlying the power of information technology is Moore’s Law (Moore 1965). Moore, former chairman of Intel, pointed out that every eighteen months the number of transistor circuits etched onto a computer chip doubled. This law has existed for the past 50 years – a tenfold increase in memory and processing power every five years. For example, the cost of a single transistor in 1961 was approximately $10; by 1968 the cost had fallen to $1. In 2009, Intel’s processor chips had about two million transistors, which gives a per transistor cost of 0.000015 cents. As a technological driver it drives the current fall in prices for personal computers and the increased availability of devices such as smartphones, laptops and tablets. The uniqueness of the microchip – essentially just sand – is in how innovatively it is compiled. In future, developments in quantum computing will make additional impact on efficiencies in hardware technology. – At the same time as Moore’s Law was having its effect on hardware there were rapid technical improvements taking place in telecommunications. These telecoms advances are improving interpersonal online connectivity. The total bandwidth of the communications industry, driven by developments in data compression made possible through fiber optic strands through which information can pass, is tripling every year. This effect is referred to as “Gilder’s Law” (Gilder 1993). – A further law is “Metcalfe’s”. Metcalfe, developer of Ethernet, observed that the value of the network that is created by the above is proportional to

TECHNOLOGICAL TRENDS

27

the square of the number of people using it. The value to one individual of a telephone depends on the number of friends, relatives and business acquaintances that also have phones – double the number of friends and the value to each participant is doubled and the total value of the network is multiplied fourfold. The triple effect of faster, better, cheaper technologies – affecting processing, storage and bandwidth – come together online, which is why there are so many free information services available, such as Google, YouTube, Flickr, Facebook, Amazon etc. These services can be free to the user because operating costs are negligible, spread over a global market, and significant revenues come from sponsors and advertisers who revel in the wider audiences open to them. Technological advances provide the means whereby content – books, journals, articles, data and supporting multimedia – can flow more quickly and efficiently through the research system. In addition, these technical developments create an infrastructure which supports a different environment to print-based information – one which is heavily digital-orientated and within which both knowledge workers generally and academic researchers specifically can operate and collaborate. The situation is not static; it is highly dynamic. Some computer scientists believe that new “neuromorphic” microchips which have machine-learning protocols hard-wired into their circuitry will boost computers’ learning ability even further in future. Taking this a step further, if computers are advancing so rapidly, and if the natural state of people’s mental responses are slow and error prone, why not take the human factor out of the equation altogether and build self-contained technical systems? “We need to let robots take over” declared Kevin Kelly in a 2013 Wired article (Kelly 2013). In his view this would improve efficiency, eliminate errors and reduce costs/prices. Apocryphal, and perhaps disturbing in its implications. There is also an ongoing refinement to powerful global search engines. Gateway services such as Google Scholar, Medline, Yahoo, PubMed, Scirus and Web of Science are important initiatives in raising awareness of relevant sources of information such as in published research articles. These platforms and services enable links to remote information sources to be made more transparent, quicker and easier. They are also reliant on technological progress achieved and being made in the future. This highlights the concept espoused by McLuhan (McLuhan 1964) in his book Understanding Media in which he foresaw the significance of the media itself in dictating the impact of the information revolution. His enigmatic

28

3 ENVIRONMENTAL AGENTS FOR CHANGE

aphorism – “the medium is the message” – has become a popular mantra supporting the idea that content matters less than the medium itself in influencing how we think and act, that technology will trump editorial as the information industry reaches maturity. These technological advances come in two forms – there are technological changes which arise from improved automation, increasing the efficiency of current book/journal publishing and online processes. There are also technological advances which arise from innovation, also referred to as “disruptive technologies”, which create new paradigms different from print-based services. The combination of automation and innovation is changing the operations, the information support systems and the future paradigm for STEM. The STEM information scene which existed ten years ago is substantially different from that which exists today and will be radically different again within the next five years driven by such technological advances. Technology provides new opportunities for existing and different mechanisms for conducting STEM research. However, there are several pundits (including Carr (2016)) who claim that the focus on automation and machine learning, and their emphasis on technical efficiencies, could lead to balanced cultural values being undermined. In effect, software development will be carried along on a wave which will change the work and leisure activities of society and diminish creativity as knowledge workers grapple with the increasing dominance of screenbased information services and their distractions. Nevertheless, in the short-term, greater availability of relatively inexpensive devices such as laptops, netbooks, eBook readers, smartphones, Google Glass and tablets has increased access paths for potential readers of digital information. Networks of users are being created, with communication being both cheap and reliable. There is now an established technical and information infrastructure in place, ready to support delivery of research output in whatever format. Both knowledge workers as well as academic researchers are beneficiaries of all these technological trends and bring them closer to the doorstep of the digital revolution in scientific research.

The Internet and the Web The impact of the Internet on scientific communication cannot be exaggerated. It has transformed information seeking habits of researchers. According to Weinberger in his book Too Big to Know (Weinberger 2012) there are several aspects to the role which the Internet has taken:

TECHNOLOGICAL TRENDS

29

– The Internet connects many people. The worldwide population is estimated at over 7.6 billion (in 2017), of which 4.02 billion are connected to the Internet (and over 2.7 billion were allegedly users of Facebook in 2018). This offers a huge reach into global society, achieved within the past three decades using Internet protocols. – This has spawned concepts such as “the wisdom of crowds” (Surowiecki 2004) which challenges the authority of the expert (Nichols 2017) and highlights the importance of input from an audience with different specialist experiences. It has led, for example, to crowd-sourcing (Howe 2006) and citizen science as important research trends. It supports the notion that increased democracy may become a feature of STEM in future. – In the past, working as a singleton in small teams on carefully defined research topics may have been appropriate and practical. In future, it may be judged inefficient to cope with new research problems. By combining the skills, experiences, knowledge and network contacts from disparate research areas in unison, such collaborative sharing could lead to the right solutions being found more quickly. The Internet facilitates such crossdisciplinary, cross-institutional, cross-global interaction. It has resulted in the “hive mind” or “collective intelligence”, the latter concept having been described by Neilsen in his book Reinventing Discovery – the new era of Networked Science (Neilsen 2011). – In a related vein, it has been calculated that everyone is six social connections away from each other. This is the number of links in the communication chain which has to be gone through in order to reach someone who may be initially unknown to the individual. This “six degrees of separation” is reduced by the Internet and similar technological trends to no more than four other people and may eventually become zero as “friends of friends” becomes an integral part of online communication. It has an impact on the speed and efficiency of online communication. – The Internet is cumulative. The Net retains everything posted to it and makes the historical record easily accessible. This has negative aspects, but also positive in that it is an open record or archive of all that has been said that is important, indifferent and/or innovative. – The “cloud” of linked computer power provides almost limitless storage of digital records. It is the basis for building on the past, for “standing on the shoulder of giants”, for establishing precedence even if it lacks the professionalism involved in permanent archiving and curation. – The Internet is flexible. It allows for unprecedented back and forth communication through online services. Millions of people can participate, but

30

3 ENVIRONMENTAL AGENTS FOR CHANGE

equally small groups of tens or more can take part in highly specialised and targeted dialogue through bespoke forums on the Internet. The importance of the Internet is that it is bi-directional. It allows messages to be both sent and received. This is different from other media, notably printbased publishing in STEM, and has led to services such as Wikipedia, YouTube, Flickr, Huffington Post etc. taking off. According to Weinberger, “the complex, multiway interactions the Net facilitates means that networks of experts can be smarter than the sum of their participants” (Weinberger 2012). It opens a new approach to research interaction and effectiveness whereby the dominant force is no longer the skills of a few experts but rather the interaction of a broadlybased crowd. This supports the inclusion of expertise from as wide a group of researchers – including knowledge workers outside academia. There are sociological aspects through which usage of the Internet is still being investigated, but early research described in Greenfield’s book (Mind Change) suggests that: (a) People tend to lose themselves when going online (b) Only true physical friendship succeeds and is lasting; online friendships do not (c) There is a strong link between excessive Internet use and autism. Autism involves a distrust of empathy, hence people losing themselves on the Internet (d) Internet use can exacerbate an otherwise poor social relationship None of these are directly relevant to the research community; it merely highlights that there are unexpected sociological consequences from moving into a fully Internet-based information society which may need consideration as the Internet intrudes more into STEM information communities. Unlike print-based publishing, the Net’s extensive connectivity does not respect physical boundaries which is a feature of academic research. It enables uncredited people outside academia to mix online with those inside university research centres. It allows various directions to be followed, to get involved in conversations tangential but nevertheless occasionally significant, with wider research communities: “The Net refuses to keep information apart from communication and apart from sociality” (Weinberger 2012). Much of new social media is dependent upon the infrastructure provided by the Internet. According to data from InternetWorldStatistics.com on July 29, 2012, the following global penetration of Internet has been measured.

TECHNOLOGICAL TRENDS

31

Tab. 1: World Internet Usage and Population Statistics (2011). Sources: Internet Usage and World Population statistics are for December 2011. Population numbers come from US Census Bureau. Internet usage data comes from International Telecommunications Union. (World Internet Usage and Population Statistics, 2011) World Region

Population in millions

Internet Users In million

Penetration (% of population)

Growth –

North America





.%

+%





.%

+%

Europe





.%

+%

Latin America





.%

+,%

Asia

,

,

.%

+,%

Africa

,



.%

+,

Total

,

,

.%

+,%

Oceania and Australia

A cultural divide exists, between those generations which grew up before digital technology and those who know no other world than one in which interactive digital technology has become ubiquitous. Each generation expects unique requirements for information to be met; in many respects, these expectations are defined by attitudes or major events which took place during formative years, but also rely on the current and future availability of new effective communications technology. This is particularly true of the “digital generation” in which broadband, iPods, mobile phones, laptops and iPads became all-pervasive and essential features of current lifestyles and online video gaming and interactivity gain increasing popularity. It affects the way the new generations communicate in their daily lives. An international survey conducted in 2009 reported that Europeans spent eight hours per week online (a 30% increase from 2005) which amounts to 30% of their leisure time. In China, 44% of individuals’ leisure time was similarly spent online. Though the use of TV viewing has grown during this period of Internet use, this is not true of other media types such as newspapers, magazines and books. Key functions which can be found in this new digital communal world are “connections” between information artefacts; “links” between items; “transparency” and “openness”. These are now an intrinsic feature of the Internet culture. There are also virtues such as “publicness”, “generosity” and “listening”

32

3 ENVIRONMENTAL AGENTS FOR CHANGE

which build up trust in the communication system. In addition, other aspects such as efficiency and technical competency are offered (Neilsen 2011) The Internet and Web’s richness in content and process carries with it an aspect which has potential limitations. The Web is a technology which supports forgetfulness. Researchers will increasingly rely on Internet search tools and large data sources to fill in the blanks when conducting research. Reliance on an analytical framework developed by the individual to cope with progress in research is no longer necessary – the computer and telecoms come to their aid, and potentially take over some of the intellectual analysis. User behaviour will change as a consequence.

Web Versus Apps The Internet itself is also in transition. It was the transport vehicle for the World Wide Web, but recent years have seen the decline of the Web in favour of semi-closed platforms and applications or Apps. The latter have witnessed the rise of the iPhone model at the expense of HTML which constrains Googlelike crawling: “Dedicated platforms often just work better or fit better into their lives” (Anderson and Wolff 2010, 3). They are designed for single purpose and optimal mobile experience. A similar dichotomy exists with online access. As a proportion of US Internet traffic, the Web had declined from a peak of about 50% in 2000 to 23% in 2010 and is still shrinking. The emphasis on specific platforms such as Facebook and iTunes has emerged. Video (51%) and peer-to-peer communication (23%) had by 2010 taken dominant positions on the Internet, all of which indicates that use is increasingly made of the Internet, not specifically the Web. This issue is highlighted graphically by the different technological positions adopted by two leaders in the information industry, Steve Jobs and Bill Gates. Bill Gates adopted an open architecture for the Microsoft operating system which in a short time became an industry standard. Many computer manufacturers licensed the system and built it into their own products, in different ways for different purposes. The result was rapid adoption of Microsoft software but within a plethora of competing products, potentially confusing the end user in the process. The alternative model pursued by Apple was end-to-end integration to create a uniform customer experience – hardware, software, content and applications. Both models were successful, with both approaches leading to technical progress. Though only a few of the evolving Internet-based services focus on scientific research, it raises the probability that traditional inequities in the scientific

TECHNOLOGICAL TRENDS

33

communication process could be overcome as further technical advances are introduced. Recent developments on the Internet support the idea that there will be more interaction, particularly among knowledge workers, as the so-called “generative Web” takes hold where openness prevails. This makes the basic assumption, however, that the mindset within the research community is amenable to adopting new technical advances in communications – that they relate proactively with technical developments. There are many indications that in the broader world of online entertainment and communications such adaptation to the opportunities and constraints of IT and the Internet is occurring.

Challenges to the Internet The Internet is nonetheless facing some growing challenges as sceptics highlight problems on how it is being exploited. It centres on the use of the data circulating freely in cyberspace which is being captured by some companies and used for purposes not agreed to by the providers of data, particularly where the data is private, confidential and personal. Several organisations or platforms collect such data from a variety of sources and build profiles of users covering all aspects of their lives. They then sell this data to advertising companies, or to companies monitoring political or economic attitudes. There is no openness or transparency in how they allegedly operate, in contrast to the intents of the early pioneers of the Internet who saw the Internet as an open and free public service. As indicated earlier, their reliance on monitoring online activity, and linking this with other demographic data resources, gives them the ability to read the minds of their online clients. This could result in abuse, unless safeguards are introduced to minimise what is done with collections of data about usage behaviour. Safeguards could come either through voluntary protocols being developed and implemented by the main online organisations, or through government intervention to ensure that fairness and equity are enforced. The likelihood of voluntary controls being agreed to, given the “get rich quick” nature of the largest Silicon Valley-based players (GAFA in particular), seems distant. However, the European Commission has become aggressive in opposing the commercial activities of several of the leading media platforms, notably Google and Amazon. Facebook is also coming under scrutiny. These may change the culture of the Internet, returning it to the original public service mentality of the pioneering Internet developers (such as Bob Kahn and Vince Cerf) and World Wide Web developers such as Sir Tim Berners Lee.

34

3 ENVIRONMENTAL AGENTS FOR CHANGE

As described by Andrew Keen in his book “The Internet is not the Answer” (Keen 2015) the legacy being left by large media platforms is that of a divisive society. Rather than unite the community through enabling equal and free access to information, it polarises society with the few gaining great benefits and the many being sidelined (and in some cases rendered unemployed in their thousands). The fallout from this conflict, and how it gets resolved, is relevant for the future of the STEM business. It sets the guidelines within which research-based support services will operate in the future. It opens the door for government intervention to ensure stability, equity and security within the economy. In the 2010s there are indications of an awakening of the concerns by public authorities in the activities of organisations such as Google, Apple, Facebook, Snapchat, Uber and Twitter.

Mobile Devices (including Smartphones) There is also a switch taking place at the individual level, with migration from desktop to laptop to handheld devices (such as smartphones) to wearable units as mechanisms for information gathering and use. This trend puts an onus on how STEM information could be formatted to meet this personalised technological infrastructure (Nicholas and Clark 2013a). There are more than 6 billion mobile phone subscriptions throughout the world. Of these about 2 billion are smartphone users with connections to the Internet. This latter is estimated to double to 4 billion by 2020 (Economist 2015). In their 2015 Communications Market Report, OfCom found that smartphones had overtaken laptops as the most popular device for getting online (OfCom 2015). Two thirds of the UK population now own a smartphone, using it for nearly two hours every day to browse the Internet, communicate, access social media, bank and shop online. Ofcom also found that a third (33%) of Internet users see their smartphone as the most important device for going online, compared with 30% who preferred their laptop. The rise in smartphone use marks a clear shift since 2014, when just 22% turned to their phone first and 40% preferred their laptop. Within STEM specifically there is anecdotal evidence of use being made of smartphones for accessing STEM material, though it is more in the form of metadata/abstracts than full text articles and reports. Approximately 10% of STEM usage is from mobile devices, higher in areas such as clinical medicine (Ware and Mabe 2015).

TECHNOLOGICAL TRENDS

35

According to CIBER Research, the mobile revolution will result in further disintermediation within scientific communication. With the change in screen size available on smartphones and tablets comes a change in the way information about research output is formatted, sought and delivered online. Research undertaken by CIBER on the EU Europeana project involved an analysis of usage logs of this cultural, multimedia website (Nicholas and Clark 2013a; 2013b). The information behaviour of 150,000 Europeana mobile users was examined in 2012 and compared with that for desktop users. The main findings were that mobile users are the fastest-growing sector of the user community, a growth rate five times greater than that of PC and desktop users. Mobile telephony generates a “time shift” in behaviour. Visits are different from those using desktops. Mobile phone visits are information “lite”, typically shorter, less interactive, and with less content viewed per visit. Use takes on a social rather than office rhythm, with use peaking at nights and weekends. Many Europeana site visits occurred on Saturday nights for mobile users; for fixed devices such as PCs it was Wednesday afternoons. The stimulus behind the growth of mobile telephony for scientific information and cultural purposes is that people trust their mobile and smartphones, and they are convenient and ubiquitous. It also indicates a merger of entertainment and scholarship through the medium of this small portable device. It appears “instead of information-seeking and reading taking place in the library and office, it will take place on the train, coffee shop, and around the kitchen table” (Nicholas and Clark 2013a). The varied environment and context changes the nature of searching and reading, according to CIBER: While the first transition, from the physical to digital, transformed the way we seek, read, trust, and consume information, until relatively recently the environment and conditions in which scholars conducted these activities had not really changed – it was still largely in the library or office, sometimes the home. However, with the second transition to the mobile environment, information behaviour is no longer mediated or conditioned by the office or library (and its rules and impositions), but by the street, coffee shop, home; in a nutshell by current social norms (Nicholas and Clark 2013a).

Mobiles are part of a consumer purchasing process; they are used to search for information prior to purchase, during the process itself, and to make the actual purchase (Nicholas and Clark 2014b). It is possible that knowledge workers, who are also digital consumers, will have scope for moving down an analogous purchasing route in accessing STEM information in future. There is also a pricing/charging mechanism which needs to be considered, one which is more linked to Apps than traditional printed publications pricing. In practice there is already a procedure for paying for services through mobile

36

3 ENVIRONMENTAL AGENTS FOR CHANGE

phones which could be used for STEM purchases. Suggestion such as cryptocurrencies, including Bitcoin, have also been made, though there may be a cloud hanging over introducing such risky transactional systems in a staid, conservative STEM research environment. The various technical trends can be brought together in the following graph.

Access devices

Internet

Web services

PCs and laptops

Listservs, Blogs

Mobile/smart phones

Skype and webinars

Datasets, logs, njps

Researcher Behaviour

Grass roots driven

Google, Yahoo

Publisher driven

Discipline focused search

Library/other driven

Open systems and links

New info services

Search services

Fig. 2: Technical trends developed for this book by the author.

SOCIOLOGICAL CHANGES Behavioural changes are being brought about at the individual researcher level, at the group project level and also driven by general demographic trends within society. Hitherto ignored in assessing STEM are changes which are taking place in an individual researcher’s brain. This is a new area of research, one which is gaining

SOCIOLOGICAL CHANGES

37

prominence as it relates to other important social/medical challenges such as confronting ageing, reducing dementia, resolving Alzheimer’s, Parkinson’s and other neurological diseases. If there are changes in the way the brain operates, and these can be identified and related to the STEM research process, then such changes could figure within the design criterion for future STEM products, services and processes.

Neurological Studies Neuroplasticity describes how synapses and neurons within the brain adapt to changing stimuli. It is the psychological mechanism which enables individuals to adapt to, and cope with, external challenges in tandem with internal genetic traits. Within the brain there are neurons, with each (3 kg) brain having approximately 85 billion neurons. Each neuron generates trillions of signals. They also have central cores (or somas) which carry out functions common to all cells. They have two kinds of tentacle appendages – axons and dendrites – which transmit and receive electric impulses. When a neuron is made active a pulse flows from the soma or its central core to the tip of the axon, where it triggers the release of chemicals or neurotransmitters. Neurotransmitters flow across the synapse and attach themselves to a dendrite of a neighbouring neuron. A dense mesh of circuitry is created within the brain which triggers or suppresses electric pulses. It is through the flow of neurotransmitters across synapses that neurons communicate with one another. Thoughts, memories, emotions – all emerge from the electrochemical interactions of neurons. Every time a task is performed a set of neurons in the brain is activated. If they are in proximity, the neurons join through the exchange of synaptic neurotransmitters such as amino acid glutamate. If the same experience is repeated, the synaptic links between the neurons grow stronger through both physiological (releasing higher concentrations of neurotransmitters) and anatomical (generation of new neurons) effects. On the other hand, and by the same token, synaptic links can also weaken from lack of activity. The brain is constantly changing as it adapts to different conditions. The brain can reorganise itself after accidents to limbs (the “phantom limb” syndrome among amputees) or loss of senses such as sight – the neurons enhance capabilities in other senses to mitigate loss of functionality. The plasticity of the brain, or its coping with changed circumstances, is a factor in dictating how researchers are adapting to the technologically-driven new communication media. The individual’s actions are determined not only by

38

3 ENVIRONMENTAL AGENTS FOR CHANGE

nature but also nurture: “The genius of our brain’s construction is not that it contains a lot of hardwiring but that it doesn’t” (Carr 2010). Evolution has given us a brain that can literally change its mind repeatedly. This compact, light, intricate, versatile, often self-repairing water-cooled facility – the brain – becomes an important aspect to be included in designing STEM information services in future. This highlights that biological memory in humans is different from digital memory in computers. The human mind is constantly being reworked – the hardwired digital memory is essentially static. This difference is important as it suggests that the flexibility of the human mind can provide innovation and creativity which is not necessarily a feature of an automation-led society. The implications from aggregating such research to individuals, groups and the whole community is significant. A report published by a research team headed by Professor Eleanor Maguire, a cognitive neuroscientist at University College London (UCL), showed that brains of London’s taxi drivers’ change and grow as they develop their knowledge of the city’s streets. The part of the brain dealing with navigation – the posterior hippocampus – is larger among the test case of 16 taxi drivers compared with their peers (Maguire et al. 2000). Taxi drivers constantly use the navigational part of the brain to optimise taking passengers to their destinations. More disturbing is that taxi drivers improve the size of their posterior hippocampus at the expense of the adjacent anterior hippocampus. This reduces their ability to cope with other special processing tasks. It suggests the brain is like a muscle, and as a muscle it can expand and be enhanced by constant use and exercise, sometimes at the expense of other brain functions. In Mind Change, Baroness Greenfield claims technologies are creating a new environment and our minds are physically adapting: they are being “rewired” (Greenfield 2005; 2015). Though her conclusion of a rewired brain particularly among children is not universally accepted – critics point to her reliance on anecdotal data and not extensive evidence-based results – her opinions have nevertheless sparked much debate. Flitting Behaviour One view proposed by Greenfield and others is that reading habits are affected, and that short items are preferred over longer ones. A switch to skimming and away from in-depth reading has consequences. Greenfield claims that by ingesting only small bytes of information – rather than getting involved in detailed linear reading – the Netgeners (the Internet Generation) will fail to develop intellectual skills necessary for higher order thinking. They may develop a digital version of

SOCIOLOGICAL CHANGES

39

ADD – attention deficit disorder – zigzagging between ideas without the contemplative finishing of anything (Greenfield 2005). This flitting behaviour can be related to the use of global search engines which highlight the more popular results from an online query. They also provide easy links to related articles which again drill down to the specific issue rather than facilitating in-depth reading of contemplative texts across different disciplines. There is a tradeoff – on the one hand the willingness to get absorbed into the emerging and contemplative argument within a three-hundredpage book conflicts with the benefits from having access to many more external influences and therefore on serendipitous creative thought. For many the thought of reading a book from beginning to end has become old-fashioned. In effect, it represents a move from a stable, linear progression to a complex interactive and disparate information world. Several pundits challenge the benefits from digital use. Carr (2008) wrote in an article in Atlantic Magazine (July/August 2008) and again in his book The Shallows (2010) how Google was making the world stupid. His argument was that the snippets of information which the information explosion has created, and made available through Google, was at the expense of in-depth reading of books and articles. There was no longer any “quiet space” into which end users could retreat. He claimed “A new intellectual ethic is taking hold. The pathways in our brain are [again] being rerouted”. Carr also emphasizes the point that the synapses require reinforcement to remain “live” (Carr 2008; 2010). He no longer reads articles or books as he and his colleagues find it increasingly difficult to concentrate on lengthy text. Other researchers have shown that the process of online searching effects changes within the dorsolateral prefrontal cortex, changes which are not apparent in those who rely on printed literature for their knowledge input (Small 2008). The result is that the mode of communication will change as our reading patterns change. It could be that lengthy descriptions in scientific reports are passé as far as being the main vehicle for future scientific communication. Snippets/abstracts, synopses or, more likely, granularised parts of a report or dataset become more important. The ease of looking something up on Google is not only transforming memory strategies – with search engines an outsource for recalling objective facts – but also the thought process itself. Eric Schmidt, chairman of Google, became almost apologetic about his institutional affiliation when he reported “I worry that the level of interrupt, the sort of overwhelming rapidity of information. . . is in fact affecting cognition. It is affecting deeper thinking. I still believe that sitting down and reading a book is the best way to really learn something, and I worry that we’re losing that” (Schonfeld 2009).

40

3 ENVIRONMENTAL AGENTS FOR CHANGE

Distraction Another example of the problems generated by the rise of easily accessible digital information is that it is creating an overload in a researcher’s working memory, which in turn affects the ability to assimilate new information into the larger long-term memory. Cognitive overload of short-term memory is created not only from increased information available on the Net but is also compounded by the many hypertext links being introduced into online text, and more recently by additional links to multimedia content. Unlike with education, where such resources can be harnessed and directed at improving the educative process, with STEM research information too many distractions can be introduced which could be counterproductive. Such distractions have been shown in research studies to have reduced the effectiveness of the research process. There is a dichotomy; on the one hand the Net empowers the researcher with the ability to keep up to date, while on the other it creates too many distractions which run counter to a clearly delineated research project. This is the dichotomy facing Google – on the one hand it aims to be the world’s online library. But in doing so it is in the business of facilitating distraction. It acts against contemplative reading in favour of generating frequent clicks, which in turn feeds into Google’s lucrative AdWord advertising algorithm. Though these are considered controversial views in some quarters, it does imply that traditional information habits may be modified resulting from developments taking place in the neurological processes. More specifically, it has implications on how information should be formatted to cope with the needs and habits of a wider knowledge worker audience which does not, and has not, depended on lengthy, specialised textual treatises for their information. These differences between the traditional and emergent researchers also find their roots in the structure of the social group.

Natural Group Size In assessing groups rather than individuals, Dunbar, an Oxford University anthropologist, suggested that the “natural size of the [social] group” is about 150 individuals (Dunbar 1992). Dunbar claimed that “this limit is a direct function of relative neocortex size, and that this in turn limits group size. . . the limit imposed by neocortical processing capacity is simply on the number of individuals with whom a stable inter-personal relationship can be maintained” (Dunbar 1992). These are relationships in which an individual knows who each person is

SOCIOLOGICAL CHANGES

41

in their contact network, and how each person relates to every other person. They can be considered “friends” in contemporary online parlance. The Dunbar number is about the same as the number of people in a typical pre-industrial village, a professional army unit, the Roman army’s centurions, a Hutterite farming community and, more relevant, a scientific sub-speciality in its early days of development. However, social networks, and services such as Facebook, have created a new form of social bonding which replaces the traditional idea of the natural group size. Facebook, LinkedIn, WhatsApp and Snapshot all enable many thousands of people to group together in a communication network without creating tensions and pressures on the system or the people involved. NetGeners use services such as Twitter which are orders of magnitude larger, far more sophisticated and much more efficient than print-based networks which were possible for older generations served by printed books, journals and written correspondence. This extension beyond the Dunbar figure, leading to hundreds or thousands of people being part of a researcher’s social network, enables researchers outside the traditional world of a close-knit academic audience to be reached. Unaffiliated Knowledge Workers can therefore more easily be brought within the developing capacities of the technology-enhanced neocortex. Democracy becomes an essential corollary of the emerging digital world, generally and notably within STEM. This increase in democratic involvement through online media platforms comes with several downsides, however. It enables misuse to be made of the users’ behaviour patterns for the sake of commercial gain (see earlier under Internet).

Cognitive Surplus Improvements in productivity over centuries have resulted in more personal free time becoming available, enabling individuals now being able to perform more non-work-related activities (Shirky 2010). This could therefore provide the basis from which greater participation in network science, collective intelligence and the use of social media can take place. It gives rise to individuals not just being a consumer (of broadcast material) but also becoming active participants and producers. It is claimed that TV watching in the UK involves over one trillion hours of “free time” being spent (Ofcom 2014). If only a small percentage of this were to switch over to science-based sharing and collaborative platforms this would

42

3 ENVIRONMENTAL AGENTS FOR CHANGE

mean a radical change in the structure of the information/entertainment industry. This is the untapped latency which exists in society and which could be made available, providing the right motivations to participate in STEM can be found.

Educational Trends

Research (R&D) Trends

Educational attainment

Workflow processes

Generational differences

‘Big Science’

University challenges

Datasets, logs

Researcher Behaviour

Information overload ‘Google makes stupid’

Social media

Collaboration, Sharing

Automation Mapping the Brain Neurological changes

Communication v Verification

Social interaction

Fig. 3: Sociological Trends. (See Brown, D. 2016. p23)

DEMOGRAPHY At the vanguard of advances into the new digital world is the child. Research studies referred to earlier show that the developing brain of a child is more plastic, and responds more malleably to experiences, than the adult’s mind (Mind Change, Greenfield 2014, 25). This provides context for several demographic changes which could dominate the future global research scene.

DEMOGRAPHY

43

The Digital Scholar Have these new informal communication channels made an impact on formal scientific communication? So far there is little evidence of a breakthrough. As Weller writes in his book The Digital Scholar (Weller 2011): These emerging themes [crowdsourcing, lite connections, online networks] sit less comfortably alongside existing practices and can be seen as a more radical shift in research practice. A combination of the two is undoubtedly the best way to proceed, but the danger exists of a schism developing between those who embrace the new approaches and those who reject them, with a resultant entrenchment to extremes on both sides. This can be avoided in part by the acknowledgement and reward of a new form of scholarship. (Weller 2011)

The schism reflects different patterns of behaviour between a researcher as either a digital immigrant or a digital native. This was highlighted during interviews conducted by phone for this report among UK researchers. The “dinosaur” sector, which focuses on formal publications and personal contacts as the prime source for scientific updates, accounted for half the respondents (digital immigrants). This compares with the other half who were self-confessed “grubbers” among all that social media had to offer (digital natives). This emphasises a difference in approach for information by the two groups. Shonfeld made the point that academics have been groomed by the Internet experience: Academics’ expectations for user experience are not set by reference to improvements relative to the past, but increasingly in comparison with their experiences on consumer Internet services and mobile devices (Shonfeld 2015).

Science communication is slowly adapting to these changes, and to a world where Google dominates the search space, and Amazon makes online purchasing easy; where eBay and PayPal set the parameters for selling and buying individual items; where Skype and Viber make connections and communications interactive, cheap and easy; where Facebook, Twitter and LinkedIn open awareness of personal and professional activities. With new paradigms being explored as society moves towards openness, and as technology platforms are being created to satisfy emerging social trends for rapid communication, there is an opportunity to rewrite the scientific communication manual and not be constrained by past practices and the survival needs of current stakeholders. The manual should include chapters on integrating knowledge workers generally within the STEM system.

44

3 ENVIRONMENTAL AGENTS FOR CHANGE

The Net Generation There is the rise within society of the so-called “NetGeners” (born since the emergence of the Internet) or “X” generation. Their reliance on informal, digital information sources differs from the traditional reliance on formal printed books and journals as the primary means of scientific communication. The following table summarises the main generational classifications currently in use: Tab. 2: “The Generations” – a UK overview. Social sector

Proportion of total (Ofcom)

Born within years

Related definitions

Pre-boomers

%

Up to 

Boomer Generation

%

–

Baby boom generation

Generation X

%

–

Baby bust generation

Net Generation

%

–

Millenniums or Generation Y

Next Generation

%

– present

Generation Z

These “generations” reflect a gradual transition from print age into digital, with the younger generations seeing the digital options as being more attractive than earlier print-based information systems. Such generational typology exists as much in the sector of knowledge workers as within the affiliated. Support for this generational divide comes from anecdotes included in published works by Greenfield, Carr and Tapscott (Greenfield 2014; Carr 2010; Tapscott 2008). Marc Prensky, an American technologist, coined the term “Digital Native” for someone defined by their perceived outlook and abilities with reference to digital technologies. By contrast, “Digital Immigrants” are those who have adopted aspects of new technology but still have one foot in the past (Prensky 2001). Digital Natives know no other existence than through the culture created by the Internet, laptops and mobiles: “They can be freed from the constraints of local mores and hierarchical authority and, as autonomous citizens of the world, they will personalize screen-based activities and services while collaborating with, and contributing to, global social networks and information services” (Greenfield 2014, 4–5).

DEMOGRAPHY

45

Another advocate for the change in the way digital natives behave is Donald Tapscott. Tapscott’s books – Growing up Digital (Tapscott 1998) and Grown up Digital” (Tapscott 2008) – are based on evidence collected from the new generation (NetGeners). He claims: NetGeners exhibit a powerful social conscience. They are more participative within society, more collaborative, and supportive of greater openness than earlier generations. They make considerable use of Internet communication tools now available to them. NetGeners have different mindsets and skills, created through early exposure to IT, interactive gaming, and the Internet. These skills are just as relevant as the old linear skills learnt by the Baby Boomer generation but are more appropriate in taking advantage of the opportunities which informatics, the Internet and digital communication systems offer.

In pre-digital times, during the Boomer and Generation X periods, the focus was on collecting “eyeballs”, on establishing site stickiness, but overall it was using static presentation platforms for broadcasting to audiences. The big change came with XML which allowed collaboration and interactivity in creating communities with like interests: “The old Web is something you surfed for content. The new Web is a communications medium that enables people to create their own content” (Tapscott 2008). The Net generation is in many respects the antithesis of the TV generation. This shift from one-way broadcast media to interactive media is profound. The distinction between bottom-up and top-down organisational structure is also at the heart of the new generation, with the NetGeners relating more closely to a more democratic bottom-up approach. Tapscott explored eight characteristics which differentiate the Net Generation from the earlier generations. These include: – Freedom is prized – Customisation and personalisation of things for their own specific needs become important – Collaboration, not diktats from above, is used to produce new extended relationships – Traditional organisational structures and procedures are scrutinised more intensely – Integrity and openness are demanded, as is transparency – They want to have fun, be entertained and play – Speed is a prerequisite – Innovation becomes an essential feature of life They influence each other through so-called N-fluence Networks – online networks of NetGeners who, among other things, discuss brands, companies, products and services. They do this by creating online content. This can be in

46

3 ENVIRONMENTAL AGENTS FOR CHANGE

the form of blogs, wikis, bulletin boards, tweets or other online combinations. Some 40% of US teens and young adults have their own blogs, according to US Pew Research Center (Duggan 2013a; 2013b). In this way “they are democratising the creation of content, and this new paradigm of communication will have a revolutionary impact on everything it touches”. It suggests that the writing is on the wall for broadcasting services – TV as well as newspapers, and possibly parts of the traditional science communication process. As Shirky pointed out in his book Here comes Everybody (Shirky 2008), two decades ago supply of published information created its own demand. Now demand is creating its own supply which means user needs are driving the creation of product. Users are making their own results available online. Scarcity is no longer an issue in an era of massive digitisation, data compilation, cloud storage and tumbling costs of technology. Instead, elimination of noise and redundancy become the obstacles. It also suggests a new working relationship with social institutions. NetGeners take active part in creating new products and services which match their customised and personalised needs. This activity was first identified by Toffler when he referred to the “prosumer’ (Toffler, 1970). Tapscott extends this to “prosumption” – the interaction of consumption with production to influence the creation of useful products and services. Where barriers are put in place to restrict such collaboration, the NetGeners use social networks to convey their concerns. New bespoke information services for the researcher are being developed and launched by the research community itself, with the popularity of ResearchGate being an example. This heralds a new approach to publication of research output. Whilst the main stakeholders argue over the merits or otherwise of promoting “free access” to research output, the more challenging need is to provide end users with what they need in a format which is wanted, in a manner which is interactive and collaborative, at a price which is acceptable, and within an overall context which enables all participating stakeholders to achieve a reasonable and sustainable financial and social return. This affects academics as much as knowledge workers in general.

Demographic Data on UK researchers The following table shows whether those who completed UK higher education remained in academia or moved into non-academic centres and potentially increased the cohort of the unaffiliated knowledge worker.

47

DEMOGRAPHY

Overall, it shows that 70.2% went into forms of non-academic work, whereas approximately 15% remained in academia. Each year the number of unaffiliated knowledge workers continues to grow at a rate faster than in academia. Tab. 3: Destinations of leavers by activity and level of qualification obtained 2015/16. (See HESA, 2014) Postgraduate

First degree

Other undergraduate

Total

Full-time work Part-time work Work and further study Further study Unemployed Other



.%



.%



.%



.%



.%



.%



.%



.%



.%



.%



.%



.%



.%



.%



.%



.%

 

.% .%

 

.% .%

 

.% .%

 

.% .%

Total



%



.%



.%



.%

The following chart shows the distribution of new entrants to various trades and occupations of graduates of UK universities (HESA 2014). It shows how significant the professions are in attracting newly qualified graduates. Professions are a key sector in expanding the number of knowledge workers within UK society. To quote from an article by Park (CEO of DeepDyve, a Silicon Valleybased company selling published articles to individuals irrespective of their affiliation): In 1930, 25% of the US population of 122 million lived on farms and only 3.9% of the population had a college degree. Fast forward to 2006: just 2% of Americans live on farms, the US population had nearly tripled, 17% of Americans held a bachelors degree and nearly 10% a graduate degree” (Park 2009).

In the UK there is a similar growth in the proportion and number of an educated population as the government seeks to increase attendance rates at higher education institutions. This leads to a more informed and “scientific aware” society, and as such is a further stimulus for wider dissemination and understanding of scientific research results beyond traditional academic/research centres. Targets set by successive governments in the UK, whereby college and university-trained students increase as a proportion of British society, supports this trend

48

3 ENVIRONMENTAL AGENTS FOR CHANGE

45 40 35 30 25 20 15 10 5 0

Fig. 4: Employment by Standard Occupational Classification % Male Leavers to full-time occupations, 2012/13. HESA 2014.

towards greater scientific awareness. Though graduate enrolment of 49% of the population has been claimed in a UKDBIS report, the actual percentage is nearer 35% (Ball 2013). In 2011 there were 20,076 PhDs awarded, which represents the fourth largest producer of PhD graduates globally. The volume of PhDs awarded is an indicator of new research talent about to come on tap. A commitment to raising educational attainments in both developed and developing countries also creates an environment within which the dissemination of relevant research results in a more fertile ground than that which existed in earlier decades. The increase in proportion of knowledge workers within UK society is occurring faster than for academics or for those in corporate R&D. This is fuelled by the growth in graduate and postgraduate outputs from the higher education system, with far more going into private and public service than remaining in academia on completion of their studies. This is indicated in the following table made available by HESA: There is a distinction in employment patterns according to discipline – in medicine and dentistry 92.2% go straight into the workplace, whereas the subject generating the greatest numbers going into further study (32.3%) is law. Nevertheless, paid workers outside further study (academia) are numerically almost five times greater than those graduates remaining within academia. Each year the “tail” of knowledge workers gets larger. The flow of people through higher education and into a career can be categorised as a “pipeline”

DEMOGRAPHY

49

Fig. 5: Full-time first degree leavers 2012/13. (Source: HESA, 2014)

of talent but one that narrows as individuals pass through it and are “siphoned off” into careers outside academia (Elsevier 2013). The following table gives a more detailed breakdown of the split of graduates between full-time employment and further study by occupational discipline in 2008/9. The key point is that of the 430,000 UK graduates in 2009 (only a portion reflected in the above table), 14% stayed on to become “privileged” or “affiliated” scientific information users, whereas the majority took employment in various professions and businesses in the UK, both public and private, and in effect became potentially unaffiliated from the mainstream of academic support services, such as access to scientific books and journals in research libraries. The following table gives a breakdown of the disciplines/professions which took graduates as first-time employees in 2011/12: Nearly 50% of the leavers were in the sciences with business studies, social science and creative arts also figuring as important contributors to the UK knowledge economy. This is a global phenomenon among western countries, only surpassed in its importance by some rapidly expanding Asian Tiger countries. It is therefore no longer a domestic issue within the UK but one which has global relevance,

50

3 ENVIRONMENTAL AGENTS FOR CHANGE

Tab. 4: UK output of graduates into knowledge-based occupations, 2008/09. Higher Education Statistics Agency (HESA) 2014. Higher Education – Statistics for the United Kingdom. HESA June 2015. Profession

Numbers overall

Paid work Numbers

Paid work %

Further study numbers

Further study %

TOTAL

,

,

.%

,

.%

,

.%

,

.%

Industrial & Engineering IT strategy and planning Civil engineers

}

Mechanical engineers

}

Chemical engineers

} ,

Design & Develop eng

}

Electronic engineers

}

Production & Process

}

Planning & quality

}

Quantity surveyors Bioscientists/ biochem

,

,

.%

,

.%

Physicists, geologists

,

,

.%

,

.%

Sub Total

,

,

.%

,

.%

,

,

.%

,

.%

Services Medical professions Dentists Opticians

51

DEMOGRAPHY

Tab. 4 (continued ) Profession

Numbers overall

Paid work Numbers

Paid work %

Further study numbers

Further study %

,

,

.%

,

.%

,

,

.%

,

.%

Managt, accountants

,

,

.%

,

.%

Psychologists

,

,

.%

,

.%

Social science research

,

,

.%

,

.%

,

,

.%

,

.%





.%



.%

Subtotal

,

,

.%

,

.%

TOTAL of Above

,

,

.%

,

.%

Software professions Solicitors, lawyers Legal professions nec Management, business

Social workers Probation officers Public service Architects Town planners Veterinarians

(Source: HESA, June 2015)

Tab. 5: Destination of UK university leavers who obtained first degrees by subject area and activity 2011/12. Destinations of UK Domiciled leavers who obtained qualifications through full-time study, HESA 2010 (Data for 2008/09 (Table 3a)). Discipline Medicine & Dentistry Subjects allied to Medicine

Full-time Work

Part-time Work

Full-time Study

Part-time Study

,







,

,

,



52

3 ENVIRONMENTAL AGENTS FOR CHANGE

Tab. 5 (continued ) Discipline

Full-time Work

Part-time Work

Full-time Study

Part-time Study

Biological sciences

,

,

,



Veterinary sciences









Agriculture & related subjects









Physical sciences

,

,

,



Mathematical sciences

,



,



Computer sciences

,







Engineering & technology

,



,



Architecture, building

,







Total Science

,

,

,



Social studies

,

,

,



,

,

,



,

,

,



Mass communications

,

,





Languages

,

,

,



History/Philosophy

,

,

,



,

,

,



,

,

,











,

,

,

,

Law Business & administration

Creative arts & design Education *Combined studies Total all subjects

the issue being how can a traditional introspective STEM information sector reach out to include a broader audience who are increasingly science-aware.

CULTURE It is suggested that STEM publishing is not robust. It does not serve users of research well. It is restrictive, limiting usage exclusively to an institutional

CULTURE

53

clientele, preventing the individual researcher who may also be on the fringes of the research effort from having easy access (see STEM Dysfunctionality). In making these claims it is emphasised that there is no single culture common to all scientific disciplines. Each discipline has its own peculiarities shaped by circumstances deep within science itself and the community which is built around it. There are also individual’s circumstances – personality, ability to adapt to change, having access to funds, confidentiality issues – all determinants at grass roots level which affect an individual’s ability to adapt to change. There are nevertheless impacts on the research community which suggests that change in the STEM process will occur. It arises from the culture of scientific research clashing with emerging cyberculture. Cyberculture does not encourage the development of attention spans necessary for deep and sustained thought (see previous section). It fails to provide a conceptual framework that gives meaning to the world of research. It accentuates the here and now, not creativity and novel thinking. Another powerful feature of the emerging culture of science is the extent of networking activity, which places emphasis on “cooperation”, “sharing” and “collaboration” to an extent not seen in the era of printed scientific communication. Not only is this evident in “Big Science” (Price 1963) and highlighted in international projects such as the Large Hadron Collider (LHC) but it is also seen in the changed working practices of researchers in other disciplines and in some instances in the activities of the general public. The process of organising many researchers from all walks of life and from around the globe requires a skill set different from that required under the former practice of “little science” or individualistic research. If organised effectively, big science attracts wide community participation on a research topic, and thereby includes specialist knowledge workers as well as the affiliated – especially where a diverse set of skill sets proves valuable in tackling a multidisciplinary research problem. However, such changes in culture and working practices raises new questions. Can organisation of these emerging collaborative networks ensure that common procedures, standards and acceptable practices are adopted by all participants in the network – both academic and non-academic? What inducements or sanctions can be imposed to ensure consistency and quality by all those active within these new research networks? This is a question currently also facing large established professions as they confront decomposition (Susskind 2015). A further question is the extent to which researchers in front-line research in academia and the corporate world are prepared to reveal their research activities to those who do not have experience or qualifications in scientific research: whether these third parties can become active participants in a research

54

3 ENVIRONMENTAL AGENTS FOR CHANGE

team; whether a broader platform of research provides sufficient additional benefits to compensate for the added effort in creating a network of participation and ensuring that it operates effectively. There are therefore questions about how much trust and faith there is in both the established – print journal-based system – and the new informal digital communication systems by those who are actual users and, perhaps more significantly, those who are still latent and potential users. What judgements are used to decide how much credibility to give different forms of published output? Some of these issues were addressed by an investigation funded by the US-based Sloan Foundation and including the University of Tennessee, CIBER and several publishers. The study involved assessing behaviour patterns and wishes among scholars and was completed in November 2013 (Tenopir and Nicholas 2014; Nicholas et al. 2014a). No firm conclusions can be drawn on how cooperation and collaboration will develop in a fully digital STEM world. The openness underlying informatics structures suggests that wide participation will emerge, but it is unsafe to speculate at this stage how the sociology of science will change and whether a fully UKW-embraced culture will emerge and when.

Cultural Adaptation A common theme among industry observers is that greater democratisation is likely to occur in the provision and access to information. Changes in the entertainment and consumer sectors will spill over into the STEM communication arena and result in further changes to acceptable modes of scientific communication in the future (Esposito 2012a). The focus has moved to how users of publications are altering their habits along with demands set by the new information environment. Authors such as Gladwell (The Tipping Point, 2000), Anderson (The Long Tail, 2004; Free, 2008a), Surowiecki (Wisdom of the Crowd, 2004), Nielsen (Reinventing Discovery, 2011), Weinberger (Everything is Miscellaneous, 2007; Too Big to Know, 2012), Tapscott (Growing up Digital, 1998’ Grown up Digital, 2008; co-author of Wikinomics, 2006), Shirky (Here comes Everybody, 2008), Weller (The Digital Scholar, 2011) and Carr (The Glass House, 2016) are some of the many writers who have pointed out that technological developments have an impact on research output, researcher activity and the underlying sociology of science. This in turn has affected user behaviour which in turn raises the question of how research results should be formatted, not only to satisfy those remaining in academia but also the knowledge worker community in the evolving digital information economy.

CULTURE

55

It has led to writers and researchers such as Neilsen (2009), Monbiot (2011), Gowers (2014), Brown A (2009), Allington (2013) and Murray-Rust (2014) suggesting that the current STEM publishing system is no longer fit for purpose and needs to be changed to cope with new socio/technical conditions. One significant development was the introduction of the concept of the “Long Tail”. The long tail (Anderson 2004; 2009a) claims that the Internet offers less sales concentration compared with the 80:20 concentration traditionally held under the Pareto Principle for print. Evidence is available to support the contention that the Internet’s long tail has dramatically changed the business profiles of many items available on the Internet. The long tail can also be applied to the audience profile for scientific research output, with researchers in academia being the “core”, and unaffiliated knowledge workers being part of the “tail”. In many industries (music for example) the tail exceeds the core in size and significance. It is not only the audience which is subject to the long tail concept – it can also be applied to the output of STEM publications, with a few core articles being balanced by the many specialist and esoteric reports in each research field which would meet the needs of a long information tail in society. Concerns remain about how social media will impact on researchers, and this also has a bearing on creating the long tail. It can lead to information overload, the inability to absorb and analyse the flood of digital information being generated on various media platforms, including social media. Carr writes that “There is no Sleepy Hollow on the Internet, no peaceful spot where contemplativeness can work its restorative magic. There is only the endless, mesmerizing buzz of the urban street”. Constant use of the Internet can overwhelm all quieter modes of thought. According to Attention Restoration Theory (Art), when an individual is not being bombarded by external stimuli their brains can relax – their working memories are no longer being taxed by a stream of distractions. Too much noise is a distraction to deep analysis and thought processes. As quoted by Greenfield: Social networking sites could worsen communication skills and reduce interpersonal empathy; personal identities might be constructed externally. . . obsessive gaming could lead to greater recklessness, a shorter attention span and an increasingly aggressive disposition; heavy reliance on search engines and a preference for surfing, rather than researching, could result in agile mental processing at the expense of deep knowledge and understanding” (Greenfield 2014, 279–80).

Meyer (Meyer 1997) also challenges the role and value of another aspect of digital activity – that of “multitasking”. It results in less deliberative thought; less consideration being given to think through a problem. Multitasking involves

56

3 ENVIRONMENTAL AGENTS FOR CHANGE

spending too much time on distractions which preclude entering the quiet space for contemplative thought. It remains to be seen whether the Greenfield and Meyer arguments turn out to be more accurate than those put forward by Shirky (2008), Weinberger (2007), Tapscott (2008) et al. who focus on the benefits from using a variety of informatic services. In a related vein, Lieberman has explored an individual’s need to connect with other people, a fundamental need he claims as basic as that of food or shelter (Lieberman 2013). He asserts that because of this our brain uses its spare capacity to learn about the social world – other people and our relation to them. It is believed that we must commit 10,000 hours to master a skill (The Outliers by Gladwell (2008)). According to Lieberman, each of us has spent 10,000 hours learning to make sense of people and groups by the time we reach ten years of age. Sharing, cooperation, collaboration – key aspects if knowledge workers are to be engaged in the research process – have their foundation in these sociological processes developed in formative years. “Sharing” of information has become a critical aspect in the digital/Internet world and for scientific research. The brain’s formative wiring leads to restraint on selfish impulses for the greater good. Another impact which the social context is having on the individual’s ability to adapt to a new digital information environment was put forward by Nicholas Carr as an extension to the arguments in The Shallows (Carr 2010). In his later book The Glass Cage (2016), Carr describes how those who determine developments in automation, through software developments, do so to achieve efficiencies and not necessarily in support of broader cultural values. In effect, he suggests we are increasingly devolving many creative and innovative processes onto the machine and away from human control: “The computer, introduced as an aid to reduce the chances of human error, ends up making it more likely that people. . . will make the wrong move” (Carr 2016, 92). Besides the machine and artificial intelligence destroying traditional concepts of innovation and originality, another attack comes from the new aggregators which are dominating the information space. Google, Facebook and Amazon “consider originality to be a highly overrated ideal, even a pernicious one” (Foer 2017). Their dominance as intermediaries creates the impression that the masses have little creative potential, which justifies their being forcefed. In so doing, these intermediaries set out to dismantle the structures which have protected traditional ideas of authorship. Copyright laws have been weakened or breached, reducing the livelihood of many writers. Silicon Valley has championed a different theory of creativity from the traditional individual genius. It emphasises the virtue of collaboration, of achieving something magnificent by working with other people.

LEGAL ISSUES

57

Though this supports the concept of democratisation within the information society, it comes at a price, conformity, of accepting the powerful role of a few global organisations which have a commercial agenda different from the basic ethos of science. Their operational principles are that wisdom can be found in amassing massive data from datasets, and that by scouring through these, new ideas or interpretations can be made: “That is the essence of Google’s ranking of web sites; of Amazon’s recommendation algorithms, and Facebook’s News Feed – all extrapolated from the data sourced from the wisdom of crowds” (Foer 2017). Amazon has done much to undermine the profitability of publishers; Facebook and Google go one better by choosing not to pay for or own content. Writers take control as they are provided with the raw material on the web with which to “mash-up” and produce novel findings relevant to their own specific needs. The peril to society from the network is the creation of a “hive mind” (Kelly 2013). According to Foer (2017) such a development deadens disagreement and strangles originality. Faith in culture diminishes, replaced by a mania for data. Society is on the cusp of an age of algorithmically derived art and ideas. Instead of experimentation and novelty, data is propelling research into a formulaic approach which could be manipulated. Whilst working together is a strong cultural feature and has beneficial outcomes, “this should not be allowed to replace contemplation, moments of isolation, where the individual’s mind can follow its own course and reach its own conclusions” (Foer 2017) These cultural issues provide a mix which suggests that democracy is basic to human needs, none more evident than in the researchers’ needs to share, communicate and collaborate. It questions the basic premise behind traditional journal article publishing as the mainstay of science which centres on a competitive, individualistic approach which has meritocracy or elitism as its core. However, it is not plain sailing towards an open democratic information society as there are devils along the way which stifle originality, whether it is Silicon Valley values, or society’s love affair with social media, or the protectionism which the publishing industry has adopted to prevent democracy being achieved.

LEGAL ISSUES Protection of intellectual property rights over information contained in publications has been an ongoing feature of scientific publishing and their trade associations. The issue is that publishers feel they deserve to be compensated for activities they undertook to provide quality publications for their end users. However, there is a countervailing force which suggests that research

58

3 ENVIRONMENTAL AGENTS FOR CHANGE

publications, largely funded and created by the public sector, should be open and available to all, that STEM should be a public utility not subject to commercial barriers being put in place to control access. Copyright describes the rights related to the publication and distribution of research. It governs how authors, funders, publishers and the wider public can use, publish and distribute research articles in particular. The protective device is use of subscriptions, licenses or paywalls set by publishers to control access. The issue of maintenance of subscriptions or paywalls versus openness exerts a powerful determinant over the present and future structure of the STEM industry. It bears witness to several legal cases in which the protectionists of the existing IP (intellectual property) system are challenging new ventures developed by new organisations to STEM which threaten to undermine traditional publishing operations. Some key organisations, stakeholders or protocols in this aspect of STEM include the following:

Creative Commons (CC) Creative Commons is an American-based non-profit organisation aimed at expanding the range of creative works available for others to share and build upon legally. The organisation has released several proformas known as Creative Commons licenses available free of charge to anyone. These licenses allow creators to determine which rights they reserve, and which rights they choose to waive for the benefit of recipients or other creators. The CC organisation was founded in 2001 by Lawrence Lessig, Hal Abelson and Eric Eldred with the support of Center for the Public Domain. Prior to Lessig’s appointment at Harvard he was a professor of law at Stanford Law School. He also entered the US Presidential race in 2015. Creative Commons licenses do not replace copyright but are based on it. They replace individual negotiations for specific rights between copyright owner (licensor) and licensee. They are necessary under an “all rights reserved” copyright management, with a “some rights reserved” management employing standardised licenses for re-use cases where no commercial compensation is sought by the copyright owner. The result is a low-cost copyright-management regime, benefiting both copyright owners and licensees. Lawrence Lessig claimed in 2009 that because 70% of young people obtain digital information from illegal sources, the law should be changed. Lessig also argued “there is a different class of amateur creators that digital technologies have. . . enabled, and a different kind of creativity has emerged therefore”.

LEGAL ISSUES

59

Copyright Clearance Center (CCC) Copyright Clearance Center is based in Danvers, Massachusetts. It provides collective copyright licensing services for corporate and academic users of copyrighted materials. CCC creates agreements between rights holders, primarily academic publishers, and then acts as their agent in arranging collective licensing for institutions and one-time licensing for document delivery services, course packs and other access and uses of texts. CCC’s primary purpose is to “further the economic interest of publishers and copyright owners” and its founders (a group of publishers) had no “interests of any substance beyond the creation of a device to protect their copyright ownership and collect license fees”. The CCC is a broker of licenses, earning a 15% commission on the fees it collects. The company passes more than 70% of its revenues to publishers in the form of royalty payments to right holders, and another 30% is kept by the company as a fee for its services. Amsterdam-based RightsDirect, the wholly owned European subsidiary of Copyright Clearance Center, was established in 2010 and provides copyright licensing services for European-based companies for print and digital content in books, journals, newspapers, magazines and images.

Publishers Versus Sci-Hub Sci-Hub is a website with over 70 million academic papers and articles available for direct download. It bypasses publisher paywalls by allowing access through educational institution proxies. Sci-Hub stores papers on its own caches to speed up the processing of future requests. Sci-Hub was founded by Kazakhstani graduate student Alexandra Elbakyan in 2011, as a reaction to the high cost of research papers behind paywalls, typically US$30 each when bought on a per-paper basis. In 2015, Elsevier and American Chemical Society filed a legal complaint in New York City against Sci-Hub alleging copyright infringement, and the subsequent lawsuit led to a loss of the original sci-hub.org domain. Following the first domain loss, Sci-Hub has gone through several domains, some of which have been blocked in various countries. Sci-Hub has been controversial, lauded by parts of the scientific and academic communities and condemned by many publishers.

60

3 ENVIRONMENTAL AGENTS FOR CHANGE

Publishers Versus ResearchGate In October 2018, the American Chemical Society and Elsevier filed a lawsuit against ResearchGate in the US District Court in Maryland. This lawsuit focused on ResearchGate’s intentional misconduct in its online file-sharing and download service, where the alleged dissemination of unauthorised copies of published journal articles constitutes an infringement of the copyrights owned by the American Chemical Society (ACS), Elsevier and other STEM journal publishers. It is pointed out by the plaintiffs that the lawsuit is not about researchers and scientists collaborating in their research efforts, such as asking and answering questions, or promoting themselves and their projects. Nor is it in researchers sharing their research findings, raw data or pre-prints of articles. It is just about disseminating published articles freely. Founded in 2008, ResearchGate is a for-profit business that owns and operates an online social network and file sharing service which is aimed at scientists, researchers and related professionals. In deliberate violation of publishers’ rights, it is claimed that ResearchGate uploads infringing copies to computer servers it owns or controls. In addition, ResearchGate lures individuals into uploading copies by encouraging use of a “request full-text” feature, and misleadingly promoting the concept of “self-archiving”. According to the plaintiffs, ResearchGate is aware that, as a result, it has turned the RG Website into a focal point for massive copyright infringement.In the legal suit filed by ACS and Elsevier, “ResearchGate’s infringing activity is no accident. Infringing copies of published articles are a cornerstone to ResearchGate’s growth strategy”. ResearchGate uses the infringing copies to increase traffic to its website, its base of registered users, its digital content, its revenues and its attractiveness as an investment by venture capitalists. ResearchGate has established itself as an important free service for the research community worldwide. The deliberations in the law courts will have a significant effect on the extent of the barriers which users will face if the traditional means of disseminating STEM holds sway.

Implications These activities point to the difficult situation facing STEM, with the claim that it is “dysfunctional” ringing loud and clear by one side, and that STEM provides a valuable quality service by the other and which warrants protection. Such differences in approach to the dissemination of research results suggest a breakdown in the traditional STEM market structure would seem inevitable.

SOCIAL MEDIA

61

The forces which impel society towards a more digital and open structure would imply that the supply industry – the publishers – would need to adapt to the new environment and reduce their reliance on paywalls in protecting their existence. The legal challenges raised by publishers reflects their attempts to protect their existing profitable system. The risks associated with changing the paradigm to something new, the fear of the unknown, is a powerful stimulus to ensuring that change is limited to graduations rather than radical revision of the STEM system. Yet the combined effects of the technological, social, economic, commercial, administrative and research trends all point to a radical change about to take place against which the legal protectionism sought by publishers will be frustrated in due course. The answer appears to be that the legal actions by publishers are at best a temporary device which may hold back the tide but will fail to stop a new STEM structure emerging which relies on processes and procedures alien to the commercial journal publishers as currently configured. A new system, with new players and new business models geared around open sharing, threatens the midterm viability of the giants in the STEM publishing sector. The extent, diversity and strength of the various forces will in their combination provide the basis for a new STEM structure in future. A significant push towards a more open and less formal system of STEM dissemination may come from the rapid adoption of social media as an accepted way of communicating information about research outputs. The next section explores the position which social media has captured in the decade or so, during which it has transformed communication towards, arguably, a more democratic process within society in general.

SOCIAL MEDIA The way communication is taking place in modern society suggests that large sections of UK society join online networks, ranging from personal interest groups to corporate research networks, in order to find information, solve problems, build new services or forge new relationships (Tapscott 2008; Shirky 2008). Participation in Wikipedia-like services, seeking and disseminating knowledge freely for everyone, has become a way of life, particularly among younger generations. Populism is seen to be a strong political as well as social movement within global societies, facilitated in part by the availability of new open channels of communication – so-called social media. Populism is akin to democracy in offering an extensive platform on which interests can be shared.

62

3 ENVIRONMENTAL AGENTS FOR CHANGE

Adoption of Social Media Within Society During the past two decades, social media has exploded onto the information scene, having been marginal or non-existent in the previous millennium. The extent of the increase in its use, and the ability to move easily between various social media platforms, has opened new ways to undertake research. The following table illustrates just how intricately involved social media has become in 2015 at a global level. Tab. 6: Summary of social media penetration, 2015. Sources: Statistics from news reporting sources including The Times Business dashboard, November 30, 2011. Also from Statistica, The Statistical Portal (Statistica 2015). Summary (Jan )

Number of users

Percent of total

World total population

, million

%

+.%

No. active Internet users

, million

% penetration

+%

No. active social media accounts

, million

% penetration

+%

Number of unique mobile users

, million

%

+%

Active mobile social accounts

, million

%

+%

Year-on-year growth

Similar penetration can be seen in the UK. The rise of social media and social networking has been notably dramatic within the consumer and entertainment sectors of UK society. It has changed the information profile and activity of consumers. Data on social media services listed below was collected in 2016/7 from a variety of published sources and online services, as well as from the websites of the principal social media services. The data indicates the scale of the service concerned though the figures provides little indication of the intensity or regularity of use, particularly in a STEM context. – Facebook Facebook revolutionised the way a growing number of digital users communicate with each other. There were 1,440 million active Facebook members (June 2015) – one in seven of the world’s population. Additional sites provide an additional 2 million users to link into Facebook, while there were 35 million users in the UK alone (Twitter item on Facebook, August 24, 2015). It began as a free but closed system and generated a club-like adherence from dedicated

Global

Google

 mil active users p/m  mil visitors (/)

$ bil (/)

 bil users

. mil monthly visits from UK (/)

YouTube

 bil video views p/ day

$ bil ()

 mil active users (/)  mil visitors p/m (/)

. –  mil (rosemcgrory.co.uk) % in – age (www.emarketer.com)

LinkedIn

Twitter

$ bil ()

$. bil ()

Advert Revs  $. bil (est)

Corporate value ($)

VALUE

$. bil ()

 mil Tweets (/)

. bil online searches p/m (/)

. bil (/)

 mil (/)

 bil members

 bil content items p/m

Global users

UK users

CONTENT

 million (/) of which  mil members () . mil in London. Students key driver for growth

% USA % academic

Global

Facebook

Yahoo

Geographical coverage

Social Media

USERS

Tab. 7: Numbers of users per social media.

(continued )

 mil mobile users

 mil phone users

 mil users p/m (/)

Mobiles

SOCIAL MEDIA

63

Geographical coverage

Seeking financial model

 million users (/) . mil users

Mendeley % biomedics

Wikipedia

 mil articles ( p/d)

Acquired by Elsevier ()

 mil users (/)

Acquired for $ mil (/)

China

 mil monthly videos viewed

$ bil paid by Facebook ()

Corporate value ($)

Q-Zone

. mil monthly users (/) % are under .

 mil monthly active users

Global users

VALUE

 mil US visits

UK users

CONTENT

Myspace

Instagram  mil users in USA ()

Social Media

USERS

Tab. 7 (continued )

Mobiles

64 3 ENVIRONMENTAL AGENTS FOR CHANGE

SOCIAL MEDIA

65

users, away from the general search engines. Facebook became a parallel universe to the Web. Though advertising has not been monetised as much as Google (US$0. 40 per month per visitor for Facebook compared with Google’s US$ 2.11), investors have bought into Mark Zuckerberg’s entrepreneurship in the IPO (initial public offering on the stock exchange) which gave him control in 2015 over voting rights as well as remaining chairman, CEO and founder of Facebook. The service is, however, under scrutiny (in 2018) for allowing the data on its users to be exploited for gain by itself and other organisations without seeking and receiving consent from the individuals concerned. – Google Google has also dominated the way both affiliated and unaffiliated knowledge workers search for relevant information. The number of monthly unique Google searches was 1.5 billion, which gave Google a 75% share of the online search engine market. There were 300 million monthly active users of Google+ and the number of Google+ unique “mobile” monthly users was 20 million (October 2013). The global number of Gmail users has been put at 900 million (Google web site, October 2013) – LinkedIn During the first quarter of 2015 LinkedIn had 364 million members, up from 296 million the year earlier. It is a professionally-focused media platform and therefore a target service for unaffiliated knowledge workers. In 2014 most revenues came from talent solutions, online recruiting, marketing solutions and premium subscriptions. In the UK the main categories of LinkedIn users in 2015 were: – 12,530 journalists – 48,679 solicitors – 374,711 engineers – 4,083 farmers These categories give some indication of the spread of unaffiliated knowledge workers in the UK using just one of the available social media platforms. LinkedIn was acquired by Microsoft in June 2016. – ResearchGate ResearchGate is a service used to exchange articles among and between peers, circumventing publishers. It has become highly popular among researchers despite antagonism from leading STEM journal publishers.

66

3 ENVIRONMENTAL AGENTS FOR CHANGE

ResearchGate was founded in 2008. The company’s mission was to connect the world of science and make research open to all. Thirteen million researchers have made more than 140 million connections on the network and on a daily basis share over half a million updates about their research. ResearchGate has completed four rounds of financing from intermediaries and venture capitalists. However, in early October, the Coalition for Responsible Sharing – which includes Elsevier, Wiley, Wolters Kluwer, ACS and Brill – claimed that up to seven million copyrighted articles (40% of ResearchGate’s total content) had been obtained illegally and that all attempts to work with it to find ways for it "to run its service in a copyright-compliant way” had failed. Nevertheless, it emerged that Springer Nature was in discussions with ResearchGate to come to an amical agreement, details of which at the time of writing were not available. – Snapshot Snapshot is a screen capture system developed by Evan Spiegel and Jorijn Schrijvershof in 2012. It is a free service particularly popular in the United States, India, UK, Norway and Indonesia. In March 2017, it underwent a $5 billion IPO even though the business model supporting the service remains unclear. Despite this the shares were heavily oversubscribed, indicating the popularity with which some social media sites are held. – WhatsApp Media reports in early 2016 detailed over 1 billion WhatsApp users. – Twitter In January 2015, 500 million tweets were sent per day by 302 million active users (April 2015). Though 30,000 people a day were signing up to tweet, and Donald Trump, US president, finds this an important resource for keeping in touch with the voting public, Twitter’s growth has so far failed to match market expectations. – YouTube YouTube has over one billion users. Half of YouTube’s views are made using mobile devices (smartphones, laptops). The company generated $4 billion in revenues and projected four billion video views per day. The number of monthly unique visitors to YouTube in the UK amounted to 19.1 million (September 2013). – Instagram There are 300 million active monthly users of Instagram. The number of USbased users in 2015 was 77.6 million or 28% of the US population, while 13% of

SOCIAL MEDIA

67

Internet users use Instagram. Facebook paid $1 billion for the company in 2013. There has been a rapid growth in the photo-based Instagram to the extent that it is challenging the text-based Facebook and Twitter for market appeal. – QZone QZone has 644 million active users, the majority of which are based in China (May 2014). It now accounts for 40% of the world’s social media users. – Mendeley As of November 2012, Mendeley had 2 million users. 31% were in biosciences and medicine as well as 16% from the physical sciences, while 13% were engineers and 10% computer and IT specialists. On average, each user collected 142.8 papers and spent 1 hour 12 minutes per day studying the literature (Mendeley 2012). The company was acquired by Elsevier in early 2014. – Myspace Started in August 2003, it was acquired by Rupert Murdoch (News International) in July 2005 for $580 million. It was then bought by Specific Media/Justin Timberlake for $35 million in 2011. It has 150 employees (January 2015), well down from its peak of 1,600 employees a decade earlier. There were 300 million monthly videos viewed from 50.6 million monthly users (January 2015). Though it was overtaken by Facebook, Myspace still remains an important web property. – Wikipedia A “wiki” (from the Hawaii word for quick) is an encyclopedia-based web site that users can directly alter or add to at no charge. Wikipedia includes 35 million articles which are being added to by 750 new articles each day from approximately 69,000 contributors. The number of registered users is 25.5 million, creating an estimated half a billion usages each month. It is an example of a collective project in which participants seek no direct financial gain. It illustrates where the available technology and change in cultural values has led to a democratic, open and free social media service.

GAFAs There is no smooth pathway leading to openness and transparency in STEM. New technology and systems are emerging which stand in the way. This point was highlighted in a book by Franklin Foer entitled World Without Mind (Foer 2017), where he describes the way in which several new social media

68

3 ENVIRONMENTAL AGENTS FOR CHANGE

companies – notably Facebook, Google, Amazon and to a certain extent Apple (or GAFA) – are destroying the creative processes within society. They are criticised for controlling the way individuals seek and receive information, not for good social reasons, but rather to capitalise on the power it gives them to dominate in the commercial sector through generating advertising revenues. It is problematic as these companies build on their dominance of the community’s mindset to move into other areas, so extending their tentacles, with each move relying on their understanding of individual’s needs which comes through collection of usage data. For example, Google aims to build driverless cars, manufacture phones and “conquer death” (Foer 2017). Amazon has moved beyond online catalogues to produce TV shows and design drones. They all have their heritage in the communes and psychedelic world of California in the 1970s rather than in the financial and capitalistic worlds of commerce. As such, their initial corporate aims based on democracy and collectivism are being dissolved as market forces propel them towards authoritarianism and control. These new behemoths on the information scene assume that individuals are hooked on the value of being part of a network, to be part of the “wisdom of crowds”, and as such Google, Apple, Facebook and Amazon compete to create unique networks within which individuals can operate: “They are shredding the principles that protect individuality”. They appear to disrespect the value of authorship with their hostility to intellectual property. For them information should be free. Commercial transactions are created from the knowledge of consumer behaviour built up and inherent within their networks: “Their algorithms suggest the news we read, the goods we buy, the friends we invite into our circle” (Foer 2017). They have become arbiters of what is delivered to the wider community. This power of the GAFA threatens to distort the future STEM information process by marginalising individual creativity in favour of group behaviour and sharing, with asset base on which such sharing takes place – information – being under the control of the GAFAs. Their astuteness lies in deciding not to own the content but rather to make third party content available as comprehensively, to as wide an audience, as possible. What that gives them is “clicks” and therefore an awareness of user needs. However, this does not guarantee quality as the filtering process so important within STEM is left to the wider community which in turn can give rise to spurious claims, sensationalism and “fake news”. This is compounded by the esoteric nature of much of the STEM published output, too technical for the average layperson to understand and assimilate. As suggested by Foer, the tech companies are destroying something precious, and the individual’s most precious asset, attention, is being abused.

SOCIAL MEDIA

69

Google appears to focus on creating an artificial brain, using machines to outsource our thinking; Facebook arose out of the computer hacking world and Zuckerberg’s focus is on engineering solutions to promote sharing and make human beings predictable. Amazon is creating itself as a gatekeeper into a world which is easy to buy items by harnessing technology and the Internet. In anticipating human behaviour, as they rely on their respective algorithms, the users become easy targets for manipulation. Democracy, so strong a feature of the new information world, loses out to central control by a few commercial undertakings, and they in turn are now being investigated by national authorities.

Summary Social networks are ubiquitous. Based on self-reporting by over 2,000 individuals over 12 years of age in the USA (Arbitron 2013), the percentage of persons using each of the main social networking services was as follows: Facebook LinkedIn Twitter Myspace Google Instagram

% % % % % %

These usage percentages represent a wider community – one in which knowledge workers inhabit – rather than just the STEM audience. For example, Google usage would figure more prominently in the research community, and Facebook would not be so dominant in the exchange of STEM material. Nevertheless, the indications are that social networks need serious consideration in the future evolution of the STEM information system. The following chart illustrates how several of the main social media services relate to the Internet. It highlights that the numbers of users to each of these services are in the millions, in some cases billions. It also puts into perspective the small inroads which the large commercial journal publishers have made in developing STEM-related activities as a service supporting global digital natives. So far, conservatism in STEM authorship has been powerful in protecting the established print-derived paradigm. The traditional system has been reinforced by the existing reward system which in turn is mainly evaluated and assessed through citation metrics. The effectiveness of authors in research is judged on whether their output appears in the highest cited impact factor

70

3 ENVIRONMENTAL AGENTS FOR CHANGE

Elsevier (0.001 bil) INTERNET (2,300 million) MySPACE (51 million)

Wiley (0.001 bil)

LINKEDIN (364 million)

FACEBOOK (2,000 million) TWITTER (302 million)

**

Flickr (51 mil)

GOOGLE (190 million) Wikipedia (26 million)

EBAY (150 million)

INSTAGRAM (300 million)

YOUTUBE (1,000 million)

iTUNES (500 million)

Fig. 6: Social media services. (See Brown, D. 2016. p189)

journals. Users of research output rely on the published research article in reputable journals as their main source for credible information. Social media and social networking sites could worsen communication skills and reduce personal empathy (Greenfield 2014): “Personal identities might be constructed externally and refined to perfection with the approbation of the audience as priority”. Obsessive gaming could lead to recklessness; reliance on search engines and preference for surfing rather than researching could result in agile mental processing at the expense of deep knowledge and understanding. Seeking friends on social media could get out of hand and become obsessive. Adoption of social media therefore carries some negativity for society, scientific research and STEM.

COMMERCIAL ISSUES

71

COMMERCIAL ISSUES A vocal complaint about STEM is that publishers’ preferences for charging a toll-based and paywall system for access to its publications denies everyone outside the closed academic/R&D circles from reading and using published research results. To counter this, other business models have emerged. One which has many supporters is an open access (OA) model, which relates more generally to open science developments and the ambitions of the founding fathers of the Internet. This offers benefits to non-academic knowledge workers in enabling them to gain easy and free access to STEM results. Another set of commercial policies could emerge as blockchain developments take hold and begin to impinge on STEM. This is something which may have future appeal as the basis for a flexible digital business model.

“Openness” and Open Science “Openness” is a broad movement influencing cultural change. It is a move from tight control and meritocracy to openness and democracy. Specifically relating to STEM, it replaces “toll-based access” with “free at the point of usage” as the underlying business model. The concept that “Information wants to be free” was claimed by Brand as early as 1984 (Brand 1984). He highlighted that information is a strange “product”; it does not follow conventional rules of losing its value over time (depreciation) or through frequent usage. The more it is used the more valuable information can become. Obscurity is fatal for information ─ it needs to be seen, recognised and built on, and in so doing does not obey traditional laws of wastage. Brand pointed out that although information wants to be free, it also wants to be expensive ─ the two concepts struggle against each other (Brand 1984). This thought has also been pointed out by Google’s Hal Varian (Varian 2000) who wrote: Because the marginal cost of reproducing information tends to be very low the price of an information product, if left to the market place, will tend to be low as well. What makes information products economically attractive – their low reproduction costs – also makes them economically dangerous (Varian 2000).

The danger is that free access runs up against publishing’s tradition and legacy. Openness conflicts with those forces protecting intellectual property and copyright, which are powerful social cornerstones with strong commercial support

72

3 ENVIRONMENTAL AGENTS FOR CHANGE

and historical roots. This alternative view has been made by Cory Doctorow in his book Information Doesn’t Want to be Free (Doctorow 2015) in which he challenges the current state of protectionism in the creative industries. His point is not that information wants to be free but rather people do not want to be overprotected. Protection over rights usually incorporates a charging mechanism, and the extent of this charge is the crux of tensions between current STEM stakeholders.

Open Access (OA) There are several types of open access, each being colour-coded. More details on each is given in the Business Plan section of this book. Each type – grey, green, gold, platinum, hybrid – has its own supporters. So far none has offered a sustainable and acceptable commercial approach for STEM. As described by Richard Poynter in his blog on LibLicence (Poynder, April 9, 2018), “a multitude of issues have arisen since the 2002 Budapest Open Access Initiative (BOAI) adopted the term in order to promote the idea of research being made freely available on the internet. It has also led to a great deal of debate and disagreement over the best way of making open access a reality”. He claims we are now arriving at the point where consensus is growing around the idea of publishers converting (or “flipping”) their journals from a subscription model to one of the open access models, notably the gold model. This is being spearheaded by the OA2020 Initiative. One implication of this would be that widespread use of the pay-to-publish model would result (the gold OA model). Instead of readers paying to access other researchers’ papers, authors will pay to have their research published by traditional journal publishers. This would be by means of article-processing charges (APCs) being levied by the author or his/her agency. Currently, APCs are around $3,000 for each accepted manuscript. Amongst many other issues, this has an impact on the ability of researchers from lesser developed countries to have their research published – the ability to make such upfront payments poses a significant budgetary problem for them and their institution. However, Jeffrey MacKie-Mason, university librarian at UC Berkeley, argues that engineering a mass conversion of subscription journals to OA appears to be the only practical way of achieving open access in the near term, and that while a global flip presents challenges. . . we cannot expect open access “to remedy all inequities” (see https://poynder.blogspot.co.uk/2018/04/northsouth-and-open-access-view-from.html).

COMMERCIAL ISSUES

73

A report from the University of California and the California Digital Library published in July 2016 addressed the consequences of a “flip” from a subscription-based publication system to a full (gold) open access business model (University of California, Davis 2016). The conclusion from this $800,000 study funded by the Mellon Foundation confirmed previous reports: that the move to the gold system, involving payments by authors or their institutions, results in increased costs for those institutions which are highly productive at the expense of smaller research centres. The report also highlighted the split between those users who want to see everyone else pay for the publication of their research outputs and authors who do not want to pay for their own results to be published; as such, gold remains a low priority. The other main OA business model is referred to as green. This involves the automatic deposit of the researcher’s manuscript in a local institutional repository (IR), and the IR would make such articles freely available to all at no additional cost. The cost would be in establishing the institutional repository in the first place and managing the manuscript flow on an ongoing basis, usually by the library. Though it is possible to imagine a world that switches from subscriptions to one of the open access business models, it nevertheless appears that publishers’ revenues from subscriptions are currently as high as ever. The reason, as argued by Alex Holcombe and Björn Brembs (Holcombe 2017), is that publishers control prestigious, legacy journals with high impact factors. Researchers are compelled to publish in these journals to enhance their careers and reputation, even if they are more expensive (for the librarian) than alternative less prestigious journals. It is an author-driven policy rather than catering for end users’ needs. A prominent champion in support of the UK open access movement has been Jisc. It funded research studies, several of which suggest that if open access could be implemented by all relevant parties then “The increased impact of wider access to academic research papers could be worth approximately £170 million per year to the UK economy” (Read 2011). Similarly, funding agencies such as the Welcome Trust have been vocal in their support for open access for the dissemination of results arising from their investments in biomedical research (Welcome Trust 2008a; 2008b). Against such libertarian organisations there is the combined power of commercial publishers, notably through trade associations such as IPA (International Publishers Association), the International STM Association (STM), the Association of Learned Professional and Society Publishers (ALPSP) and the American Association of Publishers (AAP). These associations have even deeper pockets with which to fund lobbying activities aimed at supporting a commercial

74

3 ENVIRONMENTAL AGENTS FOR CHANGE

approach to scholarly communication and protection of copyright and the subscription model. Toby Green from OECD in Paris identified that the scholarly publishing industry had lost its way in the new digital millennium. The issues facing traditional publishing outlined above meant that problems were being stored up, which even experiments with open access (OA) could not resolve. The challenges raised by Sci Hub were indicative of a general malaise (see earlier under Legal Issues). The headline of the article, “We’ve failed: Pirate black open access is trumping green and gold and we must change our approach”, indicates OECD’s strategic thinking (Green 2017). The key points he made included: – Sci-Hub has made nearly all articles freely available using a black open access model, leaving green and gold models in its dust. – Why, after 20 years of effort, have green and gold open access not achieved more? Do we need “to think again”? – If human nature is to postpone change for as long as possible, are green and gold open access fundamentally flawed? – Both open and closed publishing models depend on bundle pricing paid by one stakeholder, the others getting a free ride. Is unbundling a fairer model? – If publishers changed course and unbundle their product, would this open a legal, fairer route to 100% open access and see off the pirates? The evidence above says that green and gold open access models are not the revolutionary business models we need because, if they were, then they would have in excess of 80% market share already, instead of the 20% for the two options combined, and the pirates which have emerged would be looking elsewhere for opportunities. There are some sustainable open access successes such as BioMed Central, PLoS and arXiv, but their share of all articles remains marginal. The bottom line is that for both green and gold open access to succeed many actors need to change what they do (Green 2017). As claimed by Green, for the green open access model to become a success, all six principal organisations involved in the scholarly communication process need to change aspects of their behaviour together. The green open access business model “fails if any one actor does not change or fails to cooperate with others” (Green 2017). The same applies in getting gold OA adopted. That we have managed to get roughly halfway there in terms of articles being open access in some form is, in this light, not a bad result, but it has taken nearly two decades and the pace of change for journal articles may be slowing (Boselli and Galindo-Rueda 2016; SIMBA 2016).

COMMERCIAL ISSUES

75

Nevertheless, pilot experiments still emerge as commercial publishers seek to find a way through the difficulties of open access versus subscription business models. Elsevier in particular has been experimenting with so-called “mirror journals”. These journals are fully gold open access but share the same editorial board, scope and peer review policies of their existing “parent” journal from which they emerge. They are companion journals, offering the same level of visibility and discoverability as the parent journal. The choice is open to the author about which type of exposure, and which business model, they are willing to accept – subscription based, hybrid or mirror. It is not only publishers which are experimenting with new open access business models: “In a new article in the open-access journal PLOS Biology, several US-based researchers are proposing a so-called Plan U” (for ‘universal’). This is similar to the concept proposed by Allington in the UK (2013) whereby funding agencies include a synopsis of their funded projects on their web sites, which should be made available immediately and for free. The US scientists are similarly calling on organisations that fund research “They call on the organizations that fund research – government agencies such as NIH and charities like the Howard Hughes Medical Institute – to require the scientists they support to post drafts of their papers on free websites called preprint servers before submitting them to academic journals” (PLoS Biology 17 (6): e3000273). Sci Hub and LibGen As described earlier, a recent development which could influence the future of open access and scholarly communications is Sci-Hub, which publishers see as a true pirate. Sci-Hub was founded by Armenian graduate student Alexandra Elbakyan in 2011 as a reaction to the high cost of research papers hidden behind paywalls. It now has a website with over 64.5 million academic papers and articles available for download (February 2018). It bypasses publisher paywalls by either facilitating access to online articles through educational institution proxies, often capitalising on leaked logins and inadequate security controls. There are some 400 research universities included in the Sci-Hub network. Sci Hub also stores papers on its own repository. Sci-Hub is not, according to pundits, open access (OA) in its purist form. It is a unique service lost in the mists of what is acceptable and possible in the provision of access to published articles and books. Some claim its sole proposition is to provide bootleg merchandise, functioning as a parasite on the capitalist system. Others see something more insidious in a system which has political overtones. Sci-Hub is controversial, lauded by parts of the scientific

76

3 ENVIRONMENTAL AGENTS FOR CHANGE

and academic communities and condemned by publishers: “Sci-Hub could have harvested nearly all scholarly literature” (Himmelstein et al. 2017); “if true, Sci-Hub has single-handedly won the race to make all journal articles open access” (Green 2017). Sci-Hub faces legal challenges from several large STEM publishers. In 2015, Elsevier filed a legal complaint in the USA alleging copyright infringement, and the subsequent lawsuit led to a loss of the original sci-hub.org domain. In 2017, a similar law suit was filed by the American Chemical Society. Following the first domain loss, Sci-Hub has moved through several domains, some of which have been blocked in other countries. What Sci-Hub lost in international legitimacy it more than made up for the explosion in traffic, resulting from the promotion of its non-commercial activities. What caused Elbakvan, the creator of Sci-Hub, to take on the large international STEM publishers with such impunity? It is partly due to her belief in the untapped potential of the human brain, or “global brain”, as an intelligent network that could facilitate information storage and retrieval – “driving communication between people in real time the way that neurons fire together wire together” (Graber-Stiehl 2018). In this context, publisher paywalls seem like “plaques in an Alzheimer’s-riddled mind, clogging up the flow of information”. Another stimulus was her belief in “communism” and the openness which underlies such political principles. However, it is not something which has been endorsed by Russian policies in practice, which baulked against taking on the global IP community, despite Russia’s weak intellectual property rights which made it the hub of piracy in the past. It resulted in Russia bickering with Elbakyan (even though Elbakyan is rumoured to be in hiding in Russia to escape further legal suits). On the other hand, the explosion in usage from 2012 by Chinese researchers led Sci-Hub to share an approach with a similar service, LibGen, and combine article and book delivery free of charge on a massive scale, albeit from a scattered collection. Funding for the Sci-Hub service came from crowdfunding, with donations and Bitcoins being sent in to keep the operation going. Though not operating with high margins – unlike large commercial journal publishers – its low-cost operation and minimal financial goals have worked to its advantage in building up global reach. Elbakyan’s background in Kazakhstan as a postgraduate student in computers and religious studies provided the academic training which led to her appreciating the value of scientific information freely available, unrestricted by publisher paywalls. Her understanding of computer technology allowed automated programmes to be developed which checked with each university server until they found sought-after articles. Even deactivation of the PayPal account (under pressure from publishers) which had been used to

COMMERCIAL ISSUES

77

access university credentials failed to dent the growth of Sci-Hub. Such inroads into secure university sites raises questions about the efficacy of backroom controls within universities and academia. As far as publishers are concerned, it appears Elbakyan “plans to handwave away any more law suits and play whatever cat and mouse she must” (Graber Stiehl 2018). In the UK the adoption of Sci Hub as an easy to use and free document delivery service has been significant, with 1,200 downloads per day or approximately 350,000 per annum. As such, it eats into the one million document delivery requests satisfied by the British Library Document Delivery Supply, which has been the global market leader in document delivery for some time, and for which it charges. Sci-Hub operates in the murky global world of intellectual rights interpretations and has shown flexibility in adjusting its technical operations to withstand the cease and desist orders that come from publishers. It is one of the new agencies and intermediaries which have emerged from the interaction of STEM with new technological methods for information dissemination in a digital world.

Freemium Openness and free are not the same. Free specifically relates to a commercial transaction whereas openness is a process: “Free at the point of usage” has been advocated by Chris Anderson (Anderson 2009b) in his book Free – The Future of a Radical Price. He suggests that the Internet will encourage free access to basic information services (such as research articles) with commercial returns needing to be sought from other related activities, such as through offering premium services, seeking advertising, gaining sponsorships etc. Charging any price, even a few pence, for accessing a research article online would stifle latent demand according to Anderson: “Give a product away and it can go viral. Charge a single cent for it and you’re in an entirely different business, one of clawing and scratching for every customer” (Anderson 2008a). The difference between “cheap” and “free” is what venture capitalist Josh Kopelman calls the “penny gap” (Kopelman 2007). In the digital marketplace the most effective price is no price at all. Anderson gave examples where novel business models have been used which include cross-subsidies (giving away a digital recording to sell a TV cable service) and freemiums (offering Flickr for free while selling the superior FlickrPro to more serious users; the same with LinkedIn). Other examples include the music industry in recent years where appearances, streaming, festivals and merchandising

78

3 ENVIRONMENTAL AGENTS FOR CHANGE

Fig. 7: The Penny Gap. Josh Kopelman, MD, First Round Capital, 2007.

reflects that new sources of income can be made from avenues other than just selling music on CDs. Anderson suggested that a zero price should be considered for the main product (the refereed journal article) and freemium pricing applied to related premium products. Zero pricing would open the market for the article and would be sustainable if there were sufficient interest within the sector for the associated premium products from which payments would be obtained. This is

COMMERCIAL ISSUES

79

a drastic course but one which would bring in knowledge workers as information users, but not necessarily as buyers. Adopting a freemium pricing policy is a big ask for a traditional, conservative and protectionist-focused STEM journal publishing sector, and not one which has been readily adopted by the commercial STEM publishing industry. Freemium and STEM Publishers Some publishers with foresight have however put the freemium business model into effect. Toby Green, from OECD’s Directorate of Public Affairs and Communication, wrote an opinion piece for Learned Publishing which described a freemium project which OECD launched in 2012, and the reasons behind this action (Green 2017). The key feature behind freemium is “unbundling” – breaking down an established product or service (such as a journal article) into component parts or functions, and for each separate function establishing whether it can be a chargeable entity and, if so, how much. The prime example of how unbundling has worked in practice is in the airline industry. What used to be a ticket sale to get a person from one place to another is now a confusion of seat selection, excess baggage charges and food and drinks in flight, as well as even the use of the washroom which one airline is rumoured to have considered – an unbundling brought about by the emergence of no-frill budget operators and new technology. Breaking down and analysing the commercial aspects of each function can be applied to scholarly journals in a similar way as for the airline industry. Offering a basic service to readers for free would qualify as gratis open access (Suber 2012). The question arises that if everyone could read all scholarly content for free, would there be sufficient value in additional services to generate revenues needed to fund both the article publication process and the various spin-off and unbundled premium services. Green offers some examples of where “unbundling” could take place: – On the author side, peer review management, copy-editing and language services could be offered along with services to promote publications and make them more accessible to non-technical audiences. – On the reader side, services could include higher-utility versions of articles, books and data sets; access to productivity tools like downloadable citations, semantically driven navigation and alerting services; and tools to report impact to funders and employers. – Librarians might value rich metadata and feeds to build catalogue databases, usage reports and user support services. – Funders and employers could be offered reports by subject area.

80

3 ENVIRONMENTAL AGENTS FOR CHANGE

In unbundling the product, low-cost airlines democratised air travel, reducing prices such that a new and large general market for flying emerged. Likewise, in democratising scholarly communications by offering a basic service to all for free, readership expands and could stimulate a market for services beyond the existing customer base, tapping into new budgets. As with other digital businesses, which build large audiences with free services, there will be value in “selling the audience” to advertisers and others with an interest in accessing niche audiences. There is some evidence that a wider readership can be attracted if only the works are available at low or no cost. OECD’s experience is that total readership expanded more than fivefold following the introduction of their read-only, freeof-charge service, and it did not cannibalise its ability to earn revenues from premium editions and associated services. OECD found that 15% of accesses to their content is to premium versions, a service which only commenced in 2012. One of the keys to a successful freemium business is a constant reassessment of the boundaries between what should be provided for free and what could be paid for. A premium service today could become commoditised and a basic service tomorrow. A baseline, free, service would include the ability to print and save for offline reading. The structure of an unbundled scholarly market could focus on three types of reader. The main group would be individuals who are satisfied with the free, read-only version who would pay nothing. The next largest group could be those at large institutions (universities, companies etc.) which might purchase premium versions based on the assumed value in the utility of the premium features to its patrons. Thirdly, there will be some ad hoc users who need the utility of the premium versions but are not at a subscribing institution; they will pay as they go. Much of the assumption about the ability to achieve a viable base for premium services is dependent on there being a long tail of potential new users who can be attracted into paying for such charged-for additional premium services.

“The Long Tail” Anderson, editor of Wired magazine, unleashed a global debate with an article entitled “The Long Tail” in October 2004 (Anderson 2004; 2009a). The term has caught on in technology and media circles. The long tail is the large portion of content that is thought to be of residual value to companies catering for mass audiences. Anderson claimed that this residual portion of the demand curve

COMMERCIAL ISSUES

81

(see below) is both significant and in many instances profitable. It opens a market opportunity which needs to be addressed in assessing business models. “The long tail” is the thousands of products that are not number one bestsellers. In the digital world, these products are booming because they are unrestricted by demands of physical retail space, as is the case in the pre-digital age. What once had to be stored and accessed from buildings and shelved in warehouses now live on in computer memory and can be retrieved quickly, easily and inexpensively through online systems. The long tail concept can also be applied to the market base. As with a preponderance of demand for the product being found in a digital world, in the many small traditionally undiscovered products the long tail also applies to market conditions. Many users of these products have hitherto been excluded from becoming active buyers because of the price factor. Reducing the price brings new demand as well as supply into the equation. Anderson claims there is still demand for big “cultural buckets” or hits (such as subscription-based journals), but this is no longer the only market. The hits now compete with an infinite number of niche markets. The mass of niches always existed, but as the cost of reaching them fell – consumers finding niche products and niche products finding consumers – this concept becomes a dynamic cultural and economic force. The long tail also has implications on the structural and organisational aspects of industry. On the supply side, there are a few large commercial and society publishers complemented by thousands of smaller publishers. On the demand side, users of published information are mainly in the university and corporate research centres worldwide, but they are surpassed in numbers by trained and educated knowledge workers in wider society. In an era of almost limitless choice, many consumers will gravitate towards popular mass-market items, but just as many will move towards items that only a few, niche-market people want. Until recently, mass-market entertainment ruled the industry. However, in a digital age the tail exceeds the core markets in many instances. Specialism flourishes because they can be easily followed through a simple process of accessing required items stored digitally on computers. Niche products available for niche markets are made available in a way which challenges the dominance of a “hit” focused culture. Latency is breached by technology. “Amazon has found that 98% of its top 100,000 books sell at least one a quarter” (Anderson 2009a). Perhaps more revealing is that a further one-third of their sales come from titles not in the top group, suggesting that the market for books not held by the average bookstore is already one third the size of the existing market. “Apple has said that each of their one million tracks in iTunes

82

3 ENVIRONMENTAL AGENTS FOR CHANGE

has sold at least once”; “Netflix reckoned that 95% of its 25,000 DVDs. . . (was) rented at least once a quarter”: these experiences quoted by Anderson show the power of the long tail in changing industry paradigms and raising the spectre of the long tail as an important business concept (Anderson 2009a).

Popularity

The New Marketplace

Head

Long Tail Products Fig. 8: Theory of the Long Tail. Based on Wikipedia’s description of Anderson’s Long Tail, 2004.

As described by Brynjolfsson et al. (Brynjolfsson 2011), We find consumers’ usage of Internet search and discovery tools, such as recommendation engines, are associated with an increase in the share of niche products. We conclude that the Internet’s Long Tail is not solely due to the increase in product selection but may also partly reflect lower search costs on the Internet. If the relationships we uncover persist, the underlying trends in technology portend an ongoing shift in the distribution of product sales.

The claim made in this report is that tipping points and long tail concepts – both related in their approach to identifying new marketing opportunities – are significant business drivers towards effecting change in the STEM publishing sector over the next five years. Both are also strong influencers in offering the ability to bring knowledge workers as users into the scientific effort.

“Tipping Points” The “Tipping Point” indicates that a traditional approach, such as in communicating scientific research results, may undergo dramatic change as a result of external events taking place. These events are often outside the control of the affected community.

COMMERCIAL ISSUES

83

In 2000, Gladwell pointed out that there is not always a smooth transition from one business paradigm to another (Gladwell 2000). According to Gladwell in his book The Tipping Point – How Little Things Can Make a Big Difference, change and innovation do not always take hold for logical reasons. Gladwell claims that successful ideas, products and messages behave as “epidemics” or “viruses” (Gladwell, 2000). They are contagious. He also suggests that there are three rules which set off an epidemic: 1. The first is the Law of the Few. A few individuals can have a significant effect in creating change. They are described as connectors, mavens and salesmen. Connectors know many people – the average personal contact network is claimed to be about 150 people (Dunbar 1992) or 262 for the average Facebook user (Arbiton 2013), but connectors know many more due to social networking. They have extensive personal and online contact networks with whom they communicate. They are on first name terms with the movers and shakers in industry. Mavens are individuals who are well informed and share their knowledge willingly; they accumulate and disseminate knowledge. They are not persuaders – they are teachers. Finally, salesmen have the power of persuasion to extend the reach of product or service sales. They tend to be subtle in their approach. Their arguments cannot be resisted. Elements of these three processes influence change. There are candidates who could be considered connectors, mavens and salesmen in the current controversies over aspects of scientific communication – notably in Open Access (OA) adoption. Experts such as Suber (Harvard), Harnad (University of Southampton) and Guedon (University of Montreal) may be considered as members of the law of the few. 2. The second rule of the epidemic is for “stickiness” in the message. For electronic publishing this can be a technologically “better” information service – a key stickiness factor. The message should have recognisable appeal and offer tangible benefits. New alternative research output systems fall into this category if the value to the end user of adopting these new ways of disseminating research results is clearly apparent. 3. The final epidemic is the power of context. Epidemics are sensitive to prevailing conditions. Starting an epidemic involves a different set of human profiles: innovators, early adopters, early majority, late majority and finally the laggards. The first two are visionaries and risk takers, whereas the early and late majority and laggards avoid risks; they are pragmatists. There is a chasm between the two groups. This is where the connectors, mavens and salesmen have a role in generating the epidemic. They translate the message from the first group to the second.

84

3 ENVIRONMENTAL AGENTS FOR CHANGE

The issue here is that there is a social mechanism behind changing attitudes. This is as relevant in electronic publishing as elsewhere. It means technological efficiency by itself is not enough. There also needs to be a social drive and support mechanism in place. Have the more significant aspects of electronic publishing reached tipping point? Several have, whereas others still have some way to go. STEM publishing still needs the connectors, mavens and salesmen to be more active – for example several recent author studies show that, despite the claimed advantages for authors having their articles published in open access format, as many as 90% of authors are still unconvinced (Nicholas 2010b). Only 20% of all published research articles are available in open access form. “Tipping point” issues have not yet taken hold across the board within the research community. This is relevant for extending the market for research outputs across all society as there is a high degree of fragmentation within knowledge worker groups, and each fragment has little experience in adapting to and using STEM publishing systems. Ensuring that epidemics take hold in every knowledge worker sector may prove difficult. However, in combination with the following concept which describes how new STEM products and services reach maturity, it suggests that there are systemic market forces at work which can change individual’s attitudes.

“Product Life Cycle” At a product level, the Gartner “Hype” Cycle demonstrates that different digital products are at various stages in development, and all go through a period of hype and disillusionment before settling down on an even keel (Gartner 2014). As a concept, the Hype Life Cycle model helps organisations understand that the path to market acceptance for a product/service is not smooth and that disappointing results may reflect that it is at different points in the product life cycle. The above illustrates how stages within STEM can be plotted onto the Hype model. It is a dynamic model as well as a subjective one. For example, although institutional repositories (IRs) could be placed at an early stage – at “technology trigger” – in recent years there has been commitment by funding agencies to move them up the slope and beyond the “peak of inflated expectations”. Web 2.0 may be entering the “trough of disillusionment” according to several pundits, whereas digital rights management (DRM) may be on the “slope of enlightenment” or even reached the “plateau of productivity”. Social networking and social publishing for the science community may still be anticipating a

COMMERCIAL ISSUES

85

Fig. 9: The Gartner Hype Life Cycle. Based on Gartner’s Hype Life Cycle in Wikipedia.

“peak of inflated expectations” and therefore have some way to go before they become productive tools within STEM. The importance of this concept is that good new ideas may fail or be slow to be adopted just as often as they achieve dramatic early commercial success. There is a “right time” and “right conditions” for new services to be introduced successfully (the “tipping point”). It is not easy to predict how external factors will dictate where on the hype cycle a product or service may fall. Unaffiliated knowledge workers are bystanders in seeing how such new products/services succeed or fail along the hype cycle. However, in future their combined influence may have the effect of changing the position of new digital information products or services on the Gartner Hype life cycle model.

Economies of Scale New STEM products and services which meet conditions set by the digital research environment require investments to be made in technology and IT expertise. The structure of STEM publishing is such that large commercial STEM publishers – for example, Elsevier (part of RELX Group), Springer S&BM (and Nature), Informa (Taylor and Francis) and Wiley (Wiley-Blackwell) – operate at one end of the scale/size spectrum, and the small, highly specialised and focused learned society publishers or university presses at the other. It is not just a matter of corporate size. It is also focus, sophistication and professionalism in the adoption of modern commercial/business practices. The tail of the STEM publishing

86

3 ENVIRONMENTAL AGENTS FOR CHANGE

sector may not always have the corporate scale, resources or even commitment to undertake heavy and risky investments in new informatics areas. Traditionally, learned societies saw their publication programme as a service for members, making sure that latest research in their field was published through the learned society’s imprimatur and own journal. In recent decades several learned societies began subcontracting management of their publications to larger commercial companies to achieve increased sales. Economies of scale are important for STEM when reaching out to a global audience. On their own the small publishers cannot cope – in collaboration with the scale offered by larger publishers they can achieve greater outreach. Sophisticated sales and marketing apparatuses are required which can only be sustained if there is an extensive range of products/services being offered. It also requires investment in production and IT skills to support the transition from print to digital publishing, which adds to the cost base. Small publishers and learned societies are unable to commit such resources if it means that other aspects of their corporate mission are compromised (see later chapter on Learned Societies). Larger commercial publishers offer their sophisticated online infrastructures and support services to small learned societies. In effect, larger publishers are buying market share on the back of the smaller learned society publishers. Increasing corporate size to benefit from economies of scale is not without negative consequences. According to Professor Nerissa Russell (anthropology), chair of Cornell University Faculty Library Board: There’s been tremendous consolidation in the publishers, and things that used to be published on their own by learned societies are now being contracted out to these commercial publishers. There are about five commercial publishers, and they’re jacking up the prices to make money because they can. (March 2014, Faculty Senate Meeting, Cornell University).

This raises problems for knowledge workers as their usage and publishing activities are determined by policies set by the commercial sector and these policies are often more restrictive and commercially expensive. However, the more cost-effective digitisation of research output becomes the quicker the traditional barriers created by economy of scale can be lowered and learned societies can then regain their position as primary providers of STEM. This issue is explored later with a focus on the role of learned societies in the digital age. The following chart plots some of these economic and financial trends and how they come together in changing research behaviour of researchers.

PUBLISHING AND STEM DEVELOPMENTS

87

Costs of Publishing

Openess and Free ‘Frustration Gap’

Refereeing

OA developments Green/Gold/Grey

Electronic publishing

Mandates

Datasets

Researcher Behaviour

PPV and personal subscriptions

Publisher dominance

Institutional subscriptions

Funding problems

Big Deals and licensing

New competitors

Pricing models

Industry structure

Fig. 10: Economic/Commercial Trends. (See Brown, D. 2016. p26)

PUBLISHING AND STEM DEVELOPMENTS STEM publishing provides an infrastructure for weeding out noise and inaccuracies in research outputs, enabling researchers to receive what is relevant, accurate and useful. The role of the refereeing system is crucial. However, there are challenges to the current two “blind” refereeing systems which most journal publishers adopt to ensure that only quality articles are incorporated into the minutes of science. The refereeing system remains the main solution to one of the leading problems facing the STEM industry – that there are too many research outputs for an individual researcher to cope with. There is an information overload.

88

3 ENVIRONMENTAL AGENTS FOR CHANGE

“Information Overload” Discussions about research information and users is often how, in a digital society, people cope with too much information from too many sources. An early proponent of the concept of “information overload” was Toffler (Toffler 1970) who claimed in his book Future Shock that information overload was a psychological syndrome experienced by individuals rendering them confused and irrational. Whereas information overload was considered a psychological problem by Toffler, it is now viewed as a cultural condition. Industry observers worry that it is not too much information we are receiving but rather that there is not enough knowledge being assimilated. Two possible solutions have emerged – individuals rely on arithmetic, the reliance on large databases, to collect and sift information on their behalf. Alternatively, there is the social construct of using colleagues, societies, professionals and friends to point individuals to relevant items (Dean and Webb 2011). According to Shirky (2008), any problem we have with information overload is a filtering problem. Whilst in a pre-digital world we relied on traditional quality sources such as experts and peers, newspapers, journals and textbooks for filtering, in the digital world there has been a shift towards the informal and social media to provide filtering services. Jan Velterop has commented that there is now more information available in any given field than anyone can find using traditional literature searches: The time is long gone when it was possible to go to a small set of journals to find pretty much all needed (Velterop 2014).

There are two problems resulting from overload. Firstly, researchers arrive at conclusions based on a small subset of information in the area concerned. The Internet has accelerated the collapse of communications between experts and the layperson by offering an apparent shortcut to erudition. Secondly, Velterop claimed there is much duplication of research as researchers are no longer aware of similar and related activity going on elsewhere. As both these factors “will only add to the noise and volume within the system, the situation promises to get progressively worse” (Velterop 2014). Furthermore, academic discovery often takes place in the interfacial areas between disciplines, increasing the amount of information needed to be scanned. The Internet is weakening the ability of researchers and knowledge workers alike to do basic research. It is not like a library, with all its search aids and filters to weed out noise. The Internet is more like a giant repository where anyone can deposit anything, good or bad, right or wrong. It is an environment without regulation, which opens the door to content being driven by marketing,

PUBLISHING AND STEM DEVELOPMENTS

89

commerce, politics and the uninformed decisions of laypersons rather than relying on qualified experts. Each person is now walking around with more information on their smartphone or laptop than ever existed in the entire Library of Alexandria. However, most of this is of a low quality. According to Sturgeon’s Law, 90% of everything is rubbish (Sturgeon, “The Claustrophile”, Galaxy, August 1956). The sheer size of the Internet, with its over one billion websites, and the inability to separate meaningful knowledge from random noise, means that good information will always be swamped by inferior data. Another aspect is that there is a finite amount of time available for researchers to find what they are looking for. A researcher’s focus could therefore become too narrow, that they can be sidelined by new developments, and spend too much time going down blind alleys. In addition, disciplines change their shape and direction which could be overlooked by individuals in their focus on a small, specialist area. More worrying is the impact which the trend towards “multitasking” is having on the ability of researchers and academics to absorb information. Multitasking introduces a distraction. From studies on students it appears that performance when undertaking multitasking is lower in comparison with those who do not multitask (Greenfield 2014, 230). Increased availability and use of Facebook, Skype and messaging systems impact negatively on concentration levels. Besides information overload in content, the digital native faces information overload through new media formats, as both the medium and the message creates an excessive flood of information and data (McLuhan 1964). The net result is that users of research information have to find their own way through the morass of information with which they are confronted; some succeed, others become “confused and irrational” (Toffler 1970). The problem gets worse as the structure of science splinters in different directions, each creating new sources for research outputs and publications.

“The Twigging Phenomenon” New subject areas are created continuously as frontiers of science are pushed out ever further. Science fragments into smaller sub-disciplines – the so-called “twigging phenomenon” (Small 2006). Each new sub-discipline is a collective ground for a group of like-minded researchers to unite and create a new learned society, and this leads to the development of a common forum for the exchange of relevant information – a society-spawned learned journal. Such groups want the

90

3 ENVIRONMENTAL AGENTS FOR CHANGE

published output from their members to receive international recognition, respectability and visibility. A commercial/marketing support service is required. A similar process is undertaken within commercial journal publishers; their in-house editors are constantly looking for new publishing ventures which support new research areas. The following graph is indicative how such twigging or splintering occurred in one sub-discipline of physics – the same process is replicated in most areas of science. narrow baryon state

541 electroweak breaking 215

B-hadron production

699 fundamental constants

568

air fluorescence yield gamma-ray 1649 Supersymmetric bursts 1202 dark matter power spectra 660 quasar binary black survey holes early 138 587 galaxy structure 2429 galaxy clusters

exotic nuclear structure

738 2009

1752 first stars

108

525

neutrino mixing matrix 725 cosmic microwave background 1491 brane cosmology 487 superconformal theory

894 interstellar turbulence

transiting extrasolar 965 planets

Fig. 11: Example of “twigging” in physics disciplines.

Editorial barriers to entry are low in STEM publishing and are reducing even more as “economies of scale” bite further. It is a competitive process with each publisher seeking to be the first to establish a journal or book series in any new research area, to sign up the best editor-in-chief, to invite support from a reputable editorial board, to offer the best service to authors in the topic. Each such title would be unique even though like-minded titles focus on the same area. They are not necessarily substitutable as each carry different reports and research findings. They differ only in perceived personal assessment of quality and brand, something which is still subjective and unquantifiable (impact factors notwithstanding). The twigging process adds a further dynamic to STEM publishing – as research frontiers are breached with new research areas, topics and disciplines emerging, the scope for including more and more specialists within the research network increases. Twigging therefore caters for the “long tail” of publishing. It appeals to those unaffiliated knowledge workers who seek only the

PUBLISHING AND STEM DEVELOPMENTS

91

occasional specialist publication and research involvement of a temporary nature. However, this structural feature of science makes the information problem facing researchers worse – their exposure to information overload is increased as the frontiers of science are extended. This is where the refereeing service offered by STEM publishing comes into play. Refereeing eliminates those articles which are questionable, fraudulent and irrelevant. Rejection rates are high in some of the quality journals/magazines – as many as 90% of submitted manuscripts by industry leading journals. The question raised by many pundits is whether this important filtering and refereeing process remains fit for purpose in a digital age. There are alternative systems which technology is making available.

Vertical Search People are faced with a bewildering array of information formats (magazines, billboards, TV, blogs, bulletin boards etc.), which Morville, 2011, (and Toffler (1970) before him) claims leads to loss of literacy. Morville wrote in his book Ambient Findability about finding one’s way through this flood of information within the many published formats (Morville 2011). Ambient findability is less about the computer than the complex interactions between humans and information. All our information needs will not be met easily, he claims. Information anxiety will intensify, and we will spend more time rather than less searching for what we need. “Information overload” faces a resurgence in the digital age. Search engines are not necessarily up to the task of finding what the end user wants. They tend to be out of date and inaccurate. However, they are trying to rectify some of the emerging weaknesses by improving their technology. For example, they undertake SEO – Search Engine Optimisation – ensuring that software throws up results that are most relevant to the end users based on the search terms provided. Whilst search engines pride themselves on speed and specificity, this excludes the subsequent activity the end user goes through in bypassing “splash” pages and other interferences in reaching the required source data (splash pages on a website are what the user first sees before being given the option to continue to the main content on the site). Faced with the above, there are opportunities for new types of vertical search services. Vertical search engines, distinct from a general web search engine such as Google, focus on a specific discipline. The vertical content area may be based on topicality, media type, or genre of content, such as a research

92

3 ENVIRONMENTAL AGENTS FOR CHANGE

area. Google cannot be as precise and filtered as a targeted vertical search service (Battelle 2005). The question is whether these vertical or niche services can pull back search activity from the entrenched position large generic search engines have established (Gardner and Inger 2012). This requires hard work and investment, but few publishers have shown any inclination thus far to create such platforms either individually or collaboratively. Consequently, researchers and knowledge workers are not offered a good service. They depend on the broad sweep of sources included within Google and similar general search engines. Knowledge workers rely on “something is good enough” until an improvement in ambient findability can be achieved and it filters down to more general knowledge workers. Ambient findability as described by Morville is closely associated with the development of portals and hubs (see later). A vertical front end search system could link into communities to provide a seamless search and delivery system. Both could then feed into an SDI-equivalent of a personalised STEM service. This will be taken further in a later discussion on future STEM trends.

“Wisdom of the Crowd” It is implicit in this book that the present STEM publication system appears to cater for the “elite” or meritocracy in science. There is a closed network of specialists who determine what is published. Authors and readers are essentially the same individuals. Appreciation and understanding of an article’s content are made by a small network of specialist referees having a similar professional and academic standing. “Wisdom of the Crowd” has been held as a challenge to such specialism, concentration and meritocracy or elitism. It has been shown that the sum, knowledge and experience of numerous people exceeds the performance and results of a few skilled experts under certain circumstances. As a social construct it is another example of a move towards “democratisation”. In his book, Surowiecki suggests that answers, and therefore ultimately wisdom, stem from asking a wide group of people their opinion on a topic. “Wisdom of the crowd” takes the position that citizens contribute to a common end from the perspective of their separate personal interests, background and experiences. In many cases the combined results of the group’s opinion turn out to be more accurate than relying on the views of a few individual experts (Surowiecki 2004). Surowiecki’s theory was based on practical observation. In 1906, Galton witnessed bets being placed on the weight of an ox at a west-country fair in

PUBLISHING AND STEM DEVELOPMENTS

93

England. He found that the average estimate from the 800 or so participants was almost accurate. The conclusion from this and similar experiments was that groups do not have to consist of clever people to be smart. Consensus among the masses is achieved through a mathematical truism – if enough people participate, the individual errors in their estimates, positive and negative, cancel themselves out, leaving a near-accurate result. Surowiecki suggests that one should stop chasing an expert and instead ask the crowd to reach informed decisions. Expertise – as the foundation of the refereeing system – can be challenged. Experts are more likely to disagree than to agree. There are some spectacular examples of the refereeing system getting things wrong – bad research being applauded, important research trivialised and plagiarism institutionalised – though these are admittedly untypical. Such failings can be attributed to the current global community of referees being overworked (Ware 2005). Ware highlights that a small number of referees are responsible for a significant share of articles reviewed – an unhealthy reliance by STEM on the voluntary actions of a dedicated few reviewers. They take part as volunteer reviewers because they have an emotional commitment to their research area which transcends financial returns. This may not be a typical motivation of researchers who pursue their own research priorities. New online systems can compete with traditional assessment systems in terms of speed, interactivity and effectiveness in stimulating community involvement. Blogs and wikis are examples of grass root services providing a framework on which alternative refereeing services can be based, revelling in the democracy which is inherent within the Internet. They are based on the institutionalisation of sharing, cooperation and collaboration – all powerful facets of the digital/Internet world. Social media and social networking processes, enabled by the Internet’s openness and interactivity, could support a different assessment system – one which leads to new publication formats (such as “the article of the future” (Elsevier 2011; Zudilova-Seinstra, 2013)) rather than relying on a “static” published article as the traditional form of scientific communication. In society there are websites which ignore the expert and use mass participation to create content, assessments and reviews. For example, RottenTomatoes. com uses the wisdom of the crowd to rate films and movies. Tripadvisor.com does the same for hotels and travel services. It allows anyone to be a critic, not just those who are professionally trained. Even Google’s core search system is created using wisdom of the crowd – the PageRank algorithm (which highlights items most relevant to a specific search) is based on actions by the crowd of its online users. The crowd’s vote is encapsulated in the web logs, and search

94

3 ENVIRONMENTAL AGENTS FOR CHANGE

results are raised in the listing of results according to the term’s web log popularity. Amazon, eBay and similar online services look to the wisdom of the crowd to improve their services. They point to items that end users might be interested in buying, based on experiences of similar activity by their crowds of users. Wikipedia is an example of a product which has been built using the wisdom of the crowd in a structured way. For “crowd”, one could substitute “unaffiliated knowledge workers”. They are not all experts, but in using the sum of all their experiences they are as likely as not to come up with assessments based on numerical superiority which are comparable in accuracy with the current two-blind refereeing system. It is another way that UKWs could become more active in STEM in future. It would need an alternative refereeing structure to be developed to formalise such a process. The power of technology and social change may provide a solution.

Cult of the Amateur Nevertheless, Keen, in his book The Cult of the Amateur (Keen 2007), was critical of the growing democratisation within media and suggests that it is destroying something valuable within society – quality, relevance and expertise. He claims that empowering the amateur, a consequence of passing control to the “the crowd”, undermines authority of experts who have spent years building up personal knowledge and experience. Adopting the principle of wisdom of the crowd would give experts the same status as an ignorant bystander, in his opinion. Mumford called such a situation “a state of intellectual enervation and depletion hardly to be distinguished from massive ignorance” (Mumford 1974). Keen claims it is necessary for experts to sift through and decide what is important and what is not. Otherwise we are left to make our own way through the mass of “white noise” without gatekeepers being there to provide selection, advice and assistance. STEM publishing provides the service whereby experts make judgements on quality, a valuable service for end users and one for which he/she should be expected to pay. Traditional two-blind refereeing (whereby two experts give their opinions on whether an article merited publication) is a service which had to be organised – it is not something which emerged automatically, and nor is it free to administer and coordinate. Tom Nichols (2017) also makes the distinction between laypeople and professionals. Volunteers or knowledge workers do what interests them at any given

PUBLISHING AND STEM DEVELOPMENTS

95

time, whilst professionals employ their expertise every day: “The enthusiasm of interested amateurs is not a consistent substitute for the judgements of experts” (Nichols 2017). As an example, Nichols points to the varied coverage within Wikipedia, with the ad hoc approach to item descriptions defeating the oftenclaimed universality of the service. It is skewed toward technical, Western and apparently male dominated content – a point which has been identified (in MIT Technology Review 2013, for example). The value of Wikipedia is less in the comprehensibility and reliability of its content as in bringing together much data in a reliable and stable format. However, knowledge is more than just an assembly of factoids. The Internet creates the false impression that the opinions of many are tantamount to facts. As such it is a facility in support of creating “wisdom of the crowd” whereas what it overlooks is the complexity of many of the issues which face researchers. It is the scrutiny from experts which enables the knowledge base to grow effectively, and not to be drowned out by social media noise. However, though there are aspects of the “cult of the amateur” which are important, the STEM communication industry is not static – it is being driven by environmental developments which change the industry, including the way relevancy is created and targeted to those who most need it. Refereeing could become more open and transparent in a digital world because of general support for open systems and new interactive media formats. Though the challenges made by Nichols and Keen are fair, it is a challenge which will be eroded as and when demography provides a more enlightened and extensive crowd or audience. At that point, the “cult of amateurs” becomes an informed, extensive and workable network of “amateur scientists” who could interact online to comment on the quality of research in an analogous way to that seen with many social media services (Google, Amazon et al.). Wisdom of the crowd raises questions of whether professionalism inherent in the production of content creates a barrier to widespread dissemination of research output in a digital world, and whether the existing in-house editing skills within publishers, and content management expertise within libraries, restrict the ease of access which researchers need to become active STEM participants.

“Serendipity” In a webinar organised by the Society of Scholarly Publishers entitled “The Changing Discovery Landscape,” David McCandlish noted, “Not only am I trying to keep up with developments in my specialty, but I’m also trying to find cool ideas from other areas where I’m not an expert.” As a quantitative evolutionary

96

3 ENVIRONMENTAL AGENTS FOR CHANGE

biologist at Cold Spring Harbor Laboratory in the USA, he monitors literature in his own neighbourhood of evolutionary biology and genetics, but also roams more widely across other fields, such as maths, physics and engineering. This is a pattern found among many scientists where a discipline impinges on subject areas, sometimes very different from their core subject. New ideas may emerge from keeping in touch with a broader coverage of published information and results. It enables the research process to be more imaginative, creative and less introspective. There are also eureka moments. New ideas and new research findings spring up from nowhere. Reliance on traditional publication systems do not provide the steps which lead to such novel developments. Application of computer technology, automation and machine intelligence can smooth the path towards eureka moments. Even that still requires an analytic assessment, based on real life experience, to determine whether the new development is worthy of further effort.

Miscellanised Information In his book entitled Everything is Miscellaneous – The Power of the new Social Order, Weinberger, a marketing consultant and fellow of Harvard Law School, extols the virtue of redundancy in a digital age as a counter to professionalism (Weinberger, 2007). Metadata facilitates access to objects in different ways. Digitising everything enables information to take different formats. Whereas in the printed world the “leaf can only be on the one branch at any one time”, in a digital world this is no longer true. Metadata provides the link to objects in multiple ways. “Messiness” now becomes a virtue, unacceptable in a print world, but in a digital world entirely possible as many links can be followed to the required published objects. Weinberger claims that in a print world there was a finite resource to transmit information and hence knowledge. In the new digital world there is no such limit. According to Weinberger, compilation of metadata should not be confined to the rigid structures of traditional classification/cataloguing. Rather, metadata should be developed, enhanced, built on by the community itself for different purposes, in a sort of Wikipedia style of ongoing improvement. In effect, Weinberger claims that the elaborate classification of information will be made redundant by what he refers to as the “third order”. The “third order” is social networking/social collaboration through Web 2. Weinberger’s arguments rest on the concept of the “wisdom of the crowd” (Surowiecki 2004) and the power of metadata fragmentation (Evans and Wurster 2000). Both

RESEARCHER BEHAVIOUR

97

result in a greater variety of digital information and communication formats (“miscellaneous”) and the decline of the expert (classification). Authority therefore comes under attack. The owner (publisher) of an item was historically in a powerful position. In the online world it is now the end user who is empowered. Sifting occurs through polling and rating systems (Digg, Twitter, Amazon, Google etc.) rather than through reliance on blind refereeing systems. There is latency in the information process enabling new players to join, and new technology allows them to do so. In some areas it is more apparent (astronomy, for example) than others and Weinberger fails to comment on subject areas which rely on a corpus of carefully vetted and structured information which advance the cause of science (such as in biomedicine). “Standing on the shoulders of giants” (Bernard of Chartres, twelfth century; Newton 1668) has little room in Weinberger’s analysis.

RESEARCHER BEHAVIOUR Below are a selection of market reports on user behaviour in the sciences. They are either recent or backward-looking, and at best are based on assessing researcher behaviour using log analyses to follow their online “exhaust trails”. At worst, they rely on impressions of past experiences reported by individual researchers which give little indication of what the information needs and habits of researchers in a future digital age might be. They are, quantitively, unrepresentative and could be spurious in their conclusions. Neither approach is therefore useful for indicating future STEM behaviour.

Typology of Researchers/Users Studies undertaken in the past apply typologies to individuals according to their ability to interact with published scientific information. The profiles vary from the totally switched off to those who become gatekeepers, selecting, filtering and redistributing information on behalf of colleagues. The latter are the “mavens” (see “Tipping Points” and Gladwell 2000). In demonstrating that researchers have different needs, such typologies can also be applied to the latent audience for STEM – the UKWs. This might indicate the proportional distribution of user types in the various UKW communities. All the approaches stress that a single approach to embracing the UKW audience would be unrealistic – segmentation of the UKW communities in terms

98

3 ENVIRONMENTAL AGENTS FOR CHANGE

of disciplines, culture and user typology is necessary, in the same way as for academic researchers.

The SuperJournal Project A major study on researchers’ behaviour was funded jointly by Jisc and several STEM publishers, the “SuperJournal” project (Pullinger and Baldwin 2002). The structure they identified for researchers adapting to e-journals was based on questionnaire returns received from 2,500 users. This created a database from which two main types of users were identified, regular digital users and occasional digital users. Within these two categories several user types were identified. Profile of users in SuperJournal Project Regular Users

Entdused

Journalfocused Topicfocused Articlefocused Occasional Users

Bingers Explorers Window shoppers

Frequent use of large numbers of journals (at least .), usually of tde full text. Mainly social scientists. There were  users in total. Very frequent use of specific journals, half their time being spent on the full text.  were identified. Access titles once every six weeks or so. Use on average . journals. More social scientists than natural scientists.  users in total. Access once every two months. Use only one journal, sometimes reaching into the full text. Mainly natural scientists retrieving known articles.  in total. Used service for short period of time, intensively, and did not return. Used the online service extensively, making several repeat visits. Those who came into the online service, looked around, and then left. Mainly natural scientists.

Different typologies have been identified in other studies. The first systematic attempt at typology of researchers was undertaken by the Faxon Institute in 1991/ 92. Its unpublished multi-client study showed that almost 50% of researchers were either “information zealots” or “classic scientists” and 16% were “information anxious” (Faxon Institute 1991/92). More elaborate typology of all sections of the US community, not just researchers, is undertaken at regular intervals by the Pew Internet and American Life project. Ten different user typologies have been identified, ranging from “Omnivores” through “Lackluster Veterans” to “Indifferent” and “Off the Network” (Pew Research 2008).

99

RESEARCHER BEHAVIOUR

Targeted analyses of STEM researchers have been made by Professors King and Tenopir in the United States. Typology was not the key feature of their many reports but rather an assessment of use based on critical incidence feedback.

Patterns of STEM Use The main results from the investigations undertaken by King and Tenopir can be summarised as follows: Tab. 8: Understanding patterns of STEM use. Readings per researcher

No. of articles

No. of Hours

In academia





In Industry





From journal subscriptions

Year

Personal subscription Library collection



%

%

/

%

%

Age of article read

Over % from journals over  year old

Time spent on research

Hours spent on all research activities

Type of document read

Format

, per annum

University researcher

Non-university researcher





Trade journals





Professional books





External reports / grey literature





Internal reports





Other materials









Research journals

Total

100

3 ENVIRONMENTAL AGENTS FOR CHANGE

Tab.  (continued) Disciplinary differences

Discipline

Average number of articles read per annum

Medical researchers



Paediatricians



Engineers Social scientists and psychologists

– 

Tenopir and King based their reports on responses received from over 25,000 scientists, engineers, physicians and social scientists during the past 30 years. These represent academic-affiliated researchers. Within this period, the average number of articles read by these largely university-based scientists has risen from 150 (1977) to 172 (1984), 188 (1993) and 216 articles between 2000 and 2003. However, this is not consistent across disciplines, as indicated above. Tenopir and King also established that during the same thirty-year period the average number of personal subscriptions which individuals subscribed to fell from nearly six to under two. This was compensated for by access through library central holdings and interlibrary loan support services (Tenopir and King 2000).

Ofcom The Office of National Statistics commissioned a study by Ofcom, the UK government’s telecommunications watchdog, in August 2014. The results painted a picture of how the young generation have adapted to new communication technology in the UK. According to Ofcom, the advent of broadband in 2000 has created a generation of digital natives. “These younger people [children under 16] are shaping communications,” claimed Rumble, head of Ofcom’s media research. “They are developing fundamentally different communication habits from older generations, even compared to what we call the early adopters, the 16-to-24 age group” (Ofcom 2014). Industry pundits consider preferences of millennial children a better indicator of the future than those of innovative young adults. The most remarkable indicator is time spent talking by phone. Two decades ago, teenagers devoted their evenings to using home telephones. However, for those aged 12 to 15, phone calls now account for about 3% of time spent communicating. For all

RESEARCHER BEHAVIOUR

101

adults, this rises to 20%. Today’s children do most of their remote socialising by sending digital messages or through sharing photographs and videos. “The millennium generation is losing its voice,” Ofcom claims. Over 90% of their device time is message based, chatting on social networks such as Facebook, or sending instant messages through services such as Twitter or WhatsApp, or even sending traditional mobile phone text messages. On the other hand, 2% of children’s time is spent emailing, compared with 33% for adults. Away from their phones, 12 to 15-year-olds have a different relationship with other media. A seven-day diary showed live television accounts for just 50% of viewing for this age group, compared with nearly 70% for adults. They spend 20% of their time viewing short video clips, for example on YouTube. Young adults aged 16 to 24 are active consumers of almost all media, devoting 14 hours and seven minutes each day to communications, if the time spent multitasking, for example texting while watching TV, is included. However, their use of radio and print-based media has all but disappeared. Such change in behaviour patterns is critical at all ages, but particularly among the generation which will soon feed into the nation’s community of researchers. Not only will it affect the way future academic-based researchers are able and willing to conduct their communication activities, but it will also set new demands to modify the communications systems and infrastructure to allow a wider audience of knowledge workers to become active participants in research whatever their institutional affiliation.

Gaps and Barriers A study into the difficulties facing users in accessing research output was undertaken by CIBER (Rowlands and Nicholas 2011). In this study, journal articles were considered critical to discovery. Nevertheless, 11.5% of all researchers described their level of journal access as “poor” or “very poor”. For university researchers, the proportion fell to only 5.4% but rose to 19.8% for knowledge workers in small and medium-sized enterprises and 22.9% in manufacturing. Faced with barriers to access, the general response was “simply to give up and find something else” which does not auger well for efficiency and productivity. The study also pointed out that “there are around 1.8 million professional knowledge workers in the UK, many working in R&D intensive occupations (such as software development, civil engineering, consultancy) and in small firms who are currently outside of subscription arrangements. The needs of this sector of the economy demand greater policy attention” (Rowlands and Nicholas 2011).

102

3 ENVIRONMENTAL AGENTS FOR CHANGE

The UK industrial sectors reporting the poorest levels of journal access included the motor industry, utilities companies, metals and fabrication, construction and rubber and plastics, although, clearly, R&D also takes place in all these industries. In the CIBER study, nearly half (45.8%) of the researchers reported they had difficulty accessing full texts of journal articles they needed on ten or more occasions over the previous 12 months. It is not possible to quantify the knockon effects of this “failure at the library terminal”. A spectrum of outcomes is possible, from mild frustration to more serious outcomes such as repeating an experiment unnecessarily or losing out on a grant. There is also confusion about licensing and particularly walk-in rights, especially for accessing e-resources. Pay-per-view business models constitute a disincentive to accessing research publications. There was widespread reluctance to pay for individual articles at prices currently being asked for by publishers and document suppliers, and a minority of researchers (26.3%) claimed that they had strong objections in principle to this mode of access. Nevertheless, there were indications of a substantial market for pay-perview and that this could grow further if acceptable business models and prices could be set. 12.6% of respondents to the CIBER survey say they might consider buying individual journal articles in the future, and this proportion rises to 43.8% in the case of conference papers.

Early Career Researchers CIBER Research is, at the time of writing, completing a three-year study of early career researchers being undertaken on behalf of Publishing Research Consortium. The target audience are mainly doctorates or doctoral candidates, under 35 years of age, who are active in research. Extensive interviews were conducted among over 100 researchers in a cross section of seven countries, including the UK, the USA and China. The central message from these mainly face-to-face interviews was that this growing sector of the overall research community was firmly committed to a paper-driven form of information use. They were happy with the refereeing system as it currently stands and were concerned about any proliferation of an open review system as this could attract unwelcome and risky comments. Nevertheless, there is a growing adoption of some social media in finding, communicating and sharing information through such social media platforms as ResearchGate, LinkedIn and, to some extent and surprisingly, Twitter.

WORK PROCESSES

103

As indicated above, sharing and cooperation are important functions and this is also apparent within early career researchers (ECRs). ECRs recognise the volatility of the research environment in which they are active. They feel unable or reluctant to take advantage of the new opportunities in the STEM information systems because of the risks in following an uncertain social media path. Nevertheless, there is active growth in online communities, and these communities are champions for tightening up rules governing authorship. Open access (OA) publications, from an initial antipathy towards them, are now seen as a positive way to achieve greater outreach and speed of publication. Within the geographical coverage of this ECR study, the UK and USA show little pressures to support change, whereas China, Malaysia and France seem to welcome more widespread acceptance of a different communication model. Throughout there is still a great deal of support for a journal publishing system and its attendant referee process. Standing proud from the crowd is ResearchGate, which has become a pillar of scholarly communication. It has become an increasingly important platform for scientific communication. These initial results from CIBER Research put a dampener on what has been written earlier about the strength of the forces for change facing the STEM community. There are several points to be made – the CIBER Research is a small study investigating in depth approximately 100 of the universe of over 7 million researchers worldwide, and therefore may not be fully representative. Secondly, ECRs are mainly drawn from the academic community, and one of the great harbingers of change may come from the incorporation of researchers outside academia such as in the professions, SMEs and notably from a scienceaware general public. Finally, the train is rapidly leaving the station for the current cadre of researchers, driven on by the coalescence of the “perfect storm” factors described earlier.

WORK PROCESSES The transition from a singleton-based research process to global collaborative Big Science projects is not linear. It occurs in stages. The starting point is the research process itself with a strong scientific ethic dictated by the need to ensure quality of the scientific record. During the 1990s, office standardisation processes were increasingly adopted by researchers. These included back-office services such as email, project management procedures, standard forms for fund applications, protocols etc.

104

3 ENVIRONMENTAL AGENTS FOR CHANGE

During the early years of the current decade, new technology tools also became available, which together with social media have led to a further stage in the research transition – that of systematised research. It involves applying sophisticated technology to replace what was traditionally manual effort. Robotics in surgery and dentistry and CAD in architecture are instances of the new approaches. A further stage is the externalisation of the research effort. This involves the creation and access to a wider range of research outputs. The next stage is still in development, but with the speed of technology and social changes currently underway, the paradigm for research is likely to change again in the foreseeable future (Susskind 2015).

Sharing Results Underpinning the above is the assumption that researchers are prepared to share information and experiences with others without prevarication or equivocation, particularly with and among unaffiliated knowledge workers. This is a critical issue for future behavioural patterns – the willingness to “share” at all stages in the research process. In Shirky’s book on Cognitive Surplus (Shirky 2010) he comments on studies, many built around Game Theory, suggesting that there is within society the willingness to collaborate and share rather than be selfish in protecting their research. This more collaborative approach has been supported – from a neurological perspective – by Lieberman in his book Social – Why our Brains are Wired to Connect (Lieberman 2013). An improved sharing of common resources occurs when all participants act without compulsion. There is a social mechanism which supports this – it is not destructive or aggressive but one which treats individuals as being nonexploitable. It is a founding principle of the Internet. This support for sharing leads to participation in the communication process by a far wider group than has hitherto been the case. It enables knowledge workers to share their thoughts and experiences with academic/researchers on a collegiate and level playing field basis. However, sharing information is not always as ingrained into the scientist’s psyche as participation in some large collaborative projects might suggest. Most career-minded scientists have little incentive to contribute to open-sharing sites and instead the focus is on doing what has been done over decades – to “publish or perish”, to create articles which are published and cited and give international recognition for research achievements of the individual scientist concerned: “Their data is the raw record of experimental observations and may lead to

WORK PROCESSES

105

important new discoveries” (Neilsen 2011). It is the special edge which a researcher’s quality publications would give them over their peer group with whom they compete for funding, promotion and international recognition. The fear by such traditionalists is that open sharing creates opportunities for stealing results before they become attributable to the original author. It even allowed for erroneous results to be disseminated. The history of scientific development is strewn with examples of scientists stealing data from others, and plagiarising other works. It is the dark side of science and scientific publishing. Another barrier to sharing comes from research which leads to patent applications being made, or in developing a commercial product. This is where basic research, historically undertaken within universities, comes up against the proprietary aspects of applied research, undertaken in industry but also increasingly within universities under contract. It needs a change in behaviour and administrative procedures within corporations and research institutes if sharing is to succeed across those disciplines and projects which have an industrial and commercial application. However, this protection of the individual’s research activity conflicts with neurological trends which research into social networking is highlighting. Neurologists point to the importance of dopamine as a stimulant for enjoyment within the brain. Small releases of dopamine occur whenever an individual participates in a social networking site such as Facebook. According to Greenfield (Greenfield 2015, 110), “Going on Facebook is physically and/or physiologically exciting”. Dopamine has the same effect on gambling, arousal, reward seeking and addiction – blips of dopamine which occur when experiencing interaction on social networks become not only rewarding but also, allegedly, compulsive (Greenfield 2015). Harvard researchers have demonstrated that sharing personal information activates the reward systems in the brain the same way as food does (Tamir et al. 2012). It leads to the social mind accepting loss of some aspects of privacy – no longer holding back on making personal information available – in favour of sharing. According to Mark Zuckerberg, from Facebook, “People have really gotten comfortable not only sharing more information and different kinds, but more openly and with more people and that social norm is just something that has evolved over time” (McCullagh 2010). Zuckerberg also said that “There is a huge opportunity to get everyone in the world connected, to give everyone a voice and to help transform society for the future. . . As people share more, they have access to more opinions from the people they trust about the products and services they use. This makes it easier to discover the best products and improve the quality and efficiency of their lives” (McCullagh, 2010).

106

3 ENVIRONMENTAL AGENTS FOR CHANGE

There is a psychological disposition for self-disclosure brought to fruition in a social networked world which trumps traditional adherence over personal privacy: “If identity is now constructed externally and is a far more fragile product of the continuous interaction with ‘friends’, it has been uncoupled from the traditional notion of, and need for, privacy” (Greenfield 2015). With a reduced commitment to privacy, the gates are opened to support greater sharing in a social networked environment. The early Web tried to instill “stickiness” on sites, keeping the reader on site for as long as possible. From these have come a new second Web model of research, which focuses on hyperlinking in and out of an article. An even newer model offers the ability to look behind each page to the source and enables the reader to “fan out across a territory chased by our interests” (Weinberger 2012), a more open and democratic system. This issue will be referred to in the chapter on Citizen Science, where sharing and the “hive mind” has become a significant aspect in the emerging research activity. The point is made that sharing, cooperation and collaboration exert a powerful new influence on the emerging digital natives in a way unknown within researchers in a printed world.

Collaboratories Much research involves teams of specialists acting in close cooperation, and not only within universities – significant public and private research projects are being undertaken across many institutional types both in the UK and globally. The dominance of the single researcher breaking new research ground is fading and being substituted by moves towards Big Science (Price, 1963). Their coordination is guided by an “invisible hand” (Smith 1776) towards joint discovery and innovation. The premise behind this can again be found in basic psychology where the individual is a social animal, enveloped by social networks from birth. Brain circuitry interacts with others from early on and leads to the research community being part of a large integrated neural network. It is the basis of research collaboratories (Greenfield 2014, in Mind Change). A collaboratory, as defined by Wulf, is a “centre without walls”, in which researchers can perform their research without regard for physical location, interacting with colleagues, accessing instrumentation, sharing data and computational resources, (and) accessing information in digital libraries (Wulf 1989). A collaboratory is an environment in which participants make use of computing

WORK PROCESSES

107

and communication technologies to access shared instruments and data, as well as to communicate with others. Neilsen describes several collaborative projects which harnessed the micro-expertises of individuals with different skills who would not normally come together (Neilsen 2011). This has led to networks of collaborators which can include academics, professionals and the general public, with a demand for effective real-time and fast communication support services. Existing collaboratories include the Biological Sciences Collaboratory; Collaboratory for Adaptation to Climate Change; Marine Biological Laboratory; Molecular Interactive Collaboratory Environment (MICE); and the Collaboratory for Microscopic Digital Anatomy (Wikipedia). From 1992 to 2000, financial budgets for scientific research and development of collaboratories in the United States ranged from US$447,000 to US$10,890,000 and the total use ranged from 17 to 215 users per collaboratory (Sonnenwald 2003). Such collaboration was originally defined by Udell as “Designed Serendipity” and adapted by Neilsen in his book Reinventing Discovery: The New Era of Networked Science (Neilsen 2011).

Designed Serendipity Designed serendipity is where intractable technical problems facing a researcher are unlocked by finding the right expert at the right time to give the right solution. That person can be anywhere in the world. They can be outside the conventional groups found in the research process. In the past, such identification was difficult – with social media it has become easier. Collaboration is key to designed serendipity. It is claimed that when we try to resolve a problem on one’s own, most of the ideas can lead to dead ends. But when many people address the problem interaction increases through the “network and multiplier effects”. It happens more often when the number and diversity of participants increases. The problem solving goes “critical” and “viral”: “Once the system goes critical the collaborative process is self-sustaining. That jump qualitatively changes how we solve problems, taking us to a new and higher level” (Neilsen 2011). For those involved in designed serendipity, it helps to focus on specific issues where a researcher has a special insight and advantage. In the digital world, circulation of personal profiles online describing levels of expertise assists in matching skills to a collaboratory project. Increasing the granularity of action points within the research topic also improves prospects for more individuals with different expertises to become

108

3 ENVIRONMENTAL AGENTS FOR CHANGE

involved – expertises which may not necessarily be available within the academic world. Knowledge workers generally bring practical and applied knowledge and give new insights on how pure research problems can be resolved within budget and time constraints. There are some essential aspects to designed serendipity. In essence: – In society there is a tremendous amount and range of expertises – This expertise can be small elements of an overall problem to be solved – The expertise is often latent – Social tools enable such latent micro-expertises to be identified, activated and harnessed – These online tools create an “architecture of attention” – Collectively, this harnessing exceeds expertise of any one individual – A series of such modular approaches may be necessary for large social projects to be managed successfully As a workflow process it builds on sharing and using online tools as well as collaboration to produce a more effective collective intelligence. It adds another dimension to the democratisation of the research process and enables the wider market of knowledge workers – with their individual, often unique or esoteric skills and expertises – to be embraced within the scientific research process. Participants in many online social groups do not necessarily agree with current copyright rules, but instead adopt their own version of what is acceptable – and that excludes profiteering from other’s socially-created work. They do not want to inhabit a world of commerce but rather seek affirmation and recognition from peers: “Within the community purity of motivation inside the community matters more than legality of action outside it. Intrinsic motivations take precedence over extrinsic motives” (Shirky 2010). This indicates that an open, democratic information society embraces the different needs of a new audience hitherto locked out of the old system. It brings in social groups which are found in “the long tail” of the information industry – the unaffiliated knowledge workers. Children and especially teenagers are at the vanguard of the advance into demographic research processes. The developing brain of a child is more plastic, and responds more malleably to experiences than the adult’s brain (Greenfield 2014, 25). This flexibility and adaptation to external digital experiences will be a significant driver in dictating the structure of the STEM dissemination structure in years to come, and it will come from the youth rather than the current silver-haired generation of researchers.

RESEARCH DEVELOPMENT

109

RESEARCH DEVELOPMENT A new organisational landscape is taking shape. Young scientists in emerging scientific research fields rely on hundreds of data banks and use electronic laboratory notebooks to record and then share results, adopting blogs and wikis for rapid communication. They become part of multi-disciplinary global research teams, applying social media for exchange of information nuggets, publishing results in open access journals and including open source versions of software used in the experiments. Science has become more multi-media in its approach and produces multiformats as primary outputs. These outputs, in whatever format, are just as critical to researchers in both affiliated and non-affiliated contexts as the textbased write-up in the published research article. The issues facing datasets and data compilations have highlighted this.

Data and Datasets A study by Stanford University in 2005 entitled “How much information?” claimed there was five exabytes of information being created each year. Of this, paper-based information represented a mere 0.01% of the total. Digital information stored on hard discs was the main source. Even this estimate of five exabytes is the tip of the iceberg when more recent social trends are considered – information is held in store every time individuals drive through traffic monitoring areas; with every flight taken; and with every online purchase made. “Digital footprints” are left everywhere as part of daily life and recorded within datasets. More recently, claims of between three exabytes and 23 exabytes of the worldwide “big data” have been proposed. One attempt to put this in context was made by Eric Schmidt (chairman of Google) who says that three exabytes are created each day, and that as much data is created every two days as was created from the dawn of civilisation to 2013. With rapidly evolving technologies that range “from genome sequencing machines capable of reading a human’s chromosomal DNA in half an hour (circa 1.5 gigabytes of data) to particle accelerators such as the Large Hadron Collider at the European Organisation for Nuclear Research (CERN), which generates close to 100 terabytes of data a day, researchers are awash with information and data”. (Hannay 2014). In STEM circles this has led to opportunities to make use of digital data as a primary resource. Access to digital raw material or data from related research studies becomes significant (Anderson 2008b). Building on the shoulders of

110

3 ENVIRONMENTAL AGENTS FOR CHANGE

giants has taken a new twist as these giants take the form of robotic accumulations of hard data deposited in subject-based and institutional repositories. The main disciplines where large STEM datasets exist include astronomy, bioinformatics, environmental sciences, physics and demography. These data stores can often be accessed for free, and users are able to feed their own research results into the dataset. Data has become the new Intel. National and international organisations are making infrastructural commitments to create e-Science and e-Research through investing in grids and data networks. The NSF report on Cyberinfrastructure in the US in 2007 (“Cyberinfrastructure Vision for 21st Century Discovery”, March 2007) and investigations on data and infrastructure by OSI in the UK are instances of policy setting initiatives to cope with the “data deluge” (Hey et al. 2003). Bringing small, isolated datasets into publicly accessible domain can be a challenge. It has been claimed, anecdotally, that such small author-created datasets, often held in cabinets or drawers in the researcher’s office, amounts in aggregate to two to three times the total amount of data currently within the large curated datasets of Big Science. The STEM publishing industry has been slow to provide a platform in support of such data collections – only recently has there been industry collaboration on issues such as data citation principles (including the international DataCite network), incorporating the Resource Identification initiative and developing altmetrics usage and evaluation (NISO). Berners-Lee’s vision of the semantic web has datasets as an intrinsic part of the future intelligent web. The challenge which optimising dataset access poses is something which all parties in STEM need to address. There is a role for unaffiliated knowledge workers to provide input to and receive output from such data collections.

Workflow Processes As part of the increase in data available to researchers, there has been an impact on traditional workflow processes. The workflow process leads researchers through a whole series of stages which culminate in a completed report published within a respected scientific journal: A workflow consists of an orchestrated and repeatable pattern of business activity enabled by the systematic organization of resources into processes that transform materials, provide services, or process information. It can be depicted as a sequence of operations, declared as work of a person or group, an organization of staff, or one or more simple or complex mechanisms (Wikipedia definition of Workflow).

RESEARCH DEVELOPMENT

111

Publication of research results occurs at a late stage in the research cycle – prior to that several project phases have need for access to certain types of information. From initial researching of the idea using information from the peer group to finding out about competitive studies and teams, as found in literature relevant to the idea, to looking for collaborators, sharing research data, undertaking original research and in some cases investigating commercial returns – each phase places a demand on access to different types of information, data and publications. Some of this information can be found in informal sources available through social media, others in formal records of science. Have authors of research articles taken on board the tools available through social media and applied them to their current research procedures and workflows? A study undertaken by CIBER on behalf of the Charleston Observatory (CIBER 2011) addressed this question. In an analysis of 2,414 online questionnaire responses from authors of research reports who – a priori – were assumed to have interest in social media, the results suggested that there was some way to go before there was universal adoption of social media within the wider research processes. Adaptation by the STEM research community seemed at best peripheral in 2011. In the CIBER study, seven areas of the research life cycle were identified, and social tools which corresponded with each were identified. Respondents then rated their use of social media according to their own workflow. The most popular social media tool was “collaborative authoring”. The next most popular use was “conferencing”. The areas of social media in descending order of popularity were: – collaborative authoring (just over 50% used this process) – conferencing – scheduling – social networking – image sharing – blogging, microblogging – social tagging. Most respondents – these were self-selected as being more likely to be pioneers of social media adoption – only made use of one or two of the above processes. The highest correlations occurred between those who used blogs, microblogs and social tagging (CIBER 2011). It appears that the scope of social tools – to enhance speed and extend the global communication of results – were not then integrated within the main workflow of the researcher/author community, the goals of which are to improve research output, disseminate the results and enhance personal esteem. Without

112

3 ENVIRONMENTAL AGENTS FOR CHANGE

mandates being imposed by funding agencies, the application of social tools to the scientific publishing process remains (currently) experimental and marginal. There are alternative workflow processes which may be considered, ones which rely on the Internet, flexible media and full digital publishing for their sustenance. Some pundits feel that blockchain technology might provide a more efficient information and data dissemination system, one which is very different from the current linear workflow process. Working within a blockchain technology would mean that whenever researchers create or interact with content, in whatever way, and at whatever stage, their interaction will be stored in a single decentralised platform without the intervention of publishers. Having information shared on the blockchain provides the opportunity for a marketplace for research where labs or groups specialise in specific aspects of the research workflow. Some labs could collect the data, others carry out the statistical analysis and generally accelerate the process towards collaboration, sharing and greater democracy. Building on datasets and related concepts such as “mash-ups” (combining several types of information content and sources in one single search strategy), has resulted in research activity embracing novel features of workflow management. Integrating UKWs into the complete work process in research could be feasible. There are different skill sets and expertise required at all stages of a research project. The technical and innovative research of the topic is only part of workflow process – surrounding it with commercial, administrative, political and related contextual inputs takes research from a micro level activity to the macro level, from an academic exercise to one which includes practical skills. The more this occurs, the greater the need is for knowledge workers, with their specialist, practical and professional skills, to be included in the future STEM information structure.

Artificial Intelligence and Cognitive Computing Originally referred to as artificial intelligence, researchers began to use the term “cognitive computing” in the 1990s to indicate that science was designed to teach computers to think like a human, rather than building and improving on an established intellectual infrastructure. Cognitive computing refers to the development of computer systems, together with biological input, and modelled after the human brain. Examples of the route which such intelligent application of technology is taking includes IBM’s Watson computer. It is a technology platform that uses natural language processing and machine learning to give insight into large amounts of unstructured data (see www.ibm.com/WatsonAnalytics). The important attribute

RESEARCH DEVELOPMENT

113

of Watson is not so much the way it outperforms individual specialists in solving problems but rather the way it does it – differently from the human approach. It provides answers through sheer weight of analysis of vast amounts of historical data. The underlying technology leads to even more innovative developments in such areas as quantum computing, offering more powerful approaches to the analysis of large stores of data. A further example of the growth of interest in artificial intelligence can be found in voice-controlled systems. These include Alexa from Amazon, Cortona from Microsoft, Siri from Apple and Google Assist. They all allow end users to gain information by speaking to the machine rather than pressing keys on keyboards. They each allow adjustments to be made in response to the speaking style and typical needs of the user. In some areas, technology will undoubtedly outperform the human innovator. However, if autonomous machines are set loose, moral codes of behaviour need to be translated into software codes. They will need to reflect human emotions such as creativity, affection and regret. As a society, we could rue the day that artificial intelligence/cognitive computing could make decisions which impact on society’s general moral code. As referred by Carr (2016, 187), “The age of ethical systems is upon us”. Carr furthermore questions whether computers and artificial intelligence will ever become substitutes for the human brain and mind: “Since we are nowhere near disentangling the brain’s hierarchy. . . the fabrication of an artificial mind is likely to remain an aspiration for generations to come” (Carr 2016). Such developments and challenges, if integrated into a future STEM research process system, will change the landscape for publishers, librarians, researchers and knowledge workers. Books and journals offer less relevance, and Big Data and its manipulation through AI takes over. However, there is no consistency on what form this trend in Big Data manipulation will take. Data manipulation services are tools; it will require innovators and entrepreneurs to rework these tools into useful artificial intelligent services. The likelihood is that each subject area, discipline or large research project will develop their own portfolio of data services, cherry-picking the most appropriate process from the available new software support algorithms. The result is that a pedestrian approach to scientific research in the future may be overtaken by a close human/computer encounter which, besides improving the research process itself, will generate innovative ways to share and disseminate output which could be more open and less academia-focused than in the past. Still to be resolved are the constraints which need to be set on protecting the human cultural codes.

114

3 ENVIRONMENTAL AGENTS FOR CHANGE

In the meantime, the heavy emphasis being given by those countries which are investing in new artificial intelligence supported research activity – notably China and South Korea – may provide an insight into the direction which cognitive computing and artificial intelligence will take in STEM.

Role of Machine Learning Machine learning is a construct of artificial intelligence (AI) that is capable of learning from new data which is programmed into it. It does this using algorithms to interpret the data. The algorithms can either be structured or unstructured and give a variety of ways with which data can be handled for the machine to detect results not readily obtainable through any conventional research techniques. As indicated above, several pundits (Lanier 2011; Carr 2016) point to the impact which AI and machine learning has on how the human creative process is being affected by the ubiquity of software and IT developments. Carr, in his book The Glass Cage, demonstrated that new automation developments undermine fundamental cultural values as we migrate from introducing efficiencies in the workplace, based on powerful computing capabilities, to a subtle takeover of some of the decision-making activities which have been the preserve of the human mind. This is in part a result of the software developers pushing the boundaries on what automation and AI can do. They push these boundaries without being constrained by those seeking to ensure that issues such as morality and respect for cultural values are prioritized. Businesses employing software developers seek productivity and profit to reduce labour costs and streamline operations. The option of enhancing individual professional skills and expertise, or creating a “better world”, is not part of the software developers’ agenda. When it comes to the development of commercial software, abstract concerns about the fate of humanity cannot compete with the drive to immediately save time and money for the funding organisation: “Many of the problems that bedevil automated systems stem from the failure to design human-machine interaction to exhibit the basic competencies of human-human interaction” (Carr 2016). Nevertheless, computers that can learn from their mistakes and construct new ways of approaching unfamiliar data can be an invaluable asset in almost every industry and area of life, including scientific research. They assist with data mining. The idea behind AI and machine learning is to build theories from learning rather than having to analyse data according to a predetermined theory. It is iterative in its approach. It goes through the incoming data multiple times until a successful pattern is identified. In this respect machine learning

SCIENCE POLICY ISSUES

115

differs from artificial intelligence in that the latter is more structured in its data analysis whereas machine learning adapts to available evidence. The corollary is that in future it may be necessary for society to place limits on the development and application of automation, machine learning and perhaps data mining more generally. A shift in emphasis from what might be deemed as “progress” and technological advancement to one in which social and personal improvements can be sustained and flourish may be necessary. The implication, according to Carr, is that a new form of Luddite may emerge to ensure that robotics does not take over and change the direction which society takes. A policy of “acceptable social determinism” focusing exclusively on human welfare would emerge (Carr 2016, 163). This is a work flow issue which may emerge in the next decade as one which the STEM industry, professionals and UKWs will need to address as a basic issue of policy. It will have its effect on the structure which scientific publishing takes, whether it will become fully digital, online, immediate and transparent, or whether it will hang on to the norms and values established in a print-dominated publication system.

SCIENCE POLICY ISSUES Research policy can be established to meet broad contextual demands from society, which require considerations above those more immediate issues determining science funding. Science and research policies need to take the broader, societal, sometimes global issues into account which might run counter to the old parochial and established ways of doing research and disseminating its results.

“Tragedy of the Commons” A high-level approach includes the concept of the Tragedy of the Commons, as described by Hardin in an article published in 1968. This involves analysing the conflict over resources which occurs between meeting individual interests and resolving the common good. The term derived from a situation identified by Forster Lloyd in his 1833 book on population and was then popularised by Garrett Hardin in his classic essay The Tragedy of the Commons (Hardin 1968). Under “tragedy of the commons”, public common land in medieval times was grazed upon by cattle or sheep until such a time as one extra beast tipped the scales on animal overpopulation and made the local commons useless for all. Nothing could survive on

116

3 ENVIRONMENTAL AGENTS FOR CHANGE

the shared land; all cowhands/shepherds would suffer, not just the owner of the last beast added. “Ruin is the destination toward which all men rush” claimed Hardin. This collapse would happen quickly, totally and was irreversible. In modern-day parlance, it is featured in the concerns about global warming and its destruction of the world as we know it. The unwritten assumption by critics of the STEM communication process as it existed until the 1980s was that scientific publishing was a microcosm headed in the same direction – that at some stage the collective library budget, the finite resource for purchasing most scientific communications (or the commons), would be insufficient to cope with an ever-expanding STEM output (Hardin’s beasts). The system would self-destruct under the strain. As publishers of scientific publications followed their separate agendas, the stresses would be ever greater and the collapse of the system more imminent and catastrophic. The Tragedy conceptualises, in a way unintended by Hardin, the problem facing the pre-digital publication system in which publishers produced books and journals on an uncoordinated basis, and unrelated to the constrained budgets facing the world’s research libraries. The expansion of a nation’s R&D effort bears little relationship to the budgets being allocated by individual institutions to their libraries. Research funding generating more research articles witnessed a steady increase, whereas library funds for collection development were in relative decline. It indicated that there was something inherently flawed about the traditional mainly serials-based publishing system which perpetuated a distinction between the forces of demand and supply. The two forces were uncoordinated. There was a disaster waiting to happen. However, this Tragedy of the Commons did not occur within STEM publishing. The switch from a print-based publication system to a hybrid and increasingly digital has produced (albeit temporarily) solutions which have given flexibility to the buying system and enabled more information to be bought without causing the library system to collapse. “Big Deals” and “collective licences” are examples whereby libraries could get more bytes for their bucks. Nevertheless, despite these there is still an underlying mismatch between the forces of supply and demand in the STEM publishing system. The demand side is severely hampered by being artificially restricted to a small institutional sector of society. The new digital age extends the time boundaries from which the tragedy of the commons would take place. The finite resource is being extended as the Internet makes scarcity less of a factor. The paradigm has changed, and with this changed paradigm the forces of demand and supply for STEM research output are being realigned.

SCIENCE POLICY ISSUES

117

The inclusion of knowledge workers within this realigned paradigm could be valuable to ensure that any new STEM information system works effectively, efficiently and becomes sustainable. Increasing the annual audience for STEM (through graduates migrating from academia into private industry, and a general improvement in “science awareness” in society) adds a new dimension in balancing the supply and demand equation. As such, unaffiliated knowledge worker involvement and participation in the new STEM digital age is critical and valuable. Such issues need to be included within the policy debates on science and information policies by national and international funding agencies. These debates would need to be more than just about open access, but also about inclusion of new information audiences within the future STEM systems to achieve equilibrium in the supply and demand equation, to ensure maximum and optimal dissemination of research results to the widest possible community and to ensure that “the tragedy of the commons” is not repeated in a digital environment.

National and Centralised Policy Directives Long-term strategies relating to the output of research and its impact are being set by government agencies such as the UK Department of Business, Innovation and Skills (UKDBIS) and Higher Education Funding Councils (HEFCs). These policies consider short term economic conditions and national austerity programmes. They seek to balance the annual science budget with other immediate competitive social policies and missions.Both public and private funding agencies in the sciences support new measurement standards for assessing the impact and effectiveness of research outputs. This includes within the UK, HEFCE and its RAE and REF assessments. More recently, as part of the Higher Education and Research Act of 2017, the existing seven research councils have been brought together with the Higher Education Funding Council for England in a unified body. This is the UKRI, or United Kingdom Research and Innovation, under the current presidency of the immunologist Sir Mark Walport. The authority of science lies in its tradition and universal acceptance of quality judgement – notably through impartial assessments and refereeing. However, to stimulate originality requires dissent, arguing against or building upon the established processes through innovation. Tension facing researchers in balancing these two drives – acceptance of present systems and need for originality – becomes more strident as improvements in technology tilt the balance in favour of innovation and originality. These two processes, tradition and dissent, are difficult for funding agencies to balance and resolve.

118

3 ENVIRONMENTAL AGENTS FOR CHANGE

At a more practical level, a study undertaken among UK life scientists and their information support needs (University of Edinburgh 2009) highlights inconsistencies between information search behaviour within the same discipline, and even within search teams in the same discipline. Exerting a one fit for all policy by science funding bodies is counter to the practical realities of research activity at the coal face. Furthermore, a glimpse into the future undertaken by Elsevier with support from the market research company Ipsos MORI during 2018 (Elsevier and Ipsos 2019) suggested that during the next ten years the role of public-funded research will change, with less resources coming from public bodies and a greater share from the private sector. This will have implications on published output of research as issues of competition and corporate secrecy restrict the widespread dissemination of research articles to the wider public. In fact, the suggestion from this report is that the research article will diminish as other media and information services become more prominent as communication channels in science; this coming from a study commissioned and undertaken by the world’s largest publisher of scientific research articles. The same study also pointed to the rise of the far eastern countries, notably China, in dictating the agenda for future research. Their assumed focus on applied research over blue-sky basic research will again tilt the direction which science, research and STEM will take in future years. There is little indication that there is international agreement on how the challenges to the world economy can be met, on whether global alignment can be achieved. National political and social policies may determine the direction of STEM rather than a collaborative, international approach. Within such constraints, effective support from funding agencies requires more research on user behaviour patterns in specific disciplines by policymakers before making fund allocations which might disrupt effective research activity. The Edinburgh study shows how, even within seven sectors of biomedical research, the variation in information needs is dramatically different, with each subdiscipline requiring a different national funding approach. Brexit Finally, there are special instances where political issues have an impact on STEM – issues such as Brexit. The details of decisions taken, whether a soft or hard Brexit, or whether a further referendum will be undertaken, are still to be clarified. Whatever decision is made, it will have an impact on the ability to share research projects and funding opportunities by UK-based institutions with EU member countries.

SCIENCE POLICY ISSUES

119

This issue has resulted in some of the more prominent names in science pleading for a deal on Brexit to avoid damaging British and European research. A letter in October 2018 to Theresa May and Jean-Claude-Juncker was signed by 29 Nobel Laureates and six winners of the prestigious Fields medal. Science needs “the flow of people and ideas across borders”, it says. It comes as a survey found that many scientists are considering leaving the UK. According to the results of a study undertaken by Elsevier and Ipsos MORI in 2018/19, one of the projections for the future is for a disruption of EU funding on R&D within the community as a result of Brexit (Elsevier and Ipsos 2019). According to Sir Paul Nurse, one of the signatories and a Nobel prize-winner for research into breast cancer, said: “The message is, ‘take science seriously’.” Science can help tackle global challenges like treating disease, generating clean energy and guaranteeing food supplies, the letter says, but to do that it needs to bring together the most talented researchers. And it says Britain and the EU “must now strive to ensure that as little harm as possible is done to research”. The two main issues facing scientists in the UK are funding and freedom of movement. Over the years of Britain’s membership of the EU, its scientists have secured more European grants than the country has paid into Europe’s R&D budget. Sir Paul reckons that without a deal, British science could lose about £1bn a year. The other worry is that without freedom of movement, the brightest scientific talent may be put off by the bureaucracy of having to apply for a UK work visa. In an internal survey, staff at the elite Francis Crick Institute in London were asked for their views on Brexit, with Crick the largest biomedical research centre under one roof in Europe. Of the roughly 1,000 scientists on staff, 40% are from EU countries and a priority was to find out what they might do after Brexit. 78% of the EU scientists said they were "less likely" to stay in the UK, while 97% of those who responded said a no-deal Brexit would be bad for UK science. Sir Paul Nurse claimed that “At the moment Britain is at the top of the tree; we are considered widely around the world to be the best and we are in danger of losing that top position if we don’t get this right.” The many changes taking place in global funding support for scientific research by the year 2030 would suggest that the prominent position the UK currently has will not continue in the future.

Future of the Professions Professions are a significant part of UKWs that could be included within an improved STEM system. As with STEM itself, there are changes taking place which

120

3 ENVIRONMENTAL AGENTS FOR CHANGE

challenge whether the existing infrastructures for the professions – learned societies – are fit for purpose in a digital world. As will be analysed in a later chapter, there are questions whether the established approach to running established and entrenched societies can survive or whether it will find a new role in the digital future. External developments which threaten to undermine STM publishing also apply to professions. As described by Susskind (2015), the old established and inward-looking professions are about to be overtaken by new approaches for services which they traditionally provided to their members. The question is about the nature of the replacements which will emerge. This replacement becomes a critical strategic issue, compounded in this case by whether UKWs will be included in the emerging systems (Susskind 2015). In farming out traditional more mundane functions and services offered by established professional societies, new sub-professions could be created. It is possible these could also have information profiles similar to UKWs. As such, the UKW sector would become beneficiaries of a restructuring of professions. As explained by Susskind in their conclusions (Susskind 2015, 303–308) society faces a choice: whether professions would be allowed to continue with aspects of enclosure, giving professional members protection to invest in and benefit from their knowledge base through imposing a charging mechanism, or whether to promote liberation whereby the knowledge base of the profession becomes part of the commons, a public utility, more openly available. There is a democratising trend underway in the main learned societies which coincides with the emergence of disenfranchised knowledge workers within the scientific research process.

“Valley of Death” Another feature of the STEM publishing system is organisational attitude towards new technology, and its adaptation to developing new STEM products/ services. As indicated above, the traditional model for scientific publishing is based on print-based technology which goes back centuries. In future, more digital and networking technologies will be introduced. An analogy for this is that print-based publishing is on a downward slope of a valley. It is dictated by tradition and a print-based culture. On the other hand, the Internet and digital publishing has created new ways of disseminating information online, and these processes are increasing, driven by the forces of the “perfect storm”. This can be reflected in the upward slope of the valley. The greater the rate of digitisation/networking, the steeper the upward slope.

121

Summary

At the bottom of the valley is where the two cultures collide. The challenge facing all STEM stakeholders is to take the best and most durable parts from the print culture, from the downslope, and integrate it with the best of the upslope technologies in a way which gives publishers and librarians viable and long-term survival, and researchers a valuable service. This is implicit in meeting the investors’ financial support for the industry – if the valley floor is not traversed in a logical and commercially sustainable way then their investment funds will be diverted elsewhere. The concept described is that of a “valley of death”. “The Valley of Death” DIGITAL

GR

T

OW T

RIN

HI

NP

EI

ND EG I

IN CL DE

T IS AT IO N

PRINT

“VALLEY OF DEATH” CULTURAL SHIFT

Fig. 12: Brown, “Access to Scientific Research”. Berlin: de Gruyter, January 2016.

This collision between print and digital without an effective transitional strategy means that UKWs are potential beneficiaries as old barriers to access are brought down and new more open digital opportunities are created.

Summary It is no longer an option to view the creation and use of STEM information as an issue whose problems will unravel gradually, and which will use established procedures in coming to terms with external challenges. Both sociological and technical change are dynamic, volatile and significant in the digital world. It is a “tsunami” rather than a gradual evolution. These various political and administrative trends are summarised in the following graph:

122

3 ENVIRONMENTAL AGENTS FOR CHANGE

Science Policy

Research Policy Research Assessments

Universities and Funding agencies National R&D support

Twigging phenomena

‘Openess’, Interactivity, Collaboration

Disciplinary differences

Researcher Behaviour

Alternative Business models

Publishers size and ownership

Document delivery and PPV

Learned Societies

APCs versus Suscriptions

Library issues

Pricing models

Industry structure

Fig. 13: Political/Administrative Trends. (See Brown, D. 2016 p28)

The consequence is that the map of the user population for STEM in future needs to be redrawn. Opportunities arise for including a wider audience within STEM, and in so doing create new business models to enable a larger community to be reached. But before these opportunities can be realised it is necessary to see how robust the current STEM industry is and whether it is fit for purpose in addressing current problems, let alone in adapting to future challenges.

4 STEM DYSFUNCTIONALITY Findings described in this section refer to STEM publishing as currently configured. It assesses its sustainability and viability, then focusing on external developments which will bring change to its operations. The strategic implications of such changes will be analysed through the prism of conceptual models used in other sectors of society but having applicability to the uniqueness within the STEM information sector. It follows on from an earlier chapter where the status of UKW researcher as user was analysed. This chapter goes into the barriers facing researchers resulting from the current structure of STEM publishing. Whether this industry structure is likely to accommodate the growing number of changes which are facing it, or whether it will be remodelled in line with dictates set by emerging disruptive technologies, will be reviewed.

STEM PUBLISHING SECTOR Industry Facts and Figures The following table is sourced from a variety of external data sources – some are reliable based on public data, some anecdotal and more indicative. It brings together available data from an industry sector which has been sparing in its provision of information for strategic assessment purposes. The numbers given below are an indication on the size of the worldwide STEM publishing system and its component parts.

Statistics on the Size of the Scientific Communication Industry Funding Scientific Research: – In 2013, world expenditure on R&D amounted to US$1,478 billion, compared with $1,132 billion in 2007 (Unesco 2015). – The investment in scientific research is approximately $178 billion (CEPA 2008). Publishing Revenues: – The scientific information industry worldwide – books, journals and databases – generated revenues estimated at $23.5 billion in 2011 and $25 billion https://doi.org/10.1515/9783110650778-004

124



– –













4 STEM DYSFUNCTIONALITY

in 2013 (Outsell 2011; Outsell 2013) or approximately 1.5% of global R&D expenditure. The major part of the scientific information industry was in the time spent in searching and accessing ($16.4 billion). $2 billion was in library access, and $1.9 billion in unpaid peer review (CEPA 2008). Scientific and technical revenue growth from 2010 to 2011 was 4.3% (to $12.8 billion), and medical grew 2% to $10.7 billion (Outsell 2011). Scientific, technical and medical journal revenues alone in 2013 were an estimated $10 billion (Outsell 2013). Though only 40% of total STEM revenues, the large and growing revenues from search engines account for much of the STEM business. Journal publishing revenues in the UK come from library subscriptions at academic institutions (68% to 75% of total) and corporate subscriptions (15% to 17%). This amounted to £112 million from universities and £75 million from all other sources (RIN 2012). Charges levied by publishers to enable authors’ articles to be read by all – gold and Hybrid open access – and not just subscribers to journals are approximately $172 million (Bjork 2012a) or 1.8% of journal subscriptions. APC charges vary from $2.5k to $3.4k per accepted article. One publisher which has switched from a journal subscription to a gold open access business model – Hindawi, based in Egypt – allegedly had a surplus of $3.3 million on revenues of $6.3 million in 2012 (showing that new business models can be as lucrative as traditional models under certain circumstances). The commercial scientific journal publishing industry is dominated by five key players. These include Elsevier which publishes over 2,500 journal titles, and Springer Nature with a similar number. Wiley-Blackwell, Taylor & Francis and Sage are also key players. These are commercial companies which have conflicting missions in meeting shareholder expectations as well as satisfying user demand for publications. There are many learned societies which balance their activities in support of education and training programmes for their members whilst also maintaining commercial viability from publishing activities. There are 315 members from 39 countries in the UK-based Association of Learned and Professional Society Publishers (ALPSP), but this is only a small part of learned society publishers worldwide (see ALPSP website, www.alpsp.org). The UK-based publishing industry occupies a major position within global scientific publishing and generates approximately £800 million in annual export revenues. Both the USA and the Netherlands are also prominent centres for commercial scientific publishing.

STEM PUBLISHING SECTOR

125

– In broad terms, 52% of global STEM revenues come from the USA, 32% from Europe/Middle East, 12% from Asia/Pacific and 4% from the rest of the world (Ware 2012a). – The full cost of publishing a journal article is estimated at £3,000. Users: – There are 7.8 million researchers worldwide. Since 2007, the number of researchers has risen by 21% (Unesco 2015). – OECD has reported 8.4 million researchers (or 6.3 million fulltime equivalent) for 2011. This is mainly for OECD countries but includes a few key non-OECD countries (China and Russia). – The following table gives the global distribution of these researchers (in ‘000s): Region World United States Europe Asia UK





,. ,. ,. ,. .

,. ,. ,. ,. .

(Unesco 2015)

– The Economist (October 19, 2013) puts the number of researchers at 6–7 million worldwide (Economist 2013). – Academia.edu, a new information service, estimated the number of researchers at 17 million (though this figure includes postgraduate students). – Ware (2012a) estimates the number of users at between 6.5 and 9 million worldwide. – The EU remains the world leader for the number of researchers, with a 22.2% share. Since 2011, China (19.1%) has overtaken the USA (16.7%). Japan’s world share has shrunk from 10.7% (2007) to 8.5% (2013) and the Russian Federation’s share from 7.3% to 5.7% (Unesco 2015). – Approximately 30 million worldwide are readers of science-related literature. – There are 110,000 people employed in the STEM industry globally (40% in the EU) with a further 20–30,000 directly supporting STEM (Ware 2015). – There were 132 million tertiary level students worldwide in 2004 (Weller 2011). – 4,500+ research-based institutions exist in 180 different countries.

126

4 STEM DYSFUNCTIONALITY

– 9,227 universities are listed in 204 countries (168 universities in the UK alone) – The National Science Foundation (NSF 2014) estimates the number of workers in the science and engineering workforce in the US alone as being between 5 million and 19 million in 2010. – NSF also estimates 5.4 million college graduates are employed in science and engineering occupations in the US. This includes 2.4 million in computers/mathematical sciences; 1.6 million in engineering; 597,000 in the life sciences; and 320,000 in the physical sciences (NSF 2014). – Scientists and engineers with S&E doctorates were split between 46% in business sector and 45% in education in the USA (NSF 2014). – Small companies are important employers of US S&E graduates – companies with fewer than 100 employees employ 37% of graduates. – Unemployment rates for those in S&E occupations are lower than those for the overall US labour force – 4.3% in S/E compared with 9.0% in US overall in 2010 (NSF 2014). – Between 1960 and 2011 the number of workers in S&E occupations grew at 3.3% per annum, compared with 1.5% per annum for the overall US workforce (NSF 2014). – 20% of researchers are repeat authors of journal articles (Ware 2015). – There were about 2,500 million article downloads from publisher web sites each year (plus an additional 400 million from other web sites) (ICSTI 2011). – The universe of “knowledge workers” is approximately 500 million (Microsoft 2010). – Academia.edu has 5 million scientists as users of its services. – 30 million article citations are made per annum. Standing out from the above data is the sheer range in estimates – reliability and consistency in such data sources is missing. In this respect, the STEM publishing industry differs from many other sophisticated industry sectors where sharing of market and other non-confidential commercial data occurs to the advantage of participating companies. There is no independent collaborative global agency serving all types of STEM publishers (the International Association of STM Publishers notwithstanding) having a focus on collecting macro level statistics and promoting common strategic approaches to ensure the industry’s ongoing viability in the face of generic new competition. The indications are that there is a core of 7–8 million researchers worldwide, but with concentric rings surrounding these which include postgraduates

STEM PUBLISHING SECTOR

127

(10 million), R&D workers (50 million) and knowledge workers generally (over 400 million). There are 260,000 R&D staff in the UK (Unesco 2015). Output: – Globally, 3 million STEM manuscripts are submitted each year to scientific journal publishers (ICSTI 2011). – Only 1.85 million articles were published in 2012 (the rest were rejected in their current form, but are frequently and subsequently recycled into other journals). – Article output and journal titles are increasing by at least 3.5% to 4% per annum (in line with research activity) (Mabe and Amin 2001; Mabe 2003). – 28,135 scientific journals (refereed, scientific, still active) were being published in 2014. A further 6,450 were non-English (though they are available in 55,311 different formats). – Approximately 500 new STEM journals are launched each year. – The number of STEM publishers is estimated at between 5,000 and 10,000. There is a long tail of single journal publishers who may not regard themselves as primarily being publishers. – 650 publishers, responsible for 11,350 journals (or 40% of total), are members of main publisher trade associations (Ware and Mabe 2015) – Of these, 480 publishers (73%) and 2,334 journals (20%) were not-for-profit publishers (Ware and Mabe 2015). – 40 million articles are available digitally, back to the early 1800s. – 2.5 billion documents are downloaded from publisher websites annually (Ware and Mabe 2015). Overall, there are 1.85 million journal articles published each year in 28,000 journals. There are five to ten core commercial journal publishers, though these represent a small proportion of the global journal publishing industry of up to 10,000 companies. There is a long tail of small, esoteric and specialised publishers. Editorial: – 5,000 new editors are recruited by publishers each year, to add to the current total of 125,000 editors (ICSTI 2011). – There are 350,000 editorial board members. – Over 5 million referees are included in the quality control system. – 30 million+ author/publisher communications take place each year. – 230,000 open source projects are available. The STEM journal publishing sector involves management of a large network of editors, referees and authors – a crucial administrative function performed by

128

4 STEM DYSFUNCTIONALITY

traditional publishers. The costs of maintaining this network is supported by subscriptions and document delivery charges. Intermediaries: – Search services such as Google and Yahoo are expanding more rapidly than the STEM industry as a whole. – Google alone accounted for 87 billion online search queries in 2009 out of global total of 131 billion. – In 2009 Wikipedia accounted for 55.6 million online searches. – Traditional intermediaries in the academic sector – journal subscription agencies and booksellers – have faced a torrid time over the past two decades, and many have ceased operations (the most recent example being Swets which declared bankruptcy in September 2014) (Against the Grain 2014). – Disintermediation by publishers had become a commercial strategy adopted by a few dominant players. Sources: based in part on report at ICSTI Summer Conference, Beijing, by Hugo Zhang (Managing Director of Elsevier Science and Technology, China) (Zang 2012). Ulrich’s list of titles. “The STM Report”, Ware and Mabe, STM Association, 2012.

In 2013, Elsevier undertook a comparator survey among key countries which provided the following as a summary of UK conditions: UK Researchers 262,303 researchers in 2011 – Increased at 0.9% pa (2007–2011) – Ranked fifth among leading comparator countries in 2011 – Represents 3.9% of global total (2011) UK Higher Education (based) Researchers 163,505 in 2011 – Increased at 2.1% pa (2008–2012) – Represents 62.3% of UK Researchers (2011) UK PhD Graduates 20,076 in 2011 – Increased at 3.4% pa (2007–2011) – Ranks fourth amongst comparator countries (2011) – Represents 62.3% of UK Researchers (2011)

STEM PUBLISHING SECTOR

129

UK Research Mobility 71.6% of researchers were internationally mobile (1996–2012) – Ranks second among comparator countries – 3.3% net total outflow of active researchers (Elsevier 2013) Strengths of STEM Sector As a service industry in support of the creation and dissemination of research results, the scientific communications sector has many advantages. It is a solid sector, with an established and stable history. There is a well-developed editorial and marketing infrastructure in place provided by STEM publishers. They manage a large global network of contributors to the STEM information dissemination process. It has the support of authors and researchers worldwide on the basis that the system ensures maintenance of quality and offers credibility as a reward. There are powerful brands in which authors and users place their trust. A network of reputable, in some cases physically spectacular and imposing, libraries act as custodians of STEM’s archive. Funding for research is sourced by public and charitable organisations committed to ensuring that society benefits from investment in quality research projects. This translates into a conservatism which acts as a brake on dramatic changes being made to traditional ways of operation. This conservatism is reinforced by the very structure of the STEM publishing system which has a combination of a few “boulders” or well-established large publishers, with many “pebbles” or small publishers (Leadbeater 2009) as indicated in the section on the Long Tail (Anderson 2004). The large publishers have too much at stake to risk changing a winning business formula. The small publishers lack the economies of scale to create sustainable alternative business models. The new entrants coming on the scene are the “containers”, gathering in one place the “boulders” and “pebbles” to create a new paradigm. STEM is therefore faced with a confusing cocktail of new threats. Whilst its players focus on immediate actions to improve operational efficiencies, they neglect tackling out of the box strategic challenges: “The debate between quick scientific gains and long-term public investment in basic and high-risk research to enlarge the scope of scientific discoveries has never been so relevant” (Unesco 2015). This has led to the outcry in formal and informal literature that stakeholders in the dissemination of STEM output have become dysfunctional. Underpinning the current STEM publishing system are business models which were developed

130

4 STEM DYSFUNCTIONALITY

Number Researchers worldwide 7.7 million

Global R&D spend ($1,146 bil)

STEM Publishing industry $26 bill

STEM Journals $9.4 bill STEM articles 1.8 mil pa (+ 3.5% pa growth)

Academia

Corporate R&D and Research Institutes

Worldwide academic/research institutions – 4,500 to 10,000

Readers worldwide 30 million ---------------------------------------------------------------------

Knowledge Workers worldwide – 500 million (est) USA – 50 million UK –11 million

Fig 14: The STEM publishing industry in context. (See Brown, D. 2016. p42)

BUSINESS MODELS

131

under the (print-based) paradigm. These models come in different forms from different agencies in the STEM structure. These are summarised in the next section.

BUSINESS MODELS Any business model which operates may no longer be viable if any of the following conditions apply (Dawson et al. 2016): – Customers have to cross-subsidise other customers – Customers have to buy the whole thing for one small bit they may want – Customers cannot get what they want where and when they want it – Customers get a user experience that does not match global best practice At least two of the above conditions apply for STEM, potentially three. In addition, the main STEM business models do not consider those external developments identified earlier and which impact on STEM and as such render them additionally vulnerable (see “The economic essentials of digital IT strategy”, McKinsey Quarterly, March 2016). There are different corporate missions behind each of the stakeholders which determine the extent to which profit and efficiency maximisations are sought. These range from the high profit margins sought by large commercial STEM publishers to the more socially-focused projects initiated by researchers whose aim is to develop effective informal communication channels, collaboration and sharing which become part of the gift economy. There are also several business models employed by different sectors of the STEM industry. The following lists these models not only those from STEM publishers but also employed by librarians, intermediaries and researchers.

Publisher Initiated Serial subscription and site licensing model (including e-document downloads): – The structure for publication of research articles and conference proceedings has evolved over three and a half centuries. The research journal has become the main delivery vehicle for updated reporting of scientific results. – Subscriptions (income from sales of journal titles) and licences (contracts between publishers and libraries) are the main business models used. – These subscriptions and licences depend for their success on the health of the institutional research library budget.

132

4 STEM DYSFUNCTIONALITY

– The subscription model faces a budgetary challenge as the gap between scientific output (and prices) and library budgets grows. – Nevertheless, subscription/licences remain the base business model from which several large commercial publishers generate healthy profit margins (in some cases 30–40% of revenues). – The impact of OA has not been great on the subscription business thus far. It is believed that there is a considerable number of deduplication (between both green and gold) publications which limits the inroads which OA has been making overall. – Built on the subscription/licensing system has been so-called “big deals” which offer libraries much more of a publisher’s output for a (small) additional charge. – Included within a subscription or licence are details of who is entitled to download electronic versions of articles from the licensed package. This restricts access to the main patrons of a research library, and not to a wider audience such as knowledge workers generally. – As such, subscription/licences create a padlock on research literature, open only to those few organisations which have the necessary funds to gain access. – OA has created a situation where by the combination of open access and subscription business models has meant that publishers are getting more, not less, revenues. This occurs through the interaction of the elasticities in supply/demand of the two business models. Online individual document purchase (from the publisher site): – Individuals not entitled to access online journals can purchase an article by paying an article charge set by each publisher, which is accessed directly from the publisher’s website. – Publishers are hesitant in supporting sales of individual articles as it may cannibalise the subscription business model, though there is insufficient evidence to suggest that this is the case. – There is often a high price deterrent set by publishers which detracts from more sales of individual articles – individual article prices delivered electronically are sometimes more than $30. – Publishers face higher internal administrative costs associated with collecting large numbers of small payments for individual article sales, which puts stress on their existing institutional-focused operational structures. – A price reduction could help determine the elasticity of demand, leading to optimal business strategies being developed catering for the long tail of knowledge worker demand.

BUSINESS MODELS

133

– DeepDyve has disclosed the following document ordering activity on an unnamed but representative publisher’s STEM platform: Articles purchased From one publisher’s website Traffic per year: Non-institutional traffic per year: Document delivery sales per year: Average article price: Number of docdel transactions: Docdel conversion rate of non-institutional site visitors

 million visitors  million visitors $ million $ , .% (, transactions/ million visitors)

(Correspondence with William Park, CEO, DeepDyve, September 2010)

The Chief Executive Officer of DeepDyve, William Park, also claimed that the estimated number of visitors to the DeepDyve site that were non-institutional ranged from 35–60% (Park 2009). Many of these could be counted as UKWs. A more acceptable pricing strategy could have transformed a proportion of these latent non-institutional visitors into individual article purchasers. DeepDyve has refocused its attentions from academia to SMEs in the light of difficulties in reaching into the established market with its database (in October 2017) of 15 million articles from 12,000 titles, 75% of which are STEM titles. Read and Publish In the meantime, publishers are being offered an open access business model which includes being locked into OA contracts and article delivery. European policymakers are promoting this, with the result that substantial revenue reductions could face publishers. These deals are so-called Read and Publish (RAP). Publishers are taking what appears to be a cautious approach to this. The risk to publishers is that, by flipping journals in one major region to OA, they will devalue the subscription bundle for subscribers in general. If buyers sign up to pay more to publishers under RAP models, then pay-to-read subscription models will gradually fade away. However, the cautious nature of policymakers is such that a global flip that avoids the need for a transitional higher-priced RAP model seems highly unlikely.

134

4 STEM DYSFUNCTIONALITY

Library Initiated Institutional Repositories (IRs): – Research institutions have established digital institutional repositories (IRs), often maintained by the library, which include all research material published by in-house research staff (Crow 2002). – Theses, notably e-Theses, are an important element in most universitybased institutional repositories. The British Library offers ETHOS, a collection of university e-Theses available on university institutional repositories in the UK, on a national basis. – Items deposited in the IRs can usually (subject in some cases to a time moratorium) be accessed by anyone – affiliated as well as unaffiliated – for free. – Researchers often find it time-consuming to deposit their research results in their local IR and see no personal benefit from doing so. – To overcome current reluctance to make such deposits, researchers are often mandated by funding agencies to submit their outputs to preferred institutional or subject-based repositories. – Institutions gain value from exposing to the world the quality of research being undertaken by their researchers – a reputational rather than financial service. – Potential conflict with publishers over the impact by IRs on articles being made available from publishers was explored in a EU-funded PEER research project with no conclusive resolution (Wallace 2011). Document Delivery: – Printed copies of articles can be purchased by individuals through specialised document delivery centres. National scientific agencies have been crucial in establishing such operations, notably the British Library Document Supply Centre (Appleyard 2010) in the UK, but also in France, Canada and South Korea. Private companies have also been established in the past to deliver documents on demand. – E-delivery of such documents is possible with the agreement of the publisher (with charges including processing costs and royalties, as against processing costs alone for direct supply by the publisher to the end user). – National and international formal document delivery traffic has declined rapidly during the past decade (by 75% in the case of BLDSC). – Document delivery centres perform a valuable service by creating a comprehensive catalogue of all electronic articles (such as BL’s ETOC) and establishing a centralised one-stop purchasing centre (thereby avoiding searching across thousands of publishers’ web-based silos).

BUSINESS MODELS

135

Interlibrary Loans (ILL): – Sharing published resources between research libraries is mainly through loans of physical books rather than delivery of research articles. – The ILL process imposes a significant cost burden on large, comprehensive research libraries in supporting those libraries having smaller collection budgets. – This represents more of a fall-back for libraries if they cannot serve their local patrons from any other source. Walk-in access: – Walk-in access gives an individual researcher open access to a library’s physical/printed collection (different rules apply for walk-in access to digitally held material). – Walk-in procedures for e-articles are currently being negotiated but remain subject to licensing terms agreed with the publishers. – There can be geographical and transport barriers in accessing such printed collections by external researchers – this includes the physical distance to/ from home or one’s office, for example. Alumni: – Several projects involve experimentation with delivery of publications to the university’s alumni without authentication barriers being put in place (ProQuest Udini). – In the past alumni have mainly been excluded from publishers’ licences and they have not been able to retain those access rights enjoyed whilst they were a member of the institution’s library. Public Library Access: – Publishers Licensing Society (PLS) is conducting trials to see whether a sustainable business can be developed whereby public libraries can be brought within nationally negotiated licences (Faulder and Cha 2015). – Similarly, Jisc Collections is conducting trials to see whether SMEs can be included within national academic licences. National Licensing Models: – Problems in organising a feasible structure and getting buy-in from a representative set of publishers and libraries in the England has prevented national licenses being established in the England. However, Scotland has a scheme in place. – Complex issues such as cross institutional use, home access etc. would all be resolved through a broad licence model.

136

4 STEM DYSFUNCTIONALITY

– National licences require coordination, and central funding – a new collaborative business model – which would only operate in select areas and with commitment from all sectors including public funding agencies. Though there is some movement in redefining the scope of licences, these are long and protracted discussions with publishers who need to protect their commercial interests by not allowing access rights to be given away too freely.

Intermediary Initiated Subject Based Repositories: – Related to the above, there are a few subject areas where preprints of articles are deposited and made available for free. These include biomedicine (PubMed), physics (arXiv) and social sciences (SSOAR). – Funding comes from a variety of sources. – There are several subject-based site licensing initiatives. SOAP3 is one such scheme which includes libraries and publishers agreeing to worldwide free access to a selection of physics journals (Anderson 2008). – This differs from institutional repository access in that it involves a collection of publishers being involved rather than a consortium of libraries. Pay-per-view (PPV): – Pay-per-view (PPV) has the potential to reach beyond the library into wider knowledge worker sectors, though the barrier is often the high price set by publishers for the delivery of individual articles through intermediaries. – There are experiments taking place, with DeepDyve being an example, where articles are “rented” rather than “bought”. – A key feature of these PPV initiatives is that they could address a new market sector for scientific publications if the issue of acceptable pricing levels could be addressed. Premium Subscription Services (Freemium): – These usually focus on specific disciplines with a broader range of information services being offered ─ not just journal articles. They include a package of services. – The business model involves offering a basic level of service for free, but charging for access to premium service levels.

BUSINESS MODELS

137

Intermediaries have been struggling to maintain a niche for themselves within the printed business model for STEM, and have suffered disintermediation for decades. They have been squeezed as both publishers and libraries vie to promote their own commercial interests or protect their budgets. However, this does not preclude intermediaries developing new niches with new business models in the future, particularly in partnership with the research community and employing new technology.

Author/End User Initiated Social Networking and Social Media: – “Perfect Storm” forces are attracting greater adoption of Web 2 services. – Social networking may be the process whereby scientific communication is interfaced with the needs of knowledge workers in the future.

Mixed Initiatives Open Access: – Open access allows the end user “free at the point of usage” access to published articles. – There are a number of open access systems for STEM material: green, gold, platinum, grey or hybrid routes (Guedon 2004; Harnad 1994). – Green (author-self archiving) model operates within the Institutional Repository setting, enforced in some cases by mandates issued by funding agencies to ensure deposit of the research output (see above under Library Initiated). – Gold (author pays) requires an author (or their affiliation) to pay a fee to the publisher to have their articles published. – Hybrid involves a journal offering a subscription model within which gold articles can be included (paid for by the author) and accessed for free. However, it opens itself to “double dipping”, with publishers charging twice for access to the same article. – The EC pilot project for post FP-7 grants rejected hybrid journals (Plan “S”). Norway has gone the same route. – The reason why many hybrid journals are sought after by researchers is not for quality reasons, but for prestige, status and visibility. One way to improve Plan S would be to stress this crucial distinction.

138

4 STEM DYSFUNCTIONALITY

– An unintended consequence of Plan S might be to drive faculty to turn to their universities more for APC support than they already do, so as to preserve their freedom to publish wherever they feel will most likely advance their careers. – Several decades after the launch of open access services only a minority of articles are open access. – Nevertheless, Open Access has the virtue of linking with broader developments in society – Open Information Systems generally. – However, Open Access, particularly of the gold variety, lends itself open to fraud and unacceptable quality controls. There are journals being launched which are variously defined as being “predatory” or “illegitimate” which exploit the charging mechanism underlying gold Open Access. – It is also challenged by commercial organisations, particularly operations such as Sci-Hub and ResearchGate which sail close to the wind legally. With the above business models there is growing support for more openness. This is only balanced in the research sector by a traditional conservatism and reliance on the established reward system. Should the reward system affecting researchers change so that a formal published article in a respected refereed international journal would no longer carry the weight it has in the past with funding agencies? This would do much to create a swing to open access publishing at the expense of some (but not all) subscription-based publishing.

A DYSFUNCTIONAL STEM Tensions Within the Existing System Scientific journal publishers provide formal publications of research output in a structured and quality-controlled way, and research libraries became agencies which were allocated funds with which to purchase published scientific research. This dualism became the principal operational infrastructure enabling a smooth transfer of high-level knowledge within the scientific community. However, in so doing it has operated under two conflicting business cultures. – On the one hand, the published scientific output, or “Supply,” continues to grow. The leading driver for the growth in article supply is competition between researchers and attempts to advance their careers through publishing more and better articles. It can lead to unexpected consequences such as “salami publishing” (splitting research results into many separate publications),

A DYSFUNCTIONAL STEM

139

duplication of research outputs, plagiarism, etc. These increase the supply of available publications, but essentially it is stimulated by competition between researchers for visibility from which to compete for research funds. Within a society which exploits the results of research activity to improve economic performance and international competitiveness, increases in R&D funds have been made available. As reported in the Unesco Science Report in 2015 (Unesco 2015), most countries see research and innovation as key to fostering sustainable economic growth and furthering development. The combination of micro level competitiveness (at the personal level) and macro level support for research (at the national level) are powerful stimuli for continued growth in research output at rates of over 3.5% pa overall (Mabe and Amin 2001; Ware 2012b). – “Demand” is driven by an unrelated set of issues. Research publications are purchased through the library’s collection development budget. As science continues to grow (Mabe 2001; 2003), as student levels worldwide increase at about 6% per annum (Unesco 2010), and as technology offers an ever-widening range of applications arising from research, the increase in a research library budget would need to be comparable to cope with demand at a rate in excess of 5–10% per annum. This has not been achieved. At best, library budgets have remained static in real terms in recent decades; at worst, their buying power has diminished despite serial collection budgets being prioritised at the expense of other library activities. There has been an ongoing increase in unit pricing of STEM publications over the years despite the above inbalance between supply and demand. The rate of price explosion can be seen from the following graph of book and serial costs in the UK from 2002 to 2012, as compiled by the Library and Information Statistics Unit (LISU 2012) based at Loughborough University. The above graph shows the decline in real spending on books and journals in the UK university sector despite an upward growth in academic staff and students – the people being served by the libraries. There has been an increase in costs in tandem with the diversion of supply and demand forces. This is in part because the move towards digital publishing adds costs as it became necessary to add additional services. This includes the publication of datasets alongside the journal article, and new standards for content capture and metadata inclusion. The percentage growth in periodical prices by discipline in recent years has also been collated by LISU and illustrates that price increases by the main publishers serving academic libraries have risen between 5% and almost 8% per annum for the main STEM journals.

140

4 STEM DYSFUNCTIONALITY

140

2,400

120

2,100

£ million

1,500 80 1,200 60 900 40

600

20

0 02-3

Staff & students (000's)

1,800

100

300

03-4

04-5

05-6

06-7

07-8

0 08-9 09-10 10-11 11-12

Total bookspend

Real bookspend *

Total periodical spend

Real periodical spend **

Academic staff & students Fig. 15: Total expenditure books and periodicals LISU, Loughborough University, UK.

The commitment by research libraries in the UK has been towards purchasing serials in the hard or natural sciences, and this has arisen because of increases in journal prices in these disciplines. This contrasts with their commitment to purchase monographs which has relatively declined over the years. The main problem is that research libraries’ funding sources is divorced from the production and output of research results. As such, research libraries have difficulty maintaining a credible collection in an era of “digital information overload” (CEPA 2008; CEPA & Ware 2011).

Further Paradox in STEM Publishing Traditional scientific publishing displays other distinctive characteristics, unique to STEM and different from other trade publishing sectors.

141

A DYSFUNCTIONAL STEM

Average prices

2013

% increase 13 over 12

2012

% increase 12 over 11T

2011

Social sciences

£594.96

5.7

£565.51

4.4

£520.62

Science

£1,672.37

4.9

£1,560.87

5.5

£1,429.52

Medicine

£980.40

6.9

£1,035.86

6.2

£819.53

Technology

£979.28

5.3

£1,112.42

4.6

£867.22

Humanities

£230.10

7.7

£212.30

5.0

£199.14

General

£449.37

6.9

£270.93

3.6

£281.73

No of Titles

(1)

27,117

24,470

24,343

Average all subjects (2) U.K.

£817.75

5.7

£792.98

5.1

£686.29

USA

$1,188.63

6.5

$1,106.06

4.5

$1,023.79

EURO Region

€ 884.32

4.7

€ 909.77

5.5

€ 828.76

Fig. 16: Average growth in periodical prices in the UK by subject area LISU, Loughborough University, U.K.

– There is a “closed circle” of scholarship whereby authors seek recognition and esteem for their work rather than expecting a financial return from publication. They give up rights over their published results to third parties (publishers) to have their research efforts made available to their peers throughout the world. Achieving worldwide recognition has been the main reward sought, as this often became the source of additional funded research, academic tenure, visibility and/or personal career advancement. An Ithica S&R survey involving almost 7,000 UK academics in 2015 identified that “since 2012 there has been a substantial increase in the share of academics that shape their research outputs and publication choices to match the criteria they perceive for success in tenure and promotion processes” (Ithica 2016). The problem is that the success criteria revolve around citations and so-called Journal Impact Factors (JIF) which are increasingly being criticised for being at best an inelegant measurement, at worst destructive to science (Bohannon 2016).

142

4 STEM DYSFUNCTIONALITY

In the meantime, there is no money changing hands for this service for allegedly indicating journal quality – it is “a gift economy”. – Parallel to this is payment for the quality control, production and dissemination of research results. This is handled by research funding agencies, publishers, subscription agents and institutional libraries. The latter are provided with the funds with which to buy the output from the publishers. Money is at the heart of this process. It is a “transactional economy”. – In the UK, 54% of the research undertaken is funded by industry. In the USA it is approximately 66%, with over 70% of the research actually performed in industry (NSF 2014). Despite this, most research articles are written by academia-based authors. – At the same time, authors are also readers and in this capacity seek unrestricted access to publications from other authors. Whilst they may accept some control over access to their own publications, they want a different publishing model – more open – for everyone but themselves. The proposition is that there is migration from a transactional to a gift economy as the sector moves from meritocracy or elitism towards openness, transparency and democracy. This has implications on the market structure for STEM publications in the future, making the current business model unsustainable, particularly during the period of migration (see “valley of death” above). It also impacts on whether UKWs can be brought into the mainstream research system. But the essential weakness in the demand side of the STEM equation has been highlighted in the comments covering the Periodical Prices analysis of 2018 in the US, Whether applied to a failed business arrangement or an ancient form of torture, “death by a thousand cuts” signifies a larger outcome that occurs as the cumulative result of multiple smaller events. For the periodicals market, the lingering impact of the Great Recession is one such gash. Libraries receiving a declining percentage of overall university budgets, another. Promotion and tenure systems based upon publication, yet another. Research output evaluation based upon citation metrics, 6% serials cost inflation each year since 2012—cut, cut, cut. (“Death By 1,000 Cuts – Periodicals Price Survey 2018” in The Library Journal, by Stephen Bosch, Barbara Albee, and Kittie Henderson, April 23, 2018)

The STEM publication system is out of balance, which has led to emotional commentaries by all the main stakeholders.

A DYSFUNCTIONAL STEM

143

Antagonism and Mutual Recriminations The current STEM system is characterised by increasing antagonism. One contributor to the debate (Wikoff 2015) claimed: I would love to see publishers, vendors, authors, and librarians sit down and talk straight about what can be done to reach that shared goal because right now, it feels like we are on the edge of a freefall where academic publishing is increasingly not sustainable, and all the parties are just more entrenched than ever. It’s very, very hard to get people to set that stuff aside and work together towards making it all work. I don’t know if it can be done, but we are not getting there the way we’ve been operating up to now -- in a competitive, antagonist way.

In the USA, Harvard University threw its weight behind complaints against STEM publishers. Its concerns were expressed in a letter sent in April 2012 by Harvard’s Faculty Advisory Council alleging a crisis in acquisition of scientific journals. The letter – entitled “Major Periodical Subscriptions Cannot Be Sustained” – reported an “untenable situation facing the Harvard Library” in which “many large journal publishers have made the scientific communication environment fiscally unsustainable and academically restrictive”. A few scientific journals, it said, cost upwards of $40,000 a year each: “Prices for online content from two providers have increased by about 145% over the past six years, which far exceeds not only the consumer price index, but also the higher education and the library price indices”. It concluded that “Major periodical subscriptions, especially to electronic journals published by historically key providers, cannot be sustained” (Harvard University 2012). Other complaints come from the research community. For example, in 2013, four academics from the University of Leicester’s School of Management attempted to debate the steadily increasing prices of publications in a journal published by Taylor and Francis, only to have the article censored by the publisher. This led to the editorial board threatening to resign unless the article was reinstated. The debate was to appear in the journal Prometheus: Critical Studies in Innovation; its title – ‘Publisher, be Damned! From Price Gouging to the Open Road’ – included comments which criticised the excessive profits made by commercial publishers. Equally difficult has been the relations between some editorial boards and the publisher. Elsevier saw resignations of editors to its Journal of Infometrics in protest at its business model which they cited as being too expensive. The editorial board set up a competitive journal. This illustrates the different positions which authors, editors and publishers are taking over STEM business issues.

144

4 STEM DYSFUNCTIONALITY

The “Frustration Gap” The imbalance between supply and demand forces can be illustrated by looking at the expenditure by a country in its national research and development budget as compared with the expenditure on research libraries during the same period. For the United States such a comparison is shown below, where the growing gap between the supply and demand systems becomes evident. The divergence of the two lines – supply of, and effective demand for, scientific material – has been referred to as the “frustration gap”. A further metric showing how librarians have difficulty coping with the growth in publishing output is the falling share of the library budget. Between 1994 and 1999 the percentage share of the library budget compared with its institutional budget within leading higher education (university) institutions in the UK was approximately 3.8%. Since then there has been a gradual decline each year, down to 3.4% in 2005. As far as Research Libraries UK (RLUK) members are concerned, the reduction has been equally marked, having fallen from 2.09% in 2007 to 1.82% in 2009 (SCONUL Annual Returns). The following figures illustrate the decline in the library share of the main types of higher education establishments in the UK. In the USA there has been a similar downward trend, with library expenditure as a percentage of institutional spending falling from 3.83% in 1974 to 1.8% in 2011 (ARL 2008; Davis 2014). Whether these reductions reflect increased competition for the institutional budget, or whether it indicates the declining relevance attached to the research library and its role within the institution, is difficult to judge. But there is concern that libraries cannot use the same financial metrics to advance their financial case as can other departments within the same research institution (CIBER 2009). Payback which comes from an expansion in the library is less easy to demonstrate, whereas other departments can point to an increase in student numbers or research grants to support their future funding programmes. Libraries are an intangible infrastructural support service – in many cases desirable rather than seen as an absolute necessity when it comes to the institution’s annual budgeting cycle.

The “Serials Crisis” The combination of the above has led to a “serials crisis” which faces the scientific community. Governments in the US, UK, Australia, Canada and the European Commission have all expressed concern about the problems in disseminating research results.

19

90

140

190

240

290

340

390 Main cause of the crisis

76 977 978 979 980 981 982 983 984 985 986 987 988 989 990 991 992 993 994 995 996 997 998 999 000 001 002 003 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 2 1 1 1 1 1 1 1 2 2 2 est Year Sources: ARL and NSF

Acadamic R&D Constant $

Average ARL Expenditure Constart $

Fig. 17: US Academic R&D expenditure and ARL library budgets, 1976–2003.

Index (1976=100)

Growth in Research & Library Spending 1976–2003

A DYSFUNCTIONAL STEM

145

146

4 STEM DYSFUNCTIONALITY

Fig. 18: Percentage UK institutional budget spent on their research libraries. SCONUL 2006.

The crisis is financial. Journal subscription prices are rising at a faster rate than most other indices. But the price rises set by publishers are neither consistent nor universal. Bergstrom and Bergstrom (2004) examined the rates charged by publishers, comparing those of for-profit companies with those of not-for-profits. The authors concluded that a journal page published by a for-profit publisher is between three and five times more expensive than journal pages published by a not-for-profit publisher. Highly-cited journals are perceived to be of better quality, which allows for-profit publishers to charge even higher prices for such journals (McCabe 2004; Dewatripont et al. 2006). There is no consistency or standards for price setting of journal titles. It is based on what the publisher feels the market can bear. Several measures have been taken to reduce the financial pressures on libraries. “Big Deals”, enabling libraries to get more bytes for their buck, have been introduced by larger publishers and publishing consortia. This allows for more journals to be delivered at lower per unit costs, providing the library commits to taking more of the publisher’s output. It means that the total amount paid to each publisher increases – the extent being determined by how large the publisher’s list is and what proportion had already been subscribed to by the institution. Publishers have also offered the ability for end-users to buy

A DYSFUNCTIONAL STEM

147

individual articles on demand from their website, at what is generally considered by both the library and end user communities to be extremely high prices (see earlier section on Business Models). Coming to terms with the imbalance between supply and demand is essential for a healthy information service. Unfortunately, an open-minded approach to relating what is perceived as a “need” with creating an appropriate “serviceable system” is not being pursued by traditional STEM stakeholders. There are many factors outlined above in creating concern and criticisms about the current state of STEM information. Key among these is the corporate mission which is being pursued by the large commercial publishers.

Investors Versus Customers A significant part of scientific journal publishing is commercially driven, dominated by a handful of international journal publishers with the interests of corporate and private shareholders being just as important as satisfying the needs of users in the research community. Even those organisations which nominally operate under the banner of “non-profit” (such as learned societies) often pursue commercialism (“surpluses”) as intensely as their for-profit competitors. It is inconceivable that the interests of shareholders seeking optimal financial returns, as against those concerned with the efficacy in the mechanism for supplying a “public good”, can be reconciled given the practices which are currently in place. Commercial journal publishers must persuade their owners, investors and the financial sector that they are acting in their best interests. The financial sector scrutinises company balance sheets to see whether they meet short-term commercial expectations and are suitable candidates for investment. Whilst publishers focus on editorial strengths, or global marketing coverage, or new product launches, the City is interested in financial returns, profits, margins and business relationships. It is the financiers who are important in determining where invested funds go, particularly as several of the large publishers are owned by venture capitalists. The latter’s primary aim is in seeing that their investment in the company achieves payback within a specific period, often short-term. Investment services, such as media equity researchers at Exane BNP Paribas and Bernstein Research, keep a watchful eye on financial reports issued by each commercial journal publisher (Poynder 2012b). They then report their findings back to the financial community at large. This means that a scholarly publishing company needs to produce financial figures each year which are strong and healthy. However, users and libraries who currently buy their STEM products become alarmed when such “healthy”

148

4 STEM DYSFUNCTIONALITY

figures go beyond what is judged acceptable, leading to claims against publishers of “price gouging”, greed, and creating “dysfunctionality”. The following table shows the revenues and operating profits which several of the largest commercial, university press and society publishers declared in their 2013 corporate statements. These figures include revenues from other publishing activities, not just scientific journals, though the latter often exert a heavy and positive influence on their overall results. Tab. 9: Revenues and profits from the major STM journal publishers (2013). Annual published accounts for each publisher. Company

Revenues (in £mil)

Operating Profits (in £mil)

Profitability ratio

£,.

£.

.%

December

Wiley

£.

£.

.%

April

Taylor & Francis

£.

£.

.%

December

Oxford University Press

£.

£.

.%

March

Cambridge University Press

£.

£.

.%

April

American Chemical Society

£.

£.

.%

December

Elsevier (RELX)

Year end

(Source: Published annual reports available in national and trade magazines and newspapers)

The 30% profit margin among the major international commercial publishers of scholarly journals is real. The French organisation EPRIST collects information on French public research institutions. In March 2016 it published an assessment of the financial results from six multinational publishers: these six represented 38% of the total STEM market. The profit margins represented by each of the key publishers were: – Elsevier RELX: 36.7% – Springer/Nature: 39.0% – Wolters-Kluwer: 24.2% – Wiley-Blackwell: 45.9% – Thomson-Reuters – now divested of its WoS – then: 31.2% – Informa (owner of Taylor-Francis): 36.8% (Source: EspritDraper, France, March 2016)

A DYSFUNCTIONAL STEM

149

The consolidation of the scientific publishing industry has been the topic of much debate within and outside the scientific community, especially as it relates to publishers’ high profit margins. However, the share of scientific output published in journals of these major publishers, as well as its evolution over time and across various disciplines, has not yet been fully analysed. From an analysis of 45 million documents indexed on the Web of Science over the period 1973– 2013, both natural and medical sciences and social sciences and humanities, it appears that Reed-Elsevier (or RELX), Wiley-Blackwell, Springer and Taylor & Francis increased their share of the published output, especially since the advent of the digital era (mid-1990s). Combined, these top four prolific publishers accounted for more than 50% of all papers published in 2013. The conclusion is that a discussion on the economics of scholarly publishing is timely (see Larivière et al. 2015). In 2014, Elsevier Science was undoubtedly targeted as the main villain of the piece. In the past, its almost 40% operating profit margin had been overlooked from public scrutiny. In early 2018, Elsevier announced profits of more than £900 million and unchanged margins of 36.8%. The academic publishing division of Informa, which includes publishers Taylor & Francis and Routledge, made more than £160 million in 2016, with a profit margin of 38%. Wiley achieved a margin of 29.6% in 2017, earning $252 million (£183 million). Between these three STEM publishers, more than £1.25 billion a year was siphoned out of the research system annually into largely private/corporate hands. Many millions go to shareholders (Informa and Elsevier’s parent company RELX collectively paid out nearly £900 million in dividends), while tens of millions more went to senior executives (the chief executive of RELX was paid more than £10 million in 2016). Several publishers, such as Springer S&BM/Nature Publishing Group, Sage and Emerald, do not disclose their financial results due to their being in private hands or in closed ownership. Nevertheless, the drive for healthy financial returns for these privately held companies also exists. It is apparent from the above figures that commercial journal publishers are making profits way over what could be regarded as acceptable in a business environment which relies on a “gift economy” sustained by the public sector. The following graph shows the numbers of journal titles which the main publishers produced in 2014. These figures should be seen in the context of there being 28,135 STEM journal titles published annually worldwide. It illustrates the degree of concentration within STEM publishing. The Office of Fair Trading (UKOFT 2002) looked at the potential monopoly situation facing the STEM journal market resulting from the merger of two large commercial publishers (Elsevier and Harcourt) and in a statement issued in September 2002 claimed there was evidence that the market for STEM journals

150

4 STEM DYSFUNCTIONALITY

Fig. 19: Number of journals in publishers’ portfolios Data from publishers’ web sites and media reports. (Source: Trade publications and web site information).

was not working well. The complaint was that commercial journal prices were too high in comparison with education and research institutional budgets. However, it was considered then (in 2002) that it would be inappropriate for the Office of Fair Trading to intervene. It should also be mentioned that a leader in STEM journal publishing in the 1970s and 1980s was Robert Maxwell. Despite his vision of seeing, at an early stage, the role which informatics would take in the STEM publishing sector, his notoriety as a “person unfit to run a public company” (Department of Trade in 1982) will remain his main legacy (alongside questions about his demise). STEM publishing was no longer a “gentleman’s profession” under his control. Nevertheless, there has been a spate of merger and acquisition (M&A) activity among commercial journal publishers, notably in the 1980s and 1990s. Recent mergers included Wiley taking over Blackwell Scientific in late 2012 and Thomson Reuters selling its Science Business to venture capitalist Onex and Baring for $3.5 billion in July 2016. Springer S&BM has been sold recently by one venture capital group to another, with a final twist being the announced merger of Springer and Nature in January 2015. Elsevier has been purchasing a number of information intermediaries in recent years to extend its portfolio of research support services. Despite STEM publishing being hailed as monopolistic, inefficient, inequitable and dysfunctional, the traditional journal publishing model remains intact. The answer to such a dichotomy may be found in the writings of economics laureate Kahneman who suggests that there are other motives besides efficiency

A DYSFUNCTIONAL STEM

151

in dictating corporate action. Kahneman in Prospect Theory: An Analysis of Decision and Risk (Kahneman 1979) suggests that “losers” always fight harder than “winners” to protect their interests. This means that it will always be a harder struggle to effect change than it is to preserve the status quo. The publishing industry will preserve its lucrative business rather than take risks in exploring unknown and expensive business paradigms. A similar point was made by the business guru Peter Drucker, who wrote that when everybody has accepted that change is unavoidable, change is like death and taxes; they should be postponed as long as possible. No change would be infinitely preferable (Drucker 1999). Change would also need to be made synergistically across all the stakeholders, and not confined to just one of the stakeholders. The consequence is inherent sector conservatism, that there remains market control by a few large commercial and society publishers, and as technology, globalisation and social factors all change, the dominance of a few companies over the current publishing scene may continue. The dominant players have the resources to survive market upheaval for longer than their smaller competitors. This dominance does not bode well for UKWs who are not beneficiaries of the business model which large commercial publishers prefer to use. The groundswell of concerns about the dysfunctional nature of scientific journal publishing has reached significant proportions (Neilsen 2009; Monbiot 2011; Gowers 2014; Brown 2009; Murray Rust 2012). The question is whether the situation has become sufficiently unacceptable as to generate intense interest from politicians and the science community at large to see some equitable and fair system being introduced for STEM. Much will depend on how powerful the agents for change, the “perfect storm factors”, will become and how they will force a new approach to STEM to come into effect. Conservatism may prove to be less of a barrier and more of a stimulus in seeing a new age of STEM emerging. The difficulty is that the dysfunctionality of STEM is in part created by and sustained by the researchers themselves. Insofar as they remain wedded to their need to be seen to be productive, as indicated by their cited articles being published in the more prestigious scientific journals, they have little inclination to change publishing habits. Established quality control through the refereeing system established by branded journals represents a serious impediment to change taking place. That is, until an acceptable alternative appears. Having pointed out the various support mechanisms and challenges facing commercial journal publishers, the context within which UK publishers operate will be considered. This includes establishing where the UK fits into the global scheme of science research.

152

4 STEM DYSFUNCTIONALITY

UK Status in Scientific Research On the one hand, the UK’s research programme has become increasingly global. Collaboration exists between researchers in other countries, and science itself is moving towards an open and democratic system which ignores geographical boundaries. On the other hand, UK research is competitive and introspective, seeking to support national innovation and scientific excellence in its universities. It also protects the intellectual property rights of the UK-based researcher/ author. This is a difficult balance – between internationalism versus nationalism – for STEM to sustain, both politically and operationally. There is a claim that the “UK punches above its weight” in science and this includes a well-roundedness of research across most disciplines (UKDBIS 2009; Elsevier 2013). Whilst the UK represents just 0.9% of the global population, 2.6% of R&D expenditure and 3.3% of researchers, it accounts for 9.5% of downloads, 11.6% of citations and 15.9% of the world’s most highly-cited articles (UKDBIS 2009). Amongst its comparator countries, the UK has overtaken the US to rank first by field-weighted citation impact (an indicator of research quality). The UK is recognised as being highly productive in terms of researchers, articles and citation outputs per researcher as well as per unit of R&D expenditure. The following chart puts these issues in a quantitative context. However, both the UK’s share of worldwide R&D and its share of worldwide researchers have been falling. This is reflected in the following table: Tab. 10: UK’s share of worldwide R&D and of worldwide researchers. (Unesco 2015). Year    

UK worldwide of share of R&D

UK Share worldwide researchers

.% .% .% .%

.% .% .% .%

This has had its impact on the UK’s share of worldwide scientific publications over the years. In 2007, the UK’s share was 14% of scientific articles; by 2013 this had fallen to 6.9% (Unesco 2015). During phone interviews with individuals in the UK academic sector and abroad, international collaboration and researcher mobility were seen as important in maintaining the UK’s prominent position as a scientific research nation. UK researchers are not only highly collaborative and mobile across national borders, they are mobile between academic and corporate sectors within the UK.

A DYSFUNCTIONAL STEM

153

0.9% of global population 2.6% of global funding for research

3.3% Researchers

7.9% of papers published

9.5% Downloads

11.6% of world citations

15.9% of world’s most highly cited articles

Fig. 20: UK’s share of global demographics and STEM (2009). Based on graph in The Royal Society’s “The Scientific Century: Securing our Future Prosperity” (Taylor 2010; UKDBIS 2009; Elsevier 2013).

Traditional institutional and geographic boundaries are breaking down with respect to research activity. This is important for knowledge workers, as the research structure is becoming more open, interactive, de-fossilised and global. Several countries have business sectors which invest more in R&D as a percentage of their gross domestic product (GDP) than the UK. In the international stakes the UK falls behind the Asian tiger economies. Tab. 11: R&D performed by businesses as a share of its GDP in 2013 (Unesco 2015). R&D as % of GDP South Korea Japan USA Germany UK

.% .% .% .% .%

154

4 STEM DYSFUNCTIONALITY

As shown by the Unesco figures above, South Korea, Japan and the USA all exceed the UK in terms of proportionate investors in domestic R&D programmes. Different figures have been made available in Elsevier’s 2013 survey of the UK research sector for the Department of Business, Innovation and Skills. The Elsevier study indicates that funding of R&D differs in the UK’s higher education sectors and business sectors compared with most comparator countries. The Elsevier report for DIS (Elsevier 2013) also shows the distribution of funding sources and the sectors where R&D is performed for 2011. Tab. 12: Overall research and development (GERD) by source of funding. Sector

R&D Funding

% of Total

Business/private sector Higher Education Government R&D Other

£. billion £. billion £. billion £. billion

% % % %

Tab. 13: UK R&D by sector of performance. Business/private sector Higher Education Government R&D Other

£. bill £. billion £. billion £. billion

% % % %

R&D performance is proportionately greater in the higher education sector in the UK, but lower in the business/private sector than most of the other main R&D intensive countries. “R&D as a business sector is considered a driver of shortterm economic growth” (Elsevier 2013), possibly due to the exclusion of developing long-term scenario-building and visions. Within the UK there is little consistency in the statistics given for R&D spend within the non-academic sector specifically. The following tables give the figures from two sets of national R&D data.

The “Multiplier Effect” Building on the environmental changes described in previous chapters is the “network and multiplier effect”. This effect increases the rate at which overall change is brought about through the interaction of the many “perfect storm” concepts.

A DYSFUNCTIONAL STEM

155

Tab. 14: UK R&D in UK professional and engineering sectors. Sources: (A) Table SB2 – Expenditure on R&D performed in the U.K. Businesses: 2001 to 2008. U.K. Business Enterprise Research and Development Statistical Bulletin, 2008 (11 December 2009). (B) UK Department for Business Information and Skills – Scoreboard. Professions

Numbers Employed

Number of Firms

R&D Data ONS Data ( BIS Data ( in in £m) (A) £m) (B)

IT strategy and planning

,



Technology hardware



,



Mobile communications



,



 

, 

Oil and Gas production



,

Mining





Industrial engineering





Aerospace and Defence



,

,



,

,

Telecommunications Civil engineers

Mechanical engineers

,

,

General industrial



Industrial transport



,



Gas, water utilities







Oil equipment services



Design & Development eng

,

Personal goods Leisure goods Production & Process engineer



 

























,

Forestry and Paper Planning & Quality engineering

,

Electrical engineers

,

156

4 STEM DYSFUNCTIONALITY

Tab. 14 (continued ) Professions

Numbers Employed

Number of Firms

R&D Data ONS Data ( BIS Data ( in in £m) (A) £m) (B)

Electronics engineers

,

Quantity surveyors

,

Household/home Bioscientists and biochemists









, 

Food producers



,

Beverages





Tobacco





Pharmaceutical/ Pharmacology

,



,

,

Chemists

,







Chemical engineers

,



Physicists, geologists, meterol

,

Industrial metals

 





,

,

SUBTOTAL ENGINEERING & INDUSTRY

,



Medical practitioners

,





Software professionals

,



,

Solicitors, lawyers, judges

,

Management consultants

, 







Life insurance Accountants (certified & chart)

,

A DYSFUNCTIONAL STEM

157

Tab. 14 (continued ) Professions

Numbers Employed

Number of Firms

R&D Data ONS Data ( BIS Data ( in in £m) (A) £m) (B)

Bankers Social workers

,

Accountants (management)

,

Architects

,

Dentists

,

Public service workers

,

Psychologists

,

Town planners

,

Legal profession nec

,

Opticians

,

Veterinarians

,

Probation officers

,

Social science researchers R&D support Wholesale & Retail Miscellaneous

SUBTOTAL SERVICES



,



,

,   

,,

Not listed above

TOTAL

,,



,



,

,

158

4 STEM DYSFUNCTIONALITY

The multiplier accentuates the speed with which a new scientific communication process takes hold. Combining a series of small changes to variables, as with the “perfect storm” factors, can result in a major disruption to the traditional way of doing business. This is a central message of this report.

Summary Many pundits claim that the traditional system is creaking. It is no longer fit for purpose (McGuigan and Russell 2008) because of the inbuilt tensions within the industry sector. There are tensions: – Between trust and fear within the changing knowledge environment – Between collaboration or competition in the conduct of research – Between privacy and open sharing the results of research – Between transparency and control over quality research output – Between meritocracy/elitism and democracy in governance – Between profits and openness as business models – Between traditionalists supporting the existing system and innovators seeking something better This chapter demonstrates the complexities currently inherent within digital scientific communication. The overall conclusion is that financial barriers are not the only reason for STEM publishing having difficulties in adjusting to the digital environment, nor why the wider knowledge worker sector has remained on the sidelines. Social and technological deterrents are also relevant. Expanding the range of publishing formats would do much to break down the monolithic business model of subscriptions/licences. A wider audience of UKWs within the professions, SMEs and citizen scientists could be reached if greater flexibility were introduced by the STEM community. In so doing, the balance between supply of and demand for STEM information could be accommodated. It moves the business from a low sales/high margin operation to a wider sales/low margin activity. The latter provides the basis for healthier business models to be introduced. This is crucial to ensure that supply and demand forces are similar and participatory, less disparate and confrontational as well as financially sustainable in a new STEM environment. The STEM publishing industry is currently stuck in a time warp, conditioned by its profitability expectations and ownership structure which limits any attempts to experiment with risky new paradigms which might address some of the criticisms being levied against it by industry pundits. It is sustained

THE SCIENTIFIC JOURNAL

159

by the intransigence by authors of research publications and their institutional affiliation However, the serials crisis remains a problem facing all stakeholders in STEM. Lack of collaboration between publishers and librarians to find their way through current difficulties as part of its print-legacy, and the unwillingness of publishers to cooperate on developing meaningful strategies to migrate STEM smoothly into a digital environment are potentially destructive. There is complacency dominating STEM publishing, with operational decisions taking precedence over strategic initiatives. This is the context which faces STEM. It stems from a dysfunctional, printfocused base and yet is facing a new unknown set of frontiers over which it has no control. The determinants of the future publishing environment are mainly external to the sector and – acting in concert – are powerful. The rate of change could be rapid. How much of a brake the inherent conservatism within STEM will have on this rate of change is unknown, but it is likely to be whittled down over time as the “perfect storm” takes effect. The question is whether these inherent tensions within STEM will surface and lead to a revolution rather than evolution in the way STEM operates over the next five to ten years. It is a question which can be answered in several ways, depending on the status, background and experience of those who address it. From the perspective of an independent observer, major change in STEM will happen, but it will still have as its basis the traditional book and journal to satisfy needs of conservatism within the research sector. The balance between innovation and traditional aspects of STEM is difficult to assess without more futuristic studies being undertaken to provide evidence of motivation and impact. One important ingredient in this crystal ball is the speed and assimilation of new trends in STEM communication. The implications which change is having on existing and potential future STEM publishing formats is assessed in the next sections.

THE SCIENTIFIC JOURNAL Functions of the STEM Journal Researchers have traditionally published their results in peer-reviewed papers which result in theories and evidence being tested and proven. Science is built on this, and as a corollary, over centuries the scientific journal has become the means for communicating quality-controlled results within the research community.

160

4 STEM DYSFUNCTIONALITY

Journals perform four key functions which led to their dominance – registration of results, refereeing, authentication and dissemination. In a print-only world and in the early days of digitisation these functions were what the research community needed. But as the Internet, the web and digital support services became powerful, the relevance of some of these core functions were re-assessed. Fast communication of the latest research data is important in a volatile scientific environment. Traditional printed journals do not perform this readily. In creating a quality product (the article’s Version of Record) various actions are undertaken in meeting the above four key core functions of a journal. However, most result in a time-lag being built into the publication cycle in a print or e-journal of several months, in some cases years. This is not what modern science needs. As a result, alternative mechanisms for communicating latest research outputs have emerged. New communicating formats, using social media and social networking, workflow procedures, collaboratories, sharing datasets, “mash-ups” and data mining are more in line with the digital capabilities and infrastructure within which authors/researchers are now operating, and will continue to operate more extensively in future. Reviews and Refereeing The refereeing system has come under scrutiny. Double blind refereeing of a manuscript by at least two knowledgeable experts became the basis for ensuring that quality was maintained (Ware 2005). Despite several well-documented individual failings, plagiarism and frauds, the refereeing system worked reasonably well. But in practice the conventional refereeing process is slow, ponderous and relies on the goodwill and community spirit of a small group of referees (Ware 2005). New online interactive and transparent systems for validating research have emerged. Whilst still relying on an assessment by one’s peers, online refereeing including pre- and post- publication assessments have been tested. The practice of journal editors asking independent, usually anonymous, experts to scrutinise manuscripts and reject those deemed flawed has become central to the scientific journal process. However, this tradition of according importance to papers labelled as “peer reviewed” has become questionable. An increasing number of journals that claim to review submissions in this way do not bother to do so. Not coincidentally, this seems to be leading some academics to inflate their publication lists with papers that might not pass quality scrutiny. Experts debate how many journals falsely claim to engage in peer review. Cabells, an analytics organisation based in Texas, has compiled a blacklist of those which it believes are guilty of deception. The company uses 65 criteria to

THE SCIENTIFIC JOURNAL

161

determine whether a journal should go on their blacklist. Cabells’ list totals around 8,700 journals in 2018, up from over 4,000 a year ago. Another list, which grew to around 12,000 journals, was compiled until recently by Jeffrey Beall, a librarian at the University of Colorado. Using Beall’s list, Bo-Christer Björk, an information scientist at the Hanken School of Economics in Helsinki, estimated that the number of articles published in questionable journals has ballooned from about 53,000 a year in 2010 to more than 400,000 currently. He estimates that 6% of academic papers by researchers in America appear in such journals. These figures compare with an estimated total of 1,800,000 articles being published per annum. Service to Authors The journal still has a role to play, but its main service is no longer that of facilitating immediate communication, instead extending recognition and authority for the author. Having an article published in a reputable journal means the author can use the publication details to gain peer recognition, tenure, career enhancement and/or additional funding as part of their CV (curriculum vitae) submissions. All these are essential activities, but not critical for end users and knowledge workers who are seeking to keep up to date with the latest research developments, particularly those which are on the fringes of the academic system. To quote an item from Velterop (an ex-traditional publisher) (Velterop 2012): We use journals not for conveying information, but for protecting scientific reputations and for fostering career prospects. . . Hanging on to the old (subscriptions) in order to achieve the new (open access) may have been considered a suitable strategy ten years ago, but what it delivers is at best a form of open access that’s likely to be merely ‘ocular access’ and of limited use to modern science, in contrast to the benefits that come with a radical change to full open access (no rights limitations, commercial or technical), not just to the equivalent of text on paper, but to all the potential that can be released from text, tables, graphs and images in electronic format.

Velterop’s qualifications for making statements about the future for STEM publishing come from his senior management experiences within Elsevier, Academic Press, BioMed Central and as developer of new STEM initiatives in the digital world. The role of the journal is made more complex because of the many spin-off services which have emerged. The following chart developed by Michael Mabe, CEO of the International Association of STM Publishers, shows the relationship between formats as part of the scientific method for undertaking and disseminating scientific research.

162

4 STEM DYSFUNCTIONALITY

Private Create Discuss & revisit Criticism Public evaluation

Confirmation

Acceptance & integration

Co-workers

Invisible college Speciality Discipline Public

research 1 st draft Draft mss

Draft for comment

Seminar/workshop/conference

int Pre-Pr

Science journalism

Peer reviewed paper in a journal

iew Rev per pa reference prizes work monograph history textbook

Fig. 21: The Journal and the Scientific Method. Personal communication from Michael Mabe (CEO, Int. STM Association).

A further issue which challenges the reputation of the scientific journal is that of “negative results”. It is as useful to publish the outcome of research which fails to prove a point as it is to publish results which are supportive. Negative results enable time and resources to be saved by not following others down what has proven to be unsuccessful paths. Publishing “negative results” could, in some instances, be considered a research failure, and as such researchers may be unwilling to lend their name to the results. Negative results now account for 14% of published papers, down from 30% in 1990 according to a report in The Economist of October 19, 2013 (The Economist 2013). Forty-four per cent of scientists who have carried out unsuccessful research (in terms of the original research specification) have been unable or unwilling to have their results disseminated. The stigma from being seen to be involved with research having negative results appears to be a restriction on openness. Similarly, complaints have been made in the pharmaceutical sector that published research cannot always be replicated. Again, The Economist reports that biotechnology venture capitalists found that half the papers that describe research projects could not be repeated. Amgen found that they could only reproduce six of the 53 “landmark” studies in cancer, and Bayer found that they could only repeat 17 of 67 similarly important papers (The Economist 2013). Nature undertook a survey of researchers and found that over 70% were unable to replicate

THE SCIENTIFIC JOURNAL

163

another scientist’s experiment (Kirchherr 2017), and that 52% highlighted this as a significant issue. In 2011, a project was launched by the Center of Open Science which aimed to replicate 100 different studies that were published in 2008 in the field of psychology. The results again showed that whereas 97% of the original results showed a statistically significant effect this was reproduced in only 36% of attempts. In another study, 50% of life science research could not be replicated. Fourteen per cent of scientists claim to know a researcher who has fabricated entire datasets. This is another dark side of STEM publishing. Experts from the library sector have also suggested that STEM journal publishing faces problems. A serial “subscription” is in library terms a “publication that is intended to be continued indefinitely”. For example, when a library subscribes to a journal, it is saying to the publisher “I’ll pay you up front to send me all the articles published in a Journal X for a year, regardless of how many of the articles turn out to be of any actual use or interest to my patrons” (Anderson 2013a). In the print world, the librarian had no choice but to buy articles that way, but in an online environment that level of built-in superfluous waste is no longer necessary (“unbundling”), which the library’s shrinking budgets are making it much harder to justify. It makes more sense to pay only for those items that are wanted and get used (Anderson 2013a). Other pundits claim we are operating in an economy that has been shaped by the inefficiencies of the print environment. In one study, the authors noted that “as many as 50% of papers are never read by anyone other than their authors, referees and journal editors.” It also claimed that 90% of papers published are never cited (Eveleth 2014). There is a long tail in articles which are read infrequently. According to Rick Anderson, Associate Dean in the Library at University of Utah (Anderson 2011), the long-term solution will not involve libraries paying for articles their patrons do not want, because the money to do so is no longer available. In focusing on what the library will need to collect on behalf of its research community there are several motives which drive patrons to change their behaviour patterns. These include: – The ability to communicate with peers – Being effective in undertaking research itself – Socialising with researchers and other social groups – Sharing the results of their efforts – Building online communities with those having similar interests – Becoming involved in partnerships with others – Competing effectively in chosen area of research – Becoming involved in crowd sourcing (Susskind and Susskind 2015, 176)

164

4 STEM DYSFUNCTIONALITY

Another indictment of the existing STEM journal publishing system can be found in that 50% of all significant scientific discoveries are the result of a complete change – the eureka moment – which was not based on painstaking, incremental research efforts over years. These include such items as pacemakers, safety glass, artificial sweeteners and plastic. These have emerged in despite of, or in parallel with, published outputs in the traditional research journal. They were serendipitous, emerging from nowhere, least of all a recent rigorous assessment of journal literature. Based on the above, the scientific journal offers few tangible advantages to users of research in a digital age, whether they are academics or unaffiliated knowledge workers. These issues question the value of the research journal and the article as carriers of research results.

Alternatives to the STEM Journal As reported by Velterop (2012) in a blog in which he describes the role of the journal: Very few journals are indeed ‘journals’ (in the sense of presenting ‘daily’ updates on the state of knowledge), except perhaps the likes of PLOS One and arXiv. So what we traditionally think of as journals have had their heyday. They functioned as an organising mechanism in the time that was useful and necessary. That function has been taken over, and become far more sophisticated, by computer and web technology. That doesn’t mean journals, as an organising concept, will disappear any time soon. I give them a few decades at least. I see articles also change in the way they are being used and perceived. They will more and more be ‘the record’ and less and less a means of communication. One reason is the ‘overwhelm’ of literature (see e.g. Fraser & Dunstan, on the impossibility of being expert (Fraser and Dunstan, 2010)). ‘Reading’ to ‘ingest’ knowledge will be replaced by largescale machine-assisted analysis of, and reasoning with, data and assertions found in the literature. Organisation of the literature in the current prolific number of journals — and the concomitant fragmentation it entails — will be more of a hindrance than a help. Initiatives such as nanopublications (http://nanopub.org) and, in the field of pharmacology, OpenPHACTS (http://www.openphacts.org), are the harbingers of change.

Discovery tools, especially gateway services such as Google Scholar, PubMed, Scirus and the Web of Science, have made research literature more visible to more people more conveniently than ever before, but discovery and access are not the same. Researchers vent frustration over the limited range of journal titles available to them at their institution in the free text comments section attached to the CIBER “Gaps and Barriers” survey (Rowlands and Nicholas 2011). Many respondents were especially resentful when they found something that looked useful but for which they encountered a pay wall.

THE SCIENTIFIC JOURNAL

165

A key issue here is the tension between the “article economy” (what readers want) and the “journal economy” (the dominant business model for information supply in the form of subscriptions or site licences). The provision of simple (preferably free or inexpensive) mechanisms to deliver information at the article rather than journal level would extend the reach of research. Blogs, wikis, moderated listservs, BlogTalkRadio, online seminars and webinars have emerged as exemplars of new grass roots of information creation, thriving in open networking and the democracy which the Internet has brought about. They are available for free and have extensive, global reach. This suggests a new form of communication based on the above could become the cornerstone of STEM, a communication system which is referred to as social publishing, based around Web 2.0 or even Web 3.0. As an example, Bahrend Mons, Associate Professor at Rotterdam University, was instrumental in developing Knewco, a company which focused on disseminating “nuggets” of biomedical information rather than full text describing research results, through adoption of Knowlet technology and the semantic web. It morphed from a scientific concept into a commercial operation in 2011 (and renamed itself as Personalized Media Communications) to develop personalised content recommendations and ads to readers. It has moved the goalposts away from journals/articles to other artefacts (information nuggets) which meet researchers’ information needs. This is one of several new STEM information services developed by and for researchers in the field. Padley, CEO of Semantico, gave a wake-up call for “what the article of the future is really about” (Padley 2014). He suggested that if we were called upon to design the ecosystem for scholarly communication today it would not look like it is now. He pointed out that print does have permanence, but such permanence is not necessarily what science needs. Features which should be designed into the “article of the future” include the ability to change the article and not to lock it into the past. It needs to be interactive and updateable. It also needs to be executable so that it leads through links into other sources and media. As researchers interact directly with each other’s data it becomes possible for them not just to publish science online but create new science online. Padley also thinks the article of the future should be reproducible. This means providing access not only to the raw data but to the software with which to manipulate the data: This takes us a long way away from a world where recognition of academic results is more or less dependent on text. . . contained in a document. However, for many publishers that mind-set is proving hard to change (Padle 2014).

166

4 STEM DYSFUNCTIONALITY

A further extension of this is text and data mining (TDM); if this takes hold does it mean that in future anyone will still want to read papers in the traditional way? It is wrong to claim that commercial journal publishers have avoided experimentation with STEM publishing. Elsevier, having substantial financial resources, has committed to several research projects to define a future for itself. One of these was the so-called “Article of the Future” which included new ways of presenting and navigating through articles, which are now being included in several Elsevier titles. The project design was based on interviews with some 800 researchers (Zudilova-Seinstra 2013) and made available through Elsevier’s Content Innovation and Science Direct programmes. These isolated projects and viewpoints need to be set against the panoply of other processes which relate to and impact on published research results. Several of these have been embraced by wider society and open the potential for more general knowledge workers to become involved in scientific research.

Alternatives to the Journal Article Individual article supply, such as through document delivery agencies, has been a feature of STEM publishing for decades. Pioneered by what became known as the British Library Document Supply Centre in Boston Spa, Yorkshire, it enabled the purchase of individual articles from a vast centralised collection. Other document delivery centres were established, both publicly and privately funded, in several countries. However, they were controversial as far as STEM publishers were concerned. There was a fear that document delivery would undermine the sale of the subscription as libraries sought to provide relevance to their collections instead of accumulating material (Brown 2004). This was never proven despite several studies undertaken at the time (Artemis, ADONIS, OASIS). During the 1990s and 2000s, the sale of articles through docdel (document delivery) began to decline, with the unit sales from the BLDSC falling from four million per annum in the mid 1990s to one million in the late 2000s. Alternative options have emerged to usurp what many had claimed was an “article economy” (Brown 2003). Independent from this trend, the Association of Research Libraries (ARL 2008) commissioned the Ithica organisation in the USA to investigate other new formats which were becoming available and being used by scholars to communicate research results. Their report, “Current Models of Digital Scientific Communication”, was published in November 2008. It highlighted that there were several ways to disseminate scientific information, and whilst still peripheral (at that stage) in

THE SCIENTIFIC JOURNAL

167

most subject areas – with the mainstream refereed journal as the accepted mode of communication – there were nevertheless subtle changes taking place. The ARL/Ithica survey identified eight main types of digital scientific resources available at the time. These were: – E-journals – This includes e-journals which allow immediate access to newly published articles – Some e-journals included multimedia, data visualisations, large datasets (such as JoVE: Journal of Visualized Experiments) – Reviews – Though highly rated as a service it does take a long time to write and edit each review so timeliness can be an issue – They are mainly of appeal in the humanities – Preprints – Two key e-print resources were described including arXiv (physics) and Social Science Research Network (SSRN) – Encyclopaedias and other reference works – Includes Encyclopedia of Life which encourages contributions from the lay public, although subsequent vetting is necessary – Data – The Protein Data Bank was given as an example of mass participation in creating a global digital data resource – Blogs – Blogs were seen as updated versions of the traditional listservs. Unlike discussion lists, blogs are more tightly controlled on who can participate – Of value in that they give frequent updates of researchers’ opinions and early thoughts rather than just facts which might have passed their sell-by dates. – However, blogs only represent interim stages, not the final results – Hubs and Portals – Combines many formats within a single portal or collection of items – Discussions forums – Listservs, message boards etc. are still used heavily in many disciplines – Not used to work through ideas, however, more of a broadcast medium for research updates The above list is not exclusive, and the report was completed over ten years ago. It explains why the more “e-communicative” forms are not included in the list – services such as Skype. YouTube, Facebook, Twitter, LinkedIn, WhatsApp, Snapshot etc. – all of which could become part of the scientific communication

168

4 STEM DYSFUNCTIONALITY

system but not necessarily involved in developing the final record of scientific progress. Also excluded from the ARL/Ithica list were aggregations of links to other sites, software, digital copies of print content, industry newsletters and teaching-focused resources. Nor was Wikipedia, which was seen more as a consumer-focused service. The point in highlighting these new digital resources is that the unaffiliated (UKWs) have opportunities to participate in such new social platforms, more so than they had under the “elitist” and controlled journal publication system. It gives the unaffiliated scope to sit at the same table as their academic peers in creating and developing products/services. It is another example of the Internet facilitating more democracy within the scientific information system.

Metrics There is a distinction to be made between metrics supporting library management and metrics serving specific needs of the researcher. Traditional bibliometrics studied the use and application of library resources. How frequently is a book circulated or a digital resource accessed? How often is an average article in a journal cited over the space of a few years? These are important questions for librarians to ask, relating to the use of their collections and the return on their investments in published resources. Altmetrics provides both granular and research-based sets of information. Some of the questions which could be addressed include: How has this article been received? Across all publications, how often has this researcher’s work been accessed? Are there methods, other than citations, that could assess the impact of a researcher’s work? Instead of using journal-focused measures of quality (such as the Impact Factor), the Altmetrics community separates assessments of quality from the formats in which authors chose to publish their results. This does not mean the demise of traditional metrics, which will continue to be used in specific instances, but it does highlight the distinction between specific bibliometrics and aggregated altmetric. It also suggests that attention focused on altmetrics may give a clearer picture of the user needs for STEM and the variety of needs across new (disintermediated) and old (affiliated) research sectors. But authors are currently trapped inside a giant prisoner’s dilemma. One active participant in the metrics arena is Elsevier. In addition to its collection of over 2,500 journal titles, its creation of the SCOPUS service and its derivative SciVal (a benchmarking tool that helps compare activities of client institutions with those of other institutions), and building services such as Pure

THE SCIENTIFIC JOURNAL

169

(a researcher or institutionally-focused aggregation tool), Elsevier has also more recently been acquiring companies complementary to its publishing activities and in particular extends on the platform from which altmetrics can build. It announced the availability of CiteScore, an Elsevier-developed variant of the Impact Factor: “This variant of citation analysis is based on an average citation ranking but includes all references to a publication over a three-year period, whereas the Impact Factor is derived only from research articles and only from selected sources over two years” (Carpenter 2017). Elsevier has also acquired Plum Analytics which together with Mendeley, acquired by Elsevier in 2016 (and which provides citation management data, or data that people have added an article to with their reference managers), are externally-created services which pundits feared would lose their independence and impartiality once they were absorbed into the Elsevier network. The jury is still out on this, particularly on whether other information suppliers would be willing to work with these innovative services under Elsevier ownership (Carpenter, Scholarly Kitchen, February 2017). With the departure of Thomson Reuters in 2016, and the acquisition of several of its intellectual properties by corporate investors (Onex and Baring Private Equity), Clarivate Analytics has taken on the mantle of being a significant player in the bibliometric arena. Building on established intellectual assets such as the Web of Science and Derwent Patent Index, Clarivate has developed several new metric services in support of the researcher effort. An overall consideration is that the dominant use of metrics can lead to simplified and inaccurate assessments of research. It can lure researchers into pursuing high rankings first, and good research second. There are some developments underway which will have an impact on the value-assessment being given to metrics in STEM in future. Some of these developments are explored in the next chapter.

Summary According to a report published by Elsevier in February 2019: The research article is still valued as a channel for communicating the stories behind discoveries, but has become atomized with the growth in popularity of electronic lab notebooks and other tools that facilitate fragmentation of the research and publication process. This means the article has evolved into a notebook-style paper containing (as applicable) experimental methods, data and observations, source code, and claims and insights (Elsevier and Ipsos 2019).

170

4 STEM DYSFUNCTIONALITY

The role of print servers could also be a harbinger for change, which overall may lead to a “fall in manuscript submissions to journals, leading to closure of some titles and the failure of some publishers” (Elsevier and Iposos 2019). There are a number of specific other challenges to the centrality of the research journal and journal article which are evolving.

FUTURE STEM COMMUNICATION TRENDS There is a future-focus to this section of the report. It speculates on alternatives to the journal and the research article as carriers of STEM research results as reflected in the writings of experts. Such speculation currently lacks supporting evidence and is subjective. It is based on opinion and experience rather than facts and data, but is useful nonetheless if it exposes potential new directions which STEM could take in coming to terms with the many external drivers for change. Several thematic areas with a fluid time horizon for implementation exist, ranging from the near future (mobile and social technologies) to over ten years into the future (artificial intelligence, machine learning, cognitive developments, quantum computing and automation of knowledge). Each brings alternative carriers for research results. These differ from those offered by the static and conventional books and journals. Each new carrier has implications on how publishers and libraries come to terms with future developments (Brown and Boulderstone 2008). However, the past few decades give grounds for caution in anticipating the direction which new information technologies will take. The launch by IBM of the personal computer in 1981 was entirely unexpected; by the same token the Web’s arrival in the early 1990s was unanticipated; social media emerged on the scene more recently which has also had a radical impact. None of these could have been forecast with conviction before the event. By the same token, it is impossible to anticipate what will capture the imagination of the STEM sector in the next five to ten years. Something equally profound may emerge. There is unlikely to be a linear progression from what we currently have as a research dissemination system. Several instances of such change could be identified in the following. None are definitive, but the exercise of pursuing possible lines of future technological development affecting STEM is infinitely preferable to assuming there will be no change at all.

FUTURE STEM COMMUNICATION TRENDS

171

Repackaged STEM Publications Joseph Esposito, a US-based commentator, applied the Nautilus concept to the outwards-drift in information demand and supply. He suggests there is a different publishing model required to reach those knowledge workers further out, discipline-wise, from the original source of content creation (Esposito 2007). From the requirement for rapid access to primary research findings, the nature of demand may change towards secondary (abstracts and metadata) and tertiary (review, reference) material the further out one goes from the centre of scientific activity. The implication is that there is a new opportunity which could open to satisfy the demands of “the long tail”. This opportunity is more tertiary, informed and educational in nature than high level primary scientific reporting for academic researchers as in research journals and datasets. According to Esposito, it is possible to envisage scientific communications as a spiral: the inner spiral represents the researcher’s closest colleagues; the next spiral outwards is for people in the field but not working on the topic of interest to the author; one more spiral and there is the broader discipline; beyond that are adjacent disciplines until one moves to scientists in general, highly educated laypersons, university administrators, government policymakers, investors; and ultimately there are the outer spirals, where there is the general public and consumer media. Something may be lost in translation as research information is moved outward from the core research. Without accuracy in translation, the loss would be great as errors in interpretation could develop. In effect, to translate a research article from its technical register into everyday English would (depending on the approach taken) make it more ambiguous or more verbose and therefore worse than the original article from the perspective of the primary audience. This is a stylistic issue which needs to be addressed (see later). At each spiral away from the centre, the role of the publisher potentially changes. Esposito offers the scenario that publishing and open journalism could take on not just the primary role of certifying the original research result, but also a tertiary role of interpretation for much wider audiences – to overcome ambiguity and verbosity. Unaffiliated knowledge workers would be major beneficiaries in such a scenario. Esposito believes that the future of communications needs to be based on the infrastructure of consumerism (Esposito 2010; 2012b): “This is because in a networked world the “number of nodes connected to a network matter [Metcalf’s Law] and the consumer market has the big numbers”. The issue which needs

172

4 STEM DYSFUNCTIONALITY

Fig. 22: Nautilus model of scientific communications Esposito 2007.

addressing is how to layer academic needs and interests onto a consumer market’s platforms using modern tools such as Google, iPhone, Facebook, Twitter, FigShare and LinkedIn. Allington, however, makes the point that it is not part of a researcher’s mission to be involved in public education of STEM (Allington 2013). They are different functions, with high level description of research results not sitting well with a layman description of the same results. Highly competent researchers are not necessarily the best educators or journalists. There are also other services which tap into the need to provide “translation” of research results to reach a wider market. A UK start-up entitled Kudos, established in 2012/13 by a group of ex-STEM publishers, provides authors and publishers with ways to reach the public audience. This arose because Kudos’ initial investigations indicated that 84% of authors contacted felt more could be done to “raise the visibility, impact and usage of their work” (Smith 2013). Over 125,000 researchers (in 2017) signed up to use Kudos’s free platform, and

FUTURE STEM COMMUNICATION TRENDS

173

according to a press release, Kudos’ tools allow researchers to measure performance of their publications and to track the effects of their efforts in improving visibility. In recognition of their approach, ALPSP voted Kudos as having launched the most innovative industry product in 2015. In April 2017, Kudos announced a partnership with US-based Editage, a company which helps researchers create, in plain language, summaries of their works to enable a wider audience to be reached. Editage, which commenced operations in 2002, has offices in the far east as well as the USA, and has edited nearly 750,000 papers on behalf of 200,000 authors. The partnership initially involves a co-branded web site being established. This is a practical example of a service which can reduce the current exclusivity of STEM articles and pursues the democracy theme underlying many of Internet’s features. Enabling more people to appreciate the implications from a successful research project in a way that is understandable would go a long way to meeting needs of UKWs and ensuring a healthier STEM information system. However, the practical aspect of seeing that both “core” and “long tail” parts of the science community are delivered reports in a useful way may only be possible with the introduction of more customised/profiled information delivery systems (RSS, Alerts) and a greater range of publication formats (hubs or portals).

New Approaches to Scientific Communication There are a few indicators in the way STEM researchers may adapt to the new informatics environment. Comments in a blog by Professor Jeffrey, Director International Relations, STFC, focus on the way technology and usage are beginning to interact on the researcher: Technologically the user interface level is likely to be semantic web/linked open data driven from below by a formal information system with very rich metadata and services linked as business processes for research discovery, analysis, manipulation, mining and communication from the researcher ‘workbench’. Interaction by speech and gesture rather than mouse and keyboard will become the norm (Jeffrey 2012).

Text and data mining, enabling new ideas and relationships to be found from access to research material on different platforms, is critical in some biomedical areas. Collaboratories, involving teams from across continents sharing and exchanging research results, are springing up in many natural and life science areas.

174

4 STEM DYSFUNCTIONALITY

The social dynamics within the scientific community was summarised by Professor Jeffery (2012), again on the GOAL list serv: A new generation of researchers is entering the system. They live in the Web 2.0 generation now and will evolve with whatever comes next. They are still impressed by what can be done with http, html/xml and urls; they don’t imagine a world without them. They will expect immediate interaction with hyperlinked multimedia. They have little or no respect for legalities or long-established traditions. Glory is counted by ‘likes’ or ‘friends’ and – as an aside – is more quantitative and reproducible than existing ‘glory’ metrics (especially the dreaded impact factors and related indices). They may want peer review as it is now but seem to manage the rest of their lives using online recommendations from peers they either know or respect or both. They will certainly expect to live in a research world with wider conversation including social/economic/political commentators which links with the ‘outputs – outcomes – impact’ agenda.

A similar point was made in the book edited by Bartling and Friesike (Bartling et al. 2014): Researchers all over the world use modern communication tools such as social networks, blogs, or Wikipedia to enhance their scientific expertise, meet experts, and discuss ideas with people that face similar challenges. They do not abandon classical means of scientific communication such as publications or conferences, but rather they complement them. Today we can see that these novel communication methods are becoming more and more established in the lives of researchers; we argue that they may become a significant part of the future of research.

As reported by Peter Murray-Rust (Cambridge University) on the GOAL bulletin board on August 10, 2012, the 250,000 people who helped create the Open Street Map project and get it accepted as being among the highest quality and most useful cartographic service were not old-school cartographers: They came from all walks of life, including cyclists and ramblers. Wikipedia didn’t come from converted academics, it came from people outside academia and encyclopedias. Academia (with a very few exceptions) howled it down and it has succeeded in spite of this. (Murray-Rust, GOAL, August 2012)

Studies, several funded by government agencies, have investigated alternative methods of scientific communication. The UK Finch report (RIN 2012) addressed conflicting opinions about ways to disseminate research findings. With hindsight it was probably a mistake for the membership of the Finch Study group to be dominated by organisations representing the existing supply chain – funders, publishers and librarians – who have an interest in seeing minimal disturbance occurring to the status quo (albeit allowing for some marginal changes). More radical would have been to have had the committee dominated by

FUTURE STEM COMMUNICATION TRENDS

175

researchers working at the many frontiers of science, those with a view on strategic trends in science and its communication needs: a delphic group. But Finch nevertheless provided a catalyst for the subsequent debate about appropriate business models for journal publishing in the emerging digital world. The University of California issued a policy statement for 2020 (UC 2018) which states that “Our goal is to promote, through concerted and sustained action, and with clear purpose aligned with our public mission, a scholarly communications system for research publications that does not rely on toll access”. The policy is based on detailed discussions with researchers at UC institutions. Campus-level faculty library committees voted to support this OA 2020 initiative. The consensus was that the future system of scholarly communication “will be diverse and continually evolving. APC models, community investment models, academy-controlled and supported infrastructure, the evolution of preprint and other forms of early dissemination to accommodate new models of peer review and validation, will all be part of the mix”. Which of those models will win out, and in which disciplines or communities, will involve a process of discovery and experimentation among all stakeholders? As stated, “We’re all engaged in a fascinating journey whose unfolding we have an opportunity to influence, but the ultimate shape of which will only be fully known in hindsight” (UC 2018). From the range of alternative STEM publishing options for the future, several are highlighted below as potentially bringing technology to the aid of improved STEM access.

Online Communities In both publishing and library worlds there have been experiments by information providers to become mediators in scholarly interaction/discourse in a digital world. However, it has been the convention for publishers to conceal what they are doing for competitive reasons. The general impression is that apart from a select few large commercial and learned society publishers such experimentation has been muted. There are a few examples of online communities in non-research focused sectors of society. In medicine there is PatientsLikeMe; in education there is Edmondo; in divinity there is BeliefNet; in journalism there is GlobalVoices; in consultancy there is OpenIDEO; in the taxation area there is AnswerXchange; and in architecture there is WikiHouse (Susskind 2015, 224). Aggregations of different types of information – formal printed articles, moderated bulletin boards, social media, data compilations, news, appointments, legal issues – constitute the scope and content of online communities.

176

4 STEM DYSFUNCTIONALITY

Most publisher-generated blogs inform the community what they are doing, rather than stimulating interaction and community collaboration. Wiley and Elsevier are known to be experimenting with community building, often with groups of journals. The correct formula has yet to be discovered and shared.

Portals and Hubs Digital portals or hubs include a range of information services specifically targeted at the needs of a defined community. They can include access to e-journals, reviews, e-prints, conference papers, grey literature, blogs and/or newsletters, all with the aim of consolidating relevant information targeted at a researcher’s profile. Decisions on what should be included in the portal would lie with a “gatekeeper” within the information service, a maven who would be fully aware of all the issues which the targeted community could face and the information content they would need. An example of a “one-stop shop”, hub or portal includes IBMS BoneKEy, a web portal of the International Bone and Mineral Society, and also Information for Practice which is directed towards social work and practice. Another portal is the Alzheimer Research Forum. However, they are costly to create and maintain, and a variety of business models need exploring to ensure their viability. Advertising and corporate sponsorship may be involved, as will flexible Premium pricing models.

SDIs and Alerts A STEM service popular in the early days of digital publishing was SDI (selective dissemination of information). It has since been revamped as Alerts and RSS services (Really Simple Syndication). It has potential for improved linking of supply with demand for STEM output. It is based on the structured research profile of the user which is then related to content from the newly incoming publication’s metadata. A match between profile demand and latest information supply would trigger delivery of the item to the researcher. The SDI profiled system could also predict what items may be relevant to the target audience, based on what the individual had expressed as an interest in the recent past, and proactively supply information in advance. It is similar to the system which Amazon uses to stimulate sales – to use records of past online purchases to recommend relevant and related items of future interest, to follow the digital trail left by an online researcher and match this trail with metadata associated with incoming items/documents/datasets.

FUTURE STEM COMMUNICATION TRENDS

177

This raises scope for a business model to be developed which is granular in scope. Such an SDI scheme would offer advantages, such as relevance and timeliness, over the current scheme of broadcast publishing. This single integrated approach, covering the research information needs from soup to nuts, from a wide range of sources rather than focused on one format (journal articles), is in keeping with the successful business strategy adopted by industry innovators such as Steve Jobs in his development of Apple (Isaacson 2011). Making sure that all information bases are covered – Macintosh, iPad, iPod, iPhone and including the iTunes store – through one easily accessible system meant that Jobs could outmanoeuvre and outcompete other key players at the time: “A magical walled garden where hardware, software, [content], and peripheral devices worked well together to create a great ‘user experience” (Isaacson 2011). According to Jobs, “Some people say ‘Give the customers what they want’. But that’s not my approach. Our job [at Apple] is to figure out what they are going to want before they do. I think Henry Ford once said, ‘if I’d asked customers what they wanted they would have told me a faster horse”. (Isaacson 2011). Such a holistic, integrated STEM system could be a development from the combined SDI and portal approach for researchers outlined above. There are other approaches to delivering STEM results in an efficient way and which could become an agenda for a delphic study organised in a sophisticated and professional way.

Blockchain Systems Blockchain technology covers several online transactional activities including the Bitcoin financial system. Blockchain technology can manage several digital assets, such as educational and medical records, and affects industries such as publishing, retail and manufacturing, healthcare and government. Considering its potential impact, the current state of the blockchain is seen as analogous to the early days of the Internet (see “Blockchain for Research” by Joris van Rossum at Digital Science/Nature Publishing). Blockchain in the science sector allows for decentralised, self-regulating data, creating a shared infrastructure where all transactions are saved and stored. As scientific information involves a large, diverse and dynamic body of information and data that is collaboratively created and shared, this research process lends itself to blockchain technology. Working on a blockchain would mean that whenever researchers create or interact with content in whatever way and at whatever stage, their interaction will be stored on a single platform. It becomes a certified and stored item. Cryptography is also used to ensure that all transactions cannot be altered, thereby achieving a permanent record of progress.

178

4 STEM DYSFUNCTIONALITY

The blockchain systems also include a payment or transactional systems, one such being Bitcoin: “Financial transactions can use its own currency in an authorised and encrypted way” (van Rossum 2017). It uses digital signatures and includes a peer-to-peer network. The latter involves members of the public using their computers to validate and date stamp transactions. These can receive financial rewards for work done, in the form of bitcoins. A big advantage that blockchain systems bring is that they support an open and decentralised approach, which means that there is no single owner and everyone has access to the same information. Moreover, in a blockchain for research, critical aspects of scholarly communication such as trust, credit, universal access and anonymity can be realised and safeguarded. Its potential relates to almost all stages in the researcher’s workflow. Blockchain becomes a workflow process which has the potential to pull together some of the industry changes identified earlier. It is a new paradigm for STEM information, one which rests heavily on the openness and transparency of the Internet. It is different in its approach to STEM, and as such it marginalises the role of scientific journal publishers into that of providing quality support services for authors in creating formal records of science. However, the mainstream of research activity will be conducted on new platforms catering for the diversity of digital assets emerging from the scientific research process. Besides enhancing the role of democracy, blockchain systems potentially disintermediate some of the current roles of commercial journal publishers. However, recent concern about the over-hype given to cyber payments systems questions whether parts of the blockchain system may not be as robust as speculators had originally claimed.

Personalised Search Systems In recent years there has been a surge of scientific-based search engines and browser extensions. Several in the scientific search sector focus on workflow improvements. There is a drive towards solving access control issues, with several competing options for browser plug-ins to synchronise with institutional credentials and follow a user across various databases and platforms. However, no one has yet cracked the code for connecting discovery to fulltext delivery. As such, in 2018, the newest entrants in the discovery marketplace have so far not come to terms with meeting the full needs of researchers in their workflow activity.

FUTURE STEM COMMUNICATION TRENDS

179

Artificial Intelligence (AI) Despite concerns about control over the direction which artificial intelligence may take, and how programmers may be required to develop systems incompatible with the mores and social needs of future generations, revolutionary developments are nevertheless occurring. It can be applied to the STEM publication process by ensuring that manuscripts are checked for their logic, their consistency and comply with in-house publishing standards. In future, large technology and data analytic companies could become the future curators and distributors of scientific research (Elsevier and Iposos 2019).

Summary The description ‘Digital Wildfire’ has been applied to what is happening as a result of technological developments within the research process and in the researcher’s mind (World Economic Forum 2013, Global Risks Report 8th edition). The consensus is: (a) That STEM industry trends are obscured behind veils of corporate obscurity (b) There are concerns that STEM is not fit for purpose in a digital age (c) There is lack of leadership from existing stakeholders in creating or experimenting with new paradigms (d) There is a meritocracy or elitism about the current system whereas a more open and democratic system would embrace the STEM information needs of a broader market and audience (e) There is insufficient knowledge nor awareness of the needs and habits of knowledge workers (f) There are new systems appearing (such as blockchain and artificial intelligence) which adapt to opportunities arising from new technology, and create a new information structure and paradigm for STEM An assessment will be made of one area where new approaches to STEM could emerge from (unaffiliated knowledge workers or UKWs), to be followed by an analysis of an existing stakeholder (learned societies) to see how they can adapt to the new paradigm for STEM whilst also meeting the needs of unaffiliated knowledge workers.

5 UNAFFILIATED KNOWLEDGE WORKERS As indicated earlier, unaffiliated knowledge workers or UKWs are a sector of society which has been excluded from formal scientific activity because of administrative constraints inherited from a traditional print publishing paradigm. Lack of market research into the information behaviour of UKWs has been a constraint facing this project. Several studies have been done but, almost exclusively, within and for the higher education sector. These studies provided what might be conceived as a misleading picture of the current scene, a snapshot in time and of small samples of the overall universe of over seven million researchers worldwide. In addition, they exclude a wider potential audience. Also, a problem with relying on these surveys is that they describe events during which, at worst, the print paradigm prevailed, and at best they describe a hybrid world of researchers coping with a mix of print and digital. There are few studies which assess the extent of total immersion into a digital world by all types of researchers and by all formats which includes social media, datasets, artificial intelligence and cognitive computing services. Longitudinal user studies would assist in assessing how research behaviour changes over time and how they cope with an industry sector in transition. A study commissioned by Publishing Research Consortium and undertaken by CIBER Research Ltd. involved a three-year longitudinal study of early researchers’ information needs (those under 35 years of age and of at least postgraduate status), and a hundred face-to-face interviews from contacts in several countries. Designing such new information services to address change is difficult, subjective, personal and constrained by the unrepresentative sampling. The demand side of STEM is determined by many forces, both internal and external, most outside the control of existing STEM agencies. Including such determinants in a quantitative and qualitative way is problematic. This chapter explores the evidence which exists on user behaviour among a group peripheral to the mainstream academic audiences – in this case the UKWs. The evidence base of usage suffers even greater paucity among the knowledge worker communities, in comparison with the research-based academics.

OVERVIEW OF UKWs The following social groups are identified as areas of unaffiliated knowledge workers:

https://doi.org/10.1515/9783110650778-005

182

5 UNAFFILIATED KNOWLEDGE WORKERS

– Many professionals, outside academia, rely on STEM published research results to maintain professional standards of practice through the adoption of latest developments. These can be reported in specialised research journals, trade magazines as well as other forms of research outputs. In many instances access to research results is affected by protective pricing levels and layers of copyright. As such, research results are not easily accessible by individual professionals operating in local, often small, private practices. – There is emphasis – in a society striving to achieve improved lifestyles, economic growth and to overcome financial austerity – on providing support for researchers within small and medium enterprises (SMEs). They are at the forefront in developing innovative products and services. SMEs feed off easy and regular access to information arising from progress at scientific research frontiers. Several SMEs are spawned within university laboratories and subsequently floated off in industry, either in partnership with the university or as private ventures. However, once the umbilical cord with academia has been cut, easy access to relevant STEM publications, print or online, is restricted. The publishers’ main targets are larger, wealthier research institutions and corporations. These latter, large corporations may have information and documentation centres. They are more likely to have sufficient funds to buy subscriptions to required STEM journals. SMEs, on the other hand, are not so fortunate and need to find other ways to keep informed. – There are many “citizen scientists” or “amateur scientists” who have chosen to pursue careers outside academia and corporate R&D but have retained an interest in the subject of their early academic training, or they have developed new scientific interests. Mass-collaboration by this group can be seen in global scientific projects such as the Sloan Digital Sky Survey (SDSS) in astronomy. This leads to large data webs being created which involves participation from thousands of amateur scientists, alongside academics. – Another professional area is in the agrarian, horticultural and related land management industries. They are reliant on science and technology to sustain efficiency and generate higher crop yields. Easy access to high level research results in genetic engineering, veterinary, environmental data and biological research results are beneficial to this community. Again, these professional sectors have financial barriers to overcome in gaining access to required scientific publications. – There are lobbyists and charities in the private sector that bolster their missions with evidence drawn from scientific publications, and are pushing for change to limit onset of global climate change, to eradicate pollution, to reduce plastics, to improve social conditions, to save energy etc. In

OVERVIEW OF UKWs







– –









183

addition, science writers and journalists also feed on accessible scientific literature which may or may not be easily available to them. Consultancies which have clients in science-related industries are also interested in STEM material. The current business model would not be a strong deterrent to buying required material, whereas searching and finding material in the dark areas of the Internet and social media might. Consultancies can be found both in public and private sectors. There are administrators and advisers who are at the fringes of the academic publishing system but nevertheless influence directions which research takes. There are also policymakers in government and among funding agencies involved with implementing research programmes. Voluntary associations are also included in having unfilled needs for ease of access to relevant STEM material. Even for those operating within the UK higher education system, in universities and research institutes, having access to scientific information is often not easy because of the barriers which operate even within academia. This group would also include alumni and friends of the university. It includes impatient academics unwilling to wait for required texts to be supplied through traditional document supply channels (see Dysfunctionality of STEM). Other knowledge workers operate within public libraries, trades unions, Chambers of Commerce and the Confederation of British Industries (CBI). There are (information) disenfranchised communities and knowledge workers operating in developing countries. Their approach involving leapfrogging into specialist information systems could also affect a global future publishing paradigm for STEM. A requirement for access to research information can be found in areas such as engineers working in remote offshore installations where no library facilities exist, but technical solutions may still be required, culled from past experiences reported in publications. Venture capitalists working in financial institutions which are preparing to invest in new scientific-based businesses may have occasional need to find more about a project and its underlying science before committing an investment. Individuals retraining or developing new skill sets; distance learners and those facing geographical challenges in accessing research libraries are also potential users of an open STEM information system. Patients who are seeking everything there is to know about the illness from which they or their relatives are suffering – to know as much if not more than their over-stretched general practitioner.

184

5 UNAFFILIATED KNOWLEDGE WORKERS

Alumni City and Financial

Entrepreneurs and Innovators

Policymakers and Research Funders

Government officials

Professionals Research in Academia Small and Medium sized Enterprises Voluntary Workers

Citizen scientists/

Retraining

Research in Industry

Charities

Agriculture Patients and Healthcare Workers

Developing economies

General Public

Fig. 23: Overview of main areas of knowledge workers. (See Brown, D. 2016. p254)

Although the unaffiliated knowledge worker (UKWs) sectors are large and diffuse, there are three main target groups which are the focus of this analysis: – The Professions – Small and Medium Enterprises (SMEs) – Citizen Scientists or armchair scientists In addition, the above three are sandwiched between two other groups which have influence over the direction which communication within the STEM research sector will take. These include: – Academics in academia who also face access barriers – The General Public, those without advanced level academic qualifications or professional experience in the sciences

KNOWLEDGE WORKERS

185

Each of these groups may open up new opportunities for a STEM information system in future, thereby helping to reduce the current imbalance between supply and demand for specialist information and services.

KNOWLEDGE WORKERS It has been estimated that there are about 500 million knowledge workers globally (Microsoft 2010), only 30 million of which are in academia/corporate research areas. A more specific estimate gives 50 million knowledge workers in the US alone (only eight million of which are in academia, the rest in the private sector – see NSF 2010a; 2010b; Padley 2014). Mabe (CEO of International STM Association) suggested that there were some 35–40 million who were non-institutional knowledge workers (Mabe 2009). Based on these estimates there could be between 200 million and 500 million knowledge workers worldwide. This is a large latent market for scientific material which compares with the over seven million actual (affiliated) researchers worldwide identified by Unesco (Unesco 2015). “Virtually nonexistent only 100 years ago, knowledge workers now make up the largest slice, 40%, of the American workforce” claimed business management guru Drucker over 50 years ago. He further suggested that “Knowledge worker productivity is the biggest of the 21st century management challenges. . . (it is the) only real competitive advantage in a global economy” (Drucker 1959). The knowledge worker sector continues to grow. According to Morgan Stanley’s economist Roach, “This is, by far, the most rapidly growing segment of white-collar employment. Over the past seven years. . . knowledge worker employment growth has averaged 3.5% per annum, sufficient to account for 73% of total white-collar employment growth over this period” (Roach 2007). Forrester Research claims that American workers spent $404 billion annually, or 11% of all wages, looking for information to do their jobs. Giving employees the right tools in a data and knowledge-driven workplace is imperative (Forrester 2001).

Knowledge Workers in the UK There are an estimated 32.54 million people in work in the UK (see Office of National Statistics 2011), compared with the numbers in UK academia of 2.5 million (HESA 2010). However, the definition applied to knowledge workers

186

5 UNAFFILIATED KNOWLEDGE WORKERS

by ONS is broad and ranges from the trades to highly skilled professionals working closely with academics. In a 2009 report to the British Government (The Panel on Fair Access to the Professions) it was estimated that almost half the UK knowledge workers of 11 million were in “the professions”. As this included creative arts and public sector workers, the actual numbers in the core professions was about half that number, or 5.5 million. These estimates differ from data which was made available by the Department of Business, Innovation and Skills (UKDBIS 2011) which focuses on core R&D activity. UKDBIS concluded that there were 1.8 million R&D workers in the UK in 2009 (see below): “Of these a considerable but uncertain proportion are unaffiliated, without corporate library or information centre support” (Rowlands and Nicholas 2011). This highlights that data on the demographics of knowledge workers and UKWs is statistically weak. The following table looks at the breakdown of the “official” knowledge workers in the UK as derived from the Office of National Statistics (ONS 2011).

Tab. 15: Broad sector knowledge workers (1-digit SOC Code). Reworking of Office of National Statistics (Labour Force Survey) data 2011. Total knowledge worker column includes figures for “Associate professional and technical workers”. UK Knowledge Workers SIC ()

Mining Manufacturing Electricity, gas Construction Wholesale/retail Hotels/restaurants Transport, storage Financial Real estate, renting Public & defence Education Health/social work Other community, social and personal Total

All workers in sector

Managers and Professional senior Occupations Officials

, ,, ,, , ,, ,, ,, , ,, ,, ,, ,, ,,

, , , , , , , , , , , , ,

,,

,,

, , , , , , , , , , ,, , ,

Total % KW

, , , , ,, , , , ,, ,, ,, , ,

% % % % % % % % % % % % %

,, ,,

%

KNOWLEDGE WORKERS

187

Not all the above knowledge workers are at the cutting edge of research, nor require access to the latest STEM developments. But some are – and it is possible that their effectiveness could be enhanced if they could gain the same level of access to relevant research literature as their peers and colleagues in academia. The breakdown of the numbers of Research and Development professionals by category in the UK can also be seen in data presented by the Department Business, Innovation and Skills (UKDBIS 2011): Tab. 16: Numbers of R&D professionals in UK business sectors. Sources: “The sectoral distribution of R&D”, 2009 R&D Scoreboard. U.K. Department for Business, Innovation and Skills”. Professional Sector

Numbers employed (BIS)

Percentage of total professions

,

.%

Civil engineers

,

.%

Mechanical engineers

,

.%

,

.%

Design & development

,

.%

Electronics engineers

,

.%

Production & process

,

.%

Planning & quality eng

,

.%

Quantity surveyors

,

.%

Bioscientist Biochem

,

.%

Pharmaceutical

,

.%

,

.%

,

.%

,

.%

,

.%

Industrial & Engineering professions IT strategy & planning

Chemical engineers

Physicists, geologists Subtotal Service sector Medical profession Dentists

188

5 UNAFFILIATED KNOWLEDGE WORKERS

Tab. 16 (continued ) Professional Sector

Numbers employed (BIS)

Percentage of total professions

,

.%

Software professionals

,

.%

Solicitors, lawyers

,

.%

Legal profession nec

,

.%

Managemt, business

,

.%

Accountants

,

.%

Management account

,

.%

Psychologists

,

.%

,

.%

Social workers

,

.%

Probation officers

,

.%

Public service

,

.%

Architects

,

.%

Town planners

,

.%

Veterinarians

,

.%

Subtotal

,,

.%

TOTAL

,,

.%

Opticians

Social science res

The above table shows the extent of the UK’s service economy – over 70% of R&D professionals are in the service sector with software and medicine responsible for a quarter of all such professionals. Engineers account for 14%. There is a large distribution of professionals in other areas – some 30 identified above, and even this is not exhaustive. Not all UKWs would be interested in advanced level STEM material (publications in nuclear physics or bioinformatics, for example) but many could have interest in the output from less specialised and esoteric subjects. As observed by Brienza, “the number of people who might learn from research results is always going to be greater than the number likely to actually seek out what has been written up” (Brienza 2011, 168).

PROFESSIONALS

189

There is no easy means of estimating what proportion of knowledge workers are “affiliated” to a central purchasing scheme for published scientific content. This area requires additional study through targeted in depth “niche” sector analyses. In addition, this is only a small part of the total potential market size. It excludes a vast audience of “citizen scientists”, those who have a general interest rather than a career requirement in following scientific developments. They are excluded from ONS’s 11.1 million UK knowledge workers, and not included in UKDBIS’s 1.8 million R&D professional numbers, nor are the 2.5 million in academia (HESA 2014). From the above, it is evident that knowledge workers represent a diffuse social grouping. More demographic investigation is required to reduce the extent of this diversity. All that is being claimed is that there is an unknown market available for STEM. Nevertheless, the potential exists that there are a variety of untapped markets into which a restructured STEM process could tap into. The following analysis of a selection of a few individual UKW areas acts as a pointer to the need for greater sophistication in approaching the potential for STEM within the knowledge worker sectors.

PROFESSIONALS What is a Profession? There is no accepted brief definition for a “profession”. Several groups stand out, however. These include learned societies serving lawyers, doctors, accountants, librarians. But there are many which are less obvious – management consultants, local authority workers, journalists. In 2009, a report for the UK Government identified 130 different professional sectors (Panel on Fair Access to the Professions, Unleashing Aspirations (2009)). Professions can be defined as a collection of individuals who have a “formal education” requirement (Wilson 1989). However, to distinguish professionals from academics, the former has a practical knowledge base as well as a theoretical one. Also, a professional is someone who receives important occupational rewards from a reference group whose membership has undergone similar specialised formal education/training and accepts a group-defined code of conduct and practice. The main features of a profession are therefore:

190

5 UNAFFILIATED KNOWLEDGE WORKERS

Main Features of Professions – There are conditions of entry – It has rigorous standards – They operate within specific and unique rules of conduct – The profession is either self- or statutory-regulated – The profession has accountability – The profession offers training and support – There is a knowledge base (which often has roots in formal higher education/research) – It has a distinctive and identifiable social mission (Wilson 1989) Professionals perform specialised, unique and scarce services. Also, they pay as much attention to the judgement of their peers (other professionals) as to their customers in their work (Shirky 2008). There is an acceptance by each professional to abide by a set of standards, procedures and approach to ensure that the profession’s reputation is not tarnished or compromised. Those professions having roots in science subjects and higher education are comparable with academic researchers in their understanding of scientific concepts, and as such are on a level playing field in understanding STEM publications’ content. Their problem is that there is no level playing field for ease of access to required published papers. There are several features distinguishing professionals from academicbased researchers: Differences Between Professions and Academics – They differ in their respective responses to peer pressure – They have different funding drivers – They rely on precedent created by practical experience – They differ in their success criteria – They do not seek global recognition to the same extent – Their main allegiance is to their professional association – They operate outside the closed (elitist) system of scientific research A profession exists to solve a problem, one that requires scarce or unique knowledge and expertise/experience in reaching a solution. However, by trading on such scarcity they have been criticised by sociologists for not so much acting as benevolent custodians of their knowledge/expertise as being jealous

PROFESSIONALS

191

guardians of their specific knowledge base, to the detriment of society in some instances.

Indicative List of Current Professions The following is an indicative list of professions which operate in the U.K. Partial List of Professions. Accountants Archivists Medical Doctors ICT Military Officers Nurses Pilots (airline) Social workers Statisticians

Actuaries Audiologists Economists Veterinarians Neuroscientists Pharmacists Professors Software engineers Surgeons

Advocates Dentists Engineers Journalists Occupational therapists Philosophers Psychologists Speech Language Teachers

Architects Diplomats Financial analysts Lawyers Optometrists Physicians Scientists Pathologists Translators/interpreters

The above list is incomplete – there are more professions and sub-professions, and many more emerging as digital society advances and creates new services. This is particularly noticeable in the ICT, financial services and business sectors. Challenges Facing the Professions A sociological challenge faces the status of the “professional” or expert within society. In the past, the public deferred to authorities which had earned influence and acceptability through their skills knowledge, and expertise. However, things have changed, and such authority has been eroded. New information and communication technology has given voice and self-confidence to people who hitherto have deferred to authority and experts. The erosion has arisen as individuals make use of the information and data available through the Internet. In finding something which is “good enough”, the individual, including UKWs, find sources which are relevant to their needs and prejudices (Nichols 2017). They operate a filtering system which often includes items which supports their previous convictions. What is unclear is whether a professional or expert is an individual who puts service to society above their own riches and profits. There is a suggestion

192

5 UNAFFILIATED KNOWLEDGE WORKERS

that the “grand bargain” between society and the traditional professions, in which society grants professions control over their affairs, is breaking down, and profits are increasingly trumping customer focus or needs (Susskind 2015). This means that the highly structured and inflexible professions as we know them – lawyers, medical doctors, accountants – will be supplemented or overtaken by a new set of proxy-professions in future. These rely on different skills sets, involving both automation and innovation, to take over the profession’s more mundane and repetitive tasks. In effect, they will create bypass strategies over parts of the profession’s code of practice, and by doing so break the profession’s traditional stranglehold. As such, they can support the movement towards more openness and democracy for information within society. Increasing numbers of IT literate sub-professions will emerge with differing, less specialist, elitist and less exclusive approaches to career goals. The core of professional skills will remain, but many aspects of their functions will devolve to others, including online support services. However, these new paraprofessionals, as well as their predecessors, the traditional elitist professionals, still have need for ease of access to relevant research output. The problem facing professions is that they seek to protect themselves by focusing on their skill set without considering new ways of doing similar work arising from digital and Internet developments. An analysis which explores potential redundancy of some professionals was included in the book The Future of the Professions: How Technology Will Transform the Work of Human Experts (Susskind 2015). The authors describe how “increasingly capable systems” – from tele-presence to artificial intelligence – will fundamentally change the way the “practical expertise” of specialists will be made available within society. The book predicts the decline of the protected infrastructure surrounding today’s professions. In an Internet society, according to Richard Susskind and Daniel Susskind, we will neither need nor want doctors, teachers, accountants, architects, the clergy, consultants, lawyers and many others to work as we knew in the twentieth century. Three reasons are given for a reduction in employment within the professions (and the rise of para-professions). The first is that computers will continue eroding the advantage professionals currently have in performing certain tasks. Secondly, new latent demand will be accommodated within the capabilities of machines, and thirdly, whilst the machines cannot yet take on moral responsibilities there is still much which can be devolved to them. These mean that the remaining core professional functionality will be insufficient to keep professional employment on today’s scale (Susskind 2015, 291–2).The authors claim that current professions are antiquated, opaque and no longer affordable, and that the expertise of the best professionals benefits only a few members of the

PROFESSIONALS

193

general public. In this new era when machines can out-perform human beings at many tasks, new occupations will arise. – In education, the “sage on the stage” method of teaching is being complemented, or replaced, by companies providing “adaptive” or “personalised online learning” which use computers to assess needs of the individual student and provide them with “intelligent learning systems”. A tailored approach to instruction is offered, as opposed to reliance on the traditional teacher/student interaction which is more general and systemic. MOOCs (Massive Open Online Courses) have opened a new approach to education from centres such as MIT, Harvard, Stanford and the Open University. – In healthcare, the provision of personal health records from cradle to grave, matched against indicators of potential illnesses, could provide a more personal service than the delayed advice and consultancy currently available from the GP. Many doctors in future may find themselves taking on the role of human sensors, collecting information for a decision-making computer to analyse and prescribe. – The legal profession, for long the archetype of a protective profession, has become more “open” in the UK in recent years as the monopoly over offering law services has become liberalised, which has benefited para legal and non-legal agencies. – Journalists have seen the emergence of online-only newspapers, sourced in part by individuals on site of a newsworthy event rather than professional journalists in absentia. Printed daily newspapers have been in decline, and usage of online news access in the UK has risen from 20% to 55% in seven years (Susskind 2015): “Professional journalists have come to depend unhealthily on Facebook and Google” (Foer 2017). As informal blogs and tweets become de rigueur, Silicon Valley values have come to dominate, and editorial increasingly resembles advertising. Methods of charging for professional services – the business model – are also changing. Many professions are moving away from an hourly charging rate to fixed fees for outputs. This is in response to criticisms over high hourly costs charged for traditional professional services. The conclusion – that the established professions will be disrupted – is based on in-depth research by Susskind across more than ten professions. Professions as they currently exist may be subject to change, but the main functions they perform will in most cases continue, albeit in different guises. Though professional regimes may change, Susskind believes future roles will include:

194

5 UNAFFILIATED KNOWLEDGE WORKERS

New Emerging Professions. Craftspeople R&D workers Designers

Assistants Knowledge engineers Systems providers

Para-professionals Process analysts Data scientists

Empathisers Moderators Systems engineers

The above list of emerging professional groups is again not comprehensive: “Even highly trained analysts and other so-called knowledge workers are seeing their work circumscribed by decision support systems that turn the making of judgements into a data-processing routine” (Carr 2016, p17). There are indications within some established professions that changes in the way they control, collect and assimilate STEM information is taking place. Whilst this suggests growing liberalisation for professions – a reduction in the control which the profession exerts over the services they provide – such splintering of the professional service package still needs control procedures to be built into the operations of these new para-professional groups. Professional organisations are therefore going through the same painful adjustment to the digital world that the STEM publishing industry is facing. Even though there are few claims that professionals are “dysfunctional” – the claim made against STEM information services – society is nonetheless moving on, driven by technological enhancements in new directions. Researchers and the professions need to work on common problems, such as access to STEM information. There Is a commonality in the strategic needs being resolved both for academics and the professions.

Summary of Professionals Information about professionals’ needs and use of STEM information is lacking in the public domain. A recommendation from this study is for stakeholders to be made aware of the way STEM information is generated in the academic world, and how it can be transmitted to and within individual professional sectors so that the profession can benefit. The trend appears to see a breakdown of barriers protecting key professions, and the opening to more democratic participation by new para professionals who are versed in the application of new information technology. Better information about how such trends will inform what package of information services would be appropriate for each target group and what overall information infrastructure would be required. This would cover an analysis of challenges facing the professions as society itself adapts to the new millennium

SMALL AND MEDIUM ENTERPRISES – SMEs

195

as well as the environmental changes which affect the attitudes of the traditional society members, and the variety of procedures used by different disciplines. It is a moving picture, different from traditional STEM. The role of learned societies, as protective agencies for some professions, are important in this respect, and this issue is explored later in this book.

SMALL AND MEDIUM ENTERPRISES – SMEs Defined as private or public organisations with fewer than 250 employees (Ware 2009b), SMEs offer a significant market potential in a developed/developing economy (ONS 2014). They are often the source for innovation which determines the direction society and the economy takes. They often include society’s pioneers, innovators and entrepreneurs. Many entrepreneurs base their operations on the latest scientific developments. Initial inspiration may have come from research which took place in academia, research institutes or large corporations. In this respect, the needs of SMEs are like academics. Where they differ is not having ease of access to the latest STEM developments, as reported in scientific media.

Professional Versus Commercial Ethic SMEs are constrained from openly revealing the results of their own research for competitive reasons. They operate under a commercially-focused profitgenerating motive. As reported by Kornhauser, there is often conflict between the professional ethic of the researcher and organisational goals (Kornhauser 1962). The researcher, within academia, seeks solutions which are usually open for universal benefit; however, within a corporation this drive is transformed into benefits which meet the company’s commercial requirements. There are also different hierarchical structures between academia and industry – the reliance on organisational authority as opposed to technical expertise and professional autonomy is an area of tension. The incentives systems are different – personal recognition for results achieved (scientific or professional ethic) versus financial rewards (commercial ethic). Organisations attach more value to marketing, commercial and legal issues within which the company operates, whereas there is a curiosity which motivates academic researchers. Whilst such differences may be greater in large corporations, the difference between professional and organisational ethics resonates even within SMEs.

196

5 UNAFFILIATED KNOWLEDGE WORKERS

Organisational Size Though 250 employees are used as the upper limit for an SME, or having a turnover of up to £36 million, a more realistic breakpoint would be small, energetic and entrepreneurial companies with fewer than 50 staff (defined as “small” by the EC in Recommendation 2003/361). Those with 50–249 are classed as medium-sized enterprises. Other statistical compilations include a category of “micro businesses”, which are those with fewer than 10 employees (European Commission 2003).

Tab. 17: Sizes of SMEs as defined by the European Commission. Company category

Employees

Medium-sized Small-sized Micro-sized