127 14 6MB
English Pages 220 [215] Year 2023
Annals of Theoretical Psychology 19
Craig W. Gruber Benjamin Trachik Editors
Fostering Innovation in the Intelligence Community Scientifically-Informed Solutions to Combat a Dynamic Threat Environment
Annals of Theoretical Psychology Volume 19 Series Editors Craig W. Gruber, Department of Psychology American University Washington, DC, USA Jaan Valsiner, Niels Bohr Centre of Cultural Psychology Aalborg University Niels Bohr Centre of Cultural Psychology Aalborg, Denmark Matthew G. Clark, Northeastern University Boston, MA, USA Sven Hroar Klempe, Department of Psychology Norwegian University of Science and Technology Trondheim, Norway
The Annals of Theoretical Psychology is devoted to understanding theoretical developments and advances in psychological theory. This series is designed to further the dialogue on theoretical issues in the field of psychology and to unify the discipline through a theoretical synthesis of ideas on key issues of debate. Core themes of the Annals vary from one volume to another, moving beyond a focus on one particular aspect or approach to theory. Each book consists of invited and submitted papers and commentaries that explore a facet of innovative theory in psychology. Of particular interest is moving the discussion and exploration of theory into application for use in research, practice and teaching, taking into account the globalized nature of contemporary psychology. The enduring objective of the Annals of Theoretical Psychology is the exploration of key concepts that require further inquiry, dialogue, and theoretical integration within psychology and related fields.
Craig W. Gruber • Benjamin Trachik Editors
Fostering Innovation in the Intelligence Community Scientifically-Informed Solutions to Combat a Dynamic Threat Environment
Editors Craig W. Gruber Department of Psychology Decision Sciences Laboratory American University Washington, DC, USA
Benjamin Trachik U.S. Army Medical Research Directorate-West Walter Reed Army Institute of Research Joint Base Lewis-McChord, WA, USA
Consent for Publication: The authors consent to publish these data. Material has been reviewed by the Walter Reed Army Institute of Research. There is no objection to its presentation and/or publication. The opinions or assertions contained herein are the private views of the author, and are not to be construed as official, or as reflecting true views of the Department of the Army or the Department of Defense. The investigators have adhered to the policies for protection of human subjects as prescribed in AR 70–25. ISSN 0747-5241 ISSN 2512-2207 (electronic) Annals of Theoretical Psychology ISBN 978-3-031-29806-6 ISBN 978-3-031-29807-3 (eBook) https://doi.org/10.1007/978-3-031-29807-3 This is a U.S. government work and not under copyright protection in the U.S.; foreign copyright protection may apply 2023, Corrected Publication 2023 All rights are reserved by the Publisher, whether the whole or part of the material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation, broadcasting, reproduction on microfilms or in any other physical way, and transmission or information storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now known or hereafter developed. The use of general descriptive names, registered names, trademarks, service marks, etc. in this publication does not imply, even in the absence of a specific statement, that such names are exempt from the relevant protective laws and regulations and therefore free for general use. The publisher, the authors, and the editors are safe to assume that the advice and information in this book are believed to be true and accurate at the date of publication. Neither the publisher nor the authors or the editors give a warranty, expressed or implied, with respect to the material contained herein or for any errors or omissions that may have been made. The publisher remains neutral with regard to jurisdictional claims in published maps and institutional affiliations. This Springer imprint is published by the registered company Springer Nature Switzerland AG The registered company address is: Gewerbestrasse 11, 6330 Cham, Switzerland
For the brave members of the intelligence community who will move forward doing the hardest thing there is, engaging and implementing change. This book is also to the memory of our friend and colleague Sean Sheppard, Ph.D., whose resilience, bravery, and dedication serve as an example of the impact social scientists can have on large organizations. He serves as a model for all present and future DOD personnel. Thanks The editors wish to thank our co-authors, chapter authors, and lab members who have been an essential part of this project. A special thanks to Lila Silverstein, our Assistant Editor, who was essential in ensuring the project ran on time and to completion. Although an undergraduate student, she shone like a true star throughout this project. The editors also thank our spouses, Heather and Shelby, and children Davis, Elizabeth, Alexandra, and Weston. You are all as much a part of seeing this through as anyone.
Innovation, Culture, and the Idiographic Nature of Change
What my co-authors and co-editors and I seek to do in this issue is to demonstrate not only how Ubiquitous Technical Surveillance (UTS) is an issue which truly permeates today’s society but is one which necessitates a systematic approach to dealing with the challenges and opportunities associated with minimizing the potential (and actual) leaks of personal and governmental information. Specifically, we allude to the Intelligence Community (IC) as well as government writ large. While our approach is focused on systematic approaches to reforms within the US Government and IC of the United States, the lessons, challenges, and opportunities present themselves across all democracies. As research has demonstrated, Innovation is perceived as a challenge (Standing et al., 2016; Temes, 2023; Institute for Innovation in Large Organizations, 2022). We contend that the true nature of the challenge is not innovation in and of itself, but rather how innovation is presented and perceived by the individual. Innovation sparks change, and as seen by Thelan, and colleagues (1994) (Thelen, 2000; Thelen & Bates, 2003; Thelen & Ulrich, 1991; Thelen et al., 2001), change is a dynamic process. Change is in part dynamic because of the nature of people, specifically individuals, and the organizational systems they inhabit. As a result, there can be no “one-size-fits-all” approach to change and change implementation. As change agents and leaders, we need to ensure that the change we propose and lead accounts for the nature of the individual (Clark & Gruber, 2017). Change can be perceived as threatening to individuals or the general status quo, and we need model that the change we are advocating is not a threat to the personal constructs of an individual, nor irritating to others (who may have thought that change is coming too late). Leading for change is a challenge, and by utilizing the best practices, and the wholistic leadership styles needed to incorporate many people, change implementation can be successful (Clark & Gruber, 2017; Trachik et al., 2021). Additionally, for innovation and change to be successful, one needs to model that the change is not threatening. When people see things happening, and behavior modeled for them, they learn about the behavior and the reactions and consequences of that behavior (Bandura, 2004). By seeing innovation and change successfully modeled, individuals can see how it will impact them, in both a positive and vii
viii
Innovation, Culture, and the Idiographic Nature of Change
negative light. Perhaps the biggest challenge that a large bureaucracy faces is change. In fact, they are designed to be resistant to change (Burton, 1993). One of the reasons for people’s resistance to change is that change requires thinking. Change is a cognitive process. Cognitive processes require attention and time. As a result, we can safely say that cognitive processes, whether they be grammar, math, or computational items, require cognitive processing. When we talk about being overwhelmed, that is a form of maximizing our perceived cognitive load. When individuals step away, or “take a deep breath”, that can lessen the cognitive load and/or stress associated with the activity. The challenge in change is how do we adapt ways to present change so that it is not perceived to be overly taxing on cognitive capacity. In order to help people ingest change, we need to ensure that the change does not cause increased cognitive load. Selye (1976) refers to this cognitive load as stress. When people engage in problem solving and working through problems, it is referred to as eustress. This is stress [cognitive load] which enables individuals to work through a problem and gain a solution. The opposite side of that equation is distress, which blocks and prevents individuals from gaining a solution framework, hence stress becomes an immobilizer. When my co-authors and I talk about innovation and change in the Intelligence Community, specifically around habits involving UTS, we need the stress or cognitive load to be appropriate for each individual. In essence we talk of the need to tailor the modality of change to match the needs of each user. For example, linear thinkers may need daily refreshers tied to each computer log-in to remind and refresh them on the habits necessary to minimize spillage and increase Operational Security (OPSEC) surrounding their digital life. For divergent and parallel thinkers, a constant or even consistent refresher tied to each computer log-in would be frustrating and, as one individual put it, “It’s like teaching a pig to dance. It wastes your time and annoys the pig.” By tailoring the refresher and reminder about being careful in a UTS world, we will need to develop and implement solutions which are as unique as each individual: a truly idiographic approach to ensure that each person’s needs are met. We meet them where they are, and teach and lead them where they need to go. This process is strikingly parallel to differentiated instruction (Saphier & Glower, 1994). Organizational reform can also minimize the cognitive demands required to enact and cope with change. By approaching innovation from an organizational strategy and leadership perspective, the conditions for innovation can be fostered rather than fixating on specific demands of personnel. What educators, good leaders, and positive change agents have in common is an understanding that it is essential to effectively and appropriately communicate the necessity, rationale, and outcome of change (Clark & Gruber, 2017). In the UTS environment, it is even more important; in the context of the IC, it is essential. To this end, we have included chapters on fostering innovation in both private and public organizations, leadership and team strategy, lessons from other research domains (e.g., artificial intelligence and sport), and lastly included chapters with real world examples of the execution of the aforementioned strategies.
Innovation, Culture, and the Idiographic Nature of Change
ix
We hope that this sets the stage for action in mitigating the issues that UTS presents for the IC and “Whole-of-Government”. Craig W. Gruber American University Washington, DC, USA
Works Cited Bandura, A. (2004). Swimming against the mainstream: The early years from chilly tributary to transformative mainstream. Behavior Research and Therapy, 613–630. Burton, J. G. (1993). The pentagon wars : Reformers challenge the old guard. Naval Institute Press. Clark, M., & Gruber, C. (2017). Leader development deconstructed. Springer. Grimsley, S. (2017). Systems approach to management: Theory, lesson, & quiz. Retrieved from Youtube.com: study.com Gruber, C. W. (2008, September 26). New approaches for teaching personality: Humanistic cognitive behaviourism: A new theoretical framework for development and personality. 4th International Conference on Research and Access on Developmental Education. San Juan, Puerto Rico. Institute for Innovation in Large Organizations. (2022, December 15). ILO resources. Retrieved from ILO Institute: https://www.iloinstitute.net/resources/ Saphier, J., & Glower, R. (1994). The skillful teacher. Research for Better Teaching. Selye, H. (1976). The stress concept. Canadian Medical Association Journal, 115(8), 718. Standing, C., Jackson, D., Larsen, A. C., Suseno, Y., Fulford, R., & Gengatharen, D. (2016). Enhancing individual innovation in organisations: A review of the literature. International Journal of Innovation and Learning, 19(1), 44–62. Thelen, E. (2000). Grounded in the world: Developmental origins of the embodied mind. Infancy, 3–28. Thelen, E. B., & Smith, L. (1994). A dynamic systems approach to the development of cognition and action. Bradford/MIT Press. Thelen, E., & Bates, E. (2003). Connectionism and dynamic systems: Are they really different? Developmental Science, 378–391. Thelen, E., & Ulrich, B. (1991). Hidden skills: A dynamic systems analysis of treadmill stepping during the first year. Child Development, 104. Thelen, E., Schöner, G., Scheier, C., & Smith, L. (2001). The dynamics of embodiment: A field theory of infant perseverative reaching. Behavioral and Brain Sciences, 34–86. Trachik, B., Oakey‐Frost, N., Ganulin, M. L., Adler, A. B., Dretsch, M. N., Cabrera, O. A., & Tucker, R. P. (2021). Military suicide prevention: The importance of leadership behaviors as an upstream suicide prevention target. Suicide and Life‐Threatening Behavior, 51(2), 316–324. Van Geert, P. (2000). The dynamics of general developmental mechanisms: From Piaget and Vygotsky to dynamic systems models. Current Directions in Psychological Science, 64–68. Van Geert, P., & Steenbeck, H. (2005). Explaining after by before: Basic aspects of a dynamic systems approach to the study of development. Developmental Review, 408–442.
Contents
1
Ubiquitous Technical Surveillance: A Ubiquitous Intelligence Community Issue ������������������������������������������������������������������������������������ 1 Craig W. Gruber, Benjamin Trachik, Catherine Kirby, Sara Dalpe, Lila Silverstein, Siobhan Frey, and Brendon W. Bluestein
2
Innovation in the National Security Arena�������������������������������������������� 19 Brendon W. Bluestein, Matthew A. Templeman, and Craig W. Gruber
3
Addressing Emerging Threats to the Intelligence Community Through Scientifically Informed Team and Leadership Strategy: The Case for the Integration of Research and Program Development �������������������������������������������������������������������������������������������� 37 Benjamin Trachik
4
Learning and Reasoning with AI: Restructuring Intelligence Organizations Around Innovation���������������������������������������������������������� 57 Bruce Goldfeder, Justin Davis, Mitchell Pillarick, and Ramesh Menon
5
Cross-Disciplinary Innovation Within the Intelligence Community: Evidence from Research on Sport and Military Expertise������������������ 81 Bradley Fawver, Brady S. DeCouto, Benjamin Trachik, Michael Dretsch, and A. Mark Williams
6
The Transnational Threat of Radicalization Through the Use of Online Gaming Platforms����������������������������������������������������� 113 Sujeeta Bhatt and Janna Mantua
7
A Report from the Field: How Large Groups Are Collaborating in New Ways, Driven by New Technologies ������������������������������������������ 133 Peter Temes
xi
xii
Contents
8
Innovation in Large Organizations: Structuring Organizations to Combat Emerging Threats: Developing a Theoretical Model�������� 141 Catherine Kirby
9
Innovation in Large Organizations: Structuring Organizations to Combat Emerging Threats: Exploration Through Case Studies���� 157 Catherine Kirby
10 Resistance to Change and Cognitive Bias in Organizational Change: An Application to the Irish Garda Police Force�������������������� 177 Siobhan Frey and Craig W. Gruber Correction to: Fostering Innovation in the Intelligence Community �������� C1 Craig W. Gruber and Benjamin Trachik Index������������������������������������������������������������������������������������������������������������������ 197
The original version of this book was revised. A correction is available at https://doi.org/ 10.1007/978-3-031-29807-3_11
About the Editors
Craig W. Gruber, Ph.D., is on the faculty of American University, where he is Research Associate Professor and Director of the Decision Sciences Laboratory. Previously he served as the Associate Vice-President for Innovation Campus Programs at Northeastern University. He has written on courage and presented nationally and internationally on security, intelligence, and forensic science. He developed Northeastern University’s Master’s degree programs in Homeland Security and Strategic Intelligence and Analysis and is Director of Operations for Image Insight Inc. Dr. Gruber is on the Board of Visitors of The Hill School, and he also serves on the board of the International Association for Intelligence Education, where he is the Treasurer. He is the Editor-in-Chief of Annals of Theoretical Psychology and has written introductory psychology textbooks and numerous articles. Benjamin Trachik, Ph.D., currently serves as the Chief of Clinical Psychology Research for the U.S. Army Medical Research Directorate-West, a forward directorate of the Walter Reed Army Institute of Research (WRAIR). Prior to enlisting in the Army, Dr. Trachik worked as a licensed clinical psychologist for the Department of Veterans Affairs where he received several certifications in empirically supported treatments for PTSD and engaged in related research efforts. As a principal investigator, he has received millions of dollars in research funding to execute research studies designed to optimize the cognitive, emotional, and psychological functioning of Soldiers. These studies are in the areas of suicide prevention, leadership strategy, and organizational reform with the goal of improving Soldier psychological health and wellbeing. Dr. Trachik maintains a robust publication record in the areas of suicide, PTSD, and leadership and engages in active collaborations with government organizations, universities, and private industry. Dr. Trachik also works closely with the Army Resilience Directorate developing and evaluating tangible products for Army-wide training and translating research into actionable strategies to improve Soldier wellbeing. Dr. Trachik currently serves as an ad hoc reviewer for multiple scientific journals and as a subject matter expert on several WRAIR research initiatives, military working groups, and unit-directed consultations. xiii
Assistant Editor
Lila Silverstein is a current undergraduate student at American University in Washington, DC and member of the Decision Sciences Lab. Lila is currently working towards her two Bachelor of Arts degrees, one in Psychology and one in Political Science. Lila will graduate with two B.A.s in May 2024. She joined the Decision Sciences Lab after taking a Social Psychology and Research Methods in Psychology course with Dr. Craig Gruber, an Editor of Annals of Theoretical Psychology and Director of the lab. Lila has done research projects through American University on courage and wealthy inequality. Additionally, Lila was a Research Assistant for Thrive DC, a local homeless shelter, where she researched the effects of COVID-19 on DC’s unhoused population. Lila hopes to continue social psychology research, focused on COVID-19’s effect on international supply chains.
xv
Contributors
Sujeeta Bhatt, Ph.D., is a Research Staff Member at the Institute for Defense Analyses (IDA). She has research, research management, and publication expertise in behavioral science (neuroscience, psychology, and cognition), credibility assessment, and evidence-based interrogation approaches, translating technical research for diverse audiences across Department of Defense (DoD), Intelligence Community, and Federal Law Enforcement settings. She served as Senior Program Officer with the National Academies of Sciences, Engineering, and Medicine where she led two large cognition-focused consensus studies. Prior to that, Dr. Bhatt was a Research Scientist at the Defense Intelligence Agency (DIA) and was detailed to the Federal Bureau of Investigation’s High-Value Detainee Interrogation Group (HIG) and prior to that she was an Assistant Professor in the Department of Radiology at the Georgetown University Medical Center on detail to DIA/HIG. Brendon W. Bluestein, Ph.D., has served as a Psychologist in the US Army for over 23 years across conventional, aviation, intelligence, and special operations units in support of training, operations, and assessment programs. He is published in the fields of Operational Psychology, Psych neuroendocrinology, Neuropsycho logy, Domestic Violence, Assessment Psychology, and Pain Management. Bluestein is married for 26 years, has four children, and resides in the National Capital Region. Sara Dalpe, B.S., is a graduate student in American University’s MA in Psychology program. Sara obtained her Bachelor’s degree from the University of Virginia, where she double-majored in Psychology and Spanish. After graduating from UVA, Sara worked at UVA’s Institute of Law, Psychiatry, and Public Policy as a lab manager for a research project exploring the media strategies through which extremist groups radicalize women. She also assisted in preparing expert witness testimonies for high-profile sexual abuse litigation. Before coming to American University, Sara obtained clinical experience working in an in-patient behavioral hospital for treatment of eating disorders, and then providing in-home applied behavioral analysis therapy for children diagnosed with autism spectrum disorders. At American University, Sara works in the Decision Sciences Lab. Additionally Sara works on a xvii
xviii
Contributors
study with researchers from the University of California at Berkeley on the effects of the COVID-19 pandemic and the 2020 US Presidential election on individuals with intersectional identities. Sara’s graduate thesis explores how the COVID-19 pandemic has interfered with undergraduate freshmen’s adaptation to college. Justin Davis, M.A./M.S., serves as a Strategic Competition Lead for the Defense Intelligence Agency (DIA) Chief Technology Officer with a focus on Artificial Intelligence, Machine Learning, Quantum Technologies, and the intersections of cybersecurity. Major Davis has served in a variety of active-duty enlisted positions in satellite communications and aviation engineering. He served as a Crew Commander at the Cheyenne Mountain Systems Center defending and maintaining the US Missile Warning and Space Surveillance communications systems supporting North American Aerospace Defense Command, United States Northern Command, and United States Strategic Command (USSTRATCOM). In Air Force Reserves, Major Davis served as Deputy Chief of Cyber Fires and Future Capabilities, Headquarters Air Intelligence, Surveillance, and Reconnaissance at the Pentagon; as Deputy Chief of Fires and Effects at United States Indo-Pacific Command; and as Chief of Targeting at United States Cyber Command (USCYBERCOM). He most recently served as Technical Director to the current operations division at USCYBERCOM overseeing essential technology including joint command and control systems which leverage advanced analytics and machine learning. Major Davis is a distinguished graduate of two Air Force career training programs. He also has extensive cybersecurity experience working in private sector finance, transportation, consulting, and technology organizations. He holds a Master of Science in Business Analytics from Notre Dame, a Master’s in Public Administration from Arkansas State University, a Bachelor of Science in Psychology, and associate degrees in aviation operations and electronic systems technology. He is also a certified private pilot and holds industry certifications in cybersecurity. Brady S. DeCouto, Ph.D., received a Bachelor of Science and Master of Science degrees in Kinesiology from Jacksonville University, and a Ph.D. in Kinesiology with an emphasis on Cognitive and Motor Neuroscience from the University of Utah. His experience has provided him with expertise on a diverse array of topics spanning human cognition and movement science, including visual attention, skill learning, biomechanics, stress coping, sociocultural factors, sport psychology, and sensory integration. Dr. DeCouto is interested in optimizing human performance through studying developmental factors, psychological factors, and perceptual- cognitive skills associated with elite performance. He has conducted research investigating visual and neural correlates of expert performance, sensory integration under anxiety, global and local attention in elite athletes, and sociocultural influences on athlete health and performance. He has worked closely with US Ski and Snowboard, leading papers on talent identification in youth alpine ski racers. In 2022, he joined Florida Institute for Human and Machine Cognition (IHMC) as a Postdoctoral Fellow and Senior Research Associate. At IHMC, Dr. DeCouto’s research endeavors have been targeted at understanding how human cognition fits
Contributors
xix
into future technological initiatives. Consequently, he has pursued work with exoskeleton skill learning, stressful decision-making with artificial intelligence, simulation combat training, neurostimulation, and long-term stress monitoring. He also currently assists data analyses for US soccer talent identification, continuing to pursue his interest in the influence of developmental factors on high-level sport participation and performance. Michael Dretsch, Ph.D., is the Director of the US Army Medical Research Directorate-West of the Walter Reed Army Institute of Research. He has a Ph.D. in Experimental Psychology from the University of Hull, England, UK, and completed a Post-Doctoral Fellowship in Cognitive Neuroscience/Addiction Research at the University of Wisconsin-Madison. Dr. Dretsch deployed to Iraq in 2009 as a member of a special team for the Office of the Surgeon General to assess the psychometrics of neuropsychological instruments for screening mild traumatic brain injury. His myriad contributions to the Army include regular participation and consultation in numerous working groups, committees, and advisory panels in the DoD, academia, and NATO, as a subject matter expert in cognitive neuroscience, Soldier performance, combat-related neurologic and psychologic injury (i.e., traumatic brain injury and posttraumatic stress). His research has resulted in four Magna Cum Laude Merit Awards from 2015 to 2017 at varying conferences in the field of neuroimaging. Bradley Fawver, Ph.D., earned a Bachelor of Science degree in Psychology from Clemson University, a Ph.D. in Health and Human Performance from the University of Florida, and a Postdoctoral Fellowship in Cognitive and Motor Neuroscience at the University of Utah. His interdisciplinary expertise forms a bridge between the domains of sport and performance psychology, skill acquisition, biomechanics, and rehabilitation science. Dr. Fawver’s research interests include studying how psychological factors (e.g., emotion, attention, perceptual-cognitive biases, coping/regulation skills) influence skill acquisition and motor performance under challenging or stressful conditions, as well as the development of expertise broadly speaking within military, sport, and clinical domains. In 2020, Dr. Fawver joined the US Army Medical Research Directorate-West (MRD-W) as a Research Psychologist. At MRD-W, Dr. Fawver has been the principal investigator on various research projects aimed at improving Soldier resilience, neurocognitive function, and operational performance under stress. To date, he has authored over 80 scholarly works, including peer-reviewed manuscripts, book chapters, and conference proceedings. He also serves as an ad hoc reviewer for several scientific journals and maintains active mentorship of graduate students. Siobhan Frey, B.A., is a current student at American University in Washington, DC, and member of the Decision Sciences Lab. Siobhan is currently working towards her Bachelor of Arts degree in Psychology with a minor in French and a Master’s in Special Education. Siobhan will graduate with a B.A. in December 2022 and her M.A. in May 2024. Siobhan has been working in the Decision Sciences Lab
xx
Contributors
at American University since her third year at American University. Siobhan decided to further pursue social psychology research after taking a Research Methods in psychology course with, Editor, Dr. Craig Gruber. Siobhan is a member of Psi Chi, international honor society in psychology. Siobhan was inducted into the American University chapter of Psi Chi in April 2021, for her high psychology and cumulative GPAs. During the spring of 2022, she studied at University College Dublin, where she learned about Irish history and the history of the police forces in both the Republic of Ireland and Northern Ireland. Bruce Goldfeder, M.A., is the Senior Engineering and Technical Advisor to the Defense Intelligence Agency (DIA) and Chief Technology Officer (CTO). As the Senior Engineering and Technical Advisor, he provides subject matter expertise in the field of artificial intelligence and machine learning (AI/ML). Mr. Goldfeder serves on the DIA CTO Emerging Technology team providing AI/ML prototyping, technology augmentation for ongoing AI/ML programs, and future technology solutions advancing DIA mission needs. Mr. Goldfeder received his Bachelor’s degree in Electrical Engineering from Lehigh University, Master’s degree in Computer Science from The George Washington University, and is currently a Ph.D. candidate in the George Mason University Computations Sciences and Informatics program. Mr. Goldfeder, a former Air Force Officer, has worked for over 25 years as a contractor supporting multiple Department of Defense and Intelligence Community agencies, and the Defense Advanced Research Projects Agency. Catherine Kirby, M.A., graduated cum laude from Rice University where she majored in Political Science. Catherine obtained her Master’s degree from the George Washington University in Security Policy Studies. Simultaneously, she worked for Capital One as a Business Analyst in anti-money laundering and fraud. She wrote multivariate algorithms to detect suspicious and criminal activity and built alignment across multiple operational groups to develop and implement Capital One’s first-ever financial exploitation of vulnerable adults (FEVA) model. She also investigated opportunities for communication infrastructure change to increase the detection of fraud across the enterprise’s products. As a credit card fraud analyst, she managed the gradient boosting models and fraud rules engine. She implemented a short-term fraud model and developed a new fraud model that doubled fraud count capture and increased deterrent effects of the machine learning model portfolio. Catherine has also worked at the Defense Intelligence Agency (DIA) where she provided subject matter expertise to government agencies working to understand financial ubiquitous technical surveillance (UTS) and its impact on operations as the Director of the UTS Finance Working Group. She fostered collaboration between defense schoolhouses to pursue understanding and alignment on best practices for developing curriculum and teaching on the UTS environment and risks to students. Currently, Catherine works to develop machine learning models for government agencies.
Contributors
xxi
Janna Mantua, Ph.D., is a Research Staff Member at the Institute for Defense Analyses (IDA). She has a background in Psychology and Behavioral Neuroscience and is completing an additional degree in Military Studies with a concentration in Irregular Warfare. At IDA, she contributes to and leads projects on influence operations, human-system interaction, and war gaming. Prior to working at IDA, she worked for the US Army. There, she was the Lead Scientist on a research team that focused on optimizing cognitive, physical, and operational performance in special operations populations during military operations. She also assisted with partner force psychological assessment in Afghanistan Ramesh Menon, MBA, is the Chief Technology Officer (CTO), Office of the Chief Information Officer (CIO), Defense Intelligence Agency (DIA). Mr. Menon earned a Bachelor’s degree in Electrical Engineering and a Master’s degree in Business Administration (MBA) from California State University. He has co- authored three books. Mr. Menon is responsible for technology and strategy planning, experimentation, emerging technology solutions architecture supporting DIA and IC priorities including FVEY partners. Menon serves as the CTO and authoritative information technology expert for DIA. Mr. Menon serves as the senior DIA technologist and interface to the Intelligence Community, Department of Defense, Military Service’s Intelligence Centers, Combatant Commands, international partners, national labs, industry, and academia. As the CTO, Mr. Menon will serve as the Principal Advisor for artificial intelligence to the Director and collaborate with other IC and DoD AI leaders. Mr. Menon was a CTO in the private sector at IBM and a Chief Architect at Johns Hopkins University Applied Physics Laboratory leading artificial intelligence, cloud, cyber and strategic national security initiatives. Mr. Menon is a member of IEEE congressional delegation on science and technology. A frequent guest lecturer at Johns Hopkins University. He was a member of the Brookings Institution AI expert panel on autonomous weapon systems and served on the space innovation council workgroup. He is also a mentor to AFRL catalyst campus, NASA Frontier Development Labs supporting OSINT and Edge AI to operate in contested spaces. Mitchell Pillarick III, M.A./M.S., serves as a Data Scientist for the Data and Digital Innovation (DDI) Directorate’s Tech Team. The Tech Team is responsible for sound Data Engineering offerings at National Geospatial-Intelligence Agency (NGA), ensuring the Data Scientists have the tools, processes, and data access needed to enable their workflow. Additionally, Mitch is the Machine Learning – Operations (ML-Ops) platform Product Owner and the Systems Engineer for DDI’s Synthetic Data Generation (SynGen) effort. Furthermore, he has a patent pending for work in developing a technique for generating synthetic geospatial vector data for software validation and machine learning model training. Mitch served as a contractor at NGA from 2011 until 2016. In 2008 he received a commission in the US Army as a Military Intelligence Second Lieutenant. He continues to serve now at the rank of Major in NGA’s DIMA Joint Reserves unit. During his time at NGA, Mitch has worked as a Full Motion Video Analyst, Imagery Analyst, Geospatial
xxii
Contributors
Analyst, Software Developer, Data Engineer, and Data Scientist. In the Army, Mitch has held positions as the Tactical Intelligence Officer (S2) of a light infantry battalion, the Analytical Control Element (ACE) Chief of a division headquarters, an Imagery Analysis team lead, as well as an assortment of required and assigned duties. Mitch holds a Bachelor’s degree in Physics from the University of Missouri, a Master of Arts in Intelligence Operations from the American Military University, and a Master of Science in Data Science from the University of Missouri. In addition, Mitch is a second-year Ph.D. student from the University of Missouri, Columbia’s Institute for Data Science and Informatics. Peter Temes, Ph.D., is the Founder and President of the Institute for Innovation in Large Organizations. Formerly president of the Antioch New England Graduate School and The Great Books Foundation, he has been a Dean at Northeastern University and a faculty member at Harvard University. His books include Teaching Leadership: Essays in Theory and Practice, The Just War: A Reflection on the Morality of War in Our Time, The Power of Purpose, and We the People: Human Purpose in a Digital Age. He holds a Ph.D. in Humanities from Columbia University, where his research centered on the American Civil Rights Movement. Matthew A. Templeman, US Army Retired, served as an engineer, diplomat, and intelligence professional in the US Army for over 25 years at the tactical, operational, and strategic levels. He received a Bachelor of Science degree from the United States Military Academy at West Point, a Master of Science in Engineering Management from the Missouri University of Science and Technology, and a Master of Arts in Latin American Studies from the University of Texas at Austin. As an Army officer, he led and guided organizational transformation within major commands and agencies, and currently provides organizational enhancement and strategic optimization consulting through the Gray Knight Advisory Group, helping companies address the challenges of growth and transformation. A. Mark Williams is a Senior Research Scientist at the Florida Institute of Human and Machine Cognition. His research interests focus on the neural and psychological mechanisms underpinning the acquisition and development of expertise. He has published over 250 journal articles in peer-reviewed outlets in numerous fields. He has written and edited 19 books, 77 book chapters, 60 professional articles, and 118 journal abstracts, and he has delivered more than 200 keynote and invited lectures in over 30 countries. He is Editor-in-Chief for Journal of Sports Sciences, Research Quarterly for Exercise and Sport, and the journal Human Movement Science, and has sat on numerous editorial boards for academic journals. His work has been funded by Federal agencies in Australia and the United Kingdom as well as by industry partners (Nike and Umbro). He is also currently funded by the Department of Defense and DARPA.
Chapter 1
Ubiquitous Technical Surveillance: A Ubiquitous Intelligence Community Issue Craig W. Gruber, Benjamin Trachik, Catherine Kirby, Sara Dalpe, Lila Silverstein, Siobhan Frey, and Brendon W. Bluestein
1.1 Statement of the Problem and Background Literature With Ubiquitous Technical Surveillance as contextually pervasive, change must happen sooner than later to keep pace with nation-state competitors.Only with a culture of innovation can the IC develop and implement novel approaches to UTS that protect our workforce and give us a decisive edge over our global competitors. Culture is the social cement that binds an organization together andv sets the stage for the operational permissions and limitations within its inherent constraints. An innovative cultural environment is characterized by creative thinking and implementation from leaders, teams, and officers harnessing the potential within and external to the organization. With this in mind, we ask the question, “How do we create and propagate a culture that not only accepts, but encourages the development and implementation of innovative approaches to difficult and acute problems like UTS Well-intended leaders have implemented many efforts toward changing culture on both the The original version of this chapter was revised. The correction to this chapter is available at https://doi.org/10.1007/978-3-031-29807-3_11 C. W. Gruber (*) · C. Kirby · S. Dalpe · L. Silverstein · S. Frey Department of Psychology, Decision Sciences Laboratory, American University, Washington, DC, USA e-mail: [email protected]; [email protected]; [email protected] B. Trachik U.S. Army Medical Research Directorate-West, Walter Reed Army Institute of Research, Joint Base Lewis-McChord, WA, USA B. W. Bluestein US Army, Washington, DC, USA This is a U.S. government work and not under copyright protection in the U.S.; foreign copyright protection may apply 2023, Corrected Publication 2023 C. W. Gruber, B. Trachik (eds.), Fostering Innovation in the Intelligence Community, Annals of Theoretical Psychology 19, https://doi.org/10.1007/978-3-031-29807-3_1
1
2
C. W. Gruber et al.
micro- and macro-organizational level. They approach cultural change by identifying problems within their organizational culture and developing solutions based on current theory and practice within organizational psychology. Leaders can often apply various strategies, underpinned by strong academic models, to combat discovered problems with organization culture. This type of cultural change can be both research and practice-based. There is ample literature available on the leadership aspects of change (see, Clark & Gruber, 2017). In terms of the technical and research aspects of change, we bring some research relevant in both obvious and subtle ways to the UTS challenge. Ubiquitous Technical Surveillance (UTS), defined as widespread data collection, processing, and analysis, presents one of the most acute, generalized threats and opportunities facing the broader Intelligence Community (IC), Department of Defense (DoD), and the USG. Foreign intelligence and security organizations capitalize on marketing infrastructure and other means to compromise the safety and security of US service members, IC officers, and others to collect on and exploit US interests. UTS affects all tradecraft—technical, operational, and administrative—in every contested physical, technical, and cyber domain. In the current digital age, there is little separation between the digital narrative of our professional and personal lives. Managing this narrative well provides the security, protection, and freedom of movement to execute successful operations; however, the current lack of awareness of these issues and lack of adoption of recommended actions prevent successful narrative management. The IC and DoD culture provides the foundation on which awareness will either promote change or resist it. Therefore, the USG requires a broad, well-coordinated and planned, strategic approach to cultural change that will enhance receptivity of the DEXCOM endorsed LOEs of May 2021. Framing the Problem: IC Current Culture and Reception to Change Well-intended leaders have implemented many efforts toward changing culture on both the micro- and macro-organizational level. They approach cultural change by identifying problems within their organizational culture and developing solutions based on current theory and practice within organizational psychology. Leaders can often apply various strategies, underpinned by strong academic models, to combat discovered problems with organization culture. Culture Proposition Culture is the social cement that binds an organization together and sets the stage for the operational permissions and limitations within its inherent constraints. An innovative cultural environment is characterized by creative thinking and implementation from leaders, teams, and officers harnessing the potential within and external to the organization. Only with a culture of innovation can we develop and implement novel approaches to UTS that protect our workforce and give us a decisive edge over our global competitors. With this in mind, we ask the question, “How do we create and propagate a culture that not only accepts, but encourages the development and implementation of innovative approaches to difficult and acute problems like UTS?” There is extensive research and writing that addresses this issue. With Ubiquitous Technical Surveillance as contextually pervasive, change must happen sooner than later to keep pace with nation-state competitors. This change can be both research and
1 Ubiquitous Technical Surveillance: A Ubiquitous Intelligence Community Issue
3
practice-based. There is ample literature available on the leadership aspects of change (see (Clark & Gruber, 2017)). In terms of the technical and research aspects of change, we bring some research relevant in both obvious and subtle ways to the UTS challenge. We will present brief overview of multiple research and literature approaches which demonstrate the research behind innovation and change associated with UTS.
1.2 Markov Chains Markov chains are processes where “the description of the present state fully captures all the information that could influence the future evolution of the process” (Baudoin, 2010). In other words, “a Markov (or memoryless) strategy is a behavior strategy in which every decision rule depends only on the current state” (Filar & Vrieze, 2012). As many UTS systems can be thought of in terms of Markov chains, understanding this model is vital to informing the discussion of cultural change as it pertains to UTS. Markov chains are processes where “the description of the present state fully captures all the information that could influence the future evolution of the process” (Baudoin, 2010). Markov chain models can be found throughout UTS applications; they can be used not only to analyze web navivation, but also to make predictions about things such as future navigation trends for users and personalization that may enhance a user's experience (Gupta et. al., 2016; Langville & Meyer, 2006). On top of its application to the UTS space, Markov chain can also help generate insights on organizational culture and cultural change. Given the omnipresent monitoring provided by UTS embedded technology, some view such monitoring as negative; however, UTS can also be used to detect terrorism (Sreenu & Saleem Durai, 2019). The vastness of UTS monitoring presents a puzzle for why there seems to be a lack of response to threats of domestic terrorism in the USA. Markov chains can provide a model to explain both the problem and the solution to security and safety online. To reiterate, “a Markov (or memoryless) strategy is a behavior strategy in which every decision rule depends only on the current state” (Filar & Vrieze, 2012). When applying this to problems of foreign intelligence and security organizations capitalizing on UTS systems to exploit US interests, it is important to understand how US service members, officers, and others can compromise their own, and the USA’s, safety and security. Beyond simple OPSEC, and purposeful obfuscation, the inadvertent electronic trail of an individual pay persist. Markov chains can be used to calculate risk value and calculate a probability of attack associated with network security (Sun et al., 2017). Understanding the risk value, the cost of loss, multiplied by probability of occurrence, of an online attack can help inform US service members and officers of how important their online safety and security is to the USA. When it comes to protecting online safety and security, one can model the choice to be safer and more secure using a Markov chain (Blanchet et al., 2016). The Markov chain model also outperforms a similar model, the Multinomial Logit Model (MNL), in terms of calculating the expected benefit, which in this case represents increased security and safety (Desir et al., 2020).
4
C. W. Gruber et al.
Overall, Markov chains can explain the problems with safety and security online; however, they provide a basis for a solution that can make US service members and officers more safe and secure while online.
1.2.1 Bayesian Inferences 1.2.1.1 Bayesian Inferences as It Relates to Ubiquitous Technical Surveillance Implementing a Bayesian framework for quantifying probabilities, both common and complex, allows us to navigate through various environments of unknown vulnerability simultaneously and efficiently. When utilizing this framework, we begin with base rates, preferably statistical as opposed to causal base rates, to anchor our trajectory. As new information arises, we interweave that data with our prior beliefs to analyze the diagnosticity of the case at hand (Kahneman, 2011). Essentially, the inclusion of new material into this framework establishes a continuous process of learning and updating probabilities; this never-ending game is akin to a Markov chain, as the chain evolves randomly and perpetually. The actuarial application of Bayesian models as it relates to technical surveillance and security has already been theorized in detail. For instance, Bayesian game theory-based solutions may improve traditional cyber defense networks as these games gather and rely on imperfect or probabilistic information through continuous observation of the network (Dahiya & Gupta, 2021). Bayesian decision network models can also apply to risk management; implementation of the model can occur using a three-step process of risk assessment, risk mitigation, and risk validation and monitoring (Khosravi-Farmad & Ghaemi-Bafghi, 2020). This Bayesian model allows for new information to update strategies for combating probabilities of exploitative vulnerability, which can provide a framework on which UTS culture change solution-sets can rely. This is evident looking across the IC and the government, organizations engage in a Bayesian game unaware that they are actively playing that game. The Office of Personnel Management (OPM) data breach of 2014 evidences this dynamic (Office of Personnel Management, 2015). As OPM lacked the cyber security framework to generate awareness and lodge a proper defense against an intrusion, the adversary gained chronic access to DoD members’ critical personal and occupational data. In light of events such as these, we hope to expand the efficacy of security by changing the mindset of security for an individual agent, and we can do this by modeling Bayesian games. With the Markov property of the Markov decision process is a single-agent Stochastic game. However, like Stochastic games, there is only one agent involved: the individual. And it is at this point where the interfaces between UTS and the individual are most profound. We see that no one aspect occurs alone, hence our pivot to exploring systems theory. Systems theory describes the functioning of a model and how it can be
1 Ubiquitous Technical Surveillance: A Ubiquitous Intelligence Community Issue
5
“opened” to effects by its surrounding environment (Grimsley, 2017). There are two intrinsic properties of a systems theory: simple, interactive changes as the basis of a complex system and self-organization of a phenomenon emerging from its components’ interactions (Van Geert, 2000). With systems theories, stable—and sometimes maladaptive—patterns exist until either an internal or external factor disrupts the habit (Thelen & Bates, 2003; Thelen et al., 2001); this system can be altered by its environment or by itself (Van Geert, 2002), ergo becoming a Markov chain (Schön, 2006). For instance, the systems theory of Humanistic Cognitive Behavioral Theory (HCBT), described by Gruber (2008, 2011), focuses on the interactions among cognition, behavior, environment, and courage in decision-making. We aim to expand HCBT’s use to increasing decision-making focused on safety regarding Ubiquitous Technical Surveillance (UTS). With UTS being, well, ubiquitous, choices to be safe are of the utmost importance, but the decision for a person to be safe involves many factors. Using HCBT (Gruber, 2011), we can identify what factors go into making decisions, and how to affect one so that the habitual pattern (ignorance of safety measures) of the system changes to the pattern we want (proactively being safe when using technology). How each of the four factors plays a role in changing safety decisions is as follows: cognition, thoughts of the effects of technological safety on you or your job; behavior, acts that promote safety during technology use that change environmental effects; environment, with UTS all around your environment, you must proactively be safe while online; and courage—the psychological determination needed to change a habit, especially when replacement habits involve more effort. HCBT can be used to model the change in decision-making over the time that the new habit is created, all while tracking measures of change. This allows us to generalize the change in safety decision-making and apply the new safety behavior to other situations. Aside from HBCT and courage, Systems theory provides a robust demonstration of the interconnectedness of decisions and actions and will play an essential role in exploring the decision sciences in 2nd and 3rd order decision effects, as we design both systems and IT interfaces/apparatuses to combat UTS. Viewing an organization’s information technology apparatus as a system of systems allows employees to apply general and specific security strategies to complex and dynamic missions (El-Hachem et al., 2021). Assisting employees in diversifying the application of security measures beyond direct organization contact allows them to broaden their threat assessment and remediation plans to apply appropriate countermeasures. To this end, game theory can provide a framework for understanding potentially contradicting incentives and is therefore commonly used in analyzing interactions from a security standpoint. More specifically, game theory can frame employees as players in a game behaving rationally and attempting to maximize their potential payoff. In the case of UTS, employees are balancing personal and career motivations with organizational goals and security. Specifically, social media use and a general online presence is used as a means of social engagement as well as career exploration and enhancement. However, in the presence of artificial intelligence and machine learning algorithms, this information can present an ideal
6
C. W. Gruber et al.
resource for adversarial intent on large-scale data detection with the goal of threat detection (Zhu et al., 2021). When applying game theory to cyber security, one player is defined as the network defender and the other as the network disrupter. The defender’s goal is to take preventative actions to prevent the disrupter from exploiting network vulnerabilities to obtain resources, whereas the disrupters job is to inflict maximal damage to the network and/or obtain information (Alavizadeh et al., 2021). Game theory also espouses an economical approach to cyber security in where the goal is not the elimination of all potential vulnerable security behaviors, but a self-reinforcing protection system that increases the incentives and encourages beneficial behaviors beyond just the elimination of malicious activity (Anwar & Kamhoua, 2020). Specifically, UTS as a security threat can best be described as a two-person game with one player being the obfuscator and the other player the detector. Neither player knows the other’s specific strategy, but each player understands the other’s strategy space (Oh et al., 2017). In this situation, game theory suggests that if there is no one optimal technological solution to detection prevention, then a variation of strategies is best. Another key dynamic regarding social media is the interplay between trust and privacy (Manshaei et al., 2011). Establishing trust requires the disclosure of personal information; however, the cost of disclosure can be suboptimal in certain situations. In game theory, the tradeoff between trust and privacy can be described as a dynamic Bayesian game. Establishing a solution set to this type of game is based on the perfect Bayesian equilibrium that can identify the best response for each player (Anwar & Kamhoua, 2020). Even with game and systems theories in hand, we turn to examining specific behaviors in individuals which we can reasonably foresee. With that we turn to the Theory of Planned Behavior (Ajzen, 1991), which is a model for which one can approximately anticipate a specific behavior by analyzing cognitive self-regulation. An extension of the theory of reasoned action (Ajzen & Fishbein, 1980; Fishbein & Ajzen, 1975), the theory of planned behavior acknowledges an individual’s intention to perform a specific behavior and explores the dispositional factors that influence intention. The intention of a behavior can be predicted with high accuracy when analyzing intention as a combination of behavioral beliefs, normative beliefs, and control beliefs. Additionally, the behavior itself is transitioned from a state of intention to actuality via two mediating factors—perceived behavioral control and actual behavioral control. The former signifies how the perception an individual has regarding the easiness or difficulty of performing the behavior, while the latter takes into consideration non-motivation factors, such as the availability of opportunities and proper resources. As it relates to ubiquitous technical surveillance and participating in procedures to promote and protect security, we can utilize the theory of planned behavior to identify particular beliefs that may need to be strengthened or transformed altogether to increase intentions and to, most importantly, produce actual behaviors to “play the game.” The determinants of intention, as mentioned above, are threefold: behavioral beliefs, normative beliefs, and control beliefs. First, behavioral beliefs
1 Ubiquitous Technical Surveillance: A Ubiquitous Intelligence Community Issue
7
entail attitudes toward the behavior, measured as the degree to which an individual judges a behavior as favorable or unfavorable. If we designate the behavior as an act of promoting security, say, employing two- factor authentication, then we would be interested in discovering an individual’s attitude toward that behavior. Does the individual have favorable or unfavorable opinions attached to two-factor authentication? Do they evaluate security as a whole as positive or negative? Next, we identify normative beliefs which encompass subjective norms and the perceived pressure to perform or to not perform the behavior. Subjective norms occupy every situation in which any behavior, including acting upon security measures, occurs; from societal proclamations dispersed via the media, to supervisory recommendations from one’s employer, to casual mentions by close peers, an individual consumes normative beliefs about security from every output. For every news article that declares “you HAVE to make an effort to be more secure, or it will be too late!,” there is a competing article that persuades its readers that security is an uncontrollable force and, as a result of its supposed impossibility, is an unobtainable illusion. For every instance that a boss reminds his employee to follow a specific information security policy, there is a colleague or even another peer outside one’s occupation invalidating the command, saying there is no point, we have no power over it, people will always be able to override/hack us anyway if they really wanted to. However, findings suggest that personal considerations, including personal feelings of moral obligation or responsibility to perform, tend to override the influence of perceived social pressure (Gorsuch & Ortberg, 1983; Pomazal & Jaccard, 1976; Schwartz & Tessler, 1972). Regardless, it is essential that we analyze the normative beliefs that influence an individual’s intentions to engage in security-promoting behaviors. Finally, the aforementioned perceived behavioral control influence whether an individual consider security protection as easy to achieve, or whether they are overwhelmed by its imagined difficulty. Additionally, both past experiences and expected obstacles shape perceived behavioral control. Has the agent in the security game had negative or positive past experiences? Has their security been compromised in the past, and if so, how are those memories impacting their current perceived behavioral control? As for anticipated obstacles, is the individual worrying over possible impediments that might occur the next time they play? All of these factors and influences aggregate to manipulate an individual’s self-determination and perceived behavioral control. Moreover, not only do these control beliefs combine with normative beliefs and behavioral beliefs to influence intentions, but they all influence each other. The theory of planned behavior also postulates that perceived behavioral control not only influences intention but mediates the likelihood of the intention producing the specific behavior. Thus, if the intention is constant, then perceived behavioral control, along with actual behavioral control, will strengthen or weaken the translation from intention to the performance of the behavior. For example, if person A and person B both have the intentions to undertake behaviors of security, and they both have the resources to do so, but person A has more perceived behavioral control than person B and is thus more confident that he will be able to maintain this habit and
8
C. W. Gruber et al.
win the game of security, whereas person B has less perceived behavioral control and is more so playing to not lose the game, then person A is expected to perform and continue performing the actual behaviors. However, it is imperative that persons A and B both have adequate knowledge about the behaviors, on top of having access to the necessary resources, without new and unfamiliar elements, to possess perceived behavioral control. In essence, if an individual is not certain of or does not know enough about the security measures, then the perceived behavioral control may not be strong enough to transform intention to behavior. Given this, we now briefly turn to a unified model of Information Security Policy Compliance. Prior research has investigated behavioral theories to explain ISS compliance. Among them are: theory of neutralization (ToN) (Sykes & Matza, 1957); health belief model (HBM) (Becker, 1974); protection motivation theory (PMT) (Rogers, 1975); deterrence theory (DT) (Gibbs, 1975); theory of planned behavior (TPB) (Ajzen, 1991); and theory of interpersonal behavior (TIB) (Triandis, 1977). The theory of neutralization describes rationalizations, in which an individual justifies violating an accepted norm or policy to him or herself. Siponen and Vance (2010), Barlow et al. (2013), and Teh et al. (2015) investigate neutralization as a method for reckoning with noncompliant ISS safety behaviors. Becker’s (1974) health belief model introduces the construct of risk and one’s decision to engage in safety measures as a result to the severity and susceptibility of a threat. Ng et al. (2009) explored secure emailing behaviors through the scope of HBM. Moreover, Rogers’ (1975) protection motivation theory also explains behaviors related to health threats but introduces the idea that individuals, when threatened, do not respond rationally or non-emotionally. Liang and Xue (2009) molded PMT to analyze computer users’ behaviors in response to technology-based threats by proposing their own technology threat avoidance theory (TTAT). Another behavioral model that has been used to describe ISS compliancy behaviors is the popular deterrence theory proposed by Gibbs (1975) to explain criminal behavior. DT has been used consistently to explain ISS non-compliance (D’Arcy, 2009; Lee et al., 2004; Theoharidou et al., 2005), including computer abuse and crimes. Additionally, the theory of planned behavior, as we have described in another section, has often been used in ISS research (Bulgurcu et al., 2010; Galletta & Polak, 2003; Mishra & Dhillon, 2006). TPB was adapted from the theory of reasoned action (Fishbein & Ajzen, 1975), as was Triandis’ (1977) theory of interpersonal behavior. TIB advances TRA by including additional social factors, predictors of attitudes, affective components, and facilitating conditions. Pee et al. (2008) utilized a TIB framework for predicting non-work-related computing in the workplace, and Vance et al. (2012) referenced TIB to explain habits in ISS policy violations. To develop their new behavioral model, Moody et al. (2018) dissected 11 previously established behavioral models into constructs and compared similar constructs between theories with unique constructs within theories. They created a large questionnaire with items representing 27 constructs and analyzed response data from 178 participants using univariate analyses of variance. The constructs that indicated the highest correlation with cognitions and behaviors related to ISS most generally reflected constructs and patterns from TIB. However, the factor of
1 Ubiquitous Technical Surveillance: A Ubiquitous Intelligence Community Issue
9
Fig. 1.1 Refined UMISPC
“attitude” in TIB was removed as it was not a relevant correlation in the factor analysis. Similarly, the authors removed TIB’s constructs of “rewards,” “penalties,” “attitude,” and “facilitating conditions.” TIB’s group of “social factors,” which includes “subjective norms,” “roles,” and “self-concept,” was condensed to simply “role values.” The refined model, officially labeled the Unified Model of Information Security Policy Compliance, is depicted in Fig. 1.1. As the figure illustrates, the theory predicts two resulting behaviors: intention, defined as the inclination to engage in a specific behavior, to comply with ISS policies, and reactance, defined as denying that an ISS problem exists, and thus not engaging in complying behaviors. The factors that influence outcomes of intention versus reactance are habit, role values, neutralization, and fear. Specifically, habit, role values, and fear combine to elicit intention, while fear and neutralization merge to evoke reactance. Moody et al. (2018) define habit as a “regular tendency that does not require conscious thought to be compliant with the ISS policy,” and role value as the belief that the ISS compliance act “is appropriate, justified, and acceptable, keeping in mind the nature of the work and the task the person is performing.” Fear, or negative emotional response to stimuli, is anticipated by threat, which combines an individual’s perceived susceptibility and severity to a (once again, perceived) potential harm. Response efficacy, or the “perceived effectiveness of the behavior in mitigating or avoiding the perceived threat,” precedes threat and fear. As mentioned previously, habit, role values, and fear (and thus response efficacy and threat) hypothetically predict intention to comply with ISS policies, while reactance results from fear (and thus response efficacy and threat) and neutralization, defined as “rationalized thinking that allows one to justify departure from compliance intentions.” The developers of UMISPC note that because of the theory’s implication of response efficacy as a significant impact on threat, organizations could, and should, consider emphasizing the importance of complying with information security
10
C. W. Gruber et al.
policies to eliminate information security breaches to employees (Moody et al., 2018). We believe this suggestion is similar to strengthening perceived behavioral control within the theory of planned behavior (Ajzen, 1985). This is why we reference both theories to explain ISS compliance culture within government agencies. As evidenced by the above research, change can be instigated and facilitated through many modalities and with varying levels of success. We postulate that the most robust manner of change for implementing behavioral change around UTS and individuals’ behavior may rest with a combination of Kotter’s 9-step process, with an eye toward systems theory. The common facet that everyone brings to the UTS discussion is that change needs to happen sooner rather than later. One way to incite this change of culture is by using Kotter’s 8-Step Process. This 8-step process involves the following: (1) creating a sense of urgency, (2) building a guiding coalition, (3) forming a strategic vision and initiatives, (4) enlisting a volunteer army, (5) enabling action by removing barriers, (6) generating short-term wins, (7) sustaining acceleration, and (8) instituting change (Kotter, 2008). This model integrates leadership, empowerment, and systems thinking (Alhogail & Mirza, 2014), as long as implementers follow through each step to its fullest extent (Kotter, 2017). The first step is to create a sense of urgency by identifying the need for change and communicating that immediate action is necessary (Kotter, 2008; Juneja, 2021). The second step, building a guiding coalition, indicates that employees must first create, and then guide, coordinate, and communicate the activities of a volunteer coalition of “effective people” (Kotter, 2008). The third step is to form a strategic vision and initiatives through clarifying a vision for the future that is determinately different from the past and current culture. The fourth step is the stage at which the coalition recruits a multitude of people who urgently want to see the culture change happen, which Kotter names a “volunteer army” (Kotter, 2008). The fifth step, enabling action by removing barriers, rids hierarchies and inefficient processes so that the freedom needed to impact the culture can occur. The sixth step is to generate short-term wins, meaning to acknowledge the importance of recognizing successes, and to do so early and often so that progress is tracked to promote confidence within the volunteer army (Kotter, 2008; Juneja, 2021). The seventh step, sustaining acceleration, includes pushing the culture change until the vision is reality (Kotter, 2008). Finally, the eighth step, instituting change, connects new behaviors with organizational success and communicates this to the team, which will ensure the team’s continuation of the behaviors until the new habits replace the old ones (Kotter, 2008; Juneja, 2021). Kotter’s 8-Step Model is a dual operating system, which means it evolves over time and does not induce a sudden change of culture, which often results in annoyed and stressed employees (Kotter, 2012). Rather, this type of system “grows over time, accelerates action over time, and takes on a life of its own that…differ[s] from company to company in the details” (Kotter, 2012). This is useful for changing the culture surrounding Ubiquitous Technical Surveillance, as each company will have its own specific culture to change and the 8 steps can be applied to each specific situation to make progress toward the same culture change. Overall, the majority
1 Ubiquitous Technical Surveillance: A Ubiquitous Intelligence Community Issue
11
(>75%) of people who go through the 8-step process perceive each step as being successful in its goal (Laig & Abocejo, 2021). Additionally, the process as a whole was found to be “an effective tool to bring about transformational change” because it “address[es] the importance of embedding a desired transformation into organizational culture through regular and repetitive encouragement, feedback, reinforcement and recognition of success” (Auguste, 2013). Beyond companies, in which change can be imposed and implemented, the true unit of change is the individual. All of the change metrics and systems designed to invoke change in an organization rely upon the individuals in that organization to change and adapt to that change. How we motivate individuals for change is a key component for the success of the change mission. Intrinsic and extrinsic motivation are subtypes of motivation. Intrinsic motivation is motivation that comes from within; thus a person does something because they want to and it is personally rewarding (Sennett, 2021; Gruber, 2008). Intrinsic motivation can be understood and studied as a person’s “perceived value or contribution” (Herath & Rao, 2009). Extrinsic motivation is motivation that comes from surroundings; thus, a person does something because they are avoiding punishment or expecting a reward (Sennett, 2021). Extrinsic motivational factors are those such as avoiding fear (Johnston & Warkentin, 2010), being penalized (Herath & Rao, 2009), and being socially pressured (Herath & Rao, 2009) or influenced (Ifinedo, 2014). When it comes to UTS and ISS compliance, the main extrinsic motivational factor is fear appeals, which are persuasive messages that use the element of threat (Johnston & Warkentin, 2010). Fear appeals work well as extrinsic motivation messages for ISS compliance due to the perceived severity of the threat being a significant extrinsic motivational factor (Johnston & Warkentin, 2010). Although fear appeals often do well to increase compliance with security measures, if they are phrased in a way that decreases people’s perceived ability to act securely, the message (i.e., the fear appeal) will be rejected and ignored (Johnston & Warkentin, 2010). In addition to decreased perceived susceptibility, Herath and Rao (2009) found that the “severity of penalties had significant but negative impact on policy compliance.” The other extrinsic motivational factors they studied (i.e., certainty of detection, normative beliefs, and peer behavior) as well as the intrinsic motivational factor they studied (i.e., perceived effectiveness) had significantly positive impacts on policy compliance, which is seen in Figs. 1.2 and 1.3 (Herath & Rao, 2009). The integrated theoretical model above was created by Vance et al. (2012), and it uses “habit [as] a determinant of the cognitive mediating process of protection motivation.” This is notable for the idea of intrinsic and extrinsic motivation due to their Cognitive Mediating Processes column. The three factors under Threat Appraisals (i.e., Vulnerability, Perceived Severity, and Rewards) are all extrinsic motivators (Vance et al., 2012). The three factors under Coping Appraisals (i.e., Response Efficacy, Self-efficacy, and Response Cost) are all intrinsic motivators. The interconnectedness of both intrinsic and extrinsic motivation is essential to placing the “individual” within the “system” being changed/acted upon.
12
Fig. 1.2 Intrinsic and extrinsic motivation in information security behaviors
Fig. 1.3 Research model results. (Note: ***p