236 64 36MB
English Pages 362 [364] Year 2023
“Digital learning is the new normal in higher education. The group of experts assembled in this book share important ideas and trends related to learning analytics and adaptive learning that will surely influence all of our digital learning environments in the future.” —Charles R. Graham, Professor, Department of Instructional Psychology and Technology, Brigham Young University “The concept of personalized and adaptive learning has long been touted but seldom enacted in education at scale. Data Analytics and Adaptive Learning brings together a compelling set of experts that provide novel and research-informed insights into contemporary education spaces.” —Professor Shane Dawson, Executive Dean Education Futures, University of South Australia “Moskal, Dziuban, and Picciano challenge the reader to keep the student at the center and imagine how data analytics and adaptive learning can be mutually reinforcing in closing the gap between students from different demographics.” —Susan Rundell Singer, President of St. Olaf College and Professor of Biology “We are currently living in a digital age where higher education institutions have an abundance of accessible data. This book contains a series of chapters that provide insight and strategies for using data analytics and adaptive learning to support student success and satisfaction.” —Norman Vaughan, Professor of Education, Mount Royal University, Calgary, Alberta, Canada
“An important book that comes at a critical moment in higher education. We are swimming in an ocean of data and this book from some of the country’s top researchers and practitioners will help us make sense of it and put it in the service of student success.” —Thomas Cavanagh, PhD, Vice Provost for Digital Learning, University of Central Florida “Data Analytics and Adaptive Learning is an excellent addition to the canon of literature in this field. The book offers several valuable perspectives and innovative ways of approaching both new and old problems to improve organizational outcomes.” —Jeffrey S. Russell, P.E., PhD, Dist.M.ASCE, NAC, F.NSPE, Vice Provost for Lifelong Learning, Dean for Div. of Continuing Studies, University of Wisconsin-Madison “Data is used to customize experiences from buying an item to booking travel. What about learning—a uniquely human endeavor? This book contextualizes the complex answers to that question shedding light on areas with promise: learning analytics, adaptive learning, and the use of big data.” —Dale Whittaker, Senior Program Officer in Post-Secondary Success, Bill and Melinda Gates Foundation “Data Analytics and Adaptive Learning presents a timely and wide-ranging consideration of the progress of adaptive learning and analytics in levelling the educational playing field, while providing necessary cautions regarding the drawing of too many conclusions in what is still a nascent area.” —Frank Claffey, Chief Product Officer, Realizeit Learning
“Data Analytics and Adaptive Learning provides insights and best practices from leaders in digital learning who outline considerations for strategies, change management, and effective decision-making related to data. As higher education expands its work in digital learning and utilizing data for decisions, this book is a must read!” —Connie Johnson, Chancellor, Colorado Technical University “Data analytics and adaptive learning compromise two of the most relevant educational challenges. This book provides excellent research approaches and analysis to answer practical questions related to digital education involving teachers and learners.” —Josep M Duart and Elsa Rodriguez, Editor-in-Chief & Editorial Manager of the International Journal of Educational Technology in Higher Education, the Universitat Oberta de Catalunya “Data, analytics, and machine learning are impacting all jobs and industries. For education, the opportunities are immense, but so are the challenges. This book provides an essential view into the possibilities and pitfalls. If you want to use data to impact learners positively, this book is a must-read.” —Colm Howlin, PhD, Chief Data Scientist, ReliaQuest “Data Analytics and Adaptive Learning helps the educational community broaden its understanding of these two technology-based opportunities to enhance education, looking at very different complementary contributions. Congratulations to the authors.” —Alvaro Galvis, Professor at University of Los Andes, Bogotá, Columbia “The menus, dashboards, and pathways to effective data analytics and adaptive learning can be found in this massively timely and hugely impactful juggernaut.” —Curtis J. Bonk, Professor of Instructional Systems Technology and adjunct in the School of Informatics, Indiana University Bloomington
“Adaptive learning and learning analytics—should we use both or choose one? Do they imply organizational transformation? What works and what does not? In my opinion, the book is valuable reading for those seeking the answers to their questions.” —Maria Zajac, Associate Professor (emeritus) at Pedagogical University Cracow and SGH Warsaw School of Economics, Certified Instructional Designer, Poland “Data analytics and adaptive learning platforms can direct support as needed to at-risk students, helping to create more equitable outcomes. This volume contains a timely collection of studies that examine the impact of these approaches.” —John Kane, Director of the Center for Excellence in Learning and Teaching at SUNY Oswego “This book shines a spotlight on the potential for data analytics, adaptive learning, and big data to transform higher education. The volume lights the way for those brave enough to embrace a new paradigm of teaching and learning that enacts a more equitable and person-centered experience for all learners.” —Paige McDonald, Associate Professor and Vice Chair, Department of Clinical Research and Leadership, The George Washington School of Medicine and Health Sciences “Deftly weaving adaptive learning and analytic theory and practice together, the authors offer numerous examples of how these methods can help us address academic barriers to student success. Their work significantly strengthens the fabric of knowledge on how adaptive learning can benefit students (and faculty).” —Dale P. Johnson, Director of Digital Innovation, University Design Institute, Arizona State University
“The authors of this book convince us that the concepts of data analytics and adaptive learning are tightly integrated, and the book provides insights on different aspects related to utilization of intelligent technologies and how to approach the learning cycle at different stages.” —Eli Hustad, Professor in Information Systems, The University of Agder “Student success is a fundamental mission for all educational institutions. This book explores the current opportunities within analytics, adaptive learning, and organizational transformation to generate wide-scale and equitable learning outcomes.” —John Campbell, Associate Professor, Higher Education Administration, School of Education, West Virginia University “This book brings together top scholars making the connection between data analytics and adaptive learning, all while keeping pedagogical theory on the central stage. It’s a powerhouse driven in equal parts by excellence and innovation providing vision for educators on the quest for learner success across the spectrum.” —Kimberly Arnold, Director of Learning Analytics Center of Excellence, University of Wisconsin-Madison “Once again, a dream team of faculty, researchers, thought leaders, and practitioners come up with this defining, must-read book for every institutional leader and teacher that is invested in the success of every student. This book based on years of research and practice gives the ‘how-to’.” —Manoj Kulkarni, CEO at Realizeit Learning “The chapters in this book bring a desperately needed clarity and a depth of understanding to the topic of data and analytics, adaptive learning and learning more generally in higher education. You will leave this book smarter about these topics than you started and both you and higher education will be the beneficiary.” —Glenda Morgan, Analyst, Phil Hill & Associates
DATA ANALYTICS AND ADAPTIVE LEARNING
Data Analytics and Adaptive Learning offers new insights into the use of emerging data analysis and adaptive techniques in multiple learning settings. In recent years, both analytics and adaptive learning have helped educators become more responsive to learners in virtual, blended, and personalized environments. This set of rich, illuminating, international studies spans quantitative, qualitative, and mixed-methods research in higher education, K–12, and adult/continuing education contexts. By exploring the issues of definition and pedagogical practice that permeate teaching and learning and concluding with recommendations for the future research and practice necessary to support educators at all levels, this book will prepare researchers, developers, and graduate students of instructional technology to produce evidence for the benefits and challenges of data-driven learning. Patsy D. Moskal is Director of the Digital Learning Impact Evaluation in the Research Initiative for Teaching Effectiveness at the University of Central Florida, USA. Charles D. Dziuban is Director of the Research Initiative for Teaching Effectiveness at the University of Central Florida, USA. Anthony G. Picciano is Professor of Education Leadership at Hunter College and Professor in the PhD program in Urban Education at the City University of New York Graduate Center, USA.
DATA ANALYTICS AND ADAPTIVE LEARNING Research Perspectives
Edited by Patsy D. Moskal, Charles D. Dziuban, and Anthony G. Picciano
Designed cover image: © Getty Images First published 2024 by Routledge 605 Third Avenue, New York, NY 10158 and by Routledge 4 Park Square, Milton Park, Abingdon, Oxon, OX14 4RN Routledge is an imprint of the Taylor & Francis Group, an informa business © 2024 selection and editorial matter, Patsy D. Moskal, Charles D. Dziuban, and Anthony G. Picciano; individual chapters, the contributors The right of Patsy D. Moskal, Charles D. Dziuban, and Anthony G. Picciano to be identified as the authors of the editorial material, and of the authors for their individual chapters, has been asserted in accordance with sections 77 and 78 of the Copyright, Designs and Patents Act 1988. All rights reserved. No part of this book may be reprinted or reproduced or utilised in any form or by any electronic, mechanical, or other means, now known or hereafter invented, including photocopying and recording, or in any information storage or retrieval system, without permission in writing from the publishers. Trademark notice: Product or corporate names may be trademarks or registered trademarks, and are used only for identification and explanation without intent to infringe. Library of Congress Cataloging-in-Publication Data Names: Moskal, Patsy D., editor. | Dziuban, Charles, editor. | Picciano, Anthony G., editor. Title: Data analytics and adaptive learning : research perspectives / Edited by Patsy D. Moskal, Charles D. Dziuban, and Anthony G. Picciano. Description: New York, NY : Routledge, 2024. | Includes bibliographical references and index. Identifiers: LCCN 2023020017 (print) | LCCN 2023020018 (ebook) | ISBN 9781032150390 (hardback) | ISBN 9781032154701 (paperback) | ISBN 9781003244271 (ebook) Subjects: LCSH: Education—Data processing. | Learning—Research. | Educational technology—Research. | Computer-assisted instruction. | Blended learning—Research. Classification: LCC LB1028.43 .D32 2024 (print) | LCC LB1028.43 (ebook) | DDC 370.72—dc23/eng/20230505 LC record available at https://lccn.loc.gov/2023020017 LC ebook record available at https://lccn.loc.gov/2023020018 ISBN: 978-1-032-15039-0 (hbk) ISBN: 978-1-032-15470-1 (pbk) ISBN: 978-1-003-24427-1 (ebk) DOI: 10.4324/9781003244271 Typeset in Sabon by Deanta Global Publishing Services, Chennai, India
To colleagues who did all they could during the COVID-19 pandemic to provide instruction and services to students.
CONTENTS
About the Editors xvi Preface xix Acknowledgements xx Contributors xxi SECTION I
Introduction 1 1 Data analytics and adaptive learning: increasing the odds Philip Ice and Charles Dziuban
3
SECTION II
Analytics 21 2 What we want versus what we have: Transforming teacher performance analytics to personalize professional development Rhonda Bondie and Chris Dede 3 System-wide momentum Tristan Denley
23 38
xiv Contents
4 A precise and consistent early warning system for identifying at-risk students Jianbin Zhu, Morgan C. Wang, and Patsy Moskal 5 Predictive analytics, artificial intelligence and the impact of delivering personalized supports to students from underserved backgrounds Timothy M. Renick 6 Predicting student success with self-regulated behaviors: A seven-year data analytics study on a Hong Kong University English Course Dennis Foung, Lucas Kohnke, and Julia Chen
60
78
92
7 Back to Bloom: Why theory matters in closing the achievement gap Alfred Essa
110
8 The metaphors we learn by: Toward a philosophy of learning analytics W. Gardner Campbell
128
SECTION III
Adaptive Learning 9 A cross-institutional survey of the instructor use of data analytics in adaptive courses James R. Paradiso, Kari Goin Kono, Jeremy Anderson, Maura Devlin, Baiyun Chen, and James Bennett
145 147
10 Data analytics in adaptive learning for equitable outcomes 170 Jeremy Anderson and Maura Devlin 11 Banking on adaptive questions to nudge student responsibility for learning in general chemistry Tara Carpenter, John Fritz, and Thomas Penniston
189
12 Three-year experience with adaptive learning: Faculty and student perspectives Yanzhu Wu and Andrea Leonard
211
Contents xv
13 Analyzing question items with limited data James Bennett, Kitty Kautzer, and Leila Casteel
230
14 When adaptivity and universal design for learning are not enough: Bayesian network recommendations for tutoring 242 Catherine A. Manly SECTION IV
Organizational Transformation
263
15 Sprint to 2027: Corporate analytics in the digital age Mark Jack Smith and Charles Dziuban
265
16 Academic digital transformation: Focused on data, equity, and learning science Karen Vignare, Megan Tesene, and Kristen Fox
280
SECTION V
Closing 301 17 Future technological trends and research Anthony G. Picciano
303
Index 323
ABOUT THE EDITORS
Patsy D. Moskal is Director of the Digital Learning Impact Evaluation in the Research Initiative for Teaching Effectiveness at the University of Central Florida (UCF), USA. Since 1996, she has served as the liaison for faculty research involving digital learning technologies and in support of the scholarship of teaching and learning at UCF. Her research interests include the use of adaptive learning technologies and learning analytics toward improving student success. Patsy specializes in statistics, graphics, program evaluation, and applied data analysis. She has extensive experience in research methods including survey development, interviewing, and conducting focus groups and frequently serves as an evaluation consultant to school districts, and industry and government organizations. She has served as a co-principal investigator on grants from several government and industrial agencies including the National Science Foundation, the Alfred P. Sloan Foundation, and the Gates Foundation-funded Next Generation Learning Challenges (NGLC). Patsy frequently serves as a proposal reviewer for conferences and journals, and is a frequent special editor of the Online Learning journal, in addition to serving as a reviewer for the National Science Foundation (NSF) and the U.S. Department of Education (DoE) proposals. In 2011, she was named an Online Learning Consortium Fellow “In recognition of her groundbreaking work in the assessment of the impact and efficacy of online and blended learning.” She has coauthored numerous articles and chapters on blended, adaptive, and online learning and is a frequent presenter at conferences and to other researchers. Patsy is active in both EDUCAUSE and the Online
About the Editors xvii
Learning Consortium (OLC). She serves on the EDUCAUSE Analytics & Research Advisory Group and co-leads the EDUCAUSE Evidence of Impact Community Group. She currently serves on the OLC Board of Directors as its President. Charles D. Dziuban is Director of the Research Initiative for Teaching Effectiveness at the University of Central Florida (UCF), USA, where he has been a faculty member since 1970 teaching research design and statistics as well as the founding director of the university’s Faculty Center for Teaching and Learning. He received his PhD from the University of Wisconsin, USA. Since 1996, he has directed the impact evaluation of UCF’s distributed learning initiative examining student and faculty outcomes as well as gauging the impact of online, blended, and adaptive courses on students and faculty members at the university. Chuck has published in numerous journals including Multivariate Behavioral Research, The Psychological Bulletin, Educational and Psychological Measurement, the American Education Research Journal, Phi Delta Kappan, The Internet in Higher Education, the Journal of Asynchronous Learning Networks (now Online Learning), The EDUCAUSE Review, e-Mentor, The International Journal of Technology in Higher Education, Current Issues in Emerging eLearning, The International Journal of Technology Enhanced Learning, and the Sloan-C View. He has received funding from several government and industrial agencies including the Ford Foundation, Centers for Disease Control, National Science Foundation and the Alfred P. Sloan Foundation. In 2000, Chuck was named UCF’s first ever Pegasus Professor for extraordinary research, teaching, and service, and in 2005 received the honor of Professor Emeritus. In 2005, he received the Sloan Consortium (now the Online Learning Consortium) award for Most Outstanding Achievement in Online Learning by an Individual. In 2007, he was appointed to the National Information and Communication Technology (ICT) Literacy Policy Council. In 2010, he was made an inaugural Online Learning Consortium Fellow. In 2011, UCF established the Chuck D. Dziuban Award for Excellence in Online Teaching in recognition of his impact on the field. UCF made him the inaugural collective excellence awardee in 2018. Chuck has co-authored, co-edited, andcontributed to numerous books and chapters on blended and online learning and is a regular invited speaker at national and international conferences and universities. Currently he spends most of his time as the university representative to the Rosen Foundation working on the problems of educational and economic inequality in the United States.
xviii About the Editors
Anthony G. Picciano holds multiple faculty appointments at the City University of New York’s Hunter College, USA, Graduate Center, and the School of Professional Studies. He has also held administrative appointments at the City University and State University of New York including that of Vice President and Deputy to the President at Hunter College. He assisted in the establishment of the CUNY PhD Program in Urban Education and served as its Executive Officer for 10 years (2007–2018). Picciano’s research interests include education leadership, education policy, online and blended learning, multimedia instructional models, and research methods. He has authored 20 books including one novel and numerous articles including Educational Leadership and Planning for Technology which currently is in its fifth edition (Pearson Education). He has been involved in major grants from the US Department of Education, the National Science Foundation, IBM, and the Alfred P. Sloan Foundation. He was a member of a research project funded by the US Department of Education—Institute for Education Sciences, the purpose of which was to conduct a meta-analysis on “what works” in postsecondary online education (2017–2019). In 1998, Picciano co-founded CUNY Online, a multi-million-dollar initiative funded by the Alfred P. Sloan Foundation that provides support services to faculty using the Internet for course development. He was a founding member and continues to serve on the Board of Directors of the Online Learning Consortium (formerly the Sloan Consortium). His blog started in 2009 has averaged over 600,000 visitors per year. Picciano has received wide recognition for his scholarship and research including being named the 2010 recipient of the Alfred P. Sloan Consortium’s (now the Online Learning Consortium) National Award for Outstanding Achievement in Online Education by an Individual. Visit his website at: anthonypicciano.com
PREFACE
The vast amount of data analytics available through students’ engagement with online instructional programs and coursework such as adaptive learning provides rich research opportunities on how best to use these systems to improve students’ success in education. Data Analytics and Adaptive Learning: Research Perspectives centers itself on recent original research in the area conducted by the most-talented scholars in the field with a focus on data analytics or adaptive learning techniques, specifically. In addition, several chapters focus on the organizational change aspect that these topics influence. The past decade has seen advances in instructional technology in adaptive and personalized instruction, virtual learning environments, and blended learning, all of which have been augmented by analytics and its companion big data software. Since 2020, the coronavirus pandemic has resulted in a remarkable investment in myriad online learning initiatives as education policymakers and administrators pivoted to virtual teaching to maintain access to their academic programs. Every indication is that when a new (post-pandemic) normal in education emerges, it will be more heavily augmented by instructional technologies including data analytics and adaptive learning. The strength of technology is that it constantly changes, grows, and integrates into society and its institutions. It is an ideal time to collect and view the evidence on the main topic of this book to determine how it is stimulating advances in our schools, colleges, and universities.
ACKNOWLEDGEMENTS
We are grateful to so many who helped make Data Analytics and Adaptive Learning: Research Perspectives a reality. First, our colleagues at the Online Learning Consortium, both past and present, for providing an environment that fosters inquiry as presented in our work. Second, Dan Schwartz of Taylor & Francis and his staff provided invaluable design, editorial, and production support for this book. Third, we are most grateful to Annette Reiner and Adyson Cohen, Graduate Research Assistants at the Research Initiative for Teaching Effectiveness of the University of Central Florida, for the untold hours of editorial work on this book. There is no real way we can properly thank them for their efforts on our behalf. Lastly and most importantly, however, to the authors of the outstanding chapters found herein: this is not our book, it is a celebration of their outstanding research in data analytics and adaptive learning. Heartfelt thanks to you all. Patsy, Chuck, and Tony
CONTRIBUTORS
Jeremy Anderson executes a vision for coupling digital transformation and
data-informed decision-making as Vice President of Learning Innovation, Analytics, and Technology at Bay Path University, USA. He also is building an incremental credentialing strategy to create alternative pathways to good-paying jobs. Prior to this role, he advanced business intelligence, data governance, and original student success research as the inaugural Associate Vice Chancellor of Strategic Analytics at Dallas College. Jeremy publishes and presents nationally on analytics and teaching and learning innovation. He holds an EdD in interdisciplinary leadership (Creighton University), an MS in educational technology, and a BA in history education. James Bennett has been promoting adaptive learning in post-secondary education for over a decade. His work has included developing new processes for evaluating adaptive learning assessments and the implementation of “learning nets” that employ adaptive learning systems across multiple courses and across all courses in an entire program. In addition to his participation in large-scale adaptive learning projects, James has authored a number of publications on adaptive learning topics. Rhonda Bondie is Associate Professor and Hunter College director of the
Learning Lab and Lecturer at the Harvard Graduate School of Education, USA. Rhonda began pursuing inclusive teaching as an artist-in-residence and then spent over 20 years in urban public schools as both a special and general educator. Rhonda’s co-authored book, Differentiated Instruction Made Practical, translated into Portuguese, is used by teachers in more
xxii Contributors
than 30 countries to support their work of ensuring all learners are thriving every day. Rhonda’s research examines how teachers develop inclusive teaching practices and differentiated instruction expertise throughout their career using new technologies. Gardner Campbell is Associate Professor of English at Virginia Commonwealth University (VCU), USA, where for nearly three years he also served as Vice Provost for Learning Innovation and Student Success and Dean of VCU’s University College. His publication and teaching areas include Milton and Renaissance studies, film studies, teaching and learning technologies, and new media studies. Prior to joining VCU, Campbell worked as a professor and an administrator at Virginia Tech, Baylor University, and the University of Mary Washington. Campbell has presented his work in teaching and learning technologies across the United States and in Canada, Sweden, Italy, and Australia. Tara Carpenter teaches general chemistry at UMBC. In teaching large,
introductory courses, she is interested in using evidence-based pedagogical approaches that will enhance the learning experience of today’s students. Incorporating online pedagogies has proven to be critical to her course design. One of her primary goals is to help her students, who are primarily freshmen, make the transition from high school to college by placing particular emphasis on utilizing effective learning strategies. She is intrigued by the science of learning and investigating student motivation as they take responsibility for the degree that they will earn. Leila Casteel is a nationally certified family nurse practitioner serving over 27 years in both clinical practice and academia. She received her BSN, MSN, and DNP from the University of South Florida in Tampa, USA, and strives to integrate innovation into all aspects of learning, teaching, and health care. She currently serves as the Associate Vice President of Academic Innovation for Herzing University. Baiyun Chen is Senior Instructional Designer at the Center for Distributed Learning at the University of Central Florida, USA. She leads the Personalized Adaptive Learning team, facilitates faculty professional development programs, and teaches graduate courses on Instructional Systems Design. Her team works in close collaboration with teaching faculty members to design and develop adaptive learning courses by utilizing digital courseware to personalize instruction that maximizes student learning. Her research interests focus on using instructional strategies in online and blended teaching in the STEM disciplines, professional
Contributors xxiii
development for teaching online, and the application of adaptive technologies in education. Julia Chen is Director of the Educational Development Centre at The Hong Kong Polytechnic University and courtesy Associate Professor at the Department of English and Communication. Her research interests include leveraging technology for advancing learning, English Across the Curriculum, and using learning analytics for quality assurance and enhancement. Julia has won numerous awards, including her university’s President’s Award for Excellent Performance, First Prize of the Best Paper award in Learning Analytics, the Hong Kong University Grants Committee (UGC) Teaching Award, and the QS Reimagine Education Awards Silver Prize in the category of Breakthrough Technology Innovation in Education. Chris Dede is Senior Research Fellow at the Harvard Graduate School of Education, USA, and was for 22 years its Timothy E. Wirth Professor in Learning Technologies. His fields of scholarship include emerging technologies, policy, and leadership. Chris is a Co-Principal Investigator and Associate Director of Research of the NSF-funded National Artificial Intelligence Institute in Adult Learning and Online Education. His most recent co-edited books include: Virtual, Augmented, and Mixed Realities in Education; Learning engineering for online education: Theoretical contexts and design-based examples; and The 60-Year Curriculum: New Models for Lifelong Learning in the Digital Economy. Tristan Denley currently serves as Deputy Commissioner for Academic
Affairs and Innovation at the Louisiana Board of Regents. His widely recognized work that combines education redesign, predictive analytics, cognitive psychology and behavioral economics to implement college completion initiatives at a state-wide scale has significantly improved student success and closed equity gaps in several states. He also developed and launched the nexus degree, the first new degree structure in the United States in more than 100 years. Maura Devlin is Dean of Institutional Effectiveness and Accreditation at
Bay Path University, USA, overseeing curricular quality, assessment of student learning, accreditation, and compliance initiatives. She is Project Director of a Title III Department of Education grant to develop a Guided Pathways framework and reframe student support initiatives, undergirded by technology platforms. She holds a PhD in educational policy and leadership from UMass Amherst and has published on holistic approaches to assessment, data-driven course design, adaptive learning, and promoting
xxiv Contributors
student engagement in online environments. She is passionate about equitable access to and completion of quality degree programs. Charles (Chuck) Dziuban is Director of the Research Initiative for Teaching
Effectiveness at the University of Central Florida (UCF), USA, and Coordinator of Educational Programs for the Harris Rosen Foundation. He specializes in advanced data analysis methods with an emphasis on complexity and how it impacts decision-making. Chuck has published in numerous journals including Multivariate Behavioral Research, the Psychological Bulletin, Educational and Psychological Measurement, the American Education Research Journal, the International Journal of Technology in Higher Education, and the Internet in Education. His methods for determining psychometric adequacy have been featured in both the SPSS and the SAS packages. Alfred Essa is CEO of Drury Lane Media, LLC, Founder of the AI-Learn
Open Source Project, and author of Practical AI for Business Leaders, Product Managers, and Entrepreneurs. He has served as CIO of MIT’s Sloan School of Management, Director of Analytics and Innovation at D2L, and Vice President of R&D at McGraw-Hill Education. His current research interests are in computational models of learning and applying AI Large Language Models to a variety of educational challenges, including closing the achievement gap. Dennis Foung is a writing teacher at the School of Journalism, Writing and
Media at The University of British Columbia, Canada. He has a keen interest in computer-assisted language learning and learning analytics. Kristen Fox has spent 20 years working at the intersection of education, innovation, digital learning equity, and workforce development and is a frequent author and presenter. She has been an advisor to institutions and ed-tech companies. Her work has included the development of a framework for equity-centered ed-tech products and she has published national research on the impact of the pandemic on digital learning innovation. Kristen previously worked as a Special Advisor at Northeastern University, where she led innovation initiatives. Prior to that, she was a Managing Vice President at Eduventures where she led research and advised institutions on online learning. John Fritz is Associate Vice President for Instructional Technology and New
Media at the University of Maryland, Baltimore County, USA, where he is responsible for leading UMBC’s strategic efforts in teaching, learning, and technology. As a learning analytics researcher and practitioner, Fritz
Contributors xxv
focuses on leveraging student use of digital technologies as a plausible proxy for engagement that can nudge responsibility for learning. Doing so also helps identify, support, and scale effective pedagogical practices that can help. As such, Fritz attempts to find, show, and tell stories in data that can inspire the head and heart of students and faculty for change. Colm Howlin is Chief Data Scientist at a leading cybersecurity company, where he leads a team applying machine learning and artificial intelligence to solve complex business and technology problems. He has nearly 20 years of experience working in research, data science and machine learning, with most of that spent in the educational technology space. Phil Ice is currently Senior Product Manager for Data Analytics and Intelligent Experiences at Anthology. In this capacity, he works with various parts of the organization to create predictive and prescriptive experiences to improve learning outcomes, engagement, and institutional effectiveness. Prior to joining Anthology, Phil worked as faculty at West Virginia University (WVU) and University of North Carolina, Charlotte (UNCC), Vice President of R&D at APUS, Chief Learning Officer for Mirum, and Chief Solutions Officer at Analytikus. Regardless of the position, he has remained passionate about using analytics and data-driven decision-making to improve Higher Education. Kitty Kautzer is Chief Academic Officer at Herzing University, USA, and
has previously been appointed as Provost of Academic Affairs. Prior to joining Herzing University, Kautzer served as the Chief Academic Officer at an educational corporation. She also had an 11-year tenure with another institution, where she held several leadership positions, including Vice President of academic affairs and interim Chief Academic Officer. Lucas Kohnke is Senior Lecturer at the Department of English Language
Education at The Education University of Hong Kong. His research interests include technology-supported teaching and learning, professional development using information communication technologies, and second language learning/acquisition. His research has been published in journals such as the Journal of Education for Teaching, Educational Technology & Society, RELC Journal, The Asia-Pacific Education Researcher, and TESOL Journal. Kari Goin Kono is a senior user experience designer with over 10 years of
experience in online learning and designing digital environments within higher education. She has an extensive research agenda geared toward supporting faculty with inclusive teaching practices including digital
xxvi Contributors
accessibility and co-construction as a practice in equitable design. She has been published in the journals Online Learning, Current Issues in Emerging ELearning, the Journal of Interactive Technology and Pedagogy, and the NW Journal of Teacher Education. Andrea Leonard spent nearly a decade designing and teaching both online
and hybrid chemistry courses at the University of Louisiana at Lafayette before joining the Office of Distance Learning as an Instructional Designer in 2019. Andrea’s education and certification include a BSc in chemistry from UL Lafayette and an MSc in chemistry with a concentration in organic chemistry from Louisiana State University. She is also a certified and very active Quality Matters Master Reviewer. Her research interests include the discovery and application of new adaptive and interactive teaching technologies Catherine A. Manly is a Postdoctoral Researcher at the City University of
New York, USA, and a Professor of Practice in higher education at Bay Path University, USA. She earned her PhD in higher education from the University of Massachusetts Amherst. She brings a social justice lens to quantitative investigation of transformational innovation. Her research aims to improve affordable postsecondary access and success for students underserved by traditional higher education, particularly through the changes possible because of online and educational technologies. Her work has been published in journals such as Research in Higher Education and Review of Higher Education. Patsy D. Moskal is Director of the Digital Learning Impact Evaluation at the University of Central Florida (UCF), USA. Since 1996, she has been a liaison for faculty research of distributed learning and teaching effectiveness at UCF. Patsy specializes in statistics, graphics, program evaluation, and applied data analysis. She has extensive experience in research methods including survey development, interviewing, and conducting focus groups and frequently serves as an evaluation consultant to school districts, and industry and government organizations. Currently, her research focuses on the use of learning analytics, adaptive learning, and digital technologies to improve student success James R. Paradiso is Associate Instructional Designer and the Affordable Instructional Materials (AIM) Program Coordinator for Open Education at the University of Central Florida, USA. His main areas of research and professional specialization are open education and adaptive learning— with a particular focus on devising and implementing strategies to scale
Contributors xxvii
open educational practices and engineering data-driven learning solutions across multiple internal and external stakeholder communities. Thomas Penniston leverages institutional, academic, and learning analytics to inform course redesigns and improve student engagement and success. He earned his PhD through the University of Maryland, Baltimore County’s (UMBC) Language, Literacy, and Culture program, and has extensive experience with quantitative, qualitative, and mixed methods designs. Penniston has been involved in education for over two decades, teaching students ranging in age and skill-level from early elementary to doctoral, in both domestic and international settings (including as a Peace Corps Volunteer in Moldova). He has worked in online and blended learning in different capacities for the majority of those years as an instructor, builder, and administrator. Anthony G. Picciano is Professor at the City University of New York’s
Hunter College and Graduate Center, USA. He has also held administrative appointments including that of Senior Vice President and Deputy to the President at Hunter College. He has authored or co-authored 20 books, numerous articles, and edited 10 special journal editions. He was a founder and continues to serve on the Board of Directors of the Online Learning Consortium. Picciano has received wide recognition for his scholarship and research including being named the 2010 recipient of the Alfred P. Sloan Consortium’s National Award for Outstanding Achievement in Online Education by an Individual. Timothy M. Renick is the founding Executive Director of the National Institute for Student Success and Professor of Religious Studies at Georgia State University, USA. Between 2008 and 2020, he directed the student success efforts of the university, overseeing a 62% improvement in graduation rates and the elimination of all equity gaps based on students’ race, ethnicity, or income level. Renick has testified on strategies for helping university students succeed before the US Senate and has twice been invited to speak at the White House. Mark Jack Smith is Vice President of Human Resources at Petroleum GeoServices (PGS) in Oslo, Norway. He has extensive experience in developing and leading Human Resources (HR) teams, talent management processes, and knowledge management initiatives. Mark has also contributed to learning and development through his research and publications, including a chapter in Blended Learning Research Perspectives, Volume 3 and a lecture at the Learning 2020 Conference. Prior to joining PGS, Mark held senior
xxviii Contributors
HR and knowledge management positions at PricewaterhouseCoopers, and McKinsey & Company. Mark earned an MSc degree in information science from the Pratt Institute. Megan Tesene is an advocate and higher education strategist who partners
with a broad range of postsecondary leaders and constituencies across the United States to support public universities in the adoption and implementation of evidence-based teaching practices and educational technologies. Her work centers on enhancing pedagogy, improving accessibility, and building institutional capacities to support equity in higher education. Megan is a social scientist by training with applied expertise in program evaluation, faculty learning communities, and equitable digital learning initiatives. She previously worked at Georgia State University, where she managed interdisciplinary research projects leveraging educational technologies across undergraduate gateway courses. Karen Vignare is a strategic innovator leveraging emerging technologies
to improve access, equitable success, and flexibility within higher education. Karen engages a network of public research universities committed to scaling process improvement, effective use of educational technology, and using data to continuously improve learning outcomes. Karen previously served as a Vice Provost, at the University of Maryland University College, the largest online public open access institution where she led innovations in adaptive learning, student success and analytics. She has published extensively on online learning, analytics, and open educational resources. Morgan C. Wang received his PhD from Iowa State University in 1991. He is the funding Director of the Data Science Program and Professor of Statistics and Data Science at the University of Central Florida, USA. He has published one book (Integrating Results through Meta-Analytic Review, 1999), and over 100 articles in refereed journals and conference proceedings on topics including big data analytics, meta-analysis, computer security, business analytics, healthcare analytics, and automatic intelligent analytics. He is an elected member of International Statistical Association, data science advisor of American Statistical Association, and member of American Statistical Association and International Chinese Statistical Association. Yanzhu Wu has over 15 years of experience in the instructional design and
technology field. She earned her Master’s and Ph.D. in Instructional Design and Technology from Virginia Tech. She currently works as an Instructional Designer for the Office of Distance Learning at the University of Louisiana
Contributors xxix
at Lafayette. As an instructional designer, she is passionate about creating engaging and effective learning experiences that meet the needs of various learners. Prior to joining the UL, she served as assistant director for the Office of Instructional Technology at Virginia Tech for several years. Jianbin Zhu, is a senior biostatistician in the Research Institute of
AdventHealth and Ph.D. candidate in the Statistics and Data Science Department of the University of Central Florida. He received his first Ph.D. degree in Engineering Mechanics in 2011 and currently is seeking another Ph.D. degree in Big Data Analytics. He has ten years’ experience in statistics and data science. His research areas are statistical analysis, big data analytics, machine learning and predictive modeling.
SECTION I
Introduction
1 DATA ANALYTICS AND ADAPTIVE LEARNING Increasing the odds Philip Ice and Charles Dziuban
Data analytics and adaptive learning are two critical innovations gaining a foothold in higher education that we must continue to support, interrogate, and understand if we are to bend higher education’s iron triangle and realize the sector’s longstanding promise of catalyzing equitable success. The opportunity and peril of these pioneering 21st century technologies (made available by recent advances in the learning sciences, high performance computing, cloud data storage and analytics, AI, and machine learning) requires our full engagement and study. Without the foundational and advanced insights compiled in this volume and continued careful measurement, transparency around assumptions and hypotheses testing, and open debate, we will fail to devise and implement both ethical and more equitable enhancements to the student experience – improvements that are both urgent and vital for realizing higher education’s promise on delivering greater opportunity and value to current and future generations of students. —Rahim S. Rajan, Deputy Director, Postsecondary Success, Bill & Melinda Gates Foundation
This book contains chapters on data analytics and adaptive learning— two approaches used to maximize student success by removing obstacles and increasing the favorable odds. Although considered separately, these approaches are inextricably bound to each other in a complex educational environment, realizing that information from data will improve the learning process. Analytics has an adaptive component, and the interactive nature of adaptive learning is fundamental to analytics (Dziuban et al.,
DOI: 10.4324/9781003244271-2
4 Philip Ice and Charles Dziuban
2020). They come from common pedagogical theories. Both practices address a fundamental problem in higher education. Because of the severe asymmetry (Taleb, 2012, 2020) in virtually all aspects of student life the playing field is not even close to even. The asymmetry problem: it is pervasive
Findings from the Pell Institute (2021) show that students living in the bottom economic quartile have an 11 percent chance of completing a college degree. In terms of opportunity costs, the odds against these youth are 9:1. On the other hand, students living in the top economic quartile have a 77 percent chance of completing college. Based on financial position alone, the odds in favor of their college success are 3:1. Friedman and Laurison (2019) characterized this as “the following tail wind of wealth advantage,” which positions people from the privileged class as seven times more likely to land in high-paying, elite positions; they constitute the primary candidate pool for job entry because of the inequitable economic, cultural, social, and educational status that they inherit. In a recent study reported in the Scientific American, Boghosian (2019) models free market economies, showing that resources will always flow from the poor to the rich unless markets compensate for wealth advantage by redistribution. This economic asymmetry, according to him, is the root cause of opportunity and educational inequity. Additionally, the cost of postsecondary education is crippling. Hess (2020) puts the accumulated college debt in the United States at more than 1.7 trillion dollars, which, primarily, encumbers the bottom economic quartile. If that debt were a gross domestic product (GDP) it would be the thirteenth largest economy in the world. Mitchell (2020) presents evidence that families in the lowest economic quartile carry the largest proportion of the encumbrance. Mitchell further demonstrates for the period 2016–2021 that the largest increase (43%) impacted Black students who are overrepresented in the lowest economic quartile. A canon of recent literature documents the unequal opportunity (asymmetry) in our country and educational system (Benjamin, 2019; Eubanks, 2018; Jack, 2019; McGhee, 2021; Mullainathan & Shafir, 2013; O’Neil, 2016; Taleb, 2020; Wilkerson, 2011). Analytics and adaptive learning: possible solutions Data analytics: I see you
Data analytics has a long history in the corporate sector especially in customer relations management where companies realized that understanding client interests and buying preferences increased their conditional
Data analytics and adaptive learning 5
profit margins (Chai et al., 2020; Corritore et al., 2020). Furthermore, universities all over the world are exploring analytics to improve student success. In fact, that process is a priority for educational and professorial organizations such as the Digital Analytics Association (n.d.) and the Society for Learning Analytics Research-SoLAR (2021), both of which emphasize the importance of analytics for data scientists. Moreover, an EDUCAUSE Horizon Report: Data and Analytics Edition (2022) features these topics: Artificial Intelligence, Analytics, Business Intelligence, Data Management Planning, Data Privacy, Data Security, Data Warehouse, Diversity, Equity and Inclusion, Digital Transformation, Enterprise Information Systems, and Student Information Systems. This list by its sheer comprehensiveness shows how analytics has intersected with all aspects of higher education. At the institutional level, analytic procedures that predict at-risk groups or individuals commonly use student information system variables to develop their models: SAT, ACT, High School GPA, Major GPA, Institution GPA International Baccalaureate Completion, Transfer Status, Gender, Minority Status, Dual Enrollment, and others. Strategies involve methods such as logistic regression (Pampel, 2020), predicting the binary variable success or not, neural networks (Samek et al., 2021) classification and regression trees (Ma, 2018), and other predictive techniques. Data scientists must decide about risk from errors identifying students who genuinely need support and would benefit from augmented instruction. This is analogous to the costs of types one and two errors in statistics. However, this is where the analysis ends and the judgment comes into play (Cornett, 1990; Setenyi,1995; Silver, 2012). Institutions must assume responsibility for their decisions—data never make them. Invariably their predictive models are good; however, there is a caveat. Often the best predictors such as institutional GPA are nontractable. They classify very well but there is little to compensate for academic history and a small chance that such variables will be responsive to instruction. This is the analytics conundrum. To counteract poor academic performance, analysts must identify surrogate variables that respond to instruction and increase the odds of student success. This is a fundamental challenge faced by all analytics procedures. The model that works perfectly well in one context fails in another. The question becomes not which model but which model for which context. Another approach to the analytics problem has been to place control of the process in the hands of learners through student facing dashboards where they can monitor their progress on a continuous basis (Williamson & Kizilcec, 2022). They can check against a number of course markers comparing their status either to others in the class or against predetermined
6 Philip Ice and Charles Dziuban
criterion values. The assumption is that when students monitor themselves, empowerment cedes to them and encourages engagement. The alternative is to provide the dashboard metrics to instructors and advisors so they can monitor student progress on a continual basis, identifying those that might be at risk in certain aspects of the learning trajectory. However, analytics in its present state is a work in progress and best characterized by the concept of emergence in complex systems (Mitchell, 2011; Page, 2009) rather than strategically planned and designed. Adaptive learning: let’s make it up as we go
Adaptive learning, the second area addressed in this book, increases the odds of student success just as data analytics does (Levy, 2013; Peng & Fu, 2022). However, it approaches the risk problem in an alternative way. Whereas analytics identifies potentially at-risk students, adaptive learning assesses where students are in the learning cycle and enables their achievement at the most effective pace. Carroll (1963) outlined the tenants of adaptive learning when he posited that if learning time is constant knowledge acquisition will be variable. For instance, if all students spend one semester in college algebra their learning outcomes will vary. However, if the desired outcome is to have students at equivalent achievement status at the conclusion of their studies, how long they will have to accomplish that will differ for individuals or equivalent ability level groups. Carroll (1963) partitioned the numerator his time allowed divided by time needed formulation into opportunity, perseverance, and aptitude. The intersection of those elements in the numerator creates second meta-level learning components: mediated expectations, potential progress, and likelihood of success. All three hinge on learning potential through self-expectations, growth, and prior odds of success. Although the notion of adaptive learning is simple its execution is not because the interactions among the elements are more important than the elements themselves. This is what gives adaptive learning its emergent property found in complex systems. Essa and Mojorad (2020) did an extensive numerical analysis of the Carroll model validating that time was a key element in the learning process and that perseverance increases knowledge gains. However, for that conclusion to hold, it must function in a learning environment that reflects an instructional quality threshold. As usual, there is no substitute for excellent teaching. The adaptive principle has been a foundation of effective teaching and learning for decades, but without effective technology platforms to support the autocatalytic nature of the process implementation was not possible. The adaptive workload for instructors, departments, or colleges
Data analytics and adaptive learning 7
simply prevented the process from happening. However, in recent years that has changed and there are platforms available for supporting adaptive learning. Their functioning has not been without glitches and problems but working with universities and corporate partners the systems and their underlying algorithms have improved. One of the advantages of contemporary adaptive technology, unlike analytic approaches, is its ability to provide instructors, advisors, instructional designers, and coaches with real time and continuously updated student progress (Dziuban et al., 2018), such as: 1. baseline status 2. scores on individual learning modules 3. course activity completion rate 4. average scores across modules 5. course engagement time 6. interaction with course material 7. interaction with other students 8. revision activities 9. practice engagement 10. growth from the baseline These markers accumulate for individual students, groups or entire classes creating the possibility of continuous feedback in the learning process (Dziuban et al., 2020). Of course, depending on the platform, the metrics for the adaptive elements will take on different formats and their availability may vary across the providers, but they create an interactive learning environment. This resonates with contemporary students who view an effective learning experience as one that is responsive, nonambiguous, mutually respectful, highly interactive, maximizes learning latitude—all curated by an instructor dedicated to teaching (Dziuban et al., 2007). Metaphorically, Pugliese (2016) described adaptive learning as a gathering storm that requires meaningful conversation among institutions, vendors, and instructional stakeholders to accomplish the benefits of personalized learning. He asserted that progress would accrue from student-centric instructional design. At the time he framed adaptive algorithms as machine learning, advanced algorithms, rule-based, and decision trees. However, the storm he predicted, like so many innovations in education, never really developed because the customary initial enthusiasm mediated into a more considered process. The storm passed quickly. However, three organizations, The University of Central Florida, Colorado Technical University, and Realizeit (adaptive learning platform
8 Philip Ice and Charles Dziuban
provider) followed his advice about a broader conversation and formed a multiyear cooperative research partnership (Dziuban et al., 2017). Through their work, they were able to show that the adaptive environment impacts students’ engagement differently and is roughly analogous to learning styles. They demonstrated the real-time learning variables provided by the platform predicted student risk early in their courses. Further, they demonstrated that the predictive lift of the student markers changed over the trajectory of a course and found critical points in a course past which intervention is no longer effective. Since Pugliese’s (2016) article, considerable understanding of adaptive learning has emerged in higher education and the corporate sectors. Both cultures have come to realize the potential of the adaptive method for learning and training. However, like analytics, adaptive learning is a work in progress because as Forrester (1991) cautions one can never accurately predict how an intervention will ripple through a complex system. “There is another”: Yoda, the three-body problem, and connections
This section begins with a consideration of a problem in physics that leads to chaos—the three-body problem (Wood, 2021). The difficulty arises when the gravitational forces of three bodies interact—for instance, the earth, moon, and sun. Interestingly, Newtonian physics can easily identify the relational pattern between two bodies such as the earth and moon. However, introduce a third (as done in this section) and the relationship can be more difficult, if not impossible, to identify. The system becomes chaotic (at least in physics) with an ever-changing unpredictable cycle of relationships. Whimsically, reports suggest that Newton said to Halley, “You give me a headache.” The concept was the basis of an award-winning science fiction trilogy by Cixin Liu (The Three-Body Problem, 2014; The Dark Forest, 2015; Death’s End, 2016). Aliens, the Tirolians, lived on a planet with three suns that exerted chaotic influences on them causing complete unpredictability for such things as night and day, the seasons, and weather catastrophes. One might wonder why we introduced this concept. For one, education in the digital age is just as complicated as physics problems especially when we think about our own three-body problem—the influence of teacher, student, and content exert on each other. The Tirolians have nothing on educators in the twenty-first century. Yoda’s admonition reminds us that although we address the connection between data analytics and adaptive learning, we would be remiss in ignoring the big data elephant in education. Big data comes closer to Pugliese’s storm than either of the other two and has revolutionized our thinking about sampling, estimation, hypothesis testing, measurement error, and
Data analytics and adaptive learning 9
decision-making. Industry has capitalized on big data to make analytics that result in astonishingly precise predictions about customer behavior, hiring practices, and strategic planning (Aljohani et al., 2022; Sheehy, 2020). Floridi (2016) argues that big data analysis can identify small but important patterns that obscure with any other method. Big data has had a similar impact on education (Dziuban et al., 2016). By examining institutional acceptance and success, registration patterns, diversity and inclusion, dropout prevention, transfer status, change in major, effective major selection and other characteristics of student life, universities have used big data to improve marketing and enrollment projection (Social Assurity, 2020). In recent years, big data has played a critical role in understanding and predicting trends in the COVID-19 pandemic. Without the capability to process and simplify the data daily, the pandemic would have taken a much greater toll on human life. There is a strong connection among big data, data analytics, and adaptive learning (Figure 1.1). The intersection of big data and analytics creates a new science of small pattern prediction where traditionally those relationships were unobservable. Data analytics and adaptive learning have reinforced each other making adaptive analytics a viable alternative to more traditional methods. Combining big data with adaptive learning has enabled the identification of learning patterns that would be unstable in most instructional
Big Data
Stable Learning Patterns
Small Patterns Prediction
Data Analytics
FIGURE 1.1 The
Education Data Trifecta Adaptive Analytics
Adaptive Learning
Three-Body Education Data Problem
10 Philip Ice and Charles Dziuban
situations. Apparently, the learning trifecta creates a three-body education data solution rather than a chaotic problem. The connections are real, strong, meaningful, and informative. There are additional connections; for instance, there is an almost exact correspondence among the elements of adaptive learning and the principles of supply chain management: suppler-instructional designer, manufacturer-instructional designer/ instructor, distributor/wholesaler/adaptive platform, retailer-instructor, customer-student (Johnson et al., 2020). All three of our bodies, data analytics, adaptive learning, and big data, produce results that correspond closely with the epidemiological paradigm: clinical-learning issues, patient-student, global-institutional (Tufte, 2019). Artificial intelligence big data analytics combined with adaptive systems configures systematic mapping of literature (Kabudi et al., 2021). Adaptive learning creates the foundation for a flipped classroom (Clark et al., 2021). Big data analytics can identify the components of a class that will lead to a high degree of student satisfaction (Wang et al., 2009). These are examples of the connections that currently exist. More are emerging daily. Fair and balanced
This chapter emphasizes the potential of data analytics, adaptive learning, and the third body, big data. However, each of the processes has weathered critiques and criticisms that are worthy of note. 1. Adaptive learning can isolate students, reducing instructors to the role of facilitation, thus stripping their teaching leadership role (Kok, 2020). Further, there have been technological difficulties with the platforms that are at odds with instructional objectives (Sharma, 2019). Adaptive learning requires large investments of time and resources especially for content agnostic platforms (Kok, 2020). 2. Data analytics can overlook important variables and inappropriately substitute available data for metrics of real interest—the data bait and switch (Kahneman, 2011). Analytics can fall prey to noise that obscures important findings (Silver, 2012). Analytic models are simplifications of the world and can lead to misleading portrayals (Silver, 2012). Unfortunately, most analytics algorithms are not transparent and hidden, therefore immune to feedback (Benjamin, 2019). Data security is a perennial concern (Hartford, n.d.). 3. Big data can be cherry-picking at an industrial level collecting too many variables where spurious relationships overwhelm meaningful information (Taleb, 2012). Problems and misinformation
Data analytics and adaptive learning 11
interpreting correlations in big data arise from colinearly (Marcus & Davis, 2014). There is always the danger of inherent bias with no transparency and little hope of correction (Almond-dannenbring et al., 2022). All data analytics professionals, adaptive learning specialists, and big data scientists should pay attention to and respect these criticisms and cautions. The responsibility is considerable and best summarized by Tufte (2001) when speaking to visual presentations. Paraphrased they become: 1. show the data 2. ensure the reader responds to the substance of your finding not your methods 3. do not distort with presentation 4. use presentation space wisely 5. make large data sets understandable 6. encourage the reader to make meaningful comparisons 7. show results and multiple levels 8. make sure you have a purpose 9. integrate what you say with what the data say Behind the curtain
Complicating the effective convergence of big data, advanced analytics, and adaptive learning is the system that will be overlaid upon higher education. While all organizations have unique histories and characteristics, higher education may be one of the most complex. It is steeped in tradition and built upon interconnected political, philosophical, and cultural layers that are frequently not explicitly codified but, rather, are retained as tacit knowledge by members of the Academy (Hill, 2022). Within this context, successful implementations will require infrastructural transformation, diligent oversight of processes, and complex change management. Earlier, big data’s role in helping uncover trends during the COVID19 pandemic was noted. However, COVID-19 was also responsible for driving the increased use of data in higher education. Traditional colleges and universities are highly dependent upon ancillary cash flows, such as housing, concessions, parking, etc. to meet the demands of their operating budgets (Deloitte United States, 2020). When the move to online was necessitated by COVID-19, many institutions began to realize alarming deficits, which, despite federal financial intervention, ultimately resulted in numerous closures and increased scrutiny related to expenditures and outcomes (Aspegren, 2021).
12 Philip Ice and Charles Dziuban
Labor in the clouds
Given the increased pressure campuses are now facing, the demand for evidence-based approaches to both strategic and tactical decisions has increased dramatically. However, a large percentage of institutions do not have the ability to federate their data, because of siloed and often antiquated systems. In a 2021 analysis of data-related issues in higher education, the Association of Public and Land-grant Universities interviewed 24 representatives from the Commission on Information, Measurement, and Analysis and found that the issue of data silos was noted as a major barrier to effective data analysis by all members, with only one institution stating that they had a centralized data repository (Nadasen & Alig, 2021). To understand why these silos persist it is necessary to examine the two factors necessary to facilitate change: infrastructure and personnel. Though cloud computing is seemingly ubiquitous in contemporary society, colleges and universities have been laggards with respect to implementing these systems, as a function of cost and timelines. The process of moving from a traditional on-premises implementation to a cloud-based solution can involve 12–18 months and costs that can quickly reach the mid-seven-figure range. For many institutions, projects with these types of costs and duration are simply not tenable, given already strained budgets, making the prospects of developing an infrastructure capable of leveraging the power of big data, advanced analytics, and predictive learning grim at best. In other instances, individual departments within an institution may have already undertaken to move their operations to the cloud. However, the question here becomes which cloud? AWS, Microsoft Azure, and Google Cloud are not interoperable and require the use of a cross-cloud replication service, such as Snowflake (Sharing Data Securely across Regions and Cloud Platforms—Snowflake Documentation, n.d.), with this type of integration having high costs and long timelines as well. While costs and a high pain threshold for lengthy implementations are the primary factors associated with infrastructure development, the rapidly increasing salaries of skilled IT and data science personnel are also a limiting factor. While these groups have always been in high demand, COVID-19 and the Great Resignation catalyzed a stratospheric increase in salaries that will continue to accelerate as technology expands to meet the needs of industry and society at large (Carnegie, 2022). Higher education institutions will need to invest in individuals to fill these roles and will likely find it difficult to fund compensation that is competitive with other alternatives in the market. Developing mechanisms to develop both the workforce and infrastructure necessary to realize the promise of big
Data analytics and adaptive learning 13
data, analytics, and adaptive learning, at scale, is already a daunting task, which will become increasingly difficult as more institutions embark on this journey. Lava lamps and learning
In the movie, The Big Short, a quote, which has frequently been attributed to Mark Twain is displayed, at a point where it is becoming apparent that no one really knows what is going on with Collateralized Debt Obligations: It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so. As data scientists and the practitioners with whom they collaborate begin utilizing increasingly sophisticated machine learning techniques, keeping this warning in the front of one’s mind is critical. By 2013, computational power had reached a point that deep learning, a branch of machine learning based on artificial neural networks, was developed by connecting multiple layers of processing to extract increasingly higher-level entities from data. In other words, it is doing what humans hopefully do every day—learn by example. A classic example of how this works is that if you show the deep learning algorithm enough pictures of a dog it will be able to pick the pictures of dogs out of a collection of animal pictures. Over time, deep learning’s capabilities have increased and certain techniques have been able to move, to a degree, from the specific to the general, using semi-supervised deep learning; giving the algorithm some examples and then letting it discover similar features on its own (Pearson, 2016). While this technology is extremely impressive and is used by virtually every tech and marketing company, it is also problematic in that we do not fully understand how it works. As it is modeled after the way that biological neural networks operate, it is informative to consider how a person thinks. They may see an apple, classify it as such based on prior knowledge, apply both subjective and objective analysis to it, consult with their internal systems to see if they are hungry, then make the decision of whether to eat the apple. This scenario may take only a few seconds, but there are billions of neurons involved in the process and even if we were able to scan the person’s brain, while those interactions were occurring, we could not make sense out of the data. Nor in many cases would the person be able to tell you what factors they considered, in what order,
14 Philip Ice and Charles Dziuban
to arrive at the decision; mirroring the situation that data scientists find themselves in with deep learning. Compounding the problem, researchers at Columbia University recently created a semi-supervised deep learning experiment in which the algorithm detected variables that it could not describe to the researchers. To determine if complex motion could be analyzed by artificial intelligence to surface new variables, the researchers showed the algorithm examples of well-understood phenomena, such as double pendulums, from which additional variables, not considered by humans, were detected. Then the team asked it to examine fire, lava lamps, and air dancers. In the case of flame patterns, 25 variables were returned. However, when queried about their nature, the program was incapable of describing them, even though the predictions of flame action and patterns were largely accurate (Cheng et al., 2022). Even now, some of the companies engaged in machine learning applications for higher education have publicly stated that they do not fully understand the models that are created (Ice & Layne, 2020). While disconcerting, at least the variables that are being fed to machine learning algorithms can be understood and analyzed using more conventional statistical techniques to provide some degree of interpretability. However, imagine a scenario in which a deep learning model was trained on data that are traditionally used to predict phenomena such as retention, then was allowed to ingest all the data that exist on campus, and then went on to produce highly accurate predictions related to which students were at risk, but was unable to provide researchers with an explanation of what it was measuring. While this scenario may, at first, appear theoretical, the speed with which deep learning is beginning to understand complex systems is quite stunning. What if the deep learning model above was trained on data that had inherent, though unintentional, biases? The algorithm does not have a sense of bias or an understanding of ethics, it simply applies what it has learned to new data, even if that means transference of bias. While deep learning is certainly the current darling of the analytics world, the collection, analysis, and reporting processes historically have relied on models, categorization, and labeling based on human input. While humans still have some touchpoints with the model on which deep learning is trained, it is important to remember that the complete history of human research history is distilled and embedded within those examples. This process has the potential to result in the generation of oversimplified models and incomplete, or even harmful through the perpetuation of stereotypes that are culturally repressive practices. As such, it is imperative that practitioners are thoughtful about the processes and tools they are creating and that institutions exercise appropriate oversight to ensure equity (2022 EDUCAUSE Horizon Report Data and Analytics Edition, 2022).
Data analytics and adaptive learning 15
Despite the potential pitfalls, it is important to note that machine learning also has the potential to identify historically ingrained biases, using techniques such as gradient boosting and random forest analysis. Traditionally, linear, rule-based approaches have been good at identifying a few highly predictive variables, when looking at issues such as retention, but are found lacking when attempting to detect the interaction of numerous, small variables that only have significant predictive power when considered holistically (Using Machine Learning to Improve Student Success in Higher Education, 2022). If you build it, they still might not come
Higher education has a long history of resisting change. The tripartite requirements of research, teaching, and service are demanding for tenure track faculty, while those in clinical positions and community colleges typically have an often-onerous class load. These pressures cause faculty to resist change, while their nonacademic counterparts within the university are similarly engaged in multitasking. Among the areas with the greatest resistance to adoption is technology integration. Aside from taking time away from other roles, this is an area in which faculty, administrators, and staff often find themselves in novice status, thus creating significant discomfort (Conklin, 2020). With respect to big data, analytics, and their ability to drive adaptive learning, it is essential that users be able to derive insights related to the problems they are addressing. However, before those insights can be generated, it is necessary for users to fully understand what data elements or entities represent, how to interpret findings, and what constitutes ethical use of data. For these reasons, it is imperative for institutions to understand that the transformation to a data-driven culture be prefaced by data literacy initiatives, with multiple levels, focusing on not only those who will generate the data, but also those that will consume it or be impacted by the decisions of those that do (2022 EDUCAUSE Horizon Report: Data and Analytics Edition, 2022). While ultimately, the creation of a data-driven culture will result in increased efficiencies across all facets of the university, such initiatives are costly, with timelines that can easily exceed those related to infrastructure development. Here, maintaining commitment and focusing on positive outcomes are key to long-term success. Three times two
To understand the immensity of successfully transforming higher education, it is necessary to step back and conceptualize the three-body data
16 Philip Ice and Charles Dziuban
problem within the context of the three institutional forces that are responsible for creating, maintaining, and governing the integration of big data, analytics, and adaptive learning. However, as societal demands on the Academy continue to increase, it is imperative that these forces be reconciled. It is our hope that this book is a step in that direction. That said, perhaps this preface is best concluded with another quote from Master Yoda. Do. Or do not. There is no try. References 2022 EDUCAUSE horizon report: Data and analytics edition. (2022, July 18). EDUCAUSE. https://library.educause.edu /resources/2022/ 7/2022- educause -horizon-report-data-and-analytics- edition Aljohani, N. R., Aslam, M. A., Khadidos, A. O., & Hassan, S.-U. (2022). A methodological framework to predict future market needs for sustainable skills management using AI and big data technologies. Applied Sciences, 12(14), 6898. https://doi.org/10. 3390/app12146898 Almond-Dannenbring, T., Easter, M., Feng, L., Guarcello, M., Ham, M., Machajewski, S., Maness, H., Miller, A., Mooney, S., Moore, A., & Kendall, E. (2022, May 25). A framework for student success analytics. EDUCAUSE. https://library . educause . edu / resources / 2022 /5 /a - framework - for - student -success-analytics Aspegren, E. U. T. (2021, March 29). These colleges survived World Wars, the Spanish flu and more: They couldn’t withstand COVID-19 pandemic. USA TODAY. https://eu.usatoday.com/story/news/education/2021/01/28/covid-19 -colleges- concordia-new-york- education/4302980001/ Benjamin, R. (2019). Race after technology: Abolitionist tools for the new Jim Code. Polity. Boghosian, B. M. (2019). The inescapable casino. Scientific American, 321(5), 70–77. Carnegie, M. (2022, February 11). After the great resignation, tech firms are getting desperate. WIRED. https://www.wired.com /story/great-resignation -perks-tech/ Carroll, J. B. (1963). A model of school learning. Teachers College Record, 64(8), 723–733. Chai, W., Ehrens, T., & Kiwak, K. (2020, September 25). What is CRM (customer relationship management)? SearchCustomerExperience. https:// www.techtarget.com /searchcu stomerex perience /definition /CRM- customer -relationship-management Chen, B., Huang, K., Raghupathi, S., Chandratreva, I., Du, Q., & Lipson, H. (2022, July 25). Automated discovery of fundamental variables hidden in experimental data. Nature Computational Science, 2(7), 433–442. https://doi .org /10.1038/s43588- 022- 00281-6
Data analytics and adaptive learning 17
Clark, R. M., Kaw, A. K., & Braga Gomes, R. (2021). Adaptive learning: Helpful to the flipped classroom in the online environment of COVID? Computer Applications in Engineering Education, 30(2), 517–531. https://doi.org/10 .1002/cae. 22470 Conklin, S. (2020, November 30). ERIC - EJ1311024 - Using change management as an innovative approach to learning management system, online journal of distance learning administration, 2021. https://eric.ed.gov/?id=EJ1311024 Cornett, J. W. (1990). Teacher thinking about curriculum and instruction: A case study of a secondary social studies teacher. Theory and Research in Social Education, 28(3), 248–273. Corritore, M., Goldberg, A., & Srivastava, S. B. (2020). The new analytics of culture. Harvard Business Review. https://hbr.org /2020/01/the-new-analytics -of- culture Deloitte United States. (2020, May 29). COVID-19 impact on higher education. https://www2 .deloitte.com /us /en /pages/public-sector/articles /covid-19-impact -on-higher- education.html Digital Analytics Association. (n.d.). https://www.digitalanaly ticsassociation .org/ Dziuban, C., Howlin, C., Johnson, C., & Moskal, P. (2017). An adaptive learning partnership. Educause Review. https://er.educause.edu /articles/2017/12/an -adaptive-learning-partnership Dziuban, C., Howlin, C., Moskal, P., Johnson, C., Griffin, R., & Hamilton, C. (2020). Adaptive analytics: It’s about time. Current Issues in Emerging Elearning, 7(1), Article 4. Dziuban, C., Moskal, P., Parker, L., Campbell, M., Howlin, C., & Johnson, C. (2018). Adaptive learning: A stabilizing influence across disciplines and universities. Online Learning, 22(3), 7–39. https://olj.onlinelearningc onsortium.org /index.php/olj/article /view/1465 Dziuban, C. D., Hartman, J. L., Moskal, P. D., Brophy-Ellison, J., & Shea, P. (2007). Student involvement in online learning. Submitted to the Alfred P. Sloan Foundation. Dziuban, C. D., Picciano, A. G., Graham, C. R., & Moskal, P. D. (2016). Conducting research in online and blended learning environments: New pedagogical frontiers. Routledge. Essa, A., & Mojarad, S. (2020). Does time matter in learning? A computer simulation of Carroll’s model of learning. In Adaptive instructional systems (pp. 458–474). https://doi.org/10.1007/978-3- 030-50788- 6_ 34 Eubanks, V. (2018). Automating inequality: How high-tech tools profile, police, and punish the poor. St. Martin’s Press. Floridi, L. (2016). The 4th revolution: How the infosphere is reshaping human reality. Oxford University Press. Forrester, J. W. (1991). System dynamics and the lessons of 35 years. In K. B. D. Greene (Ed.), Systems-based approach to policymaking. Kluwer Academic. http://static. clexchange.org /ftp /documents /system- dynamics / SD1991- 04S DandL essonsof35Years.pdf Friedman, S., & Laurison, D. (2019). The class ceiling: Why it pays to be privileged. Policy Press.
18 Philip Ice and Charles Dziuban
Hartford, L. (n.d.). Predictive analytics in higher education. CIO Review. https://education.cioreview.com/cioviewpoint/predictive-analytics-in-higher -education-nid-23571- cid-27.html Hess, A. J. (2020, June 12). How student debt became a $1.6 trillion crisis. CNBC. https://www.cnbc.com/2020/06/12/ how-student-debt-became-a -1point6-trillion- crisis.html Hill, J. (2022, June 24). The importance of knowledge management in higher education. Bloomfire. https://bloomfire.com/ blog/ knowledge-management-in -higher- education/ Ice, P., & Layne, M. (2020). Into the breach: The emerging landscape in higher education, students. In M. Cleveland-Innes & R. Garrison (Eds.), An introduction to distance education: Understanding teaching and learning in a New Era. Harvard University Press. https://www.taylorfrancis.com /chapters/ edit /10.4324/9781315166896 -8/ breach-ice-layne Jack, A. A. (2019). The privileged poor: How elite colleges are failing disadvantaged students. Harvard University Press. Johnson, A., Dziuban, C., Eid, M., & Howlin, C. (2020, January 26). Supply chain management and adaptive learning. Realizeit Labs. https://lab .realizeitlearning.com /research /2020/01/26/Supply- Chain-Management/ Kabudi, T., Pappas, I., & Olsen, D. H. (2021). AI-enabled adaptive learning systems: A systematic mapping of the literature. Computers and Education: Artificial Intelligence, 2. https://doi.org /10.1016/j.caeai. 2021.100017 Kahneman, D. (2011). Thinking, fast and slow. Farrar, Straus, and Giroux. Kok, M.-L. (2020, February 14). Strengths and weaknesses of adaptive learning: A case study. ELearning Industry. https://elearningindustry .com /strenghts -weaknesses-adaptive-learning-paths- case-study Levy, J. C. (2013). Adaptive learning and the human condition. Pearson. Liu, C. (2014). The three-body problem. Head of Zeus. Liu, C. (2015). The dark forest. Head of Zeus. Liu, C. (2016). Death’s end. Head of Zeus. Ma, X. (2018). Using classification and regression trees: A practical primer. IAP. Marcus, G., & Davis, E. (2014, April 6). Eight (no, nine!) problems with big data. The New York Times. https://www.nytimes.com/2014/04/07/opinion/eight -no-nine-problems-with-big-data.html McGhee, H. (2021). The sum of US: What racism costs everyone and how we can prosper together. One World. Mitchell, J. (2020, December 7). On student debt, Biden must decide whose loans to cancel. Wall Street Journal. https://www.wsj.com/articles/on-student-debt -biden-must-decide-who-would-benefit-11607356246 Mitchell, M. (2011). Complexity: A guided tour. Oxford University Press. Mullainathan, S., & Shafir, E. (2013). Scarcity: Why having too little means so much. Times Books. Nadasen, D., & Alig, J. (2021, May). Data analytics: Uses, challenges, and best practices at public research universities. National Association of Public Landgrant Universities.
Data analytics and adaptive learning 19
O’Neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy. Crown. Page, S. E. (2009). Understanding complexity. The Great Courses. Pampel, F. C. (2020). Logistic regression: A primer, 132. Sage Publications. Pearson, J. (2016, July 6). When AI goes wrong, we won’t be able to ask it why. https://www.vice . com /en /article / vv7yd4 /ai - deep - learning - ethics - right - to -explanation Peng, P., & Fu, W. (2022). A pattern recognition method of personalized adaptive learning in online education. Mobile Networks and Applications, 27(3), 1186–1198. https://doi.org /10.1007/s11036 - 022- 01942-6 Pugliese, L. (2016, October 17). Adaptive learning systems: Surviving the storm. Educause Review. https://er.educause.edu/articles/2016/10/adaptive-learning -systems-surviving-the-storm Samek, W., Montavon, G., Lapuschkin, S., Anders, C. J., & Müller, K. R. (2021). Explaining deep neural networks and beyond: A review of methods and applications. Proceedings of the IEEE, 109(3), 247–278. https://doi.org/10 .1109/ JPROC . 2021. 3060483 Setenyi, J. (1995). Teaching democracy in an unpopular democracy. Paper presented at the what to teach about Hungarian democracy conference, Kossuth Klub, Hungary. Sharing data securely across regions and cloud platforms—Snowflake documentation. (n.d.). https://docs.snowflake.com/en/user-guide/secure-data -sharing-across-regions-plaforms.html Sharma, G. (2019, January 19). Discussing the benefits and challenges of adaptive learning and education apps. ELearning Industry. https://elearningindustry.com/ benefits-and-challenges-of-adaptive-learning-education-apps-discussing Sheehy, M. D. (2020, February 4). Using big data to predict consumer behavior and improve ROI. Brandpoint. https://www.brandpoint.com/ blog/using-big -data-to-predict- consumer-behavior-and-improve-roi/ Silver, N. (2012). The signal and the noise: Why so many predictions fail–But some don’t. Penguin Books. Social Assurity. (2020, February 2). The emergence of big data and predictive analytics in college admissions decisions. Social Assurity. https://socialassurity. university/blog/big-data-and-predictive-analytics-college-admissions Society for Learning Analytics Research (SoLAR). (2021, March 24). https:// www.solaresearch.org/ Taleb, N. N. (2012). Antifragile: Things that gain from disorder. Random House. Taleb, N. N. (2020). Skin in the game: Hidden asymmetries in daily life. Random House Trade Paperbacks. The Pell Institute. (2021). Indicators of higher education equity in the United States. Penn AHEAD. Tufte, E. R. (2001). The visual display of quantitative information. Graphics Press. Tufte, E. R. (2019). Beautiful evidence. Graphics Press LLC. Using machine learning to improve student success in higher education. (2022, .mckinsey .com /industries / August 1). McKinsey & Company. https://www
20 Philip Ice and Charles Dziuban
education /our-insights/using-machine-learning-to-improve-student-success-in -higher- education Wang, M. C., Dziuban, C. D., Cook, I. J., & Moskal, P. D. (2009). Dr. Fox rocks: Using data-mining techniques to examine student ratings of instruction. In M. C. Shelley II, L. D.Yore, & B. Hand (Eds.), Quality research in literacy and science education: International perspectives and gold standards. Dordrecht, the Netherlands: Springer. Wilkerson, I. (2011). The warmth of other suns: The epic story of America’s Great Migration. Vintage. Williamson, K., & Kizilcec, R. (2022). A review of learning analytics dashboard research in higher education: Implications for justice, equity, diversity, and inclusion. LAK22: 12th International Learning Analytics and Knowledge Conference. https://doi.org /10.1145/3506860. 3506900 Wood, C. (2021, May 5). Physicists get close to taming the chaos of the ‘threebody problem’. LiveScience. https://www.livescience.com /three-body-problem -statistical-solution.html
SECTION II
Analytics
2 WHAT WE WANT VERSUS WHAT WE HAVE Transforming teacher performance analytics to personalize professional development Rhonda Bondie and Chris Dede
We want educators to be self-directed, critically reflective, lifelong learners. However, in the current teacher education system, educators are rarely asked to analyze their own teaching in order to direct their own professional learning. Starting with teacher preparation and continuing through ongoing professional development (PD) throughout a teaching career, teachers have little control over the required topics of PD and typically engage in large group “one-size-fits-some” experiences. These industrial PD experiences may not seem relevant to the school context, student learning, or individual teacher needs. Our current career-long teacher education exemplifies what Argyris and Schön (1992/1974) refer to as beliefs that do not match the theory in use. In short, the methods used for teacher learning are antithetical to the teaching practices we desire P–12 students to experience. What we need instead is to create burden-free, interactive, and collaborative learning environments; engage teachers in recognizing how their prior experiences may shape perceptions and actions; adjust learning to meet the individual strengths and needs of teachers; provide teachers with immediate and useful feedback; and develop agile teacher thinking—the ability to think critically to address challenges as student learning unfolds. PD must model and reinforce how teachers should use teaching practices as tools to guide their interactions with students. In addition to learning teaching practices, PD must examine the potential impact of these strategies on students as human beings, adapt to the context where teaching practices are implemented, and explore how students’ interactions with the teacher and peers will in turn impact their learning, motivation, and DOI: 10.4324/9781003244271-4
24 Rhonda Bondie and Chris Dede
identity formation. However, differentiated, personalized, and adaptive learning are infrequently used or researched in teacher PD. We know very little about how to get what we want. To address this shortfall, the authors have conducted novel studies through immersive technologies that use analytics showing both the promise of achieving what we want in teacher education and the challenges that must be overcome. This chapter explores moving from what we have to what we want through the purpose, perils, and promise of three analytic approaches to examining teaching expertise using simulated classroom interactions delivered via immersive technologies that generate rich behavioral data streams. Mixed reality simulations to evoke datarich teacher performances
Mixed reality simulations (MRS) offer opportunities to practice teaching through interactions with avatar students in a virtual classroom (Bondie & Dede, 2021). The avatars, controlled by a human simulation specialist, respond to teaching practices and may also initiate challenges, feedback, and coaching. Teaching practices learned through experiences in the virtual classroom may build confidence and skills that transfer to interactions with real students. For PD providers, MRS provide benefits, such as a standardized experience for assessing growth in skills (Bondie et al., 2019). By leveraging the technology’s affordances (e.g., online access, immersive learning, standard challenges, and pausing or restarting), MRS can redefine and transform field experiences by increasing opportunities for differentiated instruction, personalization, and formative assessments in ways that are not possible through in-person field experiences (Bondie et al., 2021). Data analyses we have: our experimental study
Our experimental study used block randomization to measure the effect of alternative interventions (differentiated coaching versus written selfreflection prompts) to increase the use of student-centered high-information feedback for teachers at different points in their career trajectory. Participants comprised 203 teachers (76% female) from 3 different types of teacher PD programs (suburban preservice university preparation (n = 68), rural school district early career induction (n = 66), and urban experienced teacher PD through a not-for-profit education organization (n = 68)). The programs engaged teachers at different points along a continuum of teaching experience while also representing different geographic
What we want versus what we have 25
areas. Our PD was embedded into existing courses held by each program during the 2020–2021 school year. Like many teacher educators (e.g., Dieker et al., 2017; Cohen et al., 2020), we sought to test the potential of PD through MRS. However, our goals were to use the technology affordances of MRS—such as the capacity to erase or speed up time and instantly change locations—to implement differentiated and personalized professional learning. Following the simulations, we explored how data analytics can be used to answer what worked for whom under what conditions and what teachers should do next in their professional learning. Results: what we have—behavior without context
A common method for analyzing teacher performance in MRS is for researchers to count the frequency of desired teacher behaviors and create a score for each teacher (Dieker et al., 2017; Mikeska et al., 2019; Cohen et al., 2020). For example, Dieker and colleagues (2017) used the Danielson framework and the Teacher Practice Observation Tool (TPOT) (Hayes et al., 2013) to score teacher performance, both in real classrooms and the virtual simulator, using a scale of “level 1 being limited observation of the skill to level 4 being mastery of the skill” (p. 68). Cohen et al. (2020) created an “overall quality score (ranging from 1 to 10) that reflected the extent to which teacher candidates supported student avatars in creating classroom norms … and redirecting off-task behaviors” (p. 16). Mikeska et al. (2019) used a rubric to measure preservice teacher skill in facilitating argument-focused discussions considering five dimensions of the teacher practice (e.g., attending to student ideas, responsively and equitably, and facilitating a coherent and connected discussion) and components of those dimensions (e.g., incorporates key ideas represented in students’ prework elicits substantive contributions from all students, makes use of student thinking and overall coherence of the discussion, making the content storyline explicit to students, connecting/linking ideas in substantive ways) (pp.132–133). These rubrics break down a complex teaching practice into observable component parts. All components are assumed to have equal value for all avatar students and are expected to be used in teaching situations across contexts. Rather than tallying the frequency of specific teaching behaviors in relation to student needs, a wholistic score is given for an overall performance. Wholistic rubrics are commonly used in teacher education to gauge overall changes in the use of a teaching practice (Danielson, 2013). However, our current approach to measuring teacher performance lacks an account of the types of actions within the teaching practice that changed
26 Rhonda Bondie and Chris Dede
or a recognition that component parts may be more challenging for a teacher to learn or more beneficial for students to receive. These overall scores are useful measures for researchers and policymakers, but do not provide enough specificity to differentiate future PD. Further, there is no recognition of the responsiveness of the teaching practice to individual student needs; only a count of teacher actions is provided, rather than the series of interactions. Given the wholistic data analytics many studies are using, it is difficult for teacher educators to provide personally focused PD that is needed to improve teaching practices and responsiveness to student learning. In this chapter, we demonstrate how using analytics to evaluate a series of teacher–student interactions provides the ability to examine teacher growth in offering specific types of feedback and responsiveness to individual student avatars. For our study (Bondie et al., 2022), we coded each teacher feedback utterance with a value including, low information (100), clarifying student response (200), valuing student response (300), correcting (400), and finally, pushing student thinking beyond the task (500). Our dependent variable was a weighted mean of the different types of feedback teachers offered during the 7-minute (500 second) simulations. To create the weighted mean, we multiplied the frequency of each type of feedback times the feedback type value (100 to 500), added the feedback scores together and then divided by the total number of feedback utterances. We added greater value to the types of feedback that were more beneficial for student learning because the feedback provided students with specific actionable information (Hattie & Timperley, 2007). We found that the weighted mean provides a more nuanced score than an overall holistic score. Differences between the interventions
Using data analysis techniques, we were able to determine that the treatment of differentiated coaching, in contrast to the control of differentiated written self-reflection prompts, significantly improved teachers’ use of highinformation feedback (Bondie et al., 2019). Figure 2.1 highlights the impact of coaching on teacher use of high-information feedback and also illustrates teacher growth over multiple exposures of MRS. All teachers participated in a baseline simulation, called Sim 0, prior to engaging in the PD. Following the PD and prior to Simulation 1 (Sim 1), the teachers were randomly selected for interventions of differentiated coaching or self-reflection. After these interventions, Simulation 2 (Sim 2) reflects the outcome of the teacher’s personalized practice decision (i.e., to restart the simulation, continue where the simulation was paused, or ask a new question). Figure 2.1 shows that the control group (i.e., self-reflection) improves over three exposures and continues to increase the use of high-information feedback with each additional exposure.
What we want versus what we have 27
FIGURE 2.1 Weighted
Mean of Teacher Feedback Type in Mixed Reality Simulation by Treatment (Coaching versus No Coaching)
FIGURE 2.2 Weighted
Mean of Teacher Feedback Type in Mixed Reality Simulation by Teaching Experience (Preparation, Early Career, and In-service)
Figure 2.2 displays the total sample disaggregated by level of teaching experience to highlight how teachers at different points in their career trajectory grew in the skill of offering high-information feedback across the three simulations. Figure 2.2 differentiates among preservice teachers participating in a university preparation program, school district early career educators in an induction program, and in-service teachers to show that each group benefited from the MRS in different ways. We expected preservice teachers to have the lowest mean of high-quality feedback, experienced teachers to have the highest mean, and early career teachers
28 Rhonda Bondie and Chris Dede
to be in the middle, close to their preservice peers. As expected, experienced teachers began the simulation with higher weighted means than preparation and early career educators. Surprisingly, early career teachers demonstrated on average lower weighted means of high-information feedback than both pre- and in-service teachers. What we want—specific teacher interactions
The weighted mean offers the opportunity to add more personalization to the analysis of teacher performance. For example, teachers could be invited to set goals for their feedback practice by determining the weight of each different type of feedback. As an illustration, teachers may want to work valuing the student perspective; therefore, they would prioritize valuing students with the highest weight × 5 and clarifying questions with × 4 because clarifying questions ensures teachers understand the student perspective. This simple change allows teachers to set personalized goals while engaging in the same MRS. With this approach, researchers are modeling co-construction of the valuing system for teacher performance and are demonstrating how data analytics can change the power structure of a learning experience. In this way, both teachers and researchers can set meaningful goals and use the MRS as a means to measure changes in teaching practices. The opportunity to set the value system of the weighted mean is a step forward. However, to transform teacher education, we need data analytics that illustrate teaching practices as interactions with students. Figure 2.3 displays the mean frequency of each type of feedback teachers offered to students across three trials by treatment condition. This display enables us to see the specific changes teachers made in both frequency and types of feedback as they moved from one simulation trail to the next. Importantly, we can see that teachers in both the treatment and control conditions were able to increase feedback that prompted student thinking and to decrease low-information feedback. Interestingly, teachers in the treatment group (coaching) also decreased the total frequency of feedback offered. This suggests that coaching may have helped teachers to be more precise and specific with their feedback. Figure 2.4 displays stacked bar graphs of the frequency of the types of feedback teachers offered by level of teaching experience. We see increases in prompting student thinking (dark gray) and decreases in low-information feedback (light gray). We also see preservice and experienced teachers offering fewer feedback utterances with each exposure. The middle section displays how early career teachers increased the total number of feedback utterances across the three MRS exposures. The stacked bar graphs provide a window into the types of feedback that teachers offered and specific areas
What we want versus what we have 29
FIGURE 2.3 Teacher
Feedback Types across Three Trials by Condition (Coaching versus Self-reflection)
where additional PD or coaching might increase the quality of teacher feedback. For example, future PD might address prompting thinking and valuing the student perspective—the two types of high-information feedback that remained infrequent across all three MRS exposures. While this data analysis enables teacher educators to provide differentiated instruction focusing on the specific types of feedback that teachers need to practice most, we are still missing the essence of teaching: the interactions with students. What we need—lessons measuring adaptive teaching through student interactions Purpose
Researchers and course instructors could use the outcomes from the checklists of observed behaviors during the simulations to tailor future
30 Rhonda Bondie and Chris Dede 35
Teacher Feedback Frequency Mean
30 25 Prompt Thinking 20
Correct Value Student Clarify Student Response
15
Low Information
10 5
S ce 0 :S 1 S2 In
-S
er
vi
S2
re S0 er :S 1 Ca Ea rly
Pr ep
ar
at
S2
io S0 n: S1
0
FIGURE 2.4 Frequency
of Each Type of High Information Feedback by Teacher Experience Level (Preparation, Early Career, and In-service)
instruction to better meet the needs of the teachers. Using algorithms, PD could adapt and personalize to support teachers in expanding their teaching strategy repertoire. Perils
As demonstrated in the weighted mean analytics, researchers often measure teaching behaviors without considering the alignment with the teacher’s identity and unique strengths, the values of the teacher’s practice, and the particular teaching and learning context. In addition, studies rarely assess the impact on student learning or use rubrics that generalize teaching strategies, as if the teaching behavior is the end goal rather than P–12 student growth. One way to tackle these challenges is to report teacher performance reflecting the distribution of feedback given to individual avatar students. Figure 2.5 displays this approach illustrating for the entire sample displaying the percentage of each type of feedback individual avatars received across three simulations. The researcher value system weighted prompting student thinking with the highest weight, yet teachers used that specific type of feedback the
What we want versus what we have 31
FIGURE 2.5 Given
the Total Teacher Sample, the Frequency of Each Feedback Type Offered to Individual Avatar Students
least. For example, although Ethan received the most feedback, he received primarily low information and clarifying feedback. This may be a result of the simulation script. We had three standardized challenges, where at a planned time in the simulation (e.g., 15 seconds, 1 minute, and 2 minutes) a specific avatar student interrupts the teacher with a question, “Why am I in this group?” or a statement, “I don’t understand why my answer is wrong.” Ethan is the first student to interrupt the teacher; therefore, he may receive more feedback due to that interruption. These data analytics begin to provide a dashboard of student learning. For example, we see that Savannah received more valuing feedback than any other student. Researchers could investigate this to determine if being on the far left drew visual attention as teachers read from left to right, or if Savannah as a White girl reflected the teacher sample and therefore teachers valued her thinking more than the other avatars, or perhaps Savannah’s initial response planned by researchers in the MRS script provided more opportunity for teachers to use valuing feedback. It is difficult to determine the factors that may have led to differences in the feedback that students received during the MRS. We need greater context in the teacher performance dashboard. Promise—automated data analyses
Computer programming languages such as Python along with natural language processing can be used to automate the analyses of recorded interactions in the mixed-reality classroom. For example, every teacher
32 Rhonda Bondie and Chris Dede
utterance can be coded based on the language used to communicate the teacher’s intentions during simulations, identified as spoken to individual students, and organized by time elapsed and duration. Figure 2.6 provides detailed feedback of how teaching practices changed over repeated practice. The x-axis displays elapsed time in the simulation and the y-axis indicates the type of feedback offered to students (i.e., 100=low information, 200=clarify, 300=value, 400=correct, and 500=prompt thinking). This graph enables us to see when during the simulation individual avatars received feedback, the duration, and type of the feedback provided. This type of analysis measures teacher growth by their interactions with students, exploring issues of equity in terms of quality, frequency, duration, and priority of teacher feedback given the need presented in the avatar student’s initial response to the assigned task. This transforms the outcome of teacher education from an expectation of an overall score of a standard demonstration of teaching practices to a responsive demonstration of a series of teaching interactions with students. Given the ability to measure equitable teacher–student interactions as the new outcome of teacher education, we can reimagine teacher education more broadly. For example, this data analysis could be used live, during simulations, where algorithms could be used to evoke avatar students to provide feedback and challenges individualized to the strength and needs that the teacher is currently demonstrating. The avatar students could also support teachers in successfully implementing a teaching strategy. Finally, returning to the weighted mean discussion, each teacher along with PD developers could establish the weighting system to value the type of feedback the teacher most needed to develop in their teaching practice. Clearly, we are on the journey from what we have to what we need. Factors that complicate these comparisons
The early career teachers were required to participate in our professional development as part of their induction program, with the PD taking place during school hours. The preparation teachers engaged in our PD as assignments in their required course work for state teacher certification, and the in-service teachers were required to complete PD hours, however, they chose our PD out of a catalog of many options. So, while the pre- and in-service teachers were required to complete our PD as part of a larger program, there was some choice in registering for courses. In addition, other logistical factors may have contributed to differences in teacher learning in the MRS, such as early career teachers’ stress during the school day in engaging in PD before teaching. The time period between Simulation 0 and Simulation 1 was also extended for early career
FIGURE 2.6 Measuring
Change in Teacher Feedback to Individual Students as Time Elapsed During Three MRS Exposures
What we want versus what we have 33
34 Rhonda Bondie and Chris Dede
teachers, who completed the baseline simulation in the summer and then the PD, Sim 1, and Sim 2 at the start of the school year. The differences in early career teacher performance in the MRS and their logistics raise questions for further exploration about the effectiveness of required PD that takes place during the school day on teacher learning of instructional practices for beginning teachers (Bondie et al., 2022). In addition to logistics, other factors may have contributed to the lower levels of high-information feedback for early career teachers, such as adjusting to the responsibilities of classroom teaching. Perhaps early career teachers felt greater dissonance between their current students in their classroom and the avatar students in the virtual classroom. Teachers in preparation had limited experiences to compare avatar students to experienced teachers in adapting to different classes of students, therefore, a future study might look more closely into finding out how early career teachers feel about the avatar students and the simulated classroom. Postsimulation surveys suggested that teachers at all experience levels enjoyed the simulation experience and found the MRS experience to be relevant and useful. However, early career teachers did not increase their use of high-information feedback in ways similar to preparation and experienced teachers. It is important to note that in our study, all early career teachers were teaching in the same school district, so our data are limited and not generalizable, however, even the limited data help researchers form new questions regarding how teachers at different points in their career trajectory may benefit from using MRS as a means for professional learning. For example, researchers might explore ways to give teachers greater autonomy, even within a required PD topic and format for learning. In addition, given greater use of online meetings and learning, options in logistics for required PD might also be explored. Tailoring PD to teachers at different career points
Examining teacher performance at different career points when engaging in the same simulation begins to illuminate how teaching practices may increase in quality during preparation programs when teachers are not yet responsible for students and may decrease in quality during the early career stage as teachers adjust to their new responsibilities and then increase again as teachers gain experience throughout their career. These data may also be used to develop PD interventions that aim to move preparation and early career teacher performance closer to the performance of experienced teachers and to reduce a dip in teaching practices as teachers transition from the support of preparation programs to teaching on their own in their early career. Unlike a teaching observation with
What we want versus what we have 35
real students, MRS offer a classroom performance assessment that can be returned to as a standard measure of teacher performance over time. Teachers typically improve dramatically in their first years of teaching (Papay & Laski, 2018); however, many teachers leave the profession during that time (Ingersoll et al., 2018). By leveraging data analytics and MRS as an assessment tool, we may be able to design more effective PD reducing the years previously required to develop teaching expertise and increasing the retention of early career teachers. This data analysis (Bondie et al., 2019) is based on the frequency of specific types of feedback offered during the 7-minute simulation with a weighted value toward the increasing amount of prompts for student thinking in different types of feedback. For example, clarifying student responses (weighted as × 2) was lower than prompting a student’s metacognition (weighted as × 4). The weighted mean score as a dependent variable for data analysis uses frequencies of specific types of feedback, which is moving closer to what we need to begin to differentiate professional development based on specific aspects of teacher performance. This analysis begins to answer the question of “what works for whom”: specifically, did teachers with different experience levels change more or learn more during repeated practice in MRS? However, we need more analytic techniques than what we have to understand the changes in teacher learning for the treatment or coached group from Sim 1 to Sim 2, as well as the steady growth of early career teachers across the three exposures versus the growth and decline of preservice teachers. More detailed data analytics is needed to understand how to use MRS to rapidly develop teaching expertise so that preservice and early career teachers’ performance can be closer to their in-service peers. Reaching higher quality teaching more quickly could have profound effects in the field, such as greater job satisfaction and retention. Given the high rate of teachers leaving the field, understanding how to efficiently build teaching expertise has never been more urgent. Lessons from our study illuminate the promise of future data analysis techniques. Analytics for adaptive immersive teacher learning
We want transformative professional learning that adapts to teacher strengths, needs, and specific personalized goals. Educators need to feel control over their professional learning as shapers of their career goals. Teacher learning experiences must expand knowledge and skills in the context of interactions with students—the essence of teaching. PD needs to provide teachers with meaningful feedback that helps teachers reflect on their current performance and monitor change in their teaching throughout their career.
36 Rhonda Bondie and Chris Dede
However, what we have is a one-size-fits-some pattern of PD that is often not seen by teachers as relevant to their own students and teaching practices. Further, our current data analyses use metrics that do not provide the specificity needed to differentiate PD based on individual teacher learning needs or to provide actionable feedback that will support teachers in transferring learning into daily teaching practices. Even more troublesome is the lack of common measures that teachers engage in throughout their career so teachers can track growth in their teaching practices from their preparation program, through early career induction, and throughout their in-service experiences with students. Given the limited access to schools for student teaching during the pandemic and the promise of a safe practice without the concern of harming real children, the body of knowledge based on research using MRS in teacher education is rapidly growing. For example, Cohen et al. (2020) used an experimental design with repeated MRS exposures, enabling the researchers to make causal claims regarding the impact of coaching on changes in teacher behavior. However, we don’t know how the specific types of feedback teachers offered may have changed (e.g., valuing the student perspective, offering more specific corrections, or challenging student thinking). More importantly, we don’t know the extent that teacher feedback was responsive to student learning needs and distributed equitably during the MRS. If researchers embrace new data analytics that center teacher–student interactions and teacher values of their own practices, then teacher educators can push toward a transformation in teacher education to reflect our goals for student learning. The learning analytics necessary are possible with modern immersive technologies; researchers simply need to use it. We have demonstrated, that with Python-powered data analytics, we can provide teachers with the exact feedback type offered to each avatar student as learning unfolded during the MRS. Analytics are key to getting what we want and are increasingly attainable if we choose to take this path. References Argyris, C., & Schön, D. A. (1992 [1974]). Theory in practice: Increasing professional effectiveness. Jossey-Bass. Bondie, R., Dahnke, C., & Zusho, A. (2019). Does changing “one-size fits all” to differentiated instruction impact teaching and learning? Review of Research in Education, 43(1), 336–362. https://doi-org.ezp-prod1.hul.harvard.edu /10 .3102/0091732X18821130 Bondie, R., & Dede, C. (2021). Redefining and transforming field experiences in teacher preparation through personalized mixed reality simulations. What
What we want versus what we have 37
Teacher Educators Should Have Learned from 2020, 229. https://www -learntechlib-org.ezp-prod1.hul.harvard.edu/primary/p/219088/ Bondie, R., Mancenido, Z., & Dede, C. (2021). Interaction principles for digital puppeteering to promote teacher learning. Journal of Research on Technology in Education, 53(1), 107–123. https://doi-org.ezp-prod1.hul.harvard.edu /10 .1080/15391523. 2020.1823284 Bondie, R., Zusho, A., Wiseman, E., Dede, C., & Rich, D. (2022). Can differentiated and personalized mixed reality simulations transform teacher learning? Technology, Mind, and Behavior. https://doi.org/10.1037/ tmb0000098 Cohen, J., Wong, V., Krishnamachari, A., & Berlin, R. (2020). Teacher coaching in a simulated environment. Educational Evaluation and Policy Analysis, 42(2), 208–231. https://doi.org/10. 3102/0162373720906217 Danielson, C. (2013). The framework for teaching evaluation instrument. The Danielson Group. Dieker, L. A., Hughes, C. E., Hynes, M. C., & Straub, C. (2017). Using simulated virtual environments to improve teacher performance. School–University Partnerships, 10(3), 62–81. Gabriel, R. (2010). The case for differentiated professional support: Toward a phase theory of professional development. Journal of Curriculum and Instruction, 4(1), 86–95. https://doi.org /10. 3776/joci. 2010.v4n1p86 -95 Hattie, J., & Timperley, H. (2007). The power of feedback. Review of Educational Research, 77(1), 81–112. Hayes, A. T., Straub, C. L., Dieker, L. A., Hughes, C. E., & Hynes, M. C. (2013). Ludic learning: Exploration of TLE TeachLivE™ and effective teacher training. International Journal of Gaming and Computer-Mediated Simulations (IJGCMS), 5(2), 20–33. Ingersoll, R. M., Merrill, E., Stuckey, D., & Collins, G. (2018). Seven trends: The transformation of the teaching force. Consortium for Policy Research in Education, Research Report #2018-2. Mikeska, J. M., Howell, H., & Straub, C. (2019). Using performance tasks within simulated environments to assess teachers’ ability to engage in coordinated, accumulated, and dynamic (CAD) competencies. International Journal of Testing, 19(02),128–147. https://www.tandfonline.com/doi/full/10.1080 /15305058. 2018.1551223 Papay, J. P., & Laski, M. (2018). Exploring teacher improvement in Tennessee: A brief on reimagining state support for professional learning. TN Education Research Alliance.
3 SYSTEM-WIDE MOMENTUM Tristan Denley
Momentum year strategies
Over the last decade, there has been a growing focus on strategies to improve not only access to higher education but also the successful completion of credentials of value. These efforts have ranged from the introduction of new classroom pedagogy approaches to the creation of new models of higher education. Throughout much of the last decade, I too have focused on these strategies, but from the unique perspective of what can be done at the level of a state’s higher education system. First at the Tennessee Board of Regents (2013–2017) then at the University System of Georgia (USG) (2017–2021), I worked with the faculty, staff, and leaders of those institutions to study what works in systems of higher education and conversely what does not; to diagnose what the root causes of inequities in higher education are, and how they can be removed; and to identify the barriers we have historically, and often unknowingly, placed in the way of our students’ successes, and in what ways we might take their journey to new heights. To carry out this work across a whole system of institutions requires knitting evidence-based initiatives and research findings into a coherent framework of change and working with campuses to implement that shared vision. The work I undertook, that has come to be known as the Momentum Year (Denley, to appear), was the first comprehensive student success strategy to address American college completion at a statewide scale. In this chapter I will discuss in detail three threads of the overall Momentum Year strategy, the theoretical framework behind these DOI: 10.4324/9781003244271-5
System-wide momentum 39
threads, and the strategies we developed and implemented, based on that theory, to radically impact student success. A structural approach to system-wide curricula
Any modern higher education system must teach a vast array of coursework to meet the curricular requirements of all its programs of study. For instance, in the University System of Georgia’s 26 institutions, each semester the system’s roughly 300,000 undergraduate students study over 4,600 different courses. However, more than half of the 2 million studentcourse enrollments lie in approximately 30 of those classes. This observation in no way implies that the other 4,570 courses have no role to play, or that we need far fewer courses—if anything we probably need more. But it does point to a peculiarity of curricular structure: course enrollment is highly concentrated. Over the last 20 years there has been a growing understanding of the structure of complex networks like that of coursework in higher education (Watts & Strogatz, 1998; Albert & Barabási, 2002). These smallworld networks can be used to analyze a wide range of systems in nature, technology and societal phenomena—from the World Wide Web to virus spread. One feature of this type of network is the existence of hub vertices—nodes in the network with an overabundance of connections. These hubs play a disproportionately large role in the connective structure of the overall network, both enabling effective flow around the network, and also fragmenting the network when they are damaged or removed (PastorSatorras & Vespignani, 2001; Albert et al., 2000). By studying the course transcripts of graduates across three state systems, I was able to establish that the system’s course structure itself forms a small-world network. The highly enrolled classes that comprise the lion’s share of the enrollment are the hubs in this network. But consequently, as well as being highly enrolled, they also play a disproportionately critical curricular role in the overall learning structure of the system—successful learning in these classes disproportionately leads to further success; lack of success in these classes leads to failures elsewhere. More formally, we define the graduate-transcript graph G S with vertex set V to be the set of courses that appear on the undergraduate transcript of any graduate from institution or system S, and edge set E defined by joining course c1to course c2 if course c1 and course c2 appear together on a graduate’s transcript. Network small worldliness has been quantified by the coefficient σ, calculated by comparing the average clustering coefficient Cand average path
40 Tristan Denley
length Lof the graph to those of a random graph with the same number of vertices and same average degree Cr and Lr .
C C r L L r
The degree to which σ > 1 measures the graph’s small worldliness. We constructed the graduate-transcript graph for both the Tennessee Board of Regents (TBR) institutions and the institutions in the University System of Georgia (USG). For TBR, σ = 15.15; for USG, σ = 123.57 Figure 3.1 displaying the degree distribution of the TBR system shows that it follows an inverse power law, characteristic of small-world networks (Denley, 2016). The USG distribution follows a similar pattern. This structural curricular analysis suggests that deepening student learning in these “hub” classes will reach across the student body very quickly and will also influence student success across the breadth of the curriculum. We have used this theoretical observation to steer several
Degree Distribution
80 70 60
Count
50 40 30 20 10
0 50 100 150 200 250 300 350 400 450 500 550 600 650 700 750 800 850 900 950 1000 1050 1100 1150 1200
0
Degree Value
FIGURE 3.1
Degree Distribution
System-wide momentum 41
system-level curricular transformation initiatives, designed to impact student success at scale. For instance, in 2016, the Tennessee Board of Regents began an initiative to increase accessibility to instructional materials for students with disabilities. To ensure maximum impact, the work concentrated on these “hub” vertices. We carried out this structural analysis at each of the 19 universities and colleges and identified the 30 hub courses at each. We provided training to faculty representatives, from each of the 30 hub classes at each institution, to apply an accessibility rubric to the delivery of that class on their campus. The combined data from these analyses enabled a coordinated system approach to increasing accessibility that was recognized with an award from the National Federation of the Blind and has also enabled faculty in each course to frame accessibility implementation plans for their courses. This small-world network approach has also informed course redesign strategies. Beginning in Fall 2013, TBR invited teams of faculty from across the system to propose a re-envisaged structure for one of these highly impactful “hub” classes on their campus. They were asked to propose a new approach, together with an assessment structure, with the intention that this new format could enable more students to learn more material more deeply, and consequently would be more successful in that class. Support was provided for the successful teams from funds provided by the system office as both recognition and incentive for the work to be done. Between 2013 and 2015, more than 80 teams of faculty were supported in this way to develop their strategy, implement their pilots, and collect their impact data. The projects involved over 14,000 students and 160 faculty at 18 campuses, in courses from 16 disciplines. Faculty designed new course structures from the full spectrum of contemporary techniques, including supplementary instruction, learning communities, flipped and hybrid classroom models as well as using technologies in a variety of learning settings. In the University System of Georgia, I built on this approach to course redesign, as part of the Gateway to Completion project, utilizing the small-world network approach to identify the courses in which redesign promised the greatest opportunity for impact. We define a catapult course as a course for which there is evidence that deepening the learning in that course has an impact on the student’s graduation rate, as well as the outcomes in that course itself. Let gc : A, B, C, D, F 0, 1 be the grade distribution for a course c, and let bc : A, B, C, D, F 0, 1 be the conditional probability that a student who takes course c and earns that grade graduates. So,
bc x P(student graduates | student takes course c and earns grade x)
42
Tristan Denley
Let g ’c : A, B, C, D, F 0, 1 be defined by g ’c A g A g B, g ’c B g C , g ’c C g D , g ’c D g F , g ’c F 0. We define the learning impact for course c by
I c
g ’c x bc x
x A, B,C , D, F
gc x bc x
.
x A, B,C , D, F
Ideally, to study catapult courses, we would like to study the impact of quality learning on graduation rates. Instead, to carry out this study at a system scale, we used grades as a proxy. The function bc measures how graduation rate varies across the students who take the course c and earn the various grades. The function g ’c models the grade distribution in a class where some intervention has been implemented; the intervention results in each student receiving a grade one letter grade higher than they would otherwise have received. This is intended to model the effect of deepening learning in that course. The learning impact I(c) then measures how much the graduation rate would increase for an average student in the deeper learning version of the course. To identify those courses where there is the greatest opportunity to impact graduation rates through this approach we define e : All courses 0, 1 so that e(c) is the proportion of graduates who take course c at some point in their degree. In this way we can identify the courses that have the maximal opportunity to impact student success by deepening learning. These catapult courses are the courses for which I c e c is the greatest. We carried out this analysis for each course at each institution in the University System of Georgia, using student-course-level data from 2011 to 2018. Using this approach, we were able to identify the learning impact for each course at each institution. In 2017, the University System of Georgia began its second cohort of a course redesign initiative, Gateways to Completion, in partnership with the John Gardner Institute. In this cohort, we used the learning impact approach to enable campuses to select which courses would be most strategic to engage in a redesign strategy to deepen learning. To this end, we provided a bubble-chart visualization to each campus. Each bubble represents a course c. The y-axis represents the learning impact of the course I(c), the x-axis represents the e(c), that is, the proportion of graduates who took that course, and the size of the bubble represents the overall impact I c e c . The bubble chart highlights those courses in which there is the greatest opportunity to increase graduation rates in a particular subject area by
Deeper Course Learning Impact
System-wide momentum 43
English Composition I American Government College Algebra
Intro to Psychology
Student Enrollment Impact FIGURE 3.2 Campus-Wide
Course Impact
deepening learning, and the greatest opportunity to increase graduation rates across a whole campus. Campuses used the visualization to select the courses on their campus that would lead to the greatest campus-wide impact within the context of their faculty and student body (Figure 3.2). This work involved over 75 teams of faculty in multiple disciplines. The Gardner Institute provided technical support and a network improvement community environment for these teams of faculty to explore evidence-based strategies that would deepen learning and close equity gaps. Many of these course redesigns resulted in just those outcomes. The specifics of the interventions that these faculty teams implemented are captured in detailed case studies (Gateways to Completion Case Studies, n.d.). All told, Gateways to Completion impacted more than 750,000 students’ learning experiences. The centrality of English and Mathematics
A meta-analysis of the catapult courses shows that while the complete list varies somewhat from institution to institution, there are two subject areas that appear consistently: English Composition and Introductory Mathematics. The small-world graph structure suggests that success and failure alike spread like a pandemic from these two learning areas across
44
Tristan Denley
the curriculum. Consequently, success, or lack thereof, in this coursework is about much more than passing a math or an English course. It is about graduating or not. This theory is borne out in real data. In the University System of Georgia, before 2018, students who were successful in Freshman Math and Freshman Writing in their first year were more than ten times more likely to graduate (66 percent) than those who were not (6 percent). These differences in graduation rates remained true even when the data were further disaggregated by student demographic variables and by preparation. Suffice it to say, improving student success in Introductory Mathematics and English courses played an enormous role not only in improving Mathematics and English outcomes, but also in a student’s likelihood to complete credentials of value. Although there is a rich collection of strategies that can improve student learning in Mathematics and English courses (Guide to Evidencebased, n.d.; Understanding and Teaching Writing, 2018), ironically the most fundamental barrier is the traditional approach to developmental education. Despite the intention of providing a supportive pathway for less well-prepared students, the long sequence structure of traditional remediation means that those students rarely complete the credit-bearing courses required for graduation (Remediation, 2012). It is the structure that creates a roadblock to student success. Changes in remedial education that place students directly into gateway courses accompanied by intensive tutoring in conjunction with the courses being taken for credit (corequisite remediation) have been shown to significantly improve these outcomes, in some cases doubling or even tripling student success rates in Freshman Math and English courses (Denley, 2021b; Denley, 2016; Increasing Student Success, 2019; Logue et al., 2019). This was certainly the case in both Tennessee’s and Georgia’s colleges and universities. Scaling the corequisite model
Galvanized by the centrality of student success in English and Mathematics, the USG undertook a detailed analysis of the data comparing the effectiveness of three approaches to developmental education that were being used across the system in both English and Mathematics. The three approaches were a traditional developmental sequence; the Foundations model in which students enroll in a single semester of remediation requiring successful completion prior to enrolling in a college-level course; and the corequisite model. To compare the effectiveness of these approaches, we compared the rates at which students were able to successfully complete a college-level English course and a college-level Mathematics course
System-wide momentum 45
(college algebra, quantitative reasoning, or math modeling) within one academic year. Because the preparations of incoming student populations across the system vary considerably, we disaggregated the data using common uniform measures of preparation: ACT math/writing sub-scores; SAT math/ writing sub-scores; and high-school GPA. The results were striking and mirrored results of a similar analysis from the Tennessee Board of Regents (Denley, 2016). We will discuss both English and Mathematics but will begin with Mathematics. The results disaggregated by ACT math sub-score are displayed in Figure 3.3. These data include both students who entered with a standardized test score, and those who did not. Students without a score are included in the “Total” percentage rates. Throughout, we will refer to “passing a college level class” as earning at least a “C” grade. As Figure 3.3 shows, the students who were educated using the corequisite model were more than twice as likely to complete a college-level class with a grade of “C” or better when compared with their peers who used either of the other two prerequisite approaches. Indeed, while the success rates more than doubled overall, the gains were not only for the most prepared students. In fact, the largest gains in success rates were experienced by students with the weakest preparation. The data for the other measures of preparation were similarly compelling.
Percentage of students who passed a college-level math class within one academic year
80 70 60 50 40 30 20 10 0