146 27 2MB
English Pages 136 [131] Year 2022
SPRINGER BRIEFS IN OPEN AND DISTANCE EDUC ATION
Paul Prinsloo Sharon Slade Mohammad Khalil Editors
Learning Analytics in Open and Distributed Learning Potential and Challenges
SpringerBriefs in Education
Open and Distance Education Series Editors Olaf Zawacki-Richter, University of Oldenburg, Oldenburg, Niedersachsen, Germany Junhong Xiao, Shantou Radio & Television University, Shantou, Guangdong, China Editorial Board Trisha Craig, Yale-NUS College, Singapore, Singapore Ursula Glunk, University College Freiburg, University of Freiburg, Freiburg im Breisgau, Baden-Württemberg, Germany You Guo Jiang, Boston College, Chestnut Hill, MA, USA Rui Yang, Faculty of Education, University of Hong Kong, Hong Kong, Hong Kong Akiyoshi Yonezawa, Tohoku University, Sendai, Japan
Developing human capital through education and training is crucial to social and economic progress. However despite efforts to achieve equity and learning opportunities for all, resource constraints and lack of knowledge and skills can overwhelm the capability of government and non-government agencies, institutions and teachers to provide the required levels of education and training by conventional means. More and more providers are recognising that open, distance and online means of delivery have an important role to play both in providing formal schooling and tertiary education and informal and nonformal education and training for the countless millions wishing to upgrade their skills, knowledge and competences at anytime, anywhere at their own pace, and thus making the lifelong learning for all agenda a reality. This book series examines ways in which open and distance education can empower and enable individuals, groups and even entire communities to develop the knowledge and skills necessary for life and work in the 21st century, help to reduce poverty and inequality, achieve independent and sustainable development and meet the demands of the 21st century knowledge economies and open societies. The books in this series are designed for all policy-makers, planners, managers, teachers and trainers, researchers, and students who are involved in or interested in applying open, distance and e-learning methods and technologies in informal and nonformal lifelong learning; schooling; technical and vocational education and training; higher education; workplace training and professional development; community development and international aid programmes; and serving the needs of minority groups, the disabled and other disadvantaged persons. They combine an up-to-date overview of theories, issues, core concepts and/or key literature in a particular field with case studies and practical advice in ways that will meet the needs of busy practitioners and researchers. They address such issues as access and equity, distance teaching and learning, learner support and guidance, costing, technology, assessment and learning analytics, quality assurance and evaluating outputs, outcomes and impacts, cultural factors, learning pathways and credit banking, accreditation, leadership, management, policy-making, and professional development for organisational renewal and change. Researchers interested in authoring or editing a book for this series are invited to contact the Series Publishing Editor: [email protected] All proposals will be sent out for external double-blind review. Review reports will be shared with proposers and their revisions will be further taken into consideration. The completed manuscript will be reviewed by the Series Editors and editorial advisors to ensure the quality of the book and also seek external review in order to ensure quality before formal publication. Abstracted/Indexed in: Scopus
More information about this subseries at https://link.springer.com/bookseries/15238
Paul Prinsloo · Sharon Slade · Mohammad Khalil Editors
Learning Analytics in Open and Distributed Learning Potential and Challenges
Editors Paul Prinsloo Department of Business Management University of South Africa Pretoria, South Africa
Sharon Slade Earth Trust Abingdon, UK
Mohammad Khalil Centre for the Science of Learning & Technology University of Bergen Bergen, Norway
ISSN 2211-1921 ISSN 2211-193X (electronic) SpringerBriefs in Education ISSN 2509-4335 ISSN 2509-4343 (electronic) SpringerBriefs in Open and Distance Education ISBN 978-981-19-0785-2 ISBN 978-981-19-0786-9 (eBook) https://doi.org/10.1007/978-981-19-0786-9 © The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2022, corrected publication 2022 This work is subject to copyright. All rights are solely and exclusively licensed by the Publisher, whether the whole or part of the material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation, broadcasting, reproduction on microfilms or in any other physical way, and transmission or information storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now known or hereafter developed. The use of general descriptive names, registered names, trademarks, service marks, etc. in this publication does not imply, even in the absence of a specific statement, that such names are exempt from the relevant protective laws and regulations and therefore free for general use. The publisher, the authors and the editors are safe to assume that the advice and information in this book are believed to be true and accurate at the date of publication. Neither the publisher nor the authors or the editors give a warranty, expressed or implied, with respect to the material contained herein or for any errors or omissions that may have been made. The publisher remains neutral with regard to jurisdictional claims in published maps and institutional affiliations. This Springer imprint is published by the registered company Springer Nature Singapore Pte Ltd. The registered company address is: 152 Beach Road, #21-01/04 Gateway East, Singapore 189721, Singapore
Foreword
This volume is a welcome contribution to a field not yet well understood, namely the contribution that big data, data mining, analytics and Artificial Intelligence will make to open, distance education. We are only at the beginning of understanding how these phenomena and practices serve interests, wittingly and unwittingly, rather than being innocent technological innovations at the service of all. The editors have themselves been at the forefront of revealing this in the field of learning analytics, insistently asking for a more complex reading of this fast developing field of practice than simply that of a technology-supported practice for supporting students. In this volume they draw on a range of international authors to develop a wider range of questions whose urgency they have foregrounded. The chapters include both theoretical development and more concrete case studies that nevertheless open up theoretical perspectives. At their simplest the questions revolve around the issue of ‘whose data is it’? From there we are led variously to consider the extent to which learners are subjects or objects, and related challenges of privacy and surveillance. While we are given evidence of the ways in which learning analytics can provide the basis for positive intervention and support of students whose progress through a module may have stalled, we are also challenged by the ways in which AI can embody the biases of its creators in its algorithms. Thus co-working robots, or cobots, in AI-driven learner support systems, will prioritise some categories of learners and may ignore others. More positively, there are notable innovative developments that are sketched here, including those of social learning analytics, where shared practice can enhance the role of learners themselves and diminish the risk of accretion of power in a learning and teaching landscape towards the managers. Also interesting is the enabling of social annotation on text, digitally supported, that makes networked discourse possible. Last but not least is the argument that the long established framework of constructivism now needs to make way for the demanding complexity of sociocultural accounts of social, cultural and historical perspectives, that dynamically construct and are constructed by participants in educational contexts. There is then in this volume a compelling and prescient account of the benefits and risks of big data and its associated practices in learning and teaching online. In this education is no different from so many areas of human endeavour that the digital v
vi
Foreword
revolution has reconstructed in ways that can liberate or oppress. This volume helps us to understand better. There is no more important task. Alan Tait Professor Emeritus of Distance Education and Development The Open University Milton Keynes, UK Fellow, Centre for Distance Education University of London London, UK
Contents
1 Introduction: Learning Analytics in Open and Distributed Learning—Potential and Challenges . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Paul Prinsloo, Sharon Slade, and Mohammad Khalil 2 The Potential of Learning Analytics for Intervention in ODL . . . . . . . Billy Tak-Ming Wong
1 15
3 A Global South Perspective on Learning Analytics in an Open Distance E-learning (ODeL) Institution . . . . . . . . . . . . . . . . . . . . . . . . . . . Angelo Fynn, Jaroslaw Adamiak, and Kelly Young
31
4 Learning Analytics in Open and Distance Higher Education: The Case of the Open University UK . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Avinash Boroowa and Christothea Herodotou
47
5 Mobile Multimodal Learning Analytics Conceptual Framework to Support Student Self-Regulated Learning (MOLAM) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Mohammad Khalil
63
6 Designing a Social Learning Analytics Tool for Open Annotation and Collaborative Learning . . . . . . . . . . . . . . . . . . . . . . . . . . Jeremiah H. Kalir
77
7 Situating Learning Analytics for Course Design in Online Secondary Contexts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Joshua Quick and Rebecca C. Itow
91
8 Ethical Considerations of Artificial Intelligence in Learning Analytics in Distance Education Contexts . . . . . . . . . . . . . . . . . . . . . . . . . 105 Leona Ungerer and Sharon Slade 9 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121 Paul Prinsloo, Sharon Slade, and Mohammad Khalil
vii
viii
Contents
Correction to: A Global South Perspective on Learning Analytics in an Open Distance E-learning (ODeL) Institution . . . . . . . . . . . . . . . . . . . Angelo Fynn, Jaroslaw Adamiak, and Kelly Young
C1
Chapter 1
Introduction: Learning Analytics in Open and Distributed Learning—Potential and Challenges Paul Prinsloo, Sharon Slade, and Mohammad Khalil
Background to the Rationale of the Book As part of the series, Springer Briefs in Open and Distance Education, we invited expressions of interest aiming to critically explore and map the potential and challenges of learning analytics in the specific context of open and distributed learning. Since the emergence of learning analytics as a research focus and practice in 2011, the field has matured into a rich, interdisciplinary field and praxis (Ferguson, 2012; Wong & Li, 2019). As the field has matured, there is substantial evidence that learning analytics shapes pedagogy, student retention strategies and the strategic allocation of institutional resources (Gaševi´c et al., 2015; Leitner et al., 2017; Lim, et al., 2019). Having said that, the learning analytics field is also coming to terms with its imperfections, and the lack of surety that it positively contributes to student success (Ferguson & Clow, 2017; Kitto, Shum, & Gibson, 2018, March). Currently, much of the focus of research, whether empirical, theoretical or conceptual, originates from residential or traditional forms of educational delivery. Distance education as sui generis, a unique form of educational delivery (Peters, 1996), has evolved from its early roots in postal correspondence education to a variety of delivery mechanisms including offline, digitally supported, internet supported and fully online learning (Evans & Nations, 1992; Guri-Rosenblit, 2009; Holmberg, 2005; Moore & Kearsley, 2012; Peters, 1996, 2010). A number of genealogical P. Prinsloo (B) Department of Business Management, University of South Africa, Pretoria, South Africa e-mail: [email protected] S. Slade Earth Trust, Abingdon, UK M. Khalil Centre for the Science of Learning & Technology, University of Bergen, Bergen, Norway e-mail: [email protected] © The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2022 P. Prinsloo et al. (eds.), Learning Analytics in Open and Distributed Learning, SpringerBriefs in Open and Distance Education, https://doi.org/10.1007/978-981-19-0786-9_1
1
2
P. Prinsloo et al.
models (Garrison, 1985, 2000; Guglielmo, 1998; Lauzon & Moore, 1989; Moore & Kearsley, 2011; Taylor, 1995, 2001) map the evolution of distance education through its generations or periods. In these models, advances in technology are consistently portrayed as the most important factor in the evolution of distance education. Technology was and continues to be highlighted as seminal in bridging the geographical and physical aspects of distance between students and their institutions. In this context though, ‘distance’ should be seen as a “multi-dimensional construct” (Heydenrych & Prinsloo, 2010, p. 6). (Also see Bayne et al., 2014; Edwards, 1994; Evans, 1989; Marsden, 1996; Spector, 2009; Usher, 2002) with, inter alia, distances created by different onto-epistemological assumptions and approaches emerging from and entangled in dynamic human, informational and non-human agency and materialities (Gunter et al., 2020). Despite the prominent role of technology in the evolution of distance education, other factors such as political, environmental, socio-economic, and legal factors continue to impact on distributed forms of teaching and learning—as can be seen in the dramatic reshaping of educational delivery in response to the global spread of Covid-19 (Czerniewicz et al., 2020; Williamson et al., 2020), and the broader phenomena of the ‘unbundling’ (McCowan, 2017; Morris et al., 2019) and ‘rebundling’ of educational provision (Shaw et al., 2020). Amid the “hyperporous” (Shaw et al., 2020, p. 733) ‘walls’ characterising (higher) education, there remain the more traditional ‘forms’ or categories of educational provision such as residential (whether face-to-face, online or blended), and distributed learning (such as distance, blended, online or open and distance learning). The ‘openness’ of delivery refers to, inter alia, admission requirements, accessible formats of materials, pedagogy, registration periods, forms of assessment and/or accreditation as well as the financial cost of the learning experience and summative assessments. A particular form of distance education that emphasises ‘openness’ is Open and Distance Learning (ODL). It is important to note that whilst not all distance or distributed forms of delivery are open, all ODL institutions involve distance and distributed learning (e.g., Belawati & Baggaley, 2009; MacIntosh, 2005). Distance education may encompass a variety of forms, some more ‘traditional’ forms which use a range of technologies, some in the form of Open and Distance Learning (ODL), and others in the form of distributed online/blended learning such as Massive Open Online Courses (MOOCs), or fully online course offerings on traditional campus-based institutions. These different forms all share, in one way or the other, the characteristic of students being detached from their institution with regard to space, place and/or time. Being separated in these ways allows students flexibility and choice, but may also result in feelings of isolation, a lack of peer support and restricted access to on-campus resources. As such, open and distributed learning providers may be less aware of student progress in their courses, and of the support which some students may need at particular junctures in their learning journeys. In the call for contributions we were particularly interested then in exploring the potential of learning analytics in assisting institutions in distributed learning
1 Introduction: Learning Analytics in Open and Distributed Learning …
3
contexts to make more informed choices regarding resource allocation, pedagogy, student support and assessment. While traditional distance education institutions have always collected and analysed student data, this mostly informed operational and strategic planning, resource allocation or was used for quality assurance purposes. As such, the collection, analysis and use of student data in many open, distance and distributed forms of educational delivery has typically been categorised as academic analytics, rather than learning analytics. [For a discussion on the difference between academic and learning analytics see Siemens (2013)]. Learning analytics as the measurement, collection, analysis and use of student data (demographic and behavioral) has the potential to not only inform pedagogy and student learning, but also the appropriate allocation of resources. Learning analytics focuses on improving student learning—to positively impact student retention and success. In the light of historical and persisting concerns about student retention and success in open and distance learning (Subotzky & Prinsloo, 2011), collecting, analysing and using student data may facilitate more appropriate pedagogical strategies and practices, materials, assessment (both formative and summative), resource allocation and planning, and contribute to student retention and success. Institutions have access to a greater volume, variety and granularity of student data than ever before (e.g. Kitchin, 2014). Further, developments in Artificial Intelligence (AI), machine learning and neural networks open up new opportunities for the analysis and use of student data. In the context of open and distributed learning environments, such developments offer the promise for huge potential to scale student support and learning, provide real-time advice and personalised learning journeys and to break or ameliorate the impact of the iron triangle of cost, quality and access in distance, open and distributed learning (Daniel et al., 2009; Power & Morven-Gould, 2011). At the same time, these new approaches also raise concerns pertaining to, for example, the potential for bias, a ‘black box’ of algorithmic decision-making, and a lack of oversight and accountability (Prinsloo, 2017; Vytasek et al., 2020). In moving forward, we should critically consider the potential for harm and prejudice for those students who may already be at risk. (See, for example, Macgilchrist, 2019; Pardo & Siemens, 2014; Prinsloo et al., 2018; Slade & Prinsloo, 2013; Weber, 2016).
Rationale for the Book The rationale for this book was threefold: Firstly, as learning analytics matures, there is strong evidence that it provides greater understanding of the complexities in teaching and learning, for both educators and students. It assists educational institutions in responding to student vulnerabilities, needs and potential by improving pedagogical strategies, appropriate cognitive load, appropriate assessment strategies, allocation of resources and student support. Secondly, much of the published research on the collection, analysis and use of student data has been produced in and by researchers in traditional, face-to-face educational contexts, predominantly
4
P. Prinsloo et al.
in the Global North. While notions of a Global North and Global South are highly contested (Arora, 2016; Milan & Treré, 2019; Toshkov, 2018), the divide between the research produced in each is clear. A single publication, such as this book, will not address this imbalance on its own—however, the Call for Expressions of Interest specifically sought proposals from researchers from the Global South. Thirdly, the title of this book alerts its readers to a specific interest in the potential, analyses and risks of implementing learning analytics in open and distributed learning contexts. As indicated in the Preface, open and distributed learning share a physical (permanent or temporal) separation between students and their providing institution for a host of reasons, e.g., flexibility, choice or admission requirements. Such separation—in time or space from the providing institution, educators and often also peers and support networks—requires different skills and resilience of students, as well as different strategies, resources and often technologies to support students with timely, effective, appropriate, and ethical support. Having access to operational, academic and student data, as well as the technical infrastructures and human resources needed to collect and analyse it, has always been essential in educational provision, even before learning analytics. The emergence and maturation of learning analytics has occurred in the nexus of access to increasing amounts, variety, velocity and granularity of data, the role of evidence-led management and teaching, advances in data science and analytics and the increasing prominence of online learning. This is of particular importance to open and distributed learning providers as educators are often “teaching in the dark” (Prinsloo, 2017) for much of the student learning journey. In this they are using data collected during admission and registration processes, combined with learning behavioral data (whether in the form of formative assessment, engagement data trails, inquiries, etc.) to adapt pedagogy, and to respond to students’ cognitive, affective, and administrative needs. The separation of open and distributed learning providers from students creates unique challenges for institutions, educators, managers, and course support teams in having a clear understanding of student progress. Providing students with information regarding their learning, whether, inter alia, in the form of feedback on assignments, discussion forum engagement, or student-facing dashboards is an increasingly important part of how open and distributed learning institutions help students make more informed decisions. The aim here was to explore and map, possibly for the first time, how open and distributed learning providers use, or might use, student data to get a sense of the specific challenges and potential of learning analytics in those contexts. It is important to note that our assumption is not that residential or traditional educational providers have solved the different challenges and fully realised the potential of learning analytics. On the contrary—the assumption underpinning this book is simply to recognise the unique challenges and opportunities that providing and facilitating learning in open and distributed learning environments offer. Consider, for example, two open and distance learning institutions represented in two of the chapters here: the Open University in the UK (OU) and the University of South Africa (Unisa). Though there are many differences between these two institutions around, for example, admission requirements, funding, programs offered
1 Introduction: Learning Analytics in Open and Distributed Learning …
5
and business models, they share the characteristic of offering quality learning opportunities at scale. While the number of students in traditional forms of educational delivery can be determined by the size of lecture halls, accommodation facilities and number of staff, open and distributed learning has long attempted to offer learning at scale, making use of a range of available technologies. An overview of 27 open universities (Mishra, 2017) provides an informative perspective. The report indicates that low throughput and retention rates are often associated with open and distance learning and are of particular concern. The dominant form of teaching involves “self-learning materials (SLMs) that are available in print and, with increasing frequency, online. Digital SLMs generally contain audio– video lectures, electronic assignments, self-practice quizzes and self-assessment exercises” (Mishra, 2017, p. 8). The key priority of these institutions is to increase the effectiveness and appropriateness of student support, followed by strengthening online learning infrastructures and quality assurance (Mishra, 2017). These institutions are specifically looking at online approaches to address student needs. To get a sense of what teaching at scale might look like and the role of student learning data to make this possible and more effective and appropriate, consider, for example, what it would mean to provide learning opportunities for over 150,000 students (as in the case of the OU) or over 350,000 students in the case of Unisa? How many teaching staff (both contract and full-time) are needed? What technologies are available in the context of provision? How can access be provided to thousands of students, as well as ensuring that the learning experiences are of high quality, against a backdrop of financial sustainability? To provide context of how the phenomenon of scale plays out in the OU and at Unisa, consider the following (Mishra, 2017): The OU offers around 120 qualifications and a total of 400 courses using a range of open, online, and a small number of face-to-face strategies to facilitate learning for 168,167 students. The OU has 2,552 full-time teaching staff and 4,402 part-time staff. Unisa offers 624 programs and a total of 2,974 courses through a combination of face-toface and online tutorials, as well as printed and online materials for 351,160 students. Unisa has 2,159 full-time teaching staff and 9,095 part-time staff.
The statistics above present a glimpse of scale—not only in terms of student numbers, but also number of programs, courses, instructor-student ratios, and needs in terms of student support and ICT infrastructures, openness in terms of registration and tuition periods, and so forth. There is, however, another consideration which is how scale impacts on cost (of provision, but also cost for students) and quality. The notion of the “iron triangle” refers to access, quality and cost as three interdependent vectors forming a triangle, where a change in one vector, impacts on the other two (Daniel et al., 2009; Power & Morven-Gould, 2011). For example, increasing the number of students allowed to register for a particular program or course in an open and distributed setting, may bring down the running costs of the course due to economies of scale, but might also raise concerns about ensuring quality. The issue of quality might be addressed by appointing more part-time instructors, which in turn raises the cost (Hülsmann, 2004, 2016). There are some authors who propose
6
P. Prinsloo et al.
that the assumptions and epistemologies foundational to the iron triangle be revisited e.g., Kanuka and Brookes (2010), Power and Morven-Gould (2011) and Prinsloo & Slade,(2014). It falls outside this introductory chapter to unpack the iron triangle in more detail. The reality though is that the increasing numbers of students who enrol through open and distributed learning in a particular program or course have implications for the quality of the learning experience as well as the financial and reputational sustainability of the providing institution (and costs to students). And this is where data, and particularly student learning data offers huge potential as well as risks. As institutions respond to learner needs, a careful balance is needed between evidence that the intervention and allocation of resources will bear fruit, and the availability of resources (Prinsloo & Slade, 2014, 2017). Increasing resource constraints is but one of the challenges facing educational provision. Others include increasing competition between institutions, new markets, advances in technology, the massification of education, as well as changes in funding frameworks in a climate of public education funding constraints and the broader unbundling and re-bundling of educational provision (e.g., Altbach et al., 2019; Lewis & Shore, 2019; Shaw et al., 2020). Enter data, and in particular, student (learning) data. Institutions attempt to not only collect, analyse and use data to ensure competitiveness and sustainability, but also to address endemic concerns about student dropout and retention in specifically open and distributed learning environments. Data and data analytics offer a particular data imaginary to educational institutions providing access to “speedy, accessible, revealing, panoramic, prophetic and smart” solutions (Beer, 2018, p. 22). Also see Williamson (2017). Not only do data analytic vendors and learning platforms providers offer fast, at-the-touch-of-a-button analytics, they also remove the need for institutional expertise since the analysis is often a ‘black-box’ solution. These analytics dangle the promise of previously unseen patterns in data, as well as panoramic overviews of student learning journeys that entail prophetic and smart solutions. Combine the challenges faced by open and distributed learning institutions with the promises of such a data imaginary, and it is relatively easy to understand the lure of learning analytics in this context. Combining student data harvested at registration and admission, with that gifted in discussion forums and with automated collection of login and engagement patterns, constitutes a change from ‘teaching in the dark’ to allowing institutions and students to make more informed and appropriate decisions. From the outset, as editors, we hoped to address the current gap of published/reported theoretical, conceptual and empirical research focusing on learning analytics in open and distributed environments. We believe that the specific nature of open and distributed learning offers unique challenges, but also the unique potential for learning analytics as a research focus and practice/praxis. In the context of the rich and vast body of theoretical and empirical research spanning the evolution of distance and distributed learning, the chapters in the book build on and further expand this body of research, albeit with reference to the potential and challenges of learning analytics in open and distributed contexts.
1 Introduction: Learning Analytics in Open and Distributed Learning …
7
When we invited scholars, researchers and practitioners to submit proposals to map the potential and challenges of learning analytics, it was unthinkable that the global (higher) education sector would be disrupted by the Covid-19 pandemic. Prior to Covid-19, there were signs that the strict boundaries and differences between the traditional residential and traditional open, distributed and distance learning contexts were increasingly becoming porous, and forms of educational delivery were becoming not only unbundled, but also rebundled. As educational institutions responded to the pandemic, the porousness of the traditional boundaries increased as traditional face-to-face institutions moved to Emergency Remote Online Teaching and Learning (EROTL). While it falls outside the scope of this introductory chapter to discuss the overlaps and important differences between EROTL, and established open, distance and distributed learning theories and practices, what is pertinent in the scope of the book, is the fact that student data and especially student learning data became more important than ever before. While one may assume that traditional institutions were more severely disrupted than open, distance and distributed learning institutions and forms of delivery, we should not underestimate the immense impact of Covid-19 also on open, distance and distributed learning institutions and forms of delivery. Not only were staff and students affected, many systems and processes in open, distance and distributed learning institutions were put under severe strain. As institutions, systems and processes adapted to the effects of the pandemic, student behavioral data became even more important also for open, distance and distributed learning institutions. Although the chapters in this book were written pre-Covid-19, as editors we are convinced that the chapters in the book provide important insights into the potential and challenges of learning analytics for open, distance and distributed learning institutions and forms of delivery. The seven chapters included here represent a wide range of geopolitical and institutional contexts. There are two chapters from the US, two from Europe, two from South Africa and one from Hong Kong. Three of the chapters are from two of the largest open distance education providers in the world, namely the Open University in the UK and the University of South Africa (Unisa), while the others represent research from what might be regarded as more traditional educational providers. Further, this variety is enhanced by a case study chapter based in a K-12 environment, and a further chapter which considers a multimodal approach. The book concludes by providing some directions for future research in learning analytics in open and distributed learning environments, as well as pointers for evidence-informed strategies for policy makers and practitioners/educators.
Target Audience We would hope that a varied audience will find the chapters in the book of value, including individuals connected to or having an interest in open, distance and
8
P. Prinsloo et al.
distributed learning environments, including but not limited to managers, policymakers, learners, instructors, academic planners, ICT, faculty, financial managers, researchers, practitioners, curriculum developers, instructional designers, and administrators.
Structure of the Book Although covering a broad range of conditions, in terms of institutional size, location and educational setting, the following chapters offer insight into uses of learning analytics in an open, distributed learning context. Each seeks to make clear how learning analytics links to the focus of the chapter, and each offers key findings or critical propositions for considering learning analytics in the context of open and distributed learning. Chapter 2, from Billy Tak-Ming Wong of Hong Kong, provides useful context for the remainder of the book, exploring the issues particular to open and distributed learning, providing a sound theoretical base, and summarising the evolution of intervention strategies. The chapter approaches learning analytics from a perspective of intervention strategies, discussing a range of foci and practices and highlighting the ways in which learning analytics can support these. Fynn et al. of the University of South Africa expand on some of the issues flagged in the first chapter highlighting many of the realities of open, distance learning from the perspective of the Global South. Chapter 3 looks specifically at higher education at scale by means of a case study of a large South African university. The introduction and implementation of learning analytics in this context faces a number of challenges, many associated with the availability of quality, relevant data. It provides a useful illustration of the potential realities for many institutions considering the introduction of learning analytics and offers a number of relevant pointers, concluding with a view that learning analytics systems cannot simply be transplanted from the Global North to contexts within the Global South—they must be fit for purpose and tailored appropriately. In Chapter 4, Avinash Boroowa and Christothea Herodotou of the Open University in the UK provide the perspective of another large open, distance learning institution—this time as one of the leading higher education providers in the field of learning analytics research and implementation. The chapter gives an overview of two specific learning analytics initiatives implemented across the wider university— the first seeking to inform the learning design of online courses, and the second using predictive learning analytics to identify students potentially at risk in order to trigger proactive interventions. In both cases, lessons learned are shared. The authors argue that, inter alia, institutional readiness is key to the success of learning analytics implementation, including requirements for staff readiness (digital and data literacy skills) as well as the necessary technical infrastructure to be in place. In addition, they address the need for consideration of ethical perspectives to ensure that student interests and welfare are safeguarded.
1 Introduction: Learning Analytics in Open and Distributed Learning …
9
Chapter 5 examines learning analytics through the lens of self-regulated learning. In this chapter, Mohammad Khalil of the University of Bergen, discusses a mobile multimodal learning analytics approach to support learners, instructors and instructional designers. The approach recognises the increasing uses of mobile devices to deliver learning content and the additional opportunities offered to capture a rich set of contextual and temporal learner data. Like other forms of distributed learning, mobile learning faces a number of challenges, in particular a relative lack of insight into how best to support self-regulated learning strategies. Given the increasing prevalence of mobile learning and the persisting need to improve retention in distributed settings, this chapter provides constructive insight for a range of stakeholders. In Chapter 6, Jeremiah Kalir of the University of Colorado introduces the uses and benefits of the social and technical practice of annotation, the addition of note to texts. As the author states, social annotation is a particular genre of learning technology that supports enabling information sharing, interaction with other learners and collaboration. The annotation of online material facilitates user familiarity with new ideas and the co-construction of new knowledge. Further, the collaborative activity mediated by social annotation generates digital traces that can be gathered, analyzed, reported, and interpreted as discourse data. In this chapter, Kalir discusses the development of a public social learning analytics dashboard that reports group-level social annotation and encourages open learners’ discourse and collaboration. In designing the dashboard, the developers have tackled a number of issues, including the trade off between analytic insight and potential ethical concerns around data capture and storage. A key strength for this book is the recognition of context. Kalir posits that the creation of actionable insight about the social production of knowledge should “honor how knowledge is socially situated and openly-networked, how discourse and activity emerge from and occur because of participatory group processes, and how analytic insight should reflect knowledge construction in context”. Much of the focus in learning analytics is on its application within higher education and in open and distributed learning contexts, on learning at scale. In Chapter 7, Quick and Itow provide an example of how learning analytics may be used to support course design in online secondary contexts by examining individual learning. In doing this, they recognise that many online learners are often amongst the least equipped for online learning. Quick and Itow discuss the relevance of situative theories of learning. They state that combining behavioral, cognitive, and sociocultural perspectives encourages multiple perspectives and frameworks in the development of learning analytics for online learning systems, and provide a set of principles used to develop situatively informed analyses. The principles are illustrated by application to an online high school embedded within a large US university to demonstrate insights regarding the intersection between online course design and student activity. Finally, Chapter 8 provides a discussion of ethical issues with a lens on applications of artificial intelligence (AI) in an open and distance learning context. In many ways, artificial intelligence represents what many might see as the future for the delivery of online education. Ungerer and Slade’s chapter outlines the range of systems and applications typically associated with educational provision, e.g., the automation of basic administrative tasks and the use of intelligent tutoring systems.
10
P. Prinsloo et al.
As well as highlighting the potential benefits, the authors discuss a number of ethical concerns—such as the introduction of bias, privacy and consent. The chapter offers a broad overview of the topic, concluding that AI will doubtless play a growing part in the provision of open and distributed learning, whilst recognising that it is unlikely to fully replace human educators. Acknowledgements The editors wish to acknowledge and thank all of the contributing authors for their forbearance and patience in the production of this book. All have provided valuable insight and pointers for the implementation of learning analytics across a broad range of open and distributed learning contexts. We offer special thanks to Professor Alan Tait for providing the foreword for this book—Alan’s expertise and his standing within the open, distance learning community is second to none. We are grateful that he made time to review the chapters and provide ongoing encouragement. Finally we thank the team at Springer for providing their support for this book. It is hoped that it will serve as a useful guide to learning analytics in a crucial, but often underrepresented, context.
References Altbach, P. G., Reisberg, L., & Rumbley, L. E. (2019). Trends in global higher education: Tracking an academic revolution. Brill. Arora, P. (2016). Bottom of the data pyramid: Big data and the global south. International Journal of Communication, 10(19), 1681–1699. Bayne, S., Gallagher, M. S., & Lamb, J. (2014). Being ‘at’ university: The social topologies of distance students. Higher Education, 67(5), 569–583. Beer, D. (2018). The data gaze: Capitalism, power and perception. Sage. Belawati, T., & Baggaley, J. (Eds.). (2009). Distance education in Asia: The PANdora guidebook. http://www.pandora-asia.org/guidebook/PDEG-ed1.pdf Czerniewicz, L., Agherdien, N., Badenhorst, J., Belluigi, D., Chambers, T., Chili, M., de Villiers, M., Felix, A., Gachago, D., Gokhale, C., Ivala, E., Kramm, N., Madiba, M., Mistri, G., Mgqwashu, E., Pallitt, N., Prinsloo, P., Solomon, K., Strydom, S., … & Wissing, G. (2020). A wake-up call: Equity, inequality and Covid-19 emergency remote teaching and learning. Postdigital Science and Education, 2(3), 946–967. Daniel, J., Kanwar, A., & Uvali´c-Trumbi´c, S. (2009). Breaking higher education’s iron triangle: Access, cost, and quality. Change: The Magazine of Higher Learning, 41(2), 30–35. Edwards, R. (1994). From a distance? Globalisation, space-time compression and distance education. Open Learning: the Journal of Open, Distance and e-Learning, 9(3), 9–17. Evans, T. (1989). Taking place: The social construction of place, time and space, and the (re) making of distances in distance education. Distance Education, 10(2), 170–183. Evans, T., & Nations, D. (1992). Theorising open and distance education. Open Learning, 7(2), 3–13. Ferguson, R. (2012). Learning analytics: Drivers, developments and challenges. International Journal of Technology Enhanced Learning, 4(5/6), 304–317. Ferguson, R., & Clow, D. (2017). Where is the evidence? A call to action for learning analytics. In: LAK ‘17 Proceedings of the Seventh International Learning Analytics and Knowledge Conference (pp. 56–65). ACM International Conference Proceeding Series, ACM, New York, USA. Garrison, D. R. (1985). Three generations of technological innovation in distance education. Distance Education, 6, 235–241.
1 Introduction: Learning Analytics in Open and Distributed Learning …
11
Garrison, D. R. (2000). Theoretical challenges for distance education in the 21st century: A shift from structural to transactional issues. International Review of Research in Open and Distance Learning, 1(1), 1–17. Gaševi´c, D., Dawson, S., & Siemens, G. (2015). Let’s not forget: Learning analytics are about learning. TechTrends, 59(1), 64–71. Guglielmo, T. (1998). Computer conferencing systems as seen by a designer of online courses. Educational Technology, 38(3), 36–43. Gunter, A., Raghuram, P., Breines, M. R., & Prinsloo, P. (2020). Distance education as sociomaterial assemblage: Place, distribution, and aggregation. Population, Space and Place, 26(3), e2320. Guri-Rosenblit, S. (2009). Diverse models of distance teaching universities. Encyclopedia of Distance Learning, 2, 727–733. Heydenrych, J. F., & Prinsloo, P. (2010). Revisiting the five generations of distance education: Quo vadis? Progressio, 32(1), 5–26. Holmberg, B. (2005). The evolution, principles, and practice of distance education (pp. 37–88, 104–105). BIS-Verlag der Carl von Ossietzky Universität Oldenburg. Hülsmann, T. (2004). Low cost distance education strategies: The use of appropriate information and communication technologies. The International Review of Research in Open and Distributed Learning, 5(1). https://doi.org/10.19173/irrodl.v5i1.175 Hülsmann, T. (2016). The impact of ICT on the costs and economics of distance education: A review of the literature. Common Wealth of Learning. Retrieved from http://oasis.col.org/handle/11599/ 2047 Kanuka, H., & Brooks, C. (2010). Distance education in a post-Fordist time: Negotiating difference. In Cleveland-Innes, M., & Garrison, D. R. (Eds.), An introduction to distance education (pp. 81– 102). Routledge. Kitchen, R. (2014). The data revolution. Big data, open data, data infrastructures and their consequences. Sage. Kitto, K., Shum, S. B., & Gibson, A. (2018, March). Embracing imperfection in learning analytics. Paper presented at LAK ’18, Sydney, Australia. http://users.on.net/~kirsty.kitto/papers/embrac ing-imperfection-learning-final.pdf Lauzon, A. C., & Moore, G. A. B. (1989). A fourth-generation distance education system: Integrating computer-assisted instruction and computer conferencing. Journal of Distance Education, 3(1), 38–49. Leitner, P., Khalil, M., & Ebner, M. (2017). Learning analytics in higher education - a literature review. In A. Peña-Ayala (Ed.), Learning analytics: Fundamentals, applications, and trends (pp. 1–23). Springer. Lewis, N., amp; Shore, C. (2019). From unbundling to market making: Reimagining, reassembling and reinventing the public university. Globalisation, Societies and Education, 17(1), 11–27. Lim, L. A., Gentili, S., Pardo, A., Kovanovi´c, V., Whitelock-Wainwright, A., Gaševi´c, D., & Dawson, S. (2019). What changes, and for whom? A study of the impact of learning analytics-based process feedback in a large course. Learning and Instruction. https://doi.org/10.1016/j.learninstruc.2019. 04.003 Macgilchrist, F. (2019). Cruel optimism in edtech: When the digital data practices of educational technology providers inadvertently hinder educational equity. Learning, Media and Technology, 44(1), 77–86. MacIntosh, C. (Ed.). (2005). Perspectives on distance education. Lifelong learning and distance higher education. COL and UNESCO. Marsden, R. (1996). Time, space and distance education. Distance Education, 17(2), 222–246. McCowan, T. (2017). Higher education, unbundling, and the end of the university as we know it. Oxford Review of Education, 43(6), 733–748. Milan, S., & Treré, E. (2019). Big data from the South (s): Beyond data universalism. Television & New Media, 20(4), 319–335.
12
P. Prinsloo et al.
Mishra, S. (2017). Open universities in the Commonwealth: At a glance. Commonwealth of Learning. Retrieved from http://oasis.col.org/handle/11599/2786 Moore, M. G., & Kearsley, G. (2011). Distance education: A systems view of online learning. Cengage Learning. Moore, M. G., & Kearsley, G. (2012). Distance education: A systems view of online learning. Wadsworth-Cengage Learning. Morris, N., Swinnerton, B., & Czerniewicz, L. (2019). Unbundling education: Mapping the changing nature of Higher Education in South Africa-ESRC. Impact, 2019(1), 44–46. Pardo, A., & Siemens, G. (2014). Ethical and privacy principles for learning analytics. British Journal of Educational Technology, 45(3), 438–450. Peters, O. (1996). Distance education is a form of teaching and learning sui generis. Open Learning, 11(1), 51–54. Peters, O. (2010). Distance education in transition: Developments and issues (5th ed.). BIS-Verlag der Carl von Ossietzky Universität Oldenburg. Power, T. M., & Morven-Gould, A. (2011). Head of gold, feet of clay: The online learning paradox. The International Review of Research in Open and Distributed Learning, 12(2), 19–39. Prinsloo, P. (2017). Fleeing from Frankenstein’s monster and meeting Kafka on the way: Algorithmic decision-making in higher education. E-Learning and Digital Media, 14(3), 138–163. Prinsloo, P., & Slade, S. (2014). Educational triage in open distance learning: Walking a moral tightrope. International Review of Research in Open and Distributed Learning, 15(4), 306–331. Prinsloo, P., & Slade, S. (2017, March). An elephant in the learning analytics room: The obligation to act. In Proceedings of the Seventh International Learning Analytics & Knowledge Conference (pp. 46–55). Prinsloo, P., Khalil, M., & Slade, S. (2018). User consent in MOOCs - micro, meso, and macro perspectives. International Review of Research in Open and Distributed Learning (IRRODL), 19(5), 61–79. Shaw, P., Green, P., Gration, M., Rhodes, C., Sheffield, D., & Stone, J. (2020). Within these hyperporous walls: An examination of a rebundled online learning model of higher education. Australasian Journal of Educational Technology, 36(5), 85–101. Siemens, G. (2013). Learning analytics: The emergence of a discipline. American Behavioral Scientist, 57(10), 1380–1400. Slade, S., & Prinsloo, P. (2013). Learning analytics: Ethical issues and dilemmas. American Behavioral Scientist, 57(10), 1510–1529. Spector, J. M. (2009). Reconsidering the notion of distance in distance education. Distance Education, 30(1), 157–161. Subotzky, G., & Prinsloo, P. (2011). Turning the tide: A socio-critical model and framework for improving student success in open distance learning at the University of South Africa. Distance Education, 32(2), 177–193. Taylor, J. C. (1995). Distance education technologies: The fourth generation. Australian Journal of Educational Technology, 11(2), 1–7. Taylor, J. C. (2001). Fifth generation distance education. Instructional Science and Technology, 4(1), 1–14. Toshkov, D. (2018, November 6). The ‘Global South’ is a Terrible Term. Don’t Use it’. [Web log post]. http://re-design.dimiter.eu/?p=969 Usher, R. (2002). Putting space back on the map: Globalisation, place and identity. Educational Philosophy and Theory, 34(1), 41–55. Vytasek, J. M., Patzak, A., & Winne, P. H. (2020). Analytics for student engagement. In A. S. Lampropoulos & G. A. Tsihrintzis (Eds.), Machine learning paradigms: Applications in recommender systems (pp. 23–48). Springer. Weber, A. S. (2016). The big student big data grab. International Journal of Information and Education Technology, 6(1), 65–70. Williamson, B. (2017). Big data in education: The digital future of learning, policy and practice. Sage.
1 Introduction: Learning Analytics in Open and Distributed Learning …
13
Williamson, B., Eynon, R., & Potter, J. (2020). Pandemic politics, pedagogies and practices: Digital technologies and distance education during the coronavirus emergency. Learning, Media and Technology, 45(2), 107–114. https://doi.org/10.1080/17439884.2020.1761641 Wong, B. T. M., & Li, K. C. (2019). A review of learning analytics intervention in higher education (2011–2018). Journal of Computers in Education, 1–22.
Paul Prinsloo is a Research Professor in Open and Distance Learning (ODL) in the Department of Business Management, in the College of Economic and Management Sciences, University of South Africa (Unisa). Since 2015, he is also a Visiting Professor at the Carl von Ossietzky University of Oldenburg, Germany. In 2019, the National Research Foundation (NRF) in South Africa awarded Paul with a B3 rating confirming his considerable international reputation for the high quality and impact of his research outputs. He is also a Fellow of the European Distance and E-Learning Network (EDEN) and serves on several editorial boards. His academic background includes fields as diverse as theology, art history, business management, online learning, and religious studies. Paul is an internationally recognised speaker, scholar and researcher and has published numerous articles in the fields of teaching and learning, student success in distance education contexts, learning analytics, and curriculum development. His current research focuses on the collection, analysis and use of student data in learning analytics, graduate supervision and digital identity. Sharon Slade has worked in open, distance education for almost 20 years, as a senior lecturer at the Open University in the UK. She led the work on development of the University’s Policy on the ethical use of student data for learning analytics, arguably the first of its kind in Higher Education worldwide. She has since contributed to further framework developments, notably with Jisc, Stanford University and Ithaka S+R, New America and the International Council for Open and Distance Education. Sharon was an academic lead for learning analytics projects within the Open University, leading work around ethical uses of student data, operationalisation of predictive analytics and approaches aiming to improve retention and progression. Keynotes and publications include papers around student consent, the obligation to act on what is known, examining the concept of educational triage and broader issues around an ethics of care. She now works on data insight at Earth Trust, an educational and environmental charity near Oxford. Dr. Mohammad Khalil is a senior researcher and lecturer in learning analytics at the Centre for the Science of Learning & Technology (SLATE) at the faculty of psychology, University of Bergen, Norway. Mohammad has a master’s degree in information security and digital criminology and a Ph.D. degree from Graz University of Technology in Learning Analytics in Massive Open Online Courses (MOOCs). Khalil has a rich international experience working in four different countries since 2015. He has published over 50 articles on learning analytics in highstandard and well-recognized journals and academic conferences, focusing on understanding and improving student behavior and engagement in digital learning platforms using data sciences. His current research focuses on learning analytics in Open and Distance Learning (ODL), selfregulated learning, mobile, visualizations and gamification, as well as privacy and ethics. His personal website is: http://mohdkhalil.wordpress.com.
Chapter 2
The Potential of Learning Analytics for Intervention in ODL Billy Tak-Ming Wong
Introduction The provision of assistance for at-risk or underachieving learners has long been an essential part of open and distance learning (ODL). Usually referred to as intervention in academic and instructional areas, it generally aims to “prevent learners’ academic failure by monitoring their progress, providing additional instruction or support that matches learners’ needs and influencing their physical, intellectual and moral development” (Wong & Li, 2020, p. 8). In ODL, the practice of intervention may have more specific aims, such as enhancing student persistence and retention. Within the development of ODL, the focus of intervention has evolved alongside the emergence of learning analytics (Wong et al., 2018). Despite this, studies reporting empirical work on learning analytics intervention have not been prevalent. Rienties et al. (2017) suggest that this is a result of the lack of a comprehensive framework with evidence to guide intervention practice supported by learning analytics. However, analyses of case studies of learning analytics intervention by Wong (2017) and Wong and Li (2020) suggest that there is no fundamental difference between the nature of intervention with or without the support of learning analytics. Against this background, this chapter addresses the potential of learning analytics for intervention in ODL. It first reviews intervention in ODL as a basis of the discussion, covering the relevant theoretical foundation, as well as intervention practice in terms of the problems to be dealt with, existing approaches, and the emerging use of learning analytics over the past decade. It then discusses the potential of learning analytics in this context and the challenges which should be tackled.
B. T.-M. Wong (B) Hong Kong Metropolitan University, Ho Man Tin, Hong Kong e-mail: [email protected] © The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2022 P. Prinsloo et al. (eds.), Learning Analytics in Open and Distributed Learning, SpringerBriefs in Open and Distance Education, https://doi.org/10.1007/978-981-19-0786-9_2
15
16
B. T.-M. Wong
Intervention in ODL Intervention in the context of ODL is commonly related to student dropout which has long been a prime concern for institutions with retention rates often below those of conventional institutions. In spite of the costs associated with additional support provision, intervention strategies are largely deemed necessary to retain students— as a “do or die” issue enabling ODL institutions to survive in the education sector (Netanda et al., 2019). Intervention helps students to cope with the academic demands of an institution, and meet their own academic expectations, and thus encourages them to stay until successful completion of their studies. One important aspect of intervention is contact with students. The contact can be reactive (i.e. in response to students) or proactive (i.e. initiated by the institution). Since not all ODL students make contact with their institution, particularly those who are less confident, the key to retention is proactive contact—that is, institutions should take the initiative to reach students “who might not make contact with the student support system otherwise and may be more likely to drop out” (Simpson, 2004, p. 81). The formulation of appropriate intervention strategies depends on a clear understanding of the reasons or patterns of students’ withdrawal or non-completion of their studies. With differing aims, intervention may have various focuses—for example, on recruitment (to retain registered students before the course starts), retention (to keep students on the course), retrieval (to get students back onto the course), or reclamation (to get students back onto a subsequent presentation of the same course or a different one at some future time) (Netanda et al., 2019). Since students may exit at different stages, intervention should embrace various aspects of students’ study life. For example, Tung (2012) reported a range of proactive intervention strategies, including those prior to the start of the learning journey, face-to-face contact and academic support, as well as marketing and communication. Accordingly, there were initiatives on preparation programmes; advice and guidance on course selection; orientation programmes; face-to-face tutorials; telephone hotlines; online support; regional support and learning centres; academic support and interpersonal contact by faculty members; proactive contact with inactive students; and retention bonus and incentive schemes to encourage student persistence. In learning analytics, intervention often refers to the action to take after data generation (usually from learning platforms), tracking (to trace learners’ data), and analysis (to retrieve information and generate patterns from the data) (Khalil & Ebner, 2015). It should be noted that conventional interventions and those supported by learning analytics may share similar goals, strategies and approaches. The role of learning analytics in the intervention process may lie in early identification of students who might need help, informing pedagogy, instructional design, and assessment strategies, as well as prioritising students for intervention and personalising assistance (Wong, 2019).
2 The Potential of Learning Analytics for Intervention in ODL
17
Related Models and Theories Various models and theories have been proposed in the past decades to explain students’ needs for intervention and to guide intervention practices. This section outlines a number which are potentially applicable to learning analytics intervention in ODL. While there is not a single model or theory which encompasses all the issues and factors involved in explaining student success or departure, Table 2.1 aims to summarise the key elements commonly covered in related models and theories, as well as relevant focuses of intervention. The focuses of intervention addressed in these models and theories generally revolve around interaction (students-students and students-teachers), students’ learning experiences, course materials, course design, instructors’ facilitation in online discussion, and student support. Such elements may influence students’ satisfaction and participation, and eventually their persistence. The development of learning analytics has contributed to inform and support such intervention practices.
Intervention Practices Over the Years This section reviews intervention practice in the ODL context over the years. Major types of intervention are analysed in terms of the issues dealt with and intervention strategies, highlighting the emerging role and popularity of learning analytics.
Method Relevant articles were collected from academic search engines including Web of Science, Scopus, and ProQuest.1 The search terms include [(“intervention” or “intervene” or “retention strategies” or “reduce dropout” or “improve retention”) and (“distance education” or “open education” or “online learning”)]. Each article was checked and included only if the intervention practice was implemented in the ODL setting. A total of 71 articles were finally retrieved for analysis. The period of publication spans from the 1980s to the 2010s. Based on the nature of the practices reviewed, the intervention strategies presented in different periods are summarised below in the following categories: (i) learning issues; (ii) learning needs; (iii) interpersonal interaction; (iv) online discussion; (v) collaborative learning; (vi) student engagement; and (vii) student dropout. The categories are not meant to be mutually exclusive, but to highlight the various focuses of the strategies and lay a foundation for discussion.
1
The search and selection were conducted on 3 August 2020.
– Interaction between students and tutors – Students’ learning experiences
– Course materials – Communication between students and teachers
– Course design – Interpersonal interaction (student-instructor and student–student)
Student integration model (Tinto, 1975) – Students’ integration into the academic and social systems of the institution – Students’ formal and informal interactions with the faculty and their peers Longitudinal-process model of dropout (Kember, 1989) – Students’ integration into the academic environment of the institution – Balancing the requirements of distance learning with work, family, and social obligations – Students’ cost–benefit analysis by weighing the costs of studying against its perceived benefits
Theory of guided didactic conversation (Holmberg, 1983) – Guided conversation aiming at learning in its associated traits – Presentation of learning matters – Two-way communication between tutors and students – Students’ involvement in the learning process as a result of their personal relationships with tutors
Theory of transactional distance (Moore, 1973, 1993) – Psychological and communication gap resulting from the geographical separation of teachers from learners – Course design – Student-content, student-instructor, and student–student interactions – Appropriately structured presentations and interactions
(continued)
Focuses of intervention
Key elements
Table 2.1 Key elements of the theories/models and focuses of intervention
18 B. T.-M. Wong
– Student interaction and instructor facilitation in online discussion – Student participation
Social presence theory (Short et al., 1976) – The degree to which students feel socially present – Students’ active interaction and collaboration – Students’ sense of belonging and social cohesion in the community – Students’ feeling of being engaged in a virtual environment – Students’ participation in learning
Social-critical model (Subotzky & Prinsloo, 2011) – Interactions between students and institutions throughout students’ journey – Students’ and institutions’ capital and habitus – Domains and modalities of transformation
– Student support – Assessment – Admission
Community of inquiry model (Garrison et al., 2000) – Computer-mediated communication – Worthwhile educational experience embedded within a Community of Inquiry which is composed of students and teachers – Learning within a community through the interaction of cognitive presence, social presence, and teaching presence
Focuses of intervention
Key elements
Table 2.1 (continued)
2 The Potential of Learning Analytics for Intervention in ODL 19
20
B. T.-M. Wong
Table 2.2 Intervention on learning issues Periods Issues and intervention strategies 1980s
Student problems with the learning package – Providing academic support at study centres and residential schools (Kember & Dekkers, 1987)
1990s
Improvement of student attendance – Providing extra course materials and more clarification in tutorials, and encouraging student feedback (Stevenson et al., 1996) Improvement of telephone consultation – Using a voice mail system to provide answers for commonly asked questions (Carmichael, 1995)
2010s
Opportunities for practising language skills – Pairing up students with peers who are native or highly competent speakers (Jilg & Southgate, 2017) Improvement of students’ computer skills – Providing additional instruction on self-regulated learning strategies, and additional learning tasks for students (Tsai & Tsai, 2017) Improvement of student resubmissions – Providing support and guidance on writing an improved assessment (Pinchbeck & Heaney, 2017)
Learning Issues Table 2.2 provides examples of intervention in various time periods which address learning issues. The issues range from problems with learning packages in the 1980s, student attendance and telephone consultation in the 1990s, to improving students’ language and computer skills, and resubmission of assignments in the 2010s. Intervention strategies over the years suggest a change from relying primarily on the instructors’ own efforts to the involvement of other students and the leverage of technology as resources.
Learning Needs Table 2.3 presents examples of intervention to meet the long-standing issue of students’ diverse learning needs. Intervention strategies have, by and large, offered a variety of options (for example, communication channels, learning materials, and instructional methods) to allow students to choose their preferred approach. In the 2010s, an attempt was made to use intelligence algorithms to analyse learner data and identify their learning styles, with the aim of providing customised support services and a personalised learning environment.
2 The Potential of Learning Analytics for Intervention in ODL
21
Table 2.3 Intervention on learning needs Periods Issue and intervention strategies 1980s
Meeting students’ diverse learning needs – Using multiple communication channels for the delivery of course information (Ostman & Wagner, 1987)
2000s
– Using multimedia materials and various instructional methods, as well as providing multiple activities or assignments for students to select their preferred ones (Chyung, 2001)
2010s
– Making available and accessible a variety of support for students to choose (Lee et al., 2011) – Using intelligence algorithms to identify students’ learning styles and personalise their learning environment (Bernard et al., 2017)
Interpersonal Interaction Table 2.4 provides an overview of intervention approaches related to interpersonal interaction. Issues of enhancing contact with instructors and among peers have been Table 2.4 Intervention on interpersonal interaction Periods Issues and intervention strategies 1980s
Increasing contact (student–instructor and student–student) – Using a telephone tutoring system for tutors and students to initiate phone calls when needed (Scales, 1984) – Assisting students to form study groups and assigning selected students as the contact person for communication with other students and the organisation of meetings (Amundsen & Bernard, 1989)
1990s
Increasing contact (student–instructor and student–student) – Using more electronic communication channels, e.g. telephone answering machines and emails (Brown, 1996) Enhancing student interaction in online environments – Providing timely feedback to students, as well as early training on the appropriate etiquette in online communication (Vrasidas & McIsaac, 1999)
2000s
Increasing contact (student–instructor and student–student) – Instructors making personal contact with students using online channels or the telephone (Chyung, 2001) – Using the short message service (SMS) to initiate contact with students (Fozdar & Kumar, 2007) Enhancing student interaction in online environments – Having student-moderated discussion and letting students determine the discussion topics (Durrington & Yu, 2004)
2010s
Increasing contact (student–instructor and student–student) – Increasing instructors’ accessibility in office, and by email, telephone and online forum (Netanda et al., 2019) – Building rapport with students by video updates, personalised feedback on assignments and online discussion, and personal emails (Glazier, 2016) Enhancing student interaction in online environments – Using online social networks to facilitate interaction (Thoms & Eryilmaz, 2014)
22
B. T.-M. Wong
Table 2.5 Intervention on online discussion Periods Issue and intervention strategies 1990s
Increasing student participation in online discussion – Instructors posting frequently and promptly in response to students’ messages; giving participation marks to students for online discussion; using the discussion record as a basis for assignment; and giving deadlines for students to participate in discussion (Bullen, 1998)
2000s
– Instructors designing discussion activities with clear goals and making students aware of such goals; providing guidelines to allow students to take different perspectives and share their views; maintaining a moderate presence in discussion boards; and providing students with regular feedback (Dennen, 2005)
2010s
– Giving students access to analytics results of online discussion for them to set personal goals for participation and monitor such goals (Wise et al., 2014)
consistent over the periods. The table shows that the focus of interventions has changed from providing more opportunities and channels for interaction in the early years to promoting student participation in online communication most recently. There has also been a trend for the emerging use of new online channels, such as social media.
Online Discussion Table 2.5 presents examples of intervention by seeking to improve student participation in online discussion. Intervention strategies before the 2010s mainly involved the instructors’ planning of the discussion as well as feedback and guidance to students. In the past decade, there have been initiatives to give students access to analytics reflecting their participation in discussions for reference.
Collaborative Learning Table 2.6 illustrates examples of intervention for enhancing collaborative learning. Early interventions focused more on increasing the interactivity among students so as to promote their collaboration. From the 2000s onwards, interventions concentrated on enhancing students’ online collaborative learning. Data-mining tools have been used to assist instructors in addressing students’ collaboration problems.
2 The Potential of Learning Analytics for Intervention in ODL
23
Table 2.6 Intervention on collaborative learning Periods Issue and intervention strategies 1990s
Enhancement of students’ collaborative learning – Putting students into groups for discussion; giving learner control by allowing peer moderation and students to choose discussion topics themselves (Cifuentes et al., 1997)
2000s
– Assigning students randomly to collaborative learning groups (Bernard & Lundgren-Cayrol, 2001) – Monitoring student collaboration with computer tools, so that instructors can offer timely assistance, adjust instructional strategies to enhance students’ group learning; and communicate with students to help resolve their collaboration problems (Casamayor et al., 2009)
2010s
– Increasing students’ awareness of the collaboration problem; preparing students for collaboration and teachers for facilitation; allowing students to practise with feedback given (Saqr et al., 2018)
Table 2.7 Intervention on student engagement Periods Issue and intervention strategies 1980s
Enhancement of student engagement – Providing a prompt response to students’ assignments (Rekkedal, 1983)
1990s
– Providing a study guide to assist students’ learning; stating clearly the lesson requirements and objectives (Tallman, 1994) – Using computer tools to facilitate instructors to give effective assessment feedback (Stark & Warne, 1999)
2000s
– Using SMS to provide students with timely communication and support; support for micro-learning to deliver smaller units of course content (Fozdar & Kumar, 2007)
2010s
– Monitoring student engagement and identifying disengaged students with the aid of LMS data, and giving proactive contact to such students to provide personalised support (Rienties et al., 2016)
Student Engagement Table 2.7 presents examples of intervention for enhancing student engagement. The early interventions were mainly confined to instructor feedback on student assignments. Later there were greater efforts to connect with students, and monitor their study progress and engagement, so that disengaged students could be identified early and personalised support given.
Student Dropout Table 2.8 provides examples of intervention to address student dropout which has long been a primary concern for ODL institutions. There has been a major evolution
24
B. T.-M. Wong
Table 2.8 Intervention on student dropout Periods Issues and intervention strategies 1980s
Reduction of student dropout – Minimizing the turn-around time for student assignments (Taylor, 1986) – Providing students with telephone tutoring for checking their progress and offering guidance or support (Scales, 1984)
1990s
Reduction of student dropout – Improvement of instructors’ contact with and support to students (Brown, 1996)
2000s
Reduction of student dropout – Improvement of students’ learning experience along their “learning journey” (Tresman, 2002) Proactive improvement of student persistence – Making telephone contact with each student for providing support, guidance and encouragement weeks before assignment submission (Gibbs et al., 2007) – intervene more promptly (Lykourentzou et al., 2009)
2010s
Reduction of student dropout – Coordinating online dialogues with students and keeping abreast of their progress, in order to provide timely and individualised feedback (Donnelly & Kovacich, 2014) Proactive improvement of student persistence – Sending automated welcome emails to students to encourage early login to the course webpage (Smith et al., 2012) – Using learning analytics to predict the probabilities of students’ success and persistence, so as to take proactive and personalised actions in forms such as telephone contact and emails (Lawson et al., 2016)
in approach from the 2000s onwards with an increasing emphasis on proactive intervention. This are has particularly been the focus of advances in learning analytics for identifying potential at-risk students. Interventions have been focused mainly on improving students’ learning experiences, enhancing interaction, and providing a variety of learning support. The focus on personalisation has also been increasingly emphasised. The issues addressed in the intervention practices shared similar focuses with the models and theories reviewed above. For example, many of the intervention practices addressed students’ engagement into the academic system of an institution and social relationships with peers and teachers, which has been emphasised in the student integration model (Tinto, 1975) and community of inquiry model (Garrison et al., 2000). In this regard, those models and theories could serve as a theoretical foundation for relevant intervention practices.
The Potential of Learning Analytics Based on intervention practices in ODL over the years, this section discusses the potential of learning analytics for addressing the needs for intervention in this context.
2 The Potential of Learning Analytics for Intervention in ODL
25
Changing Modes of ODL with Technological Advances The types of issue addressed by intervention practices are closely related to the changing modes of ODL. Early issues, such as those involving tutorials and telephone consultation, were largely related to the correspondence mode of distance education. With the emergence of computer-mediated communication and online learning, more efforts were directed toward topics related to online discussion and collaborative learning. The technological advances in ODL have helped to resolve some issues which had lasted for decades, such as the limited contact between students and instructors. Meanwhile, the development of online and mobile study modes, as well as relevant technologies, has enabled intervention to be better supported with learning analytics, in terms of the availability of a wide range of data generated from online and mobile activities and advances in data mining and analytics techniques (Prinsloo et al., 2015). The identification of intervention needs has been made easier, more efficient and timely, with the aid of analytical approaches such as monitoring of students’ real-time data. Such a development has not only emerged in ODL but also conventional faceto-face learning. Their converging learning environments have made it possible for learning analytics solutions developed in face-to-face learning (Li & Wong, 2020a, 2020b) to also be applicable in ODL.
Cost-Effectiveness of Intervention Cost-effectiveness and resource allocation have been major considerations in intervention practices (Li et al., 2018; Prinsloo et al., 2018). Earlier interventions, such as study centres and telephone calls, are known to incur high costs for ODL institutions. The electronic communication increasingly used in later interventions, such as SMS, emails, and online discussion boards, are in general less costly and able to reach a large number of recipients efficiently. Their growing popularity and the different ways to reach and interact with recipients have facilitated the formulation of various intervention strategies, such as building rapport with students which do not involve high costs (Glazier, 2016). It is also fair to point out that large-scale electronic communications are not always opened by the intended recipients, so whilst the delivery is cost-effective, the outcomes may not always be. The improvement in cost-effectiveness is also seen through the development of proactive intervention strategies supported by learning analytics for early identification of students’ needs. It would be too late to be effective and more costly to intervene after students have decided to withdraw (Gibbs et al., 2007). Also, as Simpson (2004) contends, “one way of ensuring that interventions are as cost-effective as possible is to target them at students most likely to drop out rather than those most likely to
26
B. T.-M. Wong
succeed” (p. 85). In this regard, the use of learning analytics to identify specific target groups for intervention is not only to help address the issue of cost-effectiveness, but also the responsibility and obligation of ODL institutions to act with those groups who may need help (Prinsloo & Slade, 2017, 2018).
Personalisation in Intervention The rising trend of personalisation in contemporary education can also be seen in intervention practices. Personalised learning and intervention have been made possible by learning analytics, in terms of time, place, content and method (Li & Wong, 2020c). Learning analytics serves to provide instructors with key information for deriving intervention plans, and assists students in adjusting their learning strategies and behaviours. Examples of personalisation in the learning analytics interventions include the analysis of students’ learning styles (Bernard et al., 2017) and learning behaviours (Choi et al., 2018) so that interventions can be tailored to meet their specific needs and preferences. There have been also the generation of personalised feedback on students’ assignments (Glazier, 2016) and engagement in online discussion (Wise et al., 2014) for their self-monitoring and adjustment. Identification of disengaged students (Rienties et al., 2016) and prediction of students’ success and persistence (Lawson et al., 2016) have been also seen. These various types of intervention reveal common features with learning analytics to process data, analyse patterns and trends, generate reports and predictions.
Summary and Future Work This chapter has highlighted the role and potential of learning analytics for intervention in the ODL context. The review findings have suggested the development of an emerging approach of intervention in the past decade which is driven and supported by learning analytics. The findings have also highlighted the ongoing trends of intervention practice in ODL, which cover the changing modes of ODL with technological advances, the cost-effectiveness of intervention, and personalisation in intervention. Despite such developments, it should be noted that intervention remains one of the most challenging areas in learning analytics (Rienties et al., 2017), and future areas of focus to address these challenges are recommended below: • The effective practice of learning analytics intervention relies heavily on advances in human-algorithmic interaction (Prinsloo, 2019), in particular instructors’ appropriate involvement in the analytics practice and interpretation of the analytics results to decide whether, when and how to intervene, in light of their experiences with and knowledge of each student. The professional development of
2 The Potential of Learning Analytics for Intervention in ODL
27
their relevant knowledge and skills is key for the success of learning analytics intervention. • While relevant models and theories were proposed to guide intervention practice in the early years, their relationship with and application to interventions supported by learning analytics have yet to be examined. Further work in this area will offer a better theoretical foundation for learning analytics intervention and insights into how conventional types of intervention could be adapted to the latest ODL environments enhanced by learning analytics. • Appropriate measures should be developed to evaluate the effectiveness of learning analytics interventions, given that not every intervention will have positive effects (Kaliisa et al., 2021). Monitoring and evaluating the effects of intervention will enable timely adjustments or refinements to maximise the benefits of the intervention.
References Amundsen, C. L., & Bernard, R. M. (1989). Institutional support for peer contact in distance education: An empirical investigation. Distance Education, 10(1), 7–27. Bernard, J., Chang, T. W., Popescu, E., & Graf, S. (2017). Learning style identifier: Improving the precision of learning style identification through computational intelligence algorithms. Expert Systems with Applications, 75, 94–108. Bernard, R. M., & Lundgren-Cayrol, K. (2001). Computer conferencing: An environment for collaborative project-based learning in distance education. Educational Research and Evaluation, 7(2–3), 241–261. Brown, K. M. (1996). The role of internal and external factors in the discontinuation of off-campus students. Distance Education, 17(1), 44–71. Bullen, M. (1998). Participation and critical thinking in online university distance education. Journal of Distance Education, 13(2), 1–32. Carmichael, J. (1995). Voice mail and the telephone: A new student support strategy in the teaching of law by distance education. Distance Education, 16(1), 7–23. Casamayor, A., Amandi, A., & Campo, M. (2009). Intelligent assistance for teachers in collaborative e-learning environments. Computers & Education, 53(4), 1147–1154. Choi, S. P. M., Lam, S. S., Li, K. C., & Wong, B. T. M. (2018). Learning analytics at low-cost: At-risk student prediction with clicker data and systematic proactive interventions. Educational Technology & Society, 21(2), 273–290. Chyung, S. Y. (2001). Systematic and systemic approaches to reducing attrition rates in online higher education. American Journal of Distance Education, 15(3), 36–49. Cifuentes, L., Murphy, K. L., Segur, R., & Kodali, S. (1997). Design considerations for computer conferences. Journal of Research on Computing in Education, 30(2), 177–201. Dennen, V. P. (2005). From message posting to learning dialogues: Factors affecting learner participation in asynchronous discussion. Distance Education, 26(1), 127–148. Donnelly, W., & Kovacich, J. (2014). A phenomenological investigation of the problem of adult student attrition in community college online courses. The Exchange, 3(1), 34–43. Durrington, V. A., & Yu, C. (2004). It’s the same only different: The effect the discussion moderator has on student participation in online class discussions. Quarterly Review of Distance Education, 5(2), 89–100.
28
B. T.-M. Wong
Fozdar, B. I., & Kumar, L. S. (2007). Mobile learning and student retention. International Review of Research in Open and Distance Learning, 8(2), 1–18. Garrison, D., Anderson, R. T., & Archer, W. (2000). Critical inquiry in a text-based environment: Computer conferencing in higher education. The Internet and Higher Education, 2(2–3), 87–105. Gibbs, G., Regan, P., & Simpson, O. (2007). Improving student retention through evidence based proactive systems at the Open University (UK). Journal of College Student Retention, 8(3), 359–376. Glazier, R. A. (2016). Building rapport to improve retention and success in online classes. Journal of Political Science Education, 12(4), 437–456. Holmberg, B. (1983). Guided didactic conversation in distance education. In D. Sewart, D. Keegan, & B. Holmberg (Eds.), Distance education: International perspectives (pp. 114–122). Croom Helm. Jilg, T., & Southgate, M. (2017). Students helping students: A learning partnership initiative for distance language learners. The Language Learning Journal, 45(2), 245–262. Kaliisa, R., Kluge, A., & Mørch, A. I. (2021). Overcoming challenges to the adoption of learning analytics at the practitioner level: A critical analysis of 18 learning analytics frameworks. Scandinavian Journal of Educational Research. https://doi.org/10.1080/00313831.2020.1869082 Kember, D. (1989). A longitudinal-process model of drop-out from distance education. The Journal of Higher Education, 60(3), 278–301. Kember, D., & Dekkers, J. (1987). The role of study centres for academic support in distance education. Distance Education, 8(1), 4–17. Khalil, M., & Ebner, M. (2015). Learning analytics: Principles and constraints. In Proceedings of World Conference on Educational Multimedia, Hypermedia and Telecommunications 2015 (pp. 1326–1336). AACE. Lawson, C., Beer, C., Rossi, D., Moore, T., & Fleming, J. (2016). Identification of ‘at risk’ students using learning analytics: The ethical dilemmas of intervention strategies in a higher education institution. Educational Technology, Research and Development, 64(5), 957–968. Lee, S. J., Srinivasan, S., Trail, T., Lewis, D., & Lopez, S. (2011). Examining the relationship among student perception of support, course satisfaction, and learning outcomes in online learning. Internet and Higher Education, 14(3), 158–163. Li, K. C., & Wong, B. T. M. (2020). Trends of learning analytics in STE(A)M education: A review of case studies. Interactive Technology and Smart Education, 17(3), 323–335. Li, K. C., & Wong, B. T. M. (2020). The use of student response systems with learning analytics: A review of case studies (2008–2017). International Journal of Mobile Learning and Organisation, 14(1), 63–79. Li, K. C., & Wong, B. T. M. (2020). Features and trends of personalised learning: A review of journal publications from 2001 to 2018. Interactive Learning Environments, 29(2), 182–195. Li, K. C., Wong, B. T. M., & Ye, C. J. (2018). Implementing learning analytics in higher education: The case of Asia. International Journal of Services and Standards, 12(3/4), 293–308. Lykourentzou, I., Giannoukos, I., Nikolopoulos, V., Mpardis, G., & Loumos, V. (2009). Dropout prediction in e-learning courses through the combination of machine learning techniques. Computers & Education, 53(3), 950–965. Moore, M. G. (1973). Toward a theory of independent learning and teaching. The Journal of Higher Education, 44(9), 661–679. Moore, M. G. (1993). Theory of transactional distance. In D. Keegan (Ed.), Theoretical principles of distance education (pp. 22–38). Routledge. Netanda, R. S., Mamabolo, J., & Themane, M. (2019). Do or die: Student support interventions for the survival of distance education institutions in a competitive higher education system. Studies in Higher Education, 44(2), 397–414. Ostman, R. E., & Wagner, G. A. (1987). New Zealand management students’ perceptions of communication technologies in correspondence education. Distance Education, 8(1), 47–63.
2 The Potential of Learning Analytics for Intervention in ODL
29
Pinchbeck, J., & Heaney, C. (2017). Case report: The impact of a resubmission intervention on level 1 distance learning students. Open Learning: the Journal of Open, Distance and e-Learning, 32(3), 236–242. Prinsloo, P. (2019). Tracking (un)belonging: At the intersections of human-algorithmic student support. In Proceedings of the Pan-Commonwealth Forum. http://dspace.col.org/handle/11599/ 3373 Prinsloo, P., Archer, E., Barnes, G., Chetty, Y., & Van Zyl, D. (2015). Big(ger) data as better data in open distance learning. International Review of Research in Open and Distributed Learning, 16(1), 284–306. Prinsloo, P., & Slade, S. (2017). An elephant in the learning analytics room: The obligation to act. In Proceedings of the Seventh International Learning Analytics & Knowledge Conference (pp. 46–55). Prinsloo, P., & Slade, S. (2018). Mapping responsible learning analytics: A critical proposal. In Khan Joseph Rene Corbeil, J. R., & Corbeil, M. E. (Eds.), Responsible analytics & data mining in education: Global perspectives on quality, support, and decision-making. http://oro.open.ac. uk/55827/ Prinsloo, P., Slade, S., & Khalil, M. (2018). Stuck in the middle? Making sense of the impact of micro, meso and macro institutional, structural and organisational factors on implementing learning analytics. In Proceedings of the European Distance and E-Learning Network Annual Conference (pp. 17–20). Rekkedal, T. (1983). The written assignments in correspondence education. Effects of reducing turn-around time. An experimental study. Distance Education, 4(2), 231–258. Rienties, B., Boroowa, A., Cross, S., Kubiak, C., Mayles, K., & Murphy, S. (2016). Analytics4Action evaluation framework: A review of evidence-based learning analytics interventions at the Open University UK. Journal of Interactive Media in Education, 1(2), 1–11. Rienties, B., Cross, S., & Zdrahal, Z. (2017). Implementing a learning analytics intervention and evaluation framework: What works? In B. K. Daniel (Ed.), Big data and learning analytics in higher education (pp. 147–166). Springer. Saqr, M., Fors, U., Tedre, M., & Nouri, J. (2018). How social network analysis can be used to monitor online collaborative learning and guide an informed intervention. PLoS ONE, 13(3), 1–22. Scales, K. (1984). A study of the relationship between telephone contact and persistence. Distance Education, 5(2), 268–276. Simpson, O. (2004). The impact on retention of interventions to support distance learning students. Open Learning, 19(1), 79–95. Short, J., Williams, E., & Christie, B. (1976). The social psychology of telecommunications. Wiley. Smith, V. C., Lange, A., & Huston, D. R. (2012). Predictive modeling to forecast student outcomes and drive effective interventions in online community college courses. Journal of Asynchronous Learning Networks, 16(3), 51–61. Stark, S., & Warne, T. (1999). ‘Connecting’ the distance: Relational issues for participants in a distance learning programme. Journal of Further and Higher Education, 23(3), 391–402. Stevenson, K., Sander, P., & Naylor, P. (1996). Student perceptions of the tutor’s role in distance learning. Open Learning, 11(1), 22–30. Subotzky, G., & Prinsloo, P. (2011). Turning the tide: A socio-critical model and framework for improving student success in open distance learning at the University of South Africa. Distance Education, 32(2), 177–193. Tallman, F. D. (1994). Satisfaction and completion in correspondence study: The influence of instructional and student-support services. American Journal of Distance Education, 8(2), 43–57. Taylor, J. C. (1986). Student persistence in distance education: A cross-cultural multi-institutional perspective. Distance Education, 7(1), 68–91. Thoms, B., & Eryilmaz, E. (2014). How media choice affects learner interactions in distance learning classes. Computers & Education, 75, 112–126.
30
B. T.-M. Wong
Tinto, V. (1975). Dropout from higher education: A theoretical synthesis of recent research. Review of Educational Research, 45(1), 89–125. Tresman, S. (2002). Towards a strategy for improved student retention in programmes of open, distance education: A case study from the Open University UK. International Review of Research in Open and Distance Learning, 3(1), 1–11. Tsai, M. C., & Tsai, C. W. (2017). Applying online externally-facilitated regulated learning and computational thinking to improve students’ learning. Universal Access in the Information Society, 17, 1–22. Tung, L. C. (2012). Proactive intervention strategies for improving online student retention in a Malaysian distance education institution. MERLOT Journal of Online Learning and Teaching, 8(4), 312–323. Vrasidas, C., & McIsaac, M. S. (1999). Factors influencing interaction in an online course. American Journal of Distance Education, 13(3), 22–36. Wise, A. F., Zhao, Y., & Hausknecht, S. N. (2014). Learning analytics for online discussions: Embedded and extracted approaches. Journal of Learning Analytics, 1(2), 48–71. Wong, B. T. M. (2017). Learning analytics in higher education: An analysis of case studies. Asian Association of Open Universities Journal, 12(1), 21–40. Wong, B. T. M. (2019). The benefits of learning analytics in open and distance education: A review of the evidence. In M. S. Khine (Ed.), Emerging trends in learning analytics: Leveraging the power of education data (pp. 65–81). Brill. Wong, B. T. M., & Li, K. C. (2020). A review of learning analytics intervention in higher education (2011–2018). Journal of Computers in Education, 7(1), 7–28. Wong, B. T. M., Li, K. C., & Choi, S. P. M. (2018). Trends in learning analytics practices: A review of higher education institutions. Interactive Technology and Smart Education, 15(2), 132–154.
Billy Tak-Ming Wong Billy Wong is Senior Research Coordinator at Hong Kong Metropolitan University. He has been teaching and conducting research on technology-enhanced education and translation technology for more than 15 years. His research interests lie in the use of technology in education, open education, learning analytics, as well as language technology. He has been involved in various research projects related to open education, technology-enhanced education, and research capacity development, besides having obtained paper awards in various academic conferences. He has widely published on the use and impacts of technology on education as well as language translation, and served as editors for Springer’s Education Innovation Series and guest editors for journals such as International Journal of Mobile Learning and Organisation and Interactive Technology and Smart Education.
Chapter 3
A Global South Perspective on Learning Analytics in an Open Distance E-learning (ODeL) Institution Angelo Fynn, Jaroslaw Adamiak, and Kelly Young
Introduction In this chapter we provide a perspective on learning analytics within a South African distance education institution. The majority of scholarship on Learning Analytics is derived from residential institutions and we would like to point out that it is also predominantly from WEIRD (Western, Educated, Industrialised, Rich and Democratic) countries (Gaševi´c, 2018). The provision of distance education and e-learning on the African continent faces a number of significant challenges (de Hart & Venter, 2013, Fynn, 2016, Liebenberg et al., 2012). It therefore stands to reason that the measurement, prediction and analysis of learning together with the implementation of learning analytics would similarly face challenges. Therefore, the need for reflection on the development and implementation of learning analytics within a Global South context, what Prinsloo (2018) refers to as “making space”, is an important step prior to the entrenchment of learning analytics within these teaching and learning spaces. While this chapter is a representation of learning analytics within this context, it is not the representative of the Global South experience. In the section below we further elaborate on the specific context of this chapter.
The original version of this chapter was revised: Missing text has been added in “The Resulting (Learning) Analytics” section. The correction to this chapter is available at https://doi.org/10.1007/ 978-981-19-0786-9_10 A. Fynn (B) · J. Adamiak · K. Young University of South Africa, Pretoria, South Africa e-mail: [email protected] © The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2022, corrected publication 2022 P. Prinsloo et al. (eds.), Learning Analytics in Open and Distributed Learning, SpringerBriefs in Open and Distance Education, https://doi.org/10.1007/978-981-19-0786-9_3
31
32
A. Fynn et al.
Context Gaševi´c (2018) locates the adoption of learning analytics in the Global South within three key challenges, namely, quality, equity and efficiency. These key challenges are also present within the context in which this chapter is situated and is further elaborated on below. The South African basic education system is one marked by inequitable access to quality schooling with “no fee” schools which serve the majority of students, whilst receiving much lower resources than the private schools that cater to the minority of students (Rogan & Reynolds, 2016). The difficulties faced by the basic education sector have a knock on effect for tertiary education in that it results in a low participation and low throughput rate in HE (Rogan & Reynolds, 2016). In terms of efficiency, the low throughput rate of the South African education system which sits at 25% for contact institutions and 15% for distance education institutions, provides an indicator of the inefficient output of graduates (Department of Higher Education & Training, 2019). The question of teaching and learning methods within this context are also pertinent to the purpose of the chapter. At the institution in question, the progress toward online learning has been sporadic due to institutional infrastructure, inadequate student access to stable, reliable and fast internet access and the low level of digital literacy of both staff and students. A particular challenge in this regard is the blended learning approach followed by the institution which still predominantly relies on correspondence teaching, with some face to face engagement in tutorial support, which in turn leads to a data poor environment for learning analytics. Better quality data arecollected from students who have the socio-economic resources to have access to the ubiquitous internet available to those who can afford it. While students who do not have this access have data records with scant information on their learning processes outside of formative or summative assessment data. These inequalities in access further deepen the socio-economic divide and a decontextualised learning analytics system runs the risk of entrenching these divides in the data and policy. Despite the challenges mentioned above, it is possible to develop learning analytics systems that, by considering the contextual challenges, are able to sketch a picture of learning at the institution. In this chapter we also depart from the position that data lies on a continuum ranging from poor to rich and that it remains possible to develop analytics almost anywhere along this continuum. We also depart from the position that institutions evolve constantly and that learning analytics can be a catalyst for developing the richness of data at an institution. This is provided that the value and utility of learning analytics is made apparent to the stakeholders at the institution. At Unisa, while the breadth of data is poor, that is the number of variables with data, the volume of data on the variables we do have is high. We therefore view the institution somewhere midway along the continuum between data poor and data rich. With this data we were able to, with reference to the context outlined below, develop a set of descriptive analytics tailored for the institution.
3 A Global South Perspective on Learning …
33
Evolving Student Profiles Though concern about student success and retention is not a new phenomenon, with research dating back decades (Astin, 1964, 1972; Cope & Hewitt, 1969; Iffert, 1958; Pantages & Creedon, 1978; Rossmann & Kirk, 1970), higher education institutions (HEIs) have struggled to significantly reduce the revolving door syndrome1 (Slade & Prinsloo, 2015). Nowhere is this challenge more pressing and topical than in South Africa, and at the University of South Africa (UNISA) in particular— which not only attracts one-third of all students enrolled in higher education in the country, but also increasingly attracts traditional students.2 This is not to say that non-traditional students are not entering UNISA’s system, but rather that UNISA’s enormous student profile (ca. 340,000 students) is undergoing substantial changes, with a body of evidence to suggest UNISA increasingly attracts younger, full-time students, with no major occupational, social and/or family commitments (Council on Higher Education, 2019; Department of Higher Education & Training, 2019; Fynn et al., 2019)—a distance education trend reported over two decades ago by Holmberg (1995). The challenge associated with this evolving profile, particularly at this distance education institution, rests on identifying accurate (and reliable) definitions and indicators of success and retention—not only for non-traditional student populations (who begin, interrupt, and complete their studies as it suits them or as work, health, and/or family conditions permit; Holmberg, 1995), but for the traditional student populations at the institution as well, who wish to pursue full-time studies.
Learning Analytics Although not proposed as the panacea for student success and throughput, the emergence of educational data miningwhich describes and predicts patterns in educational settings (Lagman & Ambat, 2015), provides new tools to address the issues at hand (Acharya & Sinha, 2014; Jayaprakash et al., 2014). These data-driven approaches provide the prospect of more justifiable (and less-risk prone) choices that HEIs can take to achieve their envisioned retention and success goals.
1
Where students have unlimited access to the system but often without success. A traditional student is commonly defined as a younger student, under the age of 25 years, often enrolling directly from high school, and attending university full-time with no major life or work responsibilities (e.g., full-time job or dependents) (Daiva, 2017; Dimmick, 2013; Tilley, 2014). A non-traditional student, by contrast, is a student who is often older, a commuter student and attends university or college on a part-time basis (due to occupational, social and/or family commitments) (Holmberg, 1995; Kasworm, 1990). While the distinction is clear, it must be noted that ‘traditional’ students entering UNISA’s systems do not have access to full-time, face-to-face classes, and residences.
2
34
A. Fynn et al.
Fig. 3.1 Gartner analytic ascendancy model. Source Boyer and Bonnin (2016)
Lying at the heart of educational data mining are learning analytics and machine learning,3 with the latter defined as the process of learning a set of rules from instances or creating a classifier that can be used to generalise from new instances (Kotsiantis, 2007). The former, learning analytics, defined as “the measurement, collection, analysis, and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs” (Long & Siemens, 2011, p. 32), enables HEIs to utilise vast data reservoirs to the potential benefit of the students, instructors/course developers and support staff, alike. All these aspects, which are transversal to analytics in general (Boyer & Bonnin, 2016), are well synthetized by the Gartner Analytic Ascendancy Model, as shown in Fig. 3.1. The Gartner Analytic Ascendancy model, shown above, illustrates the relationship between four sequential categories of analytics; descriptive, diagnostic, predictive, and prescriptive. The first and simplest step, descriptive analytics, uses historical data to answer questions around what happened (Detrick & Vipond, 2016). These analytics provide researchers with hindsight views of the research problem. Diagnostic analytics, by extention, builds on the answers that were formed during the first step, but uses various statistical methods to formulate and test hypothesis surrounding why things occurred. While the preceding types of analytics hinge on what has already happened (and why), predictive analytics focusses on what will happen or what is likely to happen in the future (Detrick & Vipond, 2016). During this third step, probabilities of the occurrence of a specific event are derived and analysed. The fourth and final step of Gartner’s analytic ascendancy model is predicated on prescriptive analytics, which proposes favourable outcomes and recommends courses of action needed to achieve 3
While learning analytics and machine learning, are fundamentally different processes, increasingly machine learning is being seen as a part of learning analytics.
3 A Global South Perspective on Learning …
35
those outcomes (de Jong, 2019). In this regard, prescriptive analytics provides guidance around discrete decisions that can be taken to ensure beneficial results, or mitigate potential threats (such as student attrition/dropout) (Detrick & Vipond, 2016). With the afore-mentioned in mind, it seems plausible to suggest that reaching the third step in Gartner’s Analytic Ascendancy model, i.e. predictive analytics, has the potential to play an important role in determining and understanding the successful pathways of both traditional and non-traditional students at UNISA. Moreover, such an analytical approach has the ability to investigate the correlation between multiple variables and student success, as well as complex interactions among those variables, providing an increasingly nuanced view of student outcomes among these diverse student profiles (Collins, 2017).
Case Study for Predictive Analytics at UNISA Within the context described in the preceding sections, the authors undertook to develop a predictive model using learning analytics to explain and predict student success at UNISA. The authors drew on UNISA’s exiting socio-critical framework by Subotzky and Prinsloo (2011), the key student attributes mentioned therein (i.e., self-efficacy, locus of control, and attribution styles) and the available institutional data. Overall, the intention was to develop a predictive model from institutional datasets that could later be translated into a near real time deployment for learning analytics.
Scoping the Available Data and Choosing an Analytical Technique Available Data When scoping the available data, it became apparent that UNISA’s databases do not contain institutional data relating to the self-efficacy, locus of control, and attribution styles of its students. Although a couple of studies have been conducted at UNISA which touch on these aspects (see Liebenberg & Van Zyl, 2012, 2014), the data is somewhat dated and only pertains to approximately 275 students from the 2011 academic year. That being said, the authors requested the available demographic data together with data relating to registration (number of modules, previous matric4 performance). The data werepredominantly categorical in nature, and included the 4
In South Africa, “matric” (otherwise known as “matriculation”) is a term commonly used to refer to the final year of high school and the qualification received upon graduating from high school.
36
A. Fynn et al.
following; gender, age, ethnicity, home language, employment status, disability status, matric performance, and workload.5 The final dataset was prepared by transforming variables into fewer categories, such as age and home language. New variables were also calculated based on the requirements of the case study—for example, a proxy for student outcomes was computed by summing students’ final marks on all modules and obtaining a mean. The mean was then recoded to represent the varying levels of the dependent variable (pass or fail). The same process was followed when the students’ matric marks were processed.
An Analytical Technique Among various analytical techniques, Yasmin (2013) suggests that classification trees offer the best approach for prediction when dealing with student data that is predominantly categorical in nature. Classification trees explain the variation of a single response variable by repeatedly splitting the data into more homogeneous groups that are used for interactive exploration, description and prediction of patterns and processes by employing combinations of explanatory variables (Yasmin, 2013). This approach has several advantages, namely (1) flexibility in handling a broad range of response types, including numeric, categorical, ratings, and survival data, (2) invariance to monotonic transformations of the explanatory variables, (3) ease and robustness of construction, (4) ease of interpretation, and (5) the ability to handle missing values in both response and explanatory variables (Yasmin, 2013). Considering these advantages, it was adopted as the focal method in the current case study. The Java implementation of the C4.5 algorithm,6 was used to gauge the role of various factors as contributors to student success. These algorithms take a predefined “label”, in this case whether a student has passed or failed, and then try to classify each (potentially) contributing factor to either outcome. This is useful not only because it produces a usable decision tree for which student success can be predicted at an individual level, but it also paints a picture of what known factors play a significant role in that outcome.
5 6
Operationalised as the number of modules registered for (in total). Developed by Ross Quinlan, the C4.5 is an algorithm used to generate decision trees.
3 A Global South Perspective on Learning …
37
The Resulting (Learning) Analytics Figure 3.2 below presents the decision tree for the 2008 cohort7 across all the colleges. Results from the decision tree revealed that the root node was a student’s matric average. This is not surprising considering the spectrum of empirical research which has suggested a positive relationship between prior academic performance in matric and student success in South African HEIs (Eiselen & Geyser, 2004; Koen, 2007; Lourens & Smit, 2003; Scott et al., 2007; Sibanda & Lourens, 2003). In this regard, Visser and Hanslo (2005) have noted that the performance on the South African matriculation certificate serves as the primary gatekeeper to success at university. While one cannot deny the importance of previous academic performance in determining the successful pathways of students in higher education, the point at which the node splits in the current case study is contentious and requires cautionary interpretation. For example, students who achieved an average above 90.5% in matric are likely to pass their studies with distinction. While this finding seems obvious, the second node where students are split is at an average of 77%. From there, the split— students who obtained an average between 81.9% and 86.7% in matric—appears to be problematic to classify from a conceptual viewpoint. For instance, students who achieve more than 86% are likely to fail while those who achieved 86.7% are likely to pass. Logically, the distinction between these passing grades is marginal (i.e. 0.7%) and it is questionable whether we can assume a significant difference between the two categories. The second issue is that the percentages at which the nodes occur only account for the top 1% (i.e. 75% and up) of performers on the matric examination. Following the matric aggregates, the subsequent nodes were split based on workload, occupation, and gender, sequentially. Students who achieved less than or equal to 80.6% and who, in total, have enrolled for more than 41 modules over an eight year period are likely to fail, while students with more than 34 modules but less than 41 modules are likely to pass. Students who have less than seven modules are then split by occupation where full time students with a matric average above 79% are likely to pass while those with a matric average below 79% are likely to fail. Employed students within the same module range are further split by gender with females likely to pass while males likely to fail. With students who are unemployed females are likely to pass while males are likely to pass with distinction. This is shown in Fig. 3.2 below. For those students who achieve less than 77% an alternate branch of nodes presents factors for predicting success. Students with an average matric score? less than or equal to 74.9% are likely to fail. At this particular node we encounter another of the limitations of the decision tree for predicting student success since the node where students with an average less than or equal to 74.9% has the largest number (61,970) of students clustered within it. Attempts to further segment this node by adjusting the parameters of the analysis such as the information gain threshold produced no result, 7
The rationale behind the choice of cohort was to allow for the maximum possible time for students to complete their 360- and 480-credit qualifications (i.e. eight and ten years, respectively) and therefore provide as complete data as possible (see UNISA, 2011).
Fig. 3.2 Model decision tree built from the 2008 cohort
38 A. Fynn et al.
3 A Global South Perspective on Learning …
39
leading to the conclusion that the algorithm cannot isolate distinguishing variables within this group. Students who achieved above 74.9% and who have more than 35 modules are more likely to pass. Students within the same performance range but who have less than 30 modules are further split by gender. Males who obtained above 74.9% (on average, in matric) and who have enrolled in more than 30 modules (in total) are likely to pass. Female student’s success is further dependent on the college they are registered within. Female students in the College of Economic and Management Sciences are likely to pass while those registered in the College of Accounting Sciences are further influenced by their home language. These students who have English as their primary language are likely to pass while those who have a language other than English as their primary language in the College of Accounting Sciences are likely to pass with distinction—an interesting finding considering that the language of teaching and learning (LoTL) at UNISA is English (University of South Africa, 2016). Students who have achieved above 74.9% in matric, but who have less than 30 modules once again present an example where the logic of the algorithm output can be questioned. Students within this branch are typically registered for less than three modules with those students who have less than two modules likely to pass with distinction while those with more than two but less than three modules likely to achieve a pass.
Understanding the (Learning) Analytics and Its Limitations Once we had completed the analysis above, we undertook a stakeholder engagement process within the university to validate the model against the collective experience of the institution. We also gauged the readiness of the institution to accept and utilise the outcomes of the learning analytics process. During these consultations, a number of issues with the analyses become evident. Among these issues were indistinct results within the classification output together with an oversimplification in defining the outcome class of variables, and the lack of learning management system (LMS) data.
Indistinct Results While it is acknowledged that decision trees provide a useful heuristic for handling trends within large datasets, the interpretation of these trees requires caution. Moreover, utilising decision trees as consistent business rules can be problematic as the borders between cases are not always distinct. For instance, in the decision tree shown in Fig. 3.2 above, one of the nodes splits on 86% where students with more than 86% fail, while those with less than or equal to 86.7% pass. The logical and practical implementation of a decision based on this split pose several (contentious) issues.
40
A. Fynn et al.
First and foremost, the results are not in line with the commonly held assumption that higher matric averages predict a higher chance of success (which is generally the case within the tree). Secondly, the practical provision of student support for students within these categories is problematic, as the underlying split is due to the level of information gained within the algorithm. This does not readily translate into an explicit indication of how to treat marginal cases such as those described here. The point being made is that decision trees should be employed as general guidelines in routing student support rather than inflexible business rules to be implemented without consideration of factors outside of the data placed within the algorithm.
Oversimplification of the Outcome Variable As mentioned above, another issue inherent in the case study was the oversimplification of the defined outcome class (as simply pass, fail, or pass with distinction). At UNISA there are a range of outcomes, some of which include deferred pass, deferred fail, absent from examination, supplementary pass, and supplementary fail. By using an oversimplified outcome class, the model could not accurately represent the pathways of student success at the institution. While the model represented in Fig. 3.2 had a high accuracy in classification, its validity comes into question in light of the class outcome definition used by the authors. We believe that this error arose because of the siloed nature of the institution where the gap between the professional research units on student success and the teaching faculty is notoriously difficult to bridge. Working in isolation of the teaching faculty allowed the authors to develop models based on their own assumptions of the broader institution and of the student body.
Lack of LMS Data Current developments in the field of learning analytics have generated new opportunities to invigorate ODeL by utilising the immense volumes of clickstream data (i.e. LMS data) and learning from it (Zhang et al., 2019). Digital records from the LMS capture student data trails and activity streams, and these “provide valuable insight into what is actually happening in the learning process and suggest ways in which educators can make improvements” (Long & Siemens, 2011, p. 32). The literature on learning analytics is replete with studies which incorporate LMS data to predict student outcomes in distance education settings (Dietz-Uhler & Hurn, 2013; Falakmasir & Habibi, 2010; Macfadyen & Dawson, 2012; Minaei-Bidgoli et al., 2003; Smith et al., 2012). Lamentably, we did not utilise any data pertaining to students’ university academic activities as they are captured within the LMS. However, considering the blended learning nature of ODeL at UNISA, this should
3 A Global South Perspective on Learning …
41
be (and will be) included in future analytics. This, and other future directions are discussed next.
Future Directions One of the primary lessons we took away from the model development and deployment was that any learning analytics system in the South African ODeL context should be enriched with data from the student- and academic systems at the institution, such as the LMS data. According to Long and Siemens (2011), without it, the analytics constitute academic analytics, and not necessary learning analytics, see the Fig. 3.3 below. However, the aforementioned statement is only true if basing the argument on the scope of the case study, and not the method which the authors applied. The (learning) analytics approach applied here would undergo a similar chain of events when applied to a micro scale, that is, on the course-level for use by instructors and students (which, according to Long & Siemens, 2011, is considered learning analytics). Regardless of whether the case study contained herein is considered academic analytics orlearning analytics, the inclusion of LMS data is warranted. According to Zhang et al. (2019), analysing the unprecedented volumes of LMS data has the potential to help ODeL practitioners understand the success and retention trajectories of diverse sets of students—such as those currently enrolled at UNISA. A further consideration we shall apply in future analytics is to identify and quantify those institutional processes, procedures and cultural practices that hamper teaching
Fig. 3.3 Learning and academic analytics (Long & Siemens, 2011, p. 34)
42
A. Fynn et al.
and learning at the institution. This consideration is based on the assumption that the university is not apolitical, acultural or free of assumptions that may marginalise or inadvertently prohibit certain groups from effectively participating in the teaching and learning process.
Conclusion: The Value of the Case Study Highlighted Paradoxically, the value of our case study lay in the problems we encountered while moving through the learning analytic cycle. By facing particular problems, we are now able to circumvent or alleviate some of the stumbling blocks in the future. Firstly, despite the fact that UNISA has a tremendous amount of data, some of the processes that purify and integrate this data are somehow lacking. The genesis of the data gathering process at the institution, which extends decades, has gone through disparate manual and computerised systems. As the business model of UNISA evolved and the numbers of students started to increase substantially, adjustments to the database systems, however, were not always adequate or technically sufficient. This resulted in an agglomerate of current and legacy systems, which led to the loss of data integrity— a tremendous challenge when our data were retrieved. Moreover, although we could access the demographic and educational data, the data pertinent to the interaction of the students with university digital systems (i.e. LMS data) is still deeply buried in the access logs and not easily extractable (at the moment). To exacerbate this situation, UNISA (despite its online presence on social media platforms), does not extract (and store) university-related, social media data from its students, nor does it extract the data gathered from the student-card sensors (e.g. entrances to the registration buildings, lecture halls, or libraries). Such limitations need further consideration when developing further iterations of learning analytics systems. Secondly, there is a need to implement a university-wide, ‘business’ or ‘middle’ layer within the analytical process. This would consist of a platform that could easily retrieve data from institutional databases (with the necessary permissions/ethical clearances) and present it in a useful manner to all the parties that require it. In our case study, we extracted the data to the personal workstation and performed the analytic work there. While this may be sufficient for building a toy model or creating a proof of concept, its acceptability or scalability in a university environment with over 340,000 diverse students remains doubtful. Thirdly, as the information is generated, there should be a facility that enables various presentations of data based on different user levels. In this regard, students should be able to extract their personal data and any information regarding their progress and prognosis, together with a referral mechanism (to support departments) to those identified as at-risk. The lecturer’s view, on the other hand, should contain the statistics of the modules that they facilitate, warning signals concerning at-risk students and a set of interventions the students can be advised to attend. The head/chair of department would have a supervisory view which would make provision for monitoring (of the interaction of the users and the system) and evaluation thereof.
3 A Global South Perspective on Learning …
43
Lastly, we would like to underline a few elements outside, but parallel, to the technical development that could potentially render learning analytics successful at UNISA. The first is that learning analytics should not operate in a vacuum, but rather within a cohort of supportive stakeholders originating from the all corners of the university. And secondly, the appropriate policies which lay down the compliance processes and ethical considerations should be also established and amended, when necessary—a considerable challenge considering the sheer size and diversity at UNISA. The development of these policies must start by acknowledging the production of the learning analytics systems which cannot simply be transplanted from the Global North to contexts within the Global South.
References Acharya, A., & Sinha, D. (2014). Early prediction of students performance using machine learning techniques. International Journal of Computer Applications, 107(1), 37–43. https://doi.org/10. 5815/ijisa.2015.01.05 Astin, A. W. (1964). Personal and environmental factors associated with college dropouts among high aptitude students. Journal of Educational Psychology, 55(4), 219–227. Astin, A. W. (1972). College dropouts: A national profile. Office of Research. Boyer, A., & Bonnin, G. (2016). Higher education and the revolution of learning analytics. 2016 International Council for Open and Distance Education (ICDE) Presidents’ summit. https://static1.squarespace.com/static/5b99664675f9eea7a3ecee82/t/5beb449703ce644 d00213dc1/1542145198920/anne_la_report+cc+licence.pdf Collins, B. (2017). Harnessing the potential of learning analytics across the university. Retrieved January 15, 2020, from https://edservices.wiley.com/potential-for-higher-education-learning-ana lytics/ Cope, R., & Hewitt, R. (1969). A typology of college student dropouts: An environmental approach. In The 1st New England educational research conference. Boston College. Council on Higher Education. (2019). Vital stats: Public higher education 2017. Council on Higher Education (CHE) Daiva, T. (2017). The concept of the nontraditional student. Vocational Training: Research and Realities, 28(1), 44–56. https://doi.org/10.2478/vtrr-2018-0004 de Hart, K., & Venter, J. (2013). Comparison of Urban and Rural dropout rates of distance students university of South Africa research method. Perspectives in Education, 31(1), 66–76. de Jong, Y. (2019). Levels of data analytics. Retrieved September 9, 2020, from http://www.ithapp ens.nu/levels-of-data-analytics/ Department of Higher Education and Training. (2019). Statistics on post-school education and training in South Africa: 2017. Department of Higher Education and Training (DHET). Detrick, A. D., & Vipond, S. (2016). Learning analytics a practical pathway to success. The eLearning Guild. Dietz-Uhler, B., & Hurn, J. (2013). Using learning analytics to predict (and improve) student success: A faculty perspective. Journal of Interactive Online Learning, 12(1), 17–26. Dimmick, M. A. (2013). Evaluating the efficacy of a hybrid nutrition course offered to on-campus and distance education students. Utah State University. Eiselen, R., & Geyser, H. (2004). Factors distinguishing between achievers and at-risk students: A qualitative and quantitative synthesis. South African Journal of Higher Education, 17(2), 118–130. Falakmasir, M. H., & Habibi, J. (2010). Using educational data mining methods to study the impact of virtual classroom in E-learning. Proceedings of EDM, 241–248.
44
A. Fynn et al.
Fynn, A. (2016). Ethical considerations in the practical application of the Unisa socio-critical model of student success. The International Review of Research in Open and Distributed Learning, 17(6). https://doi.org/10.19173/irrodl.v17i6.2812 Fynn, A., Liebenberg, H., & Van Zyl, D. (2019). The 2018/9 UNISA student profile survey. University of South Africa (UNISA). Gaševi´c, D. (2018). Include use all! Directions for adoption of learning analytics in the Global South. In C. P. Lim & V. L. Tinio (Eds.), Learning analytics for the global south. Foundation for Information Technology Education and Development. Holmberg, B. (1995). The evolution of the character and practice of distance education. Open Learning: THe Journal of Open, Distance and e-Learning, 10(2), 47–53. https://doi.org/10.1080/ 0268051950100207 Iffert, R. (1958). Retention and withdrawal of college students. Office of Education: US Department of Health, Education, and Welfare.United States Department of Health, Education, and Welfare. Jayaprakash, S. M., Moody, E. W., Lauria, E. J. M., Regan, J. R., & Baron, J. D. (2014). Early alert of academically at-risk students: An open source analytics initiative. Journal of Learning Analytics, 1(1), 6–47. Kasworm, C. E. (1990). Adult undergraduates in higher education: A review of past research perspectives. Review of Educational Research, 60(3), 345–372. https://doi.org/10.3102/003465 43060003345 Koen, C. (2007). Postgraduate student retention and success: A South African case study. HSRC Press. Kotsiantis, S. B. (2007). Supervised machine learning: A review of classification techniques. Informatica, 31, 249–268. Lagman, A., & Ambat, S. (2015). Predictive analytics of student graduation using logistic regression and decision tree algorithm. In Proceedings of Second International Conference on Digital Information Processing, Data Mining, and Wireless Communications (DIPDMWC) (pp. 41–48). Liebenberg, H., & Van Zyl, D. (2012, December). Student profile pilot survey results, (pp. 1–27). University of South Africa (UNISA). Liebenberg, H., & Van Zyl, D. (2014). Report: Student profile survey 2014. University of South Africa (UNISA). Long, P., & Siemens, G. (2011). Penetrating the fog: Analytics in learning and education. Educause Review, 46(5), 31–40. Lourens, A., & Smit, I. P. J. (2003). Retention: Predicting first-year success. South African Journal of Higher Education, 17(2), 169–176. Macfadyen, L. P., & Dawson, S. (2012). Numbers are not enough: Why e-learning analytics failed to inform an institutional strategic plan. Educational Technology and Society, 15(3), 149–163. https://doi.org/10.1.1.441.9712 Minaei-Bidgoli, B., Kashy, D. A., Kortemeyer, G., & Punch, W. F. (2003). Predicting student performance: An application of data mining methods with an educational web-based system. 33rd annual frontiers in education, 2003. https://doi.org/10.1109/FIE.2003.1263284 Pantages, T., & Creedon, C. (1978). Studies of college attrition: 1950–1975. Review of Educational Research, 48(1), 49–101. Prinsloo, P. (2018). Context matters: An African perspective on institutionalizing learning analytics. In C. P. Lim & V. L. Tinio (Eds.), Learning analytics for the global south. Foundation for Information Technology Education and Development. Rogan, M., & Reynolds, J. (2016). Schooling inequality, higher education and the labour market: Evidence from a graduate tracer study in the Eastern Cape, South Africa. Development Southern Africa, 33(3), 343–360. https://doi.org/10.1080/0376835X.2016.1153454 Rossmann, J., & Kirk, B. (1970). Factors related to persistence and withdrawal among university students. Journal of Counseling Psychology, 17(1), 56. Scott, I., Yeld, N., & Hendry, J. (2007). A case for improving teaching and learning in South African higher education. Council on Higher Education: Higher Education Monitor, 6, 1–98.
3 A Global South Perspective on Learning …
45
Sibanda, E., & Lourens, A. (2003). Using logistic regression to identify the factors influencing the success of first year students at Technikon Pretoria. In SAAIR Forum (pp. 1–15). Pretoria. Slade, S., & Prinsloo, P. (2015). Stemming the flow: improving retention for distance learning students. In EDEN 2015 annual conference expanding learning scenarios: Opening out the educational landscape, European distance and e-learning network. Smith, V. C., Lange, A., & Huston, D. R. (2012). Predictive modeling to forecast student outcomes and drive effective interventions. Journal of Asynchronous Learning Networks, 16(3), 51–61. Subotzky, G., & Prinsloo, P. (2011). Turning the tide: A socio-critical model and framework for improving student success in open distance learning at the University of South Africa. Distance Education, 32(2), 177–193. Tilley, B. P. (2014). What makes a student non-traditional? A comparison of students over and under age 25 in online, accelerated psychology courses. Psychology Learning and Teaching, 13(2), 95–106. https://doi.org/10.2304/plat.2014.13.2.95 University of South Africa. (2011). UNISA Admission policy. University of South Africa (UNISA). University of South Africa. (2016). UNISA language policy. University of South Africa (UNISA). Visser, A. J., & Hanslo, M. (2005). Approaches to predictive studies: Possibilities and challenges. South African Journal of Higher Education, 19(6), 1160–1176. Yasmin, D. (2013). Application of the classification tree model in predicting learner dropout behaviour in open and distance learning. Distance Education, 34(2), 218–231. https://doi.org/ 10.1080/01587919.2013.793642 Zhang, J., Burgos, D., & Dawson, S. (2019). Advancing open, flexible and distance learning through learning analytics. Distance Education, 40(3), 303–308. https://doi.org/10.1080/01587919.2019. 1656151
Angelo is an experienced researcher at Unisa with substantial experience in Monitoring and Evaluation; Quantitative and Qualitative Methodology; Community Engagement; Consultant Training; E-Learning; Open Distance Learning Pedagogy and Project Management. Other interests include Cognitive Psychology, Educational Psychology and Social Responsibility. His objective is to develop a culture of learning and teaching, as well as the technological infrastructure required to sustain this culture, that truly creates open access and advances the progress toward achieving Education for All. Jaroslaw is a digital facilitator at Unisa, an IT professional, data scientist, academic, and artist with a wide range of experience in the oil industry, computer technologies, education, science, and entertainment. He is able to work on own initiative as a leader, subject matter expert, and as part of a team and has proven leadership skills involving managing, developing and motivating teams to achieve the company and their career objectives. He exhibits high analytical, design, problem solving and interpersonal skills and is proficient in business process automation on various IT platforms, dedicated to maintaining high-quality standards. His education comprises mathematics, physics, computer science, information systems, and music. Dr. Kelly Young is a researcher in the Open Distance Learning Research Unit at the University of South Africa (UNISA) with an academic background in Psychology. She is highly knowledgeable with regard to student success models and predictions in the context of South African higher education and specifically in distance education. She has written papers appearing in journals such as the Journal of Psychopathology and Behavioural Assessment and the South African Journal of Education on topics ranging from psychometric analyses to cyberbullying. Her doctoral thesis examined psychological grit and its efficacy in determining student retention among postgraduate students enrolled at a South African distance education institution.
Chapter 4
Learning Analytics in Open and Distance Higher Education: The Case of the Open University UK Avinash Boroowa and Christothea Herodotou
Introduction The availability of online distance learning programmes has seen a sharp increase over the last decade. World events, technological and communication advances in audio, video and mobile learning and improved student support systems (Sewart et al., 2020) have resulted in an increasing number of higher education institutions, both distance and campus-based, offering online teaching and learning provision. In addition to online courses typically hosted in a Virtual Learning Environment (VLE), lifelong online learning resources are offered through massive open online platforms such as Futurelearn and Coursera, and more recently, through the provision of professional credentials designed to boost in-demand career skills (coined as microcredentials). Yet, online and distance higher education faces several unique challenges compared to traditional or campus-based education, such as the offering of high-quality learning experiences (e.g., Davis et al., 2019), the lack of an evidenceinformed approach to designing learning materials (e.g., Herodotou et al., 2019) and high student drop-out rates (e.g., Bawa, 2016). In online and distance learning contexts, learners have been perceived as selfdirected, driven by their own personal interests, and as individuals who make sense of the world through an inquiry-led approach to learning, that is, through manipulating, testing, observing and questioning (e.g., Bell et al., 2009; Song & Bonk, 2016). Yet, this may not hold true in practice. Transactional distance—the impact of time and distance (psychological, cognitive, affective) on learners’ interactions with study material and on their communication with others—may inhibit learning engagement and the achievement of desired learning outcomes (Moore, 1993). In addition, the role of the teacher in supporting learners can be more challenging within online and A. Boroowa (B) · C. Herodotou The Open University, MK7 6AA Kents Hill, Milton Keynes, UK e-mail: [email protected]; [email protected] © The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2022 P. Prinsloo et al. (eds.), Learning Analytics in Open and Distributed Learning, SpringerBriefs in Open and Distance Education, https://doi.org/10.1007/978-981-19-0786-9_4
47
48
A. Boroowa and C. Herodotou
distance learning settings. A lack of face-to-face interaction can inhibit recognition of learning difficulties and the provision of appropriate support (Crawley et al., 2009). Further, the often-large number of students with whom an online teacher simultaneously interacts may result in limited scaffolding of the learning process and a greater likelihood of students failing or not completing their studies. Online and distance teaching and learning can be informed and supported by Learning Design (LD), that is “a methodology for enabling teachers/designers to make more informed decisions in how they go about designing learning activities and interventions” (Conole, 2012, p. 6). It is structured on socio-cultural theories of learning and emphasizes “active pedagogies” to achieve certain learning objectives and motivate students (Holmes et al., 2019, p. 311). Despite the existence of frameworks to support LD for open and distributed learning, there is a need to develop new theoretical perspectives that will identify how to implement scalable, effective, and personalised online and distance learning experiences (Zhang et al., 2019). Learning analytics have emerged as a promising innovation which could enhance the quality of online and distance higher education and offer learners personalised and responsive learning experiences. Higher education institutions now have the power to leverage historical student information such as engagement in the VLE and library swipes, together with demographic information, and create models that can visualise student learning journeys (e.g., Charleer et al., 2016) and support learning processes (Ferguson & Shum, 2012; Papamitsiou & Economides, 2014). Learning analytics can empower teachers to provide timely support to students by accessing student performance predictions, enabling them to effectively monitor and support thousands of students (e.g., Herodotou et al., 2019a, b). Such insights can both inform the design of online courses and qualifications by improving content that students struggle with or do not access often (e.g., Rienties et al., 2016a), and address student retention (e.g., Zacharis, 2015). Despite the promise to revolutionise online learning, the development and adoption of learning analytics in higher education remains relatively low. Several institutions are beginning to explore the use of learning analytics dashboards with students and teachers (e.g., Bodily et al., 2018; Scheffel et al., 2017) and have developed approaches to identify and support, in particular, those students deemed to be at risk of failing (e.g., Calvert, 2014; Herodotou et al., 2019a, b). Yet, these approaches are often described as “early adoption” or implementations at a “small-scale” (Dawson et al., 2018; Ferguson et al., 2016). Very few institutions have developed, tested and adopted learning analytics as their main organisational approach. The Open University (OU) in the UK is one of the first higher education institutions in the world to have enacted a university-wide implementation of learning analytics for its 170,000 distance learning students (Herodotou et al., 2020). In this chapter, we reflect on two major learning analytics initiatives which took place at the OU in the last six years and which facilitated the adoption of analytics across the university: (a) Analytics for Action (A4A) focused on the use of learning analytics to inform the learning design of online courses (Rienties et al., 2018) and (b) Early Alert Indicators (EAI) examined the use of predictive learning analytics in online teaching and their potential to identify students at risk and inform teachers
4 Learning Analytics in Open and Distance …
49
who can proactively intervene (Herodotou et al., 2020). We showcase how each initiative raised differing and shared challenges and highlight lessons learnt from their implementation. We aim to illuminate the risks and potential of learning analytics in the unique context of open and distance education to inform the development of appropriate pedagogical practices, material and assessment and effectively support students’ individual learning journeys online.
Settings: The Open University (OU) UK The Open University (OU) is a world leader in open distance learning and one of the largest universities in Europe (approximately 170,000 students). It defines its mission as “open to people, places, methods and ideas” as it enables students with no previous qualifications to achieve their career and life goals by studying flexibly, at times and places that suit them. The OU’s vision for learning analytics is to use and apply information strategically in order to retain students and progress them in order to complete their study goals. The OU was the first institution globally to adopt a policy on the ethical use of student data in learning analytics in 2014. To facilitate this the OU developed and implemented operational mechanisms for generating and using analytic insights related to increasing student persistence and inserted these into key business cycles. The focus of this activity has been in the following three areas: (a) availability of data, (b) creation of actionable insight, and (c) impact on the student experience. This activity resulted in two major business/research initiatives spanning six years and enabling the adoption and large-scale implementation of learning analytics across the university.
Analytics for Action (A4A) Analytics for Action (A4A) was a university-wide initiative that focussed on the use of learning analytics to inform the learning design of online courses. It resulted in the development of the Analytics 4 Action (A4A) Evaluation Framework (Rienties et al., 2016a) providing a comprehensive approach to identifying issues with student performance and progression and embedding these in the course evaluation enacted by student support, course, and qualification teams. The framework, developed in partnership with academics and learning design practitioners, can support the students’ experience and learning outcomes during and after a course’s completion, and enable the university to monitor the quality of offering and enhancement processes. The framework is operationalised in the form of the A4A toolkit (see Fig. 4.1). The toolkit enables the real-time monitoring of course performance on the premise that course leaders will identify and implement rapid interventions to current (and future) presentations which might improve teaching and learning quality. Analysis
50
A. Boroowa and C. Herodotou
Fig. 4.1 The Analytics for Action (A4A) toolkit
of relevant data indicates potential interventions that might be made with the aim of improving student engagement and retention. The A4A toolkit covers (a) the types of data available for review, (b) possible actions that could be taken in response to the data, and (c) methods of evaluating these actions. The process of monitoring course data is sustained by learning designers who provide support to course teams. Support for staff ranges from helping to interrogate data sources (Rienties et al., 2016b) such as student demographics, qualifications profiles, assessment submissions and scores, pass and completion rates, retention, overall engagement with VLE, etc. to identifying potential issues with the course presentation and suggesting remedial actions or interventions to improve performance. Such support takes the form of face-to-face workshops and support meetings as well as drop-in sessions hosted by learning designers. To prioritise the allocation of the finite A4A support available, faculties are asked to nominate courses that ‘need attention’. As an activity, A4A is now embedded in the institutional Quality Monitoring and Enhancement (QME) Process as one means of improving the quality and delivery of curriculum. In its fourth year (2019–2020) of mainstream activity, there are 46 undergraduate courses participating, with more than 400 teaching staff expected to be trained in the use of available data tools by the end of the current cycle. In a survey conducted in 2018, 91.4% of teachers were satisfied with the training sessions and 87% of faculty staff were satisfied with the data support meetings. A further 76% of participating staff agreed that the meetings had helped them to identify an action that could be taken in their courses in order to improve student outcomes.
4 Learning Analytics in Open and Distance …
51
Early Alert Indicators (EAI) The Open University has an ‘open entry policy’, which means that students can register for study in most of the university courses without requiring any prerequisites e.g., high school certificate. As opposed to filtered entry, open entry policy has much higher student withdrawal rates. To address this challenge, predictive learning analytics (PLAs) indicators were tested at the university to predict student completion and pass rates. These insights were provided to teachers in order to proactively intervene, help and prevent student failure. In particular, OU Analyse (OUA)—a machine-learning predictive system—can produce predictions on a weekly basis as to whether or not students will submit their next teacher-marked assignment, and is able to present potential outcomes to teachers in a colour-coded dashboard to prompt action, i.e., proactively support and “save” those students from failing (see Fig. 4.2). Predictions draw from: (a) student demographics, (b) VLE engagement, (c) assessment data, and (d) outcomes from a previous presentation of the course. It is worth noting that the university is committed to the ethical use of student data for improving services and better supporting students. The OU Student Privacy Policy details how student personal data is used by the university including the use of
Fig. 4.2 The OU analyse dashboard predicting students at risk of failing (note that the names are not real)
52
A. Boroowa and C. Herodotou
learning analytics for monitoring performance and evaluating teaching (See https:// help.open.ac.uk/how-the-ou-uses-student-data). This is a highly innovative project across the higher education sector; the Higher Education Commission (2016) recognised the OU as the only institution that has made a significant headway in using predictive analytics. OUA is one of the few analytics systems available in the world that has been tested and now rolled into business as usual across the university. Its systematic evaluation has shown that its use can improve student learning (performance and completion) on a large scale (Herodotou et al., 2020). What is particularly inspiring is that a forecast of future performance gives teachers actionable insights to provide timely support and enable students to succeed. Several small-scale pilots took place before 2018 through which the algorithms behind OUA were refined and the dashboard was tested with a small sample of teachers. A university-wide implementation took place in 2019 for which the performance of 161,261 students was predicted on 530 courses with 1,774 teachers recorded as using OUA. OUA is now accessible via the teachers’ homepage allowing any teacher across the university to access their students’ predictions. Student predictions are automated so that new machine learning models are generated automatically every week, providing quality checks and highlighting changes in the student performance data. In addition to providing direct support to teachers, the university is piloting a new version of the dashboard with students with personalised study recommendations, with the aim of enhancing the student–teacher interaction in online and distance higher education. Rigorous evaluation has shown that (a) systematic use of OUA by teachers is one of the two most significant predictors of student course completion and pass rates (the other being best previous student performance) (Herodotou et al., 2019a); (b) OUA complements and enhances teaching practice by encouraging teachers to be more proactive and supportive of students. Further, OUA has shown to contribute to teachers’ professional development and capacity of supporting students at risk (Herodotou et al., 2019a), (c) better student learning outcomes are recorded, the more teachers use the system (Herodotou et al., 2019b); (d) teachers had better student outcomes in the academic year they were accessing OUA than in previous years when they had no access (Herodotou et al., 2019b). Some illuminating quotes from teachers who used OUA state that: “I had a difficult group this year and without OUA, I think I would have lost around 4–5 of them but all of them made it till the end and passed”. Another teacher explained: “One of the things that OUA a hundred per cent has made me do is being much more proactive in sending messages [to students] between assignments. I sort of feel that I am on top of what the students are doing.” An example of how a teacher successfully used the dashboard is discussed below. This teacher was able to use the dashboard to provide timely support to a female engineering student from a Black and Minority Ethnic (BME) background with no prior higher education experience, and enable her to succeed. Prior to this, the student received 100% on the first assignment (a quiz) and 86% on her second assignment. However, in week 10 (see Fig. 4.3), the OUA flagged the student as unlikely to submit the third assignment. Upon further inspection by the teacher, it emerged that the student had not accessed the VLE after submitting the previous
Fig. 4.3 The activity of a student in OUA. Highlighted in red are periods of low engagement with the course
4 Learning Analytics in Open and Distance … 53
54
A. Boroowa and C. Herodotou
assignment three weeks earlier. When the teacher contacted the student, it became apparent that the student’s lack of activity on the VLE was due to the birth of her third child. The student had not disclosed her pregnancy as she was unsure whether the university would allow her to carry on with her studies. Not only did the teacher resolve the misunderstanding, but also provided support enabling the student to get back on track. Subsequent monitoring of the student’s performance helped the teacher identify another occasion when the student had limited VLE activity (weeks 15–17 in Fig. 4.3) and was likely to fail to submit her next assignment. Again, the teacher was able to prevent the student from giving up by identifying the problem she was facing and providing timely support. The student eventually completed the course with an average score of over 80%.
Lessons Learnt from the Implementation of Learning Analytics at the Open University UK As part of the evaluation of the uptake of predictive analytics across the university, the authors proposed a set of practical recommendations (See Fig. 4.4) for the implementation of predictive learning analytics (PLAs) in open and distance education. Figure 4.4 summarises early guidelines developed as a result of 20 in-depth interviews with stakeholders involved in the application of PLAs at the OU. These guidelines suggest that the successful implementation relies on careful consideration of a variety of factors, including stakeholders’ multiple perspectives and time availability, research and provision of evidence of impact, and sustained communication. As a result of the experience gained at the OU, several further aspects have been identified which potentially influence the successful implementation of learning analytics initiatives. Table 4.1 summarizes the key elements and lessons learned from both initiatives. These fall under four main categories, and it is recommended that these are considered by other online and distance learning institutions preparing to implement learning analytics initiatives: a)
b)
Clarity of vision: a clear and measurable understanding of learning analytics objectives and direction is needed to ensure that all stakeholders understand the scope of the activity and have expressed, discussed and resolved any conflicting perspectives. A clear plan should be developed detailing the objectives and timeline, the process through which these can be achieved and measured, potential barriers and mitigation strategies. Consideration of ethical perspectives: Any concerns or opposing perspectives should be identified and addressed from the start to ensure that relevant challenges faced elsewhere (in other organisations or settings) have been discussed and solutions identified. Special attention should be given to ethical dimensions of the implementation ensuring that student interests and welfare are safeguarded.
4 Learning Analytics in Open and Distance …
55
Fig. 4.4 Guidelines for the implementation of PLAs at the Open University UK (from Herodotou et al., 2019c)
c)
d)
Institutional readiness: A good understanding of the readiness of the institution to adopt and implement analytics can assist in engaging different stakeholders with project activities. In particular, (i) other unrelated or conflicting organisational change may have an impact on stakeholders’ willingness to engage and commit themselves and derail or delay the activity, (ii) technical requirements should be considered to ensure that the necessary infrastructure is in place (e.g., data warehouse), and (iii) an understanding of staff readiness in terms of digital and data literacy skills, as well as technical expertise should inform relevant support such as training about the use of new tools. PLAs specific considerations: It is important to address any local issues. At the OU, PLAs were perceived by some as almost a ‘self-fulfilling prophecy’ which might potentially de-motivate users and impact on interactions between students
56
A. Boroowa and C. Herodotou
Table 4.1 Guidance on how to implement learning analytics in higher education: lessons from the Open University UK CLARITY OF VISION LEADERSHIP, VISION AND DIRECTION
Role of the sponsor: The sponsor should always be the visible project champion and be the point of escalation, when needed, to quickly resolve problems. The sponsor should have a thorough understanding of the project and where it needs to go. Communicating with clarity and purpose: The aim of each communication and the goals of the overall initiative should be clear. Additionally, it is important to be clear what is good practice and what is mandatory. If mandatory, it is also important to ensure that all management layers understand and are in support and can articulate the reasons.
COMMUNICATION Specialist skills: The project team struggled initially with tailoring messages to stakeholders. A communications specialist would have known how to adapt comms to the various stakeholders, and indeed ensured that the right questions were asked about why the comms was needed. Purpose: In order to avoid confusion amongst stakeholders and prevent time wastage, inclusion of non-strategically funded business as usual (BAU) tasks should be avoided. If there are no BAU tasks, then all time and budget would be appropriately used for those strategic priority tasks only.
SCOPE
MANAGING EXPECTATIONS RELATED TO OUTPUT
GOVERNANCE
TIMESCALES
Change of scope: The scope of any long-term, multi-year project is likely to change several times. Indeed, this is appropriate given that requirements can evolve depending on institutional priorities. Agile methodology allows for iterative release, which can aid delivering something to see what works and amend as necessary. In a rapidly changing environment, the switch to using agile methodologies meant the project team found it easier to introduce alterations and changes in priorities. Delivery and handover: The tools developed through the Analytics Project were instrumental to successfully realise pedagogical benefits. It is important that the use of new tools is embedded in normal working practices. Increased involvement of end-users (teachers) at an early stage to feed into reporting requirements and specification to ensure that output provided meets expectations. Additionally, ensuring that analytics data/information provided to users comes with guidance that sets expectations of actions that can be taken from it. It is important for work to be part of or aligned to existing processes where possible to avoid perceived (or actual) duplication of effort. Having the ability to change membership and responsibility of the governance groups. Keeping the membership of these groups fluid reflects the different stages of the project. It is important to ensure Terms of Reference are updated accordingly and that group members are in support to ensure effectiveness of working. The OU’s experience was of a slower than expected uptake of analytics due to time needed for culture change, particularly where there was resistance. It may have been more appropriate to plan and execute the project in phases, to take account of the extent of culture change needed. However, this requires trust and flexibility from senior management in approving a project business case with less detail in the outline project plan.
(continued)
4 Learning Analytics in Open and Distance …
57
Table 4.1 (continued) Clarity of purpose, roles and responsibilities: There can be confusion when implementing an institutional Analytics project around what is included, and which departments are involved. At the OU, there was a perpetual perception that the project was somehow separate from staff in partner units who were contributing to the project. Assumptions should not be made that staff talk to each other or disseminate information when asked. We should have made sure that we had faculty engagement earlier on in the project and clear strategy for engaging stakeholders.
STAKEHOLDER ENGAGEMENT
Prepare the ground for Cultural Change: It is helpful to have a workshop at the beginning of the project to identify group formation dynamics and have a change management specialist on hand to provide tools for the team when faced with the challenges associated with culture change. Use success stories to engage and develop champions: Where possible, identify and involve champions (or convertible skeptics) and produce case studies for what worked well. Use success stories or case studies which outline best practice; what has happened and what has resulted from it; and start developing these as early as possible to get further champions onboard and to reassure stakeholders.
MANAGING DEPENDENCIES
BENEFITS REALISATION
At the OU, there was a disconnect between the priority levels of two dependent pieces of work which complicated discussions over resourcing. Ensure that dependent projects/pieces of work have the same priority to ensure that neither is delayed or negatively impacted by the other. Advantages: Benefits mapping can be complicated and take significant time. That said, having the benefits mapped proved useful on many subsequent occasions including prioritisation of tasks within the project when there were insufficient resources of time, people or money. Limitations: The project was an enabler to provide faculties with the data and/or tools to make changes. This meant that the business impact was difficult to articulate. In addition, the project team had no influence over the scale and scope of the changes made by each faculty. As such, it was difficult to measure cost benefits of the project. A recognition of enabling projects, direct and indirect benefits would also be helpful.
CONSIDERATION OF ETHICAL PERSPECTIVES As part of the project, the institution developed the Policy on the Ethical use of CLEAR ETHICAL Student Data for Learning Analytics, which clearly stated the University’s POSITION ON THE USE intentions around the use of student data. The policy was developed in OF STUDENT DATA consultation with staff and students. DISSEMINATE ETHICAL Merely publishing a policy on how an institution intends to ethically use POSITION student data is not enough. As an institution, we needed to communicate that the policy exists. This is in line with the GDPR requirements around transparency.
(continued)
58
A. Boroowa and C. Herodotou
Table 4.1 (continued) INSTITUTIONAL READINESS
FINANCE
RISK AND ISSUE MANAGEMENT PROJECT MANAGEMENT METHODOLOGY AND TOOLS
Recruitment: The projects were constrained by budget planning. Staff were recruited following budgetary sign-offs, which meant some staff had shorter contracts, or that more budget was spent initially on contract staff until fixed-term staff began. Consider the implications of financial year budget cycles in project startup and close-down phases. Risks and issues were managed in the project management team and dealt with or escalated as required. We fully utilized the project management team which aided rather than resisted our ability to control the project. A lot of time was spent over-working what would be delivered as part of the project and providing paperwork. Keep paperwork to a minimum, adopt Agile Methodology to allow for agility in scope of the project. Aim to limit complications such as moving staff/projects between programmes and departments.
Subject matter expertise: We made the decision to pay several teachers for their work they did on the project - an invaluable contribution to our work. The involvement of teachers allowed the team to articulate the pedagogical benefits better. However, it should be recognized that paying teachers to be involved in research activities could bias the results, by highlighting particular voices and/or by recruiting a non-representative sample. CONSIDERATIONS SPECIFIC TO PREDICTIVE LEARNING ANALYTICS (PLAs) PLAs PERCEIVED AS A Clarify the objective behind PLAs which is the timely identification of students ‘SELF-FULFILLING at risk and the provision of appropriate support in order to support students PROPHECY’ and help them to succeed. PEOPLE (CAPACITY AND CAPABILITY)
CONCERNS RELATED TO ADDITIONAL WORKLOAD
Debunk myths by exploring and comparing non-PLAs support practices to PLA support practices.
and teachers. One response is to clarify the objectives, e.g. the timely identification of students at potential risk and the provision of appropriate support to help them to succeed. At the OU, we also pointed to evidence of impact showing that teachers acting upon PLAs were more likely to have better student performance (and this improvement was not related to more engaged teachers in general). In addition, OU stakeholders were concerned about additional workload for teachers. We managed to debunk this concern by exploring and comparing non-PLAs support practices to PLA support practices. Evidence suggested that student monitoring without PLAs was typically much harder in practice; being more time-consuming and less systematic, and associated with a greater risk of not effectively detecting those students likely to need support (Fig. 4.5). Large institutions are likely to have the prerequisites that support the adoption of learning analytics at scale. These range from the digital footprint of the organisation to the technological and data infrastructure that support the collection, curation, and availability of data, as well as the digital and data literacy of staff. However, not having these prerequisites should not preclude any educational institution from considering
4 Learning Analytics in Open and Distance …
59
Fig. 4.5 Areas of consideration for the implementation of learning analytics based on lessons learnt at the Open University UK
the use of learning analytics. Institutions can overcome technological and data infrastructure limitations by establishing processes that enable the systematic collection, curation, peer validation and collective use of student data. To successfully implement the use of analytics to support students in such an environment, an institution should first identify and agree the use of data indicators that are not dependent on technology but are rather based on traditional ways of performance measurement, such as formative and summative assessment scores, attendance reports, ongoing assessment scores, and teacher’s intuition. The next step should be to establish rigour around teachers’ regular recording and use of the data and embed this into the teaching practice by, for example, structuring how often and when teachers should inspect and enact on data indicators and enabling communication between teachers teaching the same cohorts of students that could validate the need for support and coordinate efforts. Establishing processes that make regular teachers’ use of the data should be supported by both institutional policy and staff training.
Conclusions In this chapter, we showcased that the process of implementing learning analytics in open and distance higher education is not a straightforward one. It can be complex and dynamic, involving varied stakeholders with varied expectations, roles, and responsibilities across different levels. We reflected in particular on the experience of using analytics with online teachers at the Open University (OU) UK over the past 6 years, and commented on the conditions that can both help minimise obstacles such as teachers’ resistance to change, and promote the adoption of new analytical outputs and technology.
60
A. Boroowa and C. Herodotou
We presented and compared insights from the implementation of two universitywide initiatives, the first focusing on learning analytics to inform learning design, and the second examining the use of predictive learning analytics to identify students potentially at risk and facilitate proactive interventions. The OU has achieved significant progress in testing, implementing and widely adopting an evidence-based approach to learning analytics that could inform practices in other open and distance higher education institutions. Yet, this is not to argue that no challenges remain. A further issue to consider is the scalable implementation of approaches, such as predictive learning analytics, especially when there are limited resources to employ teachers to communicate with students and moderate timely interventions. Models of peer learning and support could be put in place to support and solve ‘common’ learning difficulties, while an escalation process could allow for major issues to be dealt by teachers and student support teams. We close this chapter by noting the importance of human interaction to facilitate and achieve learning outcomes. With appropriate technological support (such as predictive systems), teachers could focus greater time and effort on those students most in need of support, and contribute to the sustainable application of such models across open distance-learning education. Learning analytics is not the panacea, rather how it is adopted and used determines the outcome.
References Bawa, P. (2016). Retention in online courses: Exploring issues and solutions—A literature review. SAGE Open. Retrieved 20/2/19 from https://doi.org/10.1177/2158244015621777 Bell, P., Lewenstein, B., Shouse, A. W., & Feder, M. A. (Eds.). (2009). Learning science in informal environments: People, places, and pursuits. The National Academies Press. Bodily, R., Kay, J., Aleven, V., Jivet, I., Davis, D., Xhakaj, F., & Verbert, K. (2018). Open learner models and learning analytics dashboards: A systematic review. Paper presented at the proceedings of the 8th international conference on learning analytics and knowledge. Calvert, C. (2014). Developing a model and applications for probabilities of student success: A case study of predictive analytics. Open Learning: THe Journal of Open, Distance and e-Learning, 29(2), 160–173. https://doi.org/10.1080/02680513.2014.931805 Charleer, S., Klerkx, J., Duval, E., De Laet, T., & Verbert, K. (2016). Creating effective learning analytics dashboards: Lessons learnt. In K. Verbert, M. Sharples, & T. Klobuˇcar (Eds.), Adaptive and adaptable learning: 11th European conference on technology enhanced learning, EC-TEL 2016, Lyon, France, September 13–16, 2016, Proceedings (pp. 42–56). Springer International Publishing. Conole, G. (2012). Designing for learning in an open world (Vol. 4). Springer Science & Business Media. Crawley, F. E., Fewell, M. D., & Sugar, W. A. (2009). Researcher and researched: The phenomenology of change from face-to-face to online instruction. The Quarterly Review of Distance Education, 10, 165–176. Davis, N. L., Gough, M., & Taylor, L. L. (2019). Online teaching: Advantages, obstacles and tools for getting it right. Journal of Teaching in Travel & Tourism, 256–263,. https://doi.org/10.1080/ 15313220.2019.1612313
4 Learning Analytics in Open and Distance …
61
Dawson, S., Poquet, O., Colvin, C., Rogers, T., Pardo, A., & Gasevic, D. (2018). Rethinking learning analytics adoption through complexity leadership theory. Paper presented at the proceedings of the 8th international conference on learning analytics and knowledge. Ferguson, R., & Buckingham Shum, S. (2012). Social learning analytics: Five approaches. Paper presented at the 2nd International Conference on learning analytics and knowledge. Ferguson, R., Brasher, A., Cooper, A., Hillaire, G., Mittelmeier, J., Rienties, B., Vuorikari, R. (2016). Research evidence of the use of learning analytics: Implications for education policy. In R. Vuorikari & J. Castano-Munoz (Eds.), A European framework for action on learning analytics (pp. 1–152). Joint Research Centre Science for Policy Report. Herodotou, C., Rienties, B., Boroowa, A., Zdrahal, Z., & Hlosta, M. (2019). A large-scale implementation of predictive learning analytics in higher education: The teachers’ role and perspective. Educational Technology Research and Development, 67(5), 1273–1306. https://doi.org/10.1007/ s11423-019-09685-0 Herodotou, C., Hlosta, M., Boroowa, A., Rienties, B., Zdrahal, Z., & Mangafa, C. (2019). Empowering online teachers through predictive learning analytics. British Journal of Educational Technology, 50(6), 3064–3079. https://doi.org/10.1111/bjet.12853 Herodotou, C., Rienties, B., Verdin, B., & Boroowa, A. (2019). Predictive learning analytics ‘At Scale’: Guidelines to successful implementation in higher education. Journal of Learning Analytics, 6(1), 85–95. Herodotou, C., Sharples, M., Gaved, M., Kukulska-Hulme, A., Rienties, B., Scanlon, E., & Whitelock, D. (2019d). Innovative pedagogies of the future: An evidence-based selection. In Frontiers in Education (Vol. 4, p. 113). Frontiers. Herodotou, C., Rienties, B., Hlosta, M., Boroowa, A., Mangafa, C., & Zdrahal, Z. (2020). The scalable implementation of predictive learning analytics at a distance learning university: Insights from a longitudinal case study. The Internet and Higher Education, 45, 100725. Higher Education Commission. (2016). In X. Shacklock(Ed.), From bricks to clicks (pp.1–76). Higher Education Commission. Holmes, W., Nguyen, Q., Zhang, J., Mavrikis, M., & Rienties, B. (2019). Learning analytics for learning design in online distance learning. Distance Education, 40(3), 309–329. Moore, M. G. (1993). Theory of transactional distance. In D. Keegan (Ed.), Theoretical principles of distance education. Routledge. Papamitsiou, Z., & Economides, A. (2014). Learning analytics and educational data mining in practice: A systematic literature review of empirical evidence. Educational Technology & Society, 17(4), 49–64. Rienties, B., Boroowa, A., Cross, S., Kubiak, C., Mayles, K., & Murphy, S. (2016a). Analytics4Action evaluation framework: A review of evidence-based learning analytics interventions at the Open University UK. Journal of Interactive Media in Education, (1). Rienties, B., Boroowa, A., Cross, S., Farrington-Flint, L., Herodotou, C., Prescott, L., Mayles, K., Olney, T., Toetenal, L., & Woodthorpe, J. (2016b). Reviewing three case-studies of learning analytics interventions at the Open University UK. In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (pp. 534–535). Rienties, B., Herodotou, C., Olney, T., Schencks, M., & Boroowa, A. (2018). Making sense of learning analytics dashboards: A technology acceptance perspective of 95 teachers. International Review of Research in Open and Distributed Learning, 19 Scheffel, M., Drachsler, H., de Kraker, J., Kreijns, K., Slootmaker, A., & Specht, M. (2017). Widget, widget on the wall, am I performing well at all? IEEE Transactions on Learning Technologies, 10(1), 42–52. https://doi.org/10.1109/TLT.2016.2622268 Sewart, D., Keegan, D., & Holmberg, B. (2020). Distance education: International perspectives. Routledge. Song, D., & Bonk, C. J. (2016). Motivational factors in self-directed informal learning from online learning resources. Cogent Education, 3(1), 11. https://doi.org/10.1080/2331186X.2016.1205838
62
A. Boroowa and C. Herodotou
Zacharis, N. Z. (2015). A multivariate approach to predicting student outcomes in web-enabled blended learning courses. The Internet and Higher Education, 27, 44–53. https://doi.org/10. 1016/j.iheduc.2015.05.002 Zhang, J., Burgos, D., & Dawson, S. (2019). Advancing open, flexible and distance learning through learning analytics.
Avinash Boroowa, Since 2012, Avinash has been working on numerous strategic projects aimed at helping staff at The Open University (UK) embrace learning analytics with a view to facilitate cultural and operational change. Avinash helped develop the OU’s Policy on the ethical use of student data for learning analytics, arguably the first of its kind in Higher Education worldwide. As Project Lead on the award-winning Early Alert Indicators Project, Avinash led a team piloting predictive analytics to help the Open University understand how best to use the data to provide personalised support to students and improve retention. Avinash currently leads the Apprenticeships Data & MI project set up to improve data provision for the management of degree apprenticeships at the OU. Christothea is a Senior Lecturer (Associate Professor) at the Open University, UK, interested in the evidence-based design and evaluation of technologies for learning at scale (web-based platforms, mobile applications, digital games). She has been the evaluation lead of the Early Alert Indicators project, which enabled the use of predictive learning analytics across the Open University. She holds funding, as a co-Investigator, to improve the accuracy of predictive analytics of at risk students, as measured by OUAnalyse, and explore the design of personalised student-facing dashboards for distance education. She is also heavily involved in citizen science activities; she is the academic lead of the award-winning nQuire platform (nquire.org.uk) and a co- Principal Investigator on the international project LEARN Cit Sci, funded by NSF, Welcome, and ESRC.
Chapter 5
Mobile Multimodal Learning Analytics Conceptual Framework to Support Student Self-Regulated Learning (MOLAM) Mohammad Khalil
Introduction Online distance learning is highly learner-centred, requiring different skills and competences from learners, as well as alternative approaches for instructional design, student support, and provision of resources. Learner autonomy and self-regulated learning (SRL) in online learning settings are considered key success factors that predict student performance (Broadbent, 2017; Papamitsiou & Economides, 2019; Zhao et al., 2014) SRL comprises processes of planning, monitoring, action and reflection (Zimmerman, 1990), and typically focuses on three key features of learners: (1) use of SRL strategies, (2) responsiveness to self-oriented feedback about learning effectiveness, and (3) motivational processes. SRL has been identified as having a direct correlation with students’ success (Zimmerman, 1990), including improvements in grades and the development of relevant skills and strategies. Such skills and strategies are needed to become a successful lifelong learner. Earlier research suggests that learners may struggle in online, open and mobile learning environments when they do not use critical SRL strategies (Wong, Baars et al., 2019; Wong, Khalil et al., 2019). Formal, non-formal, and informal online learning settings are constantly evolving and are predominantly accessible through a range of technologies, including those provided by educational institutions (e.g., learning management systems and laptops) and students’ own mobile devices (e.g., smartphones). This suggests that students need to be able to navigate effectively across a range of environments in combination with the use of institutionally provided and private mobile devices to succeed in their learning. It is argued that the effective use of SRL would be beneficial in these contexts, although this can be difficult for
M. Khalil (B) Centre for the Science of Learning & Technology, University of Bergen, Bergen, Norway e-mail: [email protected] © The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2022 P. Prinsloo et al. (eds.), Learning Analytics in Open and Distributed Learning, SpringerBriefs in Open and Distance Education, https://doi.org/10.1007/978-981-19-0786-9_5
63
64
M. Khalil
both learners and educators, particularly when students are learning online and/or independently (Lodge et al., 2018). Studies have shown that educators are challenged in helping students to develop the strategies and skills needed to regulate their learning (Lodge et al., 2018; Yen et al., 2018). Likewise, many students possess poor SRL strategies, including their ability to accurately calibrate their own learning processes (Dunlosky & Lipko, 2007). Moreover, without adequate instructional support and effective learning design, students may overestimate their understanding of learning materials which can then negatively impact the remainder of their learning (Gyllen et al., 2019). Given that increasing numbers of students are spending significant time learning independently in online flexible learning settings, “there is a growing need for understanding [i.e., measuring] and intervening [i.e., supporting] in these environments towards the development of SRL” (Lodge et al., 2018, p. 1). All this suggests that additional learner support is required for SRL development across multifaceted online learning contexts. This chapter introduces a Mobile Multimodal Learning Analytics approach (MOLAM). I argue that the development of student SRL would benefit from the adoption of this approach and would allow a continuous measurement and provision of in-time support of student SRL in online learning contexts.
Background Although considerable theoretical and conceptual progress has been made with respect to student self-regulation in learning, there has been “little progress in developing methods to make the primary invisible mental regulation processes […] visible and thus measurable and ultimately interpretable” (Noroozi et al., 2019, p. 2). Existing progress largely relates to those recent developments of learning analytics approaches that have focused on measuring various aspects of student SRL, frequently based on the availability of learner digital data accessed from learning management systems. However, learners today move continuously across different learning contexts (formal, informal, and non-formal), where they extensively employ their mobile devices (often smartphones) for different purposes, including learning. This type of learning is recognized as mobile learning (m-learning). M-learning “draws on the attributes of enhanced mobility and flexibility that are enabled by portable devices and cloud-based networks” (Palalas & Hoven, 2016, p. 51). M-learning as a term has been used inconsistently and to imply different meanings; it has been criticized for focusing on examining ‘things’ (i.e., the use of computing devices) rather than educational problems […] that would improve learning and achieve learning goals” (Grant, 2019, p. 362). In fact, m-learning studies rarely report (or often do not report at all) underlying pedagogical and/or theoretical frameworks (Grant, 2019). This chapter aims to fill this gap by suggesting how mobile learning analytics underpinned by the theoretical lens of SRL (Zimmerman, 1990) might be designed and used to support students’ learning in online learning settings.
5 Mobile Multimodal Learning Analytics …
65
Definitions of m-learning tend to fall into four, often overlapping, categories. These concern: (1) its relationship to distance education and eLearning, (2) exploitation of devices and technologies, (3) mediation with technology, and (4) the nomadic nature of learners and learning (Grant, 2019). Some scholars argue that such definitions are unhelpful, and researchers should instead ground their research efforts in the following design characteristics: (1) the learner is mobile; (2) the device is mobile; (3) data services are persistent; (4) content is mobile; (5) the tutor is accessible; (6) physical and network cultures and contexts impact learning or learner; and (7) the learner is engaged (Grant, 2019, p. 370). Given the importance of supporting learners in their regulation of learning activities across multifaceted online learning environments, an improved understanding and follow-up support of their SRL activities becomes critical. Despite a broad acceptance of the various benefits of learning analytics within open, distance, and distributed educational systems to support improved retention rates and educational practices (see e.g., Khalil & Ebner, 2016b), there are few studies on learning analytics frameworks designed for use with ubiquitous mobile devices and SRL. Earlier research has shown that mobile technologies in education can be advantageous and that mobile apps can enhance students’ abilities to self-regulate their learning (Broadbent et al., 2020). Yet, the impact of mobile technology on student learning has often been measured by individually stated perceptions rather than the direct use of technology, with some exceptions (e.g., Molenaar et al., 2019; Tabuenca et al., 2015). This is a serious limitation for understanding the transformative nature of learning as a continuous process, and for providing opportunities to give relevant feedback and intervene in real time (i.e., to optimize student learning by improving learner support). The mobile learning analytics field is challenging and promising for practitioners and researchers because of the distinctive features offered by mobile devices. For example, there is a large amount of contextual and temporal learner data that cannot be obtained from web-based systems but can be mined from mobile used technologies and which, in turn, impacts on the data types collected (see e.g., Toninelli & Revilla, 2020). M-learning provides opportunities to collect localized data and information from various learning activities (Tabuenca et al., 2015), continuously taking place across formal and informal learning settings. Mobile learning analytics then is explained as “the collection, analysis and reporting of the data of mobile learners, which can be collected from the mobile interactions between learners, mobile devices and available learning materials” (Aljohani & Davis, 2012, p. 71). The chapter presents a holistic approach known as the Mobile Multimodal Learning Analytics Approach, encompassing learning analytics, m-learning, and SRL grounded in Zimmerman’s (1990) theory and Aljohani and Davis’ (2012) framework. It targets three types of stakeholders, namely learners, teachers and researchers, and aims to provide additional insight into designing, implementing and evaluating m-learning scenarios to foster students’ SRL strategies, skills and knowledge in online and open learning contexts.
66
M. Khalil
Mobile Multimodal Learning Analytics Approach The Mobile Multimodal Learning Analytics Approach (MOLAM; Fig. 6.1) is informed by learning design, a “methodology for enabling teachers/designers to make more informed decisions in how they go about designing learning activities and interventions” (Conole, 2012), and learning analytics, which has shown that learning design can impact students’ learning behaviour, satisfaction and outcomes (Holmes et al., 2019). MOLAM can be understood and employed through the lenses of multidisciplinary and multichannel (i.e., data originating from different sources) SRL data research approaches which provide support for fostering students’ SRL across formal, nonformal, and informal online learning contexts. A learner’s ability to employ SRL strategies is not static; it alters throughout the learning process (Sedryakyan et al., 2018). Therefore, MOLAM focuses on: (1) the examination of the actual continuous uses of SRL meta-strategies, strategies and tactics, and (2) the further support accessible through learners’ mobile technologies-in-use. Empirical research suggests that there is limited existing support based on learning analytics given to students to foster their SRL in m-learning settings (Matcha et al., 2019). In particular, the authors conclude that the prevailing part of existing SRL support is limited to webbased student-centred learning dashboards (i.e., rich visualizations). This is seen as a critical limitation in terms of learners’ access and use of such web-based tools across formal, non-formal and informal learning contexts, in which learners are constantly moving, and in which they frequently employ various technologies, including their own mobile devices. The roots of designing MOLAM were influenced by Khalil & Ebner’s Learning Analytics Lifecycle model (2015) and Park’s Pedagogical Framework for Mobile Learning (2011) which both build upon Zimmerman’s (1990) SRL model and Winne’s (2017) grain size explanation of learning analytics for SRL. The Learning Analytics Lifecycle (Khalil & Ebner, 2015) was adapted since it refines three of the early developed learning analytics models (Chatti et al., 2013; Clow, 2012; Greller & Drachsler, 2012) as well as providing a solid foundation for the purposes of employing learning analytics with SRL. Park’s (2011) model for m-learning delivers a sound theoretical framework to build mobile applications for the purpose of ‘learning-onthe-go’. It is based on a Transactional Distance theory that is defined by distance not only as a geographic separation but, more importantly, as a pedagogical concept (Moore, 1997). Transactional distance is understood as the “interplay of teachers and learners in environments that have the special characteristics of their being spatially separate from one another” (Moore, 2007, p. 91). Park’s model consists of individual and social aspects of learning which fits well with Zimmerman’s theoretical SRL model (1990), grounded in a socio-cognitive view of SRL that includes personal, behavioural, and critical environmental classes of influence on self-regulated behaviour (Panadero, 2017). Further development of MOLAM is expected to benefit from ongoing deployment of new digital tools that are accessible through learners’ mobile technologies. The use
5 Mobile Multimodal Learning Analytics …
67
of new technologies could: (1) contribute to learner academic success (Broadbent, 2017; Wong, Khalil et al., 2019), and (2) provide access to a new type of multichannel behavioural learner data that might reveal to researchers and practitioners how learners employ different SRL strategies and develop relevant SRL skills and knowledge over time (i.e., a process-oriented view of SRL compared to an accepted static view of SRL). MOLAM, shown in Fig. 5.1, consists of four key mutually constituting parts: (1) learning settings; (2) data; (3) analytics and measurement; and (4) action-support. These are explored further below.
Fig. 5.1 Mobile Multimodal Learning Analytics Approach (MOLAM)
68
M. Khalil
Learning Settings MOLAM targets three main learning settings: formal, informal and non-formal learning contexts: • Formal learning typically implies a learning process which happens within an organised and structured context (e.g., school, college and university). It can often lead to some kind of accredited recognition (e.g., diploma, certificate). • Non-formal learning usually refers to a learning process embedded in planned activities which do not include formal learning elements such as a syllabus, certification and/or accreditation. Examples of non-formal learning include short courses, workshops, and professional development sessions. • Informal learning is defined as learning resulting from daily activities related to work, family, or leisure. It is sometimes referred to as experiential learning. It tends not to be structured in terms of learning objectives, learning time and/or learning support. Typically, it does not lead to certification. Informal learning may also be intentional (for example, when the learner takes the initiative to participate in a MOOC to learn about a certain topic). (Colardyn & Bjornavold, 2004) Learners continuously ‘move’ and produce multifaceted and continuous data, including that produced through use of their own mobile technologies, as well as data related to uses of technologies for learning purposes. Smart mobile devices are extensively used in both non-formal contexts (Barbosa et al., 2016) where the educational process has a flexible curriculum, and in informal learning contexts by enthusiasts (Clough et al., 2008); learners “use them in ways that correspond to the collaborative, contextual and constructivist mobile learning philosophies […] The mobile device enthusiasts had already successfully adopted their devices and had integrated them into their daily lives” (p. 370).
Learner Data MOLAM seeks to gain insight into student learning in a mobile context along with multichannel learner data in a variety of ways, depending on the research questions and the studied context(s). In particular, it may be argued that MOLAM is grounded in a mixed-methods methodology enriched by data from multiple data sources. Sources include: (1) learner activity data, derived from the use of mobile technologies, (2) quantitative ethnography (Shaffer, 2017), and (3) multimodal data collection, the analysis of which has earlier been found beneficial for the understanding of students’ SRL processes (Järvelä et al., 2019). First, it is suggested that specially developed or adapted software/apps - easily accessible through mobile devices (e.g., smartphones and/or tablets)—aimed at explaining and practicing SRL in selected learning settings should be used by learners as a support tool alongside institutionalised studies. Such software needs to integrate specially designed learning analytics module/s that target
5 Mobile Multimodal Learning Analytics …
69
different dimensions (e.g., cognitive and affective SRL processes) and phases of SRL (i.e., planning, monitoring and self-evaluation) separately or in a combination. This would provide stakeholders with process-oriented, continuous learner activity data, the analysis of which will allow mobile educational technology developers, learning designers, educators and researchers to better understand students’ SRL processes over time. Second, MOLAM would benefit from the use of quantitative ethnography, an emerging methodology that integrates quantitative and qualitative methods to assess learning and human meaning-making (Shaffer, 2017). Ethnography underlines the importance of having data that is grounded; ethnographers focus on understanding “what data means to the people who are being studied” (p. 110). According to Schaffer, “culture matters, because while computers can mine in a mountain of data, human beings swim in a sea of significance” (p. 20). This suggests that to be able to adequately interpret statistical SRL learner data (e.g., derived from the analysis of log data and/or more established methods, e.g., surveys), ethnographic qualitative methods of data collection should also be employed to understand the student educational culture, and the culture of their use of mobile technologies. Third, considering that learning practices vary considerably across learning contexts, MOLAM might also involve methods for collecting multimodal learner data (e.g., spatial and proximity data, physiological measurements such as eye movement, electrodermal activity, etc.) that may be accessible from mobile technologies-in-use. Researchers collecting and analysing multimodal (ethnographic) data are concerned with “accounts of cultural and social practices through prolonged fieldwork in a particular setting” (Jewitt et al., 2016, p. 118). For the analysis of complex multimodal datasets, recent computer-assisted tools, such as the Qualitative Data Analysis Software (Antoniadou, 2017) could be employed to provide valuable and practical support of complex and time-consuming qualitative research processes. For obtaining relevant multimodal process-oriented or temporal data, mechanisms for collecting multimodal data could be integrated through the use of students’ own mobile technologies, allowing researchers to better understand the continuous nature of their SRL processes occurring across learning settings and further suggest related in-time actions aimed at improving learner support and/or teaching SRL. Learning analytics for SRL could contribute to the development of a student-facing or teacher-facing learning dashboard—a digital instrument that can be used to visualise students’ SRL processes, based on a multichannel data stream (including student log activity data from the adapted SRL software use and multimodal data), with the overall goal to facilitate the development of students’ self-regulation. Making SRL processes continuously visible to learners, teachers and researchers should improve learners’ ability to self-regulate their learning.
Analytics and Measurement Data used in MOLAM can vary. Mixed-methods of analysis based on SRL theory and process-oriented behavioural data will provide a better understanding of the
70
M. Khalil
complex nature of student SRL processes (Panadero, 2017). There are many examples from quantitative learning analytics methods that can be used to support SRL using mobile generated data such as process mining and sequential pattern analysis (e.g., Shabaninejad et al., 2020; Wong, Khalil et al., 2019). Applying other data mining techniques such as decision trees and neural networks could help in classifying student types. Other types of analytics like location-aware and data moving micro clusters (Lu & Tseng, 2009) for identifying moving objects may fit well into analysing location data for the purposes of analyzing learners’ behaviours for the provision of effective learning support of self-regulation (Yamada et al., 2017). In the MOLAM proposal, I argue that the analysis of relevant learning activity data (i.e., log data from the students’ use of specially adapted or developed apps aimed at fostering SRL) and other multimodal data, combined with more established and validated ethnographic methods (e.g., surveys, self-reported data, and observations) provides considerable potential to facilitate new insights into SRL (i.e., to understand), to visualise (i.e., support), and to use such data in different learning activities to inform students (Noroozi et al., 2019), teachers, and researchers. Thus, the MOLAM approach offers potential to provide improved support for not only measuring student SRL but also for fostering SRL, i.e., to optimize learning and the environments in which it occurs, thus meeting the ultimate goal of learning analytics (Siemens & Long, 2011).
Action and Support The proposed approach targets three types of stakeholders; learners, teachers and researchers in order to optimize student learning and the environments in which it occurs, as well as to improve support for developing student SRL. Learners on the one hand, can be supported by specially developed or adapted software aimed at practicing SRL activities. The resulting mobile learning analytics could then aid teachers through the development of a teacher-facing learning dashboard that will visualise students’ SRL processes, both at an individual and/or group level. Such a tool could assist teachers not only in their understanding of students’ SRL processes but also in designing and practicing relevant teaching activities aiming at further fostering students’ SRL in educational settings and providing adequate support. Finally, to support researchers in tracíng and interpreting students’ SRL activities through a process-oriented approach, a graphical user interface to facilitate data visualisation and processing should be developed, thus offering new opportunities for researchers to explore learner data. This would contribute to a deeper understanding of the underexplored role of self-regulation in the m-learning research field and a further theoretical development of the SRL research area supported by the still emerging area of learning analytics.
5 Mobile Multimodal Learning Analytics …
71
Privacy Principles and MOLAM Several privacy and ethical concerns might emerge. According to Khalil and Ebner (2015), uses of learning analytics introduces several issues, namely: transparency, accessibility, security, ownership, policy, accuracy, and privacy. In addition to learning analytics, mobile applications are personal devices on which users store private information. The processing of personal data through mobile apps can pose significant risk to users’ security and privacy. For instance, breaching one’s privacy can be caused by a variety of sensors held in smart mobile devices. Examples of this are the use of location data (i.e., GPS and GLONASS), accelerometer data, microphone, camera, and Wifi, including educational apps, which create a cloud of unexpected privacy impacts. In the context of MOLAM, privacy, confidentiality, and anonymity remain paramount. Learning analytics may reveal personal information and attitudes, as well as learner activities, which could lead to the identification of individuals to unwanted stakeholders (Khalil & Ebner, 2016a). It should be stressed then that developing mobile applications using MOLAM (and other approaches) should follow national and international frameworks such as the General Data Protection Regulation (GDPR) in the Euro zone, and the Family Educational Rights and Privacy Act (FERPA) and the Student Privacy Compass in the US. Empirical data, whether generated by prototypes or final products of learning analytics modules, should stipulate rules of transparency and consent, and ensure data protection by design and by default, as well as security of personal data processing (Castelluccia et al., 2017). Learner consent in educational contexts like educational mobile apps that fall under MOLAM should follow the general GDPR framework or its derivatives such as the blueprint by Muravyeva et al. (2020), and the JISC framework (Sclater, 2017).
Conclusions SRL refers to how learners steer their own learning (Wong, Khalil et al., 2019). Fostering SRL for students in distance education can be particularly challenging (Andrade & Bunker, 2011). Increasing demands to understand and support learners’ SRL activities in open, distance, and distributed systems (including m-learning environments) require further employment of innovative teaching and learning practices. This chapter introduces MOLAM, a model aimed at guiding learners, teachers and researchers wanting to develop, successfully employ and/or evaluate learning analytics approaches for mobile learning activities for the purposes of measuring and fostering student SRL in diverse online learning environments. MOLAM is especially valuable for continuous measurement and interventions, thus fostering students’
72
M. Khalil
transferable SRL skills, strategies and knowledge across formal, informal and nonformal online learning settings. This would benefit not only learners’ academic success but also their development as successful lifelong learners. In conclusion, it is suggested that: 1.
2.
3.
There is a need to measure and support SRL in digital and distance learning to enable learners to take greater control over their own learning, underpinned by sound theoretical models and frameworks. The field of mobile learning analytics has the potential to support this given its powerful multidisciplinarity nature. It is anticipated that learning analytics methods will deal with multimodal multichannel data from various dimensions associated with SRL. Although the collection, use and analysis of multimodal multichannel data continues to evolve within learning analytics, researchers should address challenges resulting from instrumentation errors, reliability of measures, experimental designs, and inferences about process data (Azevedo & Gaševi´c, 2019). The application of mobile multimodal learning analytics should be performed with careful integration of relevant support mechanisms and frameworks to protect student privacy and ensure their agency in online learning settings.
References Aljohani, N., & Davis, H. (2012). Significance of learning analytics in enhancing the mobile and pervasive learning environments. In 2012 Sixth International Conference on Next Generation Mobile Applications, Services and Technologies (pp. 70–74). IEEE. Andrade, M. S., & Bunker, E. L. (2011). The role of SRL and TELEs in distance education: Narrowing the gap. In Fostering self-regulated learning through ICT (pp. 105–121). IGI Global. Antoniadou, V. (2017). Collecting, organizing and analyzing multimodal data sets: The contributions of CAQDAS. In E. Moore & M. Dooly (Eds.), Qualitative approaches to research on plurilingual education (pp. 435–450). Research-publishing.net. https://doi.org/10.14705/rpnet.2017.emmd20 16.640 Azevedo, R., & Gaševi´c, D. (2019). Analyzing multimodal multichannel data about self-regulated learning with advanced learning technologies: Issues and challenges. Computers in Human Behavior, 96, 207–210. Barbosa, D. N., Bassani, P. B., Martins, R. L., Mossmann, J. B., & Barbosa, J. L. (2016). Using mobile learning in formal and non-formal educational settings. In International Conference on Learning and Collaboration Technologies (pp. 269–280). Springer. Broadbent, J. (2017). Comparing online and blended learner’s self-regulated learning strategies and academic performance. Internet and Higher Education, 33, 24–32. Broadbent, J., Panadero, E., & Fuller-Tyszkiewicz, M. (2020). Effects of mobile-app learning diaries vs online training on specific self-regulated learning components. Educational Technology Research and Development. https://doi.org/10.1007/s11423-020-09781-6 Castelluccia, C., Guerses, S., Hansen, M., Hoepman, J. H., van Hoboken, J., & Vieira, B. (2017). Privacy and data protection in mobile applications: A study on the app development ecosystem and the technical implementation of GDPR. https://www.enisa.europa.eu/publications/privacyand-data-protection-in-mobile-applications/at_download/fullReport
5 Mobile Multimodal Learning Analytics …
73
Chatti, M. A., Dyckhoff, A. L., Schroeder, U., & Thüs, H. (2013). A reference model for learning analytics. International Journal of Technology Enhanced Learning, 4(5–6), 318–331. Clow, D. (2012). The learning analytics cycle: Closing the loop effectively. In Proceedings of the 2nd International Conference on Learning Analytics and Knowledge (pp. 134–138). ACM. https:// doi.org/10.1145/2330601.2330636 Clough, G., Jones, A. C., McAndrew, P., & Scanlon, E. (2008). Informal learning with PDAs and smartphones. Journal of Computer Assisted Learning, 24(5), 359–371. Colardyn, D., & Bjornavold, J. (2004). Validation of formal, non-formal and informal learning: Policy and practices in EU member states. European Journal of Education, 39(1), 69–89. Conole, G. (2012). Designing for learning in an open world (Vol. 4). Springer Science & Business Media. Dunlosky, J., & Lipko, A. R. (2007). Metacomprehension a brief history and how to improve its accuracy. Current Directions in Psychological Science, 16, 228–232. Grant, M. M. (2019). Difficulties in defining mobile learning: Analysis, design characteristics, and implications. Educational Technology Research and Development, 67(2), 361–388. Greller, W., & Drachsler, H. (2012). Translating learning into numbers: A generic framework for learning analytics. Educational Technology & Society, 15(3), 42–57. Gyllen, J. G., Stahovich, T. F., Mayer, R. E., Darvishzadeh, A., & Entezari, N. (2019). Accuracy in judgments of study time predicts academic success in an engineering course. Metacognition and Learning, 14(2), 215–228. Holmes, W., Nguyen, Q., Zhang, J., Mavrikis, M., & Rienties, B. (2019). Learning analytics for learning design in online distance learning. Distance Education, 40(3), 309–329. Järvelä, S., Malmberg, J., Haataja, E., Sobosincki, M., & Kirschner, P. (2019). What multimodal data can tell us about the self-regulated learning process? Learning and Instruction. https://doi. org/10.1016/j.learninstruc.2019.04.004 Jewitt, C., Bezemer, J., & O’Halloran, K. (2016). Introducing multimodality. Routledge. Khalil, M., & Ebner, M. (2015). Learning analytics: Principles and constraints. In EdMedia+ Innovate learning (pp. 1789–1799). Association for the Advancement of Computing in Education (AACE). Khalil, M., & Ebner, M. (2016a). De-identification in learning analytics. Journal of Learning Analytics, 3(1), 129–138. Khalil, M., & Ebner, M. (2016b). What massive open online course (MOOC) stakeholders can learn from learning analytics? Learning, design, and technology: An international compendium of theory, research, practice, and policy (pp. 1–30). https://doi.org/10.1007/978-3-319-177274_3-1 Lodge, J., Panadero, E., Broadbent, J., & de Barba, P. (2018). Supporting self-regulated learning with learning analytics. Learning analytics in the classroom: Translating learning analytics for teachers. https://doi.org/10.4324/9781351113038-4 Lu, E. H. C., & Tseng, V. S. (2009). Mining cluster-based mobile sequential patterns in locationbased service environments. In 2009 Tenth International Conference on Mobile Data Management: Systems, Services and Middleware (pp. 273–278). IEEE. https://doi.org/10.1109/MDM. 2009.40 Matcha, W., Uzir, N. A., Gaševi´c, D., & Pardo, A. (2019). A systematic review of empirical studies on learning analytics dashboards: A self-regulated learning perspective. In IEEE Transactions on Learning Technologies, 13(2), 226–245. https://doi.org/10.1109/TLT.2019.2916802 Molenaar, I., Horvers, A., & Baker, R. (2019). What can moment-by-moment learning curves tell about students’ self-regulated learning? Learning and Instruction. https://doi.org/10.1016/j.lea rninstruc.2019.05.003 Moore, M. (1997). Theory of transactional distance. In D. Keegan (Ed.), Theoretical principles of distance education (pp. 22–38). Routledge Studies in Distance Education. Moore, M.(2007). The theory of transactional distance. In M. G. Moore (Ed.), Handbook of distance education (pp. 89–105). Lawrence Erlbaum Associates.
74
M. Khalil
Muravyeva, E., Janssen, J., Specht, M., & Custers, B. (2020). Exploring solutions to the privacy paradox in the context of e-assessment: Informed consent revisited. Ethics and Information Technology, 1–16. https://doi.org/10.1007/s10676-020-09531-5 Noroozi, O., Alikhani, I., Järvelä, S., Kirschner, P., Seppänen, T., & Juuso, I. (2019). Multimodal data to design visual learning analytics for understanding regulation of learning. Computers in Human Behavior. https://doi.org/10.1016/j.chb.2018.12.019 Palalas, A., & Hoven, D. (2016). Emerging pedagogies for MALL. In P. Agnieszka & A. Mohamed (Eds.), The international handbook of mobile-assisted language learning (pp. 44–85). China Central Radio & TV University Press. Panadero, E. (2017). A review of self-regulated learning: Six models and four directions for research. Frontiers in Psychology, 8(422), 1–28. Papamitsiou, Z., & Economides, A. (2019). Exploring autonomous learning capacity from a self-regulated learning perspective using learning analytics. British Journal of Educational Technology, 50(6), 3138–3155. Park, Y. (2011). A pedagogical framework for mobile learning: Categorizing educational applications of mobile technologies into four types. The International Review of Research in Open and Distributed Learning, 12(2), 78–102. Sclater, N. (2017). Consent for learning analytics: Some practical guidance for institutions [Blog post]. Jisc. Retrieved from https://analytics.jiscinvolve.org/wp/2017/02/16/consent-for-learninganalytics-somepractical-guidance-for-institutions/ Sedrakyan, G., Malmberg, J., Verbert, K., Järvelä, S., & Kirschner, P. (2018). Linking learning behavior learning analytics and learning science concepts: Designing a learning analytics dashboard for feedback to support learning regulation. Computers in Human Behavior. https://doi. org/10.1016/j.chb.2018.05.004 Shabaninejad, S., Khosravi, H., Leemans, S. J., Sadiq, S., & Indulska, M. (2020). Recommending insightful drill-downs based on learning processes for learning analytics dashboards. In International Conference on Artificial Intelligence in Education (pp. 486–499). Springer. Shaffer, D. W. (2017). Quantitative ethnography. Cathcart Press. Siemens, G., & Long, P. (2011). Penetrating the fog: Analytics in learning and education. EDUCAUSE Review, 46(5). Retrieved from https://er.educause.edu/-/media/files/article-downlo ads/erm1151.pdf Tabuenca, B., Kalz, M., Drachsler, H., & Specht, M. (2015). Time will tell: The role of mobile learning analytics in self-regulated learning. Computers & Education, 89, 53–74. Toninelli, D., & Revilla, M. (2020). How mobile device screen size affects data collected in web surveys. Advances in questionnaire design, development, evaluation and testing (pp. 349–373). Wiley. Winne, P. H. (2017). Learning analytics for self-regulated learning. Handbook of learning analytics (pp. 241–249). https://doi.org/10.18608/hla17.021 Wong, J., Baars, M., de Koning, B., van der Zee, T., Davis, D., Khalil, M., Houben, G-J., & Paas, F. (2019). Educational theories and learning analytics: From data to knowledge. In Utilizing learning analytics to support study success (pp. 3–25). Springer. Wong, J., Khalil, M., Baars, M., de Koning, B., & Paas, F. (2019). Exploring sequences of learner activities in relation to self-regulated learning in a massive open online course. Computers & Education, 140, 103595. https://doi.org/10.1016/j.compedu.2019.103595 Yamada, M., Shimada, A., Okubo, F., Oi, M., Kojima, K., & Ogata, H. (2017). Learning analytics of the relationships among self-regulated learning, learning behaviors, and learning performance. Research and Practice in Technology Enhanced Learning, 12(1), 13. Yen, M. H., Chen, S., Wang, C. Y., Chen, H. L., Hsu, Y. S., & Liu, T. C. (2018). A framework for self-regulated digital learning (SRDL). Journal of Computer Assisted Learning, 34(5), 580–589. Zhao, H., Chen, L., & Panda, S. (2014). Self-regulated learning ability of Chinese distance learners. British Journal of Educational Technology, 45(5), 941–958. Zimmerman, B. J. (1990). Self-regulated learning and academic achievement: An overview. Educational Psychologist, 25(1), 3–17.
5 Mobile Multimodal Learning Analytics …
75
Mohammad Khalil is a senior researcher and lecturer in learning analytics at the Centre for the Science of Learning & Technology (SLATE) at the faculty of psychology, University of Bergen, Norway. Mohammad has a master’s degree in information security and digital criminology and a Ph.D. degree from Graz University of Technology in Learning Analytics in Massive Open Online Courses (MOOCs). Khalil has a rich international experience working in four different countries since 2015. He has published over 50 articles on learning analytics in high-standard and wellrecognized journals and academic conferences, focusing on understanding and improving student behavior and engagement in digital learning platforms using data sciences. His current research focuses on learning analytics in Open and Distance Learning (ODL), self-regulated learning, mobile, visualizations and gamification, as well as privacy and ethics. His personal website is: http://mohdkhalil.wordpress.com.
Chapter 6
Designing a Social Learning Analytics Tool for Open Annotation and Collaborative Learning Jeremiah H. Kalir
Writing on writing is both literally and metaphorically an important part of the way meaning is negotiated. —Brown and Duguid (1996)
Introduction Imagine this book annotated. Picture probable marginalia, marks of readers’ attention, perhaps even the stain of a coffee cup. This chapter examines why the social and technical practice of annotation—and, specifically, annotation that accompanies digital and openly accessible texts—is relevant to the development of learning analytics in open, flexible, and distance learning (OFDL; Zhang et al., 2019). The probable social future of this book, as an academic text written for specialists and novices, includes its likely annotation in coursework, by research teams, and because the coupling of annotation and citation propels forward the production of scholarship (Fajkovic & Björneborn, 2014; Marshall, 1997). Annotation is the addition of a note to a text. Whether written by hand as marginalia or composed using digital technology, the social life of this book will be recursively authored by students, scholars, designers, and educators who may share both this book and their annotation in conversation with one another, enabling collaborative forms of reading, writing, and meaning-making (Liu, 2006; Reid, 2019). When annotated, the text of this book will become a context for discussion, analysis, and shared inquiry. Should multimodal notes be added to a digital or online version of this book, this annotation may encourage social participation and discourse. Social annotation is a genre of learning technology that enables information sharing, peer interaction, collaboration, and the production of new knowledge J. H. Kalir (B) University of Colorado Denver, Denver, CO, USA e-mail: [email protected] © The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2022 P. Prinsloo et al. (eds.), Learning Analytics in Open and Distributed Learning, SpringerBriefs in Open and Distance Education, https://doi.org/10.1007/978-981-19-0786-9_6
77
78
J. H. Kalir
(Novak et al., 2012). Moreover, should interactive notes be directly tethered to an online version of this text, social annotation will create the conditions for this book and its chapters to function as an anchored environment (Gao et al., 2013) whereby learners can collectively develop familiarity with new ideas and construct new knowledge (e.g., Chen, 2019). Annotation has long been associated with education (Adler, 1940; Brown & Smiley, 1978; Wolfe, 2002). Today, advances in social annotation technology and pedagogical experimentation among anchored discursive environments have inspired new forms of collaborative reading, sense-making, and open learning (Kalir & Garcia, 2019; Mirra, 2018; Sprouse, 2018). Collaborative activity mediated by social annotation—and whether anchored to this book, or to texts like ebooks, PDFs, blog posts, and open textbooks—will generate digital information that may be gathered, analyzed, reported, and interpreted as discourse data (e.g., Sun & Gao, 2017). Notably, educational researchers interested in online discussion and collaborative learning are developing new designs and methods that have proven useful in establishing connections among social annotation, discourse data, and learning analytics. For example, Chen’s (2019) study of “networked discourse” examined social annotation data to describe the ways in which “students were engaged in sensemaking, sensegiving, problem-solving, artifact creation, and deepening discussion around knowledge artifacts” (p. 199). Plevinski and colleagues’ (2017) investigation of undergraduate students’ social annotation combined descriptive statistics with content analysis and found that elaboration and interpretation were common knowledge construction activities. Studies of educators’ social annotation have utilized methods like social network analysis (Kalir & Perez, 2019) to detail discourse patterns in openly-networked and interestdriven professional learning. When texts like this book anchor social annotation, it is feasible for researchers to derive insight about group activity, meaning-making, and collaboration through the analysis of discourse data. The annotation of this book is, of course, a probable though hypothetical scenario. Nevertheless, this scenario is pertinent to the relationship between learning analytics and OFDL for it raises a number of broad questions. First, how can advances in open technology shape the ways in which people read, write, and make meaning together through their engagement with shared texts? Second, might open discourse data, interpreted and made actionable as social learning analytics (Buckingham Shum & Ferguson, 2012), help to describe and encourage collaborative activity? Third, how can the public reporting of social learning analytics—in partnership with learners and other stakeholders—encourage sustained collaboration? Fourth, how might data systems reporting social learning analytics strengthen OFDL infrastructures? These questions are raised not to guide a novel empirical study but are shared as generative queries that have influenced the design and research of an interdisciplinary team working at the intersection of social annotation, learning analytics, and OFDL. Since 2017, our team—a group with expertise in the learning sciences, data sciences, technology design, and teacher education—has collaborated with multiple stakeholders to iterate a public dashboard that visualizes openly accessible discourse data generated via social annotation as social learning analytics (e.g., Kalir, 2020). The creation of any particular learning analytics tool is not without concern, and
6 Designing a Social Learning Analytics Tool …
79
there is a need to critically question the purpose and development of such technologies (Perrotta & Williamson, 2018; Selwyn, 2019), advance designs grounded in learning theory (Matcha et al., 2019), and describe “complex narratives” because “we [learning analytics scholars] are not aware of intimate, first-hand accounts of the building experience of educational data infrastructures” (Johanes & Thille, 2019, p. 2970). In response, this chapter recounts one approach to ethically co-designing a public dashboard that reports social learning analytics and encourages learners’ collaborative annotation across open texts and contexts. As a contribution to “designerly work” in the learning sciences (Svihla & Reeve, 2016), this chapter is a reflective, first-hand account organized around three related objectives: 1. 2. 3.
Naming the theoretical stances toward open and social learning that informed design and research; Describing key decisions and trade-offs pertinent to four iterations of a social learning analytics dashboard; and Considering epistemological, technological, and infrastructural implications for the development and use of social learning analytics in OFDL.
Open and Social: Activity, Annotation, and Analytics The design and research efforts detailed in this chapter reflect insight from learning theory about the relationships among open and social activity, annotation, and analytics. Amidst myriad interpretations of what counts as “open” in education and related professional and political contexts (Conrad & Prinsloo, 2020; Pomerantz & Peek, 2016), OFDL may be theoretically grounded in a broader conception of learning as participation in social activity (George, 1995; Lave & Wenger, 1991). People learn through participation in authentic and cultural practices across a range of everyday and academic settings (National Academies of Sciences, Engineering, and Medicine, 2018). Learning processes, like collaboration and meaning-making, may be understood as group-level processes (Stahl, 2017) rather than as an indicator of individual accomplishment. Moreover, because learning is situated in both everyday and designed contexts, groups rely upon their use of tools, shared resources, and distributed cognition to engage in joint activity (e.g., Slakmon & Schwarz, 2017). From this perspective, OFDL is conceived of as access to and participation in authentic social activity among and across networks comprised of people, shared technologies, and material and ideational resources that enable negotiation, meaningmaking, and knowledge construction (e.g., Bali & Caines, 2018; West-Puckett et al., 2018). As noted, social annotation is one example of open and collaborative activity that demonstrates group-level learning both within and beyond the boundaries of formal educational institutions (e.g., Siemens et al., 2017). In one respect, the open and social
80
J. H. Kalir
qualities of annotation resemble how scholars have described open educational practices as accessible, networked, transparent, and relevant to collaborative academic activities like teaching and publication (Cronin & MacLaren, 2018). Alternatively, the open and social qualities of annotation echo critical literacy practices (Ávila & Pandya, 2013) including, specifically, the expansive possibilities of writing: “Writing serves many purposes: to exchange ideas, explain positions, critique perspectives, question values, establish points of view, and reflect on beliefs that may contradict other people’s beliefs” (Kinloch, 2010, p. 44). Yet again, it is also useful to perceive the open and social attributes of annotation as reflecting connected learning (WestPuckett et al., 2018) whereby a group that adds notes to a common text evidences affinity among networked opportunities, supportive relationships, and shared interests (e.g., Kalir & Garcia, 2019). Together, these perspectives contribute to a more robust theoretical understanding of how annotation enables open and social learning. Tracing linkages from activity to annotation to analytics, it is important to note that the frequent association of learning analytics with students and schooling has elicited skepticism and critique (e.g., Broughan & Prinsloo, 2019; Rubel & Jones, 2016). Nonetheless, acknowledging open and collaborative activity beyond institutional boundaries can motivate an appreciation for the promise of social learning analytics (SLA). Buckingham Shum and Ferguson (2012) define SLA as a “subset” of learning analytics relevant to participatory online cultures and resonant with the previously described socio-cultural perspectives on learning. Specifically, SLA are intended to help describe and make actionable group processes like collaboration: “The focus of social learning analytics is on processes in which learners are not solitary, and are not necessarily doing work to be marked, but are engaged in social activity, either interacting directly with others… or using platforms in which their activity traces will be experienced by others” (p. 5). Research by Chen and colleagues (2018), for example, detailed opportunities and challenges when implementing a SLA toolkit intended to foster undergraduate students’ social interaction and conceptual engagement. Outside a formal course context, Gruzd and colleagues (2016) have demonstrated methods for collecting, analyzing, and visualizing social forms of learning analytics in order to “open up possibilities for understanding designed and emergent online learning practices as supported through social media” (p. 65). A value of SLA is the creation of tools and methods that make both accessible and actionable group-level activity which occurs in participatory online cultures, whether associated with social media or social annotation, in order to encourage ongoing joint activity. Together, a complementary stance toward open and social activity, annotation, and analytics suggests the addition of notes to a text—like this book, and whether in print or digital form—can be understood as authentic activity relevant to group-level discourse, interpretation, negotiation of meaning, and knowledge construction. The open and social annotation of a digital text is also reflective of group-level educational and literacy practices that are collaborative, critical, and connected. Furthermore, participatory patterns and discourse data associated with a digital text’s open and social annotation can be reported as SLA to better understand group processes, make
6 Designing a Social Learning Analytics Tool …
81
accessible activity patterns, and encourage learners’ ongoing interaction, meaningmaking, and the production of new knowledge.
Reporting Crowd Annotation, Encouraging Discourse Layers The remainder of this chapter synthesizes commitments from learning theory to provide a first-hand account of the development of a public SLA dashboard that reports group-level social annotation and encourages open learners’ discourse and collaboration. Social annotation does enable group activity in formal course contexts that may exhibit open qualities (e.g., Chen, 2019). However, our design and research efforts have been motivated by—and remain responsive to—a public, openlynetworked, and interest-driven learning initiative facilitated online and made possible because of contributions from educational stakeholders. Rather than retrace a linear narrative, this chapter highlights key opportunities, complexities, and trade-offs that characterize how we sought to productively support intersections among social annotation, open learning, and learning analytics.
Design Context First-hand accounts can be valuable because they showcase designers navigating technical decisions while also describing how “builders are aware of and engage with the epistemological, methodological and ethical aspects of infrastructures” (Johanes & Thille, 2019, p. 2960). This account draws from and combines design documentation, stakeholder feedback cycles during prototyping, focus groups with active communities, ongoing issue and feature tracking, and public responses elicited from blog posts and social media. Since 2016, the Marginal Syllabus has sparked and sustained openly accessible conversation about educational equity through collaborative technologies and partnerships (Kalir, 2020). Project partners—including the National Writing Project, the National Council of Teachers of English, and the web annotation organization Hypothesis—have created a professional learning initiative that invites K-12 and post-secondary educators to: (a) read and discuss academic scholarship featuring perspectives that are marginal to dominant education norms; (b) engage publicly with one another among marginal digital spaces as mediated by social annotation; and (c) document professionally-relevant meaning and knowledge using an opensource social annotation tool (Hypothesis) that is marginal to commercial educational technology. Figure 6.1 is a representative scholarly text functioning as a discursive context, with Hypothesis anchoring educators’ social annotation during Marginal Syllabus activity.
82
J. H. Kalir
Fig. 6.1 Open access text annotated using Hypothesis social annotation tool
Akin to an online book club or study group, the Marginal Syllabus “promote[s] transformative learning as dialogue” (Bali & Caines, 2018, p. 14) and demonstrates what Greenhow and colleagues (2019) describe as the “social scholarship of teaching,” or an opportunity for educators share to their knowledge and produce new meaning about teaching through social and collaborative media practices. The Marginal Syllabus’ distinctive approach to public engagement with open access scholarship for the purposes of collaborative learning has been aptly captured by one partner author whose research was annotated in 2017: Through this initiative… I was given the opportunity to imagine a different paradigm for conducting, consuming, and responding to research—one in which study findings become the start rather than the end of dialogue and in which diverse forms of expertise extend, refute, and re-mix the knowledge production process for the common purpose of making education more equitable and culturally sustaining. (Mirra, 2018, p. 30)
To date, the Marginal Syllabus has facilitated over four dozen public conversations which have elicited contributions from over 650 educators who have authored nearly 8,000 open annotations (these descriptive statistics do not include participation and annotation counts associated with closed groups; anecdotal evidence suggests Marginal Syllabus texts are regularly annotated privately in the context of teacher education courses). As a result, public activity from educators’ social annotation authored during Marginal Syllabus activity comprise a sizable corpus of growing discourse data. During the 2017–2018 academic year, a need emerged to develop additional context-sensitive technical and social supports that could encourage open and ongoing participation in the Marginal Syllabus’ annotation conversations. Further,
6 Designing a Social Learning Analytics Tool …
83
shared interest in open (educational) principles and technologies (e.g., Pomerantz & Peek, 2016), as well as methods to report and visualize emergent social learning (e.g., Gruzd et al., 2016), motivated the design of a learning analytics tool responsive to both the open scholarship annotated during Marginal Syllabus activity and also any document annotated using Hypothesis anywhere on the web. Similarly, we endeavored to design a tool that could easily be incorporated into open learning environments, like the Marginal Syllabus, as well as more formal course contexts given the use of social annotation in school. Stakeholder feedback indicated that it would be beneficial to design a tool that simultaneously encouraged educators’ participation in their own open learning while also modeling how discourse data could be visualized and made actionable should these same educators incorporate social annotation activities into their courses and curricula.
Dashboard Traits and Design Trade-offs The first public iteration of a tool for Capturing and Reporting Open Web Data for Learning Analytics, Annotation, and Education Researchers (CROWDLAAERS; pronounced “crowd layers”; crowdlaaers.org) was released in early 2018. CROWDLAAERS is a SLA dashboard that visualizes group-level (or “crowd”) activity with the intention of encouraging additional social annotation (or “layers”). At its core, the technical capability of CROWDLAAERS to capture and report open web data is made possible by querying the Hypothesis API; as a standards-based and interoperable technology, Hypothesis makes available open data and metadata associated with social annotation. Figure 6.2 is an image of the current CROWDLAAERS dashboard
Fig. 6.2 CROWDLAAERS visualizing open and social annotation threads
84
J. H. Kalir
Fig. 6.3 CROWDLAAERS visualizing closed-group social annotation across multiple documents
highlighting the interactive threads visualization (similar data tables for annotations, participants, documents, days, and tags are collapsed or not pictured); and Fig. 6.3 is an image of closed-group activity across multiple documents. Each of four CROWDLAAERS iterations has sought to more effectively summarize, visualize, and present elements of open data and metadata as SLA relevant to understanding group annotation and encouraging subsequent discourse. Prior to the second iteration, for example, stakeholder recommendations informed two decisions: first, to feature an interactive table of all annotation content associated with a given document; and second, to access via this table each annotation in context, thereby providing a direct connection back to the annotated text and lowering barriers of reentry for ongoing social activity (this latter feature is possible because each Hypothesis annotation is a distinct URL, provisioning a linkage between dashboard and document). Feedback from project stakeholders motivated a third iteration focused on searching and sharing that broadened the dashboard’s public accessibility and relevance, consequently shifting the focus from texts annotated during Marginal Syllabus activity to any text annotated with Hypothesis. By visualizing open and social annotation anywhere on the web, CROWDLAAERS was adopted by both open learning projects as well as K-12 and post-secondary educators using Hypothesis in their courses. Feature requests from this broader user community prompted a fourth iteration and two improvements: access to closed-group activity given that Hypothesis social annotation in course contexts is frequently private; and multi-document visualization as both open learning and formal course activities often include learners reading and collaboratively annotating multiple texts. While iterating CROWDLAAERS, dashboard improvements have not dismissed design tensions associated with learning analytics and ethics (Slade & Prinsloo, 2013). As noted, CROWDLAAERS gathers open annotation data by querying the Hypothesis API. However, CROWDLAAERS was purposefully designed as a lightweight and server-less tool that does not store any Hypothesis data. By not storing
6 Designing a Social Learning Analytics Tool …
85
annotation data, we have intentionally limited the dashboard’s technical capability to complete more advanced analyses of collaborative activity and visualize additional forms of SLA (i.e. social networks and how those networks change over time). Our decision is intended to respect people’s ownership of their annotation data yet also results in a dashboard that only reports real-time SLA. If someone makes their public Hypothesis annotations private or deletes their annotations altogether, then CROWDLAAERS cannot capture or report that data because the data have never been stored anywhere. This trade-off favors people’s data rights over the dashboard’s potential analytic insights. Another convergence of dashboard traits and design trade-offs emerged throughout the 2018–2019 academic year. During this period in time, CROWDLAAERS was adopted by multiple open learning projects beyond the Marginal Syllabus, including: Equity Unbound, an open curriculum guiding participants to read about and discuss topics like empathy, privacy, and digital wellbeing; a public and collaborative annotation of Augmenting Human Intellect: A Conceptual Framework (Engelbart, 1962) in partnership with the Doug Engelbart Institute; and the Right to Learn Dignity Lab (R2L), a university-based group studying case law to understand how concepts of dignity and equality are fundamental to the right of personhood. In addition to supporting CROWDLAAERS use by these projects, faculty at five different North American institutions were also assisted as they leveraged the dashboard and piloted a multi-document “course collections” prototype to gain insight into their students’ annotation patterns, reading comprehension, and engagement with discipline-specific texts. The R2L use case was a notable collaboration that helped advance how CROWDLAAERS supported group-level annotation across multiple documents. At the time, R2L was studying educational dignity in four landmark American court cases that required members of the group—often working asynchronously and across continents—to read together and collaboratively annotate the case’s complaints, amicus briefs, oral arguments, and final opinions. This corpus of 52 documents totaled over 2,000 pages; accordingly, “Hypothesis was our tool of choice because of its capacity to function as a digital historian of our thinking” (University of Colorado Denver, 2020). CROWDLAAERS enabled R2L to pilot multi-document workflows that would be refined into the current feature allowing access to SLA from closed-group activity across all the texts annotated by a group (Fig. 6.3). Throughout this intensive stage of partnership and iteration, design decisions became responsive—almost exclusively—to the requests of stakeholders and active communities. For example, two of CROWDLAAERS’ interactive tables (documents and threads) came to include a hover-over feature that displays a summary of recent participants and the date of the latest annotation. This change better informed users about what specific SLA to select and examine at a more granular level. All six interactive tables were also designed so that each could be collapsed, expanded, and filtered to minimize information overload and aid focus on a particular set of SLA. An in thread feature was added to more easily filter the annotations table in order to highlight a single set of interactions. These features were not the result of creative technical innovation but, more importantly, honored emergent stakeholder needs.
86
J. H. Kalir
Incorporating these changes into CROWDLAAERS underscored how stakeholders’ knowledge about tool use in actual OFDL environments was as important a form of expertise as designers’ technical know-how. Creating new dashboard attributes accompanied a shift in role and responsibility—from sharing a public-facing tool to facilitating co-design processes supportive of specific educational communities. As another indicator of commitments to open learning processes and groups, the CROWDLAAERS code was shared on GitHub making the dashboard open-source software and a contribution to the open web.
Conclusion When this book is read and annotated by a group to aid collaborative meaning-making and learning, it is feasible for certain types of learning analytics methods and tools to productively enable open and social activity. The creation of CROWDLAAERS reflects intertwined epistemological and technical commitments relevant to future design and research should readers—as annotators of this book and other texts— work at the convergence of learning analytics and open learning. Such future efforts can draw upon and extend three key findings. First, as one response to opportunities identified in the literature (e.g., Johanes & Thille, 2019), this chapter has modeled a theoretically-grounded, first-hand account of a learning analytics dashboard developed for use in OFDL contexts. Design decisions underscore the importance of creating technology that focuses on and makes actionable SLA for ongoing group activity in context (e.g., Chen et al., 2018), rather than analytics tools that entrench tracking, measurement, and evaluation of individual performance irrespective of group interaction or sociopolitical concern (Perrotta & Williamson, 2018; Selwyn, 2019). Furthermore, and in light of the connectedness of many open learning initiatives, this chapter has also made an argument regarding the benefits of co-designing tools—like a public SLA dashboard—alongside stakeholders whose partnership informs ethical stances, guides technical decisions, and motivates iterative improvements that meet actual learning needs. Second, advances in learning analytics need not be constrained by or exclusively concerned with conceptions of knowing that privilege schooling, individual students, or solitary cognition. On the contrary, receptiveness to open learning opportunities should motivate tool development and research agendas that honor how knowledge is socially situated and openly-networked, how discourse and activity emerge from and occur because of participatory group processes, and how analytic insight should reflect knowledge construction in context (Buckingham Shum & Ferguson, 2012; Gruzd et al., 2016). To usefully document and make actionable insight about the social production of knowledge also means partnering to produce tools which reify activity traces, discourse patterns, and group-level processes—such as those mediated by open and social annotation—that are comprehensible to and in service of stakeholder groups.
6 Designing a Social Learning Analytics Tool …
87
Third, this chapter suggests it may be productive to build open technology that both supports open learning while also strengthening the infrastructure of the open web. Annotation is an infrastructural element of the open web (Whaley, 2017). The Hypothesis organization helped galvanize the creation of a standardized and interoperable annotation data model while simultaneously making publicly available a tool that enables social annotation anywhere online. As described, Hypothesis annotation has been embraced by learners, educators, and researchers across a range of OFDL environments (Chen, 2019; Sprouse, 2018). In response, an open-source SLA dashboard can take advantage of these emerging socio-technical arrangements and activities. Future SLA tools aligned with this annotation standard can iterate and improve the CROWDLAAERS dashboard, while also extending the ways in which open annotation data function as an accessible and useful form of learning analytics. This first-hand account is one small but concrete contribution to promising efforts intended to encourage learners’ sustained collaboration across open texts and contexts while mutually architecting publicly available and reconfigurable open learning infrastructure.
References Adler, M. (1940). How to mark a book. The Saturday Review, 11–12. Ávila, J., & Pandya, J. (Eds.). (2013). Critical digital literacies as social practice: Intersections and challenges. Peter Lang. Bali, M., & Caines, A. (2018). A call for promoting ownership, equity, and agency in faculty development via connected learning. International Journal of Educational Technology in Higher Education, 15(46), 1–24. Broughan, C., & Prinsloo, P. (2019). (Re)centring students in learning analytics: In conversation with Paulo Freire. Assessment & Evaluation in Higher Education, 45(4), 617–628. Brown, A., & Smiley, S. (1978). The development of strategies for studying texts. Child Development, 49(4), 1076–1088. Brown, J. S., & Duguid, P. (1996). The social life of documents. First Monday, 1(1). Retrieved from https://firstmonday.org/article/view/466/387 Buckingham Shum, S., & Ferguson, R. (2012). Social learning analytics. Journal of Educational Technology & Society, 15(3), 3–26. Chen, B. (2019). Designing for networked collaborative discourse: An UnLMS approach. TechTrends, 63(2), 194–201. Chen, B., Chang, Y. H., Ouyang, F., & Zhou, W. (2018). Fostering student engagement in online discussion through social learning analytics. The Internet and Higher Education, 37, 21–30. Conrad, D., & Prinsloo, P. (Eds.). (2020). Open(ing) education: Theory and practice. Brill USA. Cronin, C., & MacLaren, I. (2018). Conceptualising OEP: A review of theoretical and empirical literature in Open Educational Practices. Open Praxis, 10(2), 127–143. Engelbart, D. (1962). Augmenting human intellect: A conceptual framework. Stanford Research Institute. Fajkovic, M., & Björneborn, L. (2014). Marginalia as message: Affordances for reader-to-reader communication. Journal of Documentation, 70(5), 902–926. Gao, F., Zhang, T., & Franklin, T. (2013). Designing asynchronous online discussion environments: Recent progress and possible future directions. British Journal of Educational Technology, 44(3), 469–483.
88
J. H. Kalir
George, R. (1995). Open and distance education as social practice. Distance Education, 16(1), 24–42. Greenhow, C., Gleason, B., & Staudt Willet, K. B. (2019). Social scholarship revisited: Changing scholarly practices in the age of social media. British Journal of Educational Technology, 50(3), 987–1004. Gruzd, A., Paulin, D., & Haythornthwaite, C. (2016). Analyzing social media and learning through content and social network analysis: A faceted methodological approach. Journal of Learning Analytics, 3(3), 46–71. Johanes, P., & Thille, C. (2019). The heart of educational data infrastructures = Conscious humanity and scientific responsibility, not infinite data and limitless experimentation. British Journal of Educational Technology, 50(6), 2959–2973. Kalir, J. (2020). Social annotation enabling collaboration for open learning. Distance Education, 41(2), 245–260. Kalir, J., & Garcia, A. (2019). Civic writing on digital walls. Journal of Literacy Research, 51(4), 420–443. Kalir, J., & Perez, F. (2019). The marginal syllabus: Educator learning and web annotation across sociopolitical texts and contexts. In A. Reid (Ed.), Marginalia in modern learning contexts (pp. 17–58). IGI Global. Kinloch, V. (2010). Harlem on our minds: Place, race, and the literacies of urban youth. Teachers College Press. Lave, J., & Wenger, E. (1991). Situated learning: Legitimate peripheral participation. Cambridge University Press. Liu, K. (2006). Annotation as an index to critical writing. Urban Education, 41(2), 192–207. Marshall, C. C. (1997). Annotation: From paper books to the digital library. In Proceedings of the Second ACM International Conference on Digital Libraries (pp. 131–140). Association for Computing Machinery. Matcha, W., Gasevic, D., & Pardo, A. (2019). A systematic review of empirical studies on learning analytics dashboards: A self-regulated learning perspective. IEEE Transactions on Learning Technologies. https://doi.org/10.1109/TLT.2019.2916802 Mirra, N. (2018). Pursuing a commitment to public scholarship through the practice of annotation. The Assembly, 1. Retrieved from https://www.colorado.edu/journal/assembly/2018/12/12/ pursuing-commitment-public-scholarship-through-practice-annotation National Academies of Sciences, Engineering, and Medicine. (2018). How people learn ii: Learners, contexts, and cultures. The National Academies Press. Novak, E., Razzouk, R., & Johnson, T. E. (2012). The educational use of social annotation tools in higher education: A literature review. Internet and Higher Education, 15(1), 39–49. Perrotta, C., & Williamson, B. (2018). The social life of learning analytics: Cluster analysis and the ‘performance’ of algorithmic education. Learning, Media and Technology, 43(1), 3–16. Plevinski, J., Weible, J., & Deschryver, M. (2017). Anchored annotations to support collaborative knowledge construction introduction. In Making a difference: Prioritizing equity and access in CSCL, 12th International Conference on Computer Supported Collaborative Learning (CSCL) 2017 (pp. 111–118). CSCL. Pomerantz, J., & Peek, R. (2016). Fifty shades of open. First Monday, 21(5). Retrieved from https:// firstmonday.org/ojs/index.php/fm/article/view/6360/5460 Reid, A. (Ed.). (2019). Marginalia in modern learning contexts. IGI Global. Rubel, A., & Jones, K. M. (2016). Student privacy in learning analytics: An information ethics perspective. The Information Society, 32(2), 143–159. Selwyn, N. (2019). What’s the problem with learning analytics? Journal of Learning Analytics, 6(3), 11–19. Siemens, R., Arbuckle, A., Seatter, L., El Khatib, R., & El Hajj, T. (2017). The value of plurality in “The network with a thousand entrances.” International Journal of Humanities and Arts Computing, 11(2), 153–173.
6 Designing a Social Learning Analytics Tool …
89
Slade, S., & Prinsloo, P. (2013). Learning analytics: Ethical issues and dilemmas. American Behavioral Scientist, 57(10), 1509–1528. Slakmon, B., & Schwarz, B. B. (2017). “Wherever you go, you will be a polis”: Spatial practices and political education in computer-supported collaborative learning discussions. Journal of the Learning Sciences, 26(2), 184–225. Sprouse, M. (2018). Social annotation and layered readings in composition. Computers & Writing Conference, (pp. 39–52). Stahl, G. (2017). Group practices: A new way of viewing CSCL. International Journal of ComputerSupported Collaborative Learning, 12(1), 113–126. Sun, Y., & Gao, F. (2017). Comparing the use of a social annotation tool and a threaded discussion forum to support online discussions. Internet and Higher Education, 32, 72–79. Svihla, V., & Reeve, R. (Eds.). (2016). Design as scholarship: Case studies from the learning sciences. Routledge. University of Colorado Denver. (2020, March 6). Manuel Espinoza—Chancellor’s distinguished faculty lecture [video]. Youtube. https://youtu.be/kEtjCZoTPps West-Puckett, S., Smith, A., Cantrill, C., & Zamora, M. (2018). The Fallacies of open: Participatory design, infrastructuring, and the pursuit of radical possibility. Contemporary Issues in Technology and Teacher Education, 18(2), 203–232. Whaley, D. (2017). Annotation is a now a web standard. Retrieved from https://Hypothes.is/blog/ annotation-is-now-a-web-standard/ Wolfe, J. (2002). Marginal pedagogy: How annotated texts affect a writing-from-sources task. Written Communication, 19(2), 297–333. Zhang, J., Burgos, D., & Dawson, S. (2019). Advancing open, flexible and distance learning through learning analytics. Distance Education, 40(3), 303–308.
Jeremiah H. Kalir is an associate professor of Learning Design and Technology at the University of Colorado Denver School of Education and Human Development. Remi studies how social annotation enables collaborative, open, and equitable learning. He was the 2020–2021 Hypothesis Scholar in Residence and has helped advance partner-driven research about social annotation. He is also a cofounder of the Marginal Syllabus, a project that leverages social annotation to spark and sustain public conversation about educational equity. Remi’s research about social annotation has also been supported by a National Science Foundation Data Consortium Fellowship and an OER Research Fellowship from the Open Education Group. Learn more about Remi at http://www.rem ikalir.com.
Chapter 7
Situating Learning Analytics for Course Design in Online Secondary Contexts Joshua Quick and Rebecca C. Itow
Online and distance learning (ODL) encompasses an array of practices and tools for facilitating learning without co-located1 support of an instructor (Watson et al., 2004). Particularly in asynchronous ODL environments, the individual interactions and physical separation between teachers and students provide experiences unique to ODL teaching and learning (Dede, 1995). The COVID-19 pandemic highlighted these experiences, revealing practical and theoretical concerns of teachers’ use of online course data. This chapter explores how applying a situated lens to learning analytics procedures may help educators focus on the activities and transactions between students and teachers that make learning more personalized, useful, and usable (Cavanaugh et al., 2009). Since its conception, learning analytics have aspired to help educators use massive amounts of student data. Learning analytics, however, have historically been used to analyze data in an aggregated, ‘one-size-fits-all’ focus (McPherson et al., 2016), which is useful in identifying, analyzing, and synthesizing understandings of broad behaviors and phenomena across wide data sets. By employing a situated lens of knowing and learning (Greeno et al. 1998), those same learning analytics methods can be used to analyze the nuanced interactions that occur in individual student learning. Framing knowledge as the ability to participate in a community’s practices and learning as the strengthening of those practices (Greeno et al., 1996; Greeno et al., 1998), this situated lens affords the ability to treat individual student learning as an interactive system. Within this system, student’s interactions with the social, technical, and pedagogical designs of educational spaces are constantly shaping and 1 The ODL environments we discuss are asynchronous, with no required live meetings. A live instructor does teach each course and interacts with students regularly via gradebook comments, email, and conferences.
J. Quick (B) · R. C. Itow Indiana University, Bloomington, IN, USA e-mail: [email protected] © The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2022 P. Prinsloo et al. (eds.), Learning Analytics in Open and Distributed Learning, SpringerBriefs in Open and Distance Education, https://doi.org/10.1007/978-981-19-0786-9_7
91
92
J. Quick and R. C. Itow
influencing student understanding. Applying situated learning analytics methods to the analysis of student behaviors and practices can help classroom educators understand the nuanced ways that learning environments shapes students’ learning experiences, and quickly adjust their practice and pedagogy appropriately. Specifically, this approach prioritizes the alignment between learning designs and analytics representations, which are necessary for developing impactful tools and resources (Knight et al., 2014; Kitto et al., 2018; Mangaroska & Giannakas, 2018) and help teachers make real-time decisions and adjustments in online learning environments. Applying a situated learning analytics approach to an ODL environment, this chapter examines the convergence of principled learning designs and the development of situated analytics through a case based analysis of the interactive systems of individual student learning within an online high school. The intersection between ODL and learning analytics embodies a unique combination of factors through which theoretically informed approaches can reveal insights about and catalyze changes to curricula and pedagogy (Zhang et al., 2019). Our situative lens illuminates unique challenges regarding the intersection of ODL and learning analytics, and illuminates how particular assumptions embedded in teaching and learning activities influence what learning analytics resources represent.
Challenges of Online Learning The development of technology to support learning within networked and digital tools is a fundamental component of online learning design. However, approaches for using these tools tend to focus on development and integration rather than how the underlying assumptions and theoretical approaches of these tools shape teaching and learning in digital contexts (Beetham & Sharpe, 2019). This focus on tools can result in a bricolage of resources and design features that generate tensions for participants; the unfocused curation of tools—however dynamic they may be—can impede engagement and constrain interactions with content and each other. Such tensions also hinder personal, social, and cultural factors that contribute to learners’ experiences in ODL (Hasler Waters et al., 2014). Because students enrolling in online learning programs tend to be among the more vulnerable populations for successful completion of academic degrees (de la Varre et al., 2014), this focus on participant experience is particularly important. The COVID-19 Pandemic increased the number of students participating in online learning opportunities who face a new kind of vulnerability. In March 1, 2020February 29, 2021, the online high school in which this research was conducted saw a 15% increase in enrollments and maintained a 70% course completion rate compared to the same timeframe the previous year. While students enrolling in online schools before 2020 often sought learning experiences unconstrained by traditional school routines, only part of that vulnerability involved a lack of preparation for learning online. Many students now participating in ODL are even less equipped for learning online, and all are facing new stresses that intensify challenges unique to
7 Situating Learning Analytics for Course …
93
ODL. Students (and instructors) are simultaneously learning content while learning to learn online (Itow, 2020). The development of effective and engaging online learning programs requires that course designers understand the reason(s) for enrolment in ODL and the familial circumstances (Bailey & Brown, 2016). Such factors are of particular importance given that student success (i.e., completion) of an online course or program is largely dependent on support for students to regulate and strategize their learning (Roblyer & Marshall, 2002).
Learning Analytics Challenges Learning analytics is the process of collecting and analyzing data, and making informed decisions about learners and the contexts in which they interact (Siemens, 2013). Since its emergence, learning analytics intended to focus less on automated collection and analysis of data using machine learning algorithms, and more on understanding and actionably informing educational practices in more systematic fashions (Ferguson, 2012). Despite these intentions for systemic and theoretically informed analyses, many efforts have tended to ignore the connection of analytics to frameworks of teaching and learning (Gaševi´c et al., 2015). This has resulted in a tendency for researchers and analysts to focus primarily on the development of analytic techniques and models over how the results produce insights into teaching and learning (Kitto et al., 2018).
Developments Toward Solutions When analytics and online learning design are pursued in combination, the difficulties described above magnify. Both learning analytics and online learning design represent complex—and at times contradictory—webs of cultural, economic, political, and social factors that require attention to a variety of areas of expertise (Selwyn, 2014). Each relies on designers and practitioners working together to establish an acceptable solution. This interdisciplinarity also complicates establishing trust, values, social and epistemic norms, and technical functionality across diverse participants. The negotiation of tensions borne out of this interdisciplinarity are essential for improving practice (Andersen, 2013), and can produce substantive results. Recent work has sought to address these tensions. Teasley (2019), for example, has characterized learning analytics as a dialogue between the information and learning sciences. Further, examples seeking to characterize explicit connections between learning theories and analytics have received continued focus (see Knight & Buckingham-Shum, 2017; Wise & Shaffer, 2015). The construction of theoretically framed learning analytics most frequently corresponds with the constructivist perspectives and frameworks (see Jonassen, 1994). The unit of analysis of such perspectives is at the individual student level, examining a student’s
94
J. Quick and R. C. Itow
construction, regulation, and engagement with resources; this facilitates analysis of dynamic cognitive, motivational, social, and behavioral factors in student learning. Constructivist frameworks, however, tend to ignore relevant social, historical, and institutional elements that contribute to the development of ODL systems and learning analytics, which raises concerns in the co-development of these ODL and learning analytics tools. Sociocultural frameworks provide one approach for addressing these concerns and mediating interdisciplinary tensions unique to online learning and learning analytics. Sociocultural perspectives, in contrast, prioritize the investigation of learning in terms of social and cultural processes that dynamically construct and are constructed by participants through their engagement in practices across contexts (Danish & Gresalfi, 2018; see Case, 1996). Both ODL and learning analytics extend this ongoing discussion.
Situative Theories of Learning in Learning Analytics Situative frameworks understand learning as an inherently contextualized process that is shaped by interactions with behavioral, cognitive, material, and conceptual elements of the learning environment (Greeno et al., 1998). An inherent synthesis of behavioral, cognitive, and sociocultural perspectives enables a situated lens to de-silo problems of practice and innovation, as well as learning analytics for and of online learning systems. Below are the principles we used in developing situatively informed learning analytics analyses. Principle 1: Frame analytics as capturing participation in disciplinary practice Situative learning analytics identify ways to represent and capture learners’ participation within the tools and resources in a particular learning environment. ODL frameworks uniquely document all interactions in ways that learning analytics can leverage and inform these interactions in terms of a designed system. Principle 2: Recognize participants’ roles within online systems Situative perspectives enable learners’ agency and hold them accountable in disciplinary engagement (Engle, 2012; Engle & Conant, 2002). Learners problematize content and orient practices to their own experiences and backgrounds. Sensitivity to the needs and developments of individuals’ participation within ODL contexts in relation to their broader experiences and life is necessary. In ODL environments, situative learning analytics can be used to understand how learners use their own experiences to problematize and contextualize learning.
7 Situating Learning Analytics for Course …
95
Principle 3: Require temporal analyses for situative online learning analytics Principles 1 and 2 attend to developments over time and place. Situative learning analytics examines how sequences of events and dynamic processes of student learning across contexts. Principle 4: Develop measurements of participation in online disciplinary practices with practitioners Situative learning analytics are applied in partnership with ODL programs and institutions. Stakeholders work together to iteratively refine measurements and tools within localized contexts, ensuring results are relevant and useful. The following pages describe the development of situative learning analytics in partnership with a virtual high school. This analysis is part of a preliminary implementation of the above principles and an iterative design process. Situated learning analytics does not intend to produce the more automated, generalizable insight on performance; rather it is employed as a means to better understand learners’ interactions across course designs.
Site Context and Methods This research was conducted at an online high school embedded within a large US university. Initially a correspondence program, under the direction of the second author (principal) and her team the online high school has maintained its asynchronous learning structure while developing new participatory “cooperative” models of teaching, learning, and assessment for ODL (Itow, 2018). Currently, the school’s courses embody both the correspondence and cooperative models. In this preliminary analysis, a cooperative model 10th grade English Language Arts (E10) course focused on literary interpretation and analytical writing and a correspondence model American Literature (AL) course surveying North American texts were analyzed. AL (correspondence) offers no peer-to-peer interaction and offers lessons structured such that the student can walk themselves through the content with minimal teacher prompting; grading feedback is often applied to future assignments rather than used to revise already submitted lessons. E10 (cooperative) follows a “participatory learning and assessment” model (Hickey & Rehak, 2013), offering private instructor-generated and public-to-the-class) peer-generated feedback that encourages assignment revision.
Participants and Analysis Two students, each of whom completed E10 and AL courses in the same order and in the same time period, illustrate the insights situative learning analytics provides for ODL through a comparative case study analysis (Yin, 2003). These students were
96
J. Quick and R. C. Itow
selected as part of an analysis of the collaborative and correspondence ODL models conducted in a partnership between the first and second author. This led to a focus on two of the school’s English Language Arts courses (E10 and AL) because they are taken in sequence and the courses employ different models of online instruction. We examined records of students who had (a) completed both courses since 2016, (b) taken the courses sequence, and (c) performed highly on all assessments within the course. This resulted in a limited subset of students (n = 4). The two focus students were selected given their relative individual performance across the courses. We analyzed this student data to observe the distinctions between learners’ interactions across course designs and refinement of the situative principles above. It resulted in a preliminary framework for more robust designs of situative online learning analytics. We articulate this analysis in terms of a social learning analytics framework (Buckingham-Shum & Ferguson, 2012), a family of practices for understanding processes of learning and teaching as part of cultural, social, and technical systems.
Procedures Assignment descriptors, activity content, and instructions were used to understand how the course structure impacted student participation and learning. An analysis of these factors in each course revealed that E10 focused on the construction of skills across writing styles while AL was focused on understanding broad themes and literary conventions in texts. Comparisons of students’ word use revealed how course models impacted their participation and assignment completion. Language in student writing was viewed as an extension of identifying patterns of participation and learning in online discourse (Paulus & Wise, 2019). The quantitative analysis of student work across courses revealed how course structure shaped learners’ expectations of themselves and impacted learner participation in the course. We analyzed document similarity within and between course contexts through hierarchical clustering of word differences and community detection in word cooccurrence networks. These methods can be more easily interpreted through visual means, thereby enabling potential use by practitioners. Documents were analyzed using the quanteda package in R (Benoit et al., 2018) and co-occurrence networks constructed using the igraph package (Csardi & Nepusz, 2006). Analysis proceeded in three stages: (a) removal of stop words and construction of frequency matrices for each document, (b) detection of similarity between each students’ documents between and across courses, (c) constructing word co-occurrence networks of the top 50 most frequent pairs. Stemming and lemmatization were not used to identify the variations in participation through word variant pairing due to the accuracy being secondary to representing individual participation within these spaces. Graph modularity, or group structures, were constructed using the edge betweenness community detection method (see Girvan & Newman, 2002). This method was used to indicate
7 Situating Learning Analytics for Course …
97
the extent to which a students’ interaction with a particular set of assignments, and thus their course activities, could be detected.
Preliminary Developments Figure 7.1 shows similarities in each student’s assignments as separate documents and compares the word occurrence through documents. The graphic demonstrates distances between documents by learners’ use of particular words. Learners’ word use within the courses tend to be similar when lesson content is the same for all students. For example, learners use similar terms when analyzing the same text. This pattern is evident in the literature (AL) course, though student A and B were further apart in their style of writing as measured through overall document similarity. In contrast, E10 had similarity both within and between students and was also a function of particular assignments. Further, student A’s writings in both courses were generally closer in document structure than for student B. The general takeaway is that more individual variation occurred within the collaborative course design due to learners being able to agentically choose their participation within E10 than in AL. This type of representation illustrates the intersection of learner interactions within particular course structures. While not a complete picture, representations such as these substantively identify differences across those contexts accounted for in the interaction of student and course structures.
Fig. 7.1 Students’ writing similarity
98
J. Quick and R. C. Itow
Identifying Types of Participation While word choice offers granular insight into how students engaged with course activities, Figs. 7.2 and 7.3 describe the differences in word co-occurrence and how student artifacts followed prescribed activities within the E10 and AL course. Every node relates to the top 50 frequently used words across their writing. Connections describe the co-occurrence between these word usages between each student’s submission of the same lesson. By examining structural features of the graphs, we can develop insights around the way course structures impact students’ engagement with and enactments of disciplinary practices. These co-occurrence networks describe differences between the students’ interactions in a specific course and across the two-course sequence. The modularity of each of these networks were, overall, fairly weak (where modularity describes the divisibility of a network in terms of particular groups or communities). The divisibility of these graphs into distinct topical areas of participation was not robustly observed. Student A’s course engagement (Fig. 7.2) suggests two different participations, described in terms of the extent to which the graphs are separable. For Student A, three groups with modularity of 0.48 and 13 groups with modularity of 0.13 were detected for the E10 and AL course, respectively. For Student B (Fig. 7.3),
English 10th Grade
American Literature
Fig. 7.2 Student A’s writing assignments co-occurrence networks
English 10th Grade
American Literature
Fig. 7.3 Student B’s writing assignments co-occurrence networks
7 Situating Learning Analytics for Course …
99
14 groups with a modularity of 0.18 for E10 and 13 groups with a modularity of 0.13. Student A’s work in E10 is the exception, as the work focused on interest topics in the course modules (e.g., composing an analysis of media and its effects on society through H.G. Well’s infamous radio broadcast). Therefore, Student A exhibited greater topical problematization within their engagement of the course. The representations for both students of the word use in AL were quite similar due to the topical boundaries of the course and lesson prompts. The highly structured nature of AL activities facilitated students’ writing in similar ways, whereas E10 enabled learners to participate in different manners through their writing. As such, these representations detail the particular interactions of learners with online course designs and illustrate particularities of individual students’ engagements across these contexts. In general, these metrics showcase how students engaged with course topics and the extent to which students are able to connect ideas within their course submissions. However, these analytic representations also describe the particular enactments of student A and B within the respective courses. Both students engaged in the same courses; situative learning analytics offer insights for how student participation was shaped by each course’s design structure, which can be used to inform new design decisions. This can assist in iterating lesson design, guiding student behavior, and facilitating engagement routines that are most appropriate for ODL settings. Further demonstrating the power of ODL environment design, student A and B’s differences in participation across the two courses illustrate the ways in which their participation is constrained by the way the course’s structure sets expectations of the learner (and the learner of themselves). Specifically, our observations reveal two distinct conceptions of what being a student (and being a teacher) means within these course structures. For E10, students are positioned as producers of knowledge while AL positions students as consumers of information. Representations of these cases describe different relationships between the learner and the systems which produced their behavior. Sitiuative learning analytics examines the intersection of the learner and the course designs, and can therefore be used to identify design strategies that help students relate or orient to learning in personally meaningful ways when engaging in ODL environments. These orientative views can help instructors understand where a student may need more assistance or, in the case of larger curricular designs, where ODL activities are (not) producing the types of outcomes sought by teachers and administrators.
Concluding Thoughts We do not suggest that the measures described above provide direct knowledge on the skill of these students as developing writers, though we hope to work towards that design through our partnership. Rather, we suggest that these measures and the principles by which they were constructed reveal insights regarding the intersection between online course design and student activity.
100
J. Quick and R. C. Itow
The observations in the above cases are localized contexts of the separate high school courses. The more direct assessment approach enacted by the AL course reveals more limited topical orientations of the student as defined and bounded by the course. In contrast, E10 facilitates a different type of participation by letting students elect and problematize their own topics of interest and engagement. Both case students engaged in substantively different behaviors. The representations of these behaviors in the analytics describes not just their learning of content, but also how the social and pedagogical elements embedded within the course designs and the online high school shape that learning. Specifically, the different course designs describe different institutional histories. The existing AL course uses a siloed correspondence model that was adapted to online learning and restricts opportunities for learners to engage in with content productive and disciplinary ways. This model limits opportunities for connecting content to students’ individual circumstances and needs, with teacher and student engaging in a more transactional relationship around grades. In contrast, the E10 course encourages teachers to engage with and support their learners’ interest and needs. The further development and use of the situative learning analytics principles described above can uncover where ODL designs are (not) meeting students’ larger needs within the socio-technical context of the learning environment. This can subsequently lead to potential revisions of online teaching and learning models. Although the comparative cases presented here preclude wide generalizations, the illustrative power of these cases highlights contrasts of behavior as impacted by ODL design, and thus the embedded information within its representation through computational means. In this chapter, we explored the integration of situative theoretical frameworks into the exploration and understanding of online learning spaces. While this effort is still in its preliminary development, we believe the following observations are warranted through this investigation. First, the collaborative, small scale focus described in this framework and analytic approach prioritizes understanding the intersection of ODL designs and learning analytics. Using learning analytics to change practices within a particular context requires attending to the assumptions and values for teaching, learning, and knowing embedded in the activities within an educational environment (Knight et al., 2014, 2020). In addition to prioritizing human-centered decision making with data (see Buckingham-Shum et al., 2019), our situative approach prioritizes revealing the assumptions embedded within the design decisions of ODL environments. Specifically, situative learning analytics allow us to observe particular representations produced by learners’ interactions with activities, which can be lost at larger scales of analysis. Second, we believe the initial, small-scale analysis described in this chapter facilitates considerations of additional designs and changes for both the analytic representations and the teaching and learning activities from which they are developed. Specifically, we believe this approach provides initial resources to create ODL environments that help students orient their learning in personally meaningful ways. Further refinement and development through repeated prototyping and designs, as well as analysis of how these designs produce particular changes within a class,
7 Situating Learning Analytics for Course …
101
will shape the continued iteration of situative learning analytics principles and the resources built from those principles’ implementation. We intend to expand this set of principles through the partnership that developed in this work. Importantly, the aims of supporting novel and experimental teaching and learning designs in the online high school (discussed in the analysis above) provides opportunities for conducting this kind of developmental implementation. While other ODL contexts may not be historically or institutionally structured to support this work in a robust capacity, we believe focusing on individual students’ participation in ODL environments can illuminate design issues and reveal opportunities to change existing teaching and learning activities. However, we must note that conducting these analyses robustly and at scale requires substantive alignment in skills, practices, and knowledge enacted in partnership with various stakeholders (teachers, administrators, researchers/data scientists). Further, scaling up analyses to facilitate continued substantive change and development will be an ongoing designoriented process. This is not to say, however, that large scale analyses are antithetical to our approach. Rather, we believe the emphasis on smaller scale data analyses provides information relevant for identifying larger scale applications that can be constructed from explorations of more localized teaching and learning contexts. Situative learning analytics is a beneficial approach that can lead to insights into the change of teaching and curricular structures. We offer this work as an emerging example to address the growing need for expanding learning analytics research as a process of joint collaboration between all members of an online learning community.
References Andersen, H. (2013). The second essential tension: On tradition and innovation in interdisciplinary research. Topoi, 32(1), 3–8. Bailey, T. L., & Brown, A. (2016). Online student services: Current practices and recommendations for implementation. Journal of Education Technology Systems, 44(4), 450–462. Beetham, H., & Sharpe, R. (2019). Rethinking pedagogy for a digital Age: Principles and practices of design. Routledge. Benoit, K., Watanabe, K., Wang, H., Nulty, P., Obeng, A., Müller, S., & Matsuo, A. (2018). Quanteda: An R package for the quantitative analysis of textual data. Journal of Open Source Software, 3(30), 774. https://doi.org/10.21105/joss.00774, https://quanteda.io Buckingham-Shum, S., & Ferguson, R. (2012). Social learning analytics. Journal of Educational Technology & Society, 15(3), 3–26. Buckingham-Shum, S., Ferguson, R., & Martinez-Maldonado, R. (2019). Human-centered learning analytics. Journal of Learning Analytics, 6(2), 1–9. Case, R. (1996). Changing views of knowledge and their impact on educational research and practice. In D. Olson & N. Torrence (Eds.), The Handbook of education and human development (pp. 75–99). Blackwell. Cavanaugh, C. S., Barbour, M. K., & Clark, T. (2009). Research and practice in K-12 online learning: A review of open access literature. The International Review of Research in Open and Distributed Learning, 10(1), 1–13. Csardi, G., & Nepusz, T. (2006). The igraph software package for complex network research. InterJournal, Complex Systems, 1695. http://igraph.org
102
J. Quick and R. C. Itow
Danish, J. A., & Gresalfi, M. (2018). Cognitive and sociocultural perspectives on learning: Tensions and synergy in the learning sciences. In F. Fischer, C. Hmelo-Silver, S. Goldman, & P. Reimann (Eds.), International handbook of the learning sciences (pp. 34–43). Routledge. Dede, C. (1995). The transformation of distance education to distributed learning. Learning & Leading with Technology, 23(7), 25–30. de la Varre, C., Irvin, M. J., Jordan, A. W., Hannum, W. H., & Farmer, T. W. (2014). Reasons for student dropout in an online course in a rural K–12 setting. Distance Education, 35(3), 324–344. Engle, R. A., & Conant, F. R. (2002). Guiding principles for fostering productive disciplinary engagement: Explaining an emergent argument in a community of learners classroom. Cognition and Instruction, 20(4), 399–483. Engle, R. A. (2012). The productive disciplinary engagement framework: Origins, key concepts, and developments. In D. Yun Dai (Ed.), Design research on learning and thinking in educational settings (pp. 170–209). Routledge. Ferguson, R. (2012). Learning analytics: Drivers, developments and challenges. International Journal of Technology Enhanced Learning, 4(5/6), 304–317. Gaševi´c, D., Dawson, S., & Siemens, G. (2015). Let’s not forget: Learning analytics are about learning. TechTrends, 59(1), 64–71. Girvan, M., & Newman, M. E. (2002). Community structure in social and biological networks. Proceedings of the National Academy of Sciences, 99(12), 7821–7826. Greeno, J. G., Collins, A. M., & Resnick, L. B. (1996). Cognition and learning. In D. C. Berliner & R. C. Calfee (Eds.), Handbook of educational psychology (pp. 15–46). Prentice Hall International. Greeno, J. G., et al. (1998). The situativity of knowing, learning, and research. American Psychologist, 53(1), 5–26. Hasler Waters, L., Barbour, M. K., & Menchaca, M. P. (2014). The nature of online charter schools: Evolution and emerging concerns. Educational Technology & Society, 17(4), 379–389. Hickey, D. T, & Rehak, A. (2013). Wikifolios and participatory assessment for engagement, understanding, and achievement in online courses. Journal of Educational Multimedia and Hypermedia, 22(4), 407–441. Itow, R. C. (2018). Professional development is not a summer job: Designing for teacher learning that is valuable and valued. Indiana University. Itow, R. C. (2020). Reconceptualizing learning and our roles within it: Why online education looks and feels so different. Indianagram, 22(7). Jonassen, D. H. (1994). Thinking technology: Toward a constructivist design model educational. Educational Technology, 34–37. Kitto, K., Buckingham-Shum, S., & Gibson, A. (2018). Embracing imperfection in learning analytics. In Proceedings of the 8th International Conference on Learning Analytics and Knowledge (pp. 451–460). Knight, S., & Buckingham Shum, S. (2017). Theory and learning analytics. In C. Lang, G. Siemens, A. Wise, & D. Gaševi´c (Eds.), Handbook of learning analytics (pp. 17–22). SoLAR. Knight, S., Buckingham-Shum, S., & Littleton, K. (2014). Epistemology, assessment, pedagogy: Where learning meets analytics in the middle space. Journal of Learning Analytics, 1(2), 23–47. Knight, S., Gibson, A., & Shibani, A. (2020). Implementing learning analytics for learning impact: Taking tools to task. The Internet and Higher Education, 45, 100729. Mangaroska, K., & Giannakos, M. (2018). Learning analytics for learning design: A systematic literature review. IEEE Transactions on Learning Technologies, 12(4), 516–534. McPherson, J., Tong, H. L., Fatt, S. J., & Liu, D. Y. (2016, April). Student perspectives on data provision and use: Starting to unpack disciplinary differences. In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (pp. 158–167). Paulus, T. M., & Wise, A. F. (2019). Looking for insight, transformation, and learning in online talk. Routledge. Roblyer, M. D., & Marshall, J. C. (2002). Predicting success of virtual high school students: Preliminary results from an educational success prediction instrument. Journal of Research on Computing in Education, 35(2), 241–255.
7 Situating Learning Analytics for Course …
103
Selwyn, N. (2014). Distrusting educational technology: Critical questions for changing times.. Routledge. Siemens, G. (2013). Learning analytics: The emergence of a discipline. American Behavioral Scientist, 57(10), 1380–1400. Teasley, S. D. (2019). Learning analytics: Where information science and the learning sciences meet. Information and Learning Sciences, 120(1/2), 59–73. Watson, J. F., Winograd, K., & Kalmon, S. (2004). Keeping pace with K-12 online learning: A snapshot of state-level policy and practice. Learning Point Associates. Wise, A. F., & Shaffer, D. W. (2015). Why theory matters more than ever in the age of big data. Journal of Learning Analytics, 2(2), 5–13. Yin, R. K. (2003). Case study research: Design and methods (3rd ed., Vol. 5). Sage. Zhang, J., Burgos, D., & Dawson, S. (2019). Advancing open, flexible and distance learning through learning analytics. Distance Education, 40(3), 303–308.
Joshua Quick is the Principal Learning Data Analyst of the eLearning Research & Practice Lab at Indiana University. In addition to supporting rigorous research with data from digital teaching and learning technology, Joshua’s own research interest centers on the alignment of theory and practice in the development of human-centered learning analytics for practical and impactful use by administrators, instructors, and students. He is also interested in developing learning analytics based on the insights generated from sociocultural perspectives of teaching and learning. Rebecca C. Itow is Principal of IU High School Online. She earned her Ph.D. in Learning Sciences from Indiana University, and was a public high school teacher. Driven to facilitate safe spaces for learners to navigate their academic journeys in valuable and valued ways, Rebecca researches, designs, and implements Responsive Online Pedagogical practices in digital learning environments. Her work guides current University and community partnerships that innovate online teaching, learning, design.
Chapter 8
Ethical Considerations of Artificial Intelligence in Learning Analytics in Distance Education Contexts Leona Ungerer and Sharon Slade
Introduction Learning analytics as research and practice has evolved substantially over the past decade or so. Its focus now encompasses a wide range of issues, from student retention to teaching and learning design and social network analysis to personalized learning amongst many others. As in other technologically driven fields, stakeholder expectations are increasingly high. As Beer (2019) suggests, analytics systems are expected to be “speedy, accessible, revealing, panoramic, prophetic and smart” (p. 22). Realising these expectations might be considered almost impossible without optimising the role and potential of Artificial Intelligence (AI) in learning analytics. As higher education increasingly faces change, educational institutions’ expectations of learning analytics also increase—with the prospect that using AI is no longer an option, but a necessity. In response to the changing environment, higher education often looks for speedy solutions that offer almost instant insight with ‘at the touch of a button’ analyses. Whereas institutions once relied on dedicated departments and staff for analysis and explanation, AI now offers the potential for fresh stakeholder insight with relatively little understanding of or experience in data analysis. As AI-driven learning analytics systems utilise large datasets, they provide panoramic perspectives, allowing broader, previously unconsidered views. AI-informed learning analytics as prophecy mirrors the move from simply understanding what has happened (and perhaps why) to exploiting that understanding to make predictions about what might happen in the future. And finally, AI allows learning analytics to be ‘smart’—enabling systems L. Ungerer (B) University of South Africa, Pretoria, South Africa e-mail: [email protected] S. Slade Earth Trust, Abingdon, UK © The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2022 P. Prinsloo et al. (eds.), Learning Analytics in Open and Distributed Learning, SpringerBriefs in Open and Distance Education, https://doi.org/10.1007/978-981-19-0786-9_8
105
106
L. Ungerer and S. Slade
that can learn autonomously without the need for (significant) human intervention. Whilst predictive analytics is increasingly prevalent and learner dashboards seek to roll out insight for both institutional staff and students, human agency and intervention stills play a large part in learning analytics approaches. The increasing use of AI in learning analytics will change this. In the context of this chapter, while AI is one of many ‘tools’ used in the broader field of learning analytics, it is increasingly impossible to think of learning analytics without AI. As such, AI is seen by some as the “future engine of education” with its impact felt across the ambit of higher education (Schroeder, 2019). According to UNESCO Director-General Audrey Azoulay, AI will significantly transform education and drastically alter teaching tools, learning approaches, access to knowledge and teacher training (UNESCO, 2019). Although AI holds considerable promise in an educational context, there are several concerns surrounding its use (Sharma et al., 2019), for example, that AI learns from existing data from a variety of sources, without necessarily accounting for any inherent biases in the data training sets. It would therefore not be safe to assume that outputs and predictions are not occasionally flawed, and/or perpetuating inequalities and injustice (Broussard, 2018; Crawford, 2021; Eubanks, 2017). This chapter addresses ethical issues emerging from the use of AI in learning analytics. However, it is important to place AI and learning analytics in the broader context of analytical approaches in education.
AI and Learning Analytics in Context AI involves “computers which perform cognitive tasks, usually associated with human minds, particularly learning and problem-solving” (Baker & Smith, 2019, p. 10). The software development company Serokell (2020) defines AI as the development of intelligent programmes and machines capable of creative problem-solving, a capability previously regarded as uniquely human. Artificial Intelligence in education (AIED) combines AI and the learning sciences to support, inter alia, the development of tools for adaptive learning environments (Luckin et al., 2016). Although AIED aims to be computationally precise, objective and a true reflection of events, it is based necessarily on data available to the educational institution—and data are socially, politically, economically, legally, and technologically entangled. Many critical data scholars will agree with Watters (2017) that AIED is not merely a type of technology; it is, by definition, ideological. Like all educational technology, AIED should be considered as “a knot of social, political, economic and cultural agendas that is riddled with complications, contradictions and conflicts” (Selwyn, 2014, p. 6). The boundaries between analytical approaches which extract meaning from educational data (and then act on that meaning) can be blurred. For example, Renz et al. (2020) distinguish between the fields of educational data mining, learning analytics and AIED in the following way: Educational data mining provides automated decisions and predictions based on machine learning algorithms, while
8 Ethical Considerations of Artificial Intelligence …
107
learning analytics visualises data to enhance understanding of students’ learning experience and assists in optimising the learning environment. AIED adds to learning analytics by linking to various AI applications to further support learning and teaching. AIED is described as “the measurement and acquisition of digital teaching and learning behaviour based on learning analytics” (Renz et al., 2020, p. 24; italics added). These authors foresee that AI tools such as student support chatbots, intelligent tutoring systems and assessment tools will significantly expand the capabilities of learning analytics. Rienties et al. (2020) make clear that each field—AIED, educational data mining, and learning analytics—focuses on comprehending learning and teaching by means of technology and each is guided by distinct theoretical frameworks, methods, and ontologies. Rienties et al. (2020) also propose that these fields overlap and that there is a need to “break down some of the artificial barriers between the respective communities, and jointly work together as one interdisciplinary research field”. This conceptual chapter primarily focuses on mapping current understandings and practices emerging from AIED before addressing ethical concerns.
Definition(s) and Current Uses of AI in Higher Education AI in educational contexts can be categorised as learning for AI, learning about AI, or learning with AI (Kukulska-Hulme et al., 2020). This chapter focuses on learning with AI. In this context, educational AI applications, like those within broader learning analytics, can be broadly categorised into system-facing; learner-facing; or teacherfacing applications, although there may be overlap (Kukulska-Hulme et al., 2020). Baker and Smith (2019) define these as, • system-facing tools assist administrators and managers at an institutional level in generating information, for instance, about attrition patterns across units. • learner-facing tools provide software that students use when learning subject matter, namely adaptive or personalised learning management systems or intelligent tutoring systems (ITS). • teacher-facing systems automate tasks such as administration, student feedback, assessment and uncovering plagiarism, thereby reducing workload. AIED tools can also summarise student progress, enabling educators to provide them with relevant support when necessary. Although not currently widely adopted, Zawacki-Richter et al. (2019) identify four broad educational areas likely to be impacted by AI, namely profiling and prediction, intelligent tutoring systems, assessment and evaluation, and adaptive systems and personalisation. AI applications are thought to be suited to smart tutoring systems, automated essay grading and the timely screening of at-risk students (RAND Corporation, as cited in Schroeder, 2019). As such, AIED may be particularly useful in open and distributed learning contexts, given the large numbers of diverse, geographically dispersed students, the prominent role of technology, and the relatively high dropout
108 Table 8.1 AI applications in open and distributed learning
L. Ungerer and S. Slade Category
Sub-category
Profiling and prediction
Rapid admission decisions and course scheduling Responsive approaches for drop-out and retention issues Student models and academic achievement
Intelligent tutoring systems
Increasing the efficiencies in teaching course content Diagnosing strengths and automated feedback Curating learning materials Facilitating collaboration Extending the teacher’s perspective
Assessment and evaluation
Automated grading Rapid feedback Evaluation of student understanding Engagement and academic integrity Evaluation of teaching
Adaptive systems and personalisation
Teaching course content Recommending personalised content Supporting teachers and learning design Using academic data to monitor and guide students Representation of knowledge in concepts maps
and delayed graduation rates (Musingafi et al., 2015). The large student databases available to train AIED applications within open and distributed learning institutions also supports such an approach (Zawacki-Richter et al., 2019). Students at open and distributed learning institutions further often report a lack of effective student support as a particular challenge (Van Niekerk & Schmidt, 2016). In following Zawacki-Richter et al. (2019), Table 8.1 summarise the potential uses of AI in learning analytics. These are further explored below.
Profiling and Prediction In the light of persisting concerns pertaining to student retention in higher education in general, and specifically in open and distributed learning (Herodotou et al., 2020; Subotzky & Prinsloo, 2011; Tight, 2020), the ability to accurately predict
8 Ethical Considerations of Artificial Intelligence …
109
students’ academic outcomes might be highly beneficial. Predictive analytics solutions focusing on student retention involve the development of indicators to identify at-risk students and to predict dropout. AI applications may further generate support models by, for instance, student profiling and modelling learning behaviour in response to observed interventions (Herodotou et al., 2020). AIED may also especially serve to enhance student support in open and distributed learning contexts. Prinsloo (2019) explains that higher education institutions are responsible for ensuring quality and effective teaching and learning, as well as intervening (within limits) when students’ behaviours indicate that they experience a sense of anomie or (un)belonging. “Anomie refers to feelings of un-belonging when relations between an individual and his/her community break down” (Prinsloo, 2019, n.p.). Student disengagement and anomie are important considerations in student drop-out and failure, but the indicators of these experiences and feelings of unbelonging vary according to the teaching context. In open and distributed learning environments anomie and ‘unbelonging’ may, for instance, be apparent only when students do not submit assignments or tasks, or do not take part in online discussion forums. Since open and distributed learning institutions often have large numbers of students, registered for a wide range of qualifications, such indicators may be missed, noted but ignored, or acted on but not dealt with sufficiently. In this respect, AI-informed student support can assist in profiling students who shows signs of experiencing ‘unbelonging’ or anomie, and provide scalable, first-level support to these students, at cost and scale (Prinsloo, 2019).
Intelligent Tutoring Systems (ITS) Existing ITS functions are mainly linked to teaching course content, diagnosing strengths or gaps in students’ knowledge and providing automated feedback, curating learning materials based on students’ needs and facilitating collaboration between learners (Humble & Mozelius, 2019; Mousavinasab et al., 2021). In providing course content to students, ITS may simultaneously provide students with adaptive feedback and suggestions for addressing course-related questions, and identify those who appear to be struggling with course content and tasks. When compiling learning materials based on student needs, an ITS typically ‘observes’ students’ online behaviours, generates an individual profile and provides personalised assistance, such as recommending reading material and exercises. It is not hard to see that this could be incredibly useful in open and distributed learning contexts where student numbers are often very high and any kind of personalised (teaching) support is simply not feasible. Modern AI is especially suited to support continuous, personalised mastery learning—potentially providing every student with a personal tutor, a feature especially valuable in open and distributed learning. AI-enabled mass personalisation and
110
L. Ungerer and S. Slade
smart tutoring systems support learning at scale (Roll et al., 2018), which may also appeal in open and distributed learning contexts with its large numbers of students.
Assessment and Evaluation AIED tools in higher education are currently used largely for assessment and evaluation, with particular focus on automated grading; feedback; evaluation of student understanding, engagement and academic integrity; and evaluation of teaching (Zhang & Asian, 2021). Educators across various disciplines employ Automated Essay Scoring (AES) systems, although mostly at undergraduate levels. Studies suggest a significant correspondence between the marking of AES systems and that of human markers, implying that automated grading could diminish the time and cost involved in appointing assessors for large-scale marking (Barker, 2011). For feedback provision, student-facing tools such as intelligent agents can guide and prompt students when experiencing difficulties with their studies (Zawacki-Richter et al., 2019). Such tools could also support student understanding, engagement and academic integrity by evaluating students’ conceptual knowledge and providing learners with personalised assistance. Machine learning algorithms may serve to check academic integrity, for instance, by reviewing the correspondence between students’ submitted and earlier work (or that of other students). Data mining algorithms may also guide the evaluation of lecturers’ performance in course evaluations (Zawacki-Richter et al., 2019). AI applications therefore appear capable of handling assessment and evaluation tasks in a highly accurate and effective manner. Luckin et al. (2016) feel that the progress in AIED, as well as access to large volumes of student data and learning analytics will especially enhance assessment in higher education. Embedding AIED in learning activities enables the continual analysis of students’ progress without interrupting their learning. Luckin et al. (2016) further suggest that data collected during digitally-enabled teaching and learning can provide insights that are not available during conventional assessments. Through suitable data analysis, it could be possible to determine the processes followed by students to reach an answer and not merely record whether the correct option was chosen. This type of analysis could guide an enhanced understanding of the key cognitive processes involved in learning. AIED analysis could also contribute to determining student experiences such as being bored, frustrated or confused, helping educators to address these situations. Such analyses would enable a level of ‘personal attention’ not currently feasible in an open and distributed learning contexts.
8 Ethical Considerations of Artificial Intelligence …
111
Adaptive Systems and Personalisation Adaptive systems comprise five sub-categories: those for teaching course content; recommending personalised content; supporting educators in learning and teaching design; using academic data to monitor and guide students; and supporting the representation of knowledge through concept maps (Bozkurt et al., 2021). Personalised learning can take the form of the delivery of content, materials and exercises relevant to each student and based on students’ previous behaviours and performance to ensure that required subject learning outcomes are met. Recommendation systems can guide online students to help choose their next courses. According to Zawacki-Richter et al. (2019), as in ITS, most published studies are limited to a description of the system or pilot study, and there is a noticeable absence of longer-term results. AIED tools are not yet used extensively to support educators in teaching and learning design, although recommendation systems are in place to assist educators in outlining their teaching strategies, by considering the background of a particular class. Enabling intelligent agents to manage repetitive tasks in online education has potential also to free up educators’ time to focus on more creative activities (Zawacki-Richter et al., 2019). Downes (2016) suggests that personalisation can be expressed in terms of, • Pedagogy—for instance, whether instruction should be differentiated based on student variables; • Curriculum—for instance, whether all students should firstly study foundational subjects, followed by the same subjects in the same order, or whether this order may be adapted for different students; • Learning environments—for instance, whether students should work in groups or learn individually and in different contexts. Under curriculum, for instance, students might receive AI tailored course content, based on various criteria and their existing knowledge or performance in learning tasks. AI systems can also provide recommendations to students, based on analyses of historical data sets of large groups of students from comparable learning environments involved in similar types of learning. According to Downes (2016), it is dubious whether such methods enhance learning. His stated view is that an educational system should not be responsible for learning decisions given that these are also influenced by factors such as individual preference, the accessibility of resources, and goals for future employment. Instead of using personalisation, Downes (2016) suggests personal learning as an alternative to personalised learning whereby the educational system supports learning, but does not provide it. Individual learners make decisions about what, how and where to learn. Personal learning focuses on achieving an objective, and not on the learning itself, with curricula and pedagogy chosen for practical reasons.
112
L. Ungerer and S. Slade
Adaptive systems also provide educators with diagnostic information based on students’ academic data, allowing the provision of timely and appropriate personal guidance when needed. Additionally, they might be incorporated at the institutional or administrative level in undergraduate academic advising, or applied to support career services (Zawacki-Richter et al., 2019; also see Southgate, 2020; Tan, 2020).
Concerns and Ethical Issues in AIED So, it is fair to say that many believe that AIED has significant potential to facilitate timely and personalised learning support to large groups of students, particularly in open and distributed learning environments. As with predictive analytics, the data that AIED systems draw on mostly reflect previous students’ behaviour (Berendt et al., 2020), which raises a number of concerns regarding the ethical implications of using historical data for current students. Berendt et al. (2020) therefore warn against risks embedded in AIED, resulting from unintended consequences of using these systems. Current inequalities and unfairness in the current state of affairs in education may be maintained because algorithms support existing processes and are trained on data that contain, for instance, existing gender or ethnicity biases. This not only limits change in educational approaches and systems, but also perpetuates and even creates new forms of inequality. The continual monitoring of students may develop into forceful tracking and unintended negative consequences may result from the decisions that AI systems make, for instance, when academic performance serves to predict job or study applications, severely restricting students’ future prospects (also see Holstein & Doroudi, 2021; Williamson & Enyon, 2020). Although it is believed that AI systems would support innovation, they may unintentionally also replicate traditional education approaches. Despite concerns that traditional assessment does not efficiently measure student learning and progress, assessment data are often used in predictive analytics systems. Educational institutions may consequently be kept from incorporating more reliable, authentic types of assessment (Berendt et al., 2020). Selwyn (2018a) expresses concern that certain benefits offered by human educators may be inadvertently lost when focusing on automating teaching. Educators’ knowledge is often based on their own learning; they connect with students at both a cognitive and social level; they use natural speech; they use their bodies during teaching; and they are able to improvise (Selwyn, 2018a). Many of the human attributes relevant to effective teaching, namely creativity, innovativeness and spontaneity, are absent in computer systems (Selwyn, 2018a). Some of the main challenges involved in preparing AI to adapt to real-world scenarios involve teaching computers to grasp a particular context and to have the ability to act intuitively (Lambert, 2018). It seems that expert human educators can support learning in ways that technology cannot yet simulate perfectly (Selwyn, 2018a). There are also concerns involving ethics, morals and values when machines make education decisions (Selwyn, 2018b). The algorithms guiding these technologies are
8 Ethical Considerations of Artificial Intelligence …
113
based on human decisions in coding the instructions and the protocols they follow and so cannot be considered neutral or value-free. When relying on AI-driven teaching systems, it is essential to consider who the trust for encoding teaching is assigned to, including the implicit choices and decisions that are incorporated in teaching. It is not clear how automated systems decide which students are first highlighted—i.e., struggling students or those who excel. Human teachers draw upon their empathy, knowledge and experience. It may be virtually impossible to automate decisions incorporating human conscience. Satell and Sutton (2019) therefore recommend approaching AI from an augmentation rather than an automation perspective. They suggest that the function of AIED should not be to replace human educators and reduce expenses at all costs, but to increase educational effectiveness and to add value (also see Smuha, 2020; Southgate, 2020; Zhang & Aslan, 2021).
Human Rights Concerns Although AI and big data may increase effective real-time monitoring of education systems, their implications for the fundamental human rights (Dubber et al., 2020; Humble & Altun, 2020) and freedoms of teachers and students should be considered (Weinberg, 2020). Students at higher education institutions participate in learning activities and assessments that educators believe are beneficial. Data are routinely gathered throughout such activities. Even if the data gathering process is beneficial to students, their autonomy, choice and basic human rights may be affected if their decision to take part did not take place freely and in an informed manner. It is essential to balance concerns around fundamental human rights, data quality and possible benefits to students in the implementation of AIED (Berendt et al., 2020). Surveillance is a further concern in all-encompassing data collection and analysis (Berendt et al., 2020). Typically, the literature distinguishes between surveillance and monitoring (Greener, 2019). During monitoring, data are collected automatically, whereas surveillance is more focused, and for specific purposes. When educators collect data on students, they are often monitoring, but if they then use that data to control students’ behaviour or outcomes, they perform surveillance. The distinction between the concepts therefore lies in the purpose of its use. Although universities regularly use student data to monitor and predict student performance, students’ and staff views about this process, social and ethical issues, ethical guidance and policy development are often not considered (Braunack-Mayer et al., 2020). Some of the ethical issues involved in using student data include transparency, consent, and the right to seek redress (Slade & Prinsloo, 2013). Chaudhari et al. (2019) highlight further concerns around data ownership, privacy, and digital exclusion resulting from algorithmic biases as the major challenges associated with AIED that should be addressed.
114
L. Ungerer and S. Slade
Data Ownership The matter of who owns the large volumes of big data available to institutions is a contentious one (Nielsen, as cited in Chaudhari et al., 2019). According to Berendt et al. (2020), concerns about context and agency are associated with data ownership. Students should have control over their own data, for instance, in deciding whether it should reflect achievements throughout their lives. When students or consumers use online platforms or services, they must accept associated Terms and Conditions (TACs) and in doing so, often ‘sign away’ their rights and their ownership. An option may be to grant students the right to decide what to do with their data, although they may not completely understand the complexity of the data that is captured nor have the required skills to make sensible decisions about its ownership. Chaudhari et al. (2019) advise that these ethical concerns should be considered in future policies guiding the ownership of educational data. The benefits and risks involved should be considered during the development, marketing and deployment of AIED tools. Finally, it is not always clear who decides about data retention or deletion. If data collected about students are kept throughout their lives, who maintains that data, how is it stored and who makes decisions regarding its disposal? (Berendt et al., 2020).
Data Privacy and Consent Renz et al. (2020) identify privacy concerns as one of the main challenges in using AI-based intelligent agents in educational environments, because it involves private and personal information. Students may not realise the full implications of their actions when they provide consent for their data to be captured, especially when the accompanying TACs are presented in a complex, legal manner and they consequently lose control of their data (Chaudhari et al., 2019). Prinsloo and Slade (2016) concede that TACs have limitations and advise that higher education institutions’ TACs should be formulated very clearly in terms of the types of data collected, what they will be used for, with whom the data will be shared, and why this would be necessary. Data control and ownership involve power relationships. Student data may be comprehensive and located on various administrative systems and commercial platforms, complicating the issue of who owns and controls that data. Learning management systems may store data in third party systems, while educators may focus predominantly on their teaching functions, overlooking their data storage and security functionalities. Incorporating AI systems in education may therefore aggravate power imbalances and generate new inequalities (Berendt et al., 2020). When institutions do collect personal data, there are a number of legal frameworks and (inter)national legislation that regulate how this may be done—e.g., the Protection of Personal Information Act (2013, SA), the Family Educational Rights and Privacy Act (FERPA) in the context of the USA and the General Data Protection
8 Ethical Considerations of Artificial Intelligence …
115
Regulation (GDPR) in the context of the European Union. The GDPR acknowledges that all people should have control over the use of their personal data, including in AI-based decision making.
Digital Exclusion Due to Algorithmic Biases Irrespective of whether procedures for ensuring data privacy and transparency are in place, complete transparency is almost certainly unattainable because of the obscure nature of the machine learning algorithms supporting AIED (Burrell, as cited in Chaudhari et al., 2019). Even if they are open to scrutiny, algorithms may be difficult to interpret (Berendt et al., 2020). Willis et al. (2016) caution that the ideas and thinking underlying algorithmic decision-making tend to be covert and rarely open to scrutiny. It is also perhaps fair to say that there is a lack of accountability associated with systems built around algorithmic models. O’Neil (2016) suggests that such models conceal a number of assumptions by means of mathematics and often are unverified and undisputed. Those affected negatively by their outcomes, however do not have access to real redress. Only when automated systems fail in alarming, consistent ways are programmers alerted to revisit the algorithms. Without meaningful feedback these systems will continue to deliver mistaken, harmful analyses and remain unable to learn from their mistakes. Furthermore, the issue of AI bias should be taken seriously. When humans are responsible for tagging data, input bias may be created (Satell & Sutton, 2019). Not all people are represented, resulting in their digital exclusion. In an educational context, particularly that of open and distributed learning in which relationships and judgements are largely formed from a digital profile, recommendations are built on accessible (online) data, and valuable offline data may be overlooked. When education systems do not incorporate data on all sections of society, digital exclusion occurs because of the needs and expectations of the sections of society being overlooked. Educational policies should ensure the digital inclusion of students from all societal sectors (Chaudhari et al., 2019). Chaudhari et al. (2019) advise that students should vigilantly review uses of their data, although they may not completely comprehend the processes involved. Models may also be developed based on the most easily accessible data or a model applicable to a particular sub-set of cases may be applied in a broader context than appropriate (Satell & Sutton, 2019). In essence, there is an urgent need for open, transparent implementation frameworks for AIED (Luckin et al., 2016). The effects of bias might be diminished by making AI systems more transparent, understandable and auditable (Satell & Sutton, 2019). Those building AI systems should be able to sufficiently explain to education stakeholders the types of models used, their value, and the systems that are developed (Underwood & Luckin, 2011).
116
L. Ungerer and S. Slade
Finally, the data sources that AI systems are trained on should be transparent and available for audit. In the European Union, legal frameworks such as the GDPR embed these requirements to some extent, but such requirements do not yet apply globally.
Student and Staff Views Even if issues around privacy, data security, and informed consent are resolved, staff members should use the collected data in an ethical manner (Chang & Gibson, as cited in Braunack-Mayer et al., 2020). Braunack-Mayer et al. (2020) advise that for higher education institutions to ethically use big data, data governance measures, supportive systems and organisational structures, and clear policy guidelines are necessary. However, the input of students and staff rarely feature in data governance and policy matters. Their views about the uses of big data and learning analytics, including issues such as consent, are unclear. Higher education institutions run the risk of losing trust when students’ data and learning analytics are used without their knowledge and harm is caused. Acknowledging student and staff views about these matters may enhance the design and use of learning analytics. Braunack-Mayer et al. (2020) investigated tertiary education staff and students’ views about the use of student data in analytics. Some of the general findings were that students and staff were not extensively aware, nor understood the nature of and extent to which data was collected, of data analytics, or the use of learning analytics. Both groups could identify potential benefits resulting from the use of data analytics, such as helping students learn more effectively and identifying at-risk students. They also expressed their concern about some data being misinterpreted, continual surveillance, poor transparency, and a lack of support. In all, limited awareness of ethical issues seemed to feature across higher education institutions. Braunack-Mayer et al. (2020) conclude that the consideration of ethical issues has not kept up with the adoption of learning analytics in higher education.
Conclusion Increasingly, learning analytics is unthinkable without AI, and in line with broader developments, learning analytics will become progressively not only AI informed, but AI dependent. Despite numerous challenges, it is anticipated that AIED has potential to further extend observed benefits of learning analytics and address some of the challenges faced by open and distance education students, such as struggling to organise their studies, requiring efficient interaction and experiencing a sense of isolation (Markova et al., 2017). It holds further potential to fulfil an important role in designing and delivering teaching material and including relevant course content, while providing efficient student support, opportunities for interaction and suitable
8 Ethical Considerations of Artificial Intelligence …
117
assessment opportunities, all essential requirements for effective distance education (Markova et al., 2017). Outside of ethical concerns pertaining to the increasing role of AI in education, there are also other concerns, such as AI systems replacing human educators (Selwyn, 2019). It is perhaps unlikely that AI systems will fully replace educators, even in open and distributed learning contexts. Rather AIED may transform their role (Humble & Mozelius, 2019) with ‘cobots’ (co-working robots) assisting educators with routine tasks and customising the learning experience based on the needs of individual students (Goksel & Bozkurt, 2019). It may also reduce the need for educators to possess all of the relevant knowledge and information that students require (Roll & Wylie, as cited in Humble & Mozelius, 2019). The promise of AI in personalised learning mainly lies in increasing instructional efficiency. Despite the numerous possibilities for revolutionising higher education, and perhaps, in particular for open and distributed learning, this chapter also provide a broad overview of ethical concerns regarding the use of AI in education. It is crucial that we understand AI, and AI in education, not as neutral, but as suggested here, as entangled in ideological, political, social, economic and technological assumptions and aspirations. Of specific concern is that using historical data to train AI may not only perpetuate bias and inequalities, but may also exacerbate these inequalities and create new forms of inequity. Ethical concerns include the impact of AI in education on students’ human rights, their right to data ownership and privacy, the role of consent, and algorithmic exclusion in AIED. The chapter furthermore briefly discussed student and staff views pertaining to the use of student data in learning analytics, with implications for AIED. In conclusion, it is essential that AI systems be made easier to explain, and more auditable and transparent, ensuring their fairness and efficiency. New technologies such as AIED further illustrate the urgent need to establish an ethos of digital awareness in educational institutions. When AIED, big data, and learning analytics guide personalised educational experiences, various complex concomitant ethical issues such as those surrounding informed consent should be explicitly considered in institutional policies. Students should be educated about data and privacy to enable them to make informed decisions. Finally, higher education institutions should consider the limitations of AI solutions as well their possibility for doing harm until formal ethical frameworks for their implementation are well established.
References Baker, T., & Smith, L. (2019). Educ-AI-tion rebooted? Exploring the future of artificial intelligence in schools and colleges. Retrieved from Nesta Foundation website: https://media.nesta.org.uk/ documents/Future_of_AI_and_education_v5_WEB.pdf Barker, T. (2011). An automated individual feedback and marking system: An empirical study. Electronic Journal of E-Learning, 9(1), 1–14. Beer, D. (2019). The data gaze. Sage.
118
L. Ungerer and S. Slade
Berendt, B., Littlejohn, A., & Blakemore, M. (2020). AI in education: Learner choice and fundamental rights. Learning, Media and Technology, 45(3), 312–324. Bozkurt, A., Karadeniz, A., Baneres, D., Guerrero-Roldán, A. E., & Rodríguez, M. E. (2021). Artificial Intelligence and reflections from educational landscape: A review of AI studies in half a century. Sustainability, 13(2), 800. https://doi.org/10.3390/su13020800 Braunack-Mayer, A. J., Street, J. M., Tooher, R., Feng, X., & Scharling-Gamba, K. (2020). Student and staff perspectives on the use of big data in the tertiary education sector: A scoping review and reflection on the ethical issues. Review of Educational Research. https://doi.org/10.3102/003 4654320960213. Broussard, M. (2018). Artificial unintelligence. MIT Press. Chaudhari, V., Murphy, V., & Littlejohn, A. (2019). The educational intelligent economy–Lifelong learning—A vision for the future. In The Educational Intelligent Economy: BIG DATA, Artificial Intelligence, Machine Learning and the Internet of Things in Education. Emerald Publishing Limited. Crawford, K. (2021). Atlas of AI. Yale University Press. Downes, S. (2016). Personal and personalized learning. https://www.downes.ca/post/65065 Dubber, M. D., Pasquale, F., & Das, S. (Eds.). (2020). The Oxford handbook of ethics of AI. Oxford Handbooks. Eubanks, V. (2017). Automating inequality. St Martins. Goksel, N., & Bozkurt, A. (2019). Artificial Intelligence in Education: Current insights and future perspectives. In S. Sisman-Ugur, & G. Kurubacak (Eds.), Handbook of research on learning in the age of transhumanism (pp. 224–236). IGI Global. Greener, S. (2019). Supervision or surveillance: The tension of learning analytics. Interactive Learning Environments, 27(2), 135–136. https://doi.org/10.1080/10494820.2019.1575631 Herodotou, C., Naydenova, G., Boroowa, A., Gilmour, A., & Rienties, B. (2020). How can predictive learning analytics and motivational interventions increase student retention and enhance administrative support in distance education? Journal of Learning Analytics, 7(2), 72–83. Holstein, K., & Doroudi, S. (2021). Equity and Artificial Intelligence in Education: Will “AIEd” Amplify or Alleviate Inequities in Education? arXiv preprint . arXiv:2104.12920. Humble, K. P., & Altun, D. (2020). Artificial Intelligence and the threat to human rights. Journal of Internet Law, 24(3), 1–19. Humble, N., & Mozelius, P. (2019, October 31–November 1). Artificial Intelligence in Education— A promise, a threat or a hype? In Proceedings of the European Conference on the Impact of Artificial Intelligence and Robotics (pp. 149–156). https://doi.org/10.34190 Kukulska-Hulme, A., Beirne, E., Conole, G., Costello, E., Coughlan, T., Ferguson, R., FitzGerald, E., Gaved, M., Herodotou, C., Holmes, W., Mac Lochlainn, C., Mhichíl, M. N. G., Rienties, B., Sargent, J., Scanlon, E., Sharples, M., & Whitelock, D. (2020). Innovating Pedagogy 2020: Open University Innovation Report 8. The Open University. Lambert, K. (2018). Could atificial intelligence replace our teachers? Retrieved from https://www. educationworld.com/could-artificial-intelligence-replace-our-teachers Luckin, R., Holmes, W., Griffiths, M., & Forcier, L. B. (2016). Intelligence unleashed: An argument for AI in Education. Pearson Education. Markova, T., Glazkova, I., & Zaborova, E. (2017). Quality issues of online distance learning. Procedia-Social and Behavioral Sciences, 237, 685–691. Mousavinasab, E., Zarifsanaiey, N., Niakan Kalhori, S. R., Rakhshan, M., Keikha, L., & Ghazi Saeedi, M. (2021). Intelligent tutoring systems: A systematic review of characteristics, applications, and evaluation methods. Interactive Learning Environments, 29(1), 142–163. Musingafi, M. C., Mapuranga, B., Chiwanza, K., & Zebron, S. (2015). Challenges for open and distance learning (ODL) students: Experiences from students of the Zimbabwe Open University. Journal of Education and Practice, 6(18), 59–66. O’Neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy. Broadway Books.
8 Ethical Considerations of Artificial Intelligence …
119
Prinsloo, P. (2019). Tracking (un) belonging: At the intersections of human-algorithmic student support. http://dspace.col.org/handle/11599/3373 Prinsloo, P., & Slade, S. (2016). Student vulnerability, agency, and learning analytics: An exploration. Journal of Learning Analytics, 3(1), 159–182. Renz, A., Krishnaraja, S., & Gronau, E. (2020). Demystification of Artificial Intelligence in Education—How much AI is really in Educational Technology? International Journal of Learning Analytics and Artificial Intelligence for Education (iJAI), 2(1). Rienties, B., Køhler Simonsen, H., & Herodotou, C. (2020). Defining the boundaries between artificial intelligence in education, computer-supported collaborative learning, educational data mining, and learning analytics: A need for coherence. Frontiers in Education, 5. https://doi.org/ 10.3389/feduc.2020.00128 Roll, I., Russell, D. M., & Gaševi´c, D. (2018). Learning at scale. International Journal of Artificial Intelligence in Education, 28(4), 471–477. Satell, G., & Sutton, J. (2019). We need AI that is explainable, auditable, and transparent. Retrieved from https://hbr.org/2019/10/we-need-ai-that-is-explainable-auditable-and-transparent Schroeder, R. (2019, June 19). Emerging roles of AI in education. Inside Higher Ed [web log]. https://www.insidehighered.com/digital-learning/blogs/online-trending-now/eme rging-roles-ai-education Selwyn, N. (2014). Distrusting educational technology: Critical questions for changing times. Routledge. Selwyn, N. (2018a). Six reasons artificial intelligence technology will never take over from human teachers. www.aare.edu.au/blog/?p=2948 Selwyn, N. (2018b). The rise of automated teaching technologies: We need to talk about robots. https://lit.blogg.gu.se/2018/06/12/the-ethical-dilemma-of-the-robot-teacher/ Selwyn, N. (2019). Should robots replace teachers? Polity Press. Serokell. (2020). Artificial Intelligence vs. Machine Learning vs. Deep Learning: What’s the Difference? https://medium.com/ai-in-plain-english/artificial-intelligence-vs-machine-learningvs-deep-learning-whats-the-difference-dccce18efe7f Sharma, R. C., Kawachi, P., & Bozkurt, A. (2019). The landscape of artificial intelligence in open, online and distance education: Promises and concerns. Asian Journal of Distance Education, 14(2), 1–2. Slade, S., & Prinsloo, P. (2013). Learning analytics: Ethical issues and dilemmas. American Behavioral Scientist, 57(10), 1509–1528. Smuha, N. A. (2020, December). Trustworthy Artificial Intelligence in Education: Pitfalls and pathways. Available at SSRN 3742421. Retrieved from Smuha, Nathalie A., Trustworthy Artificial Intelligence in Education: Pitfalls and Pathways. Available at SSRN: https://ssrn.com/abstract= 3742421 Southgate, E. (2020). Artificial intelligence, ethics, equity and higher education: A ‘beginning-ofthe-discussion’ paper. National Centre for Student Equity in Higher Education, Curtin University, and the University of Newcastle. Subotzky, G., & Prinsloo, P. (2011). Turning the tide: A socio-critical model and framework for improving student success in open distance learning at the University of South Africa. Distance Education, 32(2), 177–193. Tan, S. (2020). Artificial Intelligence in education: Rise of the Machines. Journal of Applied Learning and Teaching, 3(1), 129–133. Tight, M. (2020). Student retention and engagement in higher education. Journal of Further and Higher Education, 44(5), 689–704. Underwood, J., & Luckin, R. (2011). What is AIED and why does education need it. Reporte para el Programa de Investigación en Enseñanza y Aprendizaje: Aprendizaje Mejorado con Tecnología–Inteligencia Artificial en la Educación. Reino Unido. http://tel.ioe.ac.uk/wp-content/ uploads/2011/06/telaied_whyaied.pdf UNESCO. (2019). How can artificial intelligence enhance education? https://en.unesco.org/news/ how-can-artificial-intelligence-enhance-education
120
L. Ungerer and S. Slade
Van Niekerk, M. P., & Schmidt, L. (2016). The ecology of distance learning: Bridging the gap between university and student. South African Journal of Higher Education, 30(5), 196–214. Watters, A. (2017, November 1). AI is ideological. New Internationalist, 507. https://newint.org/ features/2017/11/01/audrey-watters-ai Weinberg, L. (2020). Feminist research ethics and student privacy in the age of AI. Catalyst: Feminism, Theory, Technoscience, 6(2), 1–10. Williamson, B., & Eynon, R. (2020). Historical threads, missing links, and future directions in AI in education. Learning, Media and Technology, 45(3), 223–235. https://doi.org/10.1080/17439884. 2020.1798995 Willis, J. E., Slade, S., & Prinsloo, P. (2016). Ethical oversight of student data in learning analytics: A typology derived from a cross-continental, cross-institutional perspective. Educational Technology Research and Development, 64(5), 881–901. Zawacki-Richter, O., Marín, V. I., Bond, M., & Gouverneur, F. (2019). Systematic review of research on artificial intelligence applications in higher education–Where are the educators? International Journal of Educational Technology in Higher Education, 16(1), 39. Zhang, K., & Aslan, A. B. (2021). AI technologies for education: Recent research & future directions. Computers and Education: Artificial Intelligence, 100025. https://doi.org/10.1016/j.caeai. 2021.100025
Leona Ungerer is an associate professor in the Department of Industrial and Organisational Psychology at Unisa. Her Ph.D. focused on the relationship between South African consumers’ personal values and their life satisfaction and its implications for market segmentation. Her areas of interest are consumer psychology and technology-mediated teaching and learning and she has published in both fields. She has extensive teaching experience in an OD(e)L context and curates a site involving posts about technology-mediated teaching and learning, https://www.scoop.it/topic/ creative-teaching-and-learning. Sharon Slade has worked in open, distance education for almost 20 years, as a senior lecturer at the Open University in the UK. She led the work on development of the University’s Policy on the ethical use of student data for learning analytics, arguably the first of its kind in Higher Education worldwide. She has since contributed to further framework developments, notably with Jisc, Stanford University and Ithaka S+R, New America and the International Council for Open and Distance Education. Sharon was an academic lead for learning analytics projects within the Open University, leading work around ethical uses of student data, operationalisation of predictive analytics and approaches aiming to improve retention and progression. Keynotes and publications include papers around student consent, the obligation to act on what is known, examining the concept of educational triage and broader issues around an ethics of care. She now works on data insight at Earth Trust, an educational and environmental charity near Oxford.
Chapter 9
Conclusion Paul Prinsloo, Sharon Slade, and Mohammad Khalil
It is with a sense of irony that we offer a conclusion to this book. As we acknowledged in the introductory chapter, when we invited authors to submit proposals for a book on exploring the potential and challenges of learning analytics for open, distance and distributed learning institutions and forms of delivery, no one would have imagined how the world, and in particular the education sector would be disrupted by the Covid-19 pandemic. It is therefore ironic that despite the fact that the chapters were authored before the pandemic, their content may be even more relevant now than before! As the pandemic unfolded and, to a certain extent, continues to impact on educational institutional processes, teaching, learning and student support, the use of student data, and more specifically student learning data became even more important than before. With all the uncertainties emerging from the pandemic and its aftermath, institutions, educators and students came to rely more on data and feedback, as all of them navigated their way through the ‘new normal’. As editors, we are convinced that the chapters in the book provide key insights for the whole education sector, and specifically, for open, distance and distributed learning institutions and forms of delivery. From the realities of introducing learning analytics at scale in the Global South in Chap. 2, to increasing uses of mobile technologies in Chap. 4, each set of authors has sought to offer relevant insight based on their own experiences and contexts. P. Prinsloo (B) Department of Business Management, University of South Africa, Pretoria, South Africa e-mail: [email protected] S. Slade Earth Trust, Abingdon, UK M. Khalil Centre for the Science of Learning & Technology, University of Bergen, Bergen, Norway e-mail: [email protected] © The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2022 P. Prinsloo et al. (eds.), Learning Analytics in Open and Distributed Learning, SpringerBriefs in Open and Distance Education, https://doi.org/10.1007/978-981-19-0786-9_9
121
122
P. Prinsloo et al.
As we conclude both this chapter and the book, it may be useful to point to unfolding issues and concerns which may continue to impact on the adoption and operationalisation of learning analytics in open and distributed learning contexts. The following pointers are not comprehensive, but are influenced by our own experiences and understanding of how learning analytics may unfold. 1.
2.
#AgainstSurveillance Despite the immense promise of the collection and analysis of student data, there remain concerns about the ways in which aspects of students’ learning, behaviour and progress are monitored, analysed and used to make evaluative judgments about their ability, progress and needs. While the ethics of collecting, analysing and using student data has been part and parcel of research into learning analytics for many years (Prinsloo & Slade, 2016; Slade & Prinsloo, 2013; Willis et al., 2016), new constellations of open and distributed learning in, for example, MOOCs begin to raise fresh concerns (Khalil et al., 2018; Prinsloo et al., 2019). Recently, as educational institutions have responded to the global Covid-19 pandemic, and have rapidly moved traditional teaching and learning online, the use of proctoring services has raised additional concerns regarding the surveillance of students (Flaherty, 2020; Harwell, 2020; Watters, 2020; Young, 2020). As the pushback against student surveillance in online environments increases, it seems clear that surveillance and care for the well-being of students and staff are seen to be incommensurable. One of the earliest positionings of learning analytics as surveillance was that of Knox (2010) with a provocative conference presentation entitled “Spies in the house of learning”. While it is possible to defend the collection, analysis and use of student data based on contractual, fiduciary and moral grounds (Slade & Prinsloo, 2013), we must not ignore the concerns about the colonisation of students’ life and learning worlds by ed-tech companies (Kwet & Prinsloo, 2020; Prinsloo, 2020; Watters, 2020). This is of particular concern for open and distributed learning institutions who increasingly rely on the exploitation of student data to improve teaching and learning. The perils and promises of Artificial Intelligence (AI) Learning analytics has evolved considerably over the past decade from a field which sought to understand more about learners through simply monitoring their engagement and educational outcomes to one which predicts possible outcomes, provides personalized learning and pathways, and provides improved understanding of social networks. We are fortunate to have included a chapter focusing on another dimension of learning analytics, artificial intelligence (AI). Ungerer and Slade provide a broad overview of the issues pertaining to the ethics of AI in the context of learning analytics in open and distributed learning contexts. There is ample evidence that AI shows much potential, but as the authors point out, there are several ethical issues related to its operationalisation. Despite these concerns, we hold the view that AI may assist open and distributed learning contexts to specifically address scale, access, quality of
9 Conclusion
123
student support and cost (Prinsloo, 2017). Considering the huge student numbers faced by many open and distributed learning providers (see, for example, the examples of the OU and Unisa), and the careful balance between cost, quality and access, we should not ignore the potential found at the intersections of humans and algorithmic-decision making systems (Prinsloo, 2019a). While issues of cost, access and quality are germane to all educational systems, these three elements of the iron triangle, as discussed above, necessitate that we take developments in AI seriously. While it is unlikely that algorithmic decision-making systems will ‘take over’ the role of educators, student support and administration (see for example, Selwyn, 2019), AI certainly has the potential to optimise the resources required for many of the routine and administrative functions of teaching and student support (Prinsloo, 2019a, b). Without duplicating the main line of argumentation of Ungerer and Slade (earlier), we agree with Crosslin (2019) that we should consider the following: • The design and operationalisation of algorithmic decision-making systems should be guided by an understanding of “the history of educational research, learning theory, educational psychology, learning science, and curriculum and instruction.” • We should not ignore the history and continued reality of “structural inequalities and the role of tech and algorithms in creating and enforcing those inequalities.” • We should “be honest about the limitations and bias” inherent in learning analytics (also see Kitto et al., 2018). The design and use of algorithmic decision-making systems necessitate an interdisciplinary team approach. • Algorithmic transparency and accountability are key.
3.
4.
[Also see the “Toronto Declaration: Protecting the right to equality and nondiscrimination in machine learning systems” (Amnesty International, 2018; Zaidi et al., 2018)]. Personalisation and adaptive learning Closely connected to developments in AI is the growing potential of personalisation and adaptive learning. Without AI, any meaningful personalisation (of learning, student support interventions and data-informed pedagogy and assessment), as well as adaptations of the learning experiences of students according to their needs (cognitive, affective and administrative) would simply not be attainable. As emphasised throughout this book and in this concluding chapter, responding to students’ needs, risks and potential in open and distributed learning environments requires a careful balance between institutional fiduciary and contractual duty of care, and preventing the collection, analysis and use of student data to move from justified and ethical reasons to surveillance. Prinsloo (2017, 2019a, b) provides ample examples of how AI can be utilised to respond more effectively, appropriately and ethically. Multimodal Learning Analytics (MLA) While much of the focus of early learning analytics research has related to the digital traces within learning management systems and MOOCs (Khalil &
124
5.
P. Prinsloo et al.
Ebner, 2016a), there is increasing interest in capturing and analysing students’ data from real-world learning contexts such as gaze, postures, motions, and gestures inside classrooms and face-to-face sessions. Significant records of student behavior in the classroom are often constrained within learning analytics research, due to ethics, privacy, and security concerns (Khalil & Ebner, 2016b). However, multimodal analytics goes beyond the tracking of students through direct surveillance. The chapter here on Multimodal Learning Analytics from Khalil takes a slightly different angle, combining mobile technologies with learning analytics. Data gathered from uses of mobile technologies may provide a rich multimodal data source which could be further unlocked in learning analytics research to improve learning. Such research is exciting, though is likely still to trigger discomfort for many learning analytics stakeholders, including but not limited to learners, teachers, and leaders at both schools and higher education institutes. Ochoa (2017) notes that while concerns around privacy strongly restrain applications of multimodal learning analytics, there are other issues within MLA which are little researched: the impact of MLA on learning, the integration of MLA into physical sessions, and the impact of interrupted recordings (as a result of, e.g., glitches in cameras, microphones, sensors, etc.) that distort measures of how learning is happening. The role of data in uncertain, complicated and chaotic times The Covid-19 pandemic has significantly impacted staff and students, processes, and the facilitation of learning and student support, regardless of the form of educational delivery. As students’ behavioural patterns have changed in response to the evolving situation, our beliefs about engagement and learning progress have needed further scrutiny, forcing fresh consideration of our categories, our predictive models, and our understanding of how students learn. The pandemic has offered us an opportunity to rethink the data that we already collect and what data we need to collect, as well as our understanding of uses of that data.
When the Call for Proposals was circulated and the proposals first received, we could not have imagined that the final result would include such a wide range of interesting approaches to the collection, analysis and use of student data as presented in the eight chapters here. In concluding this book, we recognise that though these chapters provide glimpses of how different learning analytics function in open and distributed learning environments, they may also raise further questions. It is our hope that readers will engage with these chapters, and go on to research the potential and reality of learning analytics in open and distributed learning environments.
9 Conclusion
125
References Amnesty International. (2018). The Toronto Declaration: Protecting the right to equality and non-discrimination in machine learning systems. https://www.amnesty.org/download/Docume nts/POL3084472018ENGLISH.PDF Crosslin, M. (2019, June 14). So, what do you want from learning analytics? [Web log post]. https://www.edugeekjournal.com/2019/06/14/so-what-do-you-want-from-learning-analytics/? fbclid=IwAR1LuAMF4xXn4HLOXjlzO7t18756YZdYvh_BlBu1nuFFQnJ65LWwKDtz5vU Flaherty, C. (2020, May 11). Big proctor. InsideHIgherEd. https://www.insidehighered.com/news/ 2020/05/11/online-proctoring-surging-during-covid-19 Harwell, D. (2020, November 12). Cheating-detection companies made millions during the pandemic. Now students are fighting back. Washington Post. https://www.washingtonpost.com/ technology/2020/11/12/test-monitoring-student-revolt/ Khalil, M., & Ebner, M. (2016a). De-identification in learning analytics. Journal of Learning Analytics, 3(1), 129–138. Khalil, M., & Ebner, M. (2016b). When learning analytics meets MOOCs-a review on iMooX case studies. In the International Conference on Innovations for Community Services (pp. 3–19). Springer. Khalil, M., Prinsloo, P., & Slade, S. (2018). User consent in MOOCs–Micro, meso, and macro perspectives. International Review of Research in Open and Distributed Learning, 19(5). Kitto, K., Buckingham Shum, S., & Gibson, A. (2018, March 5–9). Embracing Imperfection in Learning Analytics. In Proceedings LAK18: International Conference on Learning Analytics and Knowledge (pp. 451–460). ACM. https://doi.org/10.1145/3170358.3170413 Knox, D. (2010). Spies in the house of learning: A typology of surveillance in online learning environments. Paper presented at Edge 2010, Newfoundland, Canada. Kwet, M., & Prinsloo, P. (2020). The ‘smart’ classroom: A new frontier in the age of the smart university. Teaching in Higher Education titled: The datafication of higher education teaching. https://doi.org/10.1080/13562517.2020.1734922 Ochoa, X. (2017). Multimodal learning analytics. The Handbook of Learning Analytics, 1, 129–141. Prinsloo, P. (2017). Fleeing from Frankenstein’s monster and meeting Kafka on the way: Algorithmic decision-making in higher education. E-Learning and Digital Media, 14(3), 138–163. Prinsloo, P. (2019a, September 12–14). Tracking (un)belonging: At the intersections of humanalgorithmic student support. In Conference Proceedings, Pan-Commonwealth of Learning Conference. http://oasis.col.org/handle/11599/3373 Prinsloo, P. (2019b, November 14). Teaching in the dark: The perils, potential and praxis of using student data to inform teaching and learning. Invited presentation, University of South Africa, Unisa. https://doi.org/10.13140/RG.2.2.35786.52163 Prinsloo, P. (2020). Data frontiers and frontiers of power in higher education: A view of the Global South. Teaching in Higher Education. https://doi.org/10.1080/13562517.2020.1723537 Prinsloo, P., & Slade, S. (2016). Student vulnerability, agency, and learning analytics: An exploration. Journal of Learning Analytics, 3(1), 159–182. Prinsloo, P., Slade, S., & Khalil, M. (2019). Student data privacy in MOOCs: A sentiment analysis. Distance Education, 40(3), 395–413. Selwyn, N. (2019). What’s the problem with learning analytics? Journal of Learning Analytics, 6(3), 11–19. Slade, S., & Prinsloo, P. (2013). Learning analytics: Ethical issues and dilemmas. American Behavioral Scientist, 57(10), 1510–1529. Watters, A. (2020, July 20). Building anti-surveillance ed-tech. http://hackeducation.com/2020/07/ 20/surveillance Willis, J. E., Slade, S., & Prinsloo, P. (2016). Ethical oversight of student data in learning analytics: A typology derived from a cross-continental, cross-institutional perspective. Educational Technology Research and Development, 64(5), 881–901.
126
P. Prinsloo et al.
Young, J. R. (2020, November 13). Pushback is growing against automated proctoring services. But so is their use. EdSurge. https://www.edsurge.com/news/2020-11-13-pushback-is-growingagainst-automated-proctoring-services-but-so-is-their-use Zaidi, A., Beadle, S., & Hannah, A. (2018). Review of the online learning and artificial intelligence education market. Government of the United Kingdom. https://www.gov.uk/government/public ations/review-of-the-online-learning-and-artificial-intelligence-education-market
Paul Prinsloo is a Research Professor in Open and Distance Learning (ODL) in the Department of Business Management, in the College of Economic and Management Sciences, University of South Africa (Unisa). Since 2015, he is also a Visiting Professor at the Carl von Ossietzky University of Oldenburg, Germany. In 2019, the National Research Foundation (NRF) in South Africa awarded Paul with a B3 rating confirming his considerable international reputation for the high quality and impact of his research outputs. He is also a Fellow of the European Distance and E-Learning Network (EDEN) and serves on several editorial boards. His academic background includes fields as diverse as theology, art history, business management, online learning, and religious studies. Paul is an internationally recognised speaker, scholar and researcher and has published numerous articles in the fields of teaching and learning, student success in distance education contexts, learning analytics, and curriculum development. His current research focuses on the collection, analysis and use of student data in learning analytics, graduate supervision and digital identity. Sharon Slade has worked in open, distance education for almost 20 years, as a senior lecturer at the Open University in the UK. She led the work on development of the University’s Policy on the ethical use of student data for learning analytics, arguably the first of its kind in Higher Education worldwide. She has since contributed to further framework developments, notably with Jisc, Stanford University and Ithaka S+R, New America and the International Council for Open and Distance Education. Sharon was an academic lead for learning analytics projects within the Open University, leading work around ethical uses of student data, operationalisation of predictive analytics and approaches aiming to improve retention and progression. Keynotes and publications include papers around student consent, the obligation to act on what is known, examining the concept of educational triage and broader issues around an ethics of care. She now works on data insight at Earth Trust, an educational and environmental charity near Oxford. Dr. Mohammad Khalil is a senior researcher and lecturer in learning analytics at the Centre for the Science of Learning & Technology (SLATE) at the faculty of psychology, University of Bergen, Norway. Mohammad has a master’s degree in information security and digital criminology and a Ph.D. degree from Graz University of Technology in Learning Analytics in Massive Open Online Courses (MOOCs). Khalil has a rich international experience working in four different countries since 2015. He has published over 50 articles on learning analytics in highstandard and well-recognized journals and academic conferences, focusing on understanding and improving student behavior and engagement in digital learning platforms using data sciences. His current research focuses on learning analytics in Open and Distance Learning (ODL), selfregulated learning, mobile, visualizations and gamification, as well as privacy and ethics. His personal website is: http://mohdkhalil.wordpress.com.
Correction to: A Global South Perspective on Learning Analytics in an Open Distance E-learning (ODeL) Institution Angelo Fynn, Jaroslaw Adamiak, and Kelly Young
Correction to: Chapter 3 in: P. Prinsloo et al. (eds.), Learning Analytics in Open and Distributed Learning, SpringerBriefs in Open and Distance Education, https://doi.org/10.1007/978-981-19-0786-9_3 The original version of Chapter 3 was inadvertently published with the missing text in “The Resulting (Learning) Analytics” section, which has now been included. The correction to the chapter has been updated with the changes.
The updated version of this chapter can be found at https://doi.org/10.1007/978-981-19-0786-9_3
© The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2022 P. Prinsloo et al. (eds.), Learning Analytics in Open and Distributed Learning, SpringerBriefs in Open and Distance Education, https://doi.org/10.1007/978-981-19-0786-9_10
C1