119 51 11MB
English Pages 547 [566] Year 2017
Handbook of Research on Innovative Pedagogies and Technologies for Online Learning in Higher Education Phu Vu University of Nebraska at Kearney, USA Scott Fredrickson University of Nebraska at Kearney, USA Carl Moore University of DC, USA
A volume in the Advances in Higher Education and Professional Development (AHEPD) Book Series
Published in the United States of America by IGI Global Information Science Reference (an imprint of IGI Global) 701 E. Chocolate Avenue Hershey PA, USA 17033 Tel: 717-533-8845 Fax: 717-533-8661 E-mail: [email protected] Web site: http://www.igi-global.com Copyright © 2017 by IGI Global. All rights reserved. No part of this publication may be reproduced, stored or distributed in any form or by any means, electronic or mechanical, including photocopying, without written permission from the publisher. Product or company names used in this set are for identification purposes only. Inclusion of the names of the products or companies does not indicate a claim of ownership by IGI Global of the trademark or registered trademark. Library of Congress Cataloging-in-Publication Data ISBN: 978-1-5225-1851-8 eISBN: 978-1-5225-1852-5 This book is published in the IGI Global book series Advances in Higher Education and Professional Development (AHEPD) (ISSN: 2327-6983; eISSN: 2327-6991)
British Cataloguing in Publication Data A Cataloguing in Publication record for this book is available from the British Library. All work contributed to this book is new, previously-unpublished material. The views expressed in this book are those of the authors, but not necessarily of the publisher. For electronic access to this publication, please contact: [email protected].
Advances in Higher Education and Professional Development (AHEPD) Book Series Jared Keengwe University of North Dakota, USA
Mission
ISSN:2327-6983 EISSN:2327-6991
As world economies continue to shift and change in response to global financial situations, job markets have begun to demand a more highly-skilled workforce. In many industries a college degree is the minimum requirement and further educational development is expected to advance. With these current trends in mind, the Advances in Higher Education & Professional Development (AHEPD) Book Series provides an outlet for researchers and academics to publish their research in these areas and to distribute these works to practitioners and other researchers. AHEPD encompasses all research dealing with higher education pedagogy, development, and curriculum design, as well as all areas of professional development, regardless of focus.
Coverage
• Adult Education • Assessment in Higher Education • Career Training • Coaching and Mentoring • Continuing Professional Development • Governance in Higher Education • Higher Education Policy • Pedagogy of Teaching Higher Education • Vocational Education
IGI Global is currently accepting manuscripts for publication within this series. To submit a proposal for a volume in this series, please contact our Acquisition Editors at [email protected] or visit: http://www.igi-global.com/publish/.
The Advances in Higher Education and Professional Development (AHEPD) Book Series (ISSN 2327-6983) is published by IGI Global, 701 E. Chocolate Avenue, Hershey, PA 17033-1240, USA, www.igi-global.com. This series is composed of titles available for purchase individually; each title is edited to be contextually exclusive from any other title within the series. For pricing and ordering information please visit http://www.igi-global.com/book-series/advances-higher-education-professional-development/73681. Postmaster: Send all address changes to above address. Copyright © 2017 IGI Global. All rights, including translation in other languages reserved by the publisher. No part of this series may be reproduced or used in any form or by any means – graphics, electronic, or mechanical, including photocopying, recording, taping, or information and retrieval systems – without written permission from the publisher, except for non commercial, educational use, including classroom teaching purposes. The views expressed in this series are those of the authors, but not necessarily of IGI Global.
Titles in this Series
For a list of additional titles in this series, please visit: www.igi-global.com
Handbook of Research on Efficacy and Implementation of Study Abroad Programs for P-12 Teachers Heejung An (William Paterson University of New Jersey, USA) Information Science Reference • copyright 2017 • 474pp • H/C (ISBN: 9781522510574) • US $285.00 (our price) Promoting Intercultural Communication Competencies in Higher Education Grisel María García-Pérez (University of British Columbia, Okanagan Campus, Canada) and Constanza RojasPrimus (Kwantlen Polytechnic University, Canada) Information Science Reference • copyright 2017 • 361pp • H/C (ISBN: 9781522517320) • US $175.00 (our price) Adult Education and Vocational Training in the Digital Age Victor C.X. Wang (Florida Atlantic University, USA) Information Science Reference • copyright 2017 • 295pp • H/C (ISBN: 9781522509295) • US $190.00 (our price) Advancing Next-Generation Teacher Education through Digital Tools and Applications Mary Grassetti (Framingham State University, USA) and Silvy Brookby (Framingham State University, USA) Information Science Reference • copyright 2017 • 324pp • H/C (ISBN: 9781522509653) • US $180.00 (our price) Handbook of Research on Academic Misconduct in Higher Education Donna M. Velliaris (Eynesbury Institute of Business and Technology, Australia) Information Science Reference • copyright 2017 • 440pp • H/C (ISBN: 9781522516101) • US $225.00 (our price) Preparing Pre-Service Teachers for the Inclusive Classroom Patricia Dickenson (National University, USA) Penelope Keough (National University, USA) and Jennifer Courduff (Azusa Pacific University, USA) Information Science Reference • copyright 2017 • 323pp • H/C (ISBN: 9781522517535) • US $180.00 (our price) Teacher Education for Ethical Professional Practice in the 21st Century Oliver Dreon (Millersville University, USA) and Drew Polly (University of North Carolina at Charlotte, USA) Information Science Reference • copyright 2017 • 430pp • H/C (ISBN: 9781522516682) • US $190.00 (our price) Handbook of Research on Competency-Based Education in University Settings Karen Rasmussen (University of West Florida, USA) Pamela Northrup (University of West Florida, USA) and Robin Colson (University of West Florida, USA) Information Science Reference • copyright 2017 • 454pp • H/C (ISBN: 9781522509325) • US $275.00 (our price)
701 E. Chocolate Ave., Hershey, PA 17033 Order online at www.igi-global.com or call 717-533-8845 x100 To place a standing order for titles released in this series, contact: [email protected] Mon-Fri 8:00 am - 5:00 pm (est) or fax 24 hours a day 717-533-8661
List of Contributors
Altowairiki, Noha / University of Calgary, Canada.......................................................................... 151 Altowairiki, Noha F. / University of Calgary, Canada...................................................................... 218 Antonova, Albena / EI Centre, Bugaria............................................................................................ 127 Atun, Handan / Necmettin Erbakan University, Turkey........................................................................ 1 Barreto, Daisyane / UNCW, USA...................................................................................................... 106 Benson, Susan N. Kushner / The University of Akron, USA............................................................. 263 Bouchrika, Imed / University of Souk Ahras, Algeria....................................................................... 427 Brautlacht, Regina / Bonn-Rhein-Sieg University of Applied Sciences BRSU, Germany................ 393 Conklin, Sheri Anderson / UNCW, USA........................................................................................... 106 da Rosa dos Santos, Luciano / University of Calgary, Canada....................................................... 218 Ducrocq, Csilla / Paris-Sud University, Faculty of Sciences, France and Paris Graduate School of Economics, Statistics and Finance, France............................................................................... 393 Harrati, Nouzha / University of Souk Ahras, Algeria........................................................................ 427 Harris, Rachelle / Milestone’s Educational Consulting, USA........................................................... 304 Hill, S. Laurie / St. Mary’s University, Canada................................................................................. 218 Honeycutt, Barbi / FLIP It Consulting, USA.................................................................................... 449 Hristova, Plama / EI Centre, Bugaria............................................................................................... 127 Johnson, Carol / University of Calgary, Canada....................................................................... 151,218 Kasemsap, Kijpokin / Suan Sunandha Rajabhat University, Thailand............................................ 367 King-Berry, Arlene / The University of the District of Columbia, USA........................................... 304 Korucu, Agah Tugrul / Necmettin Erbakan University, Turkey............................................................ 1 Koskey, Kristin L. K. / The University of Akron, USA....................................................................... 263 Ladjailia, Ammar / University of Souk Ahras, Algeria..................................................................... 427 Lim, Jieun / Purdue University, USA................................................................................................... 19 Lock, Jennifer / University of Calgary, Canada............................................................................... 218 Mahfouf, Zohra / University of Souk Ahras, Algeria........................................................................ 427 Martins, Maria Lurdes / School of Technology and Management, Polytechnic Institute of Viseu, Portugal......................................................................................................................................... 393 Nuninger, Walter / University of Lille, France.................................................................................. 331 Ogden, Lori / West Virginia University, USA.................................................................................... 281 Olesova, Larisa / George Mason University, USA.............................................................................. 19 Ostrowski, Christopher P. / University of Calgary, Canada............................................................. 218 Oyarzun, Beth Allred / University of North Carolina Wilmington, USA.......................................... 106 Poppi, Franca / University of Modena and Reggio Emilia, Italy...................................................... 393 Qian, Yufeng / Northeastern University, USA................................................................................... 236
Romero-Hall, Enilda / University of Tampa, USA.............................................................................. 85 Scott, JoAnne Dalton / Instructional Design Practitioner/Researcher, USA...................................... 40 Shambaugh, Neal / West Virginia University, USA........................................................................... 281 Stirtz, Geraldine E / University of Nebraska Kearney, USA............................................................... 60 Thomas, Morris / University of the District of Columbia, USA........................................................ 304 Tobin, Thomas J. / The Pennsylvania State University, USA............................................................. 449 Tsvetkova, Nikolina / EI Centre, Bugaria......................................................................................... 127 Vicentini, Cristiane Rocha / University of Tampa, USA...................................................................... 85 Vu, Lan / Southern Illinois University at Carbondale, USA.............................................................. 178
Table of Contents
Foreword............................................................................................................................................. xvii Chapter 1 Use of Social Media in Online Learning................................................................................................. 1 Agah Tugrul Korucu, Necmettin Erbakan University, Turkey Handan Atun, Necmettin Erbakan University, Turkey Chapter 2 The Impact of Role Assignment on Cognitive Presence in Asynchronous Online Discussion............. 19 Larisa Olesova, George Mason University, USA Jieun Lim, Purdue University, USA Chapter 3 Promoting Learner Interaction and Personalized Learning Experiences with a Google+ Social Media Model: How to Replace the Traditional Discussion Forum....................................................... 40 JoAnne Dalton Scott, Instructional Design Practitioner/Researcher, USA Chapter 4 Online Professional Development in Academic Service-Learning: Promoting Community Engagement in Public Education........................................................................................................... 60 Geraldine E Stirtz, University of Nebraska Kearney, USA Chapter 5 Multimodal Interactive Tools for Online Discussions and Assessment................................................ 85 Enilda Romero-Hall, University of Tampa, USA Cristiane Rocha Vicentini, University of Tampa, USA Chapter 6 Instructor Presence............................................................................................................................... 106 Beth Allred Oyarzun, University of North Carolina Wilmington, USA Sheri Anderson Conklin, UNCW, USA Daisyane Barreto, UNCW, USA
Chapter 7 Developing a Pedagogical Framework for Simulated Practice Learning: How to Improve Simulated Training of Social Workers who Interact with Vulnerable People..................................... 127 Nikolina Tsvetkova, EI Centre, Bugaria Albena Antonova, EI Centre, Bugaria Plama Hristova, EI Centre, Bugaria Chapter 8 Developing Teaching Presence in Online Learning Through Shared Stakeholder Responsibility...... 151 Carol Johnson, University of Calgary, Canada Noha Altowairiki, University of Calgary, Canada Chapter 9 A Case Study of Peer Assessment in a Composition MOOC: Students’ Perceptions and Peergrading Scores versus Instructor-grading Scores................................................................................. 178 Lan Vu, Southern Illinois University at Carbondale, USA Chapter 10 A Journey Through the Development of Online Environments: Putting UDL Theory into Practice.. 218 Christopher P. Ostrowski, University of Calgary, Canada Jennifer Lock, University of Calgary, Canada S. Laurie Hill, St. Mary’s University, Canada Luciano da Rosa dos Santos, University of Calgary, Canada Noha F. Altowairiki, University of Calgary, Canada Carol Johnson, University of Calgary, Canada Chapter 11 Computer Simulation in Higher Education: Affordances, Opportunities, and Outcomes................... 236 Yufeng Qian, Northeastern University, USA Chapter 12 A Review of Literature and a Model for Scaffolding Asynchronous Student-Student Interaction in Online Discussion Forums................................................................................................................... 263 Kristin L. K. Koskey, The University of Akron, USA Susan N. Kushner Benson, The University of Akron, USA Chapter 13 Best Teaching and Technology Practices for the Hybrid Flipped College Classroom........................ 281 Lori Ogden, West Virginia University, USA Neal Shambaugh, West Virginia University, USA Chapter 14 Creating Inclusive Online Learning Environments That Build Community and Enhance Learning.. 304 Morris Thomas, University of the District of Columbia, USA Rachelle Harris, Milestone’s Educational Consulting, USA Arlene King-Berry, The University of the District of Columbia, USA
Chapter 15 Common Scenario for an Efficient Use of Online Learning: Some Guidelines for Pedagogical Digital Device Development................................................................................................................ 331 Walter Nuninger, University of Lille, France Chapter 16 Electronic Learning: Theory and Applications.................................................................................... 367 Kijpokin Kasemsap, Suan Sunandha Rajabhat University, Thailand Chapter 17 European Dialogue Project: Collaborating to Improve on the Quality of Learning Environments..... 393 Regina Brautlacht, Bonn-Rhein-Sieg University of Applied Sciences BRSU, Germany Franca Poppi, University of Modena and Reggio Emilia, Italy Maria Lurdes Martins, School of Technology and Management, Polytechnic Institute of Viseu, Portugal Csilla Ducrocq, Paris-Sud University, Faculty of Sciences, France and Paris Graduate School of Economics, Statistics and Finance, France Chapter 18 Evaluation Methods for E-Learning Applications in Terms of User Satisfaction and Interface Usability............................................................................................................................................... 427 Nouzha Harrati, University of Souk Ahras, Algeria Imed Bouchrika, University of Souk Ahras, Algeria Zohra Mahfouf, University of Souk Ahras, Algeria Ammar Ladjailia, University of Souk Ahras, Algeria Chapter 19 Improve the Flipped Classroom with Universal Design for Learning................................................. 449 Thomas J. Tobin, The Pennsylvania State University, USA Barbi Honeycutt, FLIP It Consulting, USA Compilation of References................................................................................................................ 472 About the Contributors..................................................................................................................... 537 Index.................................................................................................................................................... 545
Detailed Table of Contents
Foreword............................................................................................................................................. xvii Chapter 1 Use of Social Media in Online Learning................................................................................................. 1 Agah Tugrul Korucu, Necmettin Erbakan University, Turkey Handan Atun, Necmettin Erbakan University, Turkey Social media tools are used to visualize resources especially. However, there is a limited range of created content in social media, instructors and students have a tendency to use shared materials rather than edit an existed material or create a new material. However, this does not change the situation of social media in education, it is proved that social media improves teaching and learning process. Therefore, researchers stressed that instructions should consider supporting academic staff with technical and pedagogical guidance as the academicians do not meet the requirements of digital native students. Their web self-efficacy and digital competencies should be improved. Chapter 2 The Impact of Role Assignment on Cognitive Presence in Asynchronous Online Discussion............. 19 Larisa Olesova, George Mason University, USA Jieun Lim, Purdue University, USA This study examined the impact of role assignment on cognitive presence when students participated in asynchronous online threaded discussions. A mixed methods design was used to investigate changes in the levels of cognitive presence while the students participated in an online introductory nutrition course. This study found evidence that scripted role assignment can be an effective instructional strategy when the approach is implemented into asynchronous online discussions. Implications for instructors and designers of asynchronous online learning environments are discussed. Chapter 3 Promoting Learner Interaction and Personalized Learning Experiences with a Google+ Social Media Model: How to Replace the Traditional Discussion Forum....................................................... 40 JoAnne Dalton Scott, Instructional Design Practitioner/Researcher, USA This chapter presents the Directed Google+ Community model (DG+) as an alternative to the traditional discussion board forum. Social media platforms exhibit characteristics that can be leveraged in course design to promote positive learner experiences. Specifically, the chapter will define the DG+ model; examine how it promotes learner interaction, discussion, collaboration and peer review; discuss how it
supports course topics, course assignments and creates a searchable knowledge management system; and explain how it complements the use of a learning management system for grade reporting purposes. Both the instructor and the students experience benefits from this design tool. The chapter will also discuss ways to overcome potential obstacles to implementing the model. Chapter 4 Online Professional Development in Academic Service-Learning: Promoting Community Engagement in Public Education........................................................................................................... 60 Geraldine E Stirtz, University of Nebraska Kearney, USA The overall purpose of this qualitative study was to describe how a 3 credit hour, web-based, graduatelevel course in service-learning pedagogy supports the theory that service-learning as a pedagogy can be taught effectively in an online format. Service-learning integrates community service with academic study to enrich learning, teach civic responsibility and strengthen communities. Content analysis of the selected case studies and evaluation of the student’s reflections concludes that the students enrolled in the online class in a Midwest University were, in fact, able to learn this teaching strategy and then effectively implement this strategy with their classroom of students in their local communities. The Literature Review discusses numerous research articles supporting the value of this teaching strategy of collaboration with community partners in citizenship training for youth, children and young adults. Chapter 5 Multimodal Interactive Tools for Online Discussions and Assessment................................................ 85 Enilda Romero-Hall, University of Tampa, USA Cristiane Rocha Vicentini, University of Tampa, USA The purpose of this chapter is to discuss the enhancement of asynchronous online discussions and assessment using multimodal interactive tools that allow text, video, and audio posts. The integration of these multimodal interactive tools as well as their affordances could lead to powerful changes in the learning experience of students interacting in asynchronous online environments. Along with providing an overview on asynchronous online discussions, the chapter will include a review of how multimodal interactive tools are used to engage learners in online discussions using text, audio, and video. Additionally, the chapter will describe both the benefits and challenges of asynchronous online discussions with text, audio, and video posting. Furthermore, the chapter will describe how the same multimodal interactive tools can also serve as an assessment method in asynchronous online learning of specialized subject areas. Chapter 6 Instructor Presence............................................................................................................................... 106 Beth Allred Oyarzun, University of North Carolina Wilmington, USA Sheri Anderson Conklin, UNCW, USA Daisyane Barreto, UNCW, USA Student isolation and retention rates are persistent issues in online learning. Research has shown that an important component of student performance and satisfaction is instructor presence. Instructor presence includes three elements: 1) Teaching presence, 2) Instructor immediacy, and 3) Social presence. This chapter will use this definition of instructor presence to outline best pedagogical practices with concrete examples to increase instructor presence in asynchronous online courses. Each section will begin with a definition and research on that construct followed by best practices with concrete examples.
Chapter 7 Developing a Pedagogical Framework for Simulated Practice Learning: How to Improve Simulated Training of Social Workers who Interact with Vulnerable People..................................... 127 Nikolina Tsvetkova, EI Centre, Bugaria Albena Antonova, EI Centre, Bugaria Plama Hristova, EI Centre, Bugaria While simulated learning becomes an attractive learning method for learners and educators, it is the pedagogical framework behind the technology design that makes the learning efficient. Thus the context and the subject domain, along with learning theories largely influence its impact. Working with vulnerable people becomes part of many jobs specifics. Therefore, the main goal of the chapter is to present the pedagogical framework for simulated practice learning for social workers who interact with vulnerable people. It takes into consideration both the theories of learning and the features of games-based learning. It also outlines the relations between the broader social context, the particular educational setting and the learner, the trainer and the vulnerable person. The focus of the presented simulated learning is on teacher training for child-care professionals who work with 3- 7 years old children. The Pedagogical Framework is developed under the Simulated Practice for Skills Development in Social Services and Healthcare - Digital Bridges (2014-1-UK01-KA200-001805). Chapter 8 Developing Teaching Presence in Online Learning Through Shared Stakeholder Responsibility...... 151 Carol Johnson, University of Calgary, Canada Noha Altowairiki, University of Calgary, Canada Transitioning from a face-to-face teaching environment to online teaching requires a shift in paradigm by stakeholders involved (i.e., instructors and students). This chapter provides an extensive literature review to help novice online instructors understand the nature of online teaching presence to help position their students towards more active participation. Premised on the Community of Inquiry framework (Garrison, Anderson & Archer, 2000) and constructivism, we highlight a conceptual framework of four iterative processes for developing online teaching presence: preparations for facilitation, designing the facilitation, implementing the facilitation, and assessing the facilitation. Based on this framework, strategies are articulated for overcoming the challenges of online learning through shared stakeholder responsibility. Chapter 9 A Case Study of Peer Assessment in a Composition MOOC: Students’ Perceptions and Peergrading Scores versus Instructor-grading Scores................................................................................. 178 Lan Vu, Southern Illinois University at Carbondale, USA The large enrollments of multiple thousands of students in MOOCs seem to exceed the assessment capacity of instructors; therefore, the inability for instructors to grade so many papers is likely responsible for MOOCs turning to peer assessment. However, there has been little empirical research about peer assessment in MOOCs, especially composition MOOCs. This study aimed to address issues in peer assessment in a composition MOOC, particularly the students’ perceptions and the peer-grading scores versus instructor-grading scores. The findings provided evidence that peer assessment was well received by the majority of students although many students also expressed negative feelings about this activity. Statistical analysis shows that there were significant differences in the grades given by students and those given by the instructors, which means the grades the students awarded to their peers tended to be higher in comparison to the instructor-assigned grades. Based on the results, this study concludes with implementations for peer assessment in a composition MOOC context.
Chapter 10 A Journey Through the Development of Online Environments: Putting UDL Theory into Practice.. 218 Christopher P. Ostrowski, University of Calgary, Canada Jennifer Lock, University of Calgary, Canada S. Laurie Hill, St. Mary’s University, Canada Luciano da Rosa dos Santos, University of Calgary, Canada Noha F. Altowairiki, University of Calgary, Canada Carol Johnson, University of Calgary, Canada As higher education institutions move toward offering more online courses, they need to carefully consider how the principles of Universal Design for Learning (UDL) should be integrated into the design and development of the online environments so to better meet the needs of all learners. An example of how this can occur is illustrated in the chapter with a design project that used principles of UDL in the creation of online environments for field experience courses at one Canadian university. The design team shares the journey of developing their understanding of UDL and applying these principles when creating online environments for both students and instructors. The provision of educational developmental opportunities for instructors using various strategies is also highlighted. The chapter concludes with three recommendations for future research. Chapter 11 Computer Simulation in Higher Education: Affordances, Opportunities, and Outcomes................... 236 Yufeng Qian, Northeastern University, USA Computer simulation as both an instructional strategy and technology holds great potential to transform teaching and learning. However, there have been terminological ambiguity and typological inconsistency when the term computer simulation is used in the education setting. This chapter identifies three core components of computer simulation, and develops a learning outcome-based categorization, linking together computer simulation’s technical affordances, learning opportunities, and learning outcomes. Exemplary computer simulations in higher education are identified to illustrate the unique affordances, opportunities, and outcomes of each type of computer simulation. Chapter 12 A Review of Literature and a Model for Scaffolding Asynchronous Student-Student Interaction in Online Discussion Forums................................................................................................................... 263 Kristin L. K. Koskey, The University of Akron, USA Susan N. Kushner Benson, The University of Akron, USA The purpose of this chapter is to overview types of asynchronous student-student interactions with a focus on designed interaction in an online discussion forum context, as well as to illustrate pedagogical approaches to scaffolding interactions. Student-student interaction in asynchronous online discussion is the emphasis of this chapter. The chapter focuses on a review of the literature on the roles of the instructor, student, and learning task in the online teaching and learning process. Ways in which these roles interact is then discussed including an overview of types of interactions. The chapter then focuses on contextual and designed interactions including conditions documented in research as to how to effectively use designed interaction to scaffold student-student interaction. Next, a guiding model is presented for how to plan for asynchronous interaction. Finally, challenges faced when designing or implementing synchronous discussions are discussed, as well as potential recommendations for overcoming these challenges.
Chapter 13 Best Teaching and Technology Practices for the Hybrid Flipped College Classroom........................ 281 Lori Ogden, West Virginia University, USA Neal Shambaugh, West Virginia University, USA Two cases of the flipped classroom approach, one undergraduate course and one a graduate course, are used to demonstrate the different ways that flipping instruction can occur in both F2F and online courses, thus, extending the notion of hybrid and flipped teaching decisions with F2F and virtual class-rooms. Both cases are summarized in terms of instructional design decisions, the models of teaching framework, and research conducted on the courses. Findings from research conducted on both courses indicate that a flipped classroom approach can enhance the teaching of both F2F and online courses as it provides instructors an opportunity to adapt instruction to meet the individual needs of students. Recommendations, based on this course development work, are provided for undergraduate and graduate courses in terms of access, meaningful activities, and feedback. Chapter 14 Creating Inclusive Online Learning Environments That Build Community and Enhance Learning.. 304 Morris Thomas, University of the District of Columbia, USA Rachelle Harris, Milestone’s Educational Consulting, USA Arlene King-Berry, The University of the District of Columbia, USA The current status of today’s society is driven by and involves technology. Many people cannot function without their cell-phones, social media, gadgets, tablets, and other forms of technology for which people interact. Many of these technologies depend upon and are utilized within an online context. However, as it pertains to online learning environments, many faculty struggle with developing and implementing opportunities that builds a sense of community for their learners. This chapter: 1) Discusses key factors that impact student engagement, 2) Addresses factors that facilitate continued engagement for diverse online learners, 3) Provides evidence-based practices for creating and sustaining online learner engagement, and 4) Offers real world suggestions from the online teaching experience of chapter’s authors. Chapter 15 Common Scenario for an Efficient Use of Online Learning: Some Guidelines for Pedagogical Digital Device Development................................................................................................................ 331 Walter Nuninger, University of Lille, France Training efficiency required for Higher Education (quality, accessibility, bigger groups with heterogeneous prior experience, funding, competition…) encourages providers to find new ways to facilitate access to knowledge and enhance skills. In this scope, the use of digital pedagogical devices has increased with innovative solutions; the ones based on an LMS to support a blended course or MOOCS design for self-education. This evolution has impacted teaching practices, learning and organizations leading to a new paradigm for trainers and a new business model to be found for online and distance learning. The innovation mostly relies on the use of learner-centered digital learning solutions in a comprehensive way for the commitment of more active and independent learners and their skills recognition. Based on a 3-year experiment (hybridized course for CVT) and continuous improvement in the WIL, a common scenario is proposed to address the issue for distance training.
Chapter 16 Electronic Learning: Theory and Applications.................................................................................... 367 Kijpokin Kasemsap, Suan Sunandha Rajabhat University, Thailand This chapter aims to explain the overview of electronic learning (e-learning); the emerging trends in e-learning; the important factors of e-learning; the relationships among e-learning quality, learning satisfaction, and learning motivation; the implementation of e-learning; the approaches and barriers to e-learning utilization; e-learning for medical and nursing education; and the significance of e-learning in modern education. When compared to the traditional mode of classroom learning, there is clear evidence that e-learning brings faster delivery, lower costs, more effective learning, and lower environmental impact in the modern learning environments. E-learning allows each individual to tackle the subject at their own pace, with interactive tasks being set in place to ensure a thorough understanding throughout each module. The chapter argues that utilizing e-learning has the potential to increase educational performance and reach strategic goals in modern education. Chapter 17 European Dialogue Project: Collaborating to Improve on the Quality of Learning Environments..... 393 Regina Brautlacht, Bonn-Rhein-Sieg University of Applied Sciences BRSU, Germany Franca Poppi, University of Modena and Reggio Emilia, Italy Maria Lurdes Martins, School of Technology and Management, Polytechnic Institute of Viseu, Portugal Csilla Ducrocq, Paris-Sud University, Faculty of Sciences, France and Paris Graduate School of Economics, Statistics and Finance, France Telecollaborating and communicating in online contexts using English as a Lingua Franca (ELF) requires students to develop multiple literacies in addition to foreign language skills and intercultural communicative competence. This chapter looks at the intersection of technology and teaching ELF, examining mutual contributions of technologies, more specifically Web 2.0, and ELF to each other, and the challenges in designing and implementing collaboration projects across cultures. Moreover, it looks at how the development of digital competencies in ELF (DELF) can be enhanced through the implementation of Web 2.0 mediated intercultural dialogues. The detail of the research design including internet tools used, participants and tasks are also discussed. Data analysis points to a positive attitude towards telecollaboration, also providing confirmation of some of the problems identified in theoretical framework, such as different levels of personal engagement.
Chapter 18 Evaluation Methods for E-Learning Applications in Terms of User Satisfaction and Interface Usability............................................................................................................................................... 427 Nouzha Harrati, University of Souk Ahras, Algeria Imed Bouchrika, University of Souk Ahras, Algeria Zohra Mahfouf, University of Souk Ahras, Algeria Ammar Ladjailia, University of Souk Ahras, Algeria The use of online technology has become ubiquitous and integral part of our daily life from education to entertainment. Because of the ubiquity of e-learning and vital influence for engineering the educational process, it is no surprise that many research studies are conducted to explore different aspects covering the use of e-learning in higher education. The assessment and evaluation aspects are considered arguably the most influential part for measuring the success and effectiveness of e-learning experience. As more and more universities worldwide have opted to use online technology for their course delivery, research in e-learning systems have attracted considerable interest in order to apprehend how effective and usable e-learning systems in terms of principles related to human computer interaction. Chapter 19 Improve the Flipped Classroom with Universal Design for Learning................................................. 449 Thomas J. Tobin, The Pennsylvania State University, USA Barbi Honeycutt, FLIP It Consulting, USA The flipped-classroom approach has been adopted widely across higher education. Some faculty members have moved away from it because of the perceived workload required in order to implement a full course “flip.” Faculty members can adopt the three principles of Universal Design for Learning (UDL) in order to reduce their own workload and make their flipped-classroom content and interactions more engaging, meaningful, and accessible for students. Adopting both the classroom flip and UDL provides benefits to learners and instructors that go beyond adopting either separately. Compilation of References................................................................................................................ 472 About the Contributors..................................................................................................................... 537 Index.................................................................................................................................................... 545
xvii
Foreword
Mapping the journey for online learning is not simply an ideological endeavor alone. The attention to the details of instructional design and instructional delivery methodology requires urgency no different that given to the nuanced attention to detail that educators employ during course preparation. The solutions we create within our hybrid, online, and mixed modal approaches need to be scalable, fluid in scope, and implemented as if we had laser-focus on a GPS model, that is a Guided Pathways Sequence. Our online approaches have been begged for mapping, where students can literally see the architect’s hyperinformation highway toward employment and/or graduation, increased retention and engagement, that leads to student fulfillment. The mindfulness associated with student success cannot be underemphasized as online learning provides a conduit toward completion. We are in an age of accessible content. Often, our professoriate dialogues about the continued increase of Open Educational Resources (OER), Massive Open Online Courses (MOOC), or Mode Neutral, though none guarantee success or completion, in a society with an ever quickening pace and increased use of mobile devices, the interest in on demand learning at a self-pace calls for our points of connection to increase. Our lengthened strides toward online learning leaps the gateway forward as learners move closer to the close of the first 20 years of a new millennia. Our access to learning has increased infinite fold because of IoT and IoE. For those connected communities, learning has only one boundary now, the limitations of her/him who seeks information. The information you are privy to here acknowledges the importance of online learning, its scope, and its possibilities. I posit that success is now partially based on the spheres of access, connectivity, and competencies in the 21st century. Be it the connection between faculty and student, student and student, community and student, or technological platform and student, at the crux of many conversations is the learner. The Learner represents ALL of US. Online learning provides an opportunity to decrease the separation created by the tired categorization of research steeped in age, ethnicity, but moves toward cultural currency and the impact environment. Are we not glocal citizens now?
Foreword
An accessible education is in our pockets, in our laps and cast upon VR and AR screens of our newly acquired headsets, cardboard, and glasses. Viewed through the lenses of Futurists and emerging Virtists (a neologism coined to mean someone totally immersed in virtual reality). We must beg the question, are we as learners at another precipice? I know I have jumped and happily free fall toward the open and infinite concept of possibility, plausibility, and playful perspectives. These bubbly and cherubian whispers are expansive, so listen well to the surveyors of the moment, our coders-code breakers-our sculptors of tomorrow. All we ask here is for your wide-eyed imagination—anything is possible! Michael Torrence Volunteer State Community College, USA
Michael Torrence, Ph.D. is the Assistant Vice President of Academic Affairs at Volunteer State Community College where he leads the development of curriculum, teacher and student development, online learning and innovation for over 10,000 students annually. He also serves as the Co-Chair and member of the Tennessee Board of Regents (TBR) Emerging Technology and Mobilization group.
xviii
1
Chapter 1
Use of Social Media in Online Learning Agah Tugrul Korucu Necmettin Erbakan University, Turkey Handan Atun Necmettin Erbakan University, Turkey
ABSTRACT Social media tools are used to visualize resources especially. However, there is a limited range of created content in social media, instructors and students have a tendency to use shared materials rather than edit an existed material or create a new material. However, this does not change the situation of social media in education, it is proved that social media improves teaching and learning process. Therefore, researchers stressed that instructions should consider supporting academic staff with technical and pedagogical guidance as the academicians do not meet the requirements of digital native students. Their web self-efficacy and digital competencies should be improved (Manca & Ranieri, 2016a; Manca & Ranieri, 2016b).
CONSTRUCTIVE LEARNING THEORY Constructivism is a learner-centered educational approach which emphasizes that learner constructs his or her own knowledge by connecting new information to the existed ones (Henson, 2003). Information is not transferred from environment passively; in contrast, it is formed in individual’s mind actively (Duffy & Jonassen, 1991). That is, the information does not exist on the outside; it is constructed by human brain. Learning is defined as the creation of relationship between new knowledge and prior experience rather than simply storage of the transferred knowledge by teacher, according to constructive learning theory (Balım, İnel & Evrekli, 2007). Students meet their learning needs, reach their learning goals and solve problems with the guidance of teacher, variable resources and tools by supporting each other as a group (Wilson, 1996).
DOI: 10.4018/978-1-5225-1851-8.ch001
Copyright © 2017, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
Use of Social Media in Online Learning
Hence, active participation is required for the construction of knowledge in constructive learning theory model. Instead of acquiring messages from environment, students construct meanings through correlating, comparing and interpreting their perceptions. Students’ correlations, comparisons and interpretations are affected by prior knowledge constantly (Mayer, 1999) Learning is a collaborative effort spent by solving problems related to the actual design, assigning in independent, unrepeated tasks in the process of problem-solving and creating social environment by forming groups as stated in constructivist approach. Rather than how much information is learned, how information is learned is important, so development of learning-teaching and thinking strategies is crucial. Instead of memorizing information, constructing their own knowledge is a demanded outcome for individuals (Moussiaux & Norman, 2003). The preliminary information of learner should be aroused by teacher before teaching new knowledge and concepts because new concepts and information stay more permanent if they are correlated with the old ones (Persall, Skipper & Mintzes, 1997). Information does not exist without individual and human brain cannot be perceived like a blank sheet in any stage of learning, as stated in constructivist theory. Instead, information is controlled by human and is made up from reconstruction by comparison of old and new ones. Constructive learning theory is examined under two main sections; one of them is cognitive constructivism pioneered by Piaget and Bruner, the other one is social constructivism pioneered by Vygotsky (Özden, 2003).
Social Constructivism According to Vygotsky (1987), social constructivism is to construct knowledge in social environment together with sharing, interaction, discussion and active participation among individuals in which effective learning occurs. Information is formed by mutual decision of the individuals in a social group. When constructing knowledge in a social environment, individuals are not also affected by others but also affect others’ opinions by sharing ideas (Fer & Cırık, 2007). Learning takes place in such interactive social environments by the interaction with others and co-operate with peers. Students’ problem solving skills bring students up to a point by themselves, which is called the level of actual development. However, students’ level of potential development can bring the students a further point in success, since the students’ potentials are emerged with the guidance of teacher and collaboration with peers. The difference between actual development level and potential development level forms student’s zone of proximal development (Vygotsky, 1978). Student’s group mates can be role models for student, can encourage the student, can help in problem solving and can guide student with explanations about the task (Henson, 2003). Social constructivism is a developmental process in which the individual interacts with the cultural and social environment; in fact, the individual and his or her socio-cultural environment completes each other (Hickey & McCaslinprees, 2001). The cognitive development of human is directed from socio-cultural environment where individual lives, to the individual himself. Socio-cultural environment is created by language which is the most important element that provides interaction between environment and human. Information is constructed subjectively, that is, every individual has his/her own information different from others. Students construct their existed cognitive and social knowledge with gained experiences in the process. Social interaction increases the permanency of constructed knowledge.
2
Use of Social Media in Online Learning
The students develop their interactive learning by building projects with active participation. According to Von Glasersfeld (1995), other individuals are important evaluation components which challenge a person’s perspective and lead to learning.
Constructivist Learning Environment (CLE) • • • • • • • • • • •
All learning activities should be performed as a big task or a problem-based project. Learning environment should be designed as the learner can be active in. Students can discuss with each other by comparing different point of views and opinions. The fact that students own a whole problem or a task should be supported. Learning activities should be supported by authentic problems or a project. A product which consists of complicated media should be developed at the end of the project by students. Student should be a part of a real designed problem or a defined task. Student should dominate in the process of work plan. Student should interpret alternative opinions and suggest their opinions against their opponents. Learning content and working plan process should be mirrored and opportunities should be given to students. A learning environment should be designed that force and supports student’s consideration (Savery & Duffy, 1995).
Seven Principles of Chickering and Gamson Seven principles for good education developed by Chickering and Gamson (1987) can be accepted for CLEs since they can guide the teachers and students by increasing the efficiency of education. 1. 2. 3. 4. 5. 6. 7.
Encourages contact between students and faculty Develops reciprocity and cooperation among students. Encourages active learning. Gives prompt feedback. Emphasizes time on task. Communicates high expectations. Respects diverse talents and ways of learning.
Chickering and Gamson (1987) suggest that effective communication and interaction in education have a positive impact on student’s academic achievement and student’s commitment to the course. Effective communication and interaction defined as 21st century competencies are important in achievement, engagement, attitude and problem solving skills which are essential educational outcomes in today’s learning environments. Moreover, this type of education provides benefits on students’ high level thinking skills like critical thinking.
3
Use of Social Media in Online Learning
Feedback Recognizing feedback as essential in a learning environment, constructive theory stressed that learning emerges from the feedback reflections. According to constructivists, supporting learners leads to construct and reconstruct knowledge, which develops students’ comprehension depending on feedback (Swan, 2005). In online learning environment like distance education, immediate feedback is crucial. Feedback fosters the interaction between student and instructor, contributes to student’s development and memorizing knowledge. An effective feedback should be rapid, detailed and regular on performance (Bryan & Clegg, 2006). Based on a research conducted by Nicol and Macfarlane-Dick (2004), there are seven principles to give a good feedback. 1. 2. 3. 4. 5. 6. 7.
Helping clarify what good performance is (goals, criteria, expected standards); Facilitating the development of self-assessment (reflection) in learning; Delivering high quality information to students about their learning; Encouraging teacher and peer dialogue around learning; Encouraging positive motivational beliefs and self-esteem; Providing opportunities to close the gap between current and desired performance; Providing information to teachers that can be used to help shape the teaching.
Another study managed by Riccomini (2002) suggests that giving feedback from a web based material is weaker than instructor’s feedback. As it will be discussed later, the social media’s benefit on education compared to non-interactive learning environments is important as social media supports interaction and communication with instructor himself/herself.
Interaction Interaction is directed through 3 ways: 1. Learner-learner 2. Learner-teacher 3. Learner-environment (technology) Individual differences have great importance on the application of student-teacher-school and studentstudent interaction and communication. The students’ participation in extracurricular activities, studentenvironment-teacher interaction outside of classroom and the quality of relationship among students in a group developed planned instruction strategies increase the success in the attendance of students in class, the commitment to the study plan and gaining educational outcomes. It is stated that students’ interactions with other students, instructional strategies which provide opportunity and support, and a developed environment which enables to overcome learning tasks difficulties are beneficial to boost students’ active participation in learning process. Students’ active participation in learning process makes them aware of their performance on academic subjects, which has a significant positive impact on students’ academic achievement (Ullah and Wilson, 2007). 4
Use of Social Media in Online Learning
Moreover, students’ interaction with each other and staying in touch academically with teacher in the environments developed in and out of the school, enhance academic achievement in a significant manner. The communication between student-teacher and environment, academic engagement to build this relationship gives rise to increase in students’ academic achievement, improves personal development and 21st century competencies (Chickering & Gamson, 1987).
Collaborative Learning According to Piaget, collaborative learning is crucial to construct cognitive development. Peer learning is more effective than learning with parents or teachers because learners can communicate and interact with each other easier (Kumar, 1996). According to Vygotsky, the cognitive development is related with cooperative study, in fact, the developmental capabilities of individuals more likely to come up in group studies than working alone (Stahl, 2006). Researchers suggest that the collaborative learning is the most crucial application of constructivism which is a student-centered learning model (Duffy & Cunningham, 1996; Duffy & Jonassen, 1991; Johnson, Johnson & Holubec, 1991; Panitz, 1999). Collaborative learning is a teaching-learning process which supports interaction and cooperative studying with each other in small groups. Learners study in small groups with a targeted goal in the collaborative learning process, which develops cooperation and helping among students (Jacobsen, Egen & Donald, 2002). Group members should engage academically in order to bring both themselves and other members’ learning to the highest level in collaborative learning method within the specified time unlike other group learning methods. Awareness of the fact that any learner cannot achieve without every learner’s achievement leads to arrange a working plan that allow aid other group members. The academic achievement is the result of all academic engagements of every group members’ working together at the end of the process. Collaborative learning is not an independent study which composes of different tasks for every student. In other words, doing a particular section from given task for every student in a group is not sufficient for collaborative learning. On the contrary, students are responsible for each other’s learning, they should cooperate and interact with each other. Collaborative learning model is studying as a group of students at each level and have ability to work cooperatively in order to achieve a common targeted goal. Not every teamwork forms a collaborative learning environment. There are a plenty of features that teamwork must have as collaborative learning according to several academic research (Johnson & Johnson, 1990; Johnson, Johnson & Smith, 1991; Johnson, 1999; Johnson, Johnson & Holubec, 1994; Slavin, 1996). The features which are resulted from these studies are: • • • • • • •
Positive commitment Supportive face to face interaction Individual evaluation and responsibility Interpersonal interaction or ability to use social skills Evaluation of group process Having an equal chance of success Group awards
5
Use of Social Media in Online Learning
Computer-Supported Collaborative Learning Recently, people are increasingly using computer networks as an alternative way to learn, in fact, computers facilitate collaborative learning. Computer supported learning designates a learning method used to cooperate learners with informational technology. This latest learning technology enables guidance through goals, provides social support and individualized instruction for learners (Zheng, 2016). During CSCL, teacher can control all students’ activity and can interact simultaneously with multiple groups (Leeuwen, Janssen, Erkens & Brekelmans, 2015). Computer-supported collaborative learning is emerged as a learning style how people learn with computer aid. Web 2.0 technologies, a part of dynamic web technology applications, are seen as technologies which allow producing dynamic content that empowers learning with others in collaborative learning environments, develops creativity and enables reconstructing knowledge. Jonassen and Kwon (2001) stated that students participate actively in class and they are more likely to communicate and interact with each other in computer supported learning environments compared to traditional learning environments where face-to-face communication and interaction occurs. Computer supported online learning environments students who try to solve problems cooperatively will gain highlevel skills from 21st century competencies such as problem solving, effective communication, critical thinking and creative thinking.
THE INTEGRATION OF TECHNOLOGY IN EDUCATION The integration of IT in education is the use of IT tools in the teaching-learning process in order to make students achieve learning goals and develop their learning (Cartwright & Hammond, 2003). Moreover, the integration of IT in education is to create opportunities to collect scientific data and interact with the resources via new technologies (Gillespie, 2006). IT using in education encourages students about communication and cooperative study. Therefore, using hardware like personal computers, digital cameras, scanners, printers, projection machine, and software such as word processing, database management, and graphic design in the teaching-learning process can promote students’ learning. The process of IT integration in education includes several staff such as students, teachers, coordinators, administrators. Teachers are the core element who uses the IT in lessons directly so, it is an obligation to educate teachers first (World Links, 2007). IT integration in education composes of students, teachers, parents, administration, politics, technological infrastructure, software and the interaction between them. Every element’s effectiveness in the process of IT integration depends on: • • • • • • •
6
common vision policies standards and program support professional development access to hardware, software and other resources suitable teaching and assessment approaches technical support (ISTE, 2002; UNESCO, 2002)
Use of Social Media in Online Learning
The integration of IT deals with continuous change and development because IT integration is a multidimensional process with several elements mentioned above and there are several IT integration models in the literature. Some of these models are: the five-stage (based on 5E instructional model) computer technology integration model, e-capacity and effectiveness system model (ECMASM), systematic planning model (SPM), technological pedagogical content knowledge (TPCK), pedagogy, social interaction technology generic model (GMPSIT), technology integration planning model (SIPM), general model (GM), resources, infrastructure, manpower, policies, learning, assessment, support models. Some of the issues are highlighted by researchers for the integration of IT in education effectively. These issues are; • • • • • •
To obtain opinions of teachers related to IT integration applications Support of the school administration IT teachers’ active role in IT integration Meeting IT needs of teachers and arranging professional training related to IT Performing applications related to IT that improves interaction among teachers and providing access to technological environment and resources Providing updated software and hardware for students and teachers and inter-agency cooperation (Jung, 2005; Lim & Ching, 2004; Pelgrum, 2001; Richards, 2006; Sang, Valcke, Braak & Tondeur, 2010; Semenov, 2005; World Links, 2007)
The Features of Technology Supported Collaborative Learning Environments Today, students are expected to possess 21st century competencies. These 21st century competencies are; • • • • • • • • • • • • • • • • • •
Critical thinking Creativity / Innovation Information literacy Problem solving Deciding Flexibility and adaptability; learning to learn Research and investigation Communication, interaction Initiative and self-direction Efficiency Leadership and responsibility Cooperation Informational technology operations and concepts Digital Citizenship Media literacy Communication in native language Communication in foreign languages Mathematical competence and basic competences in science and technology
7
Use of Social Media in Online Learning
• • • • •
Digital competence Learning to learn competence Social and communal competence Sense of creativity and entrepreneurship Cultural awareness and expressing the self (Otten & Ohana, 2009; Finegold & Notabartolo, 2010)
In gaining these 21st century skills Nelson’s collaborative problem solving stages are crucial used in real life problems; they do not only encourage students to think critically, creatively and to solve complicated problems but also they help students to socialize with each other. According to Nelson (2009) the 9 steps of collaborative problem solving are listed as follows; 1. 2. 3. 4. 5. 6. 7. 8. 9.
Create Preparations Create Groups and identification. Identifying and defining the problem Defining business rules Groups of recognition time to tackle the problem-solving process should help each other to solve the problems given to them Recognition time for them to end solutions to problems or projects to the group Teachers and students’ learning experience and reflect synthesis activities to do Product and application evaluation process Turning off the application process
Collaborative learning environments should be designed according to these 9 steps. The most important characteristic of a collaborative learning environment designed based on these features is the student-centered process. In these environments, students participate in the learning process actively and they learn how to solve problem studying with others. Another benefit of collaborative learning environment is learning associated with real life occurs and cooperative group mates encourage and support each other for learning (Neo, 2003). When designing technology supported learning environments, problem solving skills should be considered initially. That is, technology supported collaborative learning environments should be developed based on problem solving. This type of learning environment contributes learning; increases the motivation and interest of students, facilitates students to reach the knowledge, supports active participation, enables effective and controllable environment, facilitates constructing knowledge with new strategies, detects and corrects mistakes easily, promotes managing the complexity and helps to create qualified products at the end of the process (Blumenfeld et al., 1991).
Online Learning Types of Online Learning •
8
Asynchronous Online Courses: In this course, there is no exact time for the course. The content of the lecture is given through computer-based material and the assessed homework should be completed in the given time. Interaction can be provided by blogs, boards and wikis.
Use of Social Media in Online Learning
• •
Synchronous Online Courses: In this course, instructor and learner enroll simultaneously in the class and they can also interact with each other instantly. In this environment, learner can participate in the classroom from a distance. Hybrid Courses (Blended Courses): This type of learning environment provides both asynchronous and synchronous online courses.
Use of Technology in Online Learning Technology is used in online learning in the process of delivering the course. As printed material, etexts, textbooks, e-zines; as video material, streaming video, video tape, satellite transmission; as audio material, streaming audio, audio tape; for asynchronous communication emails, listservs, weblogs, and forums; for synchronous communication, chat, videoconference, teleconference can be used.
Collaborative Online Learning Online collaborative learning environment created by using dynamic web technologies and cooperative technologies used to construct appropriate knowledge with the constructivist approach are massively important. Collaborative technologies which support cooperative studies are the environments which allow collaborative work on computer and internet by eliminating students require to meet at the same physical media (Shihab, 2008; Tambouris et al., 2012). Technology-supported cooperative work has appeared in many new tools and technologies to make these technologies more efficient and serves on the internet. This new technology is called “Dynamic Web technologies (Web 2.0)”. Dynamic web technologies that allow students to work together over the Internet by which they set up voice, text, are defined as internet-based system that facilitates video communication and interaction.
Education with Web 2.0 Dynamic web technologies which are used very frequently in daily life and discovered its’ usability in education effectively allows creating dynamic content and can be divided into 5 categories as follows: •
•
•
Wikis: Wikis are websites structured with using same template, which collect pages shared by many users. The first wiki was wikiwikiweb, the term wiki used because it means fast in Hawaiian language. Wikis enable to create and work together, which make them a suitable tool for collaborative learning. Blogs: Blogs are personal diaries written by users created in a sub-world in internet called Blogosphere. It can be thought that they were existed in WEB 1.0 as personal pages, however some differences existed: they provide continuous writing, they are not formed from one page, they can save more data, they are stored in Blogosphere and their information is saved under a special name called blogger. RSS: RSS (Really Simple Syndication) is to sign a written content with a related content. While the reader is viewing contents, RSS filter them according to interested topics. Users can only see what they want to see.
9
Use of Social Media in Online Learning
•
•
Tagging: Tagging enables users to create links between various media. In fact, they share everything via those tags. Thanks to WEB 2.0, tagging is personal, everyone can tag every content. Unlike other web tools this is not taxonomy; this is called folksonomy. Flicker can be an example, which is navigated by tags. Social Networking : Social networking is a very well-known term referring to applications enabling to unite multiple users in same website. Every user can invite his/her own colleagues to join in, thus social networks grow (Levy, 2009).
The Benefits of Web 2.0 Technologies in the Learning Process The benefits of using web 2.0 technologies are fostering participation, overcoming the challenges, facilitating collaborative work and assessment (Churcer, Downs & Tewksbury, 2014). Students involve in course and discussion of course immediately in web 2.0. Students can design course content and structure, which creates a sense of ownership. In addition, the profile pictures and biographical user information associated postings strengthen the relationships among users give the sense of real life, which causes more long lasting participation in courses. Secondly, the time and space flexibility that web 2.0 tools offer, allows instructors to bring the traditional course required synchronous meeting to a learning environment allows students to engage in asynchronous learning as frequent as they desire and when they want. Thus, web 2.0 tools are not only motivating factors for student, but also creating active and communal learning spaces. The integration of web 2.0 in classroom has social benefits that lead to abandon traditional pedagogical techniques and develop themselves in the shared experiences, discussions and self-disclosure of the students. Comprehending the knowledge on their own, students can learn subjects at their own pace. That is, an interpersonal learning environment is converted into an intrapersonal one. Instructors have a chance to monitor students’ status closely and guide them in certain directions due to their administrator position as an evaluator of student work. Some courses, particularly where topics may be controversial, lend themselves to the increased discussion environment that these case studies afford. Web 2.0 tools also form a suitable space for large group discussions and are helpful for times when students must come to a consensus on certain idea. Web 2.0 tools permit accession of every user along with the open forums, web pages, videos, and discussion points, which empowers students the ability to think out of the classroom.
The Use of Web 2.0 Technologies in Social Media and Education Web 2.0 is the technological infrastructure that allows creation and distribution of the social media content. Web 2.0 can be considered as a combination of technological innovations in hardware and software which enables economic content creation, interaction and interoperability. Web 2.0 technologies are designed user centered, users can collaborate through World Wide Web (Berthon, Plangger, & Shapiro, 2012). Web 2.0 applications took place in 2006 when social networking became a cultural phenomenon. Then, the impacts of social networks are appeared on educators, students, youth, enterprises, media, and governments in order to communicate, cooperate, collaborate, negotiate and create. Considering the unprecedented growth of MySpace, YouTube, and Wikipedia, which are examples for Web 2.0 technologies, meet the requirements of users. Online social groups sustain due to need of community and social connection (Lai & Turban, 2008). 10
Use of Social Media in Online Learning
According to Minocha and Roberts (2008), the use of Web 2.0 tools and applications support education with contents which are created and improved by students collaboratively. Students can share and access those contents easily and freely. Therefore, dynamic web technologies are powerful tools in order to organize, distribute and present the knowledge and create collaborative environment. According to different features and functions of dynamic web technologies, it is concluded that learning experiences can differ among individuals. Whatever the quality and differences of learning process the results of using dynamic web technologies in education are: Dynamic web technologies are effective to the formation of highly interactive, multifaceted training. Blending suitable learning environment created with dynamic web technologies with face to face education, a strong and effective blended learning environment is created. Educational activities carried out with dynamic web technologies are educating qualified individuals who can use and process the information effectively besides the course aims and objectives. In addition to these benefits, using dynamic web technologies in education following precautions should be considered; The activities carried out with dynamic web technologies are conducted more effectively with computer literate individuals. Using dynamic web technologies for irrelevant purposes should be banned (Deperlioğlu & Köse, 2010).
Types of Social Media According to Kaplan and Haenlein (2009) social media can be divided into 6 categories. •
•
•
•
Collaborative Projects: Collaborative projects are web based contents created by multiple users. Wikis and social bookmarking applications are the subsections of collaborative projects. While wikis permit users to add, remove and change text-based content, social bookmarking sites allow rating Internet links and create a collection. Wikipedia, the online encyclopedia, is the most widely used wiki and Delicious is widely known as social bookmarking site. Blogs: Blogs are content based websites which put newer entries forward and create chronological order. Blogs are usually administrated by one person but enable interaction via comments. Blogs’ content can consist of not only text but also different media formats like videos, images and audios. Content Communities: Content communities enable sharing different media types such as texts (BookCrossing), photos (Flickr), videos (YouTube) and PowerPoint presentations (Slideshare) between users. In content communities, users are not expected to create profile pages but they can interact through comments and likes. Social Networking Sites: Social networking sites, which are known as the only type of social media by many people, provide users with connection, communication and interaction between users by profile pages include personal information. Users can make friends; can communicate with each other via instant messaging. The profile page can involve various media types such as photos, video, audio files and blogs. Facebook and Myspace are the most widely used social networking sites. Social networking sites are used by the young specifically, which arouse a new problem called “Facebook addiction”, which makes them extremely important for learning since this addiction can be turned into advantage by creating social networking education site.
11
Use of Social Media in Online Learning
•
•
•
Virtual game worlds: Virtual worlds are platforms that create a three-dimensional environment where users can involve in the forms of avatar and can interact with each other as in real life with highest level of social presence and media richness, which makes them the ultimate kind of Social Media. Virtual game worlds, known as MMORPG (Massively Multiplayer Online Role Playing Game), allow users to play in the context with certain rules. World of Warcraft is the most played MMORPG in the world. Game rules usually include accomplishing tasks in order to become most powerful warrior, wizard, dragon and so forth. Virtual game worlds are also appropriate for learning as their target audience is younger ones and task accomplishment is an effective way to reach educational goals. Virtual Social Worlds: Another type of virtual worlds is virtual social worlds which enable users to live in virtual world like in real life. Users appear in the form of avatars and there are no rules unlike virtual game worlds. Second life is the most used virtual social world in which users can speak with each other, create content (furniture, clothes etc.) and sell this content. The Benefits of Using Social Media in Education: Recently an immerse shift in technology, more information is available to reach via wide variety of methods, social media has become one of these methods. The purpose of using social media is not only to find everyday information but also find academic information (Kim & Sin, 2016). That is, social media altered students’ academic lives and their communication critically.
In fact, social media has 3 main benefits on education. One of them is that, social media can facilitate the development of community, support program advancement and foster networking and collaboration. Thanks to social media, students can reach information and people regardless of their living place. This led us to across diverse geographic area. Facebook, the most used social media is appeared as a platform in which all required connection occurs, which means Facebook can create a learning place beyond the classroom. In addition, social media can minimize the effort to make program and even make it in dynamic way. With massive use, social media can be an essential tool to reach all students easily. Moreover, social media which is more related to our topic can improve collaboration among users. (Rosenberg, Terry, Bell, Hiltz & Russo, 2016). In several studies, it is stated that social media can encourage participation, engagement, reflective thinking, and collaborative learning and expand informal and formal learning (Manca & Ranieri, 2016). According to a research conducted to analyze the effect of learner-created YouTube videos on education. It is concluded that students who participate in this project and view videos are acquired higher levels of cross-curricular competencies and they have higher academic performance than non-participants. Moreover, participation has a positive effect on subjective learning and satisfaction with the course. Video creation empowers students’ social, ICT and academic skills. The fact that being difficult to achieve by traditional teaching methods, creating video provides students with 21st century competencies (Orús, Barlés, Belanche, Casaló, Fraj & Gurrea, 2016). According to Asterhan and Rosenberg (2015) teachers who use Facebook as a tool for building and maintaining contact with their students have various purposes. One of them is Academic-instructional purpose in order to expand learning beyond the classroom and manage and organize school-related activities. Other one is psycho-pedagogical purposes in order to monitor and discover virtual sphere and detect personal distress. The last one is social-relational purpose in order to lower threshold for initiating and deepen relationship, expand the range of teacher-student relationship.
12
Use of Social Media in Online Learning
SOCIAL MEDIA’S IMPACT ON EDUCATIONAL OUTCOMES Social media provides students with an environment that allows users to communicate, co-create, collaborate, share, socialize, buy and sell. These actions have positive impacts on students as well as the negative impacts. While students’ engagement, increased participation and integration and enhanced communication are positive impacts; addiction, copyright and privacy issues and academic dishonesty are the negative ones (Hernandez, 2015). Indeed, according to sb5, students are willing to use social media in their courses; they see social media as an effective learning tool due to its interactive and motivating presence. According to sb7, the social media trend is low in educational settings. Younger instructors are using social media more than older ones. While LinkedIn is used commonly by Full professors, podcasts, blogs, wikis, YouTube, Facebook and twitter are more used by assistant professors. Using social media in education can motivate students especially Facebook and twitter. Other social media tools such as YouTube-Vimeo, Blog-Wiki, Podcast, Slideshare and Research Gate improve quality of teaching and they are used to share educational content more frequently than other platforms. 21st century students are digital native, which make them easy adapters to new technology. They are familiar with social media especially Facebook more than anyone else. Therefore, academicians prepare their lectures via social media and they support collaborative work among students with Facebook particularly. Social media tools are used to visualize resources especially. However, there is a limited range of created content in social media, instructors and students have a tendency to use shared materials rather than editing an existed material or create a new material. However, this does not change the situation of social media in education, it is proved that social media improves teaching and learning process. Therefore, researchers stressed that instructions should consider supporting academic staff with technical and pedagogical guidance as the academicians do not meet the requirements of digital native students. Their web self-efficacy and digital competencies should be improved (Manca & Ranieri, 2016a; Manca & Ranieri, 2016b). Providing interaction, social media encourages students to communicate, collaborate, participate and construct knowledge. Traditional learning is not sufficient to create opportunities for personal reflection and collaborative learning, rather than learning with social media. Realizing the power of ICTs in today’s learning and students centered learning methods of new era like social learning where educators’ roles are guidance is crucial. Technological investments in order to integrate ICTs and social media in teaching and learning become critical for digital native students. Managing social media tools appropriately, they may have a positive impact on the three key learning outcomes; knowledge, skills and attitude (Orús, Barlés, Belanche, Casaló, Fraj & Gurrea, 2016).
REFERENCES Asterhan, C. S., & Rosenberg, H. (2015). The promise, reality and dilemmas of secondary school teacher–student interactions in Facebook: The teacher perspective. Computers & Education, 85, 134–148. doi:10.1016/j.compedu.2015.02.003
13
Use of Social Media in Online Learning
Balım, A. G., İnel, D., & Evrekli, E. (2007). Probleme dayalı öğrenme (PTÖ) yönteminin kavram karikatürleriyle birlikte kullanımı: Fen ve teknoloji dersi etkinliği. Turkish Republic of Northern Cyprus. Proceedings of theInternational Educational Technologies Conference, Famagusta, Cyprus. Berthon, P. R., Pitt, L. F., Plangger, K., & Shapiro, D. (2012). Marketing meets Web 2.0, social media, and creative consumers: Implications for international marketing strategy. Business Horizons, 55(3), 261–271. doi:10.1016/j.bushor.2012.01.007 Blumenfeld, P., Soloway, E., Marx, R., Krajcik, J. S., Guzdial, M., & Palincsar, A. (1991). Motivating project-based learning. Educational Psychologist, 26(3-4), 369–398. doi:10.1080/00461520.1991.9653139 Cartwright, V., & Hammond, M. (2003). The integration and embedding of ICT into the school curriculum: more questions than answers. Paper presented at theITTE 2003 Annual Conference of the Association of Information Technology for Teacher Education, Trinity and All Saints College, Leeds. Chickering, A. W., & Gamson, Z. F. (1987). Seven Principles for Good Practice. AAHE Bulletin, 39(7), 3–7. Churcher, K. M., Downs, E., & Tewksbury, D. (2014). Friending Vygotsky: A social constructivist pedagogy of knowledge building through classroom social media use. The Journal of Effective Teaching, 14(1), 33–50. Deperlioğlu, Ö., & Köse, U. (2010, 10-12 Şubat). Web 2.0 teknolojilerinin eğitim üzerindeki etkileri ve örnek bir öğrenme yaşantısı. Akademik Bilişim’10 - XII. Akademik Bilişim Konferansı Bildirileri, Muğla Üniversitesi, Muğla. Duffy, T. M., & Cunningham, D. J. (1996). Constructivism: implications for the design and delivery of instruction. In D. H. Jonassen (Ed.), Hand Book of Research For Educational Communications and Technology (pp. 170–197). New York: Simon and Schuster Macmillan Publishing. Duffy, T. M., & Jonassen, D. H. (1991). Constructivism: New implications for instructional technology? Educational Technology, 31(5), 7–12. Fer, S., & Cırık, İ. (2007). Yapılandırmacı öğrenme: Kuramdan uygulamaya. İstanbul: Morpa Kültür Yayınları. Finegold, D., & Notabartolo, A. S. (2010). 21st-Century Competencies and Their Impact: An Interdisciplinary Literature Review. Retrieved from http://www.hewlett.org/library/grantee-publication/21stcentury-competencies-and-their-impact-interdisciplinary-literature-review Gillespie, H. (2006). Unlocking teaching and learning with ICT: Identifying and overcoming barriers. London: David Fulton. Henson, K. T. (2003). Foundations for learner-centered educational: A knowledge base. Computers & Education, 124(1), 5–16. Hernandez, M. D. (2015). Academic Dishonesty Using Social Media: A Comparative Study of College Students from Canada and China. SAM Advanced Management Journal, 80(4), 45.
14
Use of Social Media in Online Learning
Hickey, D., & McCaslin, M. (2001). Educational psychology, social constructivism, and educational practice: A case of emergent identity. Educational Psychologist, 36(2), 133–140. doi:10.1207/ S15326985EP3602_8 ISTE. (2002). National Educational Technology Standards and Performance Indicators for All Teachers. Retrieved from http://cnets.iste.org/teachers/t_stands.html Jacobsen, A. D., Egen, P., & Donald, K. (2002). Methods for teaching promoting student learning (6th ed.). Ohio: Merrill Prentice Hall. Johnson, D. W., & Johnson, R. T. (1990). Cooperative learning. Blackwell Publishing Ltd. Retrieved from http://onlinelibrary.wiley.com/doi/10.1002/9780470672532.wbepp066/abstract;jsessionid=08D8977240 813DB2D09E28FF56D79C96.f02t02?deniedAccessCustomisedMessage=&userIsAuthenticated=false Johnson, D. W., Johnson, R. T., & Holubec, E. J. (1991). Cooperation in The Classroom, Interaction Book. MN: Edina Publications. Johnson, D. W., Johnson, R. T., & Smith, K. A. (1991). ASHE-ERIC Higher Education Report: Vol. 4. Cooperative learning: Increasing college faculty instructional productivity. Washington, DC: The George Washington University, School of Education and Human Development. Johnson, P. A. (1999). Problem-based, cooperative learning in the engineering classroom. Journal of Professional Issues in Engineering Education and Practice, 125(1), 8–11. doi:10.1061/(ASCE)10523928(1999)125:1(8) Jung, I. (2005). ICT-Pedagogy Integration in Teacher Training: Application Cases Worldwide. Journal of Educational Technology & Society, 8(2), 94–101. Kaplan, A. M., & Haenlein, M. (2010). Users of the world, unite! The challenges and opportunities of Social Media. Business Horizons, 53(1), 59–68. doi:10.1016/j.bushor.2009.09.003 Kim, K. S., & Sin, S. C. J. (2016). Use and Evaluation of Information from Social Media in the Academic Context: Analysis of Gap Between Students and Librarians. Journal of Academic Librarianship, 42(1), 74–82. doi:10.1016/j.acalib.2015.11.001 Kumar, V. S. (1996). Computer-supported collaborative learning: issues for research. Proceedings of theEighth Annual Graduate Symposium on Computer Science, University of Saskatchewan. Lai, L. S., & Turban, E. (2008). Groups formation and operations in the Web 2.0 environment and social networks. Group Decision and Negotiation, 17(5), 387–402. doi:10.1007/s10726-008-9113-2 Levy, M. (2009). WEB 2.0 implications on knowledge management. Journal of Knowledge Management, 13(1), 120–134. doi:10.1108/13673270910931215 Lim, C. P., & Ching, C. S. (2004). An activity-theoretical approach to research of ICT integration in Singapore schools: Orienting activities and learner autonomy. Computers & Education, 43(3), 215–236. doi:10.1016/j.compedu.2003.10.005 Manca, S., & Ranieri, M. (2016a). Facebook and the others. Potentials and obstacles of Social Media for teaching in higher education. Computers & Education, 95, 216–230. doi:10.1016/j.compedu.2016.01.012
15
Use of Social Media in Online Learning
Manca, S., & Ranieri, M. (2016b). Yes for sharing, no for teaching!: Social Media in academic practices. The Internet and Higher Education, 29, 63–74. doi:10.1016/j.iheduc.2015.12.004 Mayer, C. L. (2004). An analysis of the dimensions of a Web-delivered problem based learning environment [Ph. D. Dissertation]. University of Missouri, Columbia. Minocha, S., & Roberts, D. (2008). Social, usability, and pedagogical factors influencing studentslearning experiences with wikis and blogs. Pragmatics & Cognition, 16(2), 272–306. doi:10.1075/p&c.16.2.05min Moussiaux, S. J., & Norman, J. T. (2003). Constructivist teaching practices: perceptions of teachers and students. Retrieved from http://www.ed.psu.edu Nelson, L. M. (2009). Collaborative problem solving. In C. M. Reigeluth (Ed.), Instructional-design theories and models: A new paradigm of ınstructional theory (Vol. 2) (pp. 241–269). New York: Lawrence Erlbaum Associates Inc. Publisher. Neo, M. (2003). Developing a collaborative learning environment using a web based design. Journal of Computer Assisted Learning, 19(4), 462–473. doi:10.1046/j.0266-4909.2003.00050.x Nicol D.J., & Macfarlane-Dick, D. (2004). Rethinking Formative Assessment in HE: a theoretical model and seven principles of good feedback practice (2006). Online instantaneous and targeted feedback for remote learners. InBryan, C., & Clegg, K. (Eds.), S. Ross, S. Jordan, & P. Butcher (Authors), Innovative assessment in higher education (pp. 123–125). London: Routledge. Orús, C., Barlés, M. J., Belanche, D., Casaló, L., Fraj, E., & Gurrea, R. (2016). The effects of learnergenerated videos for YouTube on learning outcomes and satisfaction. Computers & Education, 95, 254–269. doi:10.1016/j.compedu.2016.01.007 Otten, H., & Ohana, Y. (2009). The eight key competencies for lifelong learning: an appropriate framework within which to develop the competence of trainers in the field of European youth work or just plain politics? Retrieved from: http://www.ikab.de/reports/Otten_Ohana_8keycompetence_study_2009.pdf Özden, Y. (2003). Öğrenme ve öğretme. Ankara: Pegem Akademik. Panitz, T. (1999). Collaborative versus cooperative learning: A comparison of the two concepts which will help us understand the underlying nature of interactive learning. ERIC Clearinghouse. Pelgrum, W. J. (2001). Obstacles to the integration of ICT in education: Results from a worldwide educational assessment. Computers & Education, 37(2), 163–178. doi:10.1016/S0360-1315(01)00045-8 Persall, N. R., Skipper, J. E. J., & Mintzes, J. J. (1997). Knowledge restructuring in the life sciences: A longitudinal study of conceptual change in biology. Science Education, 81(2), 193–215. doi:10.1002/ (SICI)1098-237X(199704)81:23.0.CO;2-A Riccomini, P. (2002). The comparative effectiveness of two forms of feedback: Web-based model comparison and instructor delivered corrective feedback. Journal of Educational Computing Research, 27(3), 213–228. doi:10.2190/HF6D-AXQX-AXFF-M760
16
Use of Social Media in Online Learning
Richards, C. (2006). Towards an integrated framework for designing effective ICT-supported learning environments: The challenge to better link technology and pedagogy. Technology, Pedagogy and Education, 15(2), 239–255. doi:10.1080/14759390600769771 Rosenberg, J. M., Terry, C. A., Bell, J., Hiltz, V., & Russo, T. E. (2016). Design Guidelines for Graduate Program Social Media Use. TechTrends. Sang, G., Valcke, M., Braak, J. V., & Tondeur, J. (2010). Student teachers thinking processes and ICT integration: Predictors of prospective teaching behaviors with educational technology. Computers & Education, 54(1), 103–112. doi:10.1016/j.compedu.2009.07.010 Savery, J. R., & Duffy, T. M. (1995). Problem based learning: An instructional model and its constructivist framework. Educational Technology, 35(5), 31–38. Semenov, A. (2005). Information and communication technologies in schools: A handbook for teachers or How ICT can create new, open learning environments. UNESCO. Retrieved from http://unesdoc. unesco.org/images/0013/001390/139028e.pdf Shihab, M. M. (2008). Web 2.0 tools improve teaching and collaboration in high school English language classes [Unpublished Doctoral Dissertation]. Nova Southeastern University Graduate School of Computer and Information Sciences, USA. Slavin, R. (1996). Research on cooperative learning and achievement: What we know, what we need to know. Contemporary Educational Psychology, 21(2), 43–69. doi:10.1006/ceps.1996.0004 Stahl, G., Koschmann, T., & Suthers, D. (2006). Computer-supported collaborative learning: An historical perspective. In Cambridge handbook of the learning sciences (pp. 409-426). Swan, K. (2005). A constructivist model for thinking about learning online. In Elements of quality online education: Engaging communities (Vol. 6, pp. 13-31). Tambouris, E., Panopoulou, E., Tarabanis, K., Ryberg, T., Buus, L., Peristeras, V., & Porwol, L. et al. (2012). Enabling problem based learning through web 2.0 technologies: Pbl 2.0. Journal of Educational Technology & Society, 15(4), 238–251. Ullah, H., & Wilson, M. A. (2007). Students’ academic success and its association to student involvement with learning and relationships with faculty and peers. College Student Journal, 41(4), 1192–1202. UNESCO. (2002). ICT (Information and Communication Technologies) in teacher education, a planning guide. Retrieved from http://unesdoc.unesco.org/images/0012/001295/129533e.pdf Van Leeuwen, A., Janssen, J., Erkens, G., & Brekelmans, M. (2015). Teacher regulation of multiple computer-supported collaborating groups. Computers in Human Behavior, 52, 233–242. doi:10.1016/j. chb.2015.05.058 Von Glasersfeld, E. (1995). Radical Constructivism: A Way of Knowing and Learning, SMES (Vol. 6). Taylor & Francis Inc. doi:10.4324/9780203454220
17
Use of Social Media in Online Learning
Vygotsky, L. S. (1978). Educational implications. In M. Cole, V. John-Steiner, S. Scribner, & E. Souberman (Eds.), Mind in society: The development of higher psychological processes (pp. 79–153). Cambridge: Harvard University Press. Vygotsky, L. S. (1987). Thinking and speech. In L. S. Vygotsky (Ed.), Collected Works (pp. 39–285). New York, NY: Plenum Press. Wilson, B. G. (Ed.). (1996). Constructivist learning environments: Case studies in instructional design. Educational Technology. USA: Educational Tech. Publications. Retrieved from http:// www.google.com.tr/books?hl=tr&lr=&id=mpsHa5f712wC&oi=fnd&pg=PR5&dq=Constructiv ist+learning+environments:+Case+studies+in+instructional+design&ots=sXhbBjbSOk&sig= cBifRbQXFXjX7wsOrwLISyDdPak&redir_esc=y#v=onepage&q=Constructivist%20learning%20 environments%3A%20Case%20studies%20in%20instructional%20design&f=false21 World Links. (2007). Final report on the Asian policy forum on ICT integration into education. Retrieved from http://cache-www.intel.com/cd/00/00/38/07/380769_380769.pdf Zheng, L., & Huang, R. (2016). The effects of sentiments and co-regulation on group performance in computer supported collaborative learning. The Internet and Higher Education, 28, 59–67. doi:10.1016/j. iheduc.2015.10.001
18
19
Chapter 2
The Impact of Role Assignment on Cognitive Presence in Asynchronous Online Discussion Larisa Olesova George Mason University, USA Jieun Lim Purdue University, USA
ABSTRACT This study examined the impact of role assignment on cognitive presence when students participated in asynchronous online threaded discussions. A mixed methods design was used to investigate changes in the levels of cognitive presence while the students participated in an online introductory nutrition course. This study found evidence that scripted role assignment can be an effective instructional strategy when the approach is implemented into asynchronous online discussions. Implications for instructors and designers of asynchronous online learning environments are discussed.
INTRODUCTION Computer-mediated communication (CMC) has become an integral part of educational communication to facilitate collaborative argumentation. In the CMC environment, an asynchronous online discussion board is a significant learning space where students construct concepts and knowledge about a topic through the process of sharing and arguing ideas with other students. As the use of asynchronous online discussion boards has become commonplace, researchers have been investigating robust ways to assess the quality of students’ critical thinking and promoting the cognitive abilities of constructing more meaningful and higher order learning.
DOI: 10.4018/978-1-5225-1851-8.ch002
Copyright © 2017, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
The Impact of Role Assignment on Cognitive Presence in Asynchronous Online Discussion
In particular, researchers have looked to the construct of cognitive presence to assess the systematic progression of knowledge acquisition represented by the higher-order levels of Bloom’s taxonomy (Garrison, Anderson, & Archer, 2001; Lobry de Bruyn, 2004). Cognitive presence, a core construct of the Community of Inquiry (CoI) framework, is associated with “the extent to which learners are able to construct and confirm meaning through sustained reflection and discourse in a critical community of inquiry” (Garrison et al., 2001, p. 11). Previous research has identified the importance of cognitive presence. For example, researchers have found that cognitive presence is a significant predictor of learner satisfaction and persistence (Joo, Kim, & Park, 2009; Joo, Lim, & Kim, 2011). Other researchers found that using asynchronous online threaded discussion itself cannot guarantee the development of students’ cognitive presence (Richardson & Ice, 2010) nor that cognitive presence can be attained by interaction itself (Garrison & ClevelandInnes, 2005). Garrison and Cleveland-Innes (2005) identified that the intended and structured design and facilitation for critical discourse are needed to make students shift to deep approach to learning. In addition, a fair number of earlier studies suggested that asynchronous online discussion may not provide for the development of cognitive presence at higher levels and that online discussions usually remain at the lower levels of recognizing the problem (Garrison et al, 2001; Kanuka, Rourke, & Laflamme, 2007). The research implies that instructors or instructional designers need to implement purposeful and appropriate instructional strategies to improve students’ higher levels of cognitive thinking. Role assignment is gaining popularity as an approach for facilitating and evaluating computer-supported collaborative learning in higher education (Strijbos & Weinberger, 2010). Researchers have discussed the benefits of using role assignment to promote knowledge construction, more rapid engagement, and more consistent levels of interaction among group members (Pawan, Paulus, Yalcin, & Chang, 2003; Schellens, Van Keer, & Valcke, 2005). Role assignment is one of the ways of creating a clear task structure to foster cognitive processing and academic performance (Schellens et al., 2005). Using role assignment is a common practice to foster cognitive presence because it involves students in active decision making process and knowledge construction which is based on cognitive thinking abilities (De Wever, Van Keer, Schellens, & Valcke, 2010; Wise, Saghafian, & Padmanabhan, 2012). The potential benefits of role assignment can be supported by the research emphasizing the importance of student led discussions and student-student interaction (Bernard et al., 2009; Gasevic, Adesope, Joksimovic, & Kovanovic, 2015; Johnson, 1981; Schrire, 2006). However, similar to research on cognitive presence and asynchronous threaded discussions, studies also suggest that simply assigning roles may not promote knowledge construction, engagement or interaction (Gasevic et al., 2015). Considering that CMC presents advantages for active learning, it is important to conduct an in-depth investigation of how cognitive presence is expressed in relations to role assignment. The purpose of this study is to fill this gap by examining how cognitive presence is influenced by role assignment when students participate in asynchronous online threaded discussions. In this study, we define roles as stated functions and/or responsibilities that guide students’ behavior and group interaction (Strijbos & Weinberger, 2010, p.491). Moreover, we examined scripted product-oriented roles which were assigned by an instructor and involved a single weekly task, as they were originally developed to structure the asynchronous online discussions (Strijbos & De Laat, 2010). Each student was assigned a specific task as a starter, skeptic, or wrapper. A starter was responsible for posting a preliminary response to get the discussion started about a particular reading. A wrapper summarized the points that were made during the discussion with regard to a particular reading. A skeptic challenged points made by other students. 20
The Impact of Role Assignment on Cognitive Presence in Asynchronous Online Discussion
BACKGROUND Role Assignment in Online Discussions Computer-mediated communication based on asynchronous forms of communication, such as asynchronous online discussions, can take place among students across different geographic locations and among those who are unable to participate in a discussion at a specific time (Gunawardena & McIsaac, 2004). By providing time to read and respond to a message, asynchronous online discussions can enhance greater interactions between students and support collaborative learning (Darabi, Arrastia, Nelson, Cornille, & Liang, 2011). In addition, asynchronous online discussions can provide a greater flexibility for students to share ideas and experiences by removing transactional distance when teaching and learning occur at any time and at any place (Moore, 2007). Moreover, students’ engagement in asynchronous online discussions, specifically in pre-structured discussions with the use of role assignment, has been viewed as fundamental to build and promote students’ cognitive presence (De Wever et al., 2010; Gasevic et al., 2015; Xie, Yu, & Bradshaw, 2014). Students’ role assignment in discussion boards can be distinguished into two different types of roles, scripted roles and emergent roles based on the subjects who assign the roles. In the case of emergent roles, this type of role assignment is usually developed naturally while students interact and collaborate with other students. The emergent roles are formed by students themselves spontaneously. Educational researchers have emphasized the importance of investigating students’ emergent roles in online learning activities. Researchers suggested analyzing students’ emergent roles in a group can be helpful to identify each student contribution and interaction with the group members (De Laat & Lally, 2004; Strijbos & Weinberger; 2010). In particular, De Laat and Lally (2004) identified three emergent roles including facilitator, completer/finisher, and ideas contributor by analyzing students’ activities in the group discussion. They also found that the facilitator was a group-focused role while completer/finisher and contributor were more of a task focused role. They showed that students’ emergent roles in the group activities can be important indicators to identify “individual accountability, “positive interdependency” and “a varied awareness of the need for management” in collaborative activities (p.171). In contrast to emergent roles are scripted roles, the focus of this study, which require the instructor to suggest the roles to students (Strijbos & De Laat, 2010). Scripted roles are usually designed to improve both learning processes and outcomes intentionally (Strijbos & Weinberger, 2010). Types of scripted or assigned roles have varied in the literature. For example, Schellens et al. (2005) used four scripted roles for role assignment: 1. Moderator, who monitors the discussion, stimulates active participation through providing praise, advice, questions; 2. Theoretician, who helps other students to consider all related theories when they work on a task; 3. Summarizer, who provides the summary in the discussion group; and 4. Source searcher, who provides additional sources to help for further thinking. De Wever et al. (2010) also mentioned the “starter” role that is responsible to start off the discussion and lead other students to think a topic with new perspective. Researchers have studied the positive effects of role assignment (Gasevic et al., 2015; Xie et al., 2014) and studies have focused on the impact of a certain role on quantity and diversity of student’s participation. Xie et al. (2014) found that stu21
The Impact of Role Assignment on Cognitive Presence in Asynchronous Online Discussion
dents showed increased quantity and diversity of participation when they are assigned to the moderator role. Studies also investigated how the types of scripted roles can impact students’ level of knowledge construction (De Wever et al., 2010). De Wever et al. (2010) found that role assignment had a positive effect on students’ social knowledge construction which means co-constructing knowledge with other learners and instructors through interaction. Specifically, De Wever et al. (2010) found role assignment was the most effective when the instructor assigned roles at the beginning of the discussion. Gasevic et al. (2015) determined that role assignment had positive effects on facilitating students’ higher level of cognitive presence. Specifically, they assigned students with two different roles – expert researcher and practicing researcher. While the practicing researcher roles included source searcher, theoretician, and summarizer roles, the expert researcher roles added two additional aspects – moderator and topic leader. They reported that students were likely to stay at higher levels of cognitive presence when served an expert researcher role rather than a practicing researcher role. Other studies also found positive effects of using role assignment as an effective instructional strategy to facilitate collaborative learning among students. In particular, role assignment can serve as a collaborative activity itself when students are engaged in asynchronous online discussions (Gasevic et al., 2015). Yet, Schellens et al. (2005) did not find any significant differences in students’ levels of knowledge construction between treatment and control groups for roles assignment. However, they did find significant differences in the levels of knowledge construction when students were assigned a summarizer role; students-summarizers showed higher levels of knowledge construction compared with other roles. They assumed that role assignment could become an effective strategy based on the specific assigned task, such as a summarizer role. When students are provided a specific and clear task structure, it can help to foster cognitive processing and academic performance (Schellens et al., 2005).
Cognitive Presence in Online Discussion The spread of discussion based online courses is closely related to the social constructivism theory. The basic assumption of social constructivism is that knowledge is co-constructed by interaction and social negotiation with others in cognitive development (Tam, 2000). Social constructivists emphasize the importance of collaboration and discourse in learning (Schrire, 2006). Researchers and practitioners alike have focused on the importance of “discourse” in online learning and have been interested in the effective use of discussion boards as a main space for discourse in asynchronous online learning. In due course, the lens of social constructivism on online discussions has resulted in learning objectives focused on a higher level of cognitive thinking abilities, for example, analysis, synthesis, and evaluation based on Bloom’s taxonomy. This has led to students constructing knowledge in discussion boards through a variety of cognitive thinking processes in accordance with Bloom’s six levels of cognitive processing: knowledge, comprehension, application, analysis, synthesis, and evaluation. Although there have been various instructional strategies and theories of designing and exploring effective online learning environments, the Community of Inquiry (CoI) conceptual framework, developed by Garrison et al. (2001) is one of the most widely used. This study also uses the CoI conceptual framework as a lens. The framework which is used in the development and practice of asynchronous online learning environments in higher education consists of three overlapping core elements: social, cognitive, and teaching presence. The framework is based on the practical inquiry model which is grounded in the phenomenon of critical thinking. Given that the CoI framework was established to facilitate students’ critical thinking abilities which require higher levels of cognitive thinking through 22
The Impact of Role Assignment on Cognitive Presence in Asynchronous Online Discussion
Table 1. The levels of cognitive presence by indicators and sociocognitive processes Category
Indicator
Sociocognitive processes
Triggering events
Recognizing the problem Sense of puzzlement
Presenting background information that culminates in a question Asking questions, messages that take discussion in new direction
Exploration
Divergence – within the online community Divergence – within a single message Information exchange Suggestion for consideration Brainstorming Leaps to conclusions
Unsubstantial contradiction of previous ideas Many different ideas/themes presented in one message Personal narrative/descriptions/facts (not used as evidence to support a conclusion) Author explicitly characterizes message as exploration – e.g., “Does that seem about right?” or “Am I way off the mark?” Adds to established points but does not systematically defend/justify/develop addition Offers unsupported opinions
Integration
Convergence – among group members Convergence – within a single message Connecting ideas, synthesis Creating solutions
Reference to previous message followed by substantiated agreement, e.g., “I agree because…” Building on, adding to others’ ideas Justified, developed, defensible, yet tentative hypotheses Integrating information from various sources – textbook, articles, personal experience Explicit characterization of message as a solution by participant
Resolution
Vicarious application to real world Testing solutions Defending solution
None Coded
*reprinted with permission (Garrison et al., 2001; reprinted with permission of authors)
interaction and focuses on the dynamic process of students cognitive and critical thinking activities in asynchronous online course (Garrison et al., 2001; Swan, Garrison, & Richardson, 2009), it would be sensible to choose CoI framework to analyze the level of students’ cognitive thinking abilities according to the role assignment. In particular, the model of cognitive presence guides this study in assessing levels of cognitive thinking processes in asynchronous online discussions (Garrison et al, 2001). Specifically, the CoI framework suggests four levels of cognitive presence: triggering event, exploration, integration, and resolution (Garrison, Anderson, & Archer, 1999). Table 1 below presents the four levels of cognitive presence including indicators and socio-cognitive processes by each level. A triggering event, the initial level, of cognitive presence indicates “a state of dissonance” or “feeling of unease resulting from an experience” (Garrison et al., 1999, p. 98). The exploration level focuses on searching for new information, knowledge, and alternatives to address a problem. The integration level pushes students to integrate and combine the information learned. Finally, the resolution level focuses on implementation of the proposed solutions and application of newly created knowledge (Garrison et al, 1999). These different phases of cognitive presence can be used to diagnose and analyze the state of students’ cognitive thinking abilities in the dynamics of the inquiry process and to implement relevant instructional strategies and scaffolding according to the cognitive thinking levels (Swan et al., 2009). Previous studies have found that cognitive presence has a significant impact on learning outcomes (Akyol & Garrison, 2008; Joo et al., 2011). Studies found that when online courses are designed with various tasks and strategies, they not only improve the level of cognitive presence, they have a positive impact on learner satisfaction (Richardson & Ice, 2010). Adding to the research of measuring the overall level of cognitive presence, some researchers have explored the distribution of each category of cognitive
23
The Impact of Role Assignment on Cognitive Presence in Asynchronous Online Discussion
presence in an asynchronous online discussion environment. Studies have reported that exploration level was the most coded indicator while the rate of integration and resolution was relatively few (Garrison et al., 2001; Kanuka et al., 2007). These earlier studies would indicate that students are not likely to reach higher level of cognitive presence in online discussions (Kanuka et al., 2007). However, Swan et al. (2009) proposed that integration and resolution stages might actually be more likely to appear in capstone or final assignments rather than in online discussions as students are demonstrating their learning and knowledge construction from the entire course. Moreover, when relevant instructional directions for online discussions are provided, students might progress to the higher level of cognitive presence. For example, Akyol and Garrison (2011) found that integration was the mostly widely coded level in students’ online discussions in both online and blended courses. Similarly, several years ago, they also found that integration was the most coded indicator in online discussions facilitated by students and when the course instructor modeled how to facilitate discussions in an effective way (Akyol & Garrison, 2008). The increase of posts at the integration level in Akyol and Garrison’s (2008) study is very likely due to the context of the course “Blended Learning” where the discussion topics and discussion questions focused on students’ final assignment to design a blended learning course. Students integrated and supported their ideas with various resources due to the nature of the discussion assignments (Akyol & Garrison, 2008; Swan et al., 2009). Summarizing the research we find inconsistency of the distribution of categories of cognitive presence, which may be due in part to other factors. Those factors could include differences in instructional strategies, discussion topic choices, purpose of discussion boards (to introduce topics or demonstrate knowledge), and course design. In this research we focused on the distribution of categories of cognitive presence according to the scripted role assignments (starter, skeptic, and wrapper).
Research Questions The research questions for this study were: What is the impact of role assignment on the level of cognitive presence? How is cognitive presence expressed across different assigned roles when students participate in asynchronous online discussion?
RESEARCH METHODS A mixed method research design employing both statistical and quantitative content analysis was used to investigate the impact of role assignment on the level of students’ cognitive presence in asynchronous online discussions (Creswell & Plano Clark, 2011). Specifically, all participants in the online course were required to play an assigned role (starter, skeptic, or wrapper) in four asynchronous online threaded discussion boards. Students rotated the roles throughout the course. This strategy provided students with an opportunity to take one role and benefit from the role to contribute to weekly discussions. Furthermore, there were weeks when some students were not assigned a role but the discussions were ongoing. We labelled the weeks when a student was assigned a role as “with the role assignment” and when the same student was not assigned a role as “without a role assignment.” All participants’ discussion postings were coded using the cognitive presence indicators including triggering event, exploration, integration, and resolution, then, the qualitative data were transformed into quantitative data (frequencies). The frequencies were compared to the discussion postings generated 24
The Impact of Role Assignment on Cognitive Presence in Asynchronous Online Discussion
when students were assigned a specific role to those when the same students were not assigned a role but they still participated in online discussions. The quantified data were analyzed through statistical approaches to identify any associations between students’ level of cognitive presence according to the types of assigned roles, no roles, and number of online discussions.
Context and Participants Through purposeful sampling, all participants (n = 76) in this research were undergraduate students at a large eastern public university. The participants were gathered from two course sections, section one (n = 39) and section two (n = 37), in summer 2014. All students were assigned to smaller groups with five-six students in each. Out of 76 enrolled in this course, 72 students participated in all four required online discussions. This study results report on the findings generated by the 72 students. The online course “Introduction to Nutrition” is a required, sophomore-level, three-credit fundamental course for a Nutrition and Food Studies program. It is accepted toward the Natural Science course requirements within the University Core, which sets standards and learning outcomes for all undergraduates seeking a baccalaureate degree at this large eastern public university. The course attracts a broad variety of students across the university, including several candidates to the Masters of Science in Nutrition program. However, a concentration of pre-health undergraduate majors is noted and a predominance of female students is typical of students enrolled in nutrition courses. The course focuses on introducing students to nutrition as a scientific discipline, providing a working knowledge of basic nutrition, including the sources and functions of the nutrients, the components of a healthy diet, and the relationship between diet and overall health. The online course was six weeks in length and was delivered through Blackboard. In this course, all students were required to participate in four group discussions. In particular, students were given an assigned role of starter, skeptic, and wrapper in threaded asynchronous discussion boards. Specific tasks were associated with each role: 1. The starter was responsible for posting a preliminary response to get the discussion started in response to the discussion prompt. Starters also suggested a structure to the discussion (“Will everyone please find one example of XYZ and post it by Friday?”); 2. The skeptic challenged points made by other students and kept the discussion balanced by bringing up other points of view; and 3. The wrapper summarized the points that were made during the discussion with regard to a particular reading. To facilitate online discussions, all three required roles were assigned randomly at the beginning of the semester, via a table provided to students. Discussion prompts/questions were created by the course instructor and provided at the beginning of each discussion week, which proceeded asynchronously on the discussion board from Wednesday through the following Tuesday. Each discussion required a minimum of one response addressing the key question and then also a minimum of two additional comments to responses made from classmates, regardless of the assigned role: one post by Saturday and two subsequent posts by Tuesday. Participation in online discussions counted for 25% out of the total grade in this course. The course instructor didn’t
25
The Impact of Role Assignment on Cognitive Presence in Asynchronous Online Discussion
actively facilitate online discussions except posting reminders on a specific day. However, for the first discussion, the course instructor served as the starter for two main reasons: 1. The discussions happened early in the semester. The instructor could not give the students adequate advance notice that they would need to make a post within that 24-hour window on the “start day” which was the first day of the second week 2. Teaching by modeling to provide a very real example for students to model in future discussions In addition, in this specific course, students were provided with a grading rubric, as well as examples of discussion posts of varying quality in each role. The guidelines for participation in discussions including the questions and examples are available in Appendix A.
Data Source and Analysis Qualitative data were collected from weekly discussion postings including students’ responses and comments (n = 1271). First, students’ weekly discussion postings were coded and categorized according to Garrison’s model of cognitive presence (Garrison et al., 2001). The students’ weekly discussion postings were determined as the unit of the analysis corresponding to what one student posted into one thread of the online discussions. The discussion postings were marked in each thread for coding analysis; the length and the content of each discussion posting was decided by the students. More specifically, discussion postings presenting the sense of puzzlement and recognizing the problem were categorized as “triggering events”; discussion postings demonstrating information exchange and suggestions for consideration were labeled as “exploration”; discussion postings creating solutions and connecting ideas and/or synthesis were labeled as “integration,” and discussion postings at the testing and defending solution level were labeled at the highest level of “resolution” (see Table 1). The discussion postings where students discussed organization of the online discussion, clarified the guidelines of participation, and posted appreciation or confirmation were coded as not substantive and were not used in the analysis. Then, the collected qualitative data were transformed into quantitative data (frequencies) to run descriptive and non-parametric statistical analysis. Descriptive statistics was used to analyze the frequencies of codes with role assignment and without role assignment by four levels of cognitive presence across all four discussions in the course. Next, a non-parametric statistical analysis of a chi-square test for independence was applied to analyze associations between each role assignment and the levels of cognitive presence. In order to run the analysis of all quantitative data, Statistical Package for Social Sciences (SPSS) was used.
Reliability and Validity In order to establish the reliability and validity, there were two coders for each online discussion. An inter-rater reliability of 95% was reached. The inter-rater reliability was computed by using the Miles and Huberman (1994) formula: Reliability = Number of Agreements/Total number of agreements and disagreements
26
The Impact of Role Assignment on Cognitive Presence in Asynchronous Online Discussion
In the content analysis approach, the training of the coders is important to establish improved reliability (De Wever, Schellens, Valcke, & Van Keer, 2006). For this step, researchers were provided with the categories to code the postings and the tables with indicators and description of socio cognitive processes (see Table 1). Next, a brief coding training meeting was held to discuss the coding procedures before starting the individual coding. Each coder independently coded students’ postings to get the triangulation effect of coding. After each coder finished their own coding, coded and categorized students’ weekly discussion comments were compared with specific attention to the four levels of cognitive presence outlined by Garrison et al. (2001). Initially, researchers created a comparative table, arranging codes according to the levels of cognitive presence when the students played the role in order to find evidence related to research questions. Following this, each researcher examined data to determine differences across the roles. Later, researchers discussed data analysis results to clarify individual interpretations in order to come to consensus. Patterns were identified for each level of cognitive presence to compare results across the roles.
RESULTS Research Question One To answer the first research question, what the impact of role assignment on the level of cognitive presence was, the percentages and frequencies of the codes generated based on when students were assigned a role (with the role assignment) and when the same students were not assigned the role (without the role assignment) were calculated. Table 2 presents the distribution of percentages and frequencies of these coding results. The analysis revealed that the most frequently coded category of discussion postings across both conditions was at the integration level (67.25% with role assignment and 72.17% without role assignment). No frequencies were found at the resolution level with role assignment. The resolution level is very low (0.43%) for analysis of data for those students, who were not assigned any specific role during some online discussions but still participated. The results of a chi-square test for independence indicated there was a significant association between the role assignment and the levels of cognitive presence (χ2 (1, n = 1036) = 13.38, p = .004). Table 2. The Levels of cognitive presence by students’ discussion posts generated with and without role assignment Role Assignment (n = 1036) Cognitive Presence
With Role Assignment %
Without Role Assignment n
%
n
Triggering Event
4.84
39
0.87
2
Exploration
27.92
225
26.52
61
Integration
67.25
542
72.17
166
Resolution
0
0
0.43
1
Total
100
806
100
230
27
The Impact of Role Assignment on Cognitive Presence in Asynchronous Online Discussion
Table 3. Comparison of coding results for cognitive presence by different types of roles Types of Roles (n = 806) Cognitive Presence
Starter %
Skeptic n
%
Wrapper n
%
n
Triggering
4.58
12
4.22
10
5.54
17
Exploration
33.59
88
26.16
62
24.43
75
Integration
61.83
162
69.62
165
70.03
215
Resolution
0
0
0
0
0
0
Total
100
262
100
237
100
307
Further, to continue exploring the impact of each role assignment on each level of cognitive presence, the percentages and frequencies of the codes generated by each role (starter, skeptic, and wrapper) was calculated. Table 3 presents the distribution of percentages and frequencies of these codes. In total, 806 frequencies were generated with role assignment. The number of percentages and frequencies across role assignment is very close. Again, the majority of the discussion postings occurred at integration level across all three roles (61.83% for starter; 69.92% for skeptic; and 70.03% for wrapper) while triggering was the least frequent level (4.58% for starter; 4.22% for skeptic; and 5.54% for wrapper). No frequencies were found at the resolution level for any of the roles. However, the results of the chi-square test for independence did not indicate any significant associations between each type of the role and each level of cognitive presence (χ2 (1, n = 806) = 6.75, p = .15).
Research Question Two To answer the second research question of how cognitive presence is expressed across different assigned role assignments, percentages and frequencies generated by each role assignment across the four online discussions were computed, the starter role in online discussion#1 was fulfilled by the instructor. Figure 1 presents the detailed breakdown of the percentages. To examine whether there was an association between the role assignments and the online discussions at each level of cognitive presence, a chi-square test for independence was run. The results of a chi-square test indicated significant association between the role assignment and four online discussions at the triggering event level (χ2 (1, n = 39) = 22.29, p = .001, V = .53), the exploration level (χ2 (1, n = 225) = 66.93, p = .000, V = .34), and the integration level (χ2 (1, n = 542) = 146.55, p = .000, V = .33). The effect size is large at the triggering event level .53, the exploration level .34, and the integration level .33 showing a strong association between the role assignments across all the online discussions.
DISCUSSION This study supports the findings of previous research that scripted role assignment can be an effective instructional strategy to improve both learning processes and outcomes (Strijbos & Weinberger, 2010). All three types of the scripted role assignments in this study (starter, skeptic, and wrapper) had an impact
28
The Impact of Role Assignment on Cognitive Presence in Asynchronous Online Discussion
Figure 1. The distribution of percentages by roles and levels of cognitive presence across online discussions
on the level of cognitive presence when students were involved in inquiry-based learning which pursues to foster higher levels of cognitive thinking abilities including critical thinking abilities. The findings in this study have also revealed that while the scripted role assignments provide engaging opportunities for
29
The Impact of Role Assignment on Cognitive Presence in Asynchronous Online Discussion
students, they also can lead to a higher-level of social knowledge construction and collaborative learning. When students were assigned a specific role in this study, they not only shared their knowledge, but they also extended this knowledge to a higher level of critical thinking by integrating information from various sources – textbook, articles, and own personal experience. The majority of the discussion postings across all three role assignments stayed at the integration level. The indicators of the discussion postings at this level were convergence among group members, connecting ideas, synthesis, and creating solutions. This may be explained by the type of questions asked in all four online discussions. The questions in each online discussion in this study required collaborative discussions to make a collaborative decision as a team. For example, in the third online discussion, which had the highest percentages across all three roles (average 74%), the discussion questions encouraged students to find examples of marketing techniques as a team. Then, as a team, they were required to identify each of the eleven marketing techniques on the collected nutritional products’ websites. It seems that having both the role assignments’ requirements and the directed discussion questions in each online discussion created discussion postings of a higher quality. For example, the skeptics in this study tried to challenge group members that helped increase convergence of information among group members or within a single message (indicators of the integration level).
Starter The majority of contributions to the online discussions for the starter role were at the integration level. Contributions at the integration level more than double the proportion of contributions at the exploration level across all discussions. Even though the triggering level was the least frequent category generated by this role, the frequencies increased from 3% to 14% in the last online discussion. The starter role generated the highest percentages in the last online discussion in comparison with the wrapper role (from 4% to 12%) and the skeptic role (from 3% to 9%) (Figure 1). The starter role assignment in this study was required to suggest a structure to the weekly discussion in addition to the responsibility to post a preliminary response to get the discussion started. The most frequently coded indicators for the starters’ discussion posts were connecting ideas, synthesis, and creating solution. Below is the example of a starter’s discussion posting in online discussion #3: Hydroxycut claims that dieters consuming this could lose more weight than dieting alone … As I searched on PubMed, I could not find any of these aforementioned studies, and hence could not verify the validity of the reports … Personally, I got very frustrated when surfing on the website of hydroxycut.com. The website listed all the products as well as success stories, but are lacking in the ingredients list, medical research, and possibly adverse effects of the products … I have spent my best effort but I could not locate the actual studies. In short, I don’t think there is any proof that this product has been studied … Hydroxycut is not a safe product even after the reformulation in 2009. As I log on to PubMed and did a search on “hydroxycut,” … newly published literature indicates that this product is unsafe, and had caused liver toxicity.
Skeptic The distribution of the percentages and frequencies for the skeptic role across all of the online discussions reveals a slightly different trend than for the starter role. The highest percentage for this role also stayed at 30
The Impact of Role Assignment on Cognitive Presence in Asynchronous Online Discussion
the higher level of cognitive presence – integration (69%) across the discussions (Table 3). The students assigned to the skeptic role had a lower number of percentages at the exploration level (average 21%) except for the second online discussion (42%) (Figure 1). The skeptic role had the highest percentages at the higher level - integration (72%) in the last online discussion #4, while the starter role generated the highest percentages at the lower level of cognitive presence - triggering event (14%) in the same online discussion (Figure 1). Below is the example of a skeptic’s discussion posting in online discussion #3: As a recap of the posts thus far, Cyndi searched Sensa and gave great example of how this business used false information to promote their product … As a consumer, I understand that it is easy to trust a product that seems great and potentially helpful to my health. However, as a skeptic, I wonder how so many people can completely trust these products when it is so easy to find “before and after” pictures on the internet and act as though their product was responsible for these incredible weight losses.
Wrapper Similar to the previous roles, when students were assigned to the wrapper role they stayed at a higher level of cognitive presence – integration (average 70%) across all discussions (Table 3). The results for the wrapper role also show more consistent patterns of the percentages at the exploration level (average 25%) during all four online discussions than the skeptic role (the range from 18% to 42%) (Figure 1). The wrapper role assignment in this study summarized the points that were made during the online discussion with regard to a particular reading. To compare with previously reviewed roles of starter or skeptic, the findings on the quality of discussion postings for the wrapper role was not a surprise. The role assignment itself required integrating and synthesizing ideas created by a team into a weekly discussion summary. Below is an example of a wrapper’s role discussion posting in online discussion #3: This week’s discussion was in regard to the various nutritional products that exist and the falls claims each one promotes along with their product. Cyndi chose for us to look at the products Sensa and Hydroxycut. Mary chose Cortislim and Viemma. Each of these products cleans have amazing weight loss results with their products alone … Sarah brought up that many of these products come with FDA warnings and are warning customers to stay away. A lot of awful side effects and even death came as a result of Hydroxycut … Crystal found Sensas own web page claims … Cyndi found some great research on the use chromium in Sensa and how dangerous it could be … Crystal showed us all the companies use testimonials as their number one selling point … In conclusion, we all wish consumers would do more research … of their product choices, and also realize that when it sounds too good to be true … it is because it always is.
SOLUTIONS AND RECOMMENDATIONS Similar to the previous studies on the use of asynchronous online discussions to facilitate higher-order learning and critical thinking (Duphorne & Gunawardena, 2005; Newman, Webb, & Cochrane, 1995), this study contributed to the field by revealing the effectiveness of role assignments in asynchronous threaded discussions. To fill the gap in literature, this study examined the role assignment as an instructional strategy to foster higher-order learning skills (Duphorne & Gunawardena, 2005). The authors’ 31
The Impact of Role Assignment on Cognitive Presence in Asynchronous Online Discussion
anticipation that role assignments can foster critical thinking and cognitive presence was evidenced by this study, and those findings were supported and justified by two coders. Results indicated a significant difference between role assignment versus without role assignment and the levels of cognitive presence, with the majority of students’ posts remained at the integration level for both conditions. This can be explained by the fact that all the students in this study had an opportunity to play each of the assigned roles at least once during the online discussions. This finding can also be explained by the type of questions or the design of online discussions used in this study. This section reviews practical solutions and recommendations for those who are interested in designing and teaching using effective asynchronous online discussions. Even though previous studies have reported that the exploration level was the most coded level of cognitive presence (Kanuka et al., 2007), this study reports the opposite findings echoing the results reported by Akyol and Garrison in 2008 and 2011. Like the Akyol and Garrison’s studies (2008, 2011), the great majority of the students’ discussion postings in this study stayed at the higher levels of cognitive presence – integration. The findings showed that students were able to achieve higher level learning outcomes during the online discussions when role assignments were integrated. Therefore, the authors of this study highly recommend the use of scripted role assignment to increase students’ cognitive presence. Instructional designers and other practitioners who design asynchronous online discussions are encouraged to incorporate role assignments with the scripts where students are required to create convergence among group members or within a single message. Convergence among group members or within a single message can be achieved when scripts require collaborative efforts similar to the role assignment scripts in this study. When the role assignment scripts require collaborative efforts, students usually work on integrating various resources including personal experiences as a team together. Moreover, when the role script requires summarizing, such as the wrapper role assignment in this study, students generally include the varying perspectives from the team members. Contrary to the previous studies (De Wever et al., 2010; Schellens et al., 2005), this research did not find any differences in percentage and frequency distribution for the wrapper role assignment. The majority of the discussion postings generated by the students who were assigned the wrapper role assignment also stayed at the integration and exploration levels. These findings lead to another practical recommendation that instructional designers and other practitioners should consider the use of specific type of instructional strategies to match intended outcomes. For example, one of the factors that may have influenced the higher level of cognitive presence in this study might be the question type. The last online discussion, where debates were used, demonstrated an increase of posts at the triggering level. Specifically, the question asked students to find scientific reference and a non-scientific reference, present the information to peers, and ask questions to your peers who were assigned an opposing topic.
FUTURE RESEARCH DIRECTIONS More research is needed to explore the effectiveness of using role assignments as an instructional strategy in asynchronous online discussions. Future research will include additional course sections for a more robust statistical analysis, perhaps application of multilevel linear modeling following the research by De Wever et al. (2010) and Gasevic et al. (2015). The data analysis will include three levels:
32
The Impact of Role Assignment on Cognitive Presence in Asynchronous Online Discussion
1. Individual discussion postings 2. Students 3. Courses or sections (fall 2013; spring 2014; and summer 2014) Additionally, future research should be conducted comparing courses where role assignment as a treatment can be compared to a control group to reveal evidence of whether scripted role assignment can impact the level of cognitive presence. Research across different disciplines and different role assignments is also needed.
CONCLUSION This study found evidence that scripted role assignment can be an effective instructional strategy when the approach is implemented into asynchronous online discussions. This study investigated integration of the role assignment approach into an online course for undergraduate students. Findings revealed that all the assigned roles stayed at a higher level of cognitive presence – integration. Unlike findings from previous studies that claimed that the majority of students’ online postings are usually at a lower level of cognitive presence – exploration, this study’s findings are promising. However, more research is needed to explore the relationships of role assignment and the types of questions. For example, the questions in one of the online discussions required students to debate and ask questions. The analysis revealed that students’ discussion postings increased at the lowest level of cognitive presence – triggering event when students are recognizing the problem.
ACKNOWLEDGMENT The authors thank Dr. Jennifer C. Richardson of Purdue University for her guidance with the coding process and critical feedback regarding revisions in this paper.
REFERENCES Akyol, Z., & Garrison, D. R. (2008). The development of a community of inquiry over time in an online course: Understanding the progression and integration of social, cognitive and teaching presence. Journal of Asynchronous Learning Networks, 12(3-4), 3–22. Akyol, Z., & Garrison, D. R. (2011). Understanding cognitive presence in an online and blended community of inquiry: Assessing outcomes and processes for deep approaches to learning. British Journal of Educational Technology, 42(2), 233–250. doi:10.1111/j.1467-8535.2009.01029.x Bernard, R. M., Abrami, P. C., Borokhovski, E., Wade, C. A., Tamim, R. M., Surkes, M. A., & Bethel, E. C. (2009). A meta-analysis of three types of interaction treatments in distance education. Review of Educational Research, 79(3), 1243–1289. doi:10.3102/0034654309333844
33
The Impact of Role Assignment on Cognitive Presence in Asynchronous Online Discussion
Creswell, J. W., & Plano Clark, V. L. (2011). Designing and conducting mixed methods research (2nd ed.). Thousand Oaks, CA, USA: SAGE Publications, Inc. Darabi, A., Arrastia, M. C., Nelson, D. W., Cornille, T., & Liang, X. (2011). Cognitive presence in asynchronous online learning: A comparison of four discussion strategies. Journal of Computer Assisted Learning, 27(3), 216–227. doi:10.1111/j.1365-2729.2010.00392.x De Laat, M., & Lally, V. (2004). Its not so easy: Researching the complexity of emergent participant roles and awareness in asynchronous networked learning discussions. Journal of Computer Assisted Learning, 20(3), 165–171. doi:10.1111/j.1365-2729.2004.00085.x De Wever, B., Schellens, T., Valcke, M., & Van Keer, H. (2006). Content analysis schemes to analyze transcripts of online asynchronous discussion groups: A review. Computers & Education, 46(1), 6–28. doi:10.1016/j.compedu.2005.04.005 De Wever, B., Van Keer, H., Schellens, T., & Valcke, M. (2010). Roles as a structuring tool in online discussion groups: The differential impact of different roles on social knowledge construction. Computers in Human Behavior, 26(4), 516–523. doi:10.1016/j.chb.2009.08.008 Duphorne, P. L., & Gunawardena, C. N. (2005). The effect of three computer conferencing designs on critical thinking skills of nursing students. American Journal of Distance Education, 19(1), 37–50. doi:10.1207/s15389286ajde1901_4 Garrison, D. R., Anderson, T., & Archer, W. (1999). Critical inquiry in a text-based environment: Computer conferencing in higher education. The Internet and Higher Education, 2(2), 87–105. doi:10.1016/ S1096-7516(00)00016-6 Garrison, D. R., Anderson, T., & Archer, W. (2001). Critical thinking, cognitive presence, and computer conferencing in distance education. American Journal of Distance Education, 15(1), 7–23. doi:10.1080/08923640109527071 Garrison, D. R., & Cleveland-Innes, M. (2005). Facilitating cognitive presence in online learning: Interaction is not enough. American Journal of Distance Education, 19(3), 133–148. doi:10.1207/ s15389286ajde1903_2 Gašević, D., Adesope, O., Joksimović, S., & Kovanović, V. (2015). Externally-facilitated regulation scaffolding and role assignment to develop cognitive presence in asynchronous online discussions. The Internet and Higher Education, 24, 53–65. doi:10.1016/j.iheduc.2014.09.006 Gunawardena, C. N., & McIsaac, M. S. (2004). Distance education. In D. H. Jonassen (Ed.), Handbook of research on educational communications and technology (pp. 355–395). Mahwah, NJ: Lawrence Erlbaum. Johnson, D. W. (1981). Student-student interaction: The neglected variable in education. Educational Researcher, 10(1), 5–10. doi:10.3102/0013189X010001005 Joo, Y. J., Kim, E. K., & Park, S. Y. (2009). The structural relationship among cognitive presence, flow and learning outcome in corporate cyber education. The Journal of Educational Information and Media, 15(3), 21–38.
34
The Impact of Role Assignment on Cognitive Presence in Asynchronous Online Discussion
Joo, Y. J., Lim, K. Y., & Kim, E. K. (2011). Online university students satisfaction and persistence: Examining perceived level of presence, usefulness and ease of use as predictors in a structural model. Computers & Education, 57(2), 1654–1664. doi:10.1016/j.compedu.2011.02.008 Kanuka, H., Rourke, L., & Laflamme, E. (2007). The influence of instructional methods on the quality of online discussion. British Journal of Educational Technology, 38(2), 260–271. doi:10.1111/j.14678535.2006.00620.x Lobry de Bruyn, L. (2004). Monitoring online communication: Can the development of convergence and social presence indicate an interactive learning environment? Distance Education, 25(1), 67–81. doi:10.1080/0158791042000212468 Miles, M. B., & Huberman, A. M. (1994). Qualitative data analysis (2nd ed.). Thousand Oaks, CA: Sage Publications. Moore, M. G. (2007). The Theory of Transactional Distance. In M. G. Moore (Ed.), The Handbook of Distance Education (2nd ed., pp. 89–108). Mahwah, NJ: Lawrence Erlbaum. Newman, D., Webb, B., & Cochrane, C. (1995). A content analysis method to measure critical thinking in face-to-face and computer supported group learning. Interpersonal Computer and Technology: An Electronic Journal for the 21st Century, 3(2), 56-77. Pawan, F., Paulus, T. M., Yalcin, S., & Chang, C. F. (2003). Online learning: Patterns of engagement and interaction among in-service teachers. Language Learning & Technology, 7(3), 119–140. Richardson, J. C., & Ice, P. (2010). Investigating students level of critical thinking across instructional strategies in online discussions. The Internet and Higher Education, 13(1), 52–59. doi:10.1016/j.iheduc.2009.10.009 Schellens, T., Van Keer, H., & Valcke, M. (2005). The impact of role assignment on knowledge construction in asynchronous discussion groups a multilevel analysis. Small Group Research, 36(6), 704–745. doi:10.1177/1046496405281771 Schrire, S. (2006). Knowledge building in asynchronous discussion groups: Going beyond quantitative analysis. Computers & Education, 46(1), 49–70. doi:10.1016/j.compedu.2005.04.006 Strijbos, J. W., & De Laat, M. F. (2010). Developing the role concept for computer-supported collaborative learning: An explorative synthesis. Computers in Human Behavior, 26(4), 495–505. doi:10.1016/j. chb.2009.08.014 Strijbos, J. W., & Weinberger, A. (2010). Emerging and scripted roles in computer-supported collaborative learning. Computers in Human Behavior, 26(4), 491–494. doi:10.1016/j.chb.2009.08.006 Swan, K., Garrison, D. R., & Richardson, J. (2009). A constructivist approach to online learning: the Community of Inquiry framework. In C. R. Payne (Ed.), Information technology and constructivism in higher education: Progressive learning frameworks (pp. 43–57). Hershey, PA, USA: IGI Global. doi:10.4018/978-1-60566-654-9.ch004 Tam, M. (2000). Constructivism, instructional design, and technology: Implications for transforming distance learning. Journal of Educational Technology & Society, 3(2), 50–60.
35
The Impact of Role Assignment on Cognitive Presence in Asynchronous Online Discussion
Wise, A., Saghafian, M., & Padmanabhan, P. (2012). Towards more precise design guidance: Specifying and testing the functions of assigned student roles in online discussions. Educational Technology Research and Development, 60(1), 55–82. doi:10.1007/s11423-011-9212-7 Xie, K., Yu, C., & Bradshaw, A. C. (2014). Impacts of role assignment and participation in asynchronous discussions in college-level online classes. The Internet and Higher Education, 20, 10–19. doi:10.1016/j. iheduc.2013.09.003
36
The Impact of Role Assignment on Cognitive Presence in Asynchronous Online Discussion
APPENDIX Online Discussion #3 Questions Description: There is an exceptional amount of nutrition information available on the internet and in the media. Unfortunately, much of it is un-scientific, dis-honest or outright incorrect. Some companies use this nutrition misinformation to convince you to purchase their products. This discussion encourages you and your group to find examples of these marketing techniques as practice for recognizing valid nutrition science information and misinformation in popular culture. Prior to discussion: • •
Read pages 24-30 of the textbook (12th edition: p. 23 – 28). Also review the Earmarks of Nutrition Quackery list here: NUTR 295_OD3_Earmarks of Nutrition Quackery.docx During Discussion:
•
•
•
•
Search the internet for nutrition products’ websites which you might consider to contain nutrition misinformation. ◦◦ Weight loss products, dietary supplements, nutritional “cleanses”, diet plans and their associated books may be a good place to start. ◦◦ It is best to find the product’s direct website, and not a third-party vendor. (For example, if you want to learn about the AwesomeDietPill product, you would go directly to AwesomeDietPill. com. Don’t use the product page for AwesomeDietPill at Amazon.com.) Your team’s task for the week is to identify examples of *each* of the 11 Earmarks of Nutrition Quackery on nutritional products’ websites. For each earmark, you must: ◦◦ List at least one example and provide a link to the website. ◦◦ Provide a description of where the earmark appears on the website. ◦◦ (You may use a single website as an example for multiple earmarks.) If you have completed the task, and still need fodder for discussion, consider these questions: ◦◦ Which earmarks seem to be most commonly used in online advertising? Why do you think that is the case? ◦◦ Which earmarks are hardest to identify in online marketing, even when you know to look for them? More importantly, why? ◦◦ Are there any other “red flags” that aren’t included in this list of earmarks? Don’t forget the general discussion posting deadlines: each student must make at least one post by Saturday midnight, and a total of three posts by Tuesday at midnight. (These can include your ‘special role’ posts.)
37
The Impact of Role Assignment on Cognitive Presence in Asynchronous Online Discussion
Examples of Quality of Graded Discussion Posts High Quality Skeptic: I agree that school based interventions would be very beneficial to school aged children. Currently, only 3 states require mandatory recess and only 6 require mandatory PE classes-‐ none of said states requiring both simultaneously (TIME Magazine Online, “Childhood Obesity”). The obesity crisis is a horrible epidemic that is leading to the startling statistic that this generation will be the first to have a shorter life expectancy than their own parents (HBO, “The Weight of the Nation”). Aside from being beneficial to the children’s health, the physical activity during the long school day would aid with concentration and restlessness and subsequently lead to better behaviors-‐ both cognitively and physically-‐ during the school day. Aside from school, I think a family system intervention would be just as beneficial (similar to what Andrea said). It is difficult to tell a child to be healthy when those around them are not acting in the same way or eating the same foods. The entire family needs to have the same goal (of being healthy, losing excess body fat, living longer etc.) in order to be successful, especially in families that are overweight due to genetics because within those families it is more likely that even if a child is not overweight yet—that they will be in the future. According to this article (http://www.iom.edu/~/media/Files/ Activity%20Files/PublicHealth/ObesFramework/LynnSilverIOM11420091.pdf) out of New York, another type of intervention that would be beneficial would a “Changing the Context” intervention which is similar to the topic we debated in the last assignment. It discusses calorie labeling and how that should be the norm in all restaurants and eating establishments in order for people to be more aware of what they’re consuming. Wrapper: For my post as wrapper I’ll hit the main points of our discussion. (I also tweaked the scientific experiment a bit in this summary.) We selected the age group ‘school age children’ from the given list in the discussion instructions. Promoting physical activity to people at a younger age may make it more likely that they will establish healthy habits and (hopefully) continue them into their future. Three interventions that were discussed were: 1. School based Intervention: This could be helpful in reaching out to the most amounts of children possible. The programs could be standardized across multiple schools and monitored easily and accurately to maximize the number of children affected by the intervention. Physical activity during the school day would aid with concentration and restlessness and subsequently lead to better behaviors—both cognitively and physically—during the school day. 2. Home-based Intervention: This would hopefully get more than just the school-age children involved in the process. The entire family could get involved in improving their physical fitness, therefore making the adjustment easier for the children by providing them continuous support from their family members. 3. Community Intervention: This could hopefully get children to be more physically active close to home in their own neighborhood. Club sports could be established if they are not already, and parks could be constructed or simply preserved to create a safe place for children to get out and play. Club sports are also useful in allowing children to learn team-building skills. We decided to stick with school based intervention, which can be more easily standardized and monitored. In order to scientifically test our intervention type, we have decided to conduct a ran-
38
The Impact of Role Assignment on Cognitive Presence in Asynchronous Online Discussion
domized, controlled study on 4th grade students at a specific elementary school. We will randomly select 100 4th grade students to put in the intervention group. These students will be given physical activity and exercising advice in the classroom and an increased amount of physical activity at school throughout the year. Their physical health will be tested at the beginning, midpoint, and end of the school year. There will also be a control group of 100 randomly selected 4th grade students. These students will not be given any extra advice on exercising or an increased amount of physical activity during the school day. Their physical health will be tested the same as the experimental group. Before the school year, BMI readings will be taken for all participants and current exercise habits will be noted. At the midpoint and end of the school year these facts will be compared to the initial readings/information provided to note any significant changes to their physical fitness and BMI readings. A doctor will visit to take these readings and notes due to their invasiveness, and additionally to ensure student privacy. Even if the children in the intervention group have not had a reduction in their BMI, it could be beneficial to observe any changed physical activity habits due to a more physically active school environment. Skeptic: So I found this dietary supplement that hits a lot of the major earmarks. (http://miraclegarciniacambogia.com/article/gs/) It is called Miracle Garcinia Cambogia, and supposedly is a weight loss supplement that boasts easy and fast weight loss, without having to change anything in the diet or exercise. This hits the too good to be true claim. The very second paragraph on the site says that this new supplement is creating a major media buzz and has “doctors and scientists around the world very excited with its ability to help people lose weight without doing anything different.” This is authority not cited. There is also a bogus claim here. There is a scientific study that is put on the website done by a physician at Georgetown medical center. Upon further inspection of the article, I found that the researchers actually found this supplement can be effective only in conjunction with other supplements and modifications to the diet. This was an example of an unpublished study. Furthermore, after I did a pubmed search on this extract, I found several publications that stated that this supplement does NOT cause any significant changes in weight. Interesting, right? These articles I found are found in journals and are not trying to sell anything. Also, this website is clearly trying to sell these supplements, and is an advertisement. At the top of the website, it even says advertorial. The website is trying to sell the product, and there is also motive - personal gain. At the bottom, there are comments which all happen to be testimonials of people claiming to use the product and interestingly, each person had nothing but praises for these supplements. There are also some bogus scientific claims for the reasons this supplement works. There are several scientific reasons mentioned but one in particular that stood out to me as complete was “Eating is affiliated with emotion. Miracle Garcinia Cambogia increases your serotonin levels, which leads to better mood and sleep.” This seems a little farfetched to me, and even if it is true, there is absolutely nothing to back up to prove it. This is an example of logic without proof.
39
40
Chapter 3
Promoting Learner Interaction and Personalized Learning Experiences with a Google+ Social Media Model: How to Replace the Traditional Discussion Forum JoAnne Dalton Scott Instructional Design Practitioner/Researcher, USA
ABSTRACT This chapter presents the Directed Google+ Community model (DG+) as an alternative to the traditional discussion board forum. Social media platforms exhibit characteristics that can be leveraged in course design to promote positive learner experiences. Specifically, the chapter will define the DG+ model; examine how it promotes learner interaction, discussion, collaboration and peer review; discuss how it supports course topics, course assignments and creates a searchable knowledge management system; and explain how it complements the use of a learning management system for grade reporting purposes. Both the instructor and the students experience benefits from this design tool. The chapter will also discuss ways to overcome potential obstacles to implementing the model.
INTRODUCTION Quality communications between students and the instructor is a significant dynamic affecting course outcomes. Any instructor who has taught the same course more than one time knows each course group has a different disposition even though the content and activities of the course remain relatively static. There is a kaleidoscope of reasons for this, the discussion of which is outside the purview of this chapter, but what is relevant is the way in which certain aspects of learner behavior can be leveraged to promote desired learning outcomes. DOI: 10.4018/978-1-5225-1851-8.ch003
Copyright © 2017, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
Promoting Learner Interaction and Personalized Learning Experiences
Class discussions contribute to peer collaboration, sense of community and learner perceptions whether they occur in face-to-face classroom settings, online via a threaded discussion board or asynchronously using some other bulletin board type tool. The use of threaded discussion boards in higher education courses is common, but the design of this tool does not adequately support students’ needs with respect to constructive communication scenarios. The purpose of this chapter is to provide instructional designers and instructors with a practical tool that maximizes the potential of asynchronous communications. More than that, the chapter is intended to be a workshop experience guiding the reader through the process of applying the Directed Google+ Community model (DG+) to instructional practice.
Background Discussion forums are an often used learning activity in both face-to-face and online courses whether the course is offered synchronously, asynchronously or is a hybrid of these options. In this context discussion forums refer to threaded discussion boards which are typically housed within a learning management system (LMS). Participation is limited to students enrolled in the course and their instructor(s). Organizations utilizing the discussion forum include brick and mortar universities as well as online learning platforms that host massive open online courses (MOOCs). With respect to the effectiveness of discussion forums, historically research has examined the learning outcomes (Kay, 2006; Thomas, 2002), influence of instructor participation (Mazzolini & Maddison, 2003, 2007), and the optimal design of discussion board environments (Levine, 2007) including the ways in which structured and unstructured forums promote learner engagement (Salter & Conneely, 2015). Instructor social presence has been identified as a contributing factor to learner participation, development of a sense of community (Aragon, 2010), learner perception of satisfaction (Gunawardena & Zittle, 1997; Swan & Shih, 2005), and learner perception of personal achievement (Shin, 2002). More recently, massive open online courses (MOOC’s) have provided rich sources of data for scrutinizing the ways in which discussion forums affect learners. Similarly, research topics include instructor presence, development of learning communities and peer support (Sharif & Magrill, 2015), as well as the potential impact of active participants to enrich discussions (Wong, Pursel, Divinsky, & Jansen, 2015) and discussion behaviors that correlate with learning gains (Wang, Yang, Wen, Koedinger, & Rosé, 2015). This research reveals two focal topics to consider when designing discussion forums; one addresses the social nature of people while the other addresses how they interpret their experience. Obstacles to learner success and participation in discussion environments include time commitment, misunderstanding of written communications between participants, and site navigation problems (Kay, 2006). Synchronous vs asynchronous discussion options can also be problematic for learners depending on the content area, as some topics require immediate feedback while others benefit from the opportunity to formulate structured responses (Nandi, Hamilton, & Harland, 2015). Gender may also play a role in modality choice of distance discussions. There is evidence to suggest male learners prefer text based communication while female learners prefer audio or video communication (Ching & Hsu, 2015). Again, there are two focal topics to consider; one addresses the functionality of how people use the forum and the other addresses their perception of conversation. Social media platforms including Facebook, Twitter and YouTube are increasingly applied to learning activities (Wankel, 2009). Research investigating social media use in educational applications has examined stimuli promoting instructors to utilize social media as well as effects of that use, which include learner satisfaction and learning outcomes (Cao & Hong, 2011). There are indications that social media use may be 41
Promoting Learner Interaction and Personalized Learning Experiences
restricted to social networking and communication among students, with instructors still unsure as to how they can most effectively incorporate social media into their courses (Tess, 2013). On the other hand, there is evidence of instructors creatively utilizing social media for instructional applications (Njoroge, 2016) as opposed to the distraction of non-instructionally directed social media use which occurs when learners are not engaged in the instructional activities at hand (Flanigan & Babchuk, 2015). There is also evidence suggesting students perceive social media as a valuable addition to their learning experiences (Neier & Zayer, 2015). When engagement in social media was compared to engagement in corresponding MOOC discussion forums, learners demonstrated higher engagement and propensity to pursue course completion as a result of participation in social media groups (Zheng, Han, Rosson, & Carroll, 2016). As the learner population increasingly becomes saturated with digital natives of the millennial generation, the appeal of social media as a teaching and learning tool will presumably increase. Statistics reveal 88% of millennials turn to Facebook as a news outlet and 73% of millennials who use social media report that they have at least some motivation to investigate opinions that conflict with their own (“How Millennials use and control social media,” 03/16/15). Instagram, Snapchat and Facebook have been identified as the most popular social media platforms among college students, who seem to particularly value the ability to share photographs and videos (Knight-McCord et al., 2016). In addition, statistics indicate more than 60% of facility in higher education institutions have used some variety of social media in their courses (Moran, Seaman, & Tinti-Kane, 2011). The expansion of social media from existing solely as a form of entertainment to include educational opportunities provides instructional designers with a revolutionary and promising learning tool. Social media platforms are recognized for their potential to leverage constructivist and andragogical techniques (Huang, 2002). Constructivist learning theories support engaging learners in actively building usable knowledge and skills in conjunction with their existing perception and experience (Byrnes, 2001). When applied to distance education, constructivism requires providing learners with a virtual environment in which they can construct personally meaningful knowledge (Jonassen, Davidson, Collins, Campbell, & Haag, 1995). Knowledge sharing among individuals using social media places them in a larger community than what students can access locally, which enhances their efforts to construct meaning (Schrader, 2015). Andgragogy recognizes basic characteristics common to adult learners including self-directedness; readiness and motivation to learn; prior life experience; and an orientation to learning that prefers problem solving within realistic scenarios (Knowles, 1970). Utilizing social media enables learners to direct their participation so as to satisfy their learning needs, assist others in doing the same, and vicariously experience real-life situations via their learning community. Social media platforms enable instructional designers to create an adult-centered, constructivist learning environment, which can address the topics identified above; presence and community, student perception, ease of navigation, and interpretation of written communications. First, by their very nature social media platforms utilize mechanisms that promote self-direction, presence and community. The degree to which the instructor promotes development of social presence will vary by individual, but the platform provides the necessary tools. The sense of community students feel will also depend somewhat on how the instructor facilitates the group, as well as the personalities of the individual students. Second, social media platforms provide ample techniques for influencing students’ perception of their experience in a positive way. Discussions are not limited to a strictly linear, text based conversation but rather a personal conversation oriented from life experience that satisfies the human need to collaborate and leave the conversation with the kind of closure that comes from reconciling knowledge discrepancies and/or demonstrating proficiency. 42
Promoting Learner Interaction and Personalized Learning Experiences
Third, social media platforms are functionally easy and efficient to use. Navigation is not typically an issue for any except the very novice user. The amount of time necessary to scan posts and participate in conversations is realistic. In addition, social media apps promote self-direction because they allow users the choice to participate in a discussion using a variety of devices, from anywhere, at any time as their needs dictate. And finally, the problem of interpretation. Interpretation problems can result from the lack of visual cues that are present in face-to-face conversation (Kiesler, Siegel, & McGuire, 1984). This is a recurring theme in all written conversations; it is not isolated to social media content. The use of photos, links, emoticons and other conversation enhancers assist in cueing others in the conversation to the poster’s intent. Social convention plays a role in maintaining decorum as well. People appear to be more likely to think twice before speaking once when the peers in an online community are personally known to them. Research indicates members of a social group tend to behave within the established culture of the group when their identity is known rather than diverge from that culture (Douglas & McGarty, 2001; Shapiro & Anderson, 1985). In light of the preceding discussion, social media platforms can be leveraged as a discussion forum alternative. The Directed Google+ Community model (DG+) can guide instructional designers and instructors in making the transfer from traditional discussion forum to the social media platform. It can be used with synchronous face-to-face courses, asynchronous online courses, or hybrid courses. The DG+ model creates a private community that is specific to a particular course and learner group. The community is designed to benefit students by including provisions for promoting learner interaction, discussion, collaboration, peer review, and results in a searchable knowledge management system specific to the course and learner group. It benefits the instructor by making discussions easy to manage, integrating the syllabus and course calendar topics, and complementing the use of an LMS for grade reporting purposes. The DG+ model is important for the field of instructional design because it actively integrates plans for maximizing social presence through a prescribed design of discussion and task activities which, as research has shown, influences learner benefit (Swan & Shih, 2005). Furthermore, it contributes to the focus of this book by taking an existing instructional technique (discussion board) and transforming that technique so it actively compliments online learning in a novel and productive way. It is the author’s hope that practitioners utilizing the DG+ model will contribute their experiences and insights to the evolution of this new pedagogy by participating in the DG+ community https://plus.google.com/u/0/ communities/113561215080481022498 or the blog http:\\dgmodel.weebly.com. The remainder of the chapter will introduce the DG+ model; describe how to implement the model in an existing course and during course design; discuss potential obstacles to implementing the model; and relate both learner and instructor experiences with a course that utilized the model.
The Directed Google+ Community Model It is important to begin by defining exactly what the Directed Google+ Community (DG+) model is and stating its applicable boundaries. When applied to existing courses, the DG+ model is a five step, linear process as shown in Figure 1. It guides the instructional designer or instructor in substituting (or adding if none exists) a discussion board for a more interactive, social learning experience. When applied to the design of new courses, the DG+ model is a series of tasks that are added to the chosen design model. The DG+ model functions alongside instructional design models in much the same way the ARCS Model of Motivational Design does. 43
Promoting Learner Interaction and Personalized Learning Experiences
Figure 1. Directed Google+ Community model as it applies to existing courses
Multiple and varied functions are included within the boundaries of a DG+ course and they are visually represented in Figure 2. First, all asynchronous discussions (those that would typically take place in a discussion board forum) are conducted in a Google+ community that is private to the course. This includes both student to student discussions and discussions in which the instructor participates. Second, assignments that are intended to be shared with course participants for collaborative or peer learning purposes are submitted for both grading and sharing purposes using the Google+ community. Third, the instructor uses the hashtag index system to collect the assignments. Once graded the grades are posted to the learning management system (LMS). For the purposes of peer learning, students use the same hashtag system to locate information within the community so as to collaborate and examine each other’s work. Finally, the community provides a lasting knowledge management system of sorts. Students retain access to course resources after the course has ended. Again, the hashtag system enables sorting and retrieving of information. Ideally students will develop their personal learning network and continue to collaborate and share information even after the course has ended. Figure 2. Visual representation of functions included within the boundaries of a DG+ course
44
Promoting Learner Interaction and Personalized Learning Experiences
IMPLEMENTING THE MODEL WITH AN EXISTING COURSE It is now time to turn this discussion from passive to active. Choose a course to transfer from discussion board format to DG+ format. If you are an instructor, choose a course you have taught more than once so you are adequately acquainted with it. If you are an instructional designer, choose a course you designed personally. For both instructors and designers, a course with few graded assignments and non-complex linear progression is preferable to a course heavy with assignments, projects and complex learning activities for your first DG+ experience. Let’s begin.
Collect The first step in the DG+ model is collect. Having support documents physically available as opposed to knowing their content vaguely in your mind will make the transfer process easier. For the chosen course collect the syllabus, calendar and any other documents students receive during orientation to the course. These other documents may include a list of assignments, instructions for assignments and reading list. In addition, gather any handouts, sample assignments and other resources students receive as the course progresses. You can choose whether or not to include these in the hashtag index during the transfer step. Either way, having them available will ensure nothing is inadvertently omitted. A word about working with digital documents as opposed to paper documents. The important thing to remember is that multiple documents are used simultaneously. Dual monitors and large monitors that allow for displaying multiple documents facilitate making the collect step paper-free. The syllabus and course calendar will ultimately need to be updated to include information specific to courses that utilize the DG+ model. When working with digital documents this is done as decisions are made. When working with paper documents information is handwritten on the document and transferred to the digital copy later. The collect step is quick and results in nothing more than a small stack of papers (or computer files). Next is the transfer step.
Transfer The next step in the DG+ model is transfer. This step is the most complex and time intensive. It is divided into three tasks; identify, update and create. Identify assignments and activities for transfer. There are generally three types of activities to consider: graded assignments, projects and discussions. Each of these activities requires the instructor’s attention and results in a grade. Consult the syllabus and calendar to identify each instance where students are required to submit an assignment or project for a grade and each instance where they are required to participate in an asynchronous discussion (discussion board post). Make a list of the identified assignments. Table 1 details the DG+ Blueprint. It is provided to help in organizing the information compiled during the transfer step. The blueprint will be referenced from this point forward. Write the due date next to each assignment’s name on the blueprint. If possible list the assignments in date order.
45
Promoting Learner Interaction and Personalized Learning Experiences
Table 1. DG+ Blueprint Assignment Title
Due Date
Sharable Yes/No
Hashtag (if yes)
Grade Weight
calendar
syllabus
assignment instructions
LMS
Sample Assignment
12/31/99
yes
#sample
2 pts.
a
a
a
a
For each assignment decide if it should be shared via the course’s Google community or not. Discussions obviously take place via the community. Tests and quizzes obviously will not. Essays, papers and projects have the potential to impact peer learning when students review and discuss one another’s work. The issue of cheating, or the occurrence of students looking at the work of others before completing the work themselves is discussed in a later section. On the blueprint, mark each assignment for shareability. Give each shareable assignment a short, one to two-word identifier. This will become the assignment’s hashtag. When creating hashtags use a naming convention that is logical and minimizes the incidence of typing mistakes. A shortened version of the assignment’s name that uses the underscore to replace spaces typically works well. For example, the hashtag for an assignment titled, “Naming Convention Essay” could be #nc_essay, #ncessay or #nc. The next section on the blueprint is for noting how an assignment is graded. This will vary by assignment and course. Discussions may earn a participation type grade, i.e. the student is awarded points for posting as noted in the activity’s instructions. Papers and projects may be weighted differently. In this section of the blueprint note how the particular assignment is graded.
Update Course Handouts, Activity Instructions and LMS Shell Certain information, such as due dates, are always updated on the calendar and syllabus for each new iteration of a course. For DG+ courses, once they have been added, the assignment hashtags can remain the same for each new iteration. Add the hashtag for each assignment to every instance of its occurrence on the calendar, syllabus, in the assignment instructions, and in the LMS shell. A short statement explaining the use of hashtags should be included in the syllabus and LMS shell. An example is provided in Table 2. This is a good time to consider how much to use the LMS course shell and how much to use the Google+ community. There are three options. The first is to post all instructions and related resources in the LMS. The second is to create a post for each assignment with instructions and resources in the Google+ community. The third is to utilize both formats and allow students to choose how they access course information. The third option is initially more work for the instructor or designer, but is preferable for the students as it recognizes adult learners’ need for self-direction and choice. Remember to assign a hashtag to each post and monitor students to ensure they understand what is expected of them and when it is due.
46
Promoting Learner Interaction and Personalized Learning Experiences
Table 2. Sample explanation of hashtags Instead of the traditional discussion board format, we will be using a private Google community created specifically for this course. When you receive the invitation to join, please do so. Google+ communities are similar to other social media platforms. You can create a text-based post; create a post and attach media such as a document, URL or other file; comment on or respond to the posts of others; and 1+ posts, meaning you like the content. Posts should be marked with hashtags (#) for easy sorting. The #index post provides a list of hashtags specific to our community. You can add to the index by posting a reply with a new tag. Many of your assignments this semester will be submitted via our Google+ community. The reason for this is to develop and maintain your personal learning network. Take time to look at the work of your peers and learn from them. You may or may not be required to reply to or comment upon the work of others, as indicated in the assignment instructions. Always contribute respectful comments that add value to the quality of the project. How to properly post an assignment: 1. 2.
Start a new post and attach your assignment as a document, file or whatever is appropriate under the circumstance. Mark the post with the hashtag specific to that assignment. If you do not include the hashtag you may not receive credit for completing the work. Hashtag designations are provided in the syllabus, with each activity’s instructions and in the #index post. Use care to spell the hashtag exactly as it is shown in the instructions.
If you have any questions about posting your assignments, please ask. Neglecting to ask for clarification if you are confused is not an excuse for late work.
The last four columns on the blueprint are provided to help ensure all resources are updated. For example, if an assignment listed on the blueprint has its date, hashtag and grading information updated in the course calendar, check the box. If they are updated in the syllabus, check the box. Complete the task, check the box. Create the course’s private Google community and give it a name. It is advisable to establish a naming convention before creating the community. For instructors who manage more than one community, naming them by the day the course meets, course title or some other random description will become confusing. A naming convention that includes a course identifier, the semester and year the course is taught, and a section identifier if there are multiple offerings of the course should suffice. As an example, the DG+ community name for the second section of a Chemistry I course offered the fall semester of 2016 would be “ChemI_fall16_B.” Yes the name is very unimaginative, but there is no question as to which course offering the community belongs. Instructions for creating the community are not included here, as Google offers an extensive library of help topics. For individuals new to Google products the Help Center https://support.google.com/ plus#topic= includes topics such as communities as well as connections, collections, settings, sharing and troubleshooting. An overview of Google+ communities can be found on the webpage titled, “Getting Started with Communities” https://support.google.com/plus/answer/6320411?hl=en&ref_topic=6320361. It includes a brief description of the behaviors in which community members can engage and the capabilities community owners have to maintain their communities. This page also discusses the differences between collections and communities. Collections are an interesting and useful tool in their own right. Step by step instructions for creating a new Google+ community can be found on the webpage titled, “Create and edit a Community,” https://support.google.com/plus/answer/6320395. Additional topics include editing community settings, deleting communities and pinning a post so it appears first when
47
Promoting Learner Interaction and Personalized Learning Experiences
members access the community. The last topic on this webpage is privacy settings. Be sure to make the course community private. This is a good time to make the first post and the first post should be an index of the hashtags created earlier in the transfer step. Consider pinning the #index post. Making the index readily available will prevent a certain amount of errors when students post their assignments. Both the instructor and students can add to the index of hashtags as the course progresses by replying to the original #index post. Lastly, decide how to invite students to join the community. Including a link to the community in the LMS provides students with the ability to request membership. Another option is to email an invitation to each student.
Verify The third step in the DG+ model is to verify. This step is simple, straightforward, and tempting to skip. Check your work. Better yet, have someone else check the work. Just like proofreading, the writer often sees what is intended, rather than what is. Whoever completes the verify step must compare the blueprint to the actual course documents and LMS shell to ensure all assignments were transferred and the text is correct. The reviewer should also make notes about their experience with usability. If the reviewer has difficulties, the students will mostly likely have difficulties as well. Conclude the verify step by making revisions.
Implement The fourth step in the DG+ model is implement. Implementation begins the day of the first class meeting. Whether that meeting is face-to-face, online, synchronous or asynchronous, the way in which students are oriented to the course must include an explanation of the Google+ community. Students need to know what the community is, how to join, how to participate in discussions, how to submit assignments including attaching files, how the hashtag index works and what expectations the instructor has for their participation in the community. This is also a good time to review netiquette. Be creative with the orientation platform. A short video explaining the organization of the course can be included in the LMS. Another option is a webinar demonstration utilizing screen sharing. Whatever the format, orientation must be addressed upfront to prevent students from becoming frustrated because the course layout is unfamiliar. Frustrated students do not promote a sense of community or exert social presence. Once the students have been oriented to the community and its function, they must join. If a link to the community is included in the LMS shell, as referenced in the transfer step above, students can request membership. If a link to the community is not included in the LMS shell, the instructor must invite students to join by using their Gmail address. If necessary, assist any students who do not have a Gmail account in signing up for an account and joining the community. These students may need mentoring as they learn to navigate Google applications. Once the students have been oriented to the Google+ community, post a welcome statement. This is the instructor’s first opportunity to establish social presence. It is important for the students to understand the instructor values the community and its potential benefits. Verify that each student registered for the course has joined the community before the first assignment is due.
48
Promoting Learner Interaction and Personalized Learning Experiences
Monitor The fifth step in the DG+ model is monitor. All courses must be monitored for effectiveness. Revisions are an essential part of the instructional design process. Sometimes adjustments are necessary to suit the needs of the learner group. Sometimes adjustments are necessary to amend the design of the course. Monitor the community as a communications facilitator. Be active in the community. Develop instructor presence by guiding and initiating discussions. Encourage students to develop personal presence by participating in meaningful ways rather than simply satisfying course requirements. As students become more involved in the community, step back being cautious to maintain instructor presence, and let them take the lead. Monitor the community as a traffic cop. Conversations, like traffic, can quickly back up or become precarious. When any discussion in the community takes a wrong turn, gently but firmly direct it back to the proper path. This includes interactions of a personal nature that may be inappropriate or discussions where students are forming incorrect conceptions of course topics. Monitor the community as a project manager. The DG+ course you implement, and any course for that matter, is your project which has specific student goals. As the course progresses take note of things that work, as well as things that do not work and specifically document why. Use this information to modify future iterations of the course.
Directions for the Instructor There are two instructor functions on which DG+ courses focus. The first, presence, has been discussed. Now it is time to examine the second, grading assignments. The hashtag system enables students to sort and access information and it serves the same purpose for the instructor. When it is time to grade assignments simply go to the community and sort the posts by the hashtag specific to the assignment. Each student post containing that hashtag will be shuffled to the top. Posts are dated so the instructor can be confident assignments were submitted in a timely manner. Some assignments, like discussions, earn participation points. Once the posts are sorted by hashtag, it is easy to see who participated and who did not. Other assignments include a product the student created and attached to the post as a file or link. These must be downloaded or accessed for grading purposes the same as if they were submitted via an LMS. Once grading is complete simply input the final grade into the institution’s LMS. Using dual screens or opening two windows on a large monitor makes grading/ grade reporting tasks quick and easy. Make the most of the opportunity to promote peer learning and personal learning network development by building peer feedback about assignments into discussion activities. An activity that earns participation points for students when they evaluate and provide constructive feedback on an assignment submitted by a classmate is shown in the scenario in Table 3.
Implementing the Model in Conjunction with Course Design The DG+ model is presented in a format for use with existing courses. Blended and hybrid courses, as well as fully online courses, are becoming more prominent than in past decades. Instructors and instructional designers are increasingly driven to transfer traditional courses to a blended format. The above detailed version of the DG+ model assists in this context. 49
Promoting Learner Interaction and Personalized Learning Experiences
Table 3. Scenario demonstrating instructions for peer discussion of projects Assignment: Comparison of Motivational Models Hashtag: #comparemodels This is what you will get from the activity: You will compare the ARCS Model (which is the model we minutely examine and utilize in this course) with the other motivational model your chose in the first part of this activity to recognize how both models use theoretical concepts to promote motivation in instruction. Keep in mind, you do not have to like or believe in the usefulness of the model you use for comparison. The purpose of choosing a model is to leverage peer learning; teach your classmates what you learned about the model you chose. We talk about the ARCS Model extensively so do not spend too much time on it, other than using it for comparison purposes. This is how you will do it: 1. 2.
3. 4.
Choose a presentation format. It can be visual such as an infographic, Prezi, chart; or audio like a Voki; or a short written narrative … the choice is yours. However you present the information, be sure to Τ Include the main points of the motivational model, Τ Note how the model uses motivational theories, and Τ Briefly compare the model to the ARCS model This is not meant to be a comprehensive presentation; it is meant to be concise. Follow these general guidelines: an essay should not exceed 750 words. Recorded visuals or audio should not exceed 3 minutes. Be thorough enough so your peers get the gist, but do not overdo it. Post your comparison to the course Google+ community and mark it #comparemodels. Posts that do not include the hashtag may not receive credit. Read at least two posts for each motivational model and comment on one post for each. Acceptable comments add value to the discussion by highlighting some aspect of the presented information, providing constructive critique of the comparison or asking a question that elicits more thorough explanation of the topic. Remember this is a “discussion” meaning there is back and forth conversation. Monitor the community periodically to stay involved in the discussion.
When designing a new course, the DG+ model becomes more of an add-on to the chosen instructional design model rather than a model in itself. Instructional design models include analysis functions, which are not influenced by DG+ topics. Phases that function to design, develop, implement and evaluate are influenced by DG+ topics. Figure 3. demonstrates how the DG+ model becomes a series of tasks added to an instructional design model. The ADDIE framework is used because of its generic nature. Notice the gear shape used to represent each step. The tasks necessary to complete a step work together as a portion of the whole systematic instructional design process, similar to the way gears in a clock work together to track time. When applied to design of a new course, the collect step is irrelevant. Notice it is omitted from Figure 3. The transfer step changes somewhat to become create and it aligns with design and development in the ADDIE framework. In the context of designing a course, focus is on creating new rather than adding, or transferring information from the blueprint, to existing materials. Specifically, as assignments and activities are designed they should be assigned a hashtag and included in course documents (calendar, syllabus). The blueprint is still a useful tool to help organize information. The course community is created during development and the method of orienting students to its use is addressed at this time. The verify step is part of the formative evaluation process that takes place during the concluding functions of development. It is a critical component of instructional design and should not be overlooked even when time is lacking. As noted above, have someone else check the course for usability. Implement and monitor in the DG+ model align with implement in the ADDIE framework. They also sit within the same context as courses that have been updated to follow the DG+ model. The implement and monitor discussions above continue to apply.
50
Promoting Learner Interaction and Personalized Learning Experiences
Figure 3. Directed Google+ Community Model as it applies to the design of new courses
Finally, ADDIE includes a step to evaluate the course for effectiveness. DG+ includes evaluation as part of the instructor’s responsibilities as the project manager of the course. As such, evaluation is a formative process which takes place during the course and a summative process as the instructor reflects on learning outcomes once the course has concluded.
Obstacles to Implementing the Model When any new technology is implemented there are impediments to overcome. Innovation is fraught with obstacles. The means to refute some of the most common, and possibly emotionally compelling, obstacles are discussed here. First and foremost is the issue of privacy. Google has a reputation for mining users’ information that is concerning to some. Every search engine, email provider, social media platform and online retailer mines information. The user’s perception of the way in which Google collects information is the relevant aspect of this obstacle. Google’s detailed privacy policy, https://www.google.com/intl/en/policies/privacy/, explains what information is collected and how it is used. Compare it to Yahoo!’s privacy policy, https://policies.yahoo.com/us/en/yahoo/privacy/index.htm. Email providers are not the only companies to mine data about users and their internet behavior. Online retailers such as Amazon (privacy policy http://www.amazon.com/gp/help/customer/display.html?nodeId=468496), social media platforms such as Facebook (privacy policy https://www.facebook.com/policy.php) and the obvious Neilson Company (privacy policy http://www.nielsen.com/us/en/privacy-policy/digital-measurement.html) collect large amounts of information. Making wise choices about internet behavior means being informed about how information is used. Consult the privacy policies of various companies to learn exactly what information
51
Promoting Learner Interaction and Personalized Learning Experiences
is collected and how it is used. Compare one to another. Is Google really collecting more information than Amazon? Is Google’s use of that information substantially different from the way Facebook uses the information? The US government established a website which is managed by the Federal Trade Commission in partnership with other federal agencies to assist internet users in employing responsible and safe internet behaviors https://www.onguardonline.gov/. Overcome this obstacle by aligning perception with reality and promoting responsible internet behaviors. Just as important as privacy, is institutional policy on social media use. Before implementing any type of social media tool thoroughly investigate the institution’s written policy. Overcome this obstacle by knowing institutional policy and acting within its boundaries. Both instructors and students will have prejudices for or against social media and/or particular social media platforms. There are instructors who believe social media platforms are strictly social in nature and have no place in academia. Refer them to the literature. Research indicates students enjoy using social media and their use of social media has the potential to influence learning outcomes in a positive way. (The reference list at the end of this chapter provides a selection of useful journal articles.) There are individuals who have not embraced social media or who feel strong prejudices for or against particular social media platforms. For those students who simply have not gotten on the social media band wagon, provide mentoring. Explain they are not selling out, for some refraining from joining the social media revolution is a matter of principle - they are simply participating in a course activity. Assist them in learning how to use the tools Google+ provides. If they are comfortable using the tools, they are more likely to embrace them. Overcome this obstacle by refuting prejudices with information and mentoring. Similarly, fellow instructors may not see the value in incorporating social media into course activities. Once the course ends, students will provide feedback via their institutionally mandated course evaluations. Those evaluations should reflect positive experiences with the course’s Google+ community. For more detailed feedback devise a survey for students that asks specific questions about their participation in the Google+ community and how it affected their course experience. Overcome this obstacle with data collected from course participants. Regarding student behaviors, there is the issue of netiquette. As discussed previously, orientation to the course is a good time to review what is expected of students in regard to their online conversations. In general, people are more likely to behave in socially acceptable ways when the members of the online group are personally known to them (Douglas & McGarty, 2001; Shapiro & Anderson, 1985). Even so, instructors should be aware the potential for misinterpretation of written communication continues to exist. Be prepared to mediate if warranted and consider including a written netiquette policy in the syllabus. Overcome this obstacle with a combination of proactive steps to minimize potential issues and a swift reactive response if issues arise. Another student behavior of concern is cheating. In this case the potential for cheating can exist within the sharing utility which is so important to maximize peer collaboration. When students submit their assignments via the Google+ community, there is the potential for others to do reconnaissance before completing their own assignment. This behavior is not collaboration; it is cheating. There is no one size fits all way to prevent the behavior and different types of assignments will require different preventative measures. Set standards for submitting assignments that minimize the occurrence of cheating and make it obvious when students cheat. Consider having students submit their work on a particular day and that day only. No early submissions, no late submissions. This ensures there is not sufficient time for a student to complete an assignment after looking at the work of others. Consider only certain
52
Promoting Learner Interaction and Personalized Learning Experiences
types of assignments for submission through the Google+ community such as written works that can be checked for plagiarism. Consider having students work on their assignments at least once during class time so as to see evidence of work and document some identifying factor about the assignment that will demonstrate the finished product is indeed the student’s own work. Consider having students submit a statement detailing the topic of their assignment, how they will complete it, and other identifying information. Consider using open source web-based software that requires the student to create a free account. The finished product will be identified with their user name. For example, if a student uses http://piktochart.com/ to create an infographic and submits the URL to the published work, their username will be attached to it. Overcome this obstacle by setting standards by which assignments are submitted that minimize the opportunity to cheat. For students who do not have a Google account, simply assist them in creating an account using Google’s account creation page https://accounts.google.com/Signup. It may be necessary to modify their attitude toward having yet another online account. These students may need a little extra encouragement to participate during the beginning of the course. Employ onboarding techniques such as demonstrating how to log into the account, access the community and participate in the community to gain their support. By using the settings menu, students can set their account to forward emails to the account they prefer and they can receive emailed updates when someone posts to the community or comments on their post. Overcome this obstacle through mentoring and making their use of Google’s accounts as non-invasive as possible.
DG+ Course Experiences A graduate level instructional design course titled, “Principles of Learner Motivation,” was the first course to utilize the DG+ model. It was a blended course delivered in the fall of 2014 at a private university in the southeastern US. For half of the class meetings students met face-to-face. For the other half students completed asynchronous online activities which included participation in the course’s Google+ community. Students participated in discussions, shared resources, submitted assignments and provided feedback for the works of their peers. The instructor accessed assignments that had been submitted via the community and participated in discussions. The only aspect of the DG+ model that was not reflected in this course offering was the use of a #index post. Students were expected to reference their course syllabus for the hashtags. The instructional design program of which this course was a part is a new program, launched the fall semester of 2012. For the most part students in the program were acquainted, having taken other courses together. The instructor knew the students, as well. The first course meeting was delivered asynchronously, requiring students to participate in orientation activities and complete their first assignments via the course materials as presented in the LMS. Figure 4. is a screenshot of the course home page in the LMS shell, with personally identifying information obscured. Students were expected to read the information and view the video tour, which took the place of the traditional first day, face-to-face orientation. Students adapted to this change well and the few questions they had were resolved by the instructor via email. Precise course design aimed at anticipating student need combined with instructor availability and social presence during onboarding, which are tactics supported by the literature, assisted students in forming positive perceptions of their initial experience with the course.
53
Promoting Learner Interaction and Personalized Learning Experiences
Figure 4. Example of the homepage from a DG+ course which shows how students were oriented to the course format
Figure 5. is a screenshot of the Google+ community, with personally identifying obscured information. Students shared their assignments using the appropriate hashtag designation, shared resources and participated in peer review. The literature indicates both structured and unstructured forums affect learner engagement (Salter & Conneely, 2015) and both styles were used in this course. When and what project the students contributed to the community was more structured while their choice of how to complete the project was less structured. The less structured nature of projects leveraged andragogical approaches by allowing learners to self-direct by utilizing their life context, experience and self-perceived needs. Many of the assignments in this course were incremental in that earlier assignments contributed to the content of later assignments, in-class activities and projects. This made peer feedback all the more important because students had the opportunity to revise their work before the next portion of an assignment was due. This peer feedback mechanism leveraged constructivist approaches by prompting students to assemble critiques, knowledge and life experience from their learning community. The instructor was very active in the community, both initiating posts and contributing to the posts of students, which had the all-important effect of creating instructor social presence. In addition, participation in peer feedback activities which was initially compulsory became more natural as the course progressed and continued to enhance the students’ sense of belonging to a learning community. Evidence of this can be found in the timing of posts, which were made increasingly previous to scheduled due dates. It was through this learning community that students were able to construct deeper understanding of the concepts taught as they utilized the Google+ community. An interesting dynamic created by the blended nature of this course was the way in which students were afforded the opportunity to converse both synchronously (in class or via Google’s Hangout and chat features) and asynchronously (via posts to the Google+ community) which Nandi et al. (2015) recognize as benefiting different types of topics. The instructor reported implementation and maintenance of the course was trouble-free. Sorting assignments using the hashtag system was simple, as was keeping up with the discussion posts. The amount of time necessary to suitably moderate a discussion forum was initially a concern for the instructor. In practice, the instructor found posts within the Google+ community easy to review and monitor.
54
Promoting Learner Interaction and Personalized Learning Experiences
Figure 5. Example of Google+ community posts from a DG+ course
A blend of text based posts, photographs, recordings, attachments and documents encouraged students’ efforts to learn and develop their skills, it made participation in the community engaging by satisfying the needs of male and female students who may prefer different types of communication (Ching & Hsu, 2015), and it promoted positive perception of the learning experience. Upon conclusion of the course the instructor reported learning gains among the students and the DG+ aspects of the course “achieved the social interaction that we were hoping to get” (personal communication, December 11, 2014). For the following iteration of the course the instructor used the same design. Students reported satisfaction with the online portion of the course, as well. Even though students were previously known to one another, they experienced a marked sense of community. They continued to be socially engaged during the weeks there were no face-to-face meetings. Student comments on blind course evaluations were positive. A representative few stated, “I really enjoyed the Google+ community; it was a great way to share work/ideas and give each other feedback,” “Google+ community assignments were worthwhile,” and “liked the social media aspect.” The instructor reported most end of course projects exceeded expectations. One student went on to submit her mid-term paper to an international conference. Happily, it was accepted for a concurrent presentation. In the beginning of this chapter four focal topics to consider when choosing to use social media platforms as opposed to traditional discussion forums were identified. They are the social nature of people, how people interpret their experience, functionality of forum uses and perception of conversations. All four of these considerations are addressed by the DG+ model, with the course example above demonstrating these considerations can be functionally addressed using a social media platform as opposed to a discussion forum.
55
Promoting Learner Interaction and Personalized Learning Experiences
FUTURE DIRECTIONS The pace of innovation in technology is swift. Innovation in educational pedagogies is somewhat less swift due to the fact that emerging pedagogies must be vetted. Due diligence dictates educators examine the implications of techniques and technologies on student learning outcomes. Rather than jumping from one newly developed tool to another, it is advisable to grow the pedagogy as technology changes and research becomes available. To that end the designer of the DG+ model maintains a website and Google+ community where instructional designers, instructors, students, anyone who has experience with a DG+ course can collaborate and contribute information. All contributions will be used to inform revisions and add dimension to the model. What is now considered an innovative approach to online education may soon be abandoned for the next seemingly new and better innovation. Educators have the opportunity to repeatedly align themselves with a new pedagogy or they can contribute to developing the full potential of a few core pedagogies they continually apply to the instructional practice. The DG+ model promotes peer collaboration and hopefully will benefit from peer collaboration so as to continue to remain viable as technology, social media and student populations evolve. The same can be said about the pedagogies proposed in other chapters of this book. Educators discussing, collaborating and sharing course experiences supports the momentum that inspires discovery: discovery of better ways of teaching, discovery of better ways of using technology, and discovery of broader opportunities for learning.
CONCLUSION This chapter discussed the traditional discussion board forum, examined factors influencing student satisfaction and success with regard to online communications, and considered the potential of social media platforms as a medium for learning. It then presented the Directed Google+ Community model and described it in a way that provided instructors and instructional designers with a practical tool for course re-design, as well as course design. The DG+ model contributes to the field of instructional design and the focus of this book by using the progression of technology to advance the pedagogy of online instruction. It is the author’s hope that practitioners utilizing the DG+ model will visit https://plus.google.com/u/0/communities/113561215080481022498 or http:\\dgmodel.weebly.com to contribute their experiences and insights to the evolution of this new pedagogy.
REFERENCES Aragon, S. R. (2010). Creating social presence in online environments. In New Directions for Adult and Continuing Education (pp. 57-68). San Francisco, CA, USA: Jossey Bass. Byrnes, J. P. (2001). Cognitive development and learning in instructional contexts. Allyn & Bacon. Cao, Y., & Hong, P. (2011). Antecedents and consequences of social media utilization in college teaching: A proposed model with mixed‐methods investigation. On the Horizon, 19(4), 297–306. doi:10.1108/10748121111179420
56
Promoting Learner Interaction and Personalized Learning Experiences
Ching, Y.-H., & Hsu, Y.-C. (2015). Online Graduate Students’ Preferences of Discussion Modality: Does Gender Matter? Journal of Online Learning and Teaching, 11(1), 31. Douglas, K. M., & McGarty, C. (2001). Identifiability and self‐presentation: Computer‐mediated communication and intergroup interaction. The British Journal of Social Psychology, 40(3), 399–416. doi:10.1348/014466601164894 PMID:11593941 Flanigan, A. E., & Babchuk, W. A. (2015). Social media as academic quicksand: A phenomenological study of student experiences in and out of the classroom. Learning and Individual Differences, 44, 40–45. doi:10.1016/j.lindif.2015.11.003 Gunawardena, C. N., & Zittle, F. J. (1997). Social presence as a predictor of satisfaction within a computer‐mediated conferencing environment. American Journal of Distance Education, 11(3), 8–26. doi:10.1080/08923649709526970 How Millennials use and control social media. (2015). American Press Institute. Retrieved from http:// www.americanpressinstitute.org/publications/reports/survey-research/millennials-social-media/ Huang, H. M. (2002). Toward constructivism for adult learners in online learning environments. British Journal of Educational Technology, 33(1), 27–37. doi:10.1111/1467-8535.00236 Jonassen, D., Davidson, M., Collins, M., Campbell, J., & Haag, B. B. (1995). Constructivism and computer‐mediated communication in distance education. American Journal of Distance Education, 9(2), 7–26. doi:10.1080/08923649509526885 Kay, R. H. (2006). Developing a comprehensive metric for assessing discussion board effectiveness. British Journal of Educational Technology, 37(5), 761–783. doi:10.1111/j.1467-8535.2006.00560.x Kiesler, S., Siegel, J., & McGuire, T. W. (1984). Social psychological aspects of computer-mediated communication. The American Psychologist, 39(10), 1123–1134. doi:10.1037/0003-066X.39.10.1123 Knight-McCord, J., Cleary, D., Grant, N., Herron, A., Lacey, T., Livingston, T., & Emanuel, R. et al. (2016). What social media sites do college students use most? Journal of Undergraduate Ethnic Minority Psychology, 2, 21. Knowles, M. S. (1970). The modern practice of adult education (Vol. 41). New York Association Press New York. Levine, S. J. (2007). The online discussion board. New Directions for Adult and Continuing Education, 2007(113), 67–74. doi:10.1002/ace.248 Mazzolini, M., & Maddison, S. (2003). Sage, guide or ghost? The effect of instructor intervention on student participation in online discussion forums. Computers & Education, 40(3), 237–253. doi:10.1016/ S0360-1315(02)00129-X Mazzolini, M., & Maddison, S. (2007). When to jump in: The role of the instructor in online discussion forums. Computers & Education, 49(2), 193–213. doi:10.1016/j.compedu.2005.06.011 Moran, M., Seaman, J., & Tinti-Kane, H. (2011). Teaching, Learning, and Sharing: How Today’s Higher Education Faculty Use Social Media. Babson Survey Research Group.
57
Promoting Learner Interaction and Personalized Learning Experiences
Nandi, D., Hamilton, M., & Harland, J. (2015). What Factors Impact Student-Content Interaction in Fully Online Courses. International Journal of Modern Education and Computer Science, 7(7), 28–35. doi:10.5815/ijmecs.2015.07.04 Neier, S., & Zayer, L. T. (2015). Students Perceptions and Experiences of Social Media in Higher Education. Journal of Marketing Education, 37(3), 133–143. doi:10.1177/0273475315583748 Njoroge, B. (2016). How College Instructors Use Social Media for Instruction. WEST VIRGINIA UNIVERSITY. Salter, N. P., & Conneely, M. R. (2015). Structured and unstructured discussion forums as tools for student engagement. Computers in Human Behavior, 46, 18–25. doi:10.1016/j.chb.2014.12.037 Schrader, D. E. (2015). Constructivism and Learning in the Age of Social Media: Changing Minds and Learning Communities. New Directions for Teaching and Learning, 2015(144), 23–35. doi:10.1002/ tl.20160 Shapiro, N. Z., & Anderson, R. H. (1985). Toward an Ethics and Etiquette for Electronic Mail. ERIC. Sharif, A., & Magrill, B. (2015). Discussion forums in MOOCs. International Journal of Learning, Teaching and Educational Research, 12(1). Shin, N. (2002). Beyond Interaction: The relational construct of ‘Transactional Presence’. Open Learning: The Journal of Open, Distance and e-Learning, 17(2), 121-137. doi:10.1080/02680510220146887 Swan, K., & Shih, L. F. (2005). On the nature and development of social presence in online course discussions. Journal of Asynchronous Learning Networks, 9(3), 115–136. Tess, P. A. (2013). The role of social media in higher education classes (real and virtual)–A literature review. Computers in Human Behavior, 29(5), A60–A68. doi:10.1016/j.chb.2012.12.032 Thomas, M. J. (2002). Learning within incoherent structures: The space of online discussion forums. Journal of Computer Assisted Learning, 18(3), 351–366. doi:10.1046/j.0266-4909.2002.03800.x Wang, X., Yang, D., Wen, M., Koedinger, K., & Rosé, C. P. (2015). Investigating How Student’s Cognitive Behavior in MOOC Discussion Forums Affect Learning Gains. International Educational Data Mining Society. Wankel, C. (2009). Management education using social media. Organizational Management Journal, 6(4), 251–262. doi:10.1057/omj.2009.34 Wong, J.-S., Pursel, B., Divinsky, A., & Jansen, B. J. (2015). An Analysis of MOOC Discussion Forum Interactions from the Most Active Users. Paper presented at theInternational Conference on Social Computing, Behavioral-Cultural Modeling, and Prediction. doi:10.1007/978-3-319-16268-3_58 Zheng, S., Han, K., Rosson, M. B., & Carroll, J. M. (2016). The Role of Social Media in MOOCs: How to Use Social Media to Enhance Student Retention. Paper presented at the Proceedings of the Third (2016) ACM Conference on Learning @ Scale, Edinburgh, Scotland, UK. doi:10.1145/2876034.2876047
58
Promoting Learner Interaction and Personalized Learning Experiences
KEY TERMS AND DEFINITIONS 1+: Google+ users can show their approval for a post by clicking the 1+ button. Other social media platforms utilize the ‘like’ feature as a way of showing approval. When users 1+ a post their profile picture will be included in that post’s 1+ section. Directed Google+ Community: A private social media group specific to the students enrolled in a particular course. Directed Google+ Community Model (DG+): The model that guides instructors and instructional designers in incorporating a Directed Google+ Community into either a new or an established course. It is a social media alternative to the discussion board tool included in many learning management systems. Discussion Board: An online forum that allows individuals to post information and receive replies. Learning management systems include discussion boards for use within courses. Google+: The social media platform established and maintained by Google. Users gain access via their Google account credentials. Google+ Community: A group of Google+ users with similar interests. Communities can be public, allowing anyone to join, view or participate at will; or private, with viewing and participation restricted to members who have either been invited to join or requested membership. Learning Management System (LMS) Shell: The online portion of a course housed within the institution’s learning management system. Instructors populate the course shell with content and resources specific to the course. Post: A single contribution to an online discussion that can include text, links to websites, audio recordings, video recordings, photos and other files. Social Media: Digital systems that provide users the opportunity to collaborate, share information and generate virtual communities. Examples include online networking sites, blogs, virtual worlds and wikis. Thread: A group of discussion posts relating to a single topic. Threads simulate a back and forth conversation between people. A discussion board houses many threads. Threaded Discussion Board: A form of discussion board that utilizes threads to organize information.
59
60
Chapter 4
Online Professional Development in Academic Service-Learning:
Promoting Community Engagement in Public Education Geraldine E Stirtz University of Nebraska Kearney, USA
ABSTRACT The overall purpose of this qualitative study was to describe how a 3 credit hour, web-based, graduatelevel course in service-learning pedagogy supports the theory that service-learning as a pedagogy can be taught effectively in an online format. Service-learning integrates community service with academic study to enrich learning, teach civic responsibility and strengthen communities. Content analysis of the selected case studies and evaluation of the student’s reflections concludes that the students enrolled in the online class in a Midwest University were, in fact, able to learn this teaching strategy and then effectively implement this strategy with their classroom of students in their local communities. The Literature Review discusses numerous research articles supporting the value of this teaching strategy of collaboration with community partners in citizenship training for youth, children and young adults.
INTRODUCTION The National Commission on Service-Learning conducted a major research project on the use of servicelearning as an educational tool for public schools. In the published report, Learning in Deed (2004) the report states that this strategy used for school based learning is … •
A method of teaching that combines community service with curriculum-based learning, linked to academic content and standards.
DOI: 10.4018/978-1-5225-1851-8.ch004
Copyright © 2017, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
Online Professional Development in Academic Service-Learning
• • • • •
About students helping to determine and meet real, community needs. Reciprocal in nature, benefitting both the community and the student. An effective way to encourage and foster active citizenship as part of a public education. An approach to teaching and learning that can be used in any curriculum area For all ages, even the youngest of children.
Our world is experiencing dramatic changes and the diversity in our population calls for major changes in how we see and treat our world in addition to the role we play as citizens in our country. Wade (1997) quotes Pratt from his work, The civic imperative: Examining the need for civic education. Civic attitudes taught in schools should affirm both individual rights and the common good. One of the goals of civic education should be to reduce ethnocentrism, citizens should develop tolerance if not appreciation for diversity and sincere empathy for others. Pratt (1988) described the development of a civic disposition as “...a willingness to act, in behalf of the public good while being attentive to and considerate of the feelings, needs, and attitudes of others. Civic virtue has an internal landscape reflected in the obligation or duty to be fair to others, to show kindness and tact, and above all to render agreeable service to the community” (p. 12). These trends in our society indicate that we need to provide children, youth and adults with a better understanding of the role each must play in helping to strengthen our communities to sustain our democratic society. The way individuals mistreat others, show a lack of respect for those in authority, destroy or damage other’s property, indicates a high level of careless abandon and irresponsible interactions towards others. Helping to train young people in learning about “caring for the other” in our society, is becoming even more critical each year. One of the very effective tools currently being used to promote and encourage active citizenship in public schools is the high impact practice of academic service-learning. Training higher education faculty and/or teachers in the PK-12 field, in strategies for using academic service-learning as pedagogy, can be delivered within the classroom or in online delivery. Both instructional models require careful planning and a solid structure of course requirements and service based assignments.
What is Service and Academic Service-Learning? With the growing emphasis on service-learning, researchers in the field began to discuss the true meaning of “service.” High Schools were adding a community service requirement to their graduation requirements and were struggling to define service and differentiate it from doing just any kind of volunteering. It became evident that there was a need to determine the use of the terminology being used as they looked for an answer to what is considered service and what distinguishes academic service-learning from other forms of community engagement? For example, penny wars were and are being used in many classrooms and are being considered service-learning projects, however, the youth involved have little sense of the benefits to individuals that might benefit from their collection of penny funds. The penny wars raised were often contributed to individuals living in another community or even a different country and students gained little sense about the value of this service relative to their role as a citizen in their own communities. Though raising funds for a cause is a form of service, and it can be linked to the curriculum, often there is little or no structured reflection about the community need and the service 61
Online Professional Development in Academic Service-Learning
performed is far removed from the role the students play in the bigger picture of citizenship and caring about others. Wade (1997) states service is more than action. Service is also an attitude, a relationship, and a way of being in the world. We can help through what we do. But at the deepest level we help through who we are. (cited by Wade from Dass and Gorman, 1985, p. 227). In fact, we may not be truly serving others if we act without compassion, engagement, and a willingness to be “with” rather than just “for” another (p. 63).
Defining Academic Service-Learning Academic service-learning is not volunteering; doing just any service in the community; it is not charity or doing good deeds for others. It is not practicums, internships and other experiential learning activities. It is a highly structured activity, planned and coordinated “with” (not for) a community partner to meet community identified needs. It is a collaborative effort with the community partner and educator working together to provide for integration into the curriculum being taught; it must be reciprocal, mutually beneficial to the individual serving, as well as the community being served; it must always include structured, critical reflection before, during and after the service, which helps students understand the need they are meeting with their service, while demonstrating the participant’s learning. Its structure allows for assessment of the student’s learning. Jacoby (2003) provides the following definition: Service-learning is a form of experiential education in which students engage in activities that address human and community needs together with structured opportunities intentionally designed to promote student learning and development. Reflection and reciprocity are key concepts of service-learning (p. 4). One of the first steps to understanding the use of this teaching strategy is to look at the background of national efforts and the research being conducted. Much of the developing conversation and literature about this teaching strategy resulted from the forming in 1993 of the Alliance for Service-Learning in Education Reform ASLER). This group of educators saw the need for developing a set of standards to judge programs being developed following the passing in Congress of the National and Community Service Trust Act, in 1990. The Trust Act provided for an increase in funds allotted to states for public school implementation of service-learning as a teaching strategy, however, states were responsible for distribution of the funds, but there was no specific statewide training in place to help teachers understand how to implement this strategy (Wade 1997, pp.19-20).
Teacher’s Response Teachers in the field jumped at the chance to apply for funds being offered by the State and were attempting to use service in the community as a way to enhance student’s learning, though they knew little about the specific components and structure required in building a quality service-learning experience for their students. They did not have knowledge of the necessary components for a quality experience nor how to build links to the curriculum being taught in their classrooms, which would meet required state standards and truly enhance the student’s learning. Teachers were using many different activities 62
Online Professional Development in Academic Service-Learning
of community service, considering this to be service-learning. It was evident that they needed additional training to better develop strategies for using service with the community as an effective teaching tool. One of the first questions from administrators, teachers and students is, why should we involve our students in service in the community? There is supporting evidence for the benefits and value of getting students engaged in their communities. Academic service-learning provides a way to make this work, while enhancing the student’s knowledge and understanding of the curriculum being taught in their classroom, in addition to providing service to the community which makes a difference in the lives of others. Students learn to work together, feel valued as contributors to their community, and learn to appreciate cultural diversity, develop communication skills, and learn skills to build relationships with others. This teaching strategy is a major shift for schools and communities and involves building a relationship of trust and collaboration. James Toole (as cited in Furco and Billig, 2002, p. 57) discusses why trust matters to service-learning implementation. Social trust may be particularly important to service-learning because it is a relationship-rich pedagogy. The very reason that service-learning is a particularly powerful tool to foster civil society—that its very existence depends on community participation and partnership—makes its implementation vulnerable in a way not shared by all other education innovations. At the very core of service-learning are robust notions of youth, teacher, and community collaboration...which involves risks and therefore requires increased levels of trust.
Online Course As discussions were developing at the State level about service-learning, plans began to develop for an online class in service-learning pedagogy to help meet this need for training in the public schools. The Department of Teacher Education undergraduate program at this University was already implementing this teaching strategy with an academic service-learning requirement in place since 1991. This expertise provided direction for the development of an online graduate class targeted to teachers seeking hours to use towards a Master’s Degree. Creating this course helped to provide an opportunity for specific training in service-learning strategies for public school teachers who were wishing to enhance their student’s education with real life experiences. The online course would require very specific instructions to lead teachers to a more in-depth understanding of this pedagogy. The previous definition discusses the specific components required for academic service-learning and will provide the reader a sense of the critical elements required for a quality service-learning experience in either online delivery or in the classroom. Academic servicelearning is a teaching strategy being used nationally and internationally in schools and other organizations seeking to help students and other individuals learn about their responsibility as citizens in their community and country. Academic Service-Learning involves engagement “with” the community, is a teaching strategy with a focus on caring for, and meeting the needs of “others” in our society. Though nationally, there are numerous definitions of the term being used, they essentially incorporate the same basic components noted above.
63
Online Professional Development in Academic Service-Learning
Funding for Implementation At this time, the Corporation for National and Community Service was providing Government grant funding for states to distribute for teachers implementing service-learning activities with their students. It was apparent that teachers in our public schools were becoming aware of available Government funding and they were looking for ways to receive some of these grant funds to involve their students in the community to serve. Teachers knew very little about how to structure meaningful learning experiences in the community for their students that went beyond “feel good,” community service projects. Many of those projects lacked the structure necessary for a quality service-learning experience. Projects were seldom linked to the curriculum, had no specific learning goals, did not focus on meeting State Standards and generally were projects the teacher and students “thought would be fun,” not necessarily meeting community needs or involving any reflective processing for the students. The need for professional development in academic service-learning was evident. Since expertise in this strategy was already in place at this institution, a graduate-level, online course in academic servicelearning was developed and taught for teachers in the field to fill this need in the Public Schools while providing a graduate level course to help meet needs for students working towards their Master’s Degree. Since service-learning is a proven strategy for rehabilitating youth who were school dropouts or emotionally troubled youth, the course also served as a graduate course for other disciplines interested in providing new strategies for working with youth.
Online Class Requirements The Terminal Objectives for the students enrolled in this course were to have them answer the following questions: • • • • • •
What is service-learning? How does the definition of service-learning agree with or differ from my understanding? How does service-learning impact my student with regard to John Goodlad’s Four Moral Dimensions? This was the civic renewal model on which the Teacher Education face to face class was based. What are the service needs of my community? How do I assess the needs in my community? How does service-learning impact the relationship between home, school, and community? How do I integrate service-learning as a teaching strategy into my classroom curricula?
Assignments • • •
64
Build a collaborative partnership with a community partner to develop a service plan, making certain that benefits were reciprocal. Implement meaningful service activities that benefited the students as well as the community partners. Structure critical reflection for their classroom students which provided problem solving and critical thinking skills to challenge their students.
Online Professional Development in Academic Service-Learning
•
Create an electronic public presentation, i.e. a PowerPoint to present to community members, fellow teachers and school administrators.
Early assignments consisted of readings from the text as well as referrals to numerous web sites to analyze and discuss related articles to heighten their awareness and gain knowledge about the field, particularly to gain a better understanding of the difference between service-learning and community service projects. Students were required to complete reflections in Blackboard discussion board entries, which were assessed regularly by the instructor. They were also assigned to analyze, synthesize and discuss their readings with peers as well as with the instructor. Once students had some background knowledge about the subject, they were to help create a community needs assessment survey to determine needs that could be developed into potential service-learning projects for their classroom of students.
Determining a Community Need In addition to studying the background of the development of service-learning pedagogy, students worked as a team to develop the survey each would then use in assessing their own community’s needs. Students used the survey to discuss community needs with at least 3 different constituencies within their community to determine a need that could fit with their curriculum. This assignment helped them to get acquainted and learn more about their own communities. This was indeed a stretch for some of the students who stated: “I don’t know any of the community members here, I live in another town and commute to teach.” That was one of the hurdles for those that did not know anyone outside the school personnel who lived in their community. This became an even better reason to push them out of their comfort zone and get them involved with people who support the school. Many of the communities in which the teachers were working had a very limited population. Suggested connections for gathering survey information included: city officials, law enforcement, churches, individuals attending ballgames, those frequenting the local restaurant/bar, etc. The students found this assignment a great learning experience of its own. They began to see the link between the community and the school and that their teaching positions were supported by real people that cared about the school, the teachers and the students. This interview assignment was to result in the development of a community partner willing to work with the teacher to plan and implement a mutually beneficial project that fit with the classroom curriculum while helping to meet a need in the community. This became a significant assignment as the teacher needed to understand this teaching strategy well, before he/she could discuss potential project ideas with a community partner.
LITERATURE REVIEW The following literature review provides support for the value of service-learning pedagogy in the Public Schools and the need for training of teachers and faculty to understand the structure for using this teaching strategy effectively.
65
Online Professional Development in Academic Service-Learning
History of Service in America Historically, service has been a vital force in the development of our American life. Wade provides a rich history of much of the development of service in our country. In the early forming of our Nation, our country relied on the dedication and action of fellow citizens to assist others in raising a neighbor’s roof, providing food for community members, caring for the sick and elderly, and assisting other individuals in need of help managing their lives. Starting in the 1930s, the Government began funding and developing various ways to help those in need. Several programs were developed which included: the Civilian Conservation Corps; followed by provision of the GI Bill, encouraging veterans to seek educational opportunities after World War II; these were followed by the Peace Corps, various AmeriCorps programs, and numerous other Corps, developed to help meet community needs. During the 1970’s the term “service-learning” began to surface and in the 1980’s, strong service-learning programs began to emerge in schools and on campuses across the Nation, encouraged and made possible with the assistance of Government funding provided for the National and Community Service Act of 1990. This major funding supported the work of the Corporation for National and Community Service and helped to encourage service-learning programming across the nation. Congress pulled this funding in 2012 and other sources of funding are being sought (Wade, 1997, pp. 23-25).
Service-Learning as Pedagogy in the Public Schools Previously stated, the Government grant funding provided to the states, which was targeted to promote service-learning activities in the public schools, triggered numerous efforts to develop community based projects. Kahne and Westheimer (1996) stated: “With the current interest in and allocation of resources to service-learning comes a growing need to clarify the ideological perspectives that underlie servicelearning programs.” Without professional development in service-learning pedagogy provided for the teachers applying for these funds, teachers developed many different community service projects, which involved their students in “good deed” projects in the community. Since they had little understanding of the structure for this teaching strategy, public school teachers started food collections, penny wars, clothing drives, community clean up days, and numerous other “feel good” or charitable service projects that had little to do with an understanding about why students should participate, or how these projects could truly benefit the community. There was virtually no link to the classroom curriculum nor was reflection structured to enhance student learning. These projects were, and still are, often conducted as competition between classes or grades and the purpose for the project becomes a self-gratification project for the students, competing to see who gets the prize pizza party following the drive. Kaye (2010, p. 28) responded to the question posed to her that asked, “Is competition appropriate as an incentive for service-learning?” Her answer was “In a word, no: A companion to every winner is a loser.” She proceeds to say...remember that engaging kids through knowing the underlying causes of childhood hunger has much more power to stimulate participation than winning a pizza for bringing in the most food. Kids are compassionate. They want to solve problems and improve how we live.
66
Online Professional Development in Academic Service-Learning
Wade (1997) indicates community service in schools tends to be extracurricular club activities, winter clothing and canned food collections, or one-time visits to a nursing home or soup kitchen. While schools have often included these types of activities through which students help others, most have not made service a part of the curriculum. Service-learning is not an extracurricular activity; it is a pedagogical method in which service projects form the basis of learning opportunities. Furthermore, service-learning is a means for students to develop real world skills and knowledge they can apply both inside and outside of the classroom (p. 20).
Democracy and Diversity Students who are exposed to the community and learn to build new relationships learn to accept others, just as those they are serving, learn to accept those from other cultures. Refer to Case Study #2 for an example from this class and note the change in dynamics within the nursing home when Hispanic children built relationships with Caucasian nursing home residents. Service can provide for intergenerational as well as intercultural experiences for the students. Battistoni (1997) states: In addition to assisting students’ learning about community, service-learning programs in elementary and secondary schools can be effective teachers about diversity. One of the greatest concerns in our democracy today lies in the increasing divisions between our people—divisions based on race, class, gender, age, and culture. While not a panacea, service-learning projects where students work together in teams and reflect seriously on their service work can be opportunities for students from different background to join in common causes with adults and/or other young people in the larger community, themselves reflecting a diversity of cultures and interests. Where the school/classroom itself reflects a diversity among the students, community service integrated into the curriculum can be an effective device for understanding one’s own identity in relation to community and for engaging with other students from diverse perspectives (p. 151).
Service-Learning and Reciprocity Reciprocity, one of the critical components of academic service-learning, is defined by Bringle (2004) as “a characteristic of service-learning relationships between the community and the campus in which both invest and benefit, and both serve as teacher and learner” (p. 216). Service-learning experiences are built to meet a community need in partnership with a community agency/program. The activity is not to be developed by the teacher/faculty/students and taken to the community as if we surely know what the community needs. Service-learning requires that teacher/ faculty/students go to the community and collaboratively determine activities that will help to meet the community identified need while enhancing student learning. Caution must be taken to avoid doing things “for” or “to” the community, but rather “with.” The community is not to be “used” for a learning laboratory but rather as a partner in the planning and organization of the service activities that will help to meet the determined need. In the early practice of service-learning, 1980-90s, the community voice was often overlooked. A personal example of this, a number of years ago a college, credited with completing a successful com67
Online Professional Development in Academic Service-Learning
munity based research project, was invited to a campus to share project information about their study with interested faculty. The project was described in detail and during the time allotted following the presentation for a question and answer session, one participant asked the research team what changes they might consider for the project in the future. In response, the presenter stated: “the community partners, though pleased with some of the survey results, indicated that they would have preferred to be a part of the planning for the research project from the beginning of the project.” Apparently the designers of the research study decided they knew what the community partner needed and were “using” the community as a learning laboratory rather than working in collaboration with the community, which can result in being offensive to the community. This approach to working in the community is inappropriate, yet is an approach often found in both academic service-learning activities as well as community-based research. Saltmarsh, Hartley, and Clayton (2009) discuss in their research that, Academic knowledge is valued more than community-based knowledge, and knowledge flows in one direction, from inside the boundaries of the university outward to its place of need and application in the community (p. 8). The community must be an equal partner, involved in determining the need, the planning of activities, and should become an integral player in the complete project, benefitting equally with the students. Academic service-learning and community-based research must be reciprocal. Jacoby (2015) refers to Robert Sigmon, (1996, p. 4), one of the early leaders of service-learning, who emphasized that “each participant is server and served, caregiver and care acquirer, contributor and contributed to. Learning and teaching in a service-learning arrangement is also a task for each of the partners in the relationship...each of the parties views the other as contributor and beneficiary. Saltmarsh, et al. (2009). Academic knowledge is valued more than community-based knowledge, and knowledge flows in one direction, from inside the boundaries of the university outward to its place of need and application in the community.
Service-Learning and Critical Reflection Another of the most significant components of a quality service-learning experience is the inclusion of critical reflection. Faculty and teachers must include critical reflection as a part of the total experience and it must be planned as an integral part of the structure of the experience from its inception. In online courses, critical reflection is as important as in face to face classes and a mechanism must be in place that helps to capture the student’s thoughts and ideas to assess the students understanding and learning. The planning and development of the course syllabus must include the goals, objectives, and learning outcomes intended for the students. If students are truly going to learn from a service experience, critical reflection components must be carefully structured as a part of the service-learning course or activity. Critical reflection is the element that lends itself to, and provides for the learning gained from any service activity and also helps to provide for the assessment of the student’s learning. 68
Online Professional Development in Academic Service-Learning
Ash and Clayton (2009) state that the course objectives, learning goals and assessment tools ...all should be developed in parallel during the design of the reflection activities. Trying to assess a learning goal that has not been articulated as an assessable objective is usually an exercise in frustration…The authors continue...Critical thinking standards can be used as both a formative guide to improve student reasoning and a summative tool to evaluate its quality in the end. Making visible such integration of reflection and assessment is key in helping students become increasingly aware of and responsible for their own learning processes (p. 40). Eyler and Giles (1999) discuss critical service-learning in Where’s the Learning in Service-Learning? Critical service-learning is about pushing students to explore the assumption that underlies their own perception and the way society is organized. In a critically reflective classroom, students will discuss not only effective ways to provide emerging aid for the poor but also ask, “Why do we need soup kitchens?” Critical reflection is the process that may lead to transformation learning-changes in how students understand the social order-and to action to right social wrongs (p. 198). Online students were involved in continuous reflection as they studied the term, its meaning, explored their community, and considered how to incorporate service-learning in their curriculum. This was a critical component of the course as they reflected on how service could fit with the curriculum. It was evident that if this strategy is to be an instructional model, it must be integrated into the classroom curriculum and meet state standards. One of the reflection assignments for the online students was to require each to list at least 3 state standards that would be met with their academic service-learning projects. Students learned how to access the standards on the State Department of Education’s web site, and match those to their service-learning projects. Additionally, students were referred to the Wisconsin Department of Public Instruction that produced a volume titled: Learning from Experience: A collection of service-learning projects linking academic standards to curriculum (2000), which presented summaries described by the teacher of outstanding examples of service-learning projects. The reader may refer to the Results section of this article to see one of the online student’s reflection in her final Project Report on how her class’ project was integrated into the curriculum and met numerous Nebraska state standards.
Distinguishing the Terminology A number of schools are now requiring service hours for graduation, which is not the same as servicelearning that has been integrated into the curriculum. Frequently faculty/teachers are challenged by students asking about the reasoning for a service-learning requirement in a class. Eyler and Giles (1999) state, simply requiring service hours has a tenuous link to student outcomes, but community service that is well integrated with an academic course of study contributes to personal and interpersonal development, learning and application of knowledge, critical thinking ability and perspective transformation, all of which are relevant to citizenship participation as well as leadership. Service-learning is often better than academic learning and thus a legitimate requirement in an academic program (p. 182). 69
Online Professional Development in Academic Service-Learning
Government funding has also led many faculty/teachers at institutions to use the service-learning term for any volunteer or community service project, internship, etc. without understanding what must be involved and the steps needed to build a quality service-learning experience. They understand little about how to use service in the community as an academic tool for enhancing their student’s learning. Looking back at the definition for academic service-learning, this teaching strategy has very specific components that distinguish it from other forms of community engagement. It must meet a community need; be linked to academic standards and learning goals tied to the curriculum being taught in the classroom; it must be reciprocal, providing an equal benefit to the community and the student’s learning; and always includes guided reflection to help students make the connections between the service and the curriculum, it actively involves the students in their learning. Since Public School teachers needed to understand the structure required for building meaningful academic service-learning experiences for their students, creating an online course on service-learning pedagogy would provide this professional development for teachers. Training would strengthen their understanding and provide the structure necessary for building a meaningful experience while including the critical components to meet the definition of academic service-learning.
Educational Challenges In 1999 Thomas Ehrlich, senior scholar at the Carnegie Foundation for the Advancement of Teaching and president emeritus of Indiana University, and Elizabeth Hollander, then executive director of Campus Compact, with input from presidents, chancellors and other administrators from a number of prestigious universities and colleges on the East Coast, drafted the President’s Declaration on the Civic Responsibility of Higher Education. The organization of Campus Compact evolved from this initial effort and was being developed to encourage and increase the practice of academic service-learning and community engagement nation-wide. The Compact continues to grow and increase in institutional numbers. This University is currently an active member of the Campus Compact organization. In her book, Building Partnerships for Service-Learning, Jacoby (2003, p. 318) discusses this President’s Declaration on the Civic Responsibility of Higher Education (2007) challenge (noted below) to higher education institutions, suggesting that “…service-learning proponents would do well to establish service-learning in the engaged campus framework.” As presidents of colleges and universities, private and public, large and small, two-year and four-year, we challenge higher education to re-examine its public purposes and its commitments to the democratic ideal. We also challenge higher education to become engaged, through actions and teaching, with its communities. We have a fundamental task to renew our role as agents of our democracy. We believe that the challenge of the next millennium is the renewal of our own democratic life and reassertion of social stewardship. In celebrating the birth of our democracy, we can think of no nobler task than committing ourselves to helping catalyze and lead a national movement to reinvigorate the public purposes and civic mission of higher education. We believe that now and through the next century, our institutions must be vital agents and architects of a flourishing democracy (Campus Compact). Zlotkowski, Longo, and Williams, in Students as Colleagues (2006) discuss actions of students, enrolled in higher education programs, stating: 70
Online Professional Development in Academic Service-Learning
Students have been demanding that higher education take seriously its public mission to support student civic engagement and not simply focus on professional skill and workforces preparation. The authors refer to students in Oklahoma who in 2003 drafted a Civic Resolution stating We declare that it is our responsibility to become an engaged generation with the support of our political leaders, educational institutions and society…The mission of our state higher education institutions should be to educate future citizens about their civic as well as professional duties. We urge our institutions to prioritize and implement civic education in the classrooms, in research, and in service to the community (p. 2).
National Efforts National documents, required for reading and reflection in this online course, included the Kellogg Foundation’s research study: “Learning in Deed: The power of service-learning for American schools. This was a report from the National Commission on Service-Learning, chaired by former astronaut, John Glenn, published in 2002. The online students were excited about the findings of this report; however, in their online reflection discussions they kept asking why our State Department of Education was not promoting and providing professional development in this teaching strategy. In this study, students were able to see the very positive results of numerous efforts and the value of this pedagogy. Indeed, it is very challenging to get an educational system to accept new strategies. Much of this movement to use service-learning as a teaching strategy started years ago, is being utilized extensively on the coasts, in Minnesota and Michigan, but is still slowly inching its way into the Central Midwest states. Shelley Billig (in Casey, et.al., 2005) built her research on the previous Kellogg Foundation’s research study of service-learning, Learning in Deed, and the Corporation for National and Community Service’s effort to promote and encourage the use of this teaching strategy in the public schools. Her study provides a comprehensive listing of the benefits students derive from academically integrated service-learning activities similar to the Principles of Good Practice (pp. 191-192). In an effort to help the online students understand the powerful effects of this teaching strategy, the Principles of Good Practice for Combining Service and Learning was included in the student’s assignments for a discussion assignment. Jacoby (2003, p 9) refers to these guiding principles in her book, Building Partnerships. The following listing of the principles was retrieved from the web. The combination of service and learning is powerful. It creates potential benefits beyond what either service or learning can offer separately. The frequent results of the effective interplay of service and learning are that participants: • • •
Develop a habit of critical reflection on their experiences, enabling them to learn more throughout life, Are more curious and motivated to learn, Are able to perform better service, 71
Online Professional Development in Academic Service-Learning
• • • • • • •
Strengthen their ethic of social and civic response, Feel more committed to addressing the underlying problems behind social issues, Understand problems in a more complex way and can imagine alternative solutions, Demonstrate more sensitivity to how decisions are made and how institutional decisions affect people’s lives, Respect other cultures more and are better able to learn about cultural differences, Learn how to work more collaboratively with other people on real problems, Realize that their lives can make a difference.
The emphasis on learning does not mean these Principles are limited in any way to programs connected to schools. They relate to programs and policies based in all settings – community organizations, K-12 schools, colleges and universities, corporations, government agencies, and research and policy organizations. They relate to people of all ages in all walks of life. This collaboration to define needs will insure that service by participants will: 1. not take jobs from the local community, and 2. involve tasks that will otherwise go undone. Retrieved from http://www.johnsonfdn.org/principles. html Changes and new strategies in our educational system are slow to be implemented and materialize in our schools and educational institutions. The Learning in Deed document discusses the paradox of youth engagement. Numerous studies show that large numbers of American students are not fully engaged-intellectually or otherwise-in school. Disengagement also extends to activities, such as voting and keeping up with current events, which are fundamental to our democratic society. Yet there is a paradox here. At the same time that academic and civic disengagement is rampant, primary and secondary school students volunteer in record numbers for community service activities, from tutoring children who need help with schoolwork to working on environmental problems. The volunteer spirit that students express in their spare time represents a valuable resource for transforming education (Learning in Deed, p. 6).
Changing Teaching Styles Training faculty and teachers in the development of courses or curriculum that includes a service-learning component requires a major shift in current teaching styles and practices. Fink (2003, p. 12) discusses the challenges in institutional change required and the critical need for administrative support for the faculty and/or teachers if they consider incorporating service-learning as a course requirement. Though this online course was written for college level students to learn about using this strategy with youth, the components, strategies, outcomes and challenges are similar to those which college and university faculty would use or experience in higher education.
72
Online Professional Development in Academic Service-Learning
Fink (2003) discusses the question of whether online learning can be as good as learning in a live classroom. He goes on to explain the “holistic model” of active learning discussed in his work which provides a conceptual framework for answering that question. He continues to discuss the 3 main components of active learning which would be required for development of an online course, Access to Information and Ideas; Reflection; and Experiences. He states, “...if each of these can be fulfilled in a satisfactory manner with online learning, we will have a meaningful basis for deciding whether this form of learning has the potential to be a high quality form of learning” (p. 122). One thought he stated is that the Experiences component could be the weak link in online delivery, however the course discussed in this article shows, experience is a part of an online course that can be accomplished with careful structure to help the student explore and find ways to serve their community Fink (2003, p. 198) also suggests that faculty in every institution face some of these six critical conditions that must be addressed to provide PK-16 faculty the support needed to change their teaching strategies to include service-learning and allow for students to be somewhat responsible for their own learning. These are: • • • • • •
Awareness: Of own need to change. Encouragement: Others value their expertise. Time: To make changes and adjustments to curriculum. Resources: Consulting, support groups, workshops, conferences. Cooperative Students: Mindful of good teaching strategies. Recognition and Reward: Formal recognition & rewards for effective teaching.
In an effort to promote this teaching strategy in higher education intuitions, the last statement, in Fink’s reference to conditions that must be addressed, is a very important consideration. Zlotkowski (1998, p. 11) states: “there remains one final key to success that even many of these programs [discussed in this volume] have not yet managed to make fully their own: formal recognition of service-learning within the promotion and tenure process.” Providing students of all ages an opportunity to engage in service to help meet real community needs will help to strengthen communities and help students of all ages understand the role they must play as citizens in our democratic society if our society is going to thrive and survive. Academic service-learning could serve as a tool that has the potential to dramatically change our educational system.
Methodology Public Schools were using many community service projects conducted in the schools, at this time, and it was evident that the students in this online class would need considerable background information about this strategy to help them make a shift in their understanding so that they could distinguish the difference between community service and academic service-learning. They needed understanding to help them see the close link between service and learning and that separately these two words are not the teaching strategy called service-learning. Without reflection the service can be meaningless for those involved. The term is hyphenated to depict the two words as one, not separate but joined together and closely linked as one concept. Jacoby (2015) states “the hyphen in service-learning symbolizes reflection and depicts the symbiotic relationship between service and learning” (p. 50).
73
Online Professional Development in Academic Service-Learning
The purpose of the study was to determine the impact of online professional development training on public school teacher’s understanding and ability to implement academic service learning as a classroom teaching strategy, to increase the civic and citizenship skills of their students, while they assisted their students in seeing the relevant learning and knowledge construction applied to real world problems and solutions. Research Questions: 1. Can participants in online training effectively learn how to implement quality academic servicelearning projects? 2. Will teachers be able to demonstrate their correct understanding of this strategy in building meaningful relationships with community partners?
DATA COLLECTION, CASE STUDIES This research project used the case study approach to collect the data. A case study is a form of descriptive research looking intensely at a small number of accounts. Researchers do not focus on a generalized truth but emphasis is placed on exploration and description. Case studies are primarily qualitative analysis of an event or process. The induction method of analysis was used. Induction is the generalization of broad conclusion or theoretical proposition from specific facts or findings (Bringle, et.al., 2004, p.14). According to Texas State Auditor’s Office Methodology Manual, various types of cases include best case, worst case, representative case, and areas of special interest. The design selected is determined by the objective of the case study and how it fits into an overall process or program evaluation. When the case study is designed to illustrate the optimum workings of a process or program, best case events are selected…Case studies can add depth and realism to an audit/evaluation analysis by making it more ‘real life.’ They can also demonstrate the impact of processes, policies, or programs in human terms. They complement other methods well. Illustrative case studies gather data to describe and add realism and/or in-depth examples about a program or policy.
Participants This online class was taught over the course of 5 years in several semesters with an average enrollment of 18-20 students per semester. A purposeful sampling of case studies was selected to provide examples of the best cases for this paper.
Case Studies These were taken from the students detailed Project Reports, and are used to illustrate the accomplishments of 4 students enrolled in this online class. Academic service-learning fits for all ages and in all disciplines or subject areas and these case studies illustrate examples of different levels of academic 74
Online Professional Development in Academic Service-Learning
service-learning projects that resulted from the students participating in this online course for professional development training. Each case discusses the procedures used in determining the community need; their collaborative efforts with the community partner; the process for engaging their classroom students in the project; implementation of the student’s service activities; on-going reflection; and the outcomes/results of their project.
Case #1 Community Collaboration: Example of HS Industrial Arts Class Project Mr. Smith, a High School (HS) teacher in a small rural town, enrolled in this online training, was teaching a drafting class. As a part of the community needs assessment assignment required for the online class, he discussed the project with his class then made arrangements to have his class attend a City Planning Commissioner’s meeting to discuss existing community needs. He explained curriculum he was teaching and the learning goals for his students. Among other possible projects, the City planners mentioned that parking around the soccer fields had gotten out of hand and autos were parking wherever they could with little organization. Accidents were becoming quite routine in the area. After discussing the various community needs for their City, students engaged in classroom discussion and did some problem solving. The high school students decided they would work with the City and help lay out a plan to structure a parking area in the lot, providing more appropriately identified parking spaces. The class divided up into teams with each team taking on different roles in the development of the total project. Each team was responsible for specific roles in the project. One team worked on publicity, writing articles for the local paper to create interest and talk about the student project. Another team, was responsible for soliciting the community for funds and supplies that would be needed to complete the project. Another team served to collaboratively work with, and present plans to, the City Planners for approvals and to update them on the progress of the project. The classroom assignments were to draft the plans and lay out all of the details to be ready to be put the plan in action. Once students had completed drafting the plans and had materials collected, they organized a city wide work day, encouraged townspeople to join in the actual work of building the parking structure in the lot next to the soccer field. The townspeople were very supportive and came to assist the students in the completion of the project. Following the completion of the parking area, a celebration event was planned to thank all of those that collaborated with the project. Students benefited by learning much about the community and how it functions, they learned about management and administration of the city through their collaboration with the Planning Commission. They learned about publicity and the need for and effects of promotion of school projects. They understood more about fundraising for specific needs and realized how beneficial their efforts could be for making a difference in their community. The reciprocity of this collaborative experience between the school and the community is evident as the students enhanced their learning of the classroom curriculum while, in return, the community benefited from the student’s contributions of service. A win/win for both constituencies.
Case #2 Learning about Diversity: Example of Young Children Changing Lives Mrs. Jones, a 2nd grade teacher in a small rural town began her search for needs not being met in her community. After interviewing 3 different constituencies in the community, she determined the best fit for her students and her curriculum was to collaborate and build a partnership with one of the nursing 75
Online Professional Development in Academic Service-Learning
homes. To prepare the students, prior to actually introducing the children to the nursing home residents, a number of stories were shared with the children to help them learn about things they might expect as they went to the home and were introduced to their new “friends.” Various pieces of literature discussing specific characteristics of this population helped the children to gain a sense of what they might experience in a nursing home. Children were provided a tour of the home and introduced to the residents then paired up with one who would be their new “friend” with whom they would be connecting on later visits. Early classroom activities involved art projects where children each drew a picture to take to their new friend in the home. On a special holiday, children helped their new friend decorate cookies. One specific incident that happened when decorating cookies involved a resident from the home. The gentleman had not interacted personally with anyone since he was admitted and staff were concerned. When he and his new buddy from school were decorating their cookies, the two began to laugh and he began to tease the little boy who was decorating the cookies with him. They each ended up with frosting on their noses and were laughing and enjoying some special time together. Children can help bring new life to others and the nursing home staff were excited that this gentleman was breaking out of his shell. As the little children were studying a science project, they planted seeds in little cups and cared for the seeds until they sprouted and grew. When the little plants flowered, the children took their plants to present to their new friends in the home. They then helped the resident plant the little flowers in the nursing home’s flower garden and were able to watch them grow for the following weeks when they returned to the home for other service activities. One weekend, one of the children took his parents to get acquainted with his new friend in the Home. The family became a frequent visitor to the lonely individual who had no family close that could visit, while the child welcomed this relationship and the new “grandparent” in his life. Another activity had the little ones practicing their reading with their new friends in the home. As the children were leaving the home one day and returning to their school after a visit, one of the residents in a wheelchair motioned to the teacher to come closer and she said “even those little brown children can read.” This was a major change in attitude for this resident in a home with all Caucasian residents, living in a community that had recently had a major influx of Hispanic individuals. She was adjusting to this influx of individuals from another culture moving into their community and the schools. This service-learning project resulted in a bridge between those who had been residents for years and were resenting the changes happening in their community, and the newcomers who were changing their community’s culture. Children learned about building new relationships, practiced their communication skills, learned about sharing with others in need, saw the appreciation of those they were serving, and enjoyed the difference they could make in the lives of others. The residents were able to see their community from a different perspective and begin to see these children as similar to their own grandchildren and family members.
Case #3 Community Graffiti Issue Identified: Practicing Stewardship in the School and the Community Mrs. Clark, a teacher of an upper Elementary class of students in a rural Colorado community, interviewed various segments of the community and determined that one of the major concerns of the community was the disrespect of youth for other’s property in the community. Young people were involved in painting and writing graffiti on public and private buildings throughout the community. 76
Online Professional Development in Academic Service-Learning
As the class discussed the problem and did some problem solving about ways they could help to eliminate or diffuse this problem, they came up with several ideas with a specific emphasis on raising awareness of the effects of graffiti on the community. The class divided up into teams and first surveyed the community to identify locations and evidence of graffiti. A plan was made for the students to be involved in removal of the graffiti at different sites. The teacher decided to talk with other teachers and the administration about this project and it became an “across the curriculum” project. Students launched a campaign of awareness for the school and the community, in Art classes the students made posters to hang in the school and around the community. In English class students wrote letters to the editors of the local newspapers and learned about publishing their ideas and sharing their voices with the community. In Speech, they worked to prepare speeches to share with other classes of students and did some presentations to community groups. Students became so excited about the progress they were making and the difference it meant to the community that on a school off day, students gathered at the school and removed gum from the bottom of desks and chairs. This was a great example of Stewardship in action as students began to see the value of respecting and caring for their environment and other’s property and saw they were making a difference in their community. Students began to see the problem they and peers were making for the community and experienced the challenges it was to rectify the difficulties they were causing for their community. They began to take pride in doing things that helped to improve their school and community and found they enjoyed the feeling of making a difference in the lives of others. These collaborative actions with other peers become a part of their involvement as citizens in their community.
Case #4 Service-Learning in Early Childhood: Children in a Preschool Build Relationships Mrs. Brown was a preschool teacher. Initially she struggled through the process of determining a community need that might work for her very young preschool children. Then she began to think about the fact that the community did not know much about the purposes of a preschool and the benefit it is to young children. She looked at some goals she might have for the children’s learning. She decided that her children needed to learn about working as a team, they needed to learn how to build relationships with others, they needed to learn appropriate manners and behaviors when interacting with others. These were all goals she felt she could accomplish with an Art Show project the children could host at their preschool site for the parents and the community. The teacher began this project with the children by reading literature and stories especially targeted to manners, building relationships with others, and how to behave when meeting new people. Then she discussed the idea for the project with the students and the parents to share with them the plan she had in mind and encouraged discussions about what all they might include in the Art Show and who they might invite to attend. She involved the children as much as possible in the planning for how this event could be the best that it could be. They talked about dressing appropriately for the Show and talked about decorating the space and setting up for the Art Show. Little ones practiced introducing themselves to each other, learned how to shake hands appropriately, practiced the art of stepping up to people and welcome them, extending their hand and introducing themselves. This was learning that would help prepare them to serve as hosts on the day of the event. They helped create hand drawn invitations for parents and grandparents and shared an open invitation
77
Online Professional Development in Academic Service-Learning
to the community through the area newspaper. They then started creating their special pieces of Art for the Art Show. The teacher had set a weekend afternoon aside to host the guests and others in the community to attend the children’s Art Show in her backyard. When the day arrived, the children came dressed in their nice clothes, posted their pieces of art on the side of the garage, and as people came to the Show, the children served as hosts, welcomed guests, extended their hands and introduced themselves, offered the guests punch and cookies, helped to serve the guests and then invited them to view the art pieces, explaining a bit about the work, who created it and what it depicted. The Show was a very successful event with a lovely day, excited children, and the littles ones who shared the experience with pride in what they accomplished. The community enjoyed this opportunity to get acquainted with the children and had the opportunity to learn more about the learning children experience in a preschool setting. They were most impressed by the manners the little ones displayed as they greeted and hosted these guests to their Art Show.
Results The teachers enrolled in the online classes over the course of several semesters were mostly very satisfied with the outcomes of their projects and received encouragement from their community partners to continue with future projects. As in any class, there were students who bought into this pedagogy, truly built solid relationships with the community partners, and were inspired to look at many different ways to incorporate academic service-learning with the community in their future teaching. Others found a way to do minimal work that would allow them to get by with a simple community service project, such as a canned food drive that did not include all of the components required for development of a quality project to enhance student learning. Nevertheless, this online class can be very successful with dedicated students and fits for any age student, across the curriculum, or be very effective as an interdisciplinary project. The following reflections from students in the classes are included to provide the reader with some personal thoughts about these student’s successes and plans for the use of this strategy for future collaborative teaching with their communities. The first reflection example points to fulfilling specific assignments, integration into the classroom curriculum and aligning the state standards with the service activities. This student emphasizes the importance of reflection before, during and after the service experiences.
Student’s Reflections from Final Project Reports Student #1 “The benefits were well worth the time and effort it took to plan and implement the service-learning project. My students gained many things such as confidence, a time to let their creativity run free, a place to use what they had learned in the classroom in a practical situation, and best of all they had a fun time being a part of this community collaboration project. The community of nursing home residents gave just as much as they received. Their smiles, laughter, and creative skills helped my students have the optimal learning experience during their kindergarten year at school. As a teacher and citizen I gained much more than I ever expected to. I knew in my mind starting out that this project will add to what I teach my children, but I really didn’t think about how it would affect 78
Online Professional Development in Academic Service-Learning
me. I have grown during the year in my awareness of the importance of teaching math, reading, writing, science and social studies, but it is also my job as a teacher to teach them how to be good citizens. Children are not born knowing how to do this. I was not aware of how I was lacking in the area of service in my own life. This project showed me just how important service is to our community, state, country and world. One person really does make a difference. It is exciting to think that I had a small part in planting the seed of service in the lives of my seventeen students, and now they will carry on that experience and make a difference in our world. Some background information my students needed to carry out the service-learning activities were, general information about the elderly, a tour of the nursing home, books with information about the elderly, and the nursing home activity director puppet show about people in the nursing home and what to expect when they first visit. Math standards were met with this project. Standards, 1.1.2. 1.1.4. 1.2.1, 1.3.2, 1.4.1, 1.3.4, 1.1.3, 1.2.2, 1.3.1, and 1.3.2 were all met. In reading standards 1.1.3, 1.1.4, 1.1.5, 1.1.1, and science standards 1.1.2, 1.1.3, 1.2.1, 1.4.2, and 1.4.1 were met. When the learning goals and the service project were connected there were wonderful results academically in each of the children’s education. The students were able to count and make number connections to how many projects we had to do and then subtracted to find out how many we have remaining. Students learned how to apply problem solving skills and basic math vocabulary to solve basic and simple addition problems. The students were able to compare and estimate groups when we made cookies and holiday ornaments for the nursing home. In science, students learned to identify plants and how they grow. They also learned what things that a plant needs to live. In reading the students made predictions based on the title, cover and illustrations of stories about senior citizens and applied the knowledge of organized print when they drew and wrote in their reflection journal. Student also retold stories to the nursing home residents in sequence. Being able to see the growth in each child was amazing. It wasn’t based on who was better than the other, it was based on how each individual child grew socially and academically during the year. Service-learning allowed me to observe skills and knowledge acquisition outside the classroom, and in a different setting where some children perform better. Reflection played a major role in our service-learning project. I learned that without reflection, this process would be greatly hurt and it would then be a community service project, not a service-learning project. We reflected before we started the visits and then after each visit. When the books were near completion we shared them with our nursing home friends. Reflection was used in different methods to reach the learning styles of my children. Discussions allowed for students with verbal strengths to excel.”
Student #2 “In closing, I will admit that my view of service-learning was not accurate before I took this class. It seemed like something else to add on to what I was already expected to do. Now I see how it fits in and the fact that I am already using some service-learning principles such as using real-world problems and teaching civic responsibility. This teaching method now makes sense to me and can be integrated into my curriculum. Some ideas that I am working on would include the “first impression” organization for the school, a mentoring program for incoming freshman, student involvement in documenting events for future historical consideration, a program to assist in the registration of voters in the county, assistance on election day, or helping the elderly get to the polling booths or absentee voting. There are others that I also have 79
Online Professional Development in Academic Service-Learning
observed on the various web sites throughout the year. I am hooked on service-learning even though my experience and knowledge of the practice is limited the benefit is beyond the costs.”
Student #3 “Next year I want to expand service-learning so that my main curriculum goals all relate to a real-world application. As I said earlier, the writing activities should be realistic and deal with current events or citizenship if possible. I still want to locate an address to write to service people stationed overseas. This would focus on the audience for whom it is written and an understanding that someone besides me will read it. I believe that kids really don’t understand that they can be judged by their handwriting or written words; they are so used to “doing an assignment” instead of creating an impression. They also need to learn tolerance for differences of opinion and that a society’s success is dependent on the actions of individuals.
Student #4 As I reflect on the semester spent learning about service learning and trying to put it into practice in my classroom I have a variety of thoughts. I have to start with how my perceptions of what service learning is and is not have changed. My original thoughts would be more aptly named charitable acts. I can now see how doing things with others for the betterment of community can have positive results. I see the potential from service learning in the school and in the community. When doing this again I will start earlier and plan to make better utilization of the knowledge and talents of my colleagues. As we neared the end of the school year and academics and standards came down to crunch time, some of the ways to expand the learning from this project had to be sacrificed. A service learning project would make an awesome interdisciplinary unit and more academic learning could be occurring with the acts of service.”
Student #5 “The project was rewarding for most involved. It was a small project so for a first time service learning project, it was manageable for the instructor, students, and community members. However, a bigger project would give students more responsibility. They would gain experience with funding, working with more charitable organizations, and possibly learning about legal matters to consider. The project was a success in the eyes of the students. They felt good about their service and felt a genuine contribution to others. They learned the value of others outside of their own peer group. They also learned how to seek out information from community members. On an instructional level, the project was successful to a point. More contrast in the project would have provided for better closure to the activity. In the future a larger project that is well thought out and planned would have even more benefits and meet more curriculum goals and standards.”
Student #6 “Not only are students getting to work collaboratively (which they always love), but they are getting to share their ideas and opinions in order to help their community. Many of them have been quite motivated. 80
Online Professional Development in Academic Service-Learning
The timeline to get a bulk of the project completed for the online course has caused me a bit more stress than if I was attempting to complete the same project without the added pressure of a graduate requirement. Still, I feel that I have come up with a strong project for my first attempt. I tried to keep it simple but make it meaningful and interesting to the students. The service-learning course and this project, along with the idea brought forth by other classmates, has inspired me to attempt another project next year with my Social Studies class. I have also found myself trying to “sell” the service-learning philosophy to other staff members in my building. Our school has been and will continue to go through a great deal of change and from what I have seen and heared these past few months, service-learning might be something that could fit into our middle school philosophy nicely. I am even volunteering to give a presentation on service-learning as one of our staff development options for next fall. I guess that power point presentation will come in handy.”
SUMMARY Looking at student’s reflections, one can observe the key concepts that provide evidence of the impact the training in this teaching/learning strategy had on the students enrolled in the course as well as their classroom of children being taught and involved in the community to enhance learning. It is rewarding to note when one of the students indicated she is trying to sell colleagues on the philosophy of servicelearning. Nothing is more effective in promoting a concept than the encouragement and thoughts of a peer who truly buys into the concept and can speak first hand of the value and learning. There is a need for collaboration of Higher Education faculty and PK-12 educators to partner in this effort to provide training and promote service-learning integration in classroom academics to encourage young people to be engaged citizens as they serve to help meet needs in the communities. It is very important to approach the community with an attitude of a building long-lasting partnerships, each learning from the other. The community can be a wonderful partner in welcoming students into their work and helping to teach the children about relationships, responsibilities and caring about the others in their lives and communities. Caution must be taken to carefully structure service-learning activities to include all of the critical components required for a meaningful learning experience. All elements of a quality service-learning experience need to be in place to assure a community identified need is being met with the service; benefits to the community, as well as the students, are evident; learning goals connected to the service are integrated into the classroom curriculum; structured reflection is planned and implemented throughout the project; and assessment strategies are determined from the beginning to document the learning. Reviewing the case studies, it is evident that indeed online professional development training can be an effective way to teach service-learning to teachers to provide them the structure required for a quality experience to enhance student learning. Individuals of all ages can truly benefit from real world application of academic curriculum being taught in their classrooms. Results of the course were very positive with quality project outcomes for many of the small rural communities in which teachers were teaching. (Refer to Narrative Case Studies 1-4) Though often times small communities and their schools are closely linked, service-learning provides an even more powerful impact on the relationships with the youth taking real pride in developing new ways to make a difference in the lives of others. The writer advocates for academically linked service which has the potential to change the lives of all of those involved. 81
Online Professional Development in Academic Service-Learning
The teachers enrolled in the classes were mostly very satisfied with the outcomes of their projects and received encouragement from their community partners to continue with future projects. A few reflections from the online students in the class were included, which provided personal thoughts about student’s successes and their plans for use of this strategy in their future teaching. One student even suggests that he is “hooked on service-learning,” a powerful statement from learning derived in studying this powerful pedagogy.
IMPLICATIONS FOR THE FUTURE Academic service-learning is a strategy for teaching that is a shift from standard classroom practices and involves considerable effort on the part of the teacher or faculty to give up some of their former strategies and shift to allow the students some responsibility for their own learning. Students involved in service-learning are provided opportunities to make links between their academic curriculum and real world experience. Due to this major shift in teaching style, it takes considerable background information and research to help the online students understand the benefits and value of this pedagogy as it enhances their student’s learning and makes real world application of their curriculum. In that respect, online delivery of this curriculum may work more effectively than in a face to face class as students are able to work more independently with their individual communities without worrying about what their classmates are doing or being challenged on their ideas. The initial online course was created with a plan to have the students complete the course collaboratively with community partners, developing meaningful service in which the students could be involved to meet community needs. In review of the rigor of this class, it became apparent that consideration must be given to dividing the class into two semesters of work in future iterations of this course. In a model to extend over two semesters, it may be best to suggest using the fall semester to study the history and structure of this pedagogy along with the community survey of need, all critical parts of the preparation phase. The second semester would cover the implementation of the actual service-learning activity in collaboration with their students and the community partnership. Additionally, students could then be better informed to produce a quality PowerPoint on service-learning strategy to share with peers and administrators in an attempt to promote this teaching strategy in their school district. This would provide more time for the implementation phase of the project and an opportunity to provide higher levels of success in addition to future planning for more partnerships with the community. When this was taught as a one semester class, many of the students were provided an incomplete grade after the 1st semester to allow for more time to further develop and implement the actual project with their community partners. This became an issue with a student planning to graduate after the fall semester. Developing a plan to expand the class to cover two semesters, would provide for a more indepth understanding of the pedagogy resulting in higher quality and more effective projects in partnership with the community. Another challenge of the course was the graduate student’s misunderstanding of the rigor of an online class. A number of students, enrolled in the course, thinking an online course was similar to a correspondence course and they could complete it in their own time frame, though the course description indicated there would be assignments provided weekly, with an expectation of completion and online dialogue, both with the instructor as well as other regular responses to students in the class.
82
Online Professional Development in Academic Service-Learning
This course was developed and first taught at a time when technology was new enough that some of the teachers had little or no experience in its use. Many of the students were from small rural school districts where technology was almost non-existent within their school. One of the assignments was to create a PowerPoint project about service-learning that was to be shared with their peers and school administration. This assignment was a stretch for these grad students who, as previously noted, had only basic knowledge of technology and were often looking to their own students to help them create their Power Point presentations. Another consideration is the lack of recognition for the practice of service-learning or other community engagement pedagogy in the Tenure and Promotion of faculty at all levels, particularly in higher education. Though nationally, there are a number of institutions that do recognize this pedagogy as efforts towards Tenure and Promotion, they are limited. In addition, there is a significant misunderstanding about this pedagogy in schools as well as higher education institutions, which resulted in a challenge in marketing the course.
REFERENCES Battistoni, R. (1997). Service learning and democratic citizenship. Theory into Practice, 36(3), 150–156. Retrieved from http://www.tandfonline.com/doi/ref/10.1080/00405849709543761 doi:10.1080/00405849709543761 Bringle, R. G., Phillips, M. A., & Hudson, M. (2001). The measure of service-learning: Research scales to assess student experiences. Washington, DC: American Psychological Association. Case Study Description. (n. d.). Writing@CSU. Retrieved from http://writing.colostate.edu/guides/ guide.cfm?guideid=60 Casey, K. M., Davidson, G., Billig, S. H., & Springer, N. C. (2005). Service-learning: Research to transform the field. Greenwich, CT: Information Age Publishing. Ehrlich, T. (2000). Civic responsibility and higher education. New York, NY: Oryx Press. Eyler, J., & Giles, D. (1999). Where’s the learning in service-learning? San Francisco, CA: Jossey-Bass. Fink, L. D. (2003). Creating significant learning experiences: An integrated approach to designing college courses. San Francisco, CA: Jossey-Bass. Furco, A., & Billig, S. H. (Eds.). (2002). Service-learning: The essence of the pedagogy. Greenwich, CT: Information Age Publishing. Jacoby, B. (2015). Service-Learning essentials: Questions, answers, and lessons learned. San Francisco, CA: Jossey-Bass. Jacoby, B., & Brown, N. C. (2009). Preparing students for global civic engagement. In B. Jacoby (Ed.). Civic engagement in higher education: Concepts and practices. San Francisco, CA, USA: Jossey-Bass. Kahne, J., & Westheimer, J. (1996). In the service of what? The politics of service learning. Phi Delta Kappan, 77(9), 592–599.
83
Online Professional Development in Academic Service-Learning
Kaye, C. (2004). The complete guide to service-learning; Proven, practical ways to engage students in civic responsibility, academic curriculum, and social action. Minneapolis, MN: Free Spirit. Learning from Experience: A collection of service-learning projects linking academic standards to curriculum. (2000). Madison, WI: Wisconsin Department of Public Instruction. Learning in Deed. (2002). Battle Creek, MI: W.K. Kellogg Foundation. Methodology Manual. (n. d.). Retrieved from http://www.preciousheart.net/chaplaincy/Auditor_ Manual/13casesd.pdf Saltmarsh, J., Hartley, M., & Clayton, P. (2009). Democratic engagement white paper. New England Resource Center for Higher Education Texas State Auditor’s Office. Wade, R. (Ed.). (1997). Community service-learning: A guide to including service in the public school curriculum. Albany, NY: State University of New York Press. Zlotkowski, E. (Ed.). (1998). Successful service-learning programs. Bolton, MA: Anker. Zlotkowski, E., Longo, N., & Williams, J. (2006). Students as colleagues: Expanding the circle of service-learning leadership. Providence, RI: Campus Compact.
84
85
Chapter 5
Multimodal Interactive Tools for Online Discussions and Assessment Enilda Romero-Hall University of Tampa, USA Cristiane Rocha Vicentini University of Tampa, USA
ABSTRACT The purpose of this chapter is to discuss the enhancement of asynchronous online discussions and assessment using multimodal interactive tools that allow text, video, and audio posts. The integration of these multimodal interactive tools as well as their affordances could lead to powerful changes in the learning experience of students interacting in asynchronous online environments. Along with providing an overview on asynchronous online discussions, the chapter will include a review of how multimodal interactive tools are used to engage learners in online discussions using text, audio, and video. Additionally, the chapter will describe both the benefits and challenges of asynchronous online discussions with text, audio, and video posting. Furthermore, the chapter will describe how the same multimodal interactive tools can also serve as an assessment method in asynchronous online learning of specialized subject areas.
INTRODUCTION When collaborating in online environments, interaction among users is a key element for positive attitudes towards learning, as well as the success of a course (Ching & Hsu, 2013; Durrington, Berryhill, & Swafford, 2006; Gikandi, Morrow, & Davis, 2011; Smith & Winking-Diaz, 2004). As the popularity of online courses continues to grow in higher education, the asynchronous online discussion forum has become a basic and important part of these courses (De Oliveira & Olesova, 2013; Gao, 2014; Hung & Chou, 2014). An asynchronous online discussion environment may be defined as a human-to-human interaction using text within a learning platform via computer networks, for participants to interact with one another to exchange ideas, insights, and/or personal experiences (Hung & Chou, 2014). DOI: 10.4018/978-1-5225-1851-8.ch005
Copyright © 2017, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
Multimodal Interactive Tools for Online Discussions and Assessment
Unfortunately, asynchronous online discussions are not always able to properly live up to their expectations (Andresen, 2009; Clark, Strudler, & Grove, 2015; Darabi, Arrastia, Nelson, Cornille, & Liang, 2011; Gao, 2014; Gao, Zhang, & Franklin, 2013). Learners in online courses dread having to comply with a certain number of posts per week and comment on their classmates’ posts (Clark et al., 2015). Also, time constraints such as family and work obligations are the main limitations on the amount of time online students can devote to their coursework. Additionally, most students encounter feelings of isolation and lack of engagement when participating in asynchronous online discussion (Clark et al., 2015; Darabi et al., 2011). Therefore, it is important that online instructors employ methods that will help students engage in the online learning experience and discussions. In order to help create social presence and engaging asynchronous learning experiences in computer supported learning environments, it is important to leverage and integrate interactive technology that enhances communication between the learners (Ozyurt & Ozyurt, 2011). This chapter aims to discuss the enhancement of asynchronous online discussions and assessment using multimodal interactive tools that allow text, video, and audio posts. The integration of these multimodal interactive tools as well as their affordances could lead to powerful changes in the learning experience of students interacting in asynchronous online environments. Along with providing an overview on asynchronous online discussions, the chapter will include a review of how multimodal interactive tools are used to engage learners in online discussions using text, audio, and video. Additionally, the chapter will describe both the benefits and challenges of asynchronous online discussions with text, audio, and video posting. Furthermore, the chapter will describe how the same multimodal interactive tools can also serve as an assessment method in asynchronous online learning of specialized subject areas.
ASYNCHRONOUS ONLINE DISCUSSIONS: OVERVIEW Asynchronous online discussion boards both enable students to explicitly express their thoughts in writing and promote communication among teachers and students. In addition, researchers agree that asynchronous online discussion settings support collaborative knowledge construction, critical thinking, and greater realism and motivation to learn (Bassett, 2011; De Oliveira & Olesova, 2013; deNoyelles, Mannheimer Zydney, & Baiyun, 2014; Gao et al., 2013). Participation in asynchronous online discussions is based on willingness; therefore, contributors may be expected to be individuals who are self-motivating and goal-oriented, who acquire from experiences, read, and evaluate other messages in relation to the discussed topic, and who think about the topic (Ozyurt & Ozyurt, 2011). However, limited participation and interaction in asynchronous online discussions appears to be a persistent and widespread problem. To solve this issue, peer facilitation has been proposed as a means to encourage a greater degree of interaction, as well as a model for productive online discussion (Ng, Cheung, & Hew, 2012). Other factors identified for a successful asynchronous discussion are: presence, threaded posts, quality posts, discussion style, conversational style, feedback, and the use of questions (Fear & Erikson-Brown, 2014). Based on the assumption that active participation in asynchronous online discussions is important for learning, Gao et al. (2013) proposed the Productive Online Discussion Model. In this model Gao et al. (2013) suggest that it is essential for participants to embrace the following four dispositions:
86
Multimodal Interactive Tools for Online Discussions and Assessment
1. 2. 3. 4.
Discuss to comprehend Discuss to critique Discuss to construct knowledge Discuss to share
These four dispositions address different but interrelated perspectives in learning. The disposition to discuss to comprehend encourages learners to actively engage in cognitive processes as interpretation, elaboration, and making connections to prior knowledge. The second disposition, discuss to critique, encourages learners to carefully examine other people’s views, and be sensitive and analytical to conflicting views. The third disposition, discuss to construct knowledge, incites learners to actively negotiate meanings, and be ready to reconsider, refine, and sometimes revise their thinking. Last, the disposition to discuss to share incites learners to actively encourage and support each other’s thinking and shared improved understanding based on previous discussions. The Productive Online Discussion Model and its four dispositions serve to address critical perspectives in online discussion.
Benefits One of the main advantages of asynchronous online discussion is the convenience and flexibility of time and place. Students can participate in online discussion anytime and anywhere. Researchers believe that it frees the learners from time and space constraints, providing ample possibilities for communication (Gao et al., 2013; Xie, 2013). In addition, some argue that online discussion potentially allows for more in-depth discussions and more thoughtful learning than it is possible in traditional face-to-face settings, as students in face-to-face discussions may not have sufficient time to think thoroughly before they respond (Clark et al., 2015; Gao et al., 2013; Wise, Speer, Marbouti, & Hsiao, 2013). Asynchronous communications allow students nearly unlimited time to compose and send messages and to respond to others. Moreover, asynchronous online discussions are considered an extension of instructional practices that promote dialog, reflection, knowledge construction, and self-assessment (Gao et al., 2013).
Challenges Despite the potential benefits, in practice many students do not meet expectations for participation, and discussion often suffers from low student involvement (Wise et al., 2013). Additionally, the comments that are made frequently do not respond to or build on each other (Wise et al., 2013). This results in shallow and incoherent discussions. Overall, many students interpret discussion participation as being more about “making posts” than engaging in dialog. Typically, threaded discussion forums are the most common and widely used form of asynchronous online discussion in the educational setting (Gao et al., 2013). Despite their popularity, threaded forums might not be the best technology to support the interactive and collaborative processes essential to a conversational model of learning (Gao et al., 2013). First, it is difficult to maintain a focused threaded discussion because participants are more likely to respond to recent posts and less likely to revisit older posts. Additionally, there is a lack of emotional cues and timely feedback in threaded forums (Clark et al., 2015). To enhance the quality of asynchronous online discussions, researchers have investigated how to provide appropriate instructions and guidelines to provoke good conversations (Darabi et al., 2011; deNoyelles et al., 2014), how to enhance participants’ discussion skills, how to enhance moderators’ 87
Multimodal Interactive Tools for Online Discussions and Assessment
skills (Darabi et al., 2011; De Oliveira & Olesova, 2013; deNoyelles et al., 2014), and how to enhance the design of threaded discussions (Gao, 2014; Gao et al., 2013). An alternative way is to design new discussion environments to encourage particular learning processes (Clark et al., 2015). One of the many potential opportunities is the use of multimodal interactive tools.
MULTIMODAL ASYNCHRONOUS ONLINE DISCUSSIONS As technology advances, so does the rise of instructional-based technologies that support the use of asynchronous online discussions using multimodal interactive tools. The integration of these multimodal interactive tools provides variety and novelty, as well as the creation of a supportive community of online learners (Borup, West, & Graham, 2013; Clark et al., 2015). Multimodal interactive tools allow online learners to enjoy asynchronous discussions in different formats (text, audio, and video). Many educational developers have attempted to bring the face-to-face interaction benefits into the online world through audio-video technologies. Live streaming video is one of the audio-visual technologies that have been an expanding area of exploration. However, there are still several issues in the implementation of live streaming video in education. Primarily, the idea of live video removes one of the largest benefits of online education, which is the flexibility of the learner (M. Griffiths & Graham, 2010). Additionally, many technical issues can prevent all learners from having a high quality experience. There can be a variety of problems, including difficulty connecting to the Internet, issues with personal computer hardware and software, as well as set-up problems, all of which combined can cause different learners at different times to have a poor learning experience (M. Griffiths & Graham, 2010). Multimodal interactive environments that allow for video, audio, and text communication take advantage of the same Internet infrastructure and personal computer availability as live stream video, but do not suffer from the same problems. Video and audio messages can be recorded before they are posted. Since the video and audio messages are recorded, the time flexibility benefit of online learning is retained as a student or instructor can record a video or audio message at any time (M. Griffiths & Graham, 2010). Similarly, the receiver of the message can view or hear it at any time, as many times as they wish. Additionally, the multimodal interactive discussion also helps convey many of the verbal and non-verbal cues associated with human face-to-face conversation. Overall, multimodal interactive discussions can boost the learning skills and motivation of the learners as they work with multimedia to explore subject areas, express their ideas, and share information, all at their own pace and learning level.
Benefits Asynchronous audio dialogue and feedback can allow for more engagement in discussions, as it affords users to better understand the feedback being given through subtle cues present in spoken comments, as opposed to threaded text comments (Ching & Hsu, 2013; Girasoli & Hannafin, 2008). Also, research suggests that it is possible to establish a positive instructor-student relationship using asynchronous video messages to convey immediacy. Additionally, the video messages/posts provide a means of accurately observing student knowledge and student motivation levels (M. Griffiths & Graham, 2010). Therefore, instructors are better able to respond to the individual needs of students. Research has found that multimodal interactive tools used for asynchronous online discussions are more effective at helping create social and teaching presence when compared with text-based discussion platforms. These multimodal 88
Multimodal Interactive Tools for Online Discussions and Assessment
tools make collaboration much easier and more productive as it is possible to know your class members (Cicconi, 2014). Additionally, students have the opportunity to share their voice, literally, and express opinions regarding their knowledge and ability. As learners participate in this collaborative environment, they learn how to interact, communicate, and express themselves (Brunvand & Byrd, 2011). Furthermore, asynchronous online discussions afford learners more chances for reflection than in typical face-to-face environments, as well as increased opportunities for more individualized feedback from instructors and between learners (Russel, Elton, Swinglehurst, & Greenhalgh, 2006). Current trends in education emphasize the importance of meeting the needs of all learners. Multimodal interactive tools can be used to promote student engagement, motivation, and quality of the learning experience for students with disabilities (Brunvand & Byrd, 2011). Important ingredients for learning success in school include the ability to engage and sustain attention, participate actively, maintain high levels of motivation, and complete assigned tasks. Yet many students at risk and students with disability experience difficulties in these areas. Multimodal interactive tools offer a number of affordances that allow them to address some of the learning needs of students with learning disabilities. The interfaces and features offered by these tools are well suited for promoting student engagement and motivation, as well as for helping students develop as independent learners (Brunvand & Byrd, 2011). Multimodal interactive tools enable instructors to capitalize on student learning strengths and preferred learning modalities by encouraging participation in the learning process. Multimedia elements such as video, audio, and images can be used to stimulate engagement and promote meaningful exploration of content.
Challenges The integration of these types of tools can serve to engage learners; however, it can also present certain challenges. Learners might find it difficult to interact with a new tool that breaks away from the traditional textual asynchronous online discussion format. Possible explanations for their difficulty include the learning curve involved in understanding the parameters of the tool, or the novelty of the tool, which removes learners from their comfort zone. Additionally, some learners may not feel conformable participating in multimodal formats such as audio or video (M. Griffiths & Graham, 2010). Even more, it may take the learners too much time to record a message that it is deemed adequate for the online discussion. Video and/or audio recording can at times be a time-consuming task, even if it is a short video and/or audio message.
Prior Research Several researchers have investigated the use of multimodal interactive tools for online discussions (Beach & O’Brien, 2015; Borup, West, & Graham, 2012; Borup et al., 2013; Clark et al., 2015; M. Griffiths & Graham, 2010; M. E. Griffiths & Graham, 2009; Hew & Cheung, 2012a, 2012b). Some have focused specifically on the use of asynchronous voice discussions. Hew and Cheung (2012b) investigated the use of a multimodal tool, Wimba Voice Board, to support asynchronous voice discussions. In this investigation, one group used text-based discussions while the other group used asynchronous voice discussions. The results suggested that there was no significant difference in the students’ degree of participation in the two classes. However, based on the students’ reflection data, Hew and Cheung (2012b) suggest that the asynchronous voice discussion had several advantages, including: a better understanding of the
89
Multimodal Interactive Tools for Online Discussions and Assessment
messages posted, a choice for those students that preferred speaking to writing, promotion of originality, and fostering a sense of community. Similarly, Hew and Cheung (2012a) conducted two case studies to examine the use of audio versus text-based asynchronous online discussions. The results for the first case study reveal that audio-based discussions have six perceived affordances compared to their text-based asynchronous counterpart. These affordances are as follows: they are useful for students with poor typing skills or audio learners, they are a better tool to assess how speech is delivered, their spontaneity ensures originality of ideas, they help confirm the identity of the student, they allow for more realistic and lively participation and also for more expression of emotion. However, regardless of the affordances, more than half of the students reported that they preferred to use text-based discussions. In the second case study, the students also expressed that they preferred the text-based asynchronous online discussions. The reasoning behind their preference towards text-based discussions included: an increase in the time allowance to structure/organize responses, the convenience/ease of use with text-based discussion, typed content/words facilitating better learning/ understanding, and issues with self-consciousness with how one sounds (accents, pronunciation, etc.). Another research study related to asynchronous voice discussions was conducted by Beach and O’Brien (2015), who utilized the VoiceThread app with sixth graders to support their use of the disciplinary literacy of science inquiry. As part of this investigation, Beach and O’Brien (2015) analyzed the annotations of students as they engaged in causal reasoning regarding the relationships between photosynthesis and carbon dioxide emissions. Based on their analysis, the researchers found several affordances of the multimodal tool: collaboration, interactivity, and connectivity. These affordances served to support the learners as they completed their assignments. Additionally, the results indicate that the students enjoyed making annotations in both audio and text format. Beach and O’Brien (2015) suggested that instructors should take advantage of these affordances to assess student’s ability to exploit the multimodality of the application. In addition to investigating multimodal interactive tools for asynchronous voice discussions, researchers have also investigated their use, benefits, challenges, and affordances as tools for asynchronous video discussions. Borup et al. (2012) conducted an investigation on students using different video-based instructional strategies. The students that participated in this investigation were enrolled in three predominantly online sections of a course. A different instructor taught each section. Data was collected using semi-structured interviews. The results of the investigation showed that the use of asynchronous video discussions had a tremendous impact on the instructors’ social presence. Another major finding was that students could sense the instructors’ emotional expression. In other words, the students could tell their instructor was happy and energetic because of the facial expressions and movements. Lastly, there was an impact on students’ social presence. Peers would watch each other’s videos and comments. However, the students’ social presence was not as high as instructors’ social presence due to limitations with the communication tools and the lack of threaded video conversations. Using a more a qualitative approach, Borup et al. (2013) conducted a case study in which four students were interviewed. Each student was viewed as a separate case and analyzed independent of the others. The four students were characterized as follows: the extrovert, the introvert, the English language learner, and the low self-regulation learner. The results showed that each student perceived benefits and challenges to the use of asynchronous online video discussions differently. Based on the data collected, the student identified as an extrovert valued the experience and enjoyed making her own videos. However, this student saw little value in viewing the video comments of peers. The student identified as an introvert valued the flexibility of the asynchronous video discussions. However, the student did spend 90
Multimodal Interactive Tools for Online Discussions and Assessment
a significant amount of time perfecting her videos for sharing. The English language learner valued the video comments made by her classmates, but was unable to fully take advantage of the experience because of lack of technology skills and communication issues. Lastly, the low self-regulation learner was motivated by the increased social presence through video of the instructor of the course. Overall, the findings from this qualitative research support the notion that multimodal tools can provide different experiences for students with various needs. Another investigation on the use of asynchronous video discussions was conducted by Clark et al. (2015). In this investigation, the researcher examined whether asynchronous video posts would create a higher level of teaching and social presence within an online course compared with the university’s current text-based discussion platforms. An analysis of the survey administered and interviews with the students confirmed that the perception of social and teaching presence had increased significantly when the students used the video-enabled discussion site. The participants discussed important factors such as connectedness and collaboration as benefits of using the video-enabled discussion. Even though the use of multimodal innovative tools for asynchronous online discussions is not a new state of the art technology, it is a current technique that is being introduced to the educational world. By understanding the many contributions and affordances of these Web 2.0 tools, these research efforts help provide a good comprehensive view of the benefits and challenges that multimodal innovative tools present when implemented in the online classroom.
MULTIMODAL ASYNCHRONOUS DISCUSSION TOOLS VoiceThread VoiceThread is considered the pioneer of free, collaborative multimodal interactive tools (Cicconi, 2014; Dail & Giles, 2012). VoiceThread allows learners to collaborate asynchronously using voice, text, and/ or video (Ball, 2012; Wood, Stover, & Kissel, 2013). As explained by Lee (2014), this Web 2.0 tool can also be used to create “digital stories” and share them with other users who, in turn, can provide feedback using their choice of media (voice, text, video, or all of them combined). VoiceThread is offered as a free online tool, but if the user would like to create more than three documents, called ‘VoiceThreads,’ payment for a Pro account is required (Burden & Atkinson, 2008). Since this tool is cloud-based, it allows for easy access in platforms such as computers, tablets, and cell phones (Ball, 2012). VoiceThread offers users the ability to interact with others in a more authentic environment than just text-based discussion forums (Koricich, 2013). Users can post their material online and receive comments in formats such as texts, video or audio messages, or texts along with audio messages (see Figure 1). The material created can be made public or private, and the latter makes the VoiceThread only available to users who have been invited to collaborate in that particular document. The affordances provided by VoiceThread enhance communication between instructor and students, as well as the student-student exchange of information (Nakagawa, 2001). Using VoiceThread can benefit students taking online classes at a distance by allowing them to post their work and/or comments in the format of their choice, which not only gives them autonomy in their learning process, but also generates opportunities for engagement with the material (Brunvand & Byrd, 2011; Wood et al., 2013). An entire asynchronous discussion containing voice, video, and text can take place under the same thread, which makes for easy and immediate access to information. Users also have the ability to doodle on a 91
Multimodal Interactive Tools for Online Discussions and Assessment
Figure 1. VoiceThread Discussion
video to highlight important information and annotate the slides on a presentation while commenting on it. These specific features enable a number of pedagogical affordances, such as the opportunity for formative feedback on material prior to formal assessment by the instructor, and the ability for comment moderation, which enables the instructor to decide what comments to make available for learners to see (Burden & Atkinson, 2008). This feature ensures that the material posted follows netiquette and contains relevant information. VoiceThread has been used in the math, science, social studies, and language arts (Dail & Giles, 2012) curriculum in K-12 in a variety of tasks, which vary from explaining the steps involving a word problem in mathematics, to analyzing historical documents in history, to predicting the outcome of an experiment in a science class (Wood et al., 2013). A study by Ching and Hsu (2013) revealed that the majority of students (85%) asked to provide peer feedback using VoiceThread really enjoyed the process and felt that the voice element made it closer to reality. Through the use of audio and video comments, users felt more connected to each other when compared to most courses being offered online. The use of VoiceThread as a tool for foreign language instruction has also been proven effective. Lee (2014) conducted a study in which students majoring in Spanish at a large public university in the United States were asked to use VoiceThread to submit a variety of assignments, including videos in which they would use Spanish to share news stories of their choice. As part of the requirements, students posted weekly entries and commented on the shared current events added by their classmates. Results revealed that VoiceThread allowed learners to improve their proficiency and pronunciation of Spanish due to the frequent use of voice recordings and comments created to fulfill the requirements of the assignments. In addition, students reported enjoying the experience and being involved in multimodal discussions. As explained by Lee (2014), the collaboration and exchange of ideas promoted a “stimulating learning environment” which empowered students and allowed for an active and engaging learning experience (p. 346).
92
Multimodal Interactive Tools for Online Discussions and Assessment
Blackboard Threaded Discussion Blackboard is a web-based course development platform used by many universities in North America. It offers a secure environment in which to post information, documents, assignments, and announcements (Servonsky, Daniels, & Davis, 2005). It allows for asynchronous delayed activities such as discussion boards and a digital drop box that can be used for student to instructor and instructor to student document transfer (Ajayi, 2009; Servonsky et al., 2005). The threaded discussion is one of Blackboard’s main features. Normally, the threaded discussion is in a text format. In these instances, the learners are asked to post text responses to an initial question and/or comment posted by the instructor. The asynchronous online discussion continues as all the members of a course post their responses and comment on their classmates’ post. A threaded discussion can have hundreds of posts. Ideally, an instructor and/or moderator serve to guide the discussion in an asynchronous format. With Blackboard, just like with other tools that support asynchronous group discussions, it is not difficult to set up the learning environment for the students. What is more difficult is to use the learning environment in a way that will enhance the learning experience of the students (Kear, 2004). Instructors may be tempted to simply open a discussion forum and hope learning occurs (Kear, 2004). Currently, the Blackboard Learning Management System has been updated to allow users the ability to share text, videos, and/or audio messages as content in the threaded discussion. Learners are no longer limited to text as the type of content that can be shared via Blackboard. Instead, they have the possibility of using other modalities to share their discussion messages with classmates. The format of the discussion is still using threads; however, the affordances provided by the new multimodal asynchronous discussion board allow learners to use the technology in the media format of their choice, maximizing the opportunities for engagement with the material being posted online. To share audio files, learners can simply record the video in a different application, save as an audio file, and upload it to the threaded discussion. Similarly, to share video files, learners can create the video in a different application and upload the file to the Blackboard Threaded Discussion. Alternatively, learners can record the video directly with their webcams in the Blackboard environment and share the recording once it is complete. Although current versions of the Blackboard Threaded Discussion provide these alternatives in their learning environment; the researchers were unable to find investigations discussing the benefits, challenges, and affordances of using the Blackboard threaded discussion in a multimodal format.
Wimba Voice Board Wimba Voice is a web-based voice solution that facilitates and promotes vocal instruction, collaboration, coaching, and assessment (Wimba, 2009). Wimba Voice incorporates the use of threaded voice boards, voice-enabled email, and embedded voice within course pages (Wimba, 2009). In the Wimba Voice Board, users are allowed to record and post audio messages. Similar to text-based discussion tools, this audio-based asynchronous discussion is independent of time and geographical location (Hew & Cheung, 2012a, 2012b). In Wimba Voice Board, students can simply speak a question or comment into a microphone and record it as an audio clip on a computer. The audio clips with messages are archived. The audio messages are arranged in threads, similar to threaded text discussion forums. Students have the option of typing a text description appended to the audio clip (Hew & Cheung, 2012a, 2012b). The text description can be entered in a small text box located at the bottom of the Wimba Voice Board 93
Multimodal Interactive Tools for Online Discussions and Assessment
screen. The discussion posts can be exported and downloaded in various audio formats such as MP3, WAV, and Speed audio.
MULTIMODAL ASYNCHRONOUS ONLINE ASSESSMENT Assessment can facilitate the acquisition of knowledge when carried out in an online environment due to its ability to provide opportunities for collaboration and to facilitate feedback from instructors to learners (Russel et al., 2006). As explained by McLoughlin and Luca (2001), online assessment can afford diversity in evaluation methods, where the instructor can gauge learners’ progress through the use of real-world situations in which they interact and collaborate with others through written and oral channels of communication. When it comes to effective learning and assessment, the structure of online discussions is a key element in ensuring that users respond effectively to threads and avoid redundancy. In their research, Vonderwell, Liang, and Alderman (2007) reported that students believed online discussions should contain diverse assessment methods instead of repetitive threads. Online formative assessment needs to be carried out in such a way that provides equitable feedback to students’ learning progress, assessing students with different learning styles and needs, in order to improve their overall academic performance (Gaytan & McEwen, 2007; Gikandi et al., 2011). The opportunity to learn from other students is another important affordance of online discussions. The asynchronous aspect allows for all participants to post their responses and explain their choices (Vonderwell et al., 2007), which creates more opportunities for instructor feedback and overall learning. Robles and Braathen (2002) explain that the main goal of online assessment must be to ascertain the degree to which learning objectives are achieved, as well as how much learners make use of the material being taught. Moreover, students need to be aware of how the material they have been studying can be utilized in a variety of real-world situations; therefore, instructors need to make sure to choose effective methods to allow for authentic online assessment. According to Sadler (1989), the quality of online feedback is an essential part of the assessment. While providing feedback, the instructor will not only report the learners’ grades, but also give them enough information to understand how the material was evaluated, based on a given standard, and what needs to be done from that point on. As explained by Graff (2003), online formative and summative assessment allows learners the ability to decide when and where to take their tests. Additionally, after taking their tests, learners can have access to instant feedback on their performance, which allows them to better understand the material. In their research, McLoughlin and Luca (2001) had learners interact with each other to solve problems presented to them. Through problem solving, learners were able to perform specific tasks, which measured their ability for critical thinking, as well as applying what they learned to the existing situations. This collaborative environment resulted in increased opportunities for learner interaction, which added learner engagement and participation in the assessment process. Self-assessment is another key element of the overall online assessment. Robles and Braathen (2002) suggest that instructors make use of self-assessment to allow learners to measure how much knowledge they have acquired throughout the course. Such form of assessment should take place at different times during the course, so that the instructors and learners can know how much learning has occurred. Another important component of online formative assessment is peer-feedback. Not only is it the assessment method in which learners collaborate the most, it also affords them to expand their understanding of
94
Multimodal Interactive Tools for Online Discussions and Assessment
the subject during the evaluation of their classmates’ work. Moreover, it leads to significant learning improvement (Black & Wiliam, 1998; Sadler, 1989). The same multimodal interactive learning tools that allow asynchronous online discussion using text, audio, and video can also be used for asynchronous online assessment. In this type of asynchronous online assessment, learners are given tasks in which their response is required in text, audio, or video (Lee, 2014; Swan, 2014). Using these multimodal innovative tools as assessment can boost the learning skills and motivation of these students as they explore subject areas, express their ideas, and share information, while working at their own pace and learning level. Both formative and summative asynchronous assessment can be carried out with the use of multimodal interactive learning tools. Online assessment tasks can help learners reflect on their progress to understand their thought processes and fine tune their work over the duration of a course (Black & Wiliam, 1998; Sadler, 1989). Research by Olofsson, Lindberg, and Stödberg (2011) revealed that tools that allow sharing video and blogging such as VoiceThread created opportunities for “meaning-making processes,” which opened doors for higher quality of communication and improved reflection (p. 51).
Benefits Student reports demonstrate that the feedback received from instructors in video messages/post via multimodal interactive tools is motivating and helps to build close and trusting relationships. Unlike the traditional face-to-face settings, online learning environments allow for learners to have access to all the discussions that took place throughout the course, and assess their areas of weakness as well as their strengths regarding the instructional content (Vonderwell et al., 2007). The use of online tools can greatly facilitate the quality of feedback, since it can directly impact how fast an assessment is provided to students. Even though asynchronous video feedback does not provide the instantaneous response that can be obtained in the face-to-face environment, its ability to show non-verbal cues makes it more appealing than asynchronous text communication (Borup, West, & Thomas, 2015). According to Stover, Kissel, Wood, and Putman (2015), tools such as VoiceThread help create a ‘dialogic experience between writer and viewer’, and give users the ability to provide an additional level of significance to their posts. The opportunity to explain information related to a particular slide by incorporating text, audio, and/or video comments to a presentation allows learners to communicate in a more authentic asynchronous online environment than regular threaded discussion forums. By using tools such as VoiceThread in K-12 education, Stover et al. reported great benefits, including the intensification of student engagement, higher student confidence, participation of authentic audiences, and overall improved experiences both for teachers and learners.
Challenges Stover et al. (2015) discuss barriers experienced by VoiceThread users, which include the unfamiliarity with the tool and the time needed for its set-up and implementation. Additionally, the evaluation of VoiceThread posts can be time-consuming, since it involves audio, video, and text components (Koricich, 2013). Even though video feedback can be provided in a more timely manner, research by Borup et al. (2015) revealed that students favor observations given through text, due to the ease of access to the information, as well as a more succinct content. Students also expressed their favor to text feedback because 95
Multimodal Interactive Tools for Online Discussions and Assessment
of the possibility of skimming through the suggestions given by the instructor without having to wait for the information on the video. Additionally, text feedback also afforded learners the opportunity to review the materials in various locations, which can only occur with video feedback when headphones are available.
VOICETHREAD AS A MULTIMODAL ASYNCHRONOUS ASSESSMENT TOOL VoiceThread is a great tool for formative and summative asynchronous assessment. Learners can use this platform to present their work, including PowerPoint presentations, video recordings (using their webcams), video clips, Word, Excel, and PDF files, to which the instructor and classmates can give feedback. Through Learning Tools Interoperability (LTI) developed by the IMS Global Learning Consortium (“How Does IMS Enable Better Learning Experiences?,” 2016), VoiceThread can be integrated into most Learning Management Systems (LMS), such as Blackboard, Canvas, Moodle, among others. Users can access VoiceThread directly from the LMS and the information will be transferred to VoiceThread through LTI. The use of VoiceThread as a multimodal interactive tool for asynchronous online assessment not only allows learners to self-assess, by analyzing whether they can carry out the tasks presented to them, but also boosts the learning skills and motivation of students, as it gives them a choice of format to post their tasks online and interact with other learners. Students can choose to create their individual VoiceThreads by using a variety of images, documents, and/or videos combined and arranged into a slideshow, which will be posted online for public or private viewing, and receive comments by the instructor and the classmates. The format of the comments is also optional. There can be voice comments, which are recorded with the use of a microphone and/or a webcam, or text-only comments, or a combination of both. Users can choose to upload their pictures when creating their VoiceThread profiles, which allow other learners with access to that VoiceThread to see who is making the comments, creating an environment similar to one of a face-to-face classroom. An advantage presented by the use of VoiceThread is the option users have to choose the exact location in which to add comments on the online presentations (Ching & Hsu, 2013). The use of VoiceThread allows users time to reflect upon the tasks assigned to them, and craft their responses using their preferred means of communication (images, videos, voice, text, or a combination of them). This Web 2.0 tool affords a collaborative, engaging environment in which activities are centered on learners (Pecot-Hebert, 2012). In their research, Olofsson et al. (2011) used VoiceThread for video-blogging to promote formative peer assessment and encourage reflective thinking. Participants in the study were asked to create video blogs, upload them to VoiceThread, and provide peer feedback to their users’ posts. Findings revealed that the more participants felt comfortable with the Web 2.0 tool and this particular type of interaction with their peers, the more they were willing to be open to peer feedback and work actively to provide others with reflective formative assessment. One of the methods VoiceThread can be implemented in is the assessment of English to Speakers of Other Languages (ESOL) learners’ oral and written production. As a tool for oral assessment, the instructor can post an assignment containing pictures and questions, in which learners can respond to by posting their comments as responses to the original stimulus. As learners post their responses, the instructor can assess their speaking ability (including grammar accuracy and pronunciation) and provide feedback to each submission. 96
Multimodal Interactive Tools for Online Discussions and Assessment
One of the benefits to using VoiceThread is that it affords instructors to provide oral feedback, either via a voice or video recording. Such form of feedback increases the interaction between instructor and learners, as subtle cues in intonation and emphasis can be perceived much more easily through audio and/or video than if the feedback had been given in written form. On the other hand, the instructor could decide to focus solely on the assessment of ESOL learners’ written production. In doing so, the instructor would evaluate the learners’ ability to express their thoughts in written form, and use VoiceThread as a means to provide specific feedback to written work via a variety of formats, from using the doodling tool to circle or underline specific parts of the learners’ work that contain errors, to using the microphone to comment on the learners’ progress or ask specific clarification questions. Overall, the use of VoiceThread can benefit both ESOL instructors and learners, due to its ability to bring online asynchronous assessment closer to what typically happens during face-to-face instruction.
EVALUATION CASE This evaluation effort investigates the use of a multimodal interactive asynchronous online discussion tool to provide variety and novelty as well as the creation of a supportive community of online learners.
Participants Participants of this evaluation were graduate students in a private midsized university in the southern United States. All students were 18 years or older. The participants in this evaluation were full-time and part-time graduate students in an instructional design and technology master level program. The graduate students in the program have different educational/career backgrounds (i.e., K-12, corporate, military, science, programming, higher education, etc.). The graduate students that participated in this investigation are a very diverse population with several international students and racial groups. Similarly, the alumni that participated in the evaluation included foreign and national individuals. Additionally, there was an equal representation of genders as well as a range of age groups. There were a total of 23 participants in the evaluation. There were no special participant inclusion or exclusion criteria. For this evaluation, the participation of the learners consisted of two different types of asynchronous online activities (reflection statements and scenario responses) throughout an academic semester in two different courses.
Setting In two courses of the instructional design and technology program, the instructor incorporated a multimodal asynchronous discussion tool to replace the text-based threaded discussions. Within the multimodal tool, the instructor created two groups (one group for each course). Then, after the groups were created, login information was emailed to all the students in the two courses. Lastly, information about the assignments the graduate students needed to complete within the multimodal environment was included in the syllabus and shared with the learners on the first day of class.
97
Multimodal Interactive Tools for Online Discussions and Assessment
Reflection Statements In course A, the students completed reflection statements related to the course readings during specific weeks throughout the semester. In total, learners were required to post four reflection statements for the semester.
Scenario Responses In course B, the students posted scenario responses. The scenarios related to the program evaluation process as part of the instructional design practice. The responses were posted during specific weeks throughout the semester. In total, the learners were asked to post responses to four scenarios. The specific dates in which the students had to post the reflection statements or the scenario responses were stated in the calendar for both courses. To complete the assignment, learners were asked to post their asynchronous online assignment to their respective groups in the multimodal interactive asynchronous discussion tool (VoiceThread). Learners were also given specific instruction on the length and format of their post. Moreover, learners were told to limit their responses to two pages, include peer review citations to support their statements, and to upload their files as a PDF file. Lastly, learners were asked to comment on two of their classmates’ original posts. Learners could comment using the audio, video, or text options available in the multimodal interactive discussion tool (see Figure 2).
Evaluation Procedures Students in two different instructional design courses (i.e., Trends and Issues in Instructional Design and Technology, Introduction to Program Evaluation) were invited to participate in this evaluation using an electronic questionnaire via email. The researcher sent three email invitations to the students after the conclusion of the semester. The electronic questionnaire was sent after the conclusion of the semester Figure 2. Multimodal Discussion in VoiceThread
98
Multimodal Interactive Tools for Online Discussions and Assessment
in order to get an honest response from the students and preventing their feedback from affecting their grade for the course. Students consented to participate in the electronic questionnaire by clicking on a link. The electronic questionnaire included several open-ended questions related to their learning experience, implementation, and use of the multimodal tool for asynchronous online discussion throughout the semester. The participants were asked to share any positive or negative comments about their multimodal asynchronous online experience while completing the assignment (either the reflection statements or the scenario responses). This questions helped collect qualitative data related to the experience of the participants. The survey was anonymous. However, after completing the survey, the participants were given the option to submit their contact information to participate in the drawing of a gift card.
Evaluation Results The comments provided by the learners about the learning experience using the multimodal tool yielded very interesting results. Here are the main themes and narratives that emerged from the qualitative data:
Benefit of the Learning Experience Learners shared that one of the main benefits of the learning experience was the affordances provided by having asynchronous online discussion to debate the topics or case studies covered in the courses compared to face-to-face classroom discussion. One student mentioned that: “I [the student] participated more in the classroom discussion that occurred online rather than in the class sessions.” Similarly, another student highlighted the benefits of the asynchronous online discussions, “overall, I thought it was extremely beneficial, especially since I don’t feel as anxious about participating on VoiceThread as I do in a traditional classroom setting.” One additional student referred to the extra time that asynchronous online discussion provided for reflection on the topics and/or case studies: “I loved the questions and responses students would ask or answer because it made me think of things from a different perspective.” An additional benefit of the learning experience was the exposure to the VoiceThread application as a new educational technology tool. Several students mentioned “the VoiceThread assignment was relevant to the ID&T students because these are the sort of tools that we [the students] need to be familiar with in the future.” Another participant mentioned, “If it was not for the instructor facilitating the assignments via VT [VoiceThread], I [the student] would not have known that it existed.” Lastly, one student mentioned that they “would keep using VoiceThread as it was used in the class as I [the student] do not think that there is anything else comparable out there that would work any better.”
Confusion and Difficulty Another theme that was identified in the qualitative data collected was feedback provided by the students related to difficulty of using VoiceThread. For example, at least three students discussed how it was difficult to navigate and scroll through the posts made by their classmates. One student mentioned in his comments that “the most difficult part of using VoiceThread was figuring out how to navigate through the posts.” Some of the students did not think the tool was user friendly. As explained by another student, “the post would scroll the wrong way unless you had your mouse positioned just right on the post.”
99
Multimodal Interactive Tools for Online Discussions and Assessment
Other areas of confusion with the VoiceThread application included: the ability of the user to copy and paste, difficulty zooming in and out of a post, difficult navigation without a mouse, and constantly having to click on or off to read posts and comments. One student mentioned that due to the navigation issues, he/she wished VoiceThread would incorporate “the tree-list functionality of the backboard discussion boards which make it easier to follow discussions and know who is talking to whom.”
Suggestions for Improvement The last theme identified in the qualitative data includes suggestions for improvement of the learning experience provided by the learners. Some of the suggestions related to the implementation of VoiceThread. One student recommended that “the professor should give the students some guidance on VoiceThread instead of using [sharing] the tutorial provided by the application.” Another student recommended that the initial post [made by the learners] as part of their assignment be a video post instead of a PDF file, as this would help make the discussion more interesting. Lastly, two learners suggested eliminating the implementation of VoiceThread completely and keeping the Blackboard threaded discussion format. As explained by one of the learners, “the Blackboard discussion worked better than the VoiceThread mechanism.”
Evaluation Discussion The investigation aimed to develop a better understanding of different learner’s experiences while participating in asynchronous online discussions using a multimodal interactive (audio/video) discussion tool. The node analysis and coding identified three distinct themes in the qualitative data indicating the different incidents during the learner’s participations in the asynchronous online discussions: [1] benefits of the learning experience, [2] confusion and difficulties with the audio/video, and [3] suggestions for improvement of the learning experience. Based on the qualitative analysis, it can be concluded from this implementation and evaluation case that learners enjoy participating in asynchronous online discussions to complement their face to face class discussion because it provides an outlet for those students that require additional thinking and preparation time as well as those that are shy or anxious when it comes to class participation. This learning experience is similar to what other researchers on asynchronous online discussions explained in the literature. For example, Xie (2013) and Loncar, Barrett, and Liu (2014) described how the online discussion environment expand learning opportunities beyond the classrooms without limits of time and space while also promoting social cultural connections among the learners. Additionally, the use of the interactive (audio/video) discussion tool for asynchronous online discussions helped the students understand the importance of learning about educational technology tools that can enhance the online classroom experience to be more engaging and interesting. As technology has advanced, so has the rise of instructional-based technology (Loncar et al., 2014). This has led to a profound paradigm shift, with the emergence of distance and blended learning environments in which students can discuss and interact with peers and with course materials (Loncar et al., 2014). Another finding from this investigation is that the learning experience of students is connected to the experience that they have with the tools used to complete their assignments. In this case, the asynchronous online discussions were completed using an interactive (audio/video) discussion tool [VoiceThread], which was completely new to the learners. As they completed their required assignments, they also became 100
Multimodal Interactive Tools for Online Discussions and Assessment
acquainted with the discussion tool. In some cases, the learners encountered roadblocks that made it difficult for them to have a positive learning experience. Although limited in terms of the number of students, this implementation and evaluation case revealed important aspects of asynchronous online discussion using a multimodal interactive (audio/video) discussion tool. The main point which emerged from this implementation and evaluation is the importance of using different instructional techniques and tools to make the discussion engaging, while also making it thought provoking. As instructors, we should provide our students with rich learning environments and a variety of learning possibilities for effective teaching (Kalelioglu & Gülbahar, 2014).
CONCLUSION Today, online learning has become ubiquitous and an increasing number of people choose this format to obtain their education. However, asynchronous online courses pose challenges to learners, especially when courses are not carefully designed to meet the needs of the students, which does not align with current trends in education. With the advent of Web 2.0, a variety of multimodal tools offer users the ability to research, record, take notes, compare, and collaborate asynchronously. This chapter discussed the benefits, challenges, and affordances of multimodal innovative tools and how they can be used to enhance asynchronous discussions and assessment taking place in online learning environments. By integrating these multimodal interactive tools, changes can occur in the learning experience of students who decide to pursue their education online. These changes include, but are not limited to, peer social presence, teaching presence, expression of emotion, collaboration, cooperation, constructive peer evaluation, increased self-regulation, and other benefits. The chapter also addresses the challenges that are faced when integrating the multimodal innovative tools for both asynchronous online discussions and assessment. Throughout the chapter, VoiceThread, Blackboard Threaded Discussions, and Wimba Voice Board were described, including their many benefits to enhancing asynchronous online discussions. Among these three, Blackboard Threaded Discussions is probably the most well- known. However, research has shown that the use of the VoiceThread multimodal Web 2.0 tool can greatly contribute to learners’ overall interest, engagement with the material, and level of participation in an asynchronous environment. Even though there are challenges faced by VoiceThread users, mostly because of the learning curve involved in understanding the parameters of the tool, researchers have found that the benefits far outweigh the difficulties, and that the overall experience of users has proven to be positive. The chapter also discussed the use of VoiceThread as an asynchronous tool that can be used for formative and summative assessment. The tool allows learners to present their work, using the format of their choice, as well as provide peer feedback, which is extremely beneficial in boosting the learning experience. VoiceThread can also be used for the asynchronous online assessment of English to Speakers of Other Languages (ESOL) learners’ oral and written production, which can enhance the second language learning experience. Finally, the chapter provided an overview of an evaluation case, which investigated the use of VoiceThread to create a supportive community of online learners. The qualitative narratives provided by the graduate students who participated in the evaluation provide a clear perspective of how a multimodal tool can be both beneficial yet challenging to use during the learning experience. The results from the evaluation also help understand issues to consider before implementing a multimodal tool as an asynchronous online discussion tool. 101
Multimodal Interactive Tools for Online Discussions and Assessment
As mentioned by Gao et al. (2013) effective learning environments for interactions and discussion vary when the educational purpose differs. Consequently, it is important to identify, develop, and implement new types of environments that best support other purposes of learning. Multimodal innovative tools for asynchronous discussions and assessment address some of the constraints of the more traditional types of asynchronous discussion and assessment format, which focus primarily on text-based interactions. Additionally, some multimodal tools continue to offer the asynchronous flexibility needed for online learning and instruction. There is still more to be learned from the implementation of multimodal tools in asynchronous online instruction; however, it is clear that there is a number of positive factors that support their use as an alternative to traditional ways of online instruction.
REFERENCES Ajayi, L. (2009). An Exploration of Pre-Service Teachers’ Perceptions of Learning to Teach while Using Asynchronous Discussion Board. Journal of Educational Technology & Society, 12(2), 86-n/a. Andresen, M. A. (2009). Asynchronous discussion forums: success factors, outcomes, assessments, and limitations. Journal of Educational Technology & Society, 12(1), 249. Ball, M. (2012). Using VoiceThread in a PK-2 Classroom. Learning and Leading with Technology, 40(3), 34–35. Bassett, P. (2011). How Do Students View Asynchronous Online Discussions As A Learning Experience? Interdisciplinary Journal of E-Learning and Learning Objects, 7, 69-79. Beach, R., & OBrien, D. (2015). Fostering Students Science Inquiry Through App Affordances of Multimodality, Collaboration, Interactivity, and Connectivity. Reading & Writing Quarterly, 31(2), 119–134. doi:10.1080/10573569.2014.962200 Black, P., & Wiliam, D. (1998). Assessment and Classroom Learning. Assessment in Education: Principles, Policy & Practice, 5(1), 7–74. doi:10.1080/0969595980050102 Borup, J., West, R. E., & Graham, C. R. (2012). Improving online social presence through asynchronous video. The Internet and Higher Education, 15(3), 195–203. doi:10.1016/j.iheduc.2011.11.001 Borup, J., West, R. E., & Graham, C. R. (2013). The influence of asynchronous video communication on learner social presence: A narrative analysis of four cases. Distance Education, 34(1), 48–63. doi:1 0.1080/01587919.2013.770427 Borup, J., West, R. E., & Thomas, R. (2015). The impact of text versus video communication on instructor feedback in blended courses. Educational Technology Research and Development, 63(2), 161–184. doi:10.1007/s11423-015-9367-8 Brunvand, S., & Byrd, S. (2011). Using VoiceThread to Promote Learning Engagement and Success for All Students. Teaching Exceptional Children, 43(4), 28–37. doi:10.1177/004005991104300403 Burden, K., & Atkinson, S. (2008). Evaluating pedagogical affordances of media sharing Web 2.0 technologies: A case study. Paper presented at theproceedings of ASCILITE, Melbourne, Australia.
102
Multimodal Interactive Tools for Online Discussions and Assessment
Ching, Y. H., & Hsu, Y. C. (2013). Collaborative learning using VoiceThread in an online graduate course. Knowledge Management & E-Learning: An International Journal, 5(3), 298–314. Cicconi, M. (2014). Vygotsky Meets Technology: A Reinvention of Collaboration in the Early Childhood Mathematics Classroom. Early Childhood Education Journal, 42(1), 57–65. doi:10.1007/s10643-0130582-9 Clark, C., Strudler, N., & Grove, K. (2015). Comparing Asynchronous and Synchronous Video vs. Text Based Discussions in an Online Teacher Education Course. Online Learning, 19(3), 48–69. Dail, J., & Giles, T. (2012). The Hunger Games and Little Brother come to life on VoiceThread. The ALAN Review, Summer, 6-11. Darabi, A., Arrastia, M. C., Nelson, D. W., Cornille, T., & Liang, X. (2011). Cognitive presence in asynchronous online learning: A comparison of four discussion strategies. Journal of Computer Assisted Learning, 27(3), 216–227. doi:10.1111/j.1365-2729.2010.00392.x De Oliveira, L. C., & Olesova, L. (2013). Learning about the Literacy Development of English Language Learners in Asynchronous Online Discussions. Journal of Education, 193(2), 15–23. deNoyelles, A., Mannheimer Zydney, J., & Baiyun, C. (2014). Strategies for Creating a Community of Inquiry through Online Asynchronous Discussions. Journal of Online Learning & Teaching, 10(1), 153–165. Durrington, V. A., Berryhill, A., & Swafford, J. (2006). Strategies for Enhancing Student Interactivity in an Online Environment. College Teaching, 54(1), 190–193. doi:10.3200/CTCH.54.1.190-193 Fear, W. J., & Erikson-Brown, A. (2014). Good quality discussion is necessary but not sufficient in asynchronous tuition: a brief narrative review of the literature. Journal of Asynchronous Learning Networks, 18(2), 21–28. Gao, F. (2014). Exploring the Use of Discussion Strategies and Labels in Asynchronous Online Discussion. Online Learning, 18(3), 1–19. Gao, F., Zhang, T., & Franklin, T. (2013). Designing asynchronous online discussion environments: Recent progress and possible future directions. British Journal of Educational Technology, 44(3), 469–483. doi:10.1111/j.1467-8535.2012.01330.x Gaytan, J., & McEwen, B. C. (2007). Effective Online Instructional and Assessment Strategies. American Journal of Distance Education, 21(3), 117–132. doi:10.1080/08923640701341653 Gikandi, J. W., Morrow, D., & Davis, N. E. (2011). Online formative assessment in higher education: A review of the literature. Computers & Education, 57(4), 2333–2351. doi:10.1016/j.compedu.2011.06.004 Girasoli, A. J., & Hannafin, R. D. (2008). Using asynchronous AV communication tools to increase academic self-efficacy. Computers & Education, 51(4), 1676–1682. doi:10.1016/j.compedu.2008.04.005 Graff, M. (2003). Cognitive style and attitudes towards using online learning and assessment methods. Electronic Journal of E-Learning, 1(1), 21–28.
103
Multimodal Interactive Tools for Online Discussions and Assessment
Griffiths, M., & Graham, C. R. (2010). Using Asynchronous Video to Achieve Instructor Immediacy and Closeness in Online Classes: Experiences from Three Cases. International Journal on E-Learning, 9(3), 325–340. Griffiths, M. E., & Graham, C. R. (2009). The Potential of Asynchronous Video in Online Education. Distance Learning, 6(2), 13–22. Hew, K. F., & Cheung, W. S. (2012a). Audio-based versus text-based asynchronous online discussion: Two case studies. Instructional Science, 41(2), 365–380. doi:10.1007/s11251-012-9232-7 Hew, K. F., & Cheung, W. S. (2012b). Students’ use of Asynchronous Voice Discussion in a BlendedLearning Environment: A study of two undergraduate classes. Electronic Journal of E-Learning, 10(4), 360–367. How Does IMS Enable Better Learning Experiences? (2016). Retrieved from https://www.imsglobal.org/ Hung, M.-L., & Chou, C. (2014). The Development, Validity, and Reliability of Communication Satisfaction in an Online Asynchronous Discussion Scale. Asia-Pacific Education Researcher, 23(2), 165-177. doi:10.1007/s40299-013-0094-9 Kalelioglu, F., & Gülbahar, Y. (2014). The Effect of Instructional Techniques on Critical Thinking and Critical Thinking Dispositions in Online Discussion. Journal of Educational Technology & Society, 17(1), 248–258. Kear, K. (2004). Peer learning using asynchronous discussion systems in distance education. Open Learning: The Journal of Open, Distance and e-Learning, 19(2), 151-164. doi:10.1080/0268051042000224752 Koricich, A. (2013). Technology Review: Multimedia Discussions Through VoiceThread. Community College Enterprise, 19(1), 76–79. Lee, L. (2014). Digital news stories: Building language learners content knowledge and speaking skills. Foreign Language Annals, 47(2), 338–356. doi:10.1111/flan.12084 Loncar, M., Barrett, N. E., & Liu, G.-Z. (2014). Towards the refinement of forum and asynchronous online discussion in educational contexts worldwide: Trends and investigative approaches within a dominant research paradigm. Computers & Education, 73(0), 93–110. doi:10.1016/j.compedu.2013.12.007 McLoughlin, C., & Luca, J. (2001). Quality in online delivery: What does it mean for assessment in e-learning environments. Paper presented at theAnnual Conference of the Australasian Society for Computers in Learning in Tertiary Education. Nakagawa, A. S. (2001). Using VoiceThread for professional development: Probeware training for science teachers. Paper presented at the Annual Technology, Colleges, and Community Worldwide Online Conference. Ng, C. S. L., Cheung, W. S., & Hew, K. F. (2012). Interaction in asynchronous discussion forums: Peer facilitation techniques. Journal of Computer Assisted Learning, 28(3), 280–294. doi:10.1111/j.13652729.2011.00454.x
104
Multimodal Interactive Tools for Online Discussions and Assessment
Olofsson, A., Lindberg, J. O., & Stödberg, U. (2011). Shared video media and blogging online: Educational technologies for enhancing formative e‐assessment? Campus-Wide Information Systems, 28(1), 41–55. doi:10.1108/10650741111097287 Ozyurt, O., & Ozyurt, H. (2011). Investigating the effects if asynchronous discussions on students’ learning and understanding of mathematics subjects. Turkish Online Journal of Distance Education, 12(4), 17–33. Pecot-Hebert, L. (2012). To hybrid or not to hybrid, that is the question! Incorporating VoiceThread Technology into a traditional communication course. Communication Teacher, 26(3), 129–134. doi:10 .1080/17404622.2011.650703 Robles, M., & Braathen, S. (2002). Online assessment techniques. Delta Pi Epsilon Journal, 44(1), 39–49. Russel, J., Elton, L., Swinglehurst, D., & Greenhalgh, T. (2006). Using the online environment assessment for learning: A case study of a web-based course in primary care. Assessment & Evaluation in Higher Education, 31(4), 465–478. doi:10.1080/02602930600679209 Sadler, D. R. (1989). Formative assessment and the design of instructional systems. Instructional Science, 18(2), 119–144. doi:10.1007/BF00117714 Servonsky, E. J., Daniels, W. L., & Davis, B. L. (2005). Evaluation of Blackboard(TM) as a Platform for Distance Education Delivery. The ABNF Journal, 16(6), 132–135. PMID:16382797 Smith, M., & Winking-Diaz, A. (2004). Increasing students’ interactivity in an online course. Journal of Interactive Online Learning, 2(3), 1–25. Stover, K., Kissel, B., Wood, K., & Putman, M. (2015). Examining Literacy Teachers Perceptions of the Use of VoiceThread in an Elementary, Middle School, and a High School Classroom for Enhancing Instructional Goals. Literacy Research and Instruction, 54(4), 341–362. doi:10.1080/19388071.2015. 1059911 Swan, C. (2014). Tech tools for assessing the “soft” skills. Tech & Learning, 34(8), 38–40. Vonderwell, S., Liang, X., & Alderman, K. (2007). Asynchronous discussions and assessment in online learning. Journal of Research on Technology in Education, 39(3), 309–328. doi:10.1080/15391523.20 07.10782485 Wimba. (2009). Wimba voice for higher education. Retrieved from http://www.wimba.com/solutions/ higher-education/wimba_voice_for_higher_education Wise, A., Speer, J., Marbouti, F., & Hsiao, Y.-T. (2013). Broadening the notion of participation in online discussions: Examining patterns in learners online listening behaviors. Instructional Science, 41(2), 323–343. doi:10.1007/s11251-012-9230-9 Wood, K. D., Stover, K., & Kissel, B. (2013). Using digital VoiceThreads to promote 21st century learning. Middle School Journal, 44(4), 58–64. doi:10.1080/00940771.2013.11461865 Xie, K. (2013). What do the numbers say? The influence of motivation and peer feedback on students behaviour in online discussions. British Journal of Educational Technology, 44(2), 288–301. doi:10.1111/ j.1467-8535.2012.01291.x
105
106
Chapter 6
Instructor Presence Beth Allred Oyarzun University of North Carolina Wilmington, USA Sheri Anderson Conklin UNCW, USA Daisyane Barreto UNCW, USA
ABSTRACT Student isolation and retention rates are persistent issues in online learning. Research has shown that an important component of student performance and satisfaction is instructor presence (Picciano, 2002). Instructor presence includes three elements: 1) Teaching presence, 2) Instructor immediacy, and 3) Social presence (Mandemach, Gonzales, & Garrett, 2006). This chapter will use this definition of instructor presence to outline best pedagogical practices with concrete examples to increase instructor presence in asynchronous online courses. Each section will begin with a definition and research on that construct followed by best practices with concrete examples.
INTRODUCTION Online education is an area that continues to grow, especially in higher education settings. The number of students taking at least one online course has increased by over five million to a new total of 6.7 million students (Allen & Seaman, 2013). Still, student isolation and retention rates are persistent issues in online learning. Research has shown that an important component of student performance and satisfaction is instructor presence (Borup, West, Graham, 2012; Griffiths & Graham, 2009; Picciano, 2002). Instructor presence includes three elements: 1. Teaching presence, 2. Instructor immediacy, and 3. Social presence (Mandemach, Gonzales, & Garrett, 2006). DOI: 10.4018/978-1-5225-1851-8.ch006
Copyright © 2017, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
Instructor Presence
This chapter will use these elements of instructor presence to outline research based best pedagogical practices with concrete examples to increase instructor presence in asynchronous online courses. Teaching and social presence are both constructs of the popular Community of Inquiry (CoI) framework developed by Garrison, Anderson and Archer (2010). This framework is designed to assist instructors in designing online courses to ensure that transactional distance, which refers to the separation of teacher and learner in online educational programs (Moore, 2007), is lessened and that deep meaningful learning occurs. There are three interdependent elements associated with CoI framework that foster deep and meaningful learning experiences: social presence, cognitive presence, and teaching presence. Even though a considerable number of studies investigating the CoI framework has been conducted, few studies reported any objective measures of learning to support the claims that application of the CoI framework leads to deeper levels of learning (Rourke & Kanuka, 2009). Still, the research does connect the CoI framework with increased learner satisfaction and perceived learning (Akyol, Garrison, & Ozden, 2009; Richardson & Swan, 2003; Swan & Shih, 2005). Learner satisfaction and motivation is often influenced by the interactions that happen in the online environment, which leads us to the topic of instructor immediacy. Instructor immediacy is a concept defined from two communication theories: Moore’s (1973) transactional distance theory and Mehrabian’s (1971) communication immediacy theory. Transactional distance is defined as the psychological distance between online learners and instructors. Moore’s theory proposes that more interaction between instructors and learners lessens transactional distance. Communication immediacy refers to the physical and verbal behaviors that reduce psychological and physical distance between individuals. The verbal behaviors can be translated into online learning. In this case, instructor immediacy involves communication strategies that reduce social and psychological distance between learners and instructors in online learning (Arbaugh, 2001). A level of instructor immediacy has been shown to increase retention and achievement (Bodie & Bober-Michel, 2014). This chapter will discuss research and provide examples of the best practices for teaching presence, instructor immediacy, and social presence. Each section will begin with the definition and research on the concept followed by discussion and best practices with concrete examples.
TEACHING PRESENCE The concept of teaching presence evolved from social presence research. Garrison and colleagues (2000) differentiated teaching presence from social presence as part of the Community of Inquiry (CoI) framework. While social presence is the ability to project or perceive others as real, teaching presence is conceptualized as the design and facilitation of cognitive and social processes for the purpose of realizing intended outcomes. Teaching presence begins at the design phase with course design and organization and continues into the implementation of the course through facilitation of discourse. The central focus of teaching presence is to increase social presence and student learning (Lowenthal & Parscal, 2008). According to the CoI framework, teaching presence is directly related to both social and cognitive presence through the following categories: 1. Design and organization, 2. Materials and learning activities, and 3. Facilitation and encouragement. 107
Instructor Presence
Through these three categories, students can reach meaningful and educational outcomes (Anderson, Rourke, Garrison, & Archer, 2001). Design and organization refers to the macro level structure of the online course. Materials and activities pertains to the content and assessments used within the structure of the course. Both the design and the delivery are interrelated and require effective responsiveness to developing needs and events. Facilitating reflection and discourse develops cognitive understanding in a positive environment and involves pedagogical, interpersonal, and organizational skills. Through reflection, students construct personal meaning of the content and confirm a mutual understanding. Direct instruction contradicts being a “guide on the side,” but it is needed to diagnose misconceptions and to bring expertise to the class (Garrison & Akyol, 2013). Teaching presence is important for perceived learning and satisfaction (Akyol, Garrison, & Ozden, 2009; Richardson & Swan, 2003; Swan & Shih, 2005) and the development of a community (Brook & Oliver, 2007; Ice, Curtis, Phillips, & Wells, 2007; Shea, Li, & Pickett, 2006). Teaching presence is force that combines all the aforementioned factors (Garrison & Akyol, 2013). The following sections will present the research in each category of teaching presence. Design and organization refers to the planning and design of the course structure, process, interactions (Anderson et al., 2001). During this process, the instructor establishes the course goals, provides clear instructions for participation behaviors and course activities, set deadlines and timeframes, and define boundaries for student and instructor interaction (Arbaugh & Hwang, 2006; Shea, Li, & Pickett, 2006). This planning for interactions and online classroom management is essential to allow students the ability to meet course goals and learning objectives. Without this type of planning and direction, students may be lost and the ability to seek immediate assistance is not always available (Easton, 2003). Course structure for asynchronous online courses is critical as online learners are often frustrated for being unable to find needed material or feeling lost in their courses (Swan 2001). It is essential that online faculty and instructional designers create a consistent and sequenced course structure. For example, Fredericksen and colleagues (2000) developed a course design process to create a ‘solid’ course structure. They advised faculty follow the following steps: 1. 2. 3. 4. 5. 6. 7.
Get started by reflecting and conceptualizing the course, Create an orientation, Chunk course content, Create learning activities, Walk through the course, Get ready to teach, and Evaluate and revise.
The combination of a consistent course structure and engaged instructors who create dynamic interactions have been found to be the most consistent predictors of successful online courses (Swan, 2003). Typically, the course structure is developed prior to course implementation yet adjustments can be made throughout the implementation process. Materials and learning activities refers to course content presentation and assessments. Course organization should be consistent and enable learners to find all necessary course materials. Course materials should also be “chunked”, following segmenting principles (Clark & Mayer, 2011), and presented into modules. Modules are the fundamental method for delivering online course content (Draves, 2007) and often the most time consuming to design. Modules should consist of three basic elements: overview, instructional content, and assessment. The overview should provide an introduction to the module, con108
Instructor Presence
taining specific instructions and learning outcomes for learners. It is also a means to orient students to the other elements of a course content. The instructional content can include video lectures, scholarly articles, textbook readings, multimedia, and/or internet resources. If video lectures are used, the length of the video should be short. Additionally, informal language should be used when presenting the video content (Guo, Kim, & Rubin, 2014), which follows the personalization principle (Clark & Mayer, 2011) to improve student learning and promote motivation. This content can and should vary from module to module in order to retain student engagement. Presenting content in multiple formats such as text, audio, and video, is a concept support by Universal Designs for Learning (UDL). This also assists in Americans with Disabilities Act (ADA) compliance as well as providing students access to the content that matches their learning preferences (Chickering & Gamson, 1987). For example, provide a video lecture along with an audio podcast and a text script of the content. This allows the student to choose a method of reviewing the content and that method may change depending on circumstances. The assessment part of a course can include an assignment, project, quiz, group activity, and/or discussion. Assessments should be directly aligned to course goals. Provide a chart in the syllabus that shows how assessments align to course goals. Again, the assessments should vary and happen frequently in order to retain student engagement (Orlando, 2011). Assessments drive students into the content. Students will access content based upon the perceived degree in which it will positively influence better outcomes and assessments (Murray, Perez, Geist, & Hedrick, 2012). Low stakes grading or assessments, such as quizzes or small assignments, should occur in each learning module to ensure students are retaining the content (Warnock, 2013). Low stakes grading creates a grade transparency for students and also allows for a steady flow of information. Warnock (2013) stated that low stakes grading has several advantages such as creating dialogue between the student and instructor, building confidence in the students through multiple opportunities to succeed, and increasing motivation. High stakes grading or assessments should occur in the form of authentic assessments or applied learning activities in which the students apply the information learned into a new situation. High stakes assessments should span multiple modules and have multiple products that are assessed. Applied learning activities can take on many forms including service learning, professional development activities, and other activities that apply classroom concepts to real-world situations (Kolb & Kolb, 2005). Students who were involved with applied learning activities have more positive course evaluations (Markus, Howard, & King, 1993). For example, in an online physical education course, learners might be required to attend several specific types of exercise classes and blog about their experience. The blog instructions and rubric would require them to connect their experiences to course content such as cardiovascular or strength training. Learners could visit classmate’s blogs to comment or get ideas for types of classes to attend. Facilitating reflection and discourse is the interaction students engage in to develop cognitive understanding in a positive environment and involves pedagogical, interpersonal, and organizational skills. Indicators of the facilitation of discourse include identifying areas of agreement and disagreement, seeking to reach consensus, drawing in participants, prompting discussion, and assessing the efficacy of the process (Shea, et al., 2006). Akyol and Garrison (2014) found facilitation of discourse was high at the beginning of the semester and would drop after about three weeks. A tentative explanation for the drop in discourse was that students needed more facilitation at the beginning of a course to understand the instructor’s expectations. Research on facilitation and encouragement has typically focused on dis-
109
Instructor Presence
cussion boards, but Shea and colleagues (2010) found that 80-90% of teaching presence occurs outside discussion forums in the forms of emails and private feedback. Facilitation and encouragement can happen both publicly, such as in a discussion board, or privately (e.g. feedback on an assignment). In the public atmosphere, the instructor should guide the students to delve deeper into the content through scaffolding. This can include probing questions, referencing other students’ thoughts, and direct instruction. This would occur during a discussion or group activity. Other public feedback can occur after a discussion or group activity has ended in the form of an announcement, a summery in a discussion, or a class email. The encouragement and feedback should be personalized to highlight specific students and their achievements or thoughts as well as provide constructive feedback for future activities. This encourages others to strive and participate as well as continued encouragement for those students highlighted. Instructors also can provide facilitation and encouragement privately through individual email and assessment feedback. Through private facilitation, the student receives encouragement and feedback specific to their needs. For example, if a student posts a response to a discussion board but did not cite properly, the instructor can copy and paste the prompt into an email and give student specific feedback (see Table 1). Studies have shown an increase in motivation and perceived learning when there is deliberate facilitation and encouragement (Williams, 2000). The combination of course structure, materials and activities, learning and facilitation of discourse can lead to high teaching presence (Wisneski, Ozogul, & Bichelmeyer, 2015). Simply presenting the material in a fashion where the students can engage with the materials and learning activities is not enough. Instructors who engage students in a communicative process of learning combined with solid course structure and materials and learning activities achieve higher teaching presence (Wisneski et al., 2015).
DISCUSSION AND BEST PRACTICES As stated previously, course design and organization should happen prior to delivery. The organization is an outline of how students will access course materials. There are two main ways of organizing materials: 1. Organization by material type such as articles, assignments, etc; and 2. Organize material by time period such as unit, week, or chapter. The latter option is preferable for online classes because it allows students to access all material for a given time period in one location versus having to click around to find various materials for an assignment. An example outline is provided below. I.
Start Here A. Welcome video and letter B. Instructor Introduction and Contact Information C. Technical Support D. Introduce yourself II. Syllabus III. Schedule IV. Learning Modules 110
Instructor Presence
A. Module 1 - Orientation 1. Overview 2. To-Do List 3. Content 4. Assessment B. Module 2 1. Overview 2. To-Do List 3. Content 4. Assessment C. Etc… V. Grades However, the course is organized, instructors are advised to create a student orientation to become familiar with the course, instructor, and course-related material. The orientation should include the following: 1. 2. 3. 4. 5. 6. 7. 8. 9.
Welcome from instructor Contact information Course overview and objectives Readings and materials Course learning activities Assessments Instructor expectations Course schedule Next steps
The welcome introduces the instructor to the course and students. This can be done via text or through the use of audio and video. If using text, an appropriate image of the instructor should be included (Conrad, 2002). Contact information describes specific details about how to contact the instructor. Contact information should be detailed and include expected turnaround time for types of communication and assignments (Fisher, 2010). For example, I will respond to emails within 24 hours. It is important to be specific information such as I answer emails until 5pm EST. By following these practices, the instructor sets the expectations for the course and how to communicate with the instructor. Once the course design has been established, then the instructor needs to establish and implement facilitation and discourse. Although course structure should remain consistent, materials and learning activities should vary. For example, learning modules or units for a course may be organized in the following structure: 1. 2. 3. 4.
Overview To-do list Content presentation Assessment, but the materials and learning activities might vary
111
Instructor Presence
Table 1. Feedback communication plan example Assignment Type
Estimated Feedback Timeframe
Discussion boards
Discussions will be graded 48 hours after due date
Case Studies
Feedback and grade will be provided within 72 hours of due date
Research paper
Feedback and grade will be provided one business week after due date
The material presentation for one module might be a series of research articles on a particular topic with the learning activity being a synthesis paper of research presented. Whereas another module may have a series of videos for the content presentation with a cooperative learning assignment as the learning activity. The selection of the materials and learning activities should be in alignment with the goals and objectives of the course in addition to a variety of materials. As discussed earlier, facilitation of discourse can happen either privately, publicly or both. Whether the communication happens publicly or privately, studies have shown there is a correlation between instructor timeliness with communication and feedback for assignments and teaching presence (Sheridan & Kelly, 2010; Skramstad, Schlosser, & Orellana, 2012). Sheridan and Kelly (2010) found students expected responses within 24 hours. Therefore, providing a communication plan regarding feedback for assignments and discussions is important. An example is provided in Table 1. Providing an instructor timeliness plan informs students of the instructor expectations. This information can also be seen as instructor immediacy, which is discussed in further detail in the following section.
Instructor Immediacy Instructor immediacy is a term used to describe instructor communication behaviors used to reduce the transactional distance between learners and the instructor (Anderson & Anderson, 1982). Interaction has proven to influence student motivation, participation among learners, and achievement of learning outcomes (Du, Harvard, & Li, 2005; Sargeant, Curran, Allen, Jarvis-Selinger, & Ho, 2006; Tu, 2005). Communication immediacy is defined as the extent that communicative behaviors enhance physical or psychological closeness (Mehrabian, 1971; Richmond, 2002), which could be achieved by reducing the distance between instructors and students (Ni & Aust, 2008) especially in online spaces. Immediacy can be in the form of both verbal and nonverbal communication. Non-verbal cues include maintaining eye contact or leaning forward. Verbal communication includes asking questions, use of humor, and addressing students by name (Baker, 2004). Garrison and colleagues (2000) stated it is possible to establish instructor immediacy in online courses, but a lack of nonverbal cues can be a barrier to creating a strong social presence. Through careful course design, instructor immediacy can be established. For example, in a study examining teacher immediacy in online courses, Ni and Aust (2008) found that teacher verbal immediacy was positively correlated with satisfaction, perceived learning and posting frequency in online courses. However, further data analysis indicated that teacher verbal immediacy was not a significant predictor of satisfaction or perceived learning. The potential explanation for this issue might be due to a discrepancy between the intended audience for the original teacher immediacy scale and the intended audience for this study (i.e., adults). According to the authors, adult learners might focus more on the content instead of verbal immediacy behaviors. Indeed, these findings seem to corroborate with another study conducted in a massive and fully online critical thinking course. In
112
Instructor Presence
this particular study, Campbell (2014) found that personalized messages to individual students did not influence course activity, learning outcomes, or course dropout rate. According to the author, cognitive learning seems to be connected with the time students spend with the course materials and assessments rather than with social contacts, including the instructor, within the online course. Thus, a thorough careful course design for instructor immediacy approach needs to be established. Designing instructor immediacy in an online course can range from simplistic such as using inclusive pronouns, which creates a sense of belongingness and provides personalization, to more complex such as the use of multimedia. With the use of text, establishing verbally immediate behaviors can be done through the use of discussion boards. If discussion boards are a part of the course design, the instructor can initiate questions, address students by name, respond frequently to students and offer praise (O’Sullivan, Hunt, & Lippert, 2004). Gorham (1988) stated that sharing personal stories with the students or adding humor to text can improve instructor immediacy. This can be established with an instructor introduction using text and images on a course homepage. Multimedia content, such video introductions, can facilitate instructor immediacy more effectively since it provides the opportunity for verbal and nonverbal cues. A communication plan is a detailed overview of how the instructor intends to communicate with the students and how students are expected to communicate with one another. Communication with the instructor includes one-to-one communication through email, phone, or appointments as well as personalized feedback on assignments. Communication with each other may include cooperative learning assignments, discussions, or team activities. The communication plan should be included in the syllabus or at the beginning of the course so students are aware of how to communicate with their instructor. Sheridan and Kelly (2010) indicated that clear course requirements and being responsive to student needs are the highest indicators of instructor presence that are the most important to students. Multimedia supports both verbal and nonverbal cues and therefore provides a strong instructor immediacy and social presence (Borup, Graham, & Velasquez, 2011). Feedback is an example of providing instructor presence and immediacy. Timely and quality feedback can be provided in various ways using text, graphics, and/or multimedia. Some online students report that they receive limited, mechanical, or impersonal feedback in online classes (Jennings & McCuller, 2004). Instructors report that providing feedback in online courses can be more time consuming than providing feedback in face-to-face courses (Herrmann & Popyack, 2003). With the advancements in technology, providing feedback can be less time consuming and more personal (Jones, Georghiades, & Gunson, 2012). Instructors can mark-up written assignments using tablets and stylus pens or create audio/video feedback instead of typing out comments. Both of these options add instructor presence to feedback through seeing the instructor’s handwriting, hearing the voice of the instructor or seeing the instructor.
Discussion and Best Practices Multiple modes of communication should be included along with hours of availability, and any other pertinent information such as what should be included in the subject line of email communication. The communication plan may also direct students to use the discussion board or other public forums for content related questions. Estimated response times for each communication method and estimated feedback turnaround times for each type of assignments should also be detailed in the communication plan (Fisher, 2010). Overall, communication plans should be specific and detailed to ensure students understand instructor expectations and are able to follow the clear expectations. A chart could be an effective way to communicate this information. An example instructor communication plan chart is provided in Table 2. 113
Instructor Presence
Table 2. Instructor Communication Plan Example Chart Communication Mode
Estimated Response Time
E-mail
24-48 hours
Phone Call
Immediate if available/less than 24 hours if message left
Instant Message
Immediate when online May not receive messages when offline
Text Message
Less than 1 hour
Assignment
Estimated Feedback Return
Written Projects
1 week
Quizzes
Immediate - self grading
Discussions
Intermittent responses to various students except for the Instructor Question Board. Response will be posted within 24 hours of a question being posted.
Table 3. shows an example of a communication expectations chart that details instructor and student expectations. The use of multimedia for providing feedback is burgeoning due to improvements with technology making it more accessible and easier to use. One example for providing dynamic feedback is through the use of screen capture tools such as Camtasia, Jing, or Screencast-o-matic. For example, when grading written work, typically the instructor writes many comments and explanations which can be time consuming. Using screen capture tools, the instructor could write fewer comments, then use screen capture to record an explanation of the comments to the student providing examples or further instructions such as, “apply this to the rest of your paper.” Students found that screen capture feedback to include more detail, provide clarification from the intonation and avoid misunderstandings, and felt the feedback overall was more efficient (Jones et al., 2012). This method of providing feedback is more personal than providing text only yet one has to remember that screen capture technology is only a means to provide the feedback, it is still up to the instructor to provide the quality of feedback. Tablet computers and stylus pens allow instructors to mark-up assignments as they would with paper assignments. One caveat to this method is that files submitted must be in pdf format. Color and shape can be used as a cues for feedback in addition to handwritten comments. There are several mobile applications that are available at low costs enhance annotating on tablet computers. Instructors need the ability to access the student files from a tablet, a tablet computer, a stylus, and a mobile application that enables annotation such as iAnnotate. If the learning management used does not allow for mobile access to student submission, then the use of cloud storage could be helpful. There are several cloud storage options such as Dropbox and Google Drive.
Social Presence Social presence, a term initially coined by Short and colleagues (1976), is defined as the degree of salience or awareness between two or more communicators through a communication medium. Short and colleagues (1976) first conceptualized social presence as the quality of the communication medium. Later researchers such as Gunawardena (1995) re-conceptualized social presence as the way people utilized communication mediums. In other words, in a learning environment social presence refers to the degree
114
Instructor Presence
Table 3. Communication Expectations Example Instructor Expectations
Student Expectations
• Respond to questions and concerns in a timely manner (outlined in syllabus).
• Review materials and assignments within 1-2 days of the module start date
• Clarify questions
• Manage your time appropriately
• Clarify assignments
• Ask questions when you are confused (remember this is part of the learning process)
• Guide students through researching topics (note - I state ‘guide’ not ‘tell’ or ‘direct’)
• Collaborate with your peers
• Participate or summarize (depending on the task) in discussions
• Engage with the material, colleagues, and instructor
• Provide timely feedback to assignments (within 3-5 days depending on the scope of assignment).
• Ask questions or for examples when needed
*Note that many of the items above involve answering your questions. I am not able to read your mind so it is important to ask questions. I will not seek you out to be sure you understand the material as no news is good news to me. You must seek me or another classmate out when you are confused*
to which a learner feels personally connected with other students and/or instructors. Later, Garrison and colleagues (2000) developed the Community of Inquiry (CoI) framework, which incorporated social, teaching, and cognitive presence. Specifically, the authors argued that teaching presence is utilized to foster social presence and in turn, create cognitive presence. Two components of social presence have emerged over the years: intimacy and immediacy. Intimacy was introduced by Argyle and Dean (1965) and refers to nonverbal communication factors such as physical distance, eye contact, physical proximity, smiling, facial expressions, and personal topic of conversation. Immediacy was introduced by Wiener and Mehrabian (1968) and refers to the psychological distance, which includes verbal and nonverbal cues, between a communicator and the recipient of the communication. Research has shown that instructors with high degree of social presence in online learning environments are viewed by learners as being more positive and effective (Gunawardena & Zittle, 1997; Shin, 2002). Likewise, direct engagement between students and their instructors has also had significant effect on student’s learning engagement (Chickering & Gamson, 1987; Kuh, 2009). According to Argon (2003), methods for creating social presence include strategies in the following three categories: course design, instructors, and participation. Table 4 details the strategies recommended for each category. The instructor and participant strategies are helpful when implementing an online course. Many of these strategies relate to personalizing content and feedback. The next section further explores these strategies. The personalization principle is one of the multimedia principles defined by Clark and Mayer (2011) that addresses presentation style. The principle includes three elements: 1. Conversational style 2. On-screen coaches 3. Instructor visibility For the purpose of this chapter, only conversational style and instructor visibility is discussed in this section since the instructor has control on the integration of these items. On-screen coaches are typically
115
Instructor Presence
Table 4. Strategies to establish social presence Course Design
Instructors
Participants
Develop welcome messages
Contribute to discussions
Contribute to discussions
Include student profiles
Promptly answer email
Promptly answer emails
Incorporate audio
Provide frequent feedback
Strike up a conversation
Limit class size
Strike up a conversation
Share personal stories and experiences
Structure collaborative learning activities
Share personal experiences and stories
Use humor
Use humor
Use emoticons
Use emoticons
Use appropriate titles
Address students by name Allow students options for addressing the instructor
created and produced by various publishers such as McGraw-Hill or Pearson and therefore we will not be addressing this element. The use of conversational style language should be implemented to assist learners in retaining more information. Conversational style language means using the first or second person (e.g., “you” for students, and “I” for the instructor) instead of formal academic language. This allows the learner to engage with the screen as a social conversational partner as opposed to a machine. Kurt (2011) examined the differences in achievement and cognitive load between students receiving conversational and formal styled instruction. There were significant differences between the two groups for cognitive load. Conversational style can be used for text, audio and video components of course material and feedback to create a more user-friendly tone. Instructor visibility requires the instructor to speak directly to the learner to increase motivations and inject their personal style. This can be done through the personal conversational style of text, video, and audio in addition to adding images and video of the instructor to content throughout the course. Video provides students with both verbal and non-verbal cues that may otherwise be missing from text based feedback. The ability to view the instructor also allows the students to gain insight into the instructor’s personality. Evidence has shown that instructors’ emotional expression is higher due to use of the video (Borup et al., 2011; Borup et al., 2012; Griffiths & Graham, 2009). One limitation with using video for instructor visibility is effective asynchronous video communication pedagogy. In order to effectively leverage the benefits of video, instructors need to explore or gain assistance from those who are experienced in the various software tools and pedagogy for asynchronous video integration (Borup et al., 2012). Research has confirmed that learner centered principles and practices produce high-quality instruction regardless of context (McCombs, 2015). Learning center approaches focus on student learning where the students are more active participants with their own learning and instructors become facilitators of the instructional materials (Weimer, 2002). Online courses provide opportunity for learning to become more personalized or individualized and for students to reflect and take an active part in their learning process. Immediate and direct feedback on performance can be offered (Khan, 1997) and students are able to take a more active role in their own learning (Lambert & McCombs, 2000). Table 5 details the differences between an instructor centered approach and a learner centered approach to designing instruction (Garrett, 2008).
116
Instructor Presence
Table 5. Instructor vs. Learner Centered Approaches Instructor Centered Approach
Learner-Centered Approach
Instructor is sole leader
Instructor is facilitator/leadership is shared
Learners work individually
Learners work together
Instructor chooses topics
Students have some choice of topics
Focus is on Instructor
Focus is on learners and instructors
Rewards are mostly extrinsic
Rewards are mostly intrinsic
Knowledge is disseminated
Knowledge is constructed through gathering and synthesizing information
Teaching and assessing are separate
Teaching and assessing are intertwined
Assessment is used to monitor learning
Assessment is used to promote and diagnose learning
Emphasis on right answer
Emphasis on generating better questions and learning from errors.
Culture is competitive and individualistic
Culture is cooperative, collaborative, and supportive.
Some learner centered strategies focus on creating a community of learners working together to achieve goals or learning outcomes such as problem based learning, cooperative learning, and/or active learning strategies. Community has several definitions in the educational research literature. However, community can be synthesized from these varying definitions as a sense of belonging and trust experienced by learners engaging in meaningful discourse in the learning environment (Gunawardena & Zittle, 1997; Picciano, 2002). Designing a learning experience that incorporates these ideas can be accomplished by creating a collaborative or cooperative assignment or activity. A qualitative analysis of instructors with face-to-face university teaching experience transitioning to online teaching for the first time conducted by Conrad (2002) revealed that novice online instructors have little awareness of collaborative learning, social presence, or the role of community plays in online learning. These novice online instructors’ reflections showed that they viewed themselves as deliverers of content. The following section describes techniques for designing such experiences within online course. Two strategies that aid in community development are collaborative learning and cooperative learning (Palloff & Pratt, 2010). Although these terms are often used interchangeably or simply referred to as group work, for the purpose of this chapter, cooperative learning is defined as a set of instructional methods in which learners are required to complete academic assignments together as a whole class or small groups, while collaborative learning will be defined as the social interaction and engagement among groups of learners to complete academic assignments by choice (Panitz, 1996). Hence, cooperative learning is designed by the instructor and collaborative learning is student generated. With this difference in mind, the remainder of this chapter will focus on designing cooperative learning activities in online courses. Cooperative learning is comprised of five elements according to Johnson and Johnson (1991): 1. 2. 3. 4. 5.
Positive interdependence Promotive interaction Individual and group accountability Social skills Group processing
117
Instructor Presence
Positive interdependence refers group members having a role and believing that they are responsible for their learning and the learning of the group. Promotive interaction is the concept that learners perceive that need the group to be successful. Individual and group accountability indicates that there should be multiple products produced for grading in order to ensure each group member is held accountable at an individual and group level. Social skills refer to communication and interpersonal skills that should be demonstrated particularly online. Group processing requires the group depends on one another for explanations or assistance. Koh, Hill, and Barbour (2010) offered instructional design and group work processing strategies citing that group work is becoming more popular in online learning. The instructional design strategies are: 1. 2. 3. 4.
Providing multiple communication methods Providing an overall plan for the class Preparing for technology Building virtual team skills The group work process strategies are:
1. 2. 3. 4.
Assisting group formation Building a sense of connection Being involved in group processes Evaluating group processes
These echo some of the cooperative learning recommendations and add some other helpful tips. The cooperative learning strategies can be incorporated into the plan for the class, building virtual teams, assisting group formation, and evaluating group processing. The other recommendations can assist in building the community and instructor presence.
Discussion and Best Practices Content and feedback can be personalized using a variety of formats and techniques from text to asynchronous video. A personalized sample text statement or direction in a course might read: “read chapter 1 of the text and submit assignment 1.” Using the personalization principle this could be re-written as “read chapter 1 of your textbook and submit your first assignment” (emphasis added). This same principle can be applied to feedback given to students on assignments. Personalization can also be integrated using asynchronous video. Students have stated that the use of video “humanized” the instructor and they felt there was more of a student-instructor relationship (Borup et al., 2011; Griffiths & Graham, 2009). This is due to the verbal and visual cues that video can provide. This is also an example of instructor visibility where the students are able to see the instructor and the non-verbal cues as well as the instructor’s mannerisms. Students have also stated that providing some self-disclosure assisted with gaining insight to the instructor’s personality and made them more “real” (Borup et al., 2012). Whether using text or asynchronous video personalizing the content and feedback can impact the perception of the instructor and increase social presence and course satisfaction for the students (Borup et al., 2011; Borup et al., 2012; Griffiths & Graham, 2009).
118
Instructor Presence
Many learner centered practices that are applied in a face-to-face classroom can be translated to an online environment. For example, students need to be involved with course tasks to develop their own learning. This can be accomplished through collaboration with other students in a discussion forum or group project (Saxena, 2013). Group work covers a variety of learner centered strategies such as problem based learning, collaborative learning, cooperative learning, and small group learning. There are many factors to address when implementing group work in the online environment such as communication tools, student grouping and setting clear expectations. Students find online group work difficult due to time zones, distance, lack of visual cues and hidden identities (McConnell, 2000; Smith, 2005; Straus & McGrath, 1994). Although students can be resistant to group work, when implemented properly can optimize student learning (MacNeill, Telner, Sparaggis-Agaliotis, & Hanna, 2014). When creating group work assignments, provide students with instructions on various collaboration tools. These tools could be integrated into the LMS or open source tools such as Google docs and Google Hangouts, which will allow students to hear voices and see each other and compensate for the lack of verbal and non-verbal cues (Chang & Kang, 2016). Also, give the students the option to group themselves, which gives students group ownership and autonomy (Brindley, Walti, & Blaschke, 2009). This is particularly important for graduate programs in which students are working adults who may group themselves with others sharing similar work times. Finally, provide clear goals for the group project including timelines and expectations as well as provide feedback throughout the group process on learning content, tasks and participation can help students build strong relationships (Coll, Rochera, de Gispert, & Diaz-Barriga, 2013). Applying these recommendations can be challenging for novice online instructors. Hopefully the following example will assist in a more concrete understanding of how to apply these concepts in online course design and teaching practices. The example is an assignment or project designed for an online mathematics. Traditionally mathematics is comprised of demonstration, practice, and assessment of problem solving skills individually. Face-to-face assessments typically involve quizzes and tests to ensure students are able to apply problem-solving methods. A cooperative online project in algebra might begin with requiring students to search the internet for algebraic problems and solutions that relate or interest them in some way. The instructor then requests that students share their findings with course participants and allow them to react to other’s findings. This could be done through a discussion. The instructor could use the information posted to group students by interests. For example, if several students find an algebraic card trick then they would be grouped together. Once students are grouped they will be asked to generate an instructor approved algebraic word problem based upon their theme. Once the instructor approves the problem via e-mail, the group will share it with the class and request their peers to solve the problem. This can also be done through the discussion too or the instructor could compile the problems and create an assignment. Depending on number of groups generated, there will be four to eight original problems to solve. The instructor could increase motivation by incentivizing the students to be the first to solve one of the problems correctly. Solutions could be submitted publicly via discussion or privately via the assignment.
119
Instructor Presence
CONCLUSION Instructor presence has more opportunities to grow organically in face-to-face courses, but needs to be more deliberately planned for online courses due to the transactional distance between instructors and students. Faculty teaching online for the first time might assume that uploading lectures and other instructional materials to online environments are enough for teaching in this delivery format. Assumptions similar to this one need to be debunked, which was one of the goals of this chapter when presenting best practices for instructors. Innumerous practices discussed in this chapter could be incorporated into professional development for new and even veteran faculty teaching online. The value of instructor social presence in the literature is strong. For instance, in a large-scale study investigating the importance of social presence in online courses, Lear and colleagues (2009) found the instructor elements that had the most impact were instructional designs for interaction and evidence of instructor engagement. Pollard and colleagues (2009) also suggested that the Community of Inquiry framework be edited to include instructor social presence as a fourth construct. Besides these studies, books designed for online teaching have also emphasized the importance of instructor social presence (e.g., Miller, 2014; Palloff & Pratt, 2011) in online courses. Instructors are encouraged to establish a social presence by not only being present in the online environment, but also developing a social connection with their learners. Several strategies discussed in this chapter to establish instructor social presence include, but are not limited to: contribute to the discussion board, promptly answer students’ emails, provide and elicit feedback, address students by name and others. A major limitation for instructors teaching in online asynchronous environments is the lack of non-verbal cues, which is often mediated by text and prone to misinterpretations. This limitation might affect instructor social presence in an online course. Therefore, further research should be conducted to examine instructor social presence. Researchers and scholars could also focus on identifying practical strategies to assist online instructors in effective creating instructor presence through course development and implementation. This chapter outlined course development strategies such as: 1. Creating a consistent course structure with varied presentation of materials and learning activities 2. Providing detailed and clear instructions and expectations for all assessments that are clearly aligned to goals and materials using personal language 3. Designing opportunities that allow students to apply concepts learned in a personal context Implementation strategies such as providing prompt and quality personal feedback, frequent communication, and becoming a facilitator of instruction as opposed to a lecturer or deliverer of content. Application of such strategies may allow for a learning community to be developed in which the students are more comfortable and motivated to participate. Implementing the suggested strategies effectively in an asynchronous online course may aid in increasing student motivation, satisfaction, and perceived learning. This chapter’s contribution to the field focused on strategies to be incorporated in online environments. Still, the list of strategies presented in the chapter is by no means comprehensive. Further research needs to be conducted to verify the effectiveness and outcomes of such strategies as well as other emerging techniques in online environments to establish and develop social presence.
120
Instructor Presence
REFERENCES Akyol, Z., & Garrison, D. R. (2014). The development of a community of inquiry over time in an online course: Understanding the progression and integration of social, cognitive and teaching presence. Journal of Asynchronous Learning Networks, 12, 2–3. Akyol, Z., Garrison, D. R., & Ozden, M. Y. (2009). Online and blended communities of inquiry: Exploring the developmental and perceptual differences. International Review of Research in Open and Distance Learning, 10(6), 65–83. Allen, E. I., & Seaman, J. (2013). Changing course: Ten years of tracking online education in the United States. Quahog Research Group, LLC and Babson Survey Research Group. Retrieved from http://www. onlinelearningsurvey.com/reports/changingcourse.pdf Andersen, P., & Andersen, J. (1982). Nonverbal Immediacy in Instruction. In L. Barker (Ed.), Communication in the Classroom (pp. 98–120). Englewood Cliffs, NJ: Prentice-Hall. Anderson, T., Rourke, L., Garrison, D. R., & Archer, W. (2001). Assessing teacher presence in a computer conferencing context. Journal of Asynchronous Learning Networks, 5(2), 1–7. Arbaugh, J. B. (2001). How instructor immediacy behaviors affect student satisfaction and learning in web courses. Business Communication Quarterly, 64(4), 42–54. doi:10.1177/108056990106400405 Arbaugh, J. B., & Hwang, A. (2006). Does teaching presence exist in online MBA courses? The Internet and Higher Education, 9(1), 9–21. doi:10.1016/j.iheduc.2005.12.001 Argon, S. (2003). Creating social presence in online environments. New Directions for Adult and Continuing Education, 100(100), 57–68. doi:10.1002/ace.119 Argyle, M., & Dean, J. (1965). Eye-contact, distance and affiliation. Sociometry, 28(3), 289–304. doi:10.2307/2786027 PMID:14341239 Baker, J. D. (2004). An investigation of relationships among instructor immediacy and affective and cognitive learning in the online classroom. The Internet and Higher Education, 7(1), 1–13. doi:10.1016/j. iheduc.2003.11.006 Bodie, L. W., & Bober-Michel, M. (2014). An Experimental Study of Instructor Immediacy and Cognitive Learning in an Online Classroom. Proceedings of the2014 International Conference on Intelligent Environments, Shanghai, China. IEEE. doi:10.1109/IE.2014.50 Borup, J., Graham, C. R., & Velasquez, A. (2011). The use of asynchronous video communication to improve instructor immediacy and social presence in a blended learning environment. In A. Kitchenham (Ed.), Blended learning across disciplines: models for implementation (pp. 38–57). Hershey, PA, USA: IGI Global. doi:10.4018/978-1-60960-479-0.ch003 Borup, J., West, R. E., & Graham, C. R. (2012). Improving online social interaction through asynchronous video. The Internet and Higher Education, 15(3), 195–203. doi:10.1016/j.iheduc.2011.11.001 Brindley, J. E., Walti, C., & Blaschke, L. M. (2009). Creating effective collaborative learning groups in an online environment. International Review of Research in Open and Distance Learning, 10(3), 1–18.
121
Instructor Presence
Brook, C., & Oliver, R. (2007). Exploring the influence of instructor actions on community development in online settings. In N. Lambropoulos & P. Zaphiris (Eds.), User-centered design of online learning communities (pp. 341–364). Hershey, PA, USA: IGI Global. doi:10.4018/978-1-59904-358-6.ch015 Brook, C., & Oliver, R. (2007). Exploring the influence of instructor actions on community development in online settings. In N. Lambropoulos & P. Zaphiris (Eds.), User-centered design of online learning communities (pp. 341–364). Hershey, PA, USA: IGI Global. doi:10.4018/978-1-59904-358-6.ch015 Campbell, D. (2014). The Influence of Teacher Immediacy Behaviors on Student Performance in an Online Course (and the Problem of Method Variance). Teaching of Psychology, 41(2), 163–166. doi:10.1177/0098628314530351 Chang, B., & Kang, H. (2016). Challenges facing group work online. Distance Education, 37(1), 73–88. doi:10.1080/01587919.2016.1154781 Chickering, A. W., & Gamson, Z. F. (1987). Seven principles for good practice in undergraduate education. AAHE Bulletin, 39(7), 3–7. Chickering, A. W., & Gamson, Z. F. (1987). Seven principles for good practice in undergraduate education. AAHE Bulletin, 3, 7. Clark, R. C., & Mayer, R. E. (2011). E-learning and the science of instruction: Proven guidelines for consumers and designers of multimedia learning. John Wiley & Sons. doi:10.1002/9781118255971 Coll, C., Rochera, M. J., de Gispert, I., & Diaz-Barriga, F. (2013). Distribution of feedback among teacher and students in online collaborative learning in small groups. Digital Education Review, 23, 27–45. Conrad, D. (2002). Engagement, excitement, anxiety and fear: Learners experience of starting an online course. American Journal of Distance Education, 16(4), 205–226. doi:10.1207/S15389286AJDE1604_2 Draves, W. A. (2007). Advanced teaching online (3rd ed.). River Falls, WI: LERN Books. Du, J., Havard, B., & Li, H. (2005). Dynamic online discussion: Task-oriented interaction for deep learning. Educational Media International, 42(3), 207–218. doi:10.1080/09523980500161221 Easton, S. S. (2003). Clarifying the instructors role in online distance learning. Communication Education, 52(2), 87–105. doi:10.1080/03634520302470 Fisher, C. (2010). Discussion, participation and feedback in online courses. Proceedings of Information Systems Educators Conference. Retrieved from http://www.westga.edu/~distance/ojdla/winter114/ hixon114.html Fredericksen, E., Pickett, A., Shea, P., Pelz, W., & Swan, K. (2000). Student satisfaction and perceived learning with on-line courses: Principles and examples from the SUNY learning network. Journal of Asynchronous Learning Networks, 4(2), 7–41. Garrett, T. (2008). Student-centered and teacher-centered classroom management: A case study of three elementary teachers. Journal of Classroom Interaction, 43(1), 34–47. Garrison, D. R., & Akyol, Z. (2013). The community of inquiry theoretical framework. In M. G. doi:10.4324/9780203803738.ch7
122
Instructor Presence
Garrison, D. R., Anderson, T., & Archer, W. (2000). Critical inquiry in a text-based environment: Computer conferencing in higher education. The Internet and Higher Education, 2(2/3), 87–105. Garrison, D. R., Anderson, T., & Archer, W. (2010). The first decade of the community of inquiry framework: A retrospective. The Internet and Higher Education, 13(1-2), 5–9. doi:10.1016/j.iheduc.2009.10.003 Gorham, J. (1988). The relationship between verbal teacher immediacy behaviors and student learning. Communication Education, 37(1), 40–53. Gorham, J. (1988). The relationship between verbal teacher immediacy behaviors and student learning. Communication Education, 37(1), 40–53. doi:10.1080/03634528809378702 Griffiths, M. E., & Graham, C. R. (2009) Using asynchronous video in online classes: Results from a pilot study. Instructional technology & distance Learning, 6(3), 65-76. Gunawardena, C., & Zittle, F. (1997). Social presence as a predictor of satisfaction within a computer mediated conferencing environment. American Journal of Distance Education, 11(3), 8–26. doi:10.1080/08923649709526970 Guo, P. J., Kim, J., & Rubin, R. (2014). How video production affects student engagement: An empirical study of mooc videos. Proceedings of the first ACM Learning@scale conference (pp. 41–50). ACM. Herrmann, N., & Popyack, J. L. (2003). Electronic grading: When the tablet is mightier than the pen. Syllabus: Technology for Higher Education. Ice, P., Curtis, R., Phillips, P., & Wells, J. (2007). Using asynchronous audio feedback to enhance teaching presence and students’ sense of community. Journal of Asynchronous Learning Networks, 11(2), 3–25. Jennings, S. E., & McCuller, M. Z. (2004). Meeting the challenges of grading online business communication assignments. Paper presented at the69th Annual Convention, Association for Business Communication, Cambridge, Massachusetts. Johnson, D. W., & Johnson, F. P. (1991). Joining together: Group theory and group skills. Prentice-Hall, Inc. Jones, N., Georghiades, P., & Gunson, J. (2012). Student feedback via screen capture digital video: Stimulating students modified action. Higher Education, 64(5), 593–607. doi:10.1007/s10734-012-9514-7 Khan, B. H. (Ed.). (1997). Web-based instruction. Educational Technology. Koh, M. H., Hill, J. R., & Barbour, M. K. (2010). Strategies for instructors on how to improve online groupwork. Journal of Educational Computing Research, 43(2), 183–205. doi:10.2190/EC.43.2.c Kolb, A. Y., & Kolb, D. A. (2005). Learning styles and learning spaces: Enhancing experiential learning in higher education. Academy of Management Learning & Education, 4(2), 193–212. doi:10.5465/ AMLE.2005.17268566 Kuh, G. D. (2009). What student affairs professionals need to know about student engagement. Journal of College Student Development, 50(6), 683–706. doi:10.1353/csd.0.0099 Kurt, A. A. (2011). Personalization principle in multimedia learning: Conversational versus formal style in written word. The Turkish Journal of Educational Technology, 10(3), 185–192. 123
Instructor Presence
Lambert, N., & McCombs, B. (2000). Introduction: Learner-centered schools and classrooms as a direction for school reform. In N. Lambert & B. McCombs (Eds.), How students learn (pp. 1–15). Washington, D.C.: American Psychological Association. doi:10.4000/books.pur.16454 Lear, J. L., Isernhagen, J. C., LaCost, B. A., & King, J. W. (2009). Instructor Presence for Web-Based Classes. Delta Pi Epsilon Journal, 51(2), 86–98. Retrieved from http://search.proquest.com.liblink.uncw. edu/docview/195592989?accountid=14606 Lowenthal, P., & Parscal, T. (2008). Teaching presence. The Learning Curve, 3(4), 1–2. MacNeill, H., Telner, D., Sparaggis-Agaliotis, A., & Hanna, E. (2014). All for one and one for all: Understanding health professionals’ experience in individual versus collaborative online learning. Journal of Continuing Education in the Health Profession, 34, 102-111. doi:10.1002/chp.21226 Mandermach, B. J., Gonzales, R. M., & Garrett, A. L. (2006). An examination of online instructor presence via threaded discussion participation. Journal of Online Learning and Teaching, 2(4), 248–260. Markus, G. B., Howard, J. P., & King, D. C. (1993). Notes: Integrating community service and classroom instruction enhances learning: Results from an experiment. Educational Evaluation and Policy Analysis, 15(4), 410–419. McCombs, B. (2015). Learner-Centered Online Instruction. New Directions for Teaching and Learning, 2015(144), 57–71. doi:10.1002/tl.20163 McConnell, D. (2000). Implementing computer supported cooperative learning (2nd ed.). London: Kogan Page. Mehrabian, A. (1971). Silent messages. Belmont, CA: Wadsworth. Miller, M. (2014). Minds online: Teaching effectively with technology. Cambridge, MA: Harvard University Press. doi:10.4159/harvard.9780674735996 Moore, M. (1973). Toward a theory of independent learning and teaching. The Journal of Higher Education, 44(12), 661–679. doi:10.2307/1980599 Moore, M. G. (2007). The theory of transactional distance. In M. G. Moore (Ed.), Handbook of distance education (pp. 89–105). Mahwah, NJ: Lawrence Erlbaum Associates. Moore, M. G. (Ed.), Handbook of Distance Education (3rd ed.). New York, NY: Routledge. Murray, M., Perez, J., Geist, D., & Hedrick, A. (2012). Student interaction with online course content: Build it and they might come. Journal of Information Technology Education, 11, 125–139. Ni, S., & Aust, R. (2008). Examining teacher verbal immediacy and sense of classroom community in online classes. International Journal on E-Learning, 7(3), 477–498. Retrieved from http://search. proquest.com.liblink.uncw.edu/docview/210333926?accountid=14606 O’Sullivan, P. B., Hunt, S., & Lippert, L., Owens, S/. & Whyte, A. (2001, November). Mediated immediacy: Affiliation at distance in educational contexts. Paper presented at theannual meeting of the National Communication Association, Atlanta, GA, USA.
124
Instructor Presence
Orlando, J. (2011). How to effectively assess online learning (White Paper). Retrieved from http://www. stjohns.edu/sites/default/files/documents/ir/f63bd49dcf56481e9dbd6975cce6c792.pdf Palloff, R. M., & Pratt, K. (2010). Collaborating online: Learning together in community (Vol. 32). John Wiley & Sons. Palloff, R. M., & Pratt, K. (2011). The excellent online instructor: Strategies for professional development. San Francisco, CA: Jossey-Bass. Panitz, T. (1996). A Definition of collaborative vs cooperative learning. Retrieved from http://www. londonmet.ac.uk/deliberations/collaborative-learning/panitz-paper.cfm Picciano, A. G. (2002). Beyond student perceptions: Issues of interaction, presence, and performance in an online course. Journal of Asynchronous Learning Networks, 6(1), 21–38. Pollard, H., Minor, M., & Swanson, A. (2014). Instructor Social Presence within the Community of Inquiry Framework and Its Impact on Classroom Community and the Learning Environment. Online Journal of Distance Learning Administration, 17(2), n2. Richardson, J., & Swan, K. (2003). Examining social presence in online courses in relation to students’ perceived learning and satisfaction. Journal of Asynchronous Learning Networks, 7(1), 68–88. Richmond, V. P. (2002). Teacher nonverbal immediacy: Uses and outcomes. In Communications for teachers . Rourke, L., & Kanuka, H. (2009). Learning in Communities of Inquiry: A Review of the Literature. Journal of Distance Education, 23(1), 19–48. Sargeant, J., Curran, V., Allen, M., Jarvis-Selinger, S., & Ho, K. (2006). Facilitating interpersonal interaction and learning online: Linking theory and practice. The Journal of Continuing Education in the Health Professions, 26(2), 128–136. doi:10.1002/chp.61 PMID:16802307 Saxena, S. (2013). Best classroom practices for student-centric teaching. EdTechReview. Retrieved from http://edtechreview.in/news/news/trends-insights/insights/775-best-classroom-practices-for-studentcentric-teaching?goback=%2Egde_5092459_member_5810429914867843076#%21 Shea, P., Hayes, S., & Vickers, J. (2010). Online instructional effort measured through the lens of teaching presence in the community of inquiry framework: A re-examination of measure and approach. International Review of Research in Open and Distance Learning, 11(3), 127–154. Shea, P., Li, C., & Pickett, A. (2006). A Study of Teaching Presence and Student Sense of Learning Community in fully Online and Web-enhanced College Courses. The Internet and Higher Education, 9(3), 175–190. doi:10.1016/j.iheduc.2006.06.005 Sheridan, K., & Kelly, M. A. (2010). The indicators of instructor presence that are important to students in online courses. Journal of Online Learning and Teaching, 6(4), 767. Retrieved from http://search. proquest.com/docview/1497198590?accountid=14606 Shin, N. (2002). Beyond interaction: The relational construct of transactional presence. Open Learning, 17(2), 121–137. doi:10.1080/02680510220146887
125
Instructor Presence
Short, J. A., Williams, E., & Christie, B. (1976). The social psychology of telecommunications. London: Wiley. Skramstad, E., Schlosser, C., & Orellana, A. (2012). Teaching presence and communication timeliness in asynchronous online courses. Quarterly Review of Distance Education, 13(3), 183. Smith, R. O. (2005). Working with difference in online collaborative groups. Adult Education Quarterly, 55, 182-199. doi:10.1177/0741713605274627 Straus, S. G., & McGrath, J. E. (1994). Does the medium matter? The interaction of task type and technology on group performance and member reactions. Journal of Applied Psychology, 79, 87-97. doi:10.1037/0021-9010.79.1.87 Sue, B. S., Sarah, C. W., & Warren, S. H. (2006). Reaching through the screen: Using a tablet PC to provide feedback in online classes. Rural Special Education Quarterly, 25(2), 8-12. Retrieved from http:// search.proquest.com.liblink.uncw.edu/docview/227213654?accountid=14606 Swan, K. (2001). Virtual interaction: Design factors affecting student satisfaction and perceived learning in asynchronous online courses. Distance Education, 22(2), 306–331. doi:10.1080/0158791010220208 Swan, K. (2003). Learning effectiveness online: What the research tells us. Elements of Quality Online Education. Practice and Direction, 4, 13–47. Swan, K., & Shih, L. F. (2005). On the nature and development of social presence in online course discussions. Journal of Asynchronous Learning Networks, 9(3), 115–136. Tu, C. H. (2005). From presentation to interaction: New goals for online learning. Educational Media International, 42(3), 189–206. doi:10.1080/09523980500161072 Vaughan, N., & Garrison, D. R. (2005). Creating cognitive presence in a blended faculty development community. The Internet and Higher Education, 8(1), 1–12. doi:10.1016/j.iheduc.2004.11.001 Warnock, S. (2013, April 18). Frequent, Low-Stakes Grading: Assessment for Communication Confidence. Retrieved from http://www.facultyfocus.com/articles/educational-assessment/frequent-low-stakesgrading-assessment-for-communication-confidence/ Weiner, M., & Mehrabian, A. (1968). Language within Language: Immediacy, a channel in verbal communication. New York: Appleton-Century-Crofts. Williams, P. E. (2000). Defining distance education roles and competencies for higher education institutions: A computer-mediated delphi stud [Doctoral Dissertation]. Wisneski, J. E., Ozogul, G., & Bichelmeyer, B. A. (2015). Does teaching presence transfer between MBA teaching environments? A comparative investigation of instructional design practices associated with teaching presence. The Internet and Higher Education, 25, 18–27. doi:10.1016/j.iheduc.2014.11.001
126
127
Chapter 7
Developing a Pedagogical Framework for Simulated Practice Learning:
How to Improve Simulated Training of Social Workers who Interact with Vulnerable People Nikolina Tsvetkova EI Centre, Bugaria Albena Antonova EI Centre, Bugaria Plama Hristova EI Centre, Bugaria
ABSTRACT While simulated learning becomes an attractive learning method for learners and educators, it is the pedagogical framework behind the technology design that makes the learning efficient. Thus the context and the subject domain, along with learning theories largely influence its impact. Working with vulnerable people becomes part of many jobs specifics. Therefore, the main goal of the chapter is to present the pedagogical framework for simulated practice learning for social workers who interact with vulnerable people. It takes into consideration both the theories of learning and the features of games-based learning. It also outlines the relations between the broader social context, the particular educational setting and the learner, the trainer and the vulnerable person. The focus of the presented simulated learning is on teacher training for child-care professionals who work with 3- 7 years old children. The Pedagogical Framework is developed under the Simulated Practice for Skills Development in Social Services and Healthcare - Digital Bridges (2014-1-UK01-KA200-001805).
DOI: 10.4018/978-1-5225-1851-8.ch007
Copyright © 2017, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
Developing a Pedagogical Framework for Simulated Practice Learning
INTRODUCTION Healthcare and social services have gained significant attention over the recent years and will become even more important in view of the aging of society and the increasing problems with economic downturns, unemployment, migration and social exclusion. Thus the healthcare and social systems in all EU Member States face similar objective requirements to adjust to demographic problems such as ageing, societal and economic changes, migration and social pressure. Moreover, nowadays the latter frequently demand that professionals fast adapt to and expertly handle various real-life situations involving interaction with vulnerable people. Accordingly, many social workers need to enhance their competences to work with such vulnerable groups through attending specially designed appropriate programs and on-the-job-trainings. Therefore, investigating computer-based learning, simulated learning and gamebased learning models in the context of working with vulnerable people can bring many benefits such as improving access to learning, learning adaptability and problem-oriented learning. Social and pedagogical work viewed in a professional context is a complex phenomenon which combines a number of activities, functions and professional roles which are generally aimed at supporting people in their coping with difficult life situations and strive for successful functioning in society. Its main aim is to satisfy the socially guaranteed as well as the personal interests and needs of people from different social strata. In this sense it can be discussed as an integrative activity targeting society as a whole. On the other hand, it can be seen as an activity which targets a smaller social group or an individual in a difficult situation (Pavlenko, 2010). Often, people who are subject to social and pedagogical work are referred to as “vulnerable population”. The vulnerability notion is researched in depth and explained in detail from a healthcare point of view in relation to susceptibility to different health conditions, access to health care, or as “vulnerability by virtue of status” (See de Chesnay &Anderson, 2016). In an EU context, special emphasis is placed on sustainable and adequate reforms of social protection systems, active inclusion strategies, well-designed universal and targeted benefits and most often migrants, ethnic minorities, people with disabilities, homeless, etc. are seen as vulnerable (European Commission, 2011). Dealing effectively with such social groups as well as improving life circumstances and equal access to services and protection is viewed as a major factor in achieving social cohesion, hence the need to provide up-to-date initial and in-service training for social work and pedagogy professionals. Simulation learning has gained in popularity and is being used more frequently in a number of health disciplines including social care (Wiseman, Haynes, and Hodge, 2013). According to Cosman et al., (2002) simulations display a number of important advantages such as being available regardless of the time factor and being usable in the development from novice to expert. They also lend themselves to being rehearsed before being assessed and allow for risk-free training. Furthermore, when computers are used, an effective record of previous performances of the procedure can be compared to future attempts and thus the trainee can obtain effective feedback. In addition, games-based learning has also grown in popularity and has become recognized as a potentially engaging (motivating and rewarding) and novel (innovative and more interactive) form of learning. It has been applied in a number of different areas such as: physics (Anderson & Barnett, 2013), health and well-being (Farrel et al., 2011), multiculturalism, tolerance, and solidarity (Furió et al., 2013), promotion of social skills and bullying prevention (Rubin-Vaughan et al., 2011), nutrition (Baños et al., 2013; Yien et al., 2011), music (Çoban & Tuncer, 2008) mathematics (Bakker et al., 2012), science (Wang, 2008) and language learning (Yang, Chen, & Jeng, 2010; Connolly et al., 2010). Some specific game examples also include successful results for
128
Developing a Pedagogical Framework for Simulated Practice Learning
changing attitudes, as for example dealing with xenophobia and prejudices, which can be very important for professionals dealing with vulnerable people. The main goal of the present chapter is to discuss the process of developing а pedagogical framework for simulated practice learning for social workers who interact with vulnerable people developed under the ERASMUS+ Simulated Practice for Skills Development in Social Services and Healthcare Digital Bridges project. The chapter objectives are to investigate how certain professional competences and skills can be further developed and extended through simulated practice in a game-based learning setting. More specifically, the key constraints for developing a pedagogical framework for simulated practice learning environment. It takes into consideration both theories of learning and the features of games-based learning and serious games. It also outlines the relations in a broader social context, the particular educational setting and the learner, the trainer and the vulnerable person. A particular immersive game is presented as an example of a simulated practice learning environment for child-care professionals dealing with children (3 to 7-year-old) in a day child-care center. The results of piloting the afore-mentioned simulation environment with roughly 150 trainers and trainee participants in Bulgaria are presented and discussed. The first part of the chapter contains a short overview of the theoretical concepts behind computerbased learning and simulations, including theories on learning, game-based learning and serious games features. It also outlines some of the specific problems in the area of working with vulnerable people. The second part provides more details about the developed model of Pedagogical Framework, its main components and bounding elements. Finally, some of the outcomes of a real-life implementation of the pedagogical framework for simulated learning and highlights from the pre and post-pilot feedback received during the e-learn training are presented. In the discussion part some of the key findings of the learning process are outlined and in the conclusion section some further research problems are raised.
BACKGROUND Naturally, people associate learning to formal education processes and practices, without admitting that the mass educational institutions and schools have been designed back in the 18th and the 19th century, long before understanding the complex process of how people learn (Sawyer, 2006). Thus, for traditional classroom practices, knowledge is considered as a collection of facts and procedures, and the goal of the educational system is just to transmit these facts and procedures to the learner. Nowadays, research on learning and knowledge acquisition is progressing and proposes many sophisticated approaches focusing on how to conceptualize and organize learning in order to make it more efficient, effective and user-oriented. Consequently, in the context of present-day e-learning and technology-enhanced learning systems, the following psychology-based learning theories are largely discussed and implemented (Hammond et al, 2001) – behaviorism, cognitivism, constructivism and social constructivism (Moedtricher, 2006; Hung, 2001; McLeod, 2003). Other researchers focus on the theory of “connectivism” and even explore the idea of “ecclectism” to summarize that modern learning theories are actually converging. Other studies, for example De Jong (2013), investigate new approaches to learning - “Learning by Design”, combining both principles of inquiry and collaborative learning, including learning about the domain knowledge, learning about the inquiry process, and learning about cultural aspects and cultural differences. Despite these emerging views, the basic learning theories best illustrate how the process of learning and knowledge acquisition can be decomposed and understood as a complex and context-related 129
Developing a Pedagogical Framework for Simulated Practice Learning
matter. Furthermore, learning theories have been considered not as competing, but as complementing strategies, covering various specific learning situations and contexts. Visualizing this, Illeris (2003) proposes a comprehensive framework that combines the variety of learning theories appropriate for adult learning. He sets a framework based on three main learning dimensions: cognition, emotion and environment, and demonstrates that the whole of the learning process is situated in a social context. In his framework, the (adult) learner is the one who is in control of the learning situation. While learning processes enable people to acquire knowledge and skills in order to cope with challenges and to make rational decisions, they also have a substantial role in forming attitudes, perspectives, insights, and shared understandings. They enable the learner to perform desired functions with proficiency, i.e. demonstrating a certain acquired level of competence. Therefore, first-hand personal experience and active learning methods of trial and error represent a substantial component of learning and knowledge acquisition. Thus, the most powerful learning comes from direct experience (Senge, 2006), and proactive, experience-gaining learning based on reflective internalization of the experience provides a basis better than reactive and passive learning. Waterworth & Waterworth (1999) outline some differences between perceptual and conceptual learning, pointing out that conceptual learning is theoretical and generalized learning about “there and then”, while perceptual learning concerns “here and now”. Perceptual learning is closer to the learner and faster transforms knowledge from conscious to unconscious (and thus to tacit knowledge and expertise). Recognizing the advantages of active and experience-based learning, a number of researchers and practitioners in the field of e-learning and game-based learning outline Kolb’s learning cycle as an appropriate learning model, focusing on learning-by-doing and further reflection and evaluation of the learning experiences. This model of learning most directly puts reflection in a central position. Kolb’s experiential learning cycle (1984) includes four phases (Figure 1): gaining experience – review of experience – reflection on outcomes – conceptualizing feedback (Kolb, 1994). Figure 1. Kolb’s learning cycle (1984)
130
Developing a Pedagogical Framework for Simulated Practice Learning
Kolb’s model effectively corresponds to the fast changing environment and life-long-learning approach, where learning is problem-oriented, situational and experiential. Consequently, using Kolb’s learning model can be seen as beneficial to training social and healthcare workers who interact with vulnerable people. Kolb’s learning cycle is likely to largely promote a reflective approach to learning and experience-gaining and in general, to lead to the development of the desired professional competences.
PEDAGOGICAL FRAMEWORK FOR SIMULATED PRACTICE LEARNING Learning Theories, Observational Learning and Reflection When discussing the concepts of learning theories and knowledge acquisition with interactive media such as computer simulations and games, some of the models of implicit knowledge construction and reflection thinking should be reconsidered. Thus a pedagogical framework for professionals interacting with vulnerable people should identify how social and healthcare workers can combine simulation and game-play for enhancing practice-oriented learning. Therefore, reviewing Bandura’s Social Cognitive Theory (as cited in Bryant & Zillman, 2002) we can identify how the social mechanisms of cognition and the social origins of thought and reflection are the mechanisms through which social factors influence knowledge acquisition. First, observational learning or models of vicarious and social verifications enable people to acquire implicitly (unintentionally) new knowledge without having direct experience or applying direct cognitive rules. Thus, observational learning influences the adoption of rules that can be used to judge the reality or to generate new instances of behavior. Consequently, implicit knowledge construction with observational learning can make games effective in developing new ways of thinking and new learning behaviors. Acquisition of generative rules from modeled information involves three main processes – extracting generic features from various social exemplars, integrating the extracted information to composite rules, and, finally, using the rules to produce new instances of behaviors. Through abstract modeling, people acquire standards for categorizing and judging events, linguistic rules of communication, thinking skills and personal standards for regulating motivation and conduct. Second, reflection is identified as an important condition for learning from experience. Refection during a learning process helps students monitor themselves, change the objectives and structure of their activities, communicate and cooperate with other people (clients, colleagues) and sources of information, analyze the achievements and limitations of their activities or learning (Jucevičienė, 2007). Johns (2009) describes reflection as a careful, detailed and realistic look into the mirror, where a practitioner/ learner can see himself/herself in a certain situation with all his/her actions and feelings relevant to that situation. A need for reflection occurs when a practitioner/learner faces a new complex situation, which requires a solution. Hereby new decisions are made, new insights are acquired when a learner/ practitioner has to look again at this complex situation from another perspective. It is important that the practitioner/learner be able to reflect not only on new situations and on how he/she has managed to deal with it, it should be stressed that reflection on any professional action should involve all professional knowledge including theoretical models, already gained skills and personal attitudes. Eventually, such introspection becomes a professional skill of everyday professional practice and encourages individuals to be more self-aware and confident.
131
Developing a Pedagogical Framework for Simulated Practice Learning
Reflection is a process which helps learner/practitioner to grow in his/her professional field. According to the International Association of Schools of Social Work and International Federation of Social Workers (2004) a significant part of the study programs must be allocated to practical students’ preparation and to reflection skills development.
Game-Based Learning and Serious Games Games are genuinely used in educational processes since early age, as they allow learners to acquire knowledge and skills in a more natural, close to the context environment, in a playful and less-stressful manner. They are often associated with fun providing not only different kinds of knowledge/competence / skills acquisition, but they allow people to interact, to socialize, to explore and test both reality and others. Computer simulations and computer games allow people to be involved actively in educational processes. Thus the learner is not only an observer in the learning process (as compared to using other educational media), but can take part in a number of activities and decision-making learning from his own experience and own participation in the process. Computer games-based learning has been defined as the use of a computer games-based approach to deliver, support, and enhance teaching, learning, assessment, and evaluation (Connolly & Stansfield, 2007). Further definitions of games-based learning often overlap or extend the terms of e-learning, “edutainment” (coming from education and entertainment), serious games, video-games and games based learning (Susi et al., 2007; De Freitas, 2008). Serious games are commonly described as (digital) games used for purposes other than mere entertainment or fun. They usually refer to games used for training, advertising, simulation, or education that are designed to run on personal computers or video game consoles. With wide implementation of sophisticated mobile applications even more dynamic mobile gaming models emerge. A substantial difference between games and computer simulations should be noted. Computer simulations may provide less involvement of the end-users (learners) in comparison to computer games but they have a strong educational value, concerning observational learning and for illustrating complex relationships. However, the level of end-user interaction in computer simulations is much less important for knowledge acquisition processes. The logic of serious games (SGs) and training simulations is to develop complex scenarios, where learners can develop skills coping with a number of challenging situations. In serious games, Kolb’s learning cycle (Antonova & Todorova, 2010) is adopted, where learning is developed through a number of trial-and-error situations. Building successful serious games includes the synchronization of multiple elements (game mechanics, appealing graphic environment, engaging scenarios), and therefore achieving a good mix of learning elements can prove very difficult. Moreover, expert knowledge should be incorporated in good quality and form within the game scenarios and game elements in order to form a learning path. So expert knowledge is crucial in making the learning simulation useful and meaningful to learners, and to put them in situations where they can substantially build new skills. On the other hand, the design of a serious game should fulfill several objectives, namely to transfer knowledge, to develop skills and desired attitudes at the same time remaining enjoyable and engaging the personality of the player. Serious games are playful, engaging and interactive alternatives to more passive media. They are context-related and involve learners in the educational content thus leading to a successful and rich learning experience. Below are some important characteristics of serious games (further elaborated on in Antonova & Martinov, 2010). 132
Developing a Pedagogical Framework for Simulated Practice Learning
• • • • • • • • • •
Learners become creators and have control over the play. Learners get more responsibilities and develop social skills. Learners fully engage with their role in the game. SGs provide tools for self-expression and improve computer literacy. SGs enable learners with physical and communicative needs, make passive learners more active. SGs remove barriers for learners with difficulties in traditional learning. SGs allow collaboration and sharing of knowledge. SGs better illustrate some concepts of complex processes. SGs implement specific contexts and relationships, not achievable in traditional learning setting. SGs provide a high degree of entertainment.
Investigating different learning processes, occurring in serious games the following learning models can be identified: • • • •
Learning by Exploring: learning through exploration is a type of experiential learning facilitating both knowledge acquisition and the synthesis of contextual and situational knowledge. Learning by Collaborating: SGs support learning in collaboration and team-work or learning collaboratively as opposed to learning competitively. Learning by Being: SGs enforce learning by exploration of self and of identity. Assuming different identities can enlarge an individual’s perception of situation, leading to improved cognitive and communication skills. Learning by Expressing : SGs can facilitate the development of expression competences, externalizing experiences from games and simulations to the real world.
The observed learning models are usually adopted in combination meaning that learners get complex experiences while exploring, communicating and role-playing within specific learning scenarios. This comes to show that SGs support mainly active learning, requiring learners to be involved in the educational activity, rather than receive information in an explicit form. In SGs learners play the leading role in the learning process as they are involved in complex open-ended cognitive activities. The trainers’ role transforms from one of knowledge providers to those of advisers and facilitators, guiding and helping students.
Serious Games as Simulated Learning Platforms Serious games can be applied in different domain contexts. The main elements of SGs include: back story (plot/ story line), game mechanics (specific physical functions and actions within games), rules (constraints in the game play), immersive graphic environment (including 2D/3D graphics, sound, and animation), interactivity (impact of the player’s actions on the game), and challenge/competition (the “heart” of any game – competition against the game, against one self, or against other players). SGs have a prewritten set of actions (scenario) that a player must accomplish in order to get some positive result, following predefined rules and constraints. Usually, players can get help and instructions and are “assisted” while playing the game. Many SGs provide feedback and analyses of game results, so that players can better understand their performance and improve it. SGs are often displayed in an appealing graphical environment and propose an amusing back story. This makes SGs suitable for learn133
Developing a Pedagogical Framework for Simulated Practice Learning
ing purposes especially in active learning (focusing on the learner), for testing competences and skills, developing critical thinking, testing scenarios, profiling, self-learning and others. As a downside, SGs are considered unable to easily integrate theoretical knowledge. Since serious games provide a stimulating environment which impacts on the knowledge and skills acquisition as well as on the adoption of desired behaviors and attitudes, it seems self-evident that serious games provide a natural environment in which to learn the necessary skills for complex work. Combining a complex simulation environment with a specially designed serious game can be therefore considered extremely suited to practice learning especially when the latter is connected to experiencing situations that are impossible or difficult to achieve in the real world for reasons of safety, cost, time (Corti, 2006; Hainey et al., 2014).
Defining the Pedagogical Framework From a methodological point of view, a pedagogical framework is a basic document which sets the instructional design of a particular learning process and identifies the main roles and concepts behind the learning scenarios. With the implementation of new technologies, pedagogical frameworks need to become more flexible and open in order to effectively support the individual learning approaches and learning contexts (Dimitrova et al., 2004). Moreover, the effectiveness of such frameworks has to be tested in different educational contexts, in order to understand how they reflect specific subject matter and learning context. It is not a rare exception for specialists to seek the establishment of a generic model of pedagogic interventions which can then be applied in various concrete settings (i.e. adapted according to professional, national, local, cultural, etc. demands). A very successful example in this respect is the Web 2.0 Pedagogical Framework presented in detail by Baxter et al. (2011). This particular framework provides guidance in setting up, reflecting on and evaluating learning experiences – from teacher and trainer training one to activities designed for regular classroom use – of any format and duration and has also fed into the discussed model of simulated practice learning. Thus, the objectives of the Pedagogical Framework for Simulated Practice Learning are in the first place to investigate the underlying theories of learning with emphasis on games-based learning and simulations. Secondly, it underlines the professional competences which should be developed in the specific subject domain of social workers interacting with vulnerable people. Thirdly, the said Pedagogical Framework should investigate the roles of the learner(s), the roles of the trainee(s) and the roles of the learning material. Among the key features of the Pedagogical Framework are that it has to provide the tools to enhance virtual learning environments in a personally-relevant and stimulating ways that promote autonomous and self-directed learning. Computer-assisted learning environments should also foster learner-to-learner and learner-totutor interaction and collaboration (Dimitrova et al., 2004; Baxter et al, 2011).
The Pedagogical Framework for Simulated Practice for Social Workers Dealing with Vulnerable People The pedagogical framework (or the Framework) for simulated practice for social workers dealing with vulnerable people has been developed taking into consideration the above discussed theories of learning and the features of games-based learning and serious games. It also accounts for the relationships between the broader social context, the particular educational setting and the learner, the trainer and the
134
Developing a Pedagogical Framework for Simulated Practice Learning
Figure 2. Pedagogical Framework for Simulated Practice for social workers dealing with vulnerable people
vulnerable person. As demonstrated on figure 2, the Framework also displays features such as a focus on the learner, flexibility, transparency and consistency. The different areas of the pedagogic framework form complex relationships of dynamic mutual dependences and are presented briefly below.
Social Context The outermost layer of the framework refers to the social context which defines in practice what part of society should be considered in a vulnerable position and regulates how this part of society should be treated, taken care of, interacted with, etc.
135
Developing a Pedagogical Framework for Simulated Practice Learning
Educational Setting The next area of the framework refers to the educational setting where the educational setting is the setting in a vocational education training, higher education or further education institution or training provider which deals with initial and / or in-service instruction and development of specialists working with vulnerable groups.
Simulated Practice The simulated practice is one contained in the educational setting and corresponds to an institution’s established modes of training and instruction delivery as well as to the set learning and assessment targets.
Acquired Competences The acquired competences of the practitioner dealing with vulnerable people are an equally important part of the second area of the framework. On the one hand, they should be seen as the result of the established rules and patterns of teaching and learning at the particular educational institution. On the other, they are closely related to setting learning and assessment targets (all of which informed by the broader social context). Last but not least, the acquired competences allow a practitioner working with vulnerable people not only to be successful in completing their practice training but also in doing their job of interacting with vulnerable people in the broader social context.
Domain Field Competences and Domain Context Professional competence in the context of social and pedagogical work is a realization of a complex mix of theoretical knowledge, practical skills and personal qualities and attitudes. Practicing social work requires solid theoretical and methodological knowledge about social realities, methods, means and forms of social and pedagogical work, personal development, features of different social groups, the legal basis for carrying out social work, etc. Another important aspect is the presence of specific practical skills to ensure the acquired knowledge is put in practice. These skills are also complex and diverse: to understand the reasons for an existing situation or demonstrated behavior; to make an accurate and objective diagnostic or judgment, to offer adequate support and help, to create relationships based on mutual trust and respect, to work in a team, to administrate and manage the processes, etc. The effectiveness and success of a professional is closely related to their personal qualities. Such qualities necessary for functioning as a social worker or a social pedagogue include psychological and emotional stability, analytical and critical thinking, creativity, initiative-taking, responsibility, honesty, openness, resourcefulness, discreteness, etc. No less important is the place occupied by the attitudes of the individual central to these being humaneness, a sense of belonging to society, etc. (Sapundzhieva, 2011). Closely related to the Pedagogical Framework, is the model curriculum for Childhood Practitioner as an example of a specialist working with vulnerable people, which has also been developed under the Digital Bridges project. In the context of the Curriculum, a childhood practitioner is a worker, whose key function is to work in a nursery setting providing and developing play-based learning opportunities for children and supporting children’s social and emotional well-being and physical and cognitive de136
Developing a Pedagogical Framework for Simulated Practice Learning
velopment. The document outlines the aim of training, performance expectations and expected learning outcomes, the latter organized around three main areas – Knowledge and understanding, Application of practical and professional skills through reflective practice and Transferable skills. The curriculum facilitates the practical implementation of the framework also in that it helps define the best tools to achieve the set learning targets for the simulated practice learning. Thus it feeds directly into the 3D learning environment created by the Tiny Oaks simulation game which discussed below. Any change in the social context leads to a change in the educational environment, the simulated practice and the acquired competences. The acquired competences which should be aimed at are defined in view of all the above and in close relation to the “end-participant” in this complex set of relationships, which in our case is a vulnerable person. Thus there is no invariable set of competences but they change according to the social context (including the current legal constraints), the educational institution and the concrete profile of the vulnerable group. For instance, in the cases where the vulnerable person is a child, the core set of competences for childcare professionals has been defined as belonging to one of the three groups below (as indeed is the case with the Digital Bridges project): 1. Communication a) Know how to adapt the way you communicate b) Understand ways in which children may use play to communicate c) Know how to support children to cope with their feelings 2. Well-Being and Resilience a) Understand ways to encourage emotional well-being, confidence and resilience b) Understand ways of encouraging children to make choices, whilst at the same time making them aware of how their actions can affect others c) Know how to adapt your practice to ensure that all children, can take part equally 3. Health and Safety a) Understand different kinds of incidents and emergencies that might arise in a childcare setting b) Understand how to support children during an emergency c) Understand how to summon assistance appropriate to the emergency.
Pedagogy and learning theories The relevant pedagogies behind game scenarios make it clear how the immersive game and the simulated practice should be planned and carried out so as to achieve outcomes pertinent to the particular social context.
3D immersive game “Tiny Oaks” Under the Framework, in the context of the simulated practice learning, the focal point of interaction between a Learner, a Trainer and a Vulnerable Person is a particular 3D immersive game, which is constructed according to the specifics of games-based learning outlined above. In the practical implementation of the Framework reported on below, the simulation game is entitled “Tiny Oaks”. It is aimed at pre- and in-service practice training for childhood practitioners. Most notably, this game is emotionally engaging, allowing the learners to experience real-life situations involving children (who represent vulnerable people in this case) and take decisions about their own actions under such circumstances. 137
Developing a Pedagogical Framework for Simulated Practice Learning
Figure 3. Snapshots from the 3D Game “Tiny Oaks”
According to the cognitive task analysis done under the Digital Bridges project, the game is also visually appealing and not challenging in a technical respect thus making it non-dependent on the users’ (learners’) technical skills. The “Tiny Oaks” game is based on 10 game scenarios which in turn are based on real-life situations involving interaction with vulnerable people (figure 3). They aim at developing specific competences for the learner in order that he/she is able to: • • • •
Recognize specific situation while interacting with vulnerable people Identify specific set of activities – what to do/what not to do Develop skills and attitudes how to interact with vulnerable people in specific situations (risk settings) Train competences to cope with critical incidents
The Non-player characters (NPCs) take part in the game and interact with the player, as part of the scenario, in order to: • • • • • •
Provoke specific actions from the player Provide information and give instructions Give feedback Give advice Provide help Simulate different stakeholders
The Knowledge objects represent game elements that are available for the learner through the game. These can include instruction guides that can be available during the game, some reference web sites, instruction video, help button or panic button, etc.
138
Developing a Pedagogical Framework for Simulated Practice Learning
An important feature of the Game is the inclusion of quizzes, puzzles, games-in-the-game, such as: • • •
Pre-tests and post-tests Gaining points for the game Passing from level to level, etc. The Game also confronts learners with critical incidents which usually are rare events but
• •
Need professional attitude Aim to provide training and prepare the learners for action in specific cases
All of the above are included in the Tiny Oaks scenarios and work together in a way that ensures the achievement of the learning goals, the acquisition of particular professional competences, etc., predefined in the Model Curriculum. An important asset to the Game is that it is multilingual (for the time being it can be run in Bulgarian, English, Finnish, Italian and Lithuanian with subtitling in German and French). The introductory scenario allows learners to get to know the game non-playing characters (NPCs), the nursery setting and the game controls as well as the learning diary and the built-in self-assessment tools. The next scenario is devoted to practicing knowledge and skills related to ensuring a physically safe environment for children in a nursery setting and it does not contain interaction elements with NPCs. The third scenario involves observing two kids in a situation of a low-level aggression and requires players (learners) to resolve it in a professional manner. The fourth scenario involves a child suffering some injury due to safety issues with the nursery physical environment. It requires taking care of the injured kid and of any other kids who may have witnessed the accident and after that taking action to report on what has happed according to the professional requirements. The fifth and the sixth scenario bring in the question of what role kids’ families play in a nursery and of how to communicate with both kids and their parents. The seventh scenario is again connected to a conflict situation – this time involving a group of children - and resolving it. The eighth scenario poses a situation involving a child with a specific health problem which requires a child practitioner to both find a way to tend to the child’s health problem and consider his self-respect. The ninth scenario requires the player (learner) act in a stressful situation for the child, to reassure and comfort him. The last one deals with children’s real-world play and asks the player (learner) to organize a meaningful game for the kids in the nursery.
The Learner The learner is any person involved in simulated practice learning at a pre-service, in-service or continuous professional development level delivered by a HE, VET or an institution providing informal education for social workers and practitioners dealing with vulnerable people. The learner is actively involved in the learning process and through the immersive game is able to experience different situations of interacting with vulnerable people, requiring him or her to evaluate the situations, the possible consequences of all participants’ actions, the impact on the vulnerable people involved and the effect on society as a whole. In addition to playing the Game, the learner keeps a reflective journal where he/she records thoughts, ideas and questions about the game scenarios and actions taken during the game. They participate in further pre or post-playing game-driven and reflective activities, set up by the Trainer in accordance with the concrete learning aims and the social context as a whole. 139
Developing a Pedagogical Framework for Simulated Practice Learning
The Trainer The trainer interacts with the learners though the 3D immersive game and plays the roles of an observer, a resource, a provider of feedback and an assessor. The trainer can decide to intervene during the game play or to remain silent and offer advice and support only if asked by the learner. He / she can also guide the learning process by asking probing questions before or after the game play or by following and giving feedback to the reflective journals kept by the learners.
The Vulnerable Person The vulnerable person (child in the case of the Digital Bridges project) is part of the NPCs (non-playing characters) in the game. The learner gains knowledge of, develops his / her skills in and builds professional competences of real-world situations by interacting with vulnerable people represented in the Game. Ultimately, the aim of playing the 3D immersive game is to prepare learners to interact successfully with vulnerable people from a professional point of view. The learning cycle, supported by the developed Pedagogical Framework is organized among the four main phases from Kolb’s model that can be supported both in the game and in off-line / class or group settings (Figure 4). The stages of the learning cycle under the proposed pedagogical frameworks are thus observe – interact – reflect – apply.
Figure 4. The learning cycle – learning process experience within the simulated practice
140
Developing a Pedagogical Framework for Simulated Practice Learning
Observe Within the immersive 3D game learners have the opportunity to observe the relations between the settings, the vulnerable person, the actions of the players and the NPCs. There are built-in tools for recording these observations.
Interact Learners are also immersed in communicating verbally and non-verbally with players and the NPCs applying what they have acquired while taking in (observing) the environment. They may also interact with their trainers and ask them for help or consider their feedback in order to achieve better results in the game.
Reflect Then the experiences of observing and interacting within and outside the game scenarios are reflected on, conclusions regarding own performance as well as acceptable / inacceptable behaviors, successful / unsuccessful or desirable / undesirable interaction with vulnerable people are drawn. Reflections are registered through the in-game tools (the provided reflective journal space) or another medium and can be shared with trainer(s). Then, all these comments can be collected and further class discussions can be organized, using additional materials, case studies, best practices guidelines and videos.
Apply During the last phase, the user can replay the learning scenario applying into practice the newly acquired knowledge and competences.
EMPIRICAL RESEARCH INTO THE POTENTIAL OF SIMULATED PRACTICE LEARNING IN A BULGARIAN CONTEXT The pedagogical framework was tested successfully in a number of learning situations among all partner countries. In Bulgaria, several blended learning and entirely e-learning courses, based on the Digital Bridges platform, simulated learning game and supportive learning materials have been organized from February 2016 to April 2016. During several train-the-trainer sessions delivered in a blended format in December 2015, the learning model and learning materials were widely discussed and presented to trainers and end-users. The immersive educational environment was explored individually by the learners. Thus the face-to-face meetings proved that the proposed pedagogical framework was well adapted to the requirements of simulated practice learning experience for pedagogy students and child practitioners. The study presented here was conducted as part of professional training on both a pre-service and an in-service level and aims at verifying the potential of simulated practice learning in training of child-care specialists. The training in Bulgaria took place between February and April 2016. It involved university students and practicing teachers from pre-primary education establishments in the capital of Bulgaria – Sofia – and outside it. At the heart of the trainings was the Tiny Oaks 3D game supported 141
Developing a Pedagogical Framework for Simulated Practice Learning
by additional learning activities, part of which were aimed at introducing the game, its characters and the game scenarios, and the rest – mainly aimed at reflection and further development of participants’ sector-specific skills and attitudes. The training followed the learning cycle proposed in Framework outlined in previous section of this chapter. Based on the previously presented theoretical analyses and empirical research, the authors aim at studying the potential of the simulated practice for preparing professionals who work with vulnerable people. Two hypotheses have been raised with regard to the conducted study that employed simulated practice learning in training of Bulgarian child-care specialists. First, there is a significant change in participants’ professional knowledge and skills following participation in the conducted simulated practice training. Second, participants’ gaming experience may influence their opinion on the usability of the simulated environment for learning.
Participants The research sample consists of 131 subjects who passed the trainings on the Digital Bridges project in 2016 in Bulgaria. The sample is dominated by women (124 women and 7 men) which is consistent with the profile of the professionals working with vulnerable people. Thirty-seven percent of the participants live and work in Sofia (the capital city) and the remaining 63 percent are from other cities and towns in Bulgaria. The average number of years of professional experience is 18 years ranging from 1 to 44 years. Most of the respondents are employed (85 percent) and the rest are students (15 percent). An interesting question reveals the subjects’ experience in playing computer games. According to it, they are separated into the following groups: individuals who do not play computer games (forty-seven percent), occasional players who play less than 1 hour per week (twenty-nine percent), medium players playing from 1 to 7 hours per week (nineteen percent) and frequent players who play more than 7 hours per week (five percent only).
Data-Collection Tools The pilot study was conducted by employing a questionnaire designed specifically for this project in two versions, a pre- and a post-training one. The pre-training form of the instrument contains demographic questions followed by statements distributed in the three areas of professional knowledge and skills discussed previously in the current chapter: knowledge and understanding (nine items), application of practical and professional skills through reflective practice (four items), and transferrable skills (three items). They comprise statements concerning what the respondents think they know and are able to do in relation to their work prior to the training. The post-training form of the questionnaire contains the same three areas of statements and a fourth, additional, one – dealing with the usability of the simulated environment (consisting of ten items). The statements are directed to what the respondents know and are able to do after they have attended the training and whether they are willing to use the simulated environment in the course of their work. The response categories for all items were a five-point Likerttype scale where 1 is “Strongly agree” and 5 is “Strongly disagree”. The reliability of the subscales of the pre-training and post-training forms of the questionnaire is very good (See Table 1 and Table 2 for descriptive statistics and reliability of each subscale in both forms of the questionnaire).
142
Developing a Pedagogical Framework for Simulated Practice Learning
Procedure The test procedure followed a pre-test/post-test experimental design. The participants were asked to fill in the pre-training form of the question before playing the game and then they were introduced to the underlying pedagogy and the aims of the simulation practice training. After that they got acquainted with some game characteristics which led to the next logical step, namely playing the Risk-Assessment mini-game and the subsequent interaction through all the game scenarios which involved reflection on the choices made during the game. Finally, the participants were asked to fill in the post-training form.
Results For the purposes of hypotheses testing two types of statistical analyses were employed: paired samples T-test and one-way analysis of variance (ANOVA). The results of these analyses are presented below. A paired-samples t-test was conducted to evaluate the impact of the simulated environment training on trainees’ scores on the three subscales of the pre- and post- versions of the questionnaire (Knowledge and understanding, Application of practical and professional skills through reflective practice, and Transferrable skills). For the first subscale, reflecting the knowledge and understanding, there was a statistically significant decrease in the scores from Time 1 (M = 14.92, SD = 4.53) to Time 2 (M = 12.81, SD = 3.56), t (130) = 4.10, p =. 000 (two-tailed). The mean decrease in the subscale scores was 2.11 with a 95% confidence interval ranging from 1.09 to 3.13. The eta squared statistic (.11) indicated a moderate effect size. The first subscale is designed to measure the opinion of the subjects on their own professional knowledge and understanding of children’s, safety, well-being, needs and development, of effective communication with children and parents, and of enrichment of their own professional experience trough reflection and training. Taking into consideration that the sample consists mainly of subjects employed in the childcare sector (eighty-five percent), it can be assumed the sample is highly qualified already. This specific demographic characteristic is potentially the explanation of the higher score prior to playing the game but, still, it does not explain the decrease in the score after attending the simulated environment training. An important factor in the interpretation of this result could be the training focus. Since reflexive learning is the core of the training and the participants were actively engaged in reflecting on both the separate situations and choices within the scenarios and the whole game this is cause for the observed decrease in the participants’ certainty and confidence in self-evaluation after the training. For the second subscale, measuring application of practical and professional skills through reflexive practice, there was a statistically significant decrease in the scores from Time 1 (M = 7.50, SD = 2.03) to Time 2 (M = 6.40, SD = 1.70), t (130) = 4.48, p =. 000 (two-tailed). The mean decrease in the subscale scores was 1.09 with a 95% confidence interval ranging from .61 to 1.58. The eta squared statistic (.13) indicated a moderate effect size. This subscale expresses the successful linking between the child-care practitioners’ activities with their everyday work with children, as well as constant thinking of and systematic observations of their practices with children. As it was indicated above, the sample consists of highly qualified respondents and it may seem a surprise the trainees’ opinions on application of practical and professional skills through reflexive practices showed decrease after the training. However, this should be interpreted as a result of the game provoking participants to reflect more profoundly on their everyday work with children and parents thus leading to more critical self-evaluation. 143
Developing a Pedagogical Framework for Simulated Practice Learning
The third subscale, measuring transferrable skills, again, revealed a statistically significant decrease in the scores from Time 1 (M = 6.03, SD = 1.79) to Time 2 (M = 4.99, SD = 1.38), t (130) = 5.34, p =. 000 (two-tailed). The mean decrease in the subscale scores was 1.04 with a 95% confidence interval ranging from .65 to 1.42. The eta squared statistic (.18) indicated a large effect size. This subscale has items measuring the participants’ skills to demonstrate their ideas and to objectively evaluate their effectiveness in work with children and parents. Most probably the highly experienced studied group demonstrates their high self-esteem in their professional practice in the scores before the training. After playing the computer game the trainees become more critical in their self-evaluations. The second hypothesis proposes games experience of participants may influence their opinion on the usability of the simulated environment for learning. For its testing, one-way analysis of variance (ANOVA) was conducted. It showed no statistically significant influence of the frequency of computer games playing on the trainees’ opinion on the simulated environment for learning.
Discussion The results of the survey conducted on a pre- and post-training basis during the pilot with Bulgarian higher education students and child practitioners already in service are very interesting and they provoke research questions. The sample at this piloting stage was relatively small encouraging replication work with larger samples consisting of more higher education students. The paired-samples t-test yielded clear confirmation of the first hypothesis, thus proving there is a significant change in participants’ professional knowledge and skills scores following participation in the simulated practice pilot training. However, for the three subscales of the pre-training and post-training version of the questionnaire there was small, statistically significant decrease of the scores. For the first subscale, reflecting the knowledge and understanding, the decrease could be caused by the reflexive learning focus of the training which leads to reconsideration of the participants’ high self-evaluation before the training. The second subscale, measuring application of practical and professional skills through reflexive practice, again, demonstrated a small decrease in the pre- and post-training scores thus lending support to the importance of the role that reflexive learning plays in the process of learning through the simulated environment. Most probably the game has provoked the participants to consider what they encounter in their everyday work with children and parents thus leading to more critical self-evaluation. This is also supported by the informal feedback collected during the training - the majority of the piloting practicing teachers expressed their satisfaction with having been made to question what they have started to take for granted in their job, some of them even explaining that the simulation has introduced them to aspects of working with children they had had no proper training in. The last subscale, measuring transferrable skills, also reveals a small decrease in the score after the training. Most probably the highly experienced respondents have evaluated themselves highly before the simulated practice and after it they have become more critical in their self-evaluations. Around one-third of the participants were teachers who piloted the game within a very specific for the country stage of their professional upgrading which is specifically aimed at developing a marked reflexive approach to their own practice. Along these lines, the role of the “Tiny Oaks” game could be interpreted as an effective agent in this process which is made evident by the responses given at the post-test stage. In this sense, the results of the piloting give additional country-specific information about the potential role
144
Developing a Pedagogical Framework for Simulated Practice Learning
of a simulated practice training within the system of continuous professional development in Bulgaria which deserves to be further investigated. On the whole, the authors consider that these results prove that, although the majority of trainers are highly qualified and already have rich professional experience, their trainees have benefitted from the participation in the simulated practice training in the direction of acquiring improved professional self-awareness. The second hypothesis, stating that games experience of participants may influence their opinion on the usability of the simulated environment for learning, was rejected. This clearly demonstrates that experience in playing computer games does not impact simulated practice learning in the case of the studied population. This means that the value of such training for them does not depend on their level of digital skills or their being used to gaming or not, which is a very encouraging result proving the relevance of the Framework.
SOLUTIONS AND RECOMMENDATIONS Although the results from this pilot did not demonstrate a statistically significant influence of participants’ digital skills on their opinion of the simulated learning environment, the gathered feedback from the trainees may be interpreted that more efforts into mainstreaming different digital solutions during initial and on-the-job vocational and higher education training should be employed on a regular basis. Thus the learners could make the most effective use of the simulated environment for their professional training and development.
FUTURE RESEARCH DIRECTIONS The simulated practice learning experience in itself is a very complex process and yields itself to a number of research options undertaken to clarify further strengths and weaknesses of setting up and implementing such computer-assisted simulation. In this sense it will be very beneficial to establish what factors contribute to realizing its strong points so that its effectiveness is enhanced. Another area to look into is applying the pedagogical framework in other contexts such as vocational secondary schools with a social work and pedagogical specialization which exist in some countries (i.e. Bulgaria) and test it with learner groups who are considered to be digital natives and more used to gaming in their every-day lives in order to compare to what extent the level of digital skills influences the learning experience for different learner groups. Researching further the constraints of using in-built tools for shared online activities and discussions among players would help in enhancing the Web 2.0 features which supplement the simulation game during the practice training itself. Last but not least, the Framework has been tested with a full set of pedagogical tools and learning and assessment materials only with one segment of the professional field and there is room to investigate and experiment how this could be executed in different contexts, for instance with a focus on migrant populations, disabilities, school students in risk of early school leaving, etc.
145
Developing a Pedagogical Framework for Simulated Practice Learning
CONCLUSION The area of social and pedagogical work especially when it comes to working with vulnerable people calls for experimenting with different approaches to the preparation of professionals in the field. The proposed Pedagogical Framework for Simulated Practice Learning for Skills development in Social Services and Healthcare has been developed under the Digital Bridges EU-funded project to respond to the real societal needs as outlined in relevant EU documents and has been tested in five countries – in Scotland, UK, in Bulgaria, in Italy, in Finland, and in Lithuania. The present chapter reveals the process of developing the pedagogical framework highlighting the review of relevant learning theories, the essence of game-based learning and serious games in particular and the constraints and features which feed into the framework. The final discussion is based on the experience of implementing it in a Bulgarian context only as the finalization of comparing and summarizing the results of the experimental stage in all partner countries has not been finalized yet. While new emerging technologies allow educators to conceive and implement innovative experiencebased simulation models, sound understanding of the learning methods should guide the educational process. Therefore, the proposed developed pedagogical framework aims to give a starting point for thinking about simulated learning practice experiences that can be implemented in a number of sensitive areas which social workers’ training should approach to ensure the development of adequate vocational skills for working with vulnerable people.
REFERENCES Anderson, J. L., & Barnett, M. (2013). Learning physics with digital game simulations in middle school science. Journal of Science Education and Technology, 22(1), 914–926. doi:10.1007/s10956-013-9438-8 Antonova, A., & Martinov, M. (2010). Serious Games and Virtual Worlds in education and business. Paper presented on the SAI conference, Sofia. Antonova, A., & Todorova, K. (2010). Serious Games and Virtual Worlds for high-level learning experiences. Proceedings of the S3T Conference, Varna. Bakker, M., van der Hauvel-Panhuizen, M., van Borkulo, S., & Robitzsch, A. (2012). Effects of minigames for enhancing multiplicative abilities: A first exploration. Serious games: The challenge. Communications in Computer and Information Science, 280, 53–57. doi:10.1007/978-3-642-33814-4_7 Baños, R. M., Cebolla, A., Oliver, E., Alcañiz, M., & Botella, C. (2013). Efficacy and acceptability of an Internet platform to improve the learning of nutritional knowledge in children: The ETIOBE mates. Health Education Research, 28(2), 234–248. doi:10.1093/her/cys044 PMID:22498924 Bennet, A., & Bennet, D. (2006). Organizational survival in the new world. Burlington, UK: Elsevier. Bryant, J., & Zillman, D. (Eds.). (2002). Media effects: Advances in theory and research (2nd ed.). Hillsdale, N J: Lawrence Erlbaum.
146
Developing a Pedagogical Framework for Simulated Practice Learning
Çoban, S., & Tuncer, I. (2008). An experimental study of game‐based music education of primary school children.Proceedings of the 2nd European Conference on Games‐Based Learning (EC‐GBL), Barcelona, Spain. Connolly, T., Stansfield, M., Josephson, J., Lazaro, N., Rubio, G., & Ortz, C. R., … Tsvetanova, S. (2008). Using alternate reality games to support language learning. Proceedings of the 2nd European Conference on Games Based Learning. UK: Academic Conferences and Publishing International Reading. Connolly, T. M., & Stansfield, M. H. (2007). From eLearning to games-based eLearning: Using interactive technologies in teaching Information Systems. International Journal of Information Technology Management, 6(2), 188–208. doi:10.1504/IJITM.2007.014000 Connolly, T. M., Stansfield, M. H., Gould, C., Tsvetkova, N., Kusheva, R., & Stoimenova, B., … Dimitrova, N. (2011). Understanding the pedagogy Web 2.0 Supports: The presentation of a Web 2.0 pedagogical model. Proceedings of International Conference on European Transnational Education (ICEUTE), Salamanca, Spain. Corti, K. (2006). Games-based learning: A serious business application. PIXELearning Limited. Cosman, P. H., Cregan, P. C., Martin, C. J., & Cartmill, J. A. (2001). Virtual reality simulators: Current status in acquisition and assessment of surgical skills. ANZ Journal of Surgery, 72(1), 30–34. doi:10.1046/j.1445-2197.2002.02293.x PMID:11906421 Crookall, D., Oxford, R., & Saunders, D. (1987). Towards a reconceptualization of simulation: From representation to reality. Simulation/Games for Learning, 17(4), 147‐171. De Chesnay, M., & Anderson, B. (Eds.). (2016). Caring for the vulnerable: Perspectives in nursing theory, practice and research. Burlington, MA: Jones and Bartlett Learning. De Freitas, S. (2008). Serious virtual worlds. JISC. De Jong, T. (2013). Learning by design. In Wild, Lefrere, Scott (Eds.), TEL2020: Technology and knowledge in the future, a roadmap. Dimitrova, M., Mimirinis, M., & Murphy, M. (2004). Evaluating the flexibility of a pedagogical framework for e-learning.Proceedings of International Conference on Computer Systems and Applications (AICCSA-05), Cairo, Egypt. IEEE. doi:10.1109/ICALT.2004.1357422 European Commission. Directorate General for Employment, Social Affairs and Inclusion. (2011). The Social Dimension of the Europe 2020 Strategy: A Report of the Social Protection Committee. Luxembourg: Publications Office of the European Union. Farrel, D., Kostkova, P., Weinberg, J., Lazareck, L., Weerasinghe, D., Lecky, D. M., & McNulty, C. A. M. (2011). Computer games to teach hygiene: An evaluation of the e‐Bug junior game. The Journal of Antimicrobial Chemotherapy, 66(Suppl. 5), 39–44. doi:10.1093/jac/dkr122 PMID:21680586 Furió, D., González‐Gancedo, S., Juan, M. C., Segui, I., & Rando, N. (2013). Evaluation of learning outcomes using an educational iPhone game vs. traditional game. Computers & Education, 64, 1–23. doi:10.1016/j.compedu.2012.12.001
147
Developing a Pedagogical Framework for Simulated Practice Learning
Hammond, L., Austin, K., Orcutt, S., & Rosso, J. (2001). How people learn. Introduction to learning theories. Stanford University School of Education, Stanford University. Illeris, K. (2003). Towards a contemporary and comprehensive theory of learning. International Journal of Lifelong Education, 22(4), 396–406. doi:10.1080/02601370304837 International Association of Schools of Social Work, & International Federation of Social Workers (2004). Global Standards for Social Work Education and training of the Social Work Profession. Adelaide, Australia. Kirriemuir, J., & McFarlane, A. (2004). Literature Review in Games and Learning. Bristol: NESTA Futurelab. Kolb, D. A., Rubin, I. M., & McIntyre, J. M. (1994). Organizational psychology: An experiential approach to organizational behavior (4th ed.). London: Prentice Hall. Leyland, B. (1996). How can computer games offer deep learning and still be fun.Proceedings of ASCILITE Conference, Adelaide, Australia. McLeod, G. (2003). Learning theory and instructional design. Learning Matters, 2, 35–43. Moedtricher, F. (2006). E-learning theories in practice: A comparison of three methods. Journal of Universal Science and Technology of Learning, 0(0), 3–18. Pavlenko, P. D. (2010). Theory, history and methodology of social work (Handbook). Moscow: Dashkov & Co.(In Russian) Rubin-Vaughan, A., Pepler, D., Brown, S., & Craig, W. (2011). Quest for the golden rule: An effective social skills promotion and bullying prevention program. Computers & Education, 56(1), 166–175. doi:10.1016/j.compedu.2010.08.009 Sapundzhieva, K. (2011). The academic status of social pedagogy in the context of challenges of the modern socialization situation and social practice(In Bulgarian). Pedagogy, 1, 11–20. Saywer, R. K. (2006). The Cambridge handbook of the learning sciences. Cambridge University Press. Wang, T. (2008). Web‐based quiz‐game‐like formative assessment: Development and evaluation. Computers & Education, 51(3), 1247–1263. doi:10.1016/j.compedu.2007.11.011 Waterworth, J., & Waterworth, E. (1999). Education as exploration: Being, going and changing the world. Proceeding of Didactics of Informatics and Mathematics in Education, Crete. Wiseman, A., Haynes, C., & Hodge, S. (2013). Implementing professional integrity and simulationbased learning in health and social care: An ethical and legal maze or a professional requirement for high-quality simulated practice learning? Clinical Simulation in Nursing, 9(10), 437–443. doi:10.1016/j. ecns.2012.12.004 Yang, J. C., Chen, C. H., & Jeng, M. C. (2010). Integrating video-capture virtual reality technology into a physically interactive learning environment for English learning. Computers & Education, 55(3), 1346–1356. doi:10.1016/j.compedu.2010.06.005
148
Developing a Pedagogical Framework for Simulated Practice Learning
Yien, J. M., Hung, C. M., Hwang, G. J., & Lin, Y. C. (2011). A games‐based learning approach to improving students’ learning achievements in a nutrition course. Proceedings of the Turkish Online Journal of Educational Technology, 10(2).
KEY TERMS AND DEFINITIONS Apply: One of the stages in the learning cycle of the proposed pedagogical framework which refers to the process of applying the newly acquired knowledge and skills in new game scenarios, etc., finally leading to acquiring the desired professional competences. Game-Based Learning: Learning facilitated by or happening with a specially designed educational game. Interact: One of the stages in the learning cycle of the proposed pedagogical framework which refers to the process of interacting with different NPCs, trainers and peers in the game and in class. Observe: One of the stages in the learning cycle of the proposed pedagogical framework which refers to the process of observing the relations between the setting, the vulnerable person, the actions of the players and the NPCs in the game. Play-Test-Get-Feedback: One of the stages in the learning cycle of the proposed pedagogical framework which refers to the process of playing the 3D game, testing different problem-solving strategies and obtaining feedback in the game and in class about their own actions and knowledge and skills-acquisition. Reflect: One of the stages in the learning cycle of the proposed pedagogical framework which refers to the process of reflecting within and outside the game scenarios and drawing conclusions regarding own performance as well as acceptable / inacceptable behaviors, successful / unsuccessful or desirable / undesirable interaction with vulnerable people. Serious Games: Specially designed digital (computer or internet-based) games which are used in educational, training, etc. settings. Social Work: A professional area dealing with the well-being of society members which often includes taking care of and protecting the interests of different kinds of vulnerable population.
149
Developing a Pedagogical Framework for Simulated Practice Learning
APPENDIX 1 The table below shows the descriptive statistics and reliability of each subscale obtained through processing the data from the pre-piloting questionnaire. Table 1. Descriptive statistics and reliability scores of the subscales in the pre-training questionnaire M
SD
Cronbach’s Alpha
Knowledge and understanding
14.92
4.53
.93
Application of practical and professional skills through reflective practice
7.50
2.03
.70
Transferrable Skills
6.03
1.79
.75
Subscale
APPENDIX 2 The table below shows the descriptive statistics and reliability of each subscale obtained through processing the data from the post-piloting questionnaire. Table 2. Descriptive statistics and reliability scores of the subscales in the post-training questionnaire M
SD
Cronbach’s Alpha
Knowledge and understanding
12.81
3.56
.91
Application of practical and professional skills through reflective practice
6.40
1.70
.70
Transferrable Skills
4.99
1.38
.70
Usability
24.06
5.17
.78
Subscale
150
151
Chapter 8
Developing Teaching Presence in Online Learning Through Shared Stakeholder Responsibility Carol Johnson University of Calgary, Canada Noha Altowairiki University of Calgary, Canada
ABSTRACT Transitioning from a face-to-face teaching environment to online teaching requires a shift in paradigm by stakeholders involved (i.e., instructors and students). This chapter provides an extensive literature review to help novice online instructors understand the nature of online teaching presence to help position their students towards more active participation. Premised on the Community of Inquiry framework (Garrison, Anderson & Archer, 2000) and constructivism, we highlight a conceptual framework of four iterative processes for developing online teaching presence: preparations for facilitation, designing the facilitation, implementing the facilitation, and assessing the facilitation. Based on this framework, strategies are articulated for overcoming the challenges of online learning through shared stakeholder responsibility.
INTRODUCTION Online learning is a contemporary learning environment common in post-secondary education (Allen & Seaman, 2013). It requires that at least 80 percent of a course is transacted through an online medium such as a learning management system (Allen & Seaman, 2008). Various factors contribute to a successful online learning environment. From the perspective of the Community of Inquiry framework (Garrison, Anderson, & Archer, 2000), three important components are required to develop effective and constructivist online learning environments: social presence, DOI: 10.4018/978-1-5225-1851-8.ch008
Copyright © 2017, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
Developing Teaching Presence in Online Learning Through Shared Stakeholder Responsibility
cognitive presence, and teaching presence. Of the three presences, the teaching presence component is described as “thoughtful, focused and attentive” (Garrison, Cleveland-Innes, & Fung, 2010, p. 32), and integral in interlinking student learning needs through the development and maintenance of social and cognitive processes (Garrison, 2011). Garrison (2011) notes that “[t]eaching presence represents perhaps a greater challenge in an e-learning environment” (p. 25) because of its importance for learning outcomes. As we focus on understanding the elements that comprise teaching presence, it is important to identify the stakeholders involved: in addition to the instructor, online students also have a role in teaching presence (Redmond & Lock, 2006). Through the lens of social constructivism, this chapter aims to outline how instructors and students collaborate effectively to build and maintain online teaching presence. The chapter identifies four distinct iterative processes involved in the development of online teaching presence: preparation for facilitation, designing the facilitation, implementing the facilitation, and assessing the facilitation. These components promote meaningful collaboration between instructors and students and, taken together, describe the innovative learning practices necessary to develop and maintain teaching presence in the online learning environment The chapter concludes with a proposed conceptual framework to better articulate the functions that occur within teaching presence.
BACKGROUND As in the traditional classroom, a variety of online instructional methods exist for today’s online instructor. Design options range from student focus to teacher focus and subject focus to teaching focus, with many benefits available from these varieties of choice. To reach the desired learning outcomes, however, the online instructor must initially identify the main objectives of his or her course and then locate an appropriate design approach that will be effective in connecting the student to meaningful learning opportunities. The constructivist approach, as developed by Vygotsky (1978), Bandura (1981, 1993), and elaborated furthered for educational technology by Jonassen (1992, 1999), is the lens through which learning will be viewed in this chapter. This implies that with the incorporation of digital technologies, students benefit by experiencing their learning first-hand through action. They continue to build upon their active mental, physical, and emotional constructs by seeking resolution or solution through a project or problem inquiry. The inclusion of constructivism in the teaching approach focuses on the aspect of “authenticity” (Jonassen, Davidson, Collins, Campbell, & Haag, 1995, p. 21) and personalized learning for the student; when combined with educational technology, constructivism focuses attention on the building of knowledge by way of students interacting with their learning. Jonassen et al. suggested further that the inclusion of such technology integration decreased teacher involvement from 80 percent to 10–15 percent as students are able to spend more time interacting with their own learning. The increased time spent on their own learning signals a paradigm shift, giving the learner more opportunity to focus on constructing their knowledge more independently. Accordingly, effective online instructional design is necessary. In an online course framework, a constructivist paradigm posits that students should have opportunity to create, analyze, and apply their learning in a manner that connects to the higher-level learning constructs described in the revised Bloom’s taxonomy for learning (Anderson & Krathwohl, 2001). 152
Developing Teaching Presence in Online Learning Through Shared Stakeholder Responsibility
Given the diverse choice of learning activities available for students to engage with various multimedia applications, constructivism provides a wide palette for students to actively explore learning in a more personalized manner. While the breadth of technology tools and online activities is exciting and constantly growing, the implementation of such tools and activities can be intimidating for the online instructor. With this in mind, it is essential that the instructor understand the implications and limitations of online system configurations, the various approaches to online course design, and the choice of tools available for active online learning. Given this need, the next part of the chapter introduces the online learning environment and the relationship of the Community of Inquiry framework (Garrison, Anderson & Archer, 2000) to online learning.
Online Learning Online learning consists of having “at least 80 percent of the course content” (Allen & Seaman, 2008, p. 4) delivered in the online environment. However, online learning is more than accessing learning materials. It involves multiple types of interactions; student-student, student-instructor, and student-content; to gain and develop knowledge with meaningful support during the learning process (Ally, 2008). The actual online environment may consist of housing an online course in a learning management system. Learning management systems such as Blackboard (http://www.blackboard.com), Moodle (http://www. moodle.org), or Desire2Learn (http://www.desire2learn.com) may be used for course content storage and display. As well, the learning management system may contain applications (e.g., via plug-ins) for enabling asynchronous and synchronous tools for further learning and assessment. The transformation of teaching to an online format is compounded by both technological learning curves and online learning approaches. For example, the use of a learning management system permits display of course content. However, there are seemingly endless possibilities for the inclusion of video and audio resources, text- and graphic-based documentation, calendar organizers, checklists, glossaries, and more. The bigger question of how to integrate these possibilities through a well-framed and intuitively guided course area is founded in one’s approach to teaching online. The intricate balance of transitioning an entire course to an online format can challenge even the most talented faculty member.
Community of Inquiry Framework One framework used to guide instructors in the understanding of teaching in the online environment is the Community of Inquiry (CoI) framework (Garrison, Anderson & Archer, 2000). The philosophical foundation of the CoI framework is based on the work of Dewey (1938) and the focus of collaborative constructivism (Garrison et al., 2000). In this framework, the “goal is to create a community of inquiry where students are fully engaged in collaboratively constructing meaningful and worthwhile knowledge” (Garrison, 2006, p. 25). This means that students are generating new knowledge through a transactional approach derived from critical thinking and discourse. According to Shea and Bidherano (2010), the framework’s theorized knowledge building (cognitive presence) in an online environment develops as “a result of collaborative work among active participants in learning communities characterized by instructional orchestration appropriate to the online environments (teaching presence) and a supportive collegial online setting (social presence)” (p. 1722). Together, the framework’s three core elements—social presence, cognitive presence, and teaching presence—are 153
Developing Teaching Presence in Online Learning Through Shared Stakeholder Responsibility
visualized as three circles with unique areas unto themselves as well as intersections that create a nexus. That is, social presence, cognitive presence, and teaching presence have both dependent and independent functions in the online teaching and learning environment. Social presence is “the ability of participants to identify with the group or course of study, communicate purposefully in a trusting environment, and develop personal and affective relationships progressively by way of projecting their individual personalities” (Garrison, 2009, p. 15). Through engagement and collaboration, students develop an academic climate and personal relationships naturally over time (Garrison, 2011, p. 34). Garrison explained that “the goal of establishing social presence is to create a climate of trust and belonging that will support interaction and a questioning predisposition” (2006, p. 26). Furthermore, social presence is an important factor for collaboration and critical discourse. Garrison and Anderson identified the role of social presence as critical, and as “an important antecedent to collaboration and critical discourse because it facilitates achieving cognitive objectives by instigating, sustaining, and supporting critical thinking in a community of learners” (2003, p. 67). Social presence has been studied extensively in the educational setting for both face-to-face and online learning environments (Garrison & Arbaugh, 2007). Swan and Shih (2005) examined the effect of different levels of social presence on student satisfaction. They found that students who perceived a high level of social presence believed that online discussions were more interactive and that they learned more than students who perceived a low level of social presence. Also, students in a learning environment with a high level of social presence reported benefiting from the multiple perspectives presented by their classmates, leading them to look at things from different angles. In the academic context, social presence suggests the creation of a learning climate that encourages and supports probing questions, skepticism, and the contribution of explanatory ideas (Garrison, 2011, p. 32). Garrison, Anderson, and Archer (1999) suggested that social presence is a critical factor in developing a successful community of inquiry, and that students need to feel a sense of belonging and trust in order to recognize collaboration among themselves as a valuable learning experience. So and Brush (2008) further examined the relationship between social presence and collaboration. In a study of 97 survey participants, So and Brush found that students with high perceptions of collaborative learning also perceived high levels of social presence. Weinel, Bannert, Zumbach, Hoppe, and Malzahn (2011) explained that while social presence does not cause collaboration, it could affect the attitude of participants towards collaborating on a particular task. However, Picciano (2002) posited that social presence is less important if the learning activities are information acquisition and there are no collaborative tasks where students learn with other perspectives. Overall, the research on this issue provides evidence of a relationship between social presence, satisfaction, and perceived learning (Garrison & Arbaugh, 2007; Garrison, Cleveland-Innes, & Fung, 2010; Richardson & Swan, 2003). Cognitive presence is “the extent to which learners are able to construct and confirm meaning through sustained reflection and discourse in a critical community of inquiry” (Garrison, Anderson, and Archer, 2001, p.5). While it is impossible to separate cognitive presence from interacting with social and teaching presence, cognitive presence focuses on the way meaning is constructed and how students reflect on their learning. It further refers to the conditions and processes that enable learners to build and apply knowledge through a collaborative and constructivist approach to learning (Salloum, 2012). Garrison et al. (2000) operationalized cognitive presence through a Practical Inquiry model that describes a four-phase process:
154
Developing Teaching Presence in Online Learning Through Shared Stakeholder Responsibility
1. Triggering event: This phase initiates the inquiry process through activities in order to engage and involve students. The importance of this phase is in assessing the state of knowledge and generating ideas. 2. Exploration: This phase focuses on understanding the nature of the issue first, and then searching for relevant information and possible explanations. 3. Integration: This phase moves to build meaning from the ideas developed during the exploration phase. Decisions are made about integration of ideas and how order can be created. 4. Resolution: This phase is the resolution of the problem through applying the newly gained knowledge to educational contexts. The Practical Inquiry model was developed by Garrison et al. (2000) to assess cognitive presence. The purpose was to provide a practical means to evaluate the nature and quality of critical reflection and discourse in a community of inquiry (Garrison, 2011, p. 51). Garrison further noted “the challenge for educators is to move the discussion and cognitive development through each of the phases of practical inquiry” (p. 53). Given the online learning environment and available tools, the phases require scaffolding that leads the student to resolution. He identified the learning task itself and the teaching presence component as key factors in reaching inquiry from the triggering event phase to the resolution phase. From this point of view, the moderator (e.g., instructor or student) assesses the discourse taking place and guides the discussion through a “critical thinking cycle” (Garrison, 2011, p. 53). The research on cognitive presence in a community of inquiry is associated with high levels of perceived learning and satisfaction (Akyol & Garrison, 2008, 2011; Garrison & Cleveland-Innes, 2005). Rovai (2002) suggested further that students “who have stronger sense of community and perceive greater cognitive learning should feel less isolated and have greater satisfaction with their academic programs” (p. 328). Akyol and Garrison (2008) also found that cognitive presence and teaching presence are important factors in influencing student learning. They noted that students who perceived higher levels of teaching presence also perceived higher levels of cognitive presence, learning, and satisfaction. In a later study, Akyol and Garrison (2011) suggested that “collaborative development of cognitive presence in online discussions and students’ perception of cognitive presence is associated with high perceptions of learning and actual learning outcomes in terms of grades” (p. 245). These findings confirmed that structured collaborative activities lead to deeper and more meaningful learning as earlier indicated by Garrison and Cleveland-Innes (2005). The third presence in the CoI framework is teaching presence. Teaching presence is “the design, facilitation, and direction of cognitive and social process of the purpose of realizing personally meaningful and educationally worthwhile learning outcomes” (Anderson, Rourke, Garrison, & Archer, 2001, p. 5). Teaching presence plays a critical role in online learning through designing experiences that facilitate reflection and discourse while maintaining a dynamic learning environment (Garrison, 2011). The centrality of teaching presence focuses on creating and providing a learning environment conducive to collaboration and reflection. Accordingly, developing an ideal online learning environment requires planning, organization, and management skills. As Tu and Corry (2003) suggested, “To insure a good learning experience, an ideal interactive online learning environment requires a fully integrated design rather than a sparse collection of unrelated, disconnected, and fragmented learning activities scattered throughout the course” (p. 54).
155
Developing Teaching Presence in Online Learning Through Shared Stakeholder Responsibility
Furthermore, the role of a community of inquiry’s interactive discourse among stakeholders, as defined by Garrison (2011), suggests that collaboration is a key tool. Interaction and discourse play key roles in higher-order learning; this requires structure specific to design and leadership specific to facilitation and direction (Garrison & Arbaugh, 2007). Consequently, a strong teaching presence is essential in an online environment in order to establish and maintain the online course area as a collaborative learning space (Garrison, 2006). Shea, Li, and Pickett (2006) explained that “a strong and active presence on the part of the instructor—one in which she or he actively guides and orchestrates the discourse—is related both to students’ sense of connectedness [community] and learning” (p. 185). Instructors’ roles as social relationship facilitators are particularly important during the development of group projects, especially in situations where it becomes necessary to deal with conflict and differences of opinion (Tinoca, Oliveira, & Pereira, 2010). Teaching presence has an effect on student satisfaction, student perceptions of learning, and the group sense of community (Akyol & Garrison, 2008; Arbaugh, Cleveland-Innes, Diaz, Garrison, Ice, Richardson, & Swan, 2008; Shea, Li, Swan, & Pickett, 2005). This effect suggests that students must feel they are contributing members and are gaining accomplishment through their active online participation (Garrison, 2006). As mentioned earlier, all participants have a role to play in teaching presence; teaching presence is not the sole responsibility of the instructor (Redmond & Lock, 2006). Shea et al. (2006) found a link between teaching presence and establishing a sense of learning community. Their findings suggested that students who reported high levels of teaching presence indicators, such as instructional design and organization, also reported high levels of satisfaction and learning. This further affirms the importance of the centrality of teaching presence to the development and maintenance of social and cognitive presences. Together, the three presences create a purposeful nexus for teaching in the online environment. In a study of the causal relationships between social presence, teaching presence, and cognitive presence, Garrison, Cleveland-Innes, and Fung (2010) emphasized that teaching presence plays a large role in influencing students’ perceptions of social presence and cognitive presence: “Perceptions of social presence also significantly predict perceptions of cognitive presence. Therefore, social presence must be seen as a mediating variable between teaching and cognitive presence” (p. 35). The presences are co-dependent, however, and influence each other in how students experience learning in the online environment.
MAIN FOCUS OF THE CHAPTER The Community of Inquiry (CoI) framework for online learning (Arbaugh et al., 2008; Garrison et al., 2001; Garrison, 2011) identified teaching presence as having both a formative and supportive role for developing and maintaining the nexus of cognitive and social presences. Teaching presence is multilayered, as it is “related both to students’ sense of connectedness and learning” (Shea, Li, Swan, & Pickett, 2006, p. 185). As the previously cited literature suggested, teaching presence is central to the supportive learning outcomes in the online course environment. Much of the research focused on the interplay of the three components of the CoI and requires active dialogue to take place in order to support online student learning. More specifically, research indicated that supportive online learning environments are interactive environments that encourage both instructors and students to actively participate in the area of teaching presence (Bowman, 2014; Garrison, 2011). Therefore, we posit that the use of a shared stakeholder responsibility that purposely involves students in interactive learning will assist in development of meta-cognitive awareness and deep learning. 156
Developing Teaching Presence in Online Learning Through Shared Stakeholder Responsibility
A synthesis of the literature from various sources (Branch & Kopcha, 2014; Palloff & Pratt, 2007; Szeto, 2015) indicates that the component of teaching presence is derived from four discrete iterative processes: preparing the facilitation, designing the facilitation, implementing the facilitation, and assessing the facilitation. Such iterative processes can assist in fostering effective teaching presence through sharing of some of the teaching presence responsibilities with students (i.e., shared stakeholder responsibility), thereby furthering student learning outcomes, student motivation and perceptions, and overall community participation in online learning environments. The remaining section of this chapter presents the critical components for developing online teaching presence and its implications for the higher education context. Accordingly, the purpose of this section is to respond to the following critical research questions: 1. What key elements are required to prepare and support online teaching presence? 2. What are some indicators that assist in the practical assessment of online teaching presence? and 3. What practical implications could inform the development of online teaching presence in higher education? Given the centrality of teaching presence to the development of social presence and cognitive presence, and the need for shared stakeholder responsibility, it is important to establish the elements that contribute to teaching presence. More specifically, the establishment of such elements in a framework will assist in the development of practical strategies for instructors and their paradigmatic shift to online instruction.
Issues, Controversies, Problems The transformation of teaching from a face-to-face environment to the online environment is not without its challenges. The evidenced increase of instructor front-end workload to create online course materials (Johnson, 2016), requirement of technological skills and abilities for instructors to utilize the online course environment and develop course resources (Palloff & Pratt, 2011), and the need for educational development funding for providing instructors with the training to teach in their online environments (Hoyt & Oviatt, 2013) are three challenges facing stakeholders in online learning. However, research evidence suggests that the blended and online environments can provide equal or better student learning outcomes than a face-to-face environment (Guiller, Durndell, & Ross, 2008; Osguthorpe & Graham, 2003). The front-end workload required to develop an online course area often involves more time and resource allocation than a traditional face-to-face course. Moore and Kearsley (2005) suggest that instructors may find the creation of audio and video artifacts problematic due to the requirements of time and finances to produce quality products. While the costs of video application software (e.g., iMovie) and tablet apps (e.g., Videopix) may be relatively low, the development of speaking scripts, slide graphic for insertions, and videography may now be a more current challenge for instructors. As found throughout higher education professional development courses, the collaboration of educators in reciprocal assistance may be a helpful resource to address development issues (Kelly, 2009). Technological skills are also a necessary requirement for online instructors. Park and Bonk (2007) provide ample suggestions on ways for instructors to effectively prepare for active and meaningful engagement in synchronous sessions: clarification of required technology in advance of meeting time; define student outcomes and provide a guideline for synchronous activity; practice using the synchronous tool and activities prior to the actual event; and incorporate flexibility into each session to accommodate 157
Developing Teaching Presence in Online Learning Through Shared Stakeholder Responsibility
real-time learning (p. 10). While preparation is a key to the many technical issues of synchronous tools, Moore and Kearsley (2005) and Palloff and Pratt (2011) identify facilitator skills as relevant challenges for both asynchronous and synchronous online learning engagement. With the known deficiencies of facilitator technical skills and understanding, it becomes important that online instructors become familiar with established online course technology tools. Synchronous tools have been identified as having a positive impact on overall student satisfaction when group members implement the use of verbal and nonverbal interactions such as video, audio, text chat, and emoticons (Park & Bonk, 2007). While there are still technological challenges for faculty to overcome (Pratt, 2008), challenges due to lack of faculty skills, intermittent broadband connections, and technology tool limitations will decrease with appropriate professional development and technology improvements. The ubiquity of technology available in the 21st century provides opportunities to extend student learning through personalized learning. A report by Taylor, Morin, Cohn, Kochlar, and Clark (2008) noted that 66 percent of Americans had Internet access in their homes. Currently, mobile devices, such as smart phones, are now being used by 58 percent of American adults (Pew, 2014). This has relevancy for the extension of learning. Kozma (2003, p. 13) posited that when teachers go beyond these basic practices and use technology to also plan and prepare instruction and collaborate with outside actors, and when students also use technology to conduct research projects, analyze, data, solve problems, design products, and assess their own work, students are more likely to develop new ICT, problem-solving, information management, collaboration and communication skills. The online learning environment holds much potential for supporting student learning. Linking the importance of collaboration and interaction in the online learning environment, Harasim (2012) explained that Online Collaborative Learning (OLC) is an outgrowth of our technologically networked society. Placed within the context of online courses, “learners work together online to identify and advance ideas of understanding, and to apply their new understanding and analytical terms and tools to solving problems, constructing plans or developing explanations for phenomena” (Harasim, 2012, p. 88). Productive learning dialogues can take place between the various stakeholders within the online course environment and consequently provide students with helpful learning supports, which should be the goal of all learning environments.
SOLUTIONS AND RECOMMENDATION A gap still exists in the development of models and frameworks pertaining to how such environments are formed. Therefore, with the established importance of teaching presence in the online learning environment, coupled with the understanding that the stakeholders in an online course include both students and instructors, the following conceptual framework has been developed to provide practical strategies as to how instructors should design their environments and prepare their students for taking responsibility to support their own learning in the online environment. The framework consists of the four iterative processes previously described (preparation for facilitation, designing the facilitation, implementing the facilitation, and assessing the facilitation) and seeks to assist instructors and designers with practical
158
Developing Teaching Presence in Online Learning Through Shared Stakeholder Responsibility
strategies to build, support, and maintain the necessary aspect of teaching presence through the lens of shared stakeholder responsibility. The conceptual framework is displayed in Figure 1.
Preparation for Facilitation Meaningful OLC requires effective preparation. This preparation for facilitation involves: • • • • • • •
Building supportive and trusting environment (Altowairiki, 2013) Developing a climate for social interactions (Szeto, 2015) Explaining/highlighting the importance of shared teaching responsibilities (Lazonder, 2014) Providing clear guidelines and expectations (Shea, Vickers, & Hayes, 2010) Planning affective learner support (Bates, 2015; Garrison, 2011) Consideration of Universal Design for Learning (UDL) principles (Rose & Meyer, 2002) Selecting appropriate technological tools (Altowairiki, 2013; Bates, 2015)
Building and maintaining a supportive and trusting environment was considered a fundamental step for effective online collaboration (Altowairiki, 2013). This is accomplished as learners feel comfortable to present their ideas, negotiate their perspectives, challenge each other in a constructive way, and provide peer formative feedback. A trust-centered learning environment supports students socially and cognitively throughout their learning process (deNoyelles, Zydney, & Chen, 2014). Moreover, a number of scholars highly recommended fostering a sense of community in online learning environment to reduce learners’ feelings of isolation, increase persistence in courses, improve learners’ attitude towards the course and content, and increase student retention (Wilson, Ludwig-Hardman, Thornam, & Dunlap, 2004). In addition, a strong sense of community increases “the flow of information among all learners, availability of support, commitment to group goals, cooperation among members, and satisfaction with group efforts” (Rovai, 2002, p. 3). An effective online learning community has several observable indicators, for example, as suggested by Palloff and Pratt (2007): • • • • • •
Active interactions with both course content and other participants Collaborative learning evidenced by students’ interaction with each other Socially-constructed knowledge through agreement or questioning of issues Exchange of resources among students Expression of support and encouragement between students Willingness to critically evaluate the work of others
It is important to highlight that “community is not a product or entity that can be built. Rather, it is a process that is organic in nature… it depends on relationships and building relationships” (Lock, 2003, p. 12). Thus, social presence represented by social relationships is a key for a community (Tu, 2004). Garrison (2006) explained that social learning activities should be designed carefully to provide opportunities for learners to interact formally and informally; that could occur through introductory activities, creating a chat room for informal communication such as “Course Café”, having a group discussion on desired expectations and requirement, and being open to online office hours. Additionally, modeling etiquette instruction assists in developing social presence (Garrison, 2011; Redmond & Lock, 2006). 159
Developing Teaching Presence in Online Learning Through Shared Stakeholder Responsibility
Establishing a climate for social interaction further contributes to cultivating online learning experiences through collaboration, peer feedback, and peer reflection (Szeto, 2015). Second, providing clear guidelines and specific expectations is an essential step for online learning success (Shea, Vickers, & Hayes, 2010). Online learners are diverse in their backgrounds, expectations and attitudes towards collaborative learning. According to Barkley, Cross, and Mayor (2005), some learners choose online learning so that they may do their work individually. Therefore, it is essential to provide students with information regarding the online learning approach to be used in the course, and the learning expectations and requirements specific to tasks such as collaborative learning activities (Capdeferro & Romero, 2012). It is highly recommended to educate learners about the nature of online learning and its learner-centered environment. This means that instructors take time to explain “the rationale behind student-to-student interaction in negotiating shared meaning through discourse” (Shea, Vickers, & Hayes, 2010, p. 142). Further, the role of online instructors is described in terms of “to what extent and in what capacity they will participate in course discussions” (p. 142). This type of negotiation was highlighted in case study research by Altowairiki (2013). In this study, participants reported having clear expectations at the beginning of the online course was very helpful for their success. As one student explained, online learners need “a clear outline of what the course expectations are, which includes a specific description of what active collaboration means to the instructor so that learners understand what is expected of them” (p. 75). Another student claimed that having clear guidelines and expectations influenced her level of participation as she compared her experience in two online courses. This student found herself more motivated and actively participated in an online course where an instructor provided clear expectations for active participation. Conversely, the student found it difficult to motivate herself to participate actively in an online course where an instructor did not have specific requirements for active participation. Third, considering learners’ variability (e.g., learning preferences, prior knowledge, cultural background, abilities), flexibility needs to be built into the design and facilitation of online learning. Scholars (e.g., Rao, 2012; Rudestam, 2010) termed online learners as “non-traditional” due to their distance from institutions, abilities, their decision to return to pursue their studies after an extended time period away from doing so, and/or their desire to take courses while working full time jobs (Rao, 2012). Therefore, it is a meaningful step to rethink about the design of online learning and whether it allows for inclusivity and responds to learners’ variability. One of the educational frameworks that acknowledges learner’s variability and guides instructors to design and facilitate an inclusive learning experience is UDL. UDL researchers (Rose & Meyer, 2002) rejected the idea of a one-size-fits-all learning environment, and encouraged educators to use multiple means of engagement, representation, and expression. For example, using different formats to represent content and express knowledge (i.e., written, audio, and video) would assist in addressing variable learning preferences (i.e., read/write, auditory, visual, and kinesthetic). The UDL framework has been used successfully in online higher education contexts. For example, Rao and Tanners (2011) designed and delivered an online course based on UDL principles. Generally, they used different formats to represent content (written, audio, and video), gave students the option to select their preferred method to express their knowledge (writing a research paper or creating a multimedia project), and used synchronous and asynchronous discussion forums (e.g., VoiceThread, Elluminate Live!) to facilitate students’ engagement. To assess students’ perceptions of the course design, questionnaires were sent to all students in the course (n = 25). Findings from the study revealed that 76 160
Developing Teaching Presence in Online Learning Through Shared Stakeholder Responsibility
percent of the students purchased print versions of the book, while 24 percent of students purchased the digital one. For the assigned articles, 52 percent of students read only the text, 32 percent of students read and listened to the articles concurrently, and 16 percent read some and listened to some. Most of the students appreciated the design of the course based on UDL, specifically in having options to represent and express knowledge. Fourth, selecting appropriate technological tools is a thoughtful, pedagogical step (Altowairiki, 2013; Bates, 2015) to ensure that tools are suitable for a particular audience and purpose (Januszewski & Molenda, 2008). Using both synchronous and asynchronous discussion forums is recommended by several scholars to provide assistance in overcoming each other’s weaknesses (Palloff & Pratt, 2007; Rockinson-Szapkiw, Baker, Neukrug, & Hanes, 2010). For example, the use of synchronous discussion may influence social presence (e.g., by promoting a sense of belonging; Altowairiki 2013) and thus influence students’ level of collaboration. According to Weinel, Bannert, Zumbach, Hoppe, and Malzahn (2011), social presence is not a single catalyst for collaboration; instead it is a factor that affects participants’ attitudes towards collaborating on a particular task. For a valuable learning experience, online learners need to feel a sense of belonging and trust in order to recognize the collaboration that is occurring among them (Garrison et al., 1999). The selection of technological tools (e.g., communication forums) depends on the goal and design of online learning. Rockinson-Szapkiw, Baker, Neukrug, and Hanes (2010) conducted a comparative study to compare the influence of synchronous and asynchronous communication forums on cognitive presence, social presence, teaching presence, and perceived learning. They found that students who used both synchronous and asynchronous communication forums had higher level of social presence than students who just used asynchronous communication forums. However, they did not find any difference between the two groups in terms of cognitive presence, teaching presence, and perceived learning. Together, these four strategies (i.e., building and maintaining a trust and respectful environment, providing clear guidelines and specific expectations, considering learners’ variability, and selecting appropriate technological tools) frame the preparation component of developing teaching presence in an online course. Once strategies in this component are considered, the next phase of strategies to consider is the component of design.
Designing the Facilitation To design the facilitation of the online learning experience requires focus on strategies that go into creating course structures. They are not only organized for the learning process but also provide students with different types of instruction to allow for multiple approaches to be used (e.g., direct instruction and collaborative tasks). Specifically, this phase entails concepts such as a structure and organization design (Garrison, 2011) so that the online learning experience is designed through a planning, organization, and course design process (Johnson, 2016) including use of technology and content to accommodate varied student learning levels (Johnson, 2016; Szeto, 2015). This phase also requires that a variety of learning tasks are included for students to demonstrate their knowledge of content (Altowairiki, 2013; Garrison, 2011).
161
Developing Teaching Presence in Online Learning Through Shared Stakeholder Responsibility
Structure and Organizational Design The course design process as identified in a research study by Johnson (2016) involves the foundational teaching philosophy of each instructor. The choices made for how a course is presented can be related to the beliefs and ideas about learning that are held by the instructor; as Branch and Kopcha (2014) explained, “philosophical orientation and theoretical perspective frame the concepts upon which instructional design models are constructed” (p. 80). From this it can be understood that pedagogy is bridged to the tangible aspects of course design by way of the choices made through the approach to instructional design. Furthermore, they note, instructional design is “an iterative process of planning outcomes, selecting effective strategies for teaching and learning, choosing relevant technologies, identifying educational media, and measuring performance” (p. 77). When developing a design for an online course, there are many choices for faculty to make as they consider the need for course flexibility, organized structure, and context and learning tasks. Some instructors use the ADDIE instructional design model (Amaral, Shank, Shibley, & Shibley, 2011), which consists of five building blocks: Analysis (i.e., identify the course objectives and timelines); Design (i.e., developing strategies on how to present the course content); Develop (i.e., assembling, gathering, and displaying course content); Implement (i.e., presenting the course to students); and Evaluation (i.e., assessing the challenges and opportunities of the course and completing necessary re-designs). The ADDIE model provides a linear, step-by-step structure for the creation of courses. However, the ADDIE structure does not include specific procedures or processes for integrating technology for teaching and learning. Within design for facilitation, the focus is placed on the specific organization of how and what learning will be available for the student to encounter. Planning for course design involves the realization of how the content and learning tasks within a course will be outlined. This process can involve both backwards design and scaffolding of learning. Wiggins and McTighe (2005) described backwards design as a guided method that looks at the necessary end product, or result, and then moves backwards in sequence to parse out the components that contribute to the final result. Backwards design is a method closely linked with an earlier approach to planning: scaffolding of learning (Bruner, 1960; Vygotsky, 1978). Scaffolding of learning considers the building of learning and its content in a fashion that offers graduated order, or scaffolds, that allow for sequential understanding. The work of Herrington, Reeves and Oliver (2014) affirm this important design approach in learning. They stated, “Learning is best facilitated by the inclusion of deliberate coaching and scaffolding supports provided by the teacher” (p. 404). Furthermore, the use of scaffolding, graduated learning activities, and coordination of coaching with scaffolding (e.g., cognitive apprenticeship) help deepen constructivist student learning over the course duration (Collins, Brown, & Holum, 1991; Gagné & Driscoll, 1988; Park & Bonk, 2007). Consequently, planning is closely link to student motivation; as described by Garrison (2011), the design outcome results in courses that have more predictable structure and scaffolded learning content, which gradually move students through their learning.
Designing Technology and Content Integration The organization of learning also takes into account that students need to be able to have access to all content information and learning task directions in an aligned pedagogical manner by way of the technology available (Moore & Kearsely, 2005). The Technology, Pedagogy, and Content Knowledge (TPACK) model (Koehler, Mishra, Kereluik, Shin, and Graham, 2014) is a tool used by instructors to 162
Developing Teaching Presence in Online Learning Through Shared Stakeholder Responsibility
assist in the focused integration of technology within a teaching framework. When using the TPACK model, instructors pose their own procedural structures to ensure that they are not overwhelming students with technology while still providing opportunities for students to utilize technologies to assist their learning. Both of the TPACK model and the ADDIE model mentioned earlier provide strategies to help instructors work through the overarching course design process required for the transition to teaching in the online environment. In practice, the use of a module or folder system within an online course can aid the organization of course lectures, readings, and assignments. Each module can gradually guide the student to a higher course content knowledge level and build cumulative content and learning tasks. This notion of providing students with guidance also further assists students through a sense of predictability of content organization. The process of course design includes both creating and re-designing in order to reach a particular course design outcome (Molenda, 2003, 2015). However, planning and organization of course design is not identified a one-time action but one that may be iterative (Palloff & Pratt, 2011; Reigeluth & Carr-Chellman, 2009). The degree to which re-design is necessary is determined by individual faculty members. Yet, course design is found to naturally flow from a planning stage and through a creation or development stage and be a cyclical process. That is, the cyclical nature of the process is found in the notable need for adjustment or re-design. Each portion of the process brings the learning content in a more focused view for the betterment of student learning outcomes.
Designing for Learning Tasks Online course design consists of a variety of components that are integral to successful teaching and learning processes. The pre-determined activities that students will use to demonstrate their knowledge and understanding in conjunction with the inclusion of synchronous and asynchronous tools play an important role in online learning. These components allow students to not only retrieve, explore, and analyze learning information but to also display, present, and demonstrate. Together, these components are the student’s essential elements for acquiring, processing, and validating learning. As students are better able to concentrate on learning activities rather than distracting technology, the learning activity becomes a platform for students to fully engage with their learning through exploration, play, and identification (Kim & Reeves, 2007). Finally, having opportunities for students to engage and interact with peers and their learning content by way of technology is part of a social-constructivist learning exchange (Jonassen, 1992). The importance of communication in an online course becomes relevant for both student motivation (Jones, 2010; Kanuka, Rourke, & LaFlemme, 2006) and decreased isolation due to student distance (Picciano, 2002): “Computer conferencing provides the two-way communication necessary for intellectually constructive interactions” (McDonald 1997, in Shi, Mishra, Bonk, Tan, & Zhao, 2006, p. 1). These interactions are able to take place through the use of synchronous and asynchronous communications in an online course context. Guiller, Durndell, and Ross (2008) found that when comparing online discussion with face-to-face discussion, 50 percent of the students in the study preferred online discussion and 37 percent preferred face-to-face discussion. Thirteen percent of the students were neutral, with no strong preference for one form over another. In addition, this study demonstrated that students took part in more critical thinking in the online discussion than the face-to-face discussions. This finding is important because it demonstrates 163
Developing Teaching Presence in Online Learning Through Shared Stakeholder Responsibility
that students in the online discussion area not only have a preference for the online discussion area, but that learning outcomes (e.g., critical thinking) are achievable in online discussions. Synchronous meetings in online courses can also help students feel part of the larger group. Park and Bonk’s (2007) case study surveyed 19 online students and found the use of a synchronous interactive exchange regarding a course project as helpful to their learning. Additionally, it “helped me [student] feel connected” (p. 312). McGrath (1992) suggested that groups should be limited in the number of membership changes in order to address possible issues with member dynamics and group project performance. Adapting the understanding of groups to online learning, Smith et al. (2011) posited that students prefer group work in a face-to-face course versus an online course because of the assumed individuality of online course structures. Given this preference, online learning activities using group work need to address student communication needs and group member expectations, and clarify activity outcomes (McGrath, Arrow, Gruenfeld, Hollingshead, & O’Connor, 1993). As we have identified in this section, design is a process that involves the development and re-design of organizational structures as well as technology and content integration. Additionally, design includes specific learning activities and tasks for students. While each of these elements have unique aspects, they relate directly to the instructor’s teaching philosophy and beliefs.
Implementation of the Facilitation Thus far, we have identified the preparation and design components of the conceptual model for teaching presence that occur prior to the start of a course. The strategies identified in the implementation component are ones that are used during the actual course semester by the stakeholders. These strategies include: • • • • • •
Regular modeling of desired outcomes and behaviors (Altowairiki, 2013; Bates, 2015) Guiding students through their participation process Forming groups and/or assigning individual based on their availability and interests (Altowairiki, 2013) Providing flexibility for stakeholder participation (e.g., time) Maintaining open communication (Bates, 2015; Fink, 2013) Providing active opportunities for learners explore and demonstrate prior and new knowledge (Redmond & Lock, 2006)
Taken together, these strategies highlight the importance of being an active stakeholder in the online course area and being open to adapting and sharing new knowledge.
Active Stakeholder Roles In the transition to the online environment many nuanced details may not surface in one’s online teaching until further into a course. Given this challenge, having all stakeholders take an active role in the online environment becomes critical. For the instructor, an active role in an online course means timely responses to emails and taking an active role in online discussion forums. Multiple case study research by Johnson (2016) revealed that there are different ways that instructors take active roles in online discussion areas.
164
Developing Teaching Presence in Online Learning Through Shared Stakeholder Responsibility
One case study identified an instructor who responded to each of her students’ posts with individual feedback. Within a few weeks, the instructor found herself unable to keep up the overwhelming response pace and thereby adjusted her facilitation to a more manageable pace of limiting her individual responses. Through the process of transition, the instructor continued to guide the discussions through purposeful communication with students. Her initial zeal for responding to each individual post was likely due to the desire to not only have an active presence in her online course, but to better guide her students through regular modeling of her posting expectations. A balance is needed for being an active stakeholder in an online course. Kirschner, Sweller, and Richard (2006) observed that “not only is unguided instruction normally less effective; there is also evidence that it may have negative results when students acquire misconceptions or incomplete or disorganized knowledge” (p. 84). From the opposite view, we must also acknowledge that too much guidance can also be problematic, as evidenced from the earlier example of instructor overload. Finding the balance of active instructor participation within an online course that encourages students to take ownership for their learning tasks can be developed through strategic modeling and demonstration of expectations. Bates (2015) and Fink (2013) further address the importance of creating open communication between instructors and students when developing effective learning environments. Case One and Case Two by Johnson (2016) revealed that students were motivated to take active roles in their online course after the instructor established course expectations and provided examples of desired learning outcomes. For example, AA13 was an online instructor who set up each online course to support students as active stakeholders in their collaborative learning. His initial student learning task involved student identifying multiple ways to communicate with each other (e.g., email, chat areas, phone, and video communication). Another instructor, AB104, used his initial learning task to develop student interaction by way of collaborative group learning. Instructors identified their course expectations and reaffirmed these expectations through their learning tasks and course interactions as a way to demonstrate active participation in their online courses.
Open to Adapting and Sharing New Knowledge Student learning “emerges from the sum of the encounters and from the relations established by the student within the knowledge domain” (Lowyck, 2014, p. 8). As we continue to seek strategies for developing teaching presence in the online learning environment, providing ways for students to share their new knowledge becomes an integral teaching tool. Identifying ways to be able to adapt and be flexible within the online teaching space becomes an important strategy to implement. Where and how to integrate appropriate technologies into teaching has been informed by Koehler et al.’s (2014) TPACK model, which can further assist instructors in the day-to-day exchanges with students. Based on the earlier learning approach of Shulman (1986), TPACK identifies “the complex relations among technology, pedagogy, and content that enable teachers to develop appropriate and context-specific teaching strategies” (p. 102). The instructor who understands the TPACK model’s framework for isolating and integrating the components of online technology, pedagogy, and content is provided a range of possible adaptations to meet student learning needs. Since the model can promote and be “responsive to complex teaching and learning situations” (Branch & Kopcha, 2014, p. 78), it can be an effective tool for thinking through the challenges and opportunities afforded by an online course area. All learners can further explore and demonstrate the construction of knowledge (Redmond & Lock, 2006) and share that with others. 165
Developing Teaching Presence in Online Learning Through Shared Stakeholder Responsibility
Implementing adaptation during a course may involve swapping out a particular multimedia resource for something that is more familiar for the students in your particular discipline. Various types of multimedia can be used within the content modules, including videos (both instructor-created and non-instructor-created videos or video links), handouts, PowerPoint movies with voice-overs, and audio podcasts. If instructors need to develop more student interactivity during the course, allowing students to incorporate technologies that they find more familiar can be advantageous for their learning. For example, additional interactivity can be sought with specific software applications, discussion forums, and assignment links to websites that are relevant for the specific content (e.g., VoiceThread.com, TedEd. com and YouTube.com).
Assessing the Facilitation Assessment is the heart of a meaningful learning experience. Two types of assessments could be embedded in the design of a learning experience: formative and summative. Formative assessment refers to “the iterative processes of establishing what, how much and how well students are learning in relation to the learning goals and expected outcomes” (Gikandi, Morrow, & Davis, 2011, p. 2337) to enhance performance and support further learning. Giving formative feedback is the main key for online student success (Kupczynski, Ice, Wiesenmayer, & McCluskey, 2010). This helps learners have the opportunity to identify their strengths and weaknesses and thus improve their performance. The use of instructor interactions and peer feedback are examples of this type of assessment for online students (Eddy & Lawrence, 2013; Mao & Peck, 2013). Eddy and Lawrence (2013) suggested that WIKIs can provide helpful formative assessment through peer interaction. Furthermore, Mao and Peck (2013) suggested that students are able to learn through the scaffolding of self-assessment. When designed through scaffolding, students can experience WIKIs and blogs as both a form of selfassessment and peer-assessment. Summative assessment refers to the measurement process that occurs usually at the end of instructional activity, unit, or course to award “a grade or other forms of accreditation” (Gikandi, Morrow, & Davis, 2011, p. 2336). Summative assessments provide the opportunity for students to demonstrate their overall course knowledge and application within a particular assessment (Tomlinson & Moon, 2013). While there is controversy about the reliability of summative assessment (Knight, 2002), the incorporation of learner scaffolding and constructivist design can help overcome challenges associated with summative assessments. In this way, a summative assessment is interconnected to its formative assessments (Taras, 2005; 2010). Supportive learning for summative assignments is found in studies by Guiller, Durndell, and Ross (2008) and Bruns and Humphreys (2005). Both studies found that students in the online environment experienced the use of critical thinking skills and knowledge co-construction as they dialogued about their learning. Their findings reiterated the importance of peer review (Keast, 2009). With attention to the setup of the summative assignments through scaffolding of content and social constructivism, students in online courses can be given opportunities to successfully prepare for summative assessments. Gaytan and McEwen (2007) conducted research to investigate online students’ and instructors’ perceptions of effective online instructions and assessment strategies. In this study, 332 students and 29 instructors responded to an online survey. Findings from the study indicated that effective assessment starts from providing detailed explanation of each learning task and an associated well-designed rubric. Formative feedback is also important for online students, and needs to be meaningful, timely, and aligned 166
Developing Teaching Presence in Online Learning Through Shared Stakeholder Responsibility
with the rubrics. Using a variety of instructional methods to reach all learning needs is another critical factor for effective assessment. The last finding was that having open communication and group work helped students build a strong learning community. To assess group work or collaborative learning projects, scholars (i.e., McConnell, 2006; Garrison, 2011) highly recommended assessing both the process of collaboration and learning outcomes (i.e., final course project). Assessing the collaboration process may overcome some of online collaboration issues such as unequal student contribution, which may influence students’ satisfaction with the course. Capdeferro and Romero (2012) identified several factors leading to online students’ frustration, including unequal quality of individual contributions that caused unfair grading. Providing formative assessment/feedback on the collaboration process helps instructors to determine students’ participation level, ascertain needs for additional resources, monitor group learning progress, and acknowledge individual student contribution (Palloff & Pratt, 2007). Garrison (2011) recommended two techniques to assess online collaboration process: 1. Ask collaborative groups to submit a completed collaborative assignment along with each student’s contribution to it, or 2. Use multiple types of assessment such as self-assessment, peer-assessment, and group-assessment The use of peer-assessment has a twofold advantage: providing feedback gives students a better idea of the criteria for the product and develops a sense of confidence by seeing how peers are performing. Moreover, receiving feedback assists students improve their products (van der Pol, van den Berg, Admiraal, & Simons, 2008). Palloff and Pratt (2010) recommended several points for online collaboration assessment: • • • • •
Collaborative work should be assessed collaboratively. Instructors need to provide clear guidelines for assessment of collaborative work. The use of rubrics helps to make the assessment task easier and more objective. Students need to understand what is expected of them in the assessment function. Providing an area online in which they can clarify this role or ask questions can be helpful. When assessment aligns with learning objectives and collaborative activities, the task of assessment becomes less cumbersome and student satisfaction with the learning process increases (p. 53).
In addition to student learning assessment, teaching presence assessment is another activity that could be conducted to enhance teaching performance. Such assessment could be conducted on two occasions: the middle and the end of the course. Feedback from students provides an opportunity for instructors to access their teaching strengths and weaknesses and thus enhance their performance. Assessing teaching presence should involve the main indicators proposed by Anderson and his colleagues (2001): instructional design and organization, facilitating discourse, and direct instruction. Instructional design and organization involve “setting the curriculum, establishing time parameters, utilizing the medium effectively, establishing netiquette, and designing instructional methods effectively” (Shea, Fredericksen, Pickett, & Pelz, 2003, p. 69). Facilitating discourse involves “identifying areas of agreement and disagreement in online discussions, seeking to reach consensus, reinforcing student contributions, setting the climate for learning, drawing in other participants and prompting discussion, and 167
Developing Teaching Presence in Online Learning Through Shared Stakeholder Responsibility
assessing the efficacy of the process” (Shea at el., 2003, p. 70). Direct instruction includes “presenting content and questions, focusing the discussion on specific issues, confirming understanding, diagnosing misconceptions, and injecting knowledge from diverse sources” (Shea at el., 2003, p. 71).
A Conceptual Framework for Teaching Presence The four components of the conceptual framework are displayed in an iterative process as outlined in Figure 1. Each of these components highlights strategies for enabling a strong teaching presence to be formed in an online course.
FUTURE RESEARCH DIRECTIONS As we consider future research directions for the conceptual framework for developing teaching presence in online learning through shared stakeholder responsibility, it becomes apparent that there are two major areas for research. The initial area is the exploration of the framework through evidenced research. This involves asking questions focused on the relative necessity of each component and how each component interacts with the entire framework.
Figure 1. A conceptual framework for developing teaching presence in online learning through shared stakeholder responsibility. Used by permission, Johnson and Altowairiki (2015)
168
Developing Teaching Presence in Online Learning Through Shared Stakeholder Responsibility
The second area for research considers how instructors make their transition from the face-to-face classroom to the online environment using this framework. The exploration of these future research directions will further fine-tune this model and aid in the development of appropriately linked educational development resources.
CONCLUSION As this chapter outlines, stakeholders need to be prepared for the transition to teaching and learning in the online environment for effective teaching and learning to take place. Through the careful parsing of the Community of Inquiry’s (Arbaugh et al., 2008) teaching presence component, we can see that this component has paramount ties to various supporting elements: the strategic preparation, design, implementation, and assessment processes involved in a teaching and learning environment. These processes contribute to the opportunities and challenges faced by online instructors. By understanding their intricacies and interrelationships, online courses can be better designed and scaffolded to support online instructors and students.
REFERENCES Akyol, Z., & Garrison, D. R. (2008). The development of a Community of Inquiry over time in an online Course: Understanding the progression and integration of social, cognitive and teaching presence. Journal of Asynchronous Learning Networks, 12(3-4), 3–22. Akyol, Z., & Garrison, D. R. (2011). Understanding cognitive presence in an online and blended community of inquiry: Assessing outcomes and processes for deep approaches to learning. British Journal of Educational Technology, Cognitive presence in an online and blended community of inquiry, 42(2), 233–250. Allen, E., & Seaman, J. (2008). Staying the course: Online education in the United States, 2008. The Sloan Consortium. Retrieved from http://sloanconsortium.org/publications/survey/staying_course Allen, E. I., & Seaman, J. (2013). Changing course: Ten years of tracking online education in the United States. Babson Survey Research Group. Retrieved from http://www.onlinelearningsurvey.com/reports/ changingcourse.pdf Allen, E. I., & Seaman, J. (2014). Grade change: Tracking online education in the United States. Babson Survey Research Group. Retrieved from http://www.onlinelearningsurvey.com/reports/gradechange.pdf Ally, M. (2008). Foundation of educational theory for online learning. In T. Anderson (Ed.), The Theory and Practice of online Learning (2nd ed.). Edmonton: AU Press. Altowairiki, N. (2013). Instructors’ and students’ experiences with online collaborative learning in Higher Education [Master’s thesis]. University of Calgary. Amaral, K. E., Shank, J. D., Shibley, I., & Shibley, L. R. (2011). Designing a blended course: Using ADDIE to guide instructional design. Journal of College Science Teaching, 40(6), 80.
169
Developing Teaching Presence in Online Learning Through Shared Stakeholder Responsibility
Anderson, L. W., & Krathwohl, D. R. (Eds.). (2001). A taxonomy for learning, teaching, and assessing: A revision of Bloom’s taxonomy of educational objectives. New York, NY: Longman. Anderson, T., Rourke, L., Garrison, D. R., & Archer, W. (2001). Assessing Teacher Presence in a computer conferencing context. Journal of Asynchronous Learning Networks, 5(2). Retrieved from http:// auspace.athabascau.ca/handle/2149/725 Arbaugh, J. B., Cleveland-Innes, M., Diaz, S. R., Garrison, D. R., Ice, P., Richardson, K., & Swan, K. P. (2008). Developing a Community of Inquiry instrument: Testing a measure of the Community of Inquiry framework using a multi-institutional sample. The Internet and Higher Education, 11(3), 133–136. doi:10.1016/j.iheduc.2008.06.003 Bandura, A. (1981). Self-referent thought: A developmental analysis of self-efficacy. In J. H. Flavell & L. D. Ross (Eds.), Social cognitive development: Frontiers and possible futures. Cambridge, England: Cambridge University Press. Bandura, A. (1993). Perceived self-efficacy in cognitive development and functioning. Educational Psychologist, 28(2), 117–149. doi:10.1207/s15326985ep2802_3 Barkley, E., Cross, K. P., & Mayor, C. H. (2005). Collaborative learning techniques. San Francisco: Jossey-Bass Publishers. Bates, A. W. (2015). Teaching in a digital age: Guidelines for designing teaching and learning. Anthony William Bates. Retrieved from http:open.bccampus.ca Bowman, J. (2014). Online learning in music: Foundations, frameworks, and practices. Oxford, New York: Oxford University Press. doi:10.1093/acprof:oso/9780199988174.001.0001 Branch, R. M., & Kopcha, T. J. (2014). Instructional design models. In J. M. Spector, M. D. Merrill, & J. Elen (Eds.), Handbook of research on educational communications and technology (4th ed., pp. 77–87). New York, NY: Springer. doi:10.1007/978-1-4614-3185-5_7 Bruner, J. S. (1960). The process of education. Cambridge, MA: Harvard University Press. Bruns, A., & Humphreys, S. (2005, October 16-18). Wikis in teaching and assessment: The M/ Cyclopedia project. Proceedings of theInternational Wiki Symposium, San Diego, CA, USA. doi:10.1145/1104973.1104976 Capdeferro, N., & Romero, M. (2012). Are online learners frustrated with collaborative learning experiences? International Review of Research in Open and Distance Learning, 13(2), 26–44. Collins, A., Brown, J. S., & Holum, A. (1991). Cognitive apprenticeship: Making thinking visible. American Educator, 6, 38–46. deNoyelles, A., Zydney, J., & Chen, B. (2014). Strategies for creating a community of inquiry through online asynchronous discussions. Journal of Online Learning and Teaching., 10(1), 153–165. Dewey, J. (1938). Experience & education. New York, NY: Simon & Schuster. Eddy, P. L., & Lawrence, A. (2013). Wikis as platforms for authentic assessment. Innovative Higher Education, 38(4), 253–265. doi:10.1007/s10755-012-9239-7
170
Developing Teaching Presence in Online Learning Through Shared Stakeholder Responsibility
Fink, L. D. (2013). Creating significant learning experiences: An integrated approach to designing college courses (2nd ed.). San Francisco: Jossey-Bass. Gagné, R. (1985). The Conditions of Learning (4th ed.). New York, NY: Holt, Rinehart & Winston. Gagné, R., & Driscoll, M. (1988). Essentials of learning for instruction (2nd ed.). Englewood Cliffs, NJ: Prentice-Hall. Garrison, D. (2011). E- learning 21st century: A framework for research and practice (2nd ed.). NY: RoutledgeFalmer. Garrison, D. R. (2006). Online collaboration principles. Journal of Asynchronous Learning Networks, 10(1), 25–34. Garrison, D. R. (2009). Implications of online learning for the conceptual development and practice of distance education. Journal of Distance Education, 23(2), 93–104. Garrison, D. R., & Anderson, T. (2003). E-learning in the 21st Century: A framework for research and practice. NY: Routledge Falmer. Garrison, D. R., Anderson, T., & Archer, W. (1999). Critical inquiry in a text-based environment: Computer conferencing in higher education. The Internet and Higher Education, 2(2-3), 87–105. doi:10.1016/ S1096-7516(00)00016-6 Garrison, D. R., Anderson, T., & Archer, W. (2000). Critical inquiry in a text-based environment: Computer conferencing in higher education. The Internet and Higher Education, 11(2), 1–14. Garrison, D. R., Anderson, T., & Archer, W. (2001). Critical thinking and computer conferencing: A model and tool to assess cognitive presence. American Journal of Distance Education, 15(1), 7–23. doi:10.1080/08923640109527071 Garrison, D. R., & Arbaugh, J. B. (2007). Researching the community of inquiry framework: Review, issues, and future directions. The Internet and Higher Education, 10(3), 157–172. doi:10.1016/j.iheduc.2007.04.001 Garrison, D. R., & Cleveland-Innes, M. (2005). Facilitating cognitive presence in online learning: Interaction is not enough. American Journal of Distance Education, 19(3), 133–148. doi:10.1207/ s15389286ajde1903_2 Garrison, D. R., Cleveland-Innes, M., & Fung, T. S. (2010). Exploring causal relationships among teaching, cognitive and social presence: Student perceptions of the community of inquiry framework. The Internet and Higher Education, 13(1-2), 31–36. doi:10.1016/j.iheduc.2009.10.002 Gaytan, J., & McEwen, B. C. (2007). Effective online instructional and assessment strategies. The American Journal of Distance, 21(3), 117–132. doi:10.1080/08923640701341653 Gikandi, J. W., Morrow, D., & Davis, N. E. (2011). Online formative assessment in higher education: A review of the literature. Computers & Education, 57(4), 2333–2351. doi:10.1016/j.compedu.2011.06.004 Guiller, J., Durndell, A., & Ross, A. (2008). Peer interaction and critical thinking: Face-to-face or online discussion? Learning and Instruction, 18(2), 187–200. doi:10.1016/j.learninstruc.2007.03.001
171
Developing Teaching Presence in Online Learning Through Shared Stakeholder Responsibility
Harasim, L. (2012). Learning theory and online technologies. New York, NY: Routledge. Herrington, J., Reeves, T. C., & Oliver, R. (2014). Authentic learning environments. In J. M. Spector, M. D. Merrill, J. Elen, & M. J. Bishop (Eds.), Handbook of Research on Educational Communications and Technology (4th ed., pp. 401–412). New York, NY: Springer. doi:10.1007/978-1-4614-3185-5_32 Hoyt, J., & Oviatt, D. (2013). Governance, faculty incentives, and course ownership in online education at doctorate-granting universities. American Journal of Distance Education, 27(3), 165–178. doi:10.1 080/08923647.2013.805554 Januszewski, A., & Molenda, M. (2008). Educational technology: a definition with commentary. New York: Lawrence Erlbaum Associates. Johnson, C. (2016). Developing a teaching framework for online music course [Unpublished Ph.D. dissertation]. University of Calgary, Calgary, Alberta, Canada. Johnson, C., & Altowairiki, N. (2015, May 12-13). Building and maintaining online teaching presence: A practical starting point. Presentation at2015 University of Calgary Conference on Postsecondary Learning and Teaching, Calgary, Alberta, Canada. Jonassen, D. (1999). Designing constructivist learning environments. In C. M. Reigeluth (Ed.), Instructional design theories and model: A new paradigm of instructional theory (Vol. 3, pp. 215–241). Hillsdale, NJ: Lawrence Erlbaum Associates. Jonassen, D., Davidson, M., Collins, C., Campbell, J., & Haag, B. B. (1995). Constructivism and Computer-Mediated Communication in Distance Education. American Journal of Distance Education, 9(2), 7–26. doi:10.1080/08923649509526885 Jonassen, D. H. (1992). Evaluating constructivistic learning. In T. M. Duffy & D. H. Jonassen (Eds.), Constructivism and the Technology of Instruction: A Conversation. Florence, KY: Routledge. Jones, B. D. (2010). An examination of motivation model components in face-to-face and online instruction. Electronic Journal of Research in Educational Psychology, 8(3), 915–944. Kanuka, H., Rourke, L., & Laflamme, E. (2006). The influence of instructional methods. on the quality of online discussion. British Journal of Educational Technology, 38(2), 260–271. doi:10.1111/j.14678535.2006.00620.x Keast, D. A. (2009). A constructivist application for online learning in music. Research Issues in Music Education, 7(1), 1–8. Kelly, R. (2009). Jump start program prepares faculty to teach online. Faculty Focus Special Report: 12 Tips for Improving Your Faculty Development Plan. Retrieved from http://www.facultyfocus.com/ free-reports/12-tips-for-improving-your-faculty-development-plan/ Kim, B., & Reeves, T. (2007). Reframing research on learning with technology: In search of the meaning of cognitive tools. Instructional Science, 35(1), 207–256. doi:10.1007/s11251-006-9005-2
172
Developing Teaching Presence in Online Learning Through Shared Stakeholder Responsibility
Kirschner, P., Sweller, J., & Richard, E. (2006). Why minimal guidance during instruction does not work: An analysis of the failure of constructivist, discovery, problem-based, experiential, and inquiry-based teaching. Educational Psychologist, 41(2), 75–86. doi:10.1207/s15326985ep4102_1 Knight, P. (2002). Summative assessment in Higher Education: Practices in disarray. Studies in Higher Education, 27(3), 276–286. doi:10.1111/1468-2273.00218 Koehler, M., Mishra, P., Kereluik, K., Shin, T., & Graham, C. (2014). The technological pedagogical content knowledge framework. In J. M. Spector, M. D. Merrill, J. Elen, & M. J. Bishop (Eds.), Handbook of Research on Educational Communications and Technology (4th ed., pp. 101–111). New York, NY: Springer. doi:10.1007/978-1-4614-3185-5_9 Kupczynski, L., Ice, P., Weisenmayer, R., & McCluskey, F. (2010). Student perceptions of the relationship between indicators of teaching presence and success in online courses. Journal of Interactive Online Learning, 9(1), 23–43. Lazonder, A. (2014). Inquiry learning. In J. M. Spector, M. D. Merrill, J. Elen, & M. J. Bishop (Eds.), Handbook of Research on Educational Communications and Technology (4th ed., pp. 453–464). New York, NY: Springer. doi:10.1007/978-1-4614-3185-5_36 Lock, J. V. (2003). Building and sustaining virtual communities [Doctoral Dissertation]. University of Calgary, Calgary, Alberta, Canada. Lowyck, J. (2014). Bridging learning theories and technology-enhanced environments: A critical appraisal of its history. In J. M. Spector, M. D. Merrill, J. Elen, & M. J. Bishop (Eds.), Handbook of Research on Educational Communications and Technology (4th ed., pp. 3–20). New York: Springer. doi:10.1007/978-1-4614-3185-5_1 Mao, J., & Peck, K. (2013). Assessment strategies, self-regulated learning skills and perceptions of assessment in online learning. Quarterly Review of Distance Education, 14(2), 75–95. McConnell, D. (2006). E-learning groups and communities. Berkshire: Open University Press. McGrath, J. E. (1992). Time, interaction, and performance (TIP): A theory of groups. Small Group Research, 22(2), 147–174. doi:10.1177/1046496491222001 Mcgrath, J. E., Arrow, H., Gruenfeld, D. H., Hollingshead, A. B., & OConnor, K. M. (1993). Groups, tasks, and technology The effects of experience and change. Small Group Research, 24(3), 406–420. doi:10.1177/1046496493243007 Molenda, M. (2003). In search of the elusive ADDIE model. Performance Improvement, 42(5), 34–36. doi:10.1002/pfi.4930420508 Molenda, M. (2015). In search of the elusive ADDIE Model. Performance Improvement, 54(2), 40–42. Retrieved from: http://doi.org/10.1002/pfi.21461 Moore, M., & Kearsley, G. (2005). Distance education: A systems view. Toronto, Canada: Nelson. Moore, M., & Kearsley, G. (2005). Distance education: A systems view. Toronto, ON: Nelson.
173
Developing Teaching Presence in Online Learning Through Shared Stakeholder Responsibility
Osguthorpe, R. T., & Graham, C. R. (2003). Blended learning environments: Definitions and directions. Quarterly Review of Distance Education, 4(3), 227–233. Palloff, R., & Pratt, K. (2007). Building online learning communities (2ndEd.). San Francisco: Jossey-Bass. Palloff, R. M., & Pratt, K. (2010). Collaborating online: Learning together in community. New York: Wiley & Sons. Palloff, R. M., & Pratt, K. (2011). The excellent online instructor: Strategies for Professional Development. San Francisco: Jossey-Bass. Park, Y. J., & Bonk, C. (2007). Is online life a breeze? A case study for promoting synchronous learning in a blended graduate course. Journal of Online Learning and Teaching, 3(3), 1–14. Pew Internet Project. (2014). Mobile technology fact sheet. Pew Research Center’s Internet & American Life Project. Retrieved from http://www.pewinternet.org/fact-sheets/mobile-technology-fact-sheet/ Picciano, A. G. (2002). Beyond student perceptions: Issues of interaction, presence, and performance in an online course. Journal of Asynchronous Learning Networks, 6(1), 21–40. Retrieved from http:// www.sloanconsortium.org Pratt, N. (2008). Multi-point e-conferencing with initial teacher training students in England: Pitfalls and potential. Teaching and Teacher Education, 24(6), 1476–1486. doi:10.1016/j.tate.2008.02.018 Rao, K. (2012). Universal design for online courses: Addressing the needs of non-traditional learners. Proceedings of the2012 IEEE International Conference on Technology Enhanced Education (ICTEE). doi:10.1109/ICTEE.2012.6208664 Rao, K., & Tanners, A. (2011). Curb cuts in cyberspace: Universal Instructional Design for online courses. Journal of Postsecondary Education and Disability, 24(3), 211–229. Redmond, P., & Lock, J. V. (2006). A flexible framework for online collaborative learning. The Internet and Higher Education, 9(4), 267–276. doi:10.1016/j.iheduc.2006.08.003 Reigeluth, C. M., & Carr-Chellman, A. A. (2009). Understanding instructional theory. In C. M. Reigeluth & A. A. Carr-Chellman (Eds.), Instructional design theories and models (Vol. III, pp. 3–26). New York, NY: Taylor & Francis. Richardson, J. C., & Swan, K. (2003). Examining social presence in online courses in relation to students’ perceived learning and satisfaction. Journal of Asynchronous Learning Networks, 7(1). Rockinson-Szapkiw, A. J., Baker, J. D., Neukrug, E., & Hanes, J. (2010). The efficacy of computer mediated communication technologies to augment and support effective online helping profession education. Journal of Technology in Human Services, 28(3), 161–177. doi:10.1080/15228835.2010.508363 Rose, D., & Meyer, A. (2002). Teaching every student in the digital age: Universal Design for Learning. Alexandria, VA: Association for Supervision & Curriculum Development. Rose, D. H., & Meyer, A. (2002). Teaching every student in the digital age: Universal Design for Learning. Alexandria, VA: ASCD.
174
Developing Teaching Presence in Online Learning Through Shared Stakeholder Responsibility
Rovai, A. P. (2002). Sense of community, perceived cognitive learning, and persistence in asynchronous learning networks. The Internet and Higher Education, 5(4), 319–332. doi:10.1016/S1096-7516(02)001306 Rudestam, K. J., & Schoenholtz-Read, J. (2010). The flourishing of adult online education. In K. J. Rudestam & J. Schoenholtz-Read (Eds.), Handbook of Online Learning (2nd ed.). California: SAGE. Salloum, S. (2012). Student perceptions of computer-mediated communication tools in online learning| Helpfulness and effects on teaching, social, and cognitive presence. doi:10.1007/s10726-011-9234-x Shea, P., & Bidjerano, T. (2010). Learning presence: Towards a theory of self-efficacy, self-regulation, and the development of a communities of inquiry in online and blended learning environments. Computers & Education, 55(4), 1721–1731. doi:10.1016/j.compedu.2010.07.017 Shea, P., Li, C., Swan, K., & Pickett, A. (2005). Developing learning community in online asynchronous college courses: The role of teaching presence. Journal of Asynchronous Learning Networks, 9(4), 59–82. Shea, P., Li, C., Swan, K., & Pickett, A. (2006). A study of teaching presence and student sense of learning community in fully online and web-enhanced college courses. The Internet and Higher Education, 9(3), 175–190. doi:10.1016/j.iheduc.2006.06.005 Shea, P., Vickers, J., & Hayes, S. (2010). Online instructional effort measured through the lens of teaching presence in the Community of Inquiry framework: A re-examination of measures and approach. International Review of Research in Open and Distance Learning, 11(3), 1–29. Shea, P. J., Pickett, A. M., & Pelz, W. E. (2003). A follow-up investigation of “teaching presence” in the SUNY Learning Network. Journal of Asynchronous Learning Networks, 7(2), 61–80. Shi, S., Mishra, P., Bonk, C., Tan, S., & Zhao, Y. (2006). Thread theory: A framework applied to content analyses of synchronous computer mediated communication data. International Journal of Instructional Technology & Distance Learning, 3(3). Retrieved from http://www.itdl.org/journal/mar_06/index.htm Shulman, L. S. (1986). Those who understand: Knowledge growth in teaching. Educational Researcher, 15(2), 4–14. doi:10.3102/0013189X015002004 Smith, G. G., Sorensen, C., Gump, A., Heindel, A. J., Caris, M., & Martinez, C. D. (2011). Overcoming student resistance to group work: Online versus face-to-face. The Internet and Higher Education, 14(2), 121–128. doi:10.1016/j.iheduc.2010.09.005 So, H.-J., & Brush, T. A. (2008). Student perceptions of collaborative learning, social presence and satisfaction in a blended learning environment: Relationships and critical factors. Computers & Education, 51(1), 318–336. doi:10.1016/j.compedu.2007.05.009 Swan, K. (2005). A constructivist model for thinking about learning online. In J. Bourne & J. C. Moore (Eds.), Elements of Quality Online Education: Engaging Communities. Needham, MA: Sloan. Retrieved from http://www.kent.edu/rcet/Publications/upload/constructivist%20theory.pdf Swan, K., & Shih, L. F. (2005). On the nature and development of social presence in online course discussions. Journal of Asynchronous Learning Networks, 9(3), 115–136.
175
Developing Teaching Presence in Online Learning Through Shared Stakeholder Responsibility
Szeto, E. (2015). Community of Inquiry as an instructional approach: What effects of teaching, social and cognitive presences are there in blended synchronous learning and teaching? Computers & Education, 81, 191–201. doi:10.1016/j.compedu.2014.10.015 Taras, M. (2005). Assessment – Summative and Formative – Some theoretical reflections. British Journal of Educational Studies, 53(4), 466–478. doi:10.1111/j.1467-8527.2005.00307.x Taras, M. (2010). Back to basics: Definitions and processes of assessments. Práxis Educativa, 5(2), 123–130. doi:10.5212/PraxEduc.v.5i1.123130 Taylor, P., Morin, R., Cohn, D., Kochhar, R., & Clark, A. (2008). Inside the Middle Class: Bad Times Hit the Good Life. Washington: PewResearch Center. Retrieved from http://www.pewsocialtrends. org/2008/04/09/inside-the-middle-class-bad-times-hit-the-good-life/ Tinoca, L., Oliveira, I., & Pereira, A. (2010). Online group work patterns: how to promote a successful collaboration.Proceedings of the Seventh International Conference on Networked Learning (pp. 429–438). Tomlinson, C. A., & Moon, T. R. (2013). Assessment and student success in a differentiated classroom. Alexandria, VA, USA: Association for Supervision & Curriculum Development. Tu, C. (2004). Online collaborative learning communities: Twenty-one design to building an online collaborative learning communities. Libraries Unlimited. Tu, C.-H., & Corry, M. (2003). Building active online interaction via a collaborative learning community. Computers in the Schools, 20(3), 51–59. doi:10.1300/J025v20n03_07 van der Pol, J., van den Berg, B. A. M., Admiraal, W. F., & Simons, P. R. J. (2008). The nature, reception, and use of online peer feedback in higher education. Computers & Education, 51(4), 1804–1817. doi:10.1016/j.compedu.2008.06.001 Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes. In M. Cole, V. John-Steiner, S. Scribner, & E. Souberman (Eds.), Mind in society: The development of higher psychological processes. Cambridge, MA, USA: Harvard Press. Weinel, M., Bannert, M., Zumbach, J., Hoppe, H., & Malzahn, N. (2011). A closer look on social presence as a causing factor in computer-mediated collaboration. Computers in Human Behavior, 27(1), 513–521. doi:10.1016/j.chb.2010.09.020 Wiggins, G., & McTighe, J. (2005). Understanding by Design. Ontario, Canada: ASCD. Wilson, B., Ludwig-Hardman, S., Thornam, C., & Dunlap, J. (2004). Bounded community: Designing and facilitating learning communities in formal courses. The International Review of Research In Open And Distributed Learning, 5(3), 1–22. Retrieved from http://www.irrodl.org/index.php/irrodl/article/ view/204 doi:10.19173/irrodl.v5i3.204
176
Developing Teaching Presence in Online Learning Through Shared Stakeholder Responsibility
KEY TERMS AND DEFINITIONS Assessment: Tool(s) used to identify the strengths and weaknesses of students’ development of course objectives (Taras, 2015). Collaborative Learning: “Is essentially a social activity, that meaning is constructed through communication, collaborative activity, and interactions with others” (Swan, 2005, p. 5). Community of Inquiry framework: A three-fold model developed by Garrison, Anderson, and Archer (2001) comprising cognitive presence, teaching presence, and social presence. Online Learning: A form of e-learning that presents 80–100 percent of learning content and learning tasks through the Internet medium (Allen & Seaman, 2008; Garrison, 2011). Online Teaching: The transactions of presenting ideas and knowledge content to students through the Internet medium (Garrison, 2011). Teaching Presence: “The design, facilitation, and direction of cognitive and social presences for the purpose of realizing personally meaningful and educationally worthwhile learning outcomes” (Anderson, Rourke, Garrison, & Archer, 2001, p. 5).
177
178
Chapter 9
A Case Study of Peer Assessment in a Composition MOOC:
Students’ Perceptions and Peer-grading Scores versus Instructor-grading Scores Lan Vu Southern Illinois University at Carbondale, USA
ABSTRACT The large enrollments of multiple thousands of students in MOOCs seem to exceed the assessment capacity of instructors; therefore, the inability for instructors to grade so many papers is likely responsible for MOOCs turning to peer assessment. However, there has been little empirical research about peer assessment in MOOCs, especially composition MOOCs. This study aimed to address issues in peer assessment in a composition MOOC, particularly the students’ perceptions and the peer-grading scores versus instructor-grading scores. The findings provided evidence that peer assessment was well received by the majority of students although many students also expressed negative feelings about this activity. Statistical analysis shows that there were significant differences in the grades given by students and those given by the instructors, which means the grades the students awarded to their peers tended to be higher in comparison to the instructor-assigned grades. Based on the results, this study concludes with implementations for peer assessment in a composition MOOC context.
INTRODUCTION The evolution of traditional online learning or online learning 1.0 to online learning 2.0 has created both opportunities and challenges for higher education (Sloan C, 2013; Grosseck, 2009; McLoughlin, & Lee, 2007). In the traditional online learning, online courses are quite similar to traditional face-to-face courses in term of the ratio of students to instructors. However, in online learning 2.0, of which massive open online courses (MOOCs), including MOOCs in composition, are a typical representative, an DOI: 10.4018/978-1-5225-1851-8.ch009
Copyright © 2017, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
A Case Study of Peer Assessment in a Composition MOOC
online instructor can have up to several thousand students in his or her course. Grading in such massive open online courses becomes a burden or a mission impossible for even the most dedicated professors, with an army of equally dedicated teaching assistants. Since not all assignments can be designed in auto-graded formats, and artificial intelligence grading programs are not well regarded by educators and researchers (Condon, 2013; Deane, 2013; Bridgeman, Trapani, & Yigal, 2012; Byrne, Tang, Truduc, & Tang, 2010; Chen, & Cheng, 2008; Cindy, 2007; Benett, 2006; Cheville, 2004; Chodorow, & Burstein, 2004), online peer grading is utilized, especially for composition and other courses in humanities. This online peer grading practice shifts the traditional grading authority from the instructor to the learners and poses many unanswered questions about the reliability and validity of online peer-reviewed grades in an open online learning setting. In the literature, findings in a few studies on peer grading (i.e. Cho et. al, 2006; Sadler & Good, 2006; Bouzidi & Jaillet, 2009) show a high consistency among grades assigned by peers and a high correlation between peer grading and teacher grading, which indicates that peer grading has been found to be a reliable and valid assessment tool. However, these findings are generally based on the context of college courses with small or moderate enrollments. By the time this study was conducted, there has been only one empirical study on peer grading in MOOC context.Lou, Robinson and Park (2014) examined peer grading assignments from a Coursera MOOC and found that grading scores given by peer students were fairly consistent and highly similar to the scores given by instructors. Nevertheless, their results actually refer to a Coursera MOOC named Maps and the Geospatial Revolution, not a composition MOOC. Given research on peer assessment in MOOCs is limited, the present study looks into certain issues of peer assessment in composition in a MOOC context. For this study, I collected surveys, conducted interviews, and accumulated statistical data on students’ and instructors’ grades and comments from a seven–week MOOC-based composition course with the purpose of examining aspects of peer assessment in this context, particularly the students’ perceptions, and the grades given by the students and the instructors.
LITERATURE REVIEW MOOCs in Composition MOOCs (Massive Open Online Courses) hit the popular press in 2012 and were considered one of the most important emerging developments in educational technology in 2013 (New Media Horizon 2013). In 2012, Bill & Melinda Gates Foundation awarded 12 grants with a total of three million dollars to organizations and universities to develop MOOCs for a variety of courses –from developmental math to English Composition. Innovative institutions and faculty receiving Gates Foundation grants for developing MOOCs for composition include Denise K. Comer, Director of First-Year Writing, Duke University; Rebecca Burnett, Director of Writing and Communication Program and Karen Head, Director of the Communication Center, Georgia Institute of Technology; Kay Halasek, Director of Second-Year Writing, The Ohio State University. Comer’s course –English Composition I: Achieving Expertise – opened for over 8000 of enrollments in late December 2012. This course aimed to provide an introduction to and foundation for the academic reading and writing characteristic of college. For Head’s MOOC in May 2013, the First-Year Composition 2.0 course was designed to help learners develop a better process and gain confidence in written, visual, and oral communication as well as create and critique college-level documents and presentations. In Head’s course, learners would draft and revise the following assignments: 179
A Case Study of Peer Assessment in a Composition MOOC
a personal essay, an image, and an oral presentation. Halasek’s Writing II: Rhetorical Composing course introduced a variety of rhetorical concepts, ideas and techniques to inform and persuade audiences, which help learners become more effective consumers and producers of written, visual, and multimodal texts. In Halasek’s course, learners can exchange words, ideas, talents, and support. All of these composition MOOCs were created through the MOOC platform Coursera. After these MOOCs in composition were completed, questions as to whether the courses were a success were raised. Posts by Karen Head (2013) on the freshman-composition MOOC that she taught with Gates Foundation funding appeared in The Chronicle. The stats were disappointing. Head wrote: If we define success by the raw numbers, then I would probably say No, the course was not a success. Of course, the data are problematic: Many people have observed that MOOCs often have terrible retention rates, but is retention an accurate measure of success? We had 21,934 students enrolled, 14,771 of whom were active in the course. Our 26 lecture videos were viewed 95,631 times. Students submitted work for evaluation 2,942 times and completed 19,571 peer assessments (the means by which their writing was evaluated). However, only 238 students received a completion certificate—meaning that they completed all assignments and received satisfactory scores. Head (2013) and her team had some hypotheses for the reasons there were so few students completing the course. One of the hypotheses is students not completing all three major assignments could not pass the course. Plus, many students failed to complete the course because they had technical problems and cultural issues, or because they joined the course late. In spite of the terrible retention rates, Head (2013) claimed that the course was a success if the success is defined by lessons learned in the course design and presentation. She stated that “…if we define success by lessons learned in designing and presenting the course, I would say Yes, it was a success. From a pedagogical perspective, nobody on our team will ever approach course design in the same way. We are especially interested in integrating new technologies into our traditional classes for a more hybrid approach.” In regard to technology integration, Head suggested not rushing to teach another MOOC soon, especially when the technology is lacking for courses in subject areas like writing, which have such strong qualitative evaluation requirements. “Too often we found our pedagogical choices hindered by the course-delivery platform we were required to use, when we felt that the platform should serve the pedagogical requirements. Too many decisions about platform functionality seem to be arbitrary, or made by people who may be excellent programmers but, I suspect, have never been teachers” (Head, 2013). In general, from what Head (2013) pointed out, technical issues seem to have certain effects on the MOOCs’ working, for example retention rate and course delivery. Similar to Head’s MOOC, the MOOC in composition offered by The Ohio State University seemed not to be successful in terms of the low retention rate. According to the final report on the MOOC Writing II: Rhetorical Composing, posted on Educause (2013b), among 32,765 total enrolled students, there were 55% engaging in the course at least once and 444 receiving the completion statements. However, what was considered to be a success was the inaugural experience using the WExMOOC platform for students to review peers’ writing and to receive analytics, although training students to produce strong reviews was challenging. As the report stated, approximately one fourth of the course’s students used the platform to do the peer review and to receive analytics that helped students compare their own skills with the skills of others (Educause, 2013b). Another area of this course considered to be successful is training students to make critical reviews of peers’ work, although the training seemed to be challenging (Educause, 2013b). 180
A Case Study of Peer Assessment in a Composition MOOC
For the other composition MOOC offered by Duke University, although there are no official results or evaluations published, many learners have shared their experiences and comments on personal websites or blogs. Steve Krause (2013), a professor at Eastern Michigan University, posted on his blog under the title “The end of the Duke Composition MOOC: again, what did we learn here?”: “If the point of the Duke Composition MOOC was to see if it could provide an educational experience that could compete with a traditional first year writing course taken in order to earn the credential of a college degree, then the answer is clearly no.” Krause pointed out problems of this specific MOOC, for example the way the course was organized and the method of evaluating students’ work. Krause wrote: I thought there was almost no connection between Comer’s video lectures/teaching, the online discussion, and the assignments themselves, and every week there seemed to be something added to the Coursera web site to further confuse things… A multiple choice test for a writing course wouldn’t work, machine grading of writing (as it currently exists, at least) doesn’t work, and peer evaluation as the only means of evaluation for a credit-bearing course doesn’t work. So logistically, it seems to me that the way we teach first year writing right now is probably the most efficient/effective way to do it at large universities. Apart from the first three composition MOOCs mentioned above, there have been a great number of other MOOCs in composition recently offered by different universities. In general, the birth of MOOCs, including MOOCs in composition, has created a trend in teaching and learning – a configuration for delivering learning content online to an unlimited number of learners who want to take the course. MOOCs have been found to have certain success, for example in the use of WExMOOC platform for students’ peer assessment and the training students to produce strong reviews (Educause, 2013b) although there’s a lack of information and research on how these assessments were arrived at. However, with the nature of MOOCs like massiveness and scalability, MOOCs presents certain challenges, one of which is assessment.
Overview of Peer Assessment Peer assessment, which has been researched over 30 years, is a common educational practice in a wide range of subject domains including writing in both classrooms and traditional online classes (with small or moderate enrolments). Baird & Northfield (1992) stated that peer assessment refers to “specific judgments of ratings made by pupils about their achievement, often in relation to teacher–designed categories” (12). Topping et al. (2000) defined peer assessment as “an arrangement for peers to consider the level, value, worth, quality or successfulness of the products or outcomes of learning of others of similar status” (p. 150). Topping (2009) stated that peer assessment activities can be operated in different areas or subjects and be involved in a wide variety of products for example oral presentation, test performance, portfolio, writing, and so forth. Since peer assessment involves numerous applications and a wide range of educational methods, educators and scholars have employed varied terminology to describe this practice. The frequently used terms include “peer assessment” (Falchikov & Goldfinch, 2000, Falchikov, 1995; Topping et. al, 2000; Cho, Schunn, & Wilson, 2006), “peer review” and “peer feedback” (Connors, 1997; Gielen et al., 2010), “peer response” (DiPardo & Freedman, 1988), “peer marking” (Fry, 1990), “peer criticism” (Holt, 1992), “peer rating” (Magin, 1993), “peer grading” (Sadler & Good, 2006). Despite varied terminology, all of these are used to denote a common practice – students are guided through teaching materials (i.e. checklist, rubric, questionnaires, etc.) to evaluate peers’ work 181
A Case Study of Peer Assessment in a Composition MOOC
on a number of points, to provide feedback to other students on the quality of their work, and in some instances to assign a grade. In this study, the term peer assessment is used as an umbrella concept to capture the diversity of definitions and involves any activity in which students make only comments on peers’ work, or students only assign a grade to peers’ work, or students both make comments on peers’ work and assign a grade. Practices of peer assessment especially in traditional classroom instruction have been found to be effective in a large body of literature. By gathering different sources, Sadler & Good (2006) sorted the potential advantages of peer assessment over teacher grading into four perspectives: logistical, pedagogical, metacognitive, and affective. Logistically, it saves teacher time and results in immediate feedback; in addition, peers offer more detailed feedback than the teacher can (Sadler & Good, 2006). Pedagogically, students have opportunities to understand the topic better, develop skills through reading another’s answers. Metacognitively, peer assessment helps students become more aware of their strengths, weakness, and progress and develop a capacity to use higher order thinking skills to make judgments. Finally, affective changes can produce the dynamic in classrooms, helping students work more productively and cooperatively (Sadler & Good, 2006). In regard to benefits of peer assessment, Topping (2009) emphasized gains that either assessors or assessees have from peer assessment, among which are cognitive and metacognitive gains, improvements in writing, and improvements in peer/or group work. For the work online, Bouzidi and Jaillet (2009) stated that peer assessment “develop learning at high cognitive levels” which “involve the student in the revision, assessment, and feedback process of work online” (p. 257). The reliability and validity of peer assessment have been supported in the context of face-to-face and online education by some researchers. From their study on student grading compared to teacher grading in four secondary school science classrooms, Sadler & Good (2006) pointed out very high correlations between grades assigned by students and those by the teacher. This study also shows that “the high levels of inter-rater agreement are possible between students and teacher when students grade their own or others’ papers,” given that “students should be trained to grade accurately” (Sadler & Good, 2006, p. 27). Bouzidi & Jaillet (2009), carrying out an experiment of online peer assessment in three different courses with the total of 242 students, concluded that peer assessment (peer grading) is equivalent to the assessment by the professor in certain exams of computer science and electrical engineering. In another study on peer assessment of 708 students across 16 different courses from 4 universities, Cho et. al (2006) investigated the validity and reliability of peer-generated writing grades in a scaffolded reviewing context. These 16 courses included Psychology Research Methods, Health Psychology, Cognitive Science, Education, Rehabilitation Sciences, Leisure Studies, History, and the Honors College, in which the writing task that was assigned to students varied, depending on different disciplines. The assigned paper genres consisted of the introduction of a research paper, a proposal for a research study, a proposal for an application of a research finding, and a critique of a research paper. The results show that “the aggregate ratings of at least 4 peers on a piece of writing are both highly reliable and as valid as instructor ratings while (paradoxically) producing very low estimates of reliability and validity from the student perspective” (p. 891). The results also suggest that “instructor concerns about peer evaluation reliability and validity should not be a barrier to implementing peer evaluations, at least with appropriate scaffolds” (p. 891). In general, findings in a number of studies in peer assessment (i.e. Cho et. al, 2006; Sadler & Good, 2006; Bouzidi & Jaillet, 2009) show a high consistency among grades assigned by peers and a high correlation between peer grading and teacher grading, which indicates that within certain contexts like traditional college classrooms and online courses, peer assessment has been found to be a reliable and 182
A Case Study of Peer Assessment in a Composition MOOC
valid assessment tool. However, these findings are generally based on the context of college courses with small or moderate enrollments (i.e. Sadler & Good’s (2006) study on 4 secondary school classes; Cho et al.’s (2006) study on 16 classes with 708 students). In the MOOC contexts, according to Educause (2013a), peer assessment is a widely applicable approach to MOOCs of different forms, contents, and products. From a pedagogical perspective, that “students in MOOCs grade each other’s work according to the professor’s specifications” seems to be “a promising practice” because “it can extend students’ learning experience” and their sharing knowledge through interactions (Boston & Helm, 2012; Kuh, 2012, as cited in Boston & Helm, 2012). In their study on peer grading in a MOOC, Lou, Robinson and Park (2014) examined 1,825 peer grading assignments from a Coursera MOOC and found that grading scores given by peer students were fairly consistent and highly similar to the scores given by instructors. However, in a class with several thousands of students such as in the context of MOOCs, this practice of peer grading can be problematic. Watters (2012) pointed out problems associated with peer assessment grading such as the anonymity of feedback (students do not know who they are assessing and who is assessed), the lack of feedback on peer feedback (there’s no way for students to give feedback on that feedback) the lack of community in the course (students don’t know much about their peers), and the variability of feedback (many students are not well prepared to give solid feedback to peers and there are issues with English as a second or foreign language). In a nutshell, the practice of peer assessment in MOOCs is still early in implementation as well as still in the process of being assessed by researchers and educators. In addition, since empirical research on this topic is very limited, whether or not peer assessments in MOOCs in general and MOOCs in composition in particular is reliable and valid in the configurations currently used is in need of further research.
Peer Assessment in Composition MOOCs Since many students have been found to lack writing skills or to display inadequate abilities in writing (Baker, Gersten, & Graham, 2003; Swales & Feak, 2004; Leal, 2012), researchers, educators, and teachers have explored various instructional strategies and techniques used in the practice of teaching and learning writing. Among these strategies is peer assessment (also called peer review/peer response), based on theories that emphasize the social nature of language, thought and learning, prime among them Vygotsky (1962, 1978) that learning is a cognitive activity that happens through social interactions and which requires sharing, exchanging, and negotiations. In the field of teaching and learning writing, peer assessment is well established as an important theoretical stage of the writing process (Bruffee, 1984; Elbow, 1981) and the activity of having students critique peers’ writing becomes common in composition classrooms and composition textbooks (both first language (L1) and second language (L2)). Theoretically, peer assessment that involves peer interactions and collaborations will help students construct knowledge and learn effectively (Liu et al, 2002; Lee & Lim, 2012). Also, research on peer assessment supports the capacity of peers for helping each other when they work together, and shows the benefits of peer assessment in helping students revise their papers and get a better understanding of their writing (Nelson & Murphy, 1993; Villamil & Guerrero, 1996; Topping et al, 2000; Wen & Tsai, 2006). Thompson (1981), an English teacher, presented some promising advantages of peer assessment especially when students graded peers’ work in composition classes. Thompson stated that “trained students not only grade papers competently and reliably but also write better as a result of this training” (p. 172) and that “trained students can be trusted as graders in composition research” (p.172).
183
A Case Study of Peer Assessment in a Composition MOOC
As peer assessment has been an important tool in teachers’ strategies for teaching composition in general, it is not surprising that the practice of peer assessment is being carried out in the context of teaching and learning writing in a massive open online setting – MOOCs in composition. Although there is a lack of evidence for the effectiveness of peer assessment in composition MOOCs, that peer assessment has been used or suggested in other settings (i.e. face-to-face or traditional online courses) likely accounts for the decision to apply it in the MOOC setting. Also, the large enrollments of multiple thousands of students in composition MOOCs seems to exceed the assessment capacity (i.e. evaluating and grading) of instructors; therefore, the inability for teachers to grade so many papers is likely responsible for MOOCs turning to peer assessment. Coursera’s introduction of a peer assessment system to MOOCs in composition seems to help (a) MOOCs students learn how to write and write better and (b) instructors minimize their grading workload, as a few researchers suggested (Haaga, 1993; Rushton, Ramsey, & Rada, 1993; Rada et al., 1994). However, Falchikow (2001, 2005) cautioned that establishing good quality peer assessment requires more work in organizing peer assessment activities and in training students how to provide constructive feedback. Although what Falchikow (2001, 2005) cautioned was for more traditional setting, it might be true in other settings including composition MOOCs. Not taking other aspects into account (i.e. the need for greater training of students (Falchikow, 2001, 2005), the anonymity of feedback, the lack of feedback on feedback, the lack of community (Watters, 2012)), Coursera (2012) claims that this peer assessment in composition MOOCs, which guides students in using instructor-constructed rubrics to evaluate and provide feedback on peer work, “offers an effective learning tool by giving students the opportunity to play the role of both “student” and “teacher.” ….and allow students to “sharpen their critical thinking skills through the active comparison” of peers’ answers to the instructor’s rubric” (Coursera, 2012). However, it is crucial to have research to verify the effectiveness of peer assessment in composition MOOCs. As stated on Coursera’s website, the process of peer assessment including several phrases is outlined below (Table 1). Upon completion of these above phases, the grades are calculated by the median of all the peer grades. In its most basic form, common rubrics particularly designed for each of the assignments are used for students to assess peers’ papers. Student’s submitted papers are randomly distributed to a handful of peer raters (usually from three to five). Written peer comments are provided along with assigning a score. Finally, students receive peer comments and the median of peer ratings. Table 1. Phases of peer assessment in a composition MOOC (Coursera – Peer Assessment) Submission phase: During this phase, the assignment is open, and you have unlimited chances to submit your assignment (without penalty), up until the deadline. In submitting the assignment, you agree to abide by the Honor Code. Evaluation phase: This phase begins shortly after the submission deadline. It consists of several components: • Training (not required for most classes): A small number of classes may require you to practice grading sample submissions of the assignment. If this page shows up for you, you will have to pass the exercise before moving on to evaluate your peers. • Peer evaluation: In every peer assessment, you will be required to grade a predetermined number of your peers’ submissions in accordance with the rubric. These submissions will be randomly selected from the class pool. Failure to complete the requisite number of evaluations will result in a grade penalty. • Self evaluation: In some classes, you will also be required to assess your own submission against the instructor’s rubric. If this page shows up for you, and you fail to complete a self-evaluation, you may incur a grade penalty. • Results phase: Shortly after the evaluation deadline, you will be able to see your final grade and grade breakdown, as determined by your peers’ evaluations of your work.
184
A Case Study of Peer Assessment in a Composition MOOC
Although there have been studies that show peer assessment of writing to be beneficial in some contexts of teaching and learning writing, there has been no study conducted on the issue of peer assessment specifically in a composition MOOC (at least by the time this present study was conducted). Also, there has been no study about peer assessment in composition MOOCs, looking at reliability and/ or validity, and other issues like students’ perceptions. This present study was intended to address issues in peer assessment in a MOOC in composition, by looking into the practice of peer assessment in the context of MOOCs in composition, particularly the students’ perceptions, and the grades given by the students and the instructors.
Students’ Perceptions of Peer Assessment A few empirical studies in different disciplines have looked into what students themselves thought about peer assessment and their participation in that activity. In a study on 250 students’ perceptions about online peer assessment for undergraduate and graduate writing across the disciplines in ten courses, Kaufman and Schunn (2011) found that students sometimes thought peer assessment unfair and that students believed they were not qualified to grade peers’ work. Kaufman and Schunn added that “student’s perceptions about the fairness of peer assessment drop significantly following students’ experience in doing peer assessment” (p. 387). Different from Kaufman and Schunn (2011), Simkin and Ramarapu (1997) examined student perception in peer –reviewed grading in which computer science undergraduates assigned final grades to each other’s term papers and found that students trusted peers and felt comfortable with the practice of peer grading. For ESL composition, L2 researchers such as Leki (1990), Krashen (1982), Nelson & Murphy (1992), Stanley (1992), Connor & Asenavage (1984) pointed out that students are positive about peer feedback and that peer feedback helps them increase confidence in writing, develop critical reading skills, and improve their writing in many ways like organization, structure, and grammar. However, Mangelsdorf’s study of peer assessment in ESL composition classroom (1992) revealed that students had negative reactions – students did not trust in their peers’ abilities to critique the papers. Students sometimes question the efficacy of peer responses, their ability to give constructive feedback, or their peers’ competency to evaluate writing work (Leki, 1990; Zhang, 1995). For the scholarship in English composition (L1 composition), there are not many studies that measure students’ perceptions about peer assessment but more frequent in L2 and sometimes mixed groups of both English native students and ESL students (Brammer and Rees, 2007). Brammer and Rees further explained for this lack of knowledge of student perception of the peer assessment that “Perhaps because peer response is practically instinctive to those of us who teach writing, few have felt the need to study the student perspective. Instead, studies have focused on the quality of peer comments, their effect on the revision process, and the best methods for conducting peer review” (p. 274). In their study on peer review from the students’ perspective, Brammer and Rees (2007) stated that peer assessment was used in most of first year writing classrooms, but most students find the activity not very helpful and many students did not trust their peers to response to their papers. Most students “expressed concerns about classmates’ dedication and ability to peer review” (Brammer and Rees, 2007, p. 283). However, Brammer and Rees also emphasized that students did express positive impressions about peer assessment – “students who were prepared to carry out peer review through two or more teaching methods (e.g., handout, lecture, and paper demonstration) were more likely to find peer review helpful” ….and “more confident in their ability to peer review” (p. 280). In a study of both L1 and L2 students’ perceptions and attitudes of peer assessment, Murau (1993) found that both groups appreciate the value of peer assessment because it can 185
A Case Study of Peer Assessment in a Composition MOOC
help them with, in students’ words “grammar” (L2 writer), “vocabulary” (L2 writer), “surface errors” (L1 writer), “overall organization” (L2 writer), “ideas” (L1 writer) (p. 74-75). While students recognize the benefits of peer assessment, 20% would prefer to review their papers with a teacher or tutor “because they respect [the teacher’s] knowledge of a language better than with a peer (L1 writer) or because ‘he can explain me in technical grounds’ (L2 writer)”; 20% would not review their writing with anyone because they felt nervous and uncomfortable (Murau, 1993, p. 75- 76). In regard to the perceived effects about peer assessment in a Coursera MOOC, Lou et al. (2014) concluded that approximately 63% of students believed that the peer assessment activity was helpful in developing their thinking competences. The researchers also found that about 62% of the students thought the grades given by peers were fair and the feedback was useful. Overall, much research in different disciplines has been done regarding students’ perceptions and attitudes of peer assessment. In composition scholarship, students’ perception has been explored at the level of peer assessment as part of the writing process. In the MOOC context, the study by Lou et al. (2014) was one of very few empirical research found in recent literature to investigate students’ perceptions of peer assessment. Their results actually refer to a Coursera MOOC named Maps and the Geospatial Revolution, not a composition MOOC. Since empirical research on students’ perceptions of peer assessment in composition MOOCs is very limited and in need of further research, this present study was intended to look into the practice of peer assessment in the context of MOOCs in composition, particularly the students’ perceptions, in addition to the grades given by the students and the instructors.
OVERVIEW OF E-CENTER AND E-CENTER’S MOOC-BASED COURSE -ESL/EFL WRITING FOR ACADEMIC AND PROFESSIONAL SUCCESS The present study examined aspects of peer assessment in a composition MOOC named ESL/EFL Writing for Academic and Professional Success, provided by E-Center for Professional Development. For a clear presentation of the setting in which the present study was conducted, this section provides an overview of E-Center and its composition MOOC.
E-Center for Professional Development (E-Center) In 2011, through a grant from the U.S Embassy, four Vietnamese Fulbright alumni including the researcher of this study established an open online learning program with a series of six courses for 150 language learners who were in-service teachers. The target learners all taught English as a foreign language in Vietnam, many in rural areas where they had little opportunity to communicate with others in English. The goal of the online program was to offer courses, at no cost, to these in-service English teachers. Since we, as course designers, and the target learners had much more experience with traditional faceto-face, cohort learning than with asynchronous online or correspondence models, we decided to make the online course as similar as possible to face-to-face courses in terms of pedagogical approach. We attempted to replicate the face-to-face class environment in our online class, in terms of both the organization and delivery of online classes. To make sure that our learners had a similar learning experience to a face-to-face course, we included live virtual classroom (LVC) sections into each section. LVC, in our use, represented classroom-type use of the type of web conferencing or electronic meeting software that is increasing available and inex186
A Case Study of Peer Assessment in a Composition MOOC
pensive. LVC lets online course providers deliver courses in a format similar to videoconferencing that has been common in higher education for several decades. Every week, learners in each section attended the LVC in which the instructors delivered live lectures, much as they would in a face-to-face classroom, and on a fixed schedule (usually on weekends because all of the learners in the course were in-service teachers). Students and instructors could interact with each other in real time through audio or video channel or text chat. The LVCs were also recorded and posted into the course sections so that students who did not attend the LVCs could watch them. The difference between an LVC and LVC recording was that with an LVC recording, students could not ask questions directly to the instructors or have direct real time interactions with the instructor or their peers. Beside the LVC every week, students logged into their Moodle-based asynchronous course section to watch the video lectures, PowerPoint presentations, podcast and reading materials. Since we attempted to create a learning experience in our online courses as close to the face-to-face setting as we could, the assignments were very diverse including essays, peer-reviewed assignments, and collaborative assignments. In 2012, E-Center’s administrators attempted to embrace an interactive MOOC-style course. The first move in the program restructure was to create one single massive course for all students instead of many small course sections. We called this new model “one-star instructor and satellite co-instructor,” meaning that the course featured the most dynamic and dedicated instructor identified during the previous courses. This instructor was in charge of designing the course content including delivering the video lectures, LVC and assignments. The satellite co-instructors were other instructors who were in charge of grading students’ papers and facilitating the course’s discussion asynchronous discussion forums on the course Moodle-based LMS. Since 2013 E-Center have offered open online courses in ESL including Writing courses to thousands of students from many parts of the world. For open online writing courses, we realized that one of the biggest challenges was how to grade students’ writing assignments. Instructors, limited in number could not complete the grading mission. We created an online peer assessment system in this course to solve the grading problem. With a standard grading rubric for each writing assignment and some instructions on peer assessment (called training on peer assessment), three students graded one assignment, using the same standard grading rubric. We set up our grading mechanism in which only students who submitted the assignment could do peer grading. From what we perceived - peer assessment seems to be one of the most controversial issues in composition MOOC pedagogy and feedback given by peers might be thought to be of poor quality, as a partial solution to this issue, we had instructors provide students with detailed rubrics and guidelines during the training on peer assessment. However, many students did not do peer grading; some students’ assignments were graded by only one or two peers not three peers as expected. This practice may make the students’ feedback on writing assignments unreliable and invalid. Moreover, the practice of using online peer assessment was still quite new in the online learning field, and certainly to us, so we were not really sure how valid and reliable the grades offered by the fellow students were.
ESL/EFL Writing for Academic and Professional Success MOOC-based Course Course Objectives and Course Work The ESL/EFL Writing for Academic and Professional Success, a seven-week course starting from December 1, 2014 to January 18, 2015, was the fifth MOOC-based course in writing offered by E-Center. 187
A Case Study of Peer Assessment in a Composition MOOC
This is the course the researcher studied. The course was designed to help students develop the fluency, focus, analytical skills needed to become successful writers. In this course, students learned and practiced the strategies and processes that successful writers employ as they work to accomplish specific purposes for example comprehension, instruction, entertainment, persuasion, investigation, problem-resolution, evaluation, explanation, and refutation, all of which would help prepare learners for academic and professional communication. There were three major writing assignments for students to accomplish 1. Application Essays 2. Summary and Evaluation Essays 3. Argumentative Essays Students taking this course had two options. If they were interested in receiving a Statement of Accomplishment, they had to score at least 60% in the course (above 80% earned a Statement with distinction) and participating in peer assessment. If students were not trying to receive a Statement of Accomplishment, it was fine for them to audit the course and review only the materials. The course materials remained available two months after the course ended, so students could also go through the course at a later date or slower pace. The course was hoped to address a wide range of learning goals. During the course, there were live virtual sessions in which keynote speakers talked about particular topics relevant to the writing assignments students were working on. These sessions were held live at a specific time, and students were provided with the link to attend these sessions. Students’ grades were based on participation including completion of quizzes, material access (20%), three major writing assignments (25% per task), and a multiple-choice final exam (10%). The grading scales were A (90% - 100%), B (80% - 89%), C (70% - 79%), D (60% - 69%), and F (0% - 59%). Discussion forum was available within each writing assignment, so if students had questions on the video lectures or quizzes in each writing task, they could post a question right there. The following is the course syllabus (Table 2). Each of the major writing assignments in this course followed the same structure including four phases in sequence (beginning at a predefined starting time and ending at a predefined deadline time). First, students prepared and submitted Draft 1 of the assignment. Second, students reviewed and assessed peers’ Draft 1. Third, students submitted the revised version of the assignment (Final Draft) based on the feedbacks from peers. Last, students assessed peers’ Final Draft. Prior to the peer assessment, students were requested to go through a training process for understanding and using the peer assessment tool within MOODLE platform as well as for evaluating their peers’ work. Students were encouraged to selfreview their work, but the self-assessment was optional. If students failed to submit the assignment, they would not be able to participate in the peer assessment. If students failed to complete the required peer assessment, they would receive no credit for the associated assignment. The following explains typical cycles of major writing assignments:
Writing Assignment 1: Application Essays 1. Prepare and submit Application Essay (Draft 1) 2. Complete peer assessment process (required training phase, required peer assessing, optional self-assessing) 3. Prepare and submit Application Essay (Revised version – Final Draft) 188
A Case Study of Peer Assessment in a Composition MOOC
Table 2. The course syllabus Weeks
Topics / Activities
Assignments Due
Week 1 (Dec 1 – Dec 7, 2014)
Introduction to the course; syllabus An overview of writing and writing process: - Pre-test - Thinking about audience and purpose - Paragraph/Essay organization - Writing as a process: planning, drafting, revising
Do the pretest and read all of the materials by Dec 7, 2014
Week 2 (Dec 8 – 14, 2014)
Writing: Application Essay - Lectures on application essays (videos + attachments) - Writing prompt (attachment) - First draft of Application Essay - Peer assessment and self-assessment (rubric, guidelines of assessment) Live Virtual Session: Questions & Answers (Scheduled time: To be announced)
First Draft of Application Essay (Due: by midnight Wednesday, Dec 10, 2014) Peer and self-assessment of Application Essay (Due: by Saturday, Dec 13, 2014)
Week 3 (Dec 15 - 21, 2014)
Writing: Application Essay (Contd.) - Checklist of Application Essay Assignment - Second draft of Application Essay (revise the first draft for the polished one) - Peer assessment of the polished draft of Application Essay * Live Virtual Session - Keynote Speaker: How to be successful in writing application essays? (Scheduled time: To be announced)
Polished Draft of Application Essay (Due: by midnight Tuesday, Dec 16, 2014) Peer Assessment of Application Essay – Polished Draft (Due: by midnight Saturday Dec 20, 2014)
Week 4 (Dec 22- 28, 2014)
Writing: Summary and Evaluation Essay - Lectures on Summary/Evaluation Essay (videos + attachments) - Writing prompt (attachment) - First draft of Summary/Evaluation Essay - Peer assessment and self-assessment (rubric, guidelines of assessment) * Live Virtual Session - Keynote Speaker: Avoiding plagiarism (Scheduled time: To be announced)
First Draft of Summary/ Evaluation Essay (Due: by midnight Friday, Dec 26, 2014) Peer and self-assessment of Summary/ Evaluation Essay (Due: by Sunday, December 28, 2014)
Week 5 (Dec 29, 2014 – Jan 4, 2015)
Writing: Summary/Evaluation Essay (Contd.) - Checklist of Summary/Evaluation Essay Assignment - Second draft of Summary/Evaluation Essay (revise the first draft for the polished one) - Peer assessment of the polished draft of Summary/Evaluation Essay * Live Virtual Session - Keynote Speaker: Evaluation Criteria (Scheduled time: To be announced)
Polished Draft of Summary/ Evaluation Essay (Due: by midnight Tuesday, Dec 30, 2014 Peer Assessment of Summary/ Evaluation Essay – Polished Draft (Due: by midnight Saturday, Jan 3, 2015)
Week 6 (Jan 5 – 11, 2015)
Writing: Argumentative Essay - Lectures on Argumentative Essay (videos + attachments) - Writing prompt (attachment) - First draft of Argumentative Essay - Peer assessment and self-assessment (rubric, guidelines of assessment) * Live Virtual Session - Keynote Speaker: Ethos Pathos Logos: Appeals for Effective Arguments (Scheduled time: To be announced)
First Draft of Argumentative Essay (Due: by midnight Wednesday, Jan 7, 2015) Peer and self assessment of Argumentative Essay (Due: by Saturday, Jan 10, 2015
Week 7 (Jan 12-18, 2015)
Writing: Argumentative Essay (Contd.) - Checklist of Argumentative Essay Assignment - Second draft of Argumentative Essay (revise the first draft for the polished one) - Peer assessment of the polished draft of Argumentative Essay * Live Virtual Session - Keynote Speaker: Making Counter Arguments (Scheduled time: To be announced)
Polished Draft of Argumentative Essay (Due: by midnight Wednesday, Jan 14, 2015) Peer Assessment of Argumentative Essay – Polished Draft (Due: by midnight Saturday, Jan 17, 2015
Final Exam
Sunday, Jan 18, 2015
189
A Case Study of Peer Assessment in a Composition MOOC
4. Complete peer assessment process (No training phase, required peer assessing, optional self-assessing)
Writing Assignment 2: Argumentative Essays 1. Prepare and submit Argumentative Essay (Draft 1) 2. Complete peer assessment process (required training phase, required peer assessing, optional self-assessing) 3. Prepare and submit Argumentative Essay (Revised version – Final Draft) 4. Complete peer assessment process (No training phase, required peer assessing, optional self-assessing)
Writing Assignment 3: Summary and Evaluation Essays 1. Prepare and submit Summary and Evaluation Essay (Draft 1) 2. Complete peer assessment process (required training phase, required peer assessing, optional self-assessing) 3. Prepare and submit Summary and Evaluation Essay (Revised version – Final Draft) 4. Complete peer assessment process (No training phase, required peer assessing, optional self-assessing) In regard to peer assessment presented in the cycles above, for Draft 1, peer reviewers were required to both make comments (marginal and end comments) and assign a score corresponding to the criteria designed in the rubric. Based on the feedback from peers on the Draft 1, students submitted the revised version of the assignment (Final Draft) and then participated in assessing peers’ final drafts. For final drafts, students were required to assign a score in accordance with the rubric. Making comments was optional at this phase.
Students and Instructors of the Course The MOOC-based course – ESL/EFL Writing for Academic and Professional Success had 4582 registered students with 4521 students actually accessing the course. As shown in Figure 1 below, the majority of students were from Vietnam (55.18%) and China (21.41%), and the others were from Taiwan (8.18%), Indonesia (5.46%), Malaysia (3.45%), Singapore (2.76%), Japan (1.88%), and South Korea (1.68%). All of the students were non-native English speakers. In this course, there were two instructors (females) in charge of lecture delivery and three instructors (one male and two females) in charge of monitoring peer assessment, facilitating the course’s forum discussion, and answering students’ emails. All of the instructors were English native speakers with a Master degree in TESOL (Teaching English to speakers of other languages) and had at least two years of experience in teaching ESL/EFL writing. Moreover, because the instructors had taught E-Center’s writing courses before, they were familiar with the online platform, LVC, LMS, etc. used by E-Center. Like many other MOOCs, due to the large enrollment far exceeding the grading capacity of instructors, instructors in this course did not provide grades and feedback to students’ work. Instead, the course utilized an online peer assessment process to grade the learners’ papers. At the end of the course, there were 334 students (out of the initial 4,521) receiving a completion certificate, which means that they submitted all assignments, did all assigned peer assessments, and received satisfactory scores (Cs up). Among these 334 students, 115 students had their papers graded by one or two peers and 219 students 190
A Case Study of Peer Assessment in a Composition MOOC
Figure 1. Students’ Countries of Residence
For a more accurate representation of this figure, please see the electronic version.
got all of their papers graded by three peers as expected. For the purpose of the course and program evaluation of the E-center, like other MOOCs offered by the E-center, in this composition course, the administrators selected a certain number of students’ papers and had them graded by the instructors, using the same rubric given to students. E-Center had five instructors grade papers from students who received the completion certificates, more specifically, from 100 out of 219 students whose papers were graded by three peers. The graded papers were randomly selected from the pool by the computer. Similar to what students were required to do, the instructors both made comments on and assigned a score to students’ Draft 1, and only assigned a score to students’ final drafts. Overall, the number of papers the instructors were required to grade was 600 (100 Draft 1 and 100 final drafts, applied for three writing assignments). The instructors’ feedback and grades assigned were not given to students but were used for course and program evaluation purposes at the E-center.
Peer Assessment Used in the Course Part of the required work in this course was the peer assessment. Prior to the process of peer assessment, there were discussions on how to be a constructive critic of other people’s work and how to respond to criticism. Discussions were held both through LVCs and the forum. Besides, students were also provided relevant reading materials that helped understand the qualities of constructive criticism. Within MOODLE platform, students were guided through a training process for understanding and using the peer assessment tool so that they would be then able to use this tool to assess the work of their peers in the course. For each of the major writing assignments, the instructors discussed criteria defined in the rubric and then reviewed two actual student essays to help students understand the process better. During the training phase, to help apply the evaluation criteria on a particular writing assignment more fairly and critically, students were asked to evaluate two to three sample essays. Students’ scores on the sample essays were compared with those given by the instructors. Students were considered to pass the training phase if students’ scores on each of the categories of the rubric for each major writing assignment were
191
A Case Study of Peer Assessment in a Composition MOOC
within one point of the instructors’ scores. The passing of the training phase was self-evaluated by the students being trained. The training process was the regular course design for composition MOOCs in the E-Center. Through MOODLE platform, students submitted their work corresponding to modules designed for a particular writing assignment. The system randomly assigned essays for students who had their work submitted to assess. Each student was supposed to grade three essays from three different peers. Students were supposed to assign a score based on particular categories of the rubric. Information from the assignment rubric was “translated” into easy-to-use bulletin menus. For peer assessment on Draft 1, in addition to assigning a score in accordance with the rubric, it was mandatory that students made both marginal and overall comments on peers’ writing. However, for the peer assessment on the Final Draft, it was optional for students to make marginal comments and overall comments. Depending how quickly students read, the process of peer assessment could take 20 – 30 minutes per essay. Completion of peer assessment was required, with students not completing the peer assessment receiving 50% of the final score for that assignment.
Students’ Participation in the Course The Composition MOOC – ESL/EFL Writing for Academic and Professional Success had 4582 students enrolled, 4521 of whom actually accessed the course. The course had a discussion forum where students and instructors could discuss issues related to the course. A total of 236 threads of discussions were posted in the course’s discussion forum, most of which were about the activity of peer assessment, technical issues, assignment completion, and assignment discussions. The course had a total of 27 lecture videos and 7 recorded live virtual sessions, viewed a total of 24,436 times. In regard to assignment submission, students submitted their writing for evaluation 1756 times for Application Essay Assignment, 1050 times for Argumentative Essay Assignment, and 768 times for Summary and Evaluation Essay Assignment. Throughout the course, students completed 9065 peer assessments. Many students submitted their writing assignments, but they did not do peer assessment. As a result, some students’ assignments were graded by only one or two peers, not three peers as expected. The terrible retention rates MOOCs have been recorded as having been accounted for in studies to be one of the reasons students drop out (Colman 2013). ESL/EFL Writing for Academic and Professional Development is not an exception. At the end of the course, there were 334 students receiving a completion certificate, which means that they submitted all assignments, did all assigned peer assessments, and received satisfactory scores (Cs up). Possible reasons accounting for the low retention rate would include (1) students chose to access the materials but not to do assignments, (2) students failing to complete all three major assignments could not pass the course, (3) students submitting their assignments but not assessing peers’ work could not pass the course, (4) students struggling with technology – submitting assignments and doing peer assessment online through MOODLE platform, and (5) others like cultural reasons, deadline missing, personal busy schedule, etc. Among 334 students receiving a completion certificate, 115 students had their papers graded by one or two peers and 219 students got all of their papers graded by three peers as expected.
192
A Case Study of Peer Assessment in a Composition MOOC
RESEARCH METHODOLOGY Research Questions The purpose of this study was to examine aspects of peer assessment in a composition MOOC. Specifically, the study sought to answer the following questions: 1. What perceptions do the students in a composition MOOC have towards peer assessment of their own and their peers’ drafts? 2. Is there a significant difference between overall grades given by students in response to student drafts and those given by the instructors?
Participants In the study, the participants included students who enrolled in the Composition MOOC – ESL/EFL Writing for Academic and Professional Success - offered by E-center for Professional Development in December 2014. The selection of participants for the study was done through the following procedure. It began with identifying the total number of students who registered for the course, which was 4,582. A list of 4,582 students with general information such as nationality, email address, and name was then obtained. Then, these students were sent an email of invitation for participation in the study for three times. The interval of each time is two weeks. There were 119 incorrect emails with failure notice responses, so the total number of students actually receiving the survey was 4,463. After two and a half month of conducting the survey, statistics on Google Drive reported 1,290 respondents submitting their answers, accounting for 28.9% of the total students the researcher of the study sent the survey. The researcher then eliminated 75 participants who did not answer all the questions in the survey completely. All of these 75 students’ incomplete responses were not counted for the purpose of the study, so these were omitted. Eventually, the study used responses from 1215 participants. Among 1215 participants, there were 573 males (47.1%) and 642 females (52.9%). In regard to education levels, 625 participants (51.4%) had high school diplomas; 240 participants (19.8%) held Bachelor’s degrees; 252 participants (20.7%) received some college with no degree; and 98 participants (8.1%) had Master’s or Doctoral degree. Most of the participants were Vietnamese (58.1%), and others were Chinese (22.2%), Indonesia (5.8%), Taiwanese (5.4%), Singaporean (3.9%), Japanese (1.8%), South Korean (1.5%) and Malaysian (1.3%). Table 3 and Figure 2 below summarize the demographic information about the participants. Among 1,290 participants submitted their responses to the survey, 339 participants agreed to participate in Skype interviews. From this pool of participants, the researcher selected 20 participants from a stratified random sample for the interviews —participants equally represented from different levels of grade performances in the course. These 20 participants included 5 who did not complete the course, 5 whose final grade was C, 5 whose final grade was B, and 5 whose final grade was A. Each of the interviews, which lasted approximately 25 minutes, was about participants’ perception and opinions about peer assessment used in ESL/EFL Writing for Academic and Professional Success course.
193
A Case Study of Peer Assessment in a Composition MOOC
Table 3. Demographics about the participants Categories Gender
Education
Number of participants
Percentages
Male
573
47.1%
Female
642
52.9%
High School Diploma
625
51.4%
Bachelor’s Degree
240
19.8%
Some College, No Degree
252
20.7%
Master’s or Doctoral Degree Total
98
8.1%
1,215
100%
Figure 2. Participants’ Countries of Residence
For a more accurate representation of this figure, please see the electronic version.
Research Instruments To explore the research questions, this study employed two research instruments. The first instrument was an online survey with both Likert scale questions and open-ended questions to collect learners’ opinions about peer assessment utilized in a composition MOOC. The survey was sent to 4,582 participants at the beginning, but the study used survey responses from 1,215 participants due to failure of email communication and incomplete responses. The survey consisted of three parts. The first part was about demographic information such as gender, nationality, educational level, participation in previous online writing courses and peer assessment. The second part of the survey, as suggested by Dornyei (2007), were 15 questions employing a 4-point Likert scale with the following end points (1) “Strongly disagree” and (4) “Strongly agree”. The contents of Likert scale questions were adapted and edited from Kaufman and Schunn (2010), which were particularly about
194
A Case Study of Peer Assessment in a Composition MOOC
1. 2. 3. 4. 5. 6. 7. 8.
Peer assessment training and peer assessment tool Participation in peer assessment training Usefulness of own and peers’ feedback Nature of own and peers’ feedback Validity of peers’ feedback Fairness of peers’ feedback The use of peer feedback in revision Own and peers’ qualification
The second instrument was the course data including learners’ demographic information, the grades and feedback provided by both the instructors and peer students in the course. More specifically, this second instrument helped find the answer to research questions 2. Based on E-center’s data, E-Center had five instructors grade papers from students who received the completion certificates, more specifically, from 100 out of 219 students whose papers were graded by three peers. The number of papers the instructors were required to grade was 600 from 100 students (100 Draft 1 and 100 Final Draft, applied for three writing assignments). Because this study tried to balance the number of papers graded by students and those graded by the instructors, the second instrument included grades and feedback from 600 papers (100 Draft 1 and 100 Final Draft, multiplied by three writing assignments).
Data Analysis Survey responses by the participants were automatically processed in Google Drive (a web portal where the researcher of this study created a survey and reported participants’ responses). The data were exported into the Microsoft Excel program and were then entered into Statistical Package for the Social Sciences (SPSS 16.0). The data were analyzed through descriptive statistics in order to identify patterns of agreement and disagreement. Specifically, the percentages of responses for each of the four-point scale questions were calculated. The open-ended questions were analyzed through content analysis for common answers and categorized into themes. These were then scored by frequencies and converted into percentages for discussion. For fifteen Likert scale questions in the second part of the survey, a Reliability Statistics Test was conducted to identify whether the items on the survey had an acceptable internal consistency. The fifteen items on the second part of the survey were tested using Cronbach’s alpha. The resulting alpha coefficient for the fifteen items was .806, suggesting that the items have relatively high internal consistency. According to George & Mallery (2009), a reliability coefficient of .70 or higher is considered “acceptable”, which indicates good internal consistency reliability. In addition, to find out if there were any significant differences in grades given by online peer reviewers and those given by the instructors and where the significance differences were, paired t-tests (also known as dependent t-tests) were carried out.
195
A Case Study of Peer Assessment in a Composition MOOC
RESULTS Results for Research Question 1 Research Question 1 looked into the perceptions of students in a composition MOOC towards peer assessment of their own and their peers’ drafts. For this purpose, an online survey with both Likert scale questions and open-ended questions was used to collect learners’ opinions about peer assessment. The contents of Likert scale questions were particularly about peer assessment training and peer assessment tool, 1. 2. 3. 4. 5. 6. 7.
Participation in peer assessment training Usefulness of own and peers’ feedback Nature of own and peers’ feedback Validity of peers’ feedback Airness of peers’ feedback The use of peer feedback in revision Own and peers’ qualification The open-ended questions asked for
1. 2. 3. 4.
Difficulties/challenges peer reviewers experienced as commentators and grader Difficulties/challenges peer reviewers experienced as a writer being commented on and graded Peer assessment’s usefulness in helping improve writing performance Similarities or differences among peers’ comments and peers’ grades
Responses from the Likert scale questions Responses from the Likert scale questions were automatically processed and analyzed in Google Docs. Table 5 summarizes the participants’ responses to questions on peer assessment training, peer assessment tool and instruction following. As shown in Table 5, 85.6% of the participants took part in peer assessment training and referred to the training instructions and rubric when commenting on and grading their peers’ papers. More participants agreed (64.5%) and strongly agreed (31.7%) that the guidelines for peer assessment were clearly outlined as to how to grade and to make comments, whereas 3.8% of the participants showed their disagreement and strong disagreement. In terms of following the guidelines and rubrics during peer assessment, more than 88% of the participants agreed and strongly agreed that for each of the assignments they followed the guidelines provided during training for commenting on and grading peers’ writing and followed the rubrics for grading peers’ writing, in comparison with approximately 11% of participants who were in the opposite opinions. In regard to participants’ opinions on the feedback they gave to peers’ writing, 62.8% of the participants believed the feedback they gave their peers on peers’ writing was useful whereas 35.9% of the participants disagreed and 1.3% strongly disagreed. 86% of the participants showed their disagreement and strong disagreement when asked if the feedback they gave their peers on peers’ writing was too negative or critical. In another way, the majority of the participants (76.7%) believed that their feedback on peers’ 196
A Case Study of Peer Assessment in a Composition MOOC
Table 5. Participants’ opinions on peer assessment training, peer assessment tool and instruction following Strongly Disagree
Disagree
Strongly Agree
Agree
I participated in peer assessment training and referred to the training instructions and rubric when commenting on and grading my peers’ papers.
5.1%
9.3%
31.4%
54.2%
The guidelines for peer assessment were clearly outlined as to how to grade and to make comments.
1.2%
2.6%
64.5%
31.7%
For each of the assignments, I followed the guidelines provided during training for commenting on and grading peers’ writing.
2.3%
8.1%
58.1%
31.4%
For each of the assignments, I followed the rubric for grading peers’ writing.
2.2%
8.3%
33.7%
55.8%
writing was thorough and constructive while 23.3% of the participants had the opposite opinions. Table 6 below reveals the participants’ opinions on own feedback given to peers’ writing. For the participants’ opinions on peers’ feedback given to own writing, as shown in Table 7, only 12.7% of the participants perceived that the feedback they got from peers was negative and critical while the rest of 87.3% had the opposite opinions. Table 7 also shows that the majority of the participants (80%) believed that the feedback they got from peers connected very clearly to the standards set forth in the assignment rubrics for the course assignment. In regard to the usefulness of feeback, the participants expressed positive attitudes. Almost 96% of the participants felt that the feedback from peers helped improve their writing and approximately 89% found the feedback from peers helpful and used peers’ feedback when they revised their writing (see Table 8). In terms of the validity and fairness of peers’ feedback, Table 9 reveals that the participants expressed the highest disagreement (65.8%) and highest agreement (28.4%) with the statement, “The feedback I got from one peer was similar to the feedback I got from other peers on the same paper”. However, most of the participants agreed (54.4%) and strongly agreed (7.15%) with the statement, “Peers gave me fair grades on my writing” whereas 38.5% expressed their disagreement. For participants’ opinions on own and peers’ qualification to give feedback and grades, only 39.5% of the participants felt that they were qualified to give feedback and grades on peers’ writing while 61.5% expressed the opposite opinions. Despite expressing little expectation in own qualification, 66.3% of the participants conceded that peers were qualified in giving feedback and grades (see Figure 3).
Table 6. Participants’ opinions on own feedback given to peers’ writing Strongly disagree
Disagree
Agree
Strongly Agree
The feedback I gave my peers on their writing for was useful.
1.3%
35.9%
59.3%%
3.5%
The feedback I gave my peers on their writing for was too negative or critical.
45.3%%
40.7%
8.1%
5.9%
The feedback I gave my peers on their writing was thorough and constructive.
4.7%
18.6%
70.9%
5.8%
197
A Case Study of Peer Assessment in a Composition MOOC
Table 7. Participants’s opinions on peers’ feedback given to own writing Strongly disagree
Disagree
Agree
Strongly Agree
The feedback my peers gave on my writing was negative and critical.
45.4%
41.9%
9.3%
3.4%
The feedback provided to me by my peers connected very clearly to the standards set forth in the assignment rubrics for the course assignment.
1.4%
18.6%
77.9%
2.1%
Table 8. Participants’ opinions on usefulness of peers’ feedback Strongly disagree
Disagree
Agree
Strongly Agree
Feedback from peers helped me improved my writing
0%
3.4%
34.1%%
62.5%
I found feedback from my peers helpful, and I used their feedback when revising my writing
1.6%
9.3%
51.2%
37.9%
Responses for the open-ended questions Students’ perceptions about peer assessment in a composition MOOC were also reflected in open-ended responses to the survey questions. The open-ended questions asked about (1) difficulties/challenges peer reviewers experienced as commentators and graders, (2) difficulties/challenges peer reviewers experienced as a writer being commented on and graded, (3) peer assessment’s usefulness in helping improve writing performance, and (4) similarities or differences among peers’ comments and peers’ grades. Major patterns and trends of responses to the open-ended questions were identified, labeled, and then categorized into themes. Regarding the difficulties/challenges peer reviewers experienced as commentators and graders, Figure 4 below shows prevalent themes. As can be seen in Figure 4, about 65% of the respondents spoke about their lack of confidence and lack of experience. More than 72% of the respondents expressed that they had difficulty writing comments and assigning grades, and approximately 39% of the respondents felt that they were not qualified enough to do the job. In addition, prevalent themes raised included technical issues (27%), peer assessment tools i.e. the rubrics (15%), the anonymity of peer assessment (13%), the types of writing assignments (9%), and other difficulties and challenges like one’s English competences (5%), emotional factors (4%), different rhetorical strategies (3%) and so forth. The following comments from respondents are some examples that illustrate these points: Lack of confidence and lack of experience Table 9. Participants’ opinions on the validity and fairness of peers’ feedback Strongly disagree
Disagree
Agree
Strongly Agree
The feedback I got from one peer was similar to the feedback I got from other peers on the same paper
3.5%
65.8%
28.4%
2.3%
Peers gave me fair grades on my writing
6.8%
31.7%
54.4%
7.1%
198
A Case Study of Peer Assessment in a Composition MOOC
Figure 3. Participants’ opinions on own and peers’ qualification to give feedback and grades
Although there was some training on peer assessment, I never felt confident when making comments on other people’s writing. This was my first time ever to do this so I was not experienced at all. (Female 45, China) Grading peers’ writing was a very new experience to me. At the beginning I did not know what to do. Later I was better aware of the activity but I did not feel confident in what I did. If I had some experience in grading, I would feel more confident. Figure 4. Difficulties/challenges peer reviewers experienced as commentators and graders
199
A Case Study of Peer Assessment in a Composition MOOC
(Male 86, Vietnam) Writing comments and assigning grades Writing comments was the most challenging to me. I took a lot of time to practice from the training how to make comments. Many times I got stuck at not being able to think of what comments should be made. I found it very hard to write relevant and constructive comments. Plus, translating the comments into a grade was super hard, even when I followed the rubric. (Female 69, Japan) Lack of qualifications I just felt I was not qualified to make comments and to give grades to my peers’ papers. I was afraid my comments were not comprehensive, constructive, and helpful, and the grades would not reflect the true quality of peers’ papers. I was not professionally trained to assess others’ writing. I received the training from this course, but I think I need more than that to be more qualified. (Female 220, China) Technical issues I was challenged by assessing the papers online. I’m not good at doing things with computer and the internet, so it really took me a lot of time and effort to follow guided steps in order to be able to access the assessment page. Moreover, I almost surrendered when I had to insert my comments on margin in Microsoft Word and upload the graded papers. I appreciate the detailed instructions of doing these stuff, but these really challenged me. (Male 472, South Korea) Peer assessment tools It is difficult for me to understand the rubric thoroughly. I read the rubric, and also watched the videos explaining the rubric and important points I should pay attention when doing the peer assessment. However, I did not understand some words and phrases in the rubric, maybe because my English is not good enough. Therefore, it’s difficult for me to match the graded papers with the rubric criteria. (Female 325, Vietnam) Anonymity of peer assessment I did not know the authors whose papers I graded and vice versa; therefore, it would be difficult to gauge how the authors would react when they got the comments and grades from me. If I knew the authors, it would be better for me to tell the authors about their strengths and weaknesses. 200
A Case Study of Peer Assessment in a Composition MOOC
(Female 16, Singapore) Types of writing assignments I find it hard to grade a writing assignment when I am not familiar with that type of writing. In this course, the summary and evaluation assignment was something I never knew of before. I graded such papers just because I was required to do so. I don’t think I did a good job when assessing something I didn’t know of. (Female 198, Taiwan) English competences One of the challenges was that my English is not good enough. It’s my low proficiency in English that limited my ability in evaluating others’ work. I hardly knew if peers had good word choices or used correct grammar, hence I could neither point out peers’ errors nor made suggestions. (Male 315, Indonesia) Emotion Personal emotion affected my grading. I don’t want to hurt my peers’ feeling and fail them, so I tended to make positive comments and praises. I acted generous when I gave higher grades the papers actually deserved. (Male 117, Singapore) Different rhetorical strategies Sometimes I had difficulty following the ideas presented in peers’ papers. The way they organized evidences and provided necessary information seemed different from the way I did, so it was hard to grade such papers. (Female 613, South Korea) In regard to difficulties and challenges peer reviewers experienced as a writer being commented on and graded (see Figure 5 below), among those who were surveyed, approximately 68% said that they had difficulty interpreting and understanding the comments given by peers. Some other most prevalent difficulties/challenges raised in the survey responses included comment and grade irrelevance (34%), fairness (23%), peers’ qualification (19%), the anonymity of peer assessment (15%) and others (10%) like trust, conflicting comments, cultures. Comment comprehension: Out of 68% of respondents spoke about their difficulties in understanding comments given by peers, more than 80% of the respondents claimed that sometimes peers’ comments were too vague and general, which made it hard to comprehend what peer reviewers really meant; around 201
A Case Study of Peer Assessment in a Composition MOOC
Figure 5. Difficulties/challenges peer reviewers experienced as a writer being commented on and graded
20% found peers’ comments were not relevant to the writing assignments’ requirements. The following are some examples from the respondents. For some peers, they gave me very short comments like ‘thesis’, ‘grammar’, ‘vocabulary’. I completely understand the meaning of such comments but these were too general. I did not know what about ‘thesis’, ‘grammar’, or ‘vocabulary’ my peers really meant. (Female 75, Vietnam) I got a few comments which were very vague and not specific enough. I also got some comments, which were not relevant to the assignment at all. To me, these kinds of comments were hard to understand and to follow. (Female 115, Indonesia) •
Comment and grade irrelevance: Around 34% of the respondents said that the comments and grades given by peers were not relevant and even opposite. They explained that comments were sometimes positive but the grades were low or vice versa. A participant stated:
In one of my papers, I got most of positive comments about organization, idea development, vocabulary, etc. Nevertheless, my final grade was very low. There should have been a match between comments and grades.
202
A Case Study of Peer Assessment in a Composition MOOC
(Male 725, Singapore) •
Fairness: Approximately 23% of the respondents felt that the comments especially the grades given by peers were not fair enough. Among these respondents, many explained that peers tended to over-score the writing and that the grades they got from peers were too way lower or higher than the grades that were self-assessed. Below is one of the participants’ statements:
Of course, I like it when I get high grades for all of my papers. I always graded my own papers using the given rubric, and I noticed that more than one of my peers scored much higher than what I though I would get. With self-assessment, I expected a C, but I got an A then. Well… that peer cheered me up, but to be honest I think that was not fair enough. Plus, the high grade might make me not motivated enough to revise my writing. (Female 537, Japan) •
Peers’ qualification: Almost 19% of the respondents surveyed had the feeling that peers were not qualified enough to make comments on and assign grades to their papers. They explained that peers’ comments were sometimes vague, general, negative, and even rude, and that many comments were not comprehensive, constructive, and relevant to the writing assignments. In addition, the majority of the respondents (among the 19%) claimed that the grades given by peers did not reflect the actual quality of the papers. The following is an example that illustrates the points.
I had doubts about some peers’ ability to do the assessment. They seemed not experienced in assessing others’ writing. Their comments were not clear and constructive enough. I had difficulty interpreting their comments and I did not find those comments useful for my revision. (Male 821, Malaysia) •
Anonymity of peer assessment: Approximately 15% of the respondents spoke about the anonymity of peer assessment, explaining that they did not have chances to discuss feedback with peers or to ask peers for clarifications because they did not know who assessed their papers. A respondent stated:
I think the challenge was the blind peer assessment when I didn’t know who graded my papers and the graders didn’t know whose papers they were grading. Not all of the feedback I got from peers were clear to me, so I would love to discuss with the graders about their feedback to clarify certain points. However, I didn’t have any chance. (Male 67, South Korea)
203
A Case Study of Peer Assessment in a Composition MOOC
•
Other difficulties/challenges: In addition to difficulties and challenges stated above, the participants (10%) also talked about others like conflicting comments trust in peers, different cultures, etc. Some participants expressed their doubts about the accuracy of peers’ comments. Some had difficulties deciding between conflicting comments from different peers. Others talked about differences in feedback because of peers coming from different cultures. The examples illustrating these points are below:
When reading marginal comments that peers wrote on my papers, I was not so sure if my peers were right and if I should do what my peers suggested for correction. That’s my challenge. (Female 2, Vietnam) I got mixed comments on the same point. It was really hard for me to choose which advice to follow. (Female 105, Singapore) It is confusing and challenging when I got two different feedbacks on the same paper. One peer gave only positive feedback and scored high, and the other peer had more critical feedback and scored lower. I know that students in this class were from many different countries, so in this case I guess cultures made such differences. (Female 943, Japan) For responses to the other open-ended questions in the survey about peer assessment’s usefulness in helping improve writing performance, the first half of the responses were tallied into three categories “Yes”, “No”, and “Somewhat”; the second half of the responses - the reasons and explanations that the participants provided were categorized into themes. The following chart (Figure 6) presents the percentages of the first half of the responses. As shown in Figure 6 above, the majority of the participants (82%) responded that peer assessment was useful in helping improve their writing performance whereas 3% of the participants opposed the usefulness of peer assessment. 15% of the participants revealed both positive and negative opinions about the usefulness of peer assessment in helping improve writing performance. It is interesting that the answer to the first half of this open-ended question is quite consistent with the responses of the participants to the Likert scale question. Specifically, in the Likert scale responses, almost 96% of the participants felt that the feedback from peers helped improve their writing while around 4% of the partiticpant felt the opposite, and approximately 89% found the feedback from peers helpful and used peers’ feedback when they revised their writing. Overall, there was a high agreement on the usefulness of peer assessment in both Likert scale responses and open-ended responses. Regarding the second half of the responses to the open-ended question about the usefulness of peer assessment in helping improve writing performance, approximately 45% of the participants only responded whether or not peer assessment was useful and did not give further explanations (called group A). In this group A, around 80% of the participants stated that peer assessment helped improve their writing;
204
A Case Study of Peer Assessment in a Composition MOOC
Figure 6. Participants’ opinions on whether peer assessment helps improve writing performance
about 5% opposed the usefulness of peer assessment; 14% expressed both agreement and disagreement. The following are examples of the participants’ responses: Peer assessment is definitely helpful. (Female 14, Indonesia) I think peer assessment does not contribute to the improvement of my writing performance. (Male 162, China) Not always helpful. Sometimes peer assessment helps, but sometimes it does not. (Male 217, China) There were about 55% of the participants who both responded whether or not peer assessment was useful and gave further explanations (called group B). Similarly, to group A, most of the participants in group B expressed the agreement on the usefulness of peer assessment, accounting for approximately 78%; 6% expressed the disagreement; and 16% expressed both agreement and disagreement. As far as the reasons and explanations for the “Agreement” are concerned, the majority of the participants in group B (approximately 65%) explained that peer assessment helped them revise drafts better and score higher on revised drafts because they received advice from peers. Other participants (about 40%) stated that peer assessment helped them understand what constitutes ‘good work” through their seeing different approaches other peers took in responding to a writing assignment as well as knowing the assessment criteria. Another common reason that the participants (about 25%) provided was peer assessment helped promote a deep approach of learning because they had chances for better understanding of the writing assignments and the assessment criteria. Below are some participants’ statements:
205
A Case Study of Peer Assessment in a Composition MOOC
My peers pointed out my errors and also gave suggestions on how to correct the errors. With their feedback, I got higher grades for my final drafts. My grammar looked better and my essay organization looked better. I found peer assessment very useful… (Male 67, Indonesia) Peer assessment is very helpful. When I went through the rubric criteria for peer assessment, I was more aware of what I should do with my papers to have good writing products. Especially, after reading my peers’ papers, I had better understanding of what is good and what needs improving. I knew what I would do for my revised papers. (Male 912, South Korea) Regarding the disagreement on the usefulness of peer assessment in improving writing performance, the majority of participants shared the one same reason – poor quality of comments. They explained that feedback from peers were not clear and constructive enough so they did not use peer feedback in revision. Some others argued that they did not learn anything from other peers when they assessed peers’ papers. One participant said: I found peer assessment not helpful. I did not learn anything from my peers because their papers were badly written. Besides, I could not use any of the comments peers gave me for the revision because their comments did not make sense to me… (Male 811, China) For responses to the open-ended questions in the survey about similarities or differences among peers’ comments and peers’ grades, similar to the way the participants’ responses about the usefulness of peer assessment were analyzed, the first half of the responses were measured into three categories “Yes”, “No”, and “Somewhat”; the second half of the responses - the reasons and explanations that the participants provided were categorized into themes. According to Figure 7 below, approximately 19% of the participants conceded the similarities among peers’ comments and peers’ grades whereas around 73% of the participants admitted the differences. 8% of the participants expressed that peers’ comments and peers’ grades were both similar and different. Below are examples of the participants’ statements: Very different. I got all kinds of different comments from peers – general, specific, positive, negative. The grades were different, too. One peer have me low grade, and another gave me higher grade. (Female 1007, Malaysia) The grades I got from peers were quite close, so I think peers’ grades were similar. I also got similar comments from peers – I mean peers wrote in different words but they pointed out the same errors and had similar tone.
206
A Case Study of Peer Assessment in a Composition MOOC
Figure 7. Participants’ opinions on similarities and differences in peers’ comments and peers’ grades
(Female 714, China) Sometime I received similar comments and close grades, and sometimes peer feedback was extremely different. (Male 351, Japan)
Results for Research Question 2 Question two examined whether there was a significant difference between grades given by student peers in response to student drafts and those given by the instructors. To answer this question, six paired t-tests (also known as dependent t-tests) were run on each of three writing assignments’ grades given by the student peers and the instructors. These six paired t-tests were conducted to compare the mean of overall grades of the first drafts and that of the final drafts given by student peers and overall grades given by the instructors in three writing assignments (application essay, summary and evaluation essay, and argumentative essay). As illustrated in Table 10, for grades of Writing Assignment 1 – Application Essay, the results of the paired t-tests showed extremely statistically significant differences between grades given by peer students and those by the instructors in both Draft 1 and Final Draft. The two-tailed P value is less than 0.0001, so by conventional criteria, this difference is considered to be extremely statistically significant. As observed, the mean of grades given by peers tended to be higher than the mean of grades given by the instructors. This held true for both drafts. Regarding the grades of Writing Assignment 2 – Summary and Evaluation Essay, the mean of peer grading in both Draft 1 and Final Draft was higher than the mean of the instructor grading. The two-tailed P value for Draft 1 was 0.017, showing that the difference is considered to be statistically significant while P value for Final Draft was less than 0.0001, which means this difference was considered to be extremely statistically significant. See Table 11 below:
207
A Case Study of Peer Assessment in a Composition MOOC
Table 10. Paired t-test analysis for grades of writing assignment 1 – Application Essay Drafts
df
Peers’ grading Mean
SD
Instructors’ grading Mean
SD
Diff. Mean
Diff. SD
t
p
Cohen’s d
Draft 1
99
8.04
0.90
7.57
0.82
0.47
0.06
7.44
0.000*
0.54
Final Draft
99
8.14
0.97
7.71
0.81
0.43
0.05
8.76
0.000*
0.48
* Significant at p < .05
Table 11. Paired t-test analysis for grades of writing assignment 2- Summary and Evaluation Essay Drafts
df
Peers’ grading
Instructors’ grading
Mean
SD
Mean
SD
Diff. Mean
Diff. SD
t
p
Cohen’s d
Draft 1
99
7.83
1.01
7.66
0.75
0.17
0.07
2.43
0.017
0.19
Final Draft
99
8.18
0.69
7.80
0.07
0.38
0.05
7.88
0.000*
0.77
* Significant at p < .05
Table 12. Paired t-test analysis for grades of writing assignment 3- Argumentative Essay Drafts
df
Peers’ grading
Instructors’ grading
Mean
SD
Mean
SD
Diff. Mean
Diff. SD
t
p
Cohen’s d
Draft 1
99
7.78
0.67
7.64
0.61
0.15
0.05
2.67
0.009
0.22
Final Draft
99
8.29
0.58
7.85
0.51
0.43
0.04
10.7
0.000*
0.81
* Significant at p < .05
Similar to the paired t-test results for grades of Writing Assignments 1 and 2, as can be seen from Table 12, for Writing Assignment 3 – Argumentative Essay, statistically significant differences in grading between student peers and instructors existed in Draft 1 (the two-tailed P value equaled 0.009) and extremely statistically significant differences existed in Final Draft (the two-tailed P value is less than 0.0001). Overall, the statistics of paired t-tests showed that there were significant differences in overall grades given by students in response to student drafts and those given by the instructors. Specifically, for both Draft 1 and Final Draft in three Writing Assignments, although using the same rubrics, students gav higher grades to student drafts than the instructors did.
CONCLUSIONS AND DISCUSSIONS Student’s Perceptions of Peer Assessment in a Composition MOOC Peer assessment has been implemented in MOOCs for feedback to a massive student population. This study strove to investigate students’ perceptions on peer assessment utilized in a composition MOOC provided by E-Center for Professional Development and to identify if there was a significant difference
208
A Case Study of Peer Assessment in a Composition MOOC
between grades given by student peers and those given by the instructors. The empirical findings provided evidence that peer assessment was well received by the majority of students in the composition MOOC - ESL/EFL Writing for Academic and Professional Success. Specifically, 85.6% of the participants took part in peer assessment training and referred to the training instructions and rubric when commenting on and grading their peers’ papers. 88% of the participants conceded they followed the guidelines provided during training for commenting on and grading peers’ writing and followed the rubrics for grading peers’ writing. 62.8% of the participants believed the feedback they gave their peers on peers’ writing was useful, and the majority of the participants (76.7%) believed that their feedback on peers’ writing was thorough and constructive. Most of the participants admitetd that they used peers’ feedback when they revised their writing and that peer assessemnt helped improve their writing performance. Regarding what students perceived toward peers’ qualification, 66.3% of the participants conceded that peers were qualified in giving feedback and grades. Overall, the positive attitudes and perceptions of students towards peer assessment as observed in this study confirmed previous studies (i.e. Simkin and Ramarapu, 1997; Leki, 1990; Krashen, 1982; Nelson & Murphy, 1992; Stanley, 1992; Connor & Asenavage, 1984; Murau, 1993; Brammer and Rees, 2007; Lou et al., 2014) in that peer assessment was perceived as helpful in improving students’ performance. This study added to the existing literature the fact that students had positive perceptions towards online peer assessment in a composition MOOC context. Despite the positive attitudes and perceptions towards online peer assessment, students in the composition MOOC - ESL/EFL Writing for Academic and Professional Success also expressed their negative feelings about this activity. Approximately 12.7% of the participants perceived that the feedback they got from peers was negative and critical. 38.5% of the participants believed that peers did not give fair grades on their writing. 61.5% of the participants felt that they themselves were not qualified to give feedback and grades on peers’ writing, and 34.7% conceded that peers were not qualified in assessing others’ work. A few numbers of students (at least 3%) claimed that peer assessment did not help improve their writing at all. 73% of the students found peers’ comments and grades were different and not consistent. Generally, the students’ negative attitudes towards peer assessment in this study were found similar to what Kaufman and Schunn (2011) found in that students sometimes thought peer assessment unfair and that students believed they were not qualified to grade peers’ work. However, since Kaufman and Schunn’s study focused on a small number of students’ perceptions of online peer assessment while this study aimed to students’ perception of online peer assessment in a composition MOOC, it helps to make a further conclusion that students would perceive the unfairness and the lack of qualification when peer assessment is involved, irrespective of different learning contexts. The plausible interpretations for this can be (1) students are not teachers, (2) students don’t receive sufficient training on peer assessment, (3) students don’t have quality practice in peer assessment. While findings in other studies in the literature supported the usefulness of peer assessment in helping students improve their writing performance (i.e. i.e. Simkin & Ramarapu, 1997; Leki, 1990; Krashen, 1982; Nelson & Murphy, 1992; Stanley, 1992; Connor & Asenavage, 1984; Murau, 1993; Sadler & Good, 2006; Brammer & Rees, 2007), one finding in this study is different in that a few numbers of students (approximately 3%) claimed that peer assessment did not help improve their writing at all. This percentage of 3% seems small, but it says loud when compared with the large scale of MOOC student population. The students also expressed difficulties and challenges that they had with the online peer assessment used in the composition MOOC - ESL/EFL Writing for Academic and Professional Success. As a commentator and grader, the majority of the students spoke about their lack of confidence, lack 209
A Case Study of Peer Assessment in a Composition MOOC
of experience, and lack of qualification. Others difficulties and challenges included technical issues (27%), peer assessment tools i.e. the rubrics (15%), the anonymity of peer assessment (13%), the types of writing assignments (9%), and other difficulties and challenges like one’s English competences (5%), emotional factors (4%), different rhetorical strategies (3%) and so forth. As a writer being commented on and graded, 68% of the students said that they had difficulty interpreting and understanding the comments given by peers. Many students blamed for the comment and grade irrelevance (34%), fairness (23%), peers’ qualification (19%), the anonymity of peer assessment (15%) and others (10%) like trust, conflicting comments, cultures. Difficulties and challenges students in the composition MOOC - ESL/ EFL Writing for Academic and Professional Success were found similar to what Watters (2012) noted about peer assessment in general MOOCs, for example the anonymity of feedback, the lack of feedback on feedback. What this study added to the existing literature is students’ difficulties and challenges of peer assessment in a composition MOOC, which were explored in different perspectives – writers as commentator and graders and writers as being commented on and graded.
Students’ Grading Compared to Instructors’ Grading Statistical analysis in this study shows that there were significant differences in the grades given by students and those given by the instructors. For both Draft 1 and Final Draft of all three writing assignments, the descriptive analysis revealed that the means of peer-grading scores were higher than the means of the instructor grading scores. This means the grades the students awarded to their peers tended to be higher in comparison to the instructor-assigned grades. While studies by Cho et. al (2006), Sadler & Good (2006), and Bouzidi & Jaillet (2009) show a high consistency among grades assigned by peers and a high correlation between peer grading and teacher grading, the finding of this study is different from and opposite to previous studies in the literature. The initial plausible interpretation for this difference can be previous studies’ findings are generally based on the context of college courses with small or moderate enrollments, not a large scale of MOOC student population like this study. However, this initial interpretation might not be right when the present study’s finding is compared with other studies in a MOOC context. The study by Lou, Robinson and Park (2014) was the only empirical one to examine peer grading in a MOOC (at least to the best of the research’s knowledge). Their findings provided evidence that peer-grading scores were fairly consistent and highly similar to instructors grading scores, which is contrary to the finding of this study. In this study, differences between student grading scores and instructor grading scores might result from different factors as follows: •
•
210
Students’ Attitudes: Considering that students’ participation is one of the cruxes of peer assessment activity, students’ attitudes towards peer assessment can have impact on the grading consistency. From the survey responses, approximately 65% felt that they were not confident in grading their peers’ writing. In addition, students tended to inflate their peers’ writing by giving peers praises and high scores. It is the lack of confidence and the high scoring that can affect the quality of peer grading. Diversity of MOOC Students: Students in the composition MOOC - ESL/EFL Writing for Academic and Professional Success were diverse, coming from many different countries (mostly non-native English speaking countries), at all ages and with all backgrounds. More importantly, the students had greatly varied levels of English proficiency. Students with low level of English
A Case Study of Peer Assessment in a Composition MOOC
•
•
•
•
proficiency may not be competent enough to evaluate peers’ writing, so it may be the poor competence in English that leads to the inaccuracy of peer assessment. Peer Assessment Tools: From the survey responses, approximately 15% of students claimed that one of the challenges for them as a commentator and grader was the peer assessment tools i.e. the rubrics. Many students did not understand standard criteria for peer assessment and the rubrics thoroughly. The lack of understanding of the rubric and peer assessment criteria may be the source of peer grading errors. Students’ Qualification in Peer Assessment: Many students (around 61.5%) admitted that they themselves were not qualified to grade peers’ writing, giving reasons like lack of confidence, lack of experience, low English competences, and so forth. The course provided students with some training, within a limited time of one week, on how to write comments and how to grade the papers based on the assigned rubrics before peer assessment. The majority of students (at least 85.6%) responded that they took part in peer assessment training, followed the guidelines provided during training for commenting on and grading peers’ writing, and followed the rubrics for grading peers’ writing. However, that students followed the peer assessment instructions and rubrics does not guarantee that students would assess peers’ writing accurately, given students received the training in a short period of time. Students should have quality practice after receiving the training on peer assessment because good grading technique is difficult to learn. Number of Student Graders: In the composition MOOC - ESL/EFL Writing for Academic and Professional Success, each submitted paper was graded by three student graders based on the same rubric. The final peer-grading score for a paper was calculated by the mean of all the scores given by peers. With more graders, the consistency of student-grading scores might be more significant (as found in Cho et al, 2006). For more significant consistency in peer-grading scores, the suggested number of student graders should be at least five or seven. Using Peer-Grading Scores as Final Scores: Similar to other MOOCs, the ESL/EFL Writing for Academic and Professional Success course utilized peer assessment due to the large enrollments of thousands of students. That peer-grading scores is used as final scores for students’ submitted papers and as a replacement for instructor grading scores may cloud the accuracy of peer grading, especially when students tend to inflate peers’ writing and award peers high scores.
Implementation of Peer Assessment in Composition MOOCs The large enrollments of multiple thousands of students in MOOCs seem to exceed the assessment capacity (i.e. evaluating and grading) of instructors; therefore, the inability for instructors to grade so many papers is likely responsible for MOOCs turning to peer assessment. One question raised is “Does peer assessment work in MOOCs in composition? The answer is both YES and NO. On one hand, peer assessment in a composition MOOC was well received by students. Students had positive perceptions and attitudes towards peer assessment, especially the usefulness of peer assessment – the majority of students used peers’ feedback when they revised their writing and peer feedback helped improve students’ writing performance. The improvement of students’ writing performance can be evidenced by scores that students got. According to Tables 10, 11, and 12, for all three writing assignments, the mean of students’ scores for Final Draft was higher than those for Draft 1, no matter who graded students’ writing – peers or instructors.
211
A Case Study of Peer Assessment in a Composition MOOC
On the other hand, challenges in composition MOOCs like students’ lack of confidence, students’ lack of experience, students’ lack of qualification, diversity of MOOC students, anonymity of peer assessment, peer assessment tools, accuracy of peer assessment, and so forth have greatly affected the effectiveness of peer assessment in composition MOOCs. In spite of a lot of challenges, due to the large scale of student population, peer assessment has been utilized in most of the composition MOOCs and is considered the most feasible assessment method due. However, peer assessment doesn’t work by magic. For optimal peer assessment and reduction of challenges, providing students with sufficient training on peer assessment and creating high-quality peer assessment tools i.e. rubrics, peer assessment instructions, and guideline sheets are the utmost importance. In terms of training on peer assessment, instructors’ preparations play an important role in the success of peer assessment. Prior to the peer assessment session, instructors need to determine learning goals and effective peer assessment strategies based on students’ writing proficiency, feedback skills, and experience in collaborative work. By doing so, teachers will know what criteria to set for the peer assessment and what strategies can help to maximize the activity. Moreover, since composition MOOCs have a great diversity of students, it is important to include some cultural awareness training, so that students begin to appreciate the subtle differences in peer responses. Last but not least, practice makes perfect. Students need sufficient time to practice for quality peer feedback. In addition to the essence of sufficient training on peer assessment, a high-quality rubric is also a decisive factor for the success of peer assessment. Peer assessment tools should state a number of specific questions or present a list of areas for the readers to focus on when assessing peers’ writing. If guided questions and instructions are too general, students might have difficulty identifying what to respond. Plus, composition MOOCs have diverse students including ESL students; therefore, factors such as linguistic, personal history with writing, and cultural backgrounds might have influence on the organizational structures of ESL texts and how they perceive peers’ writing. “Clearly stated” words or phrases (i.e. ‘thesis’, ‘thesis’ statement’, ‘topic sentence’, audience, and the likes) are of much importance for ESL students to be aware of English rhetorical patterns and structure of writing. Moreover, the peer assessment tools explore the questions of what causes writers to revise. According to Sommers (1980) in her study on revision strategies, one of major reasons that make students revise is the “incongruities between intension and execution” (p. 385). Plus, the social interaction perspective on writing (i.e. the relationship between the writer and reader) and the mismatch between a writer’s intention and the reader’s comprehension do offer insights about revision. Nold (1981) added that in order to have successful revision, writers need to both realize mis-constraints and to have the capacity to produce a clearer alternative to the current text.
LIMITATIONS OF THE STUDY This study examined aspects of peer assessment utilized in a composition MOOC, specifically students’ perceptions and scores given by students and instructors. There were several factors that may be considered the limitations of this study. First, the study used a convenient sample, so the results of the study, to a certain extent, was hard to replicate. Second, because the study was conducted based on a context of small-sized composition MOOC, the findings of the study might not generalize for other composition MOOCs with a massive population of learners. Third, the concept and practice of MOOCs is still new to researchers and educators, and there have been very limited studies about this area. Therefore, the 212
A Case Study of Peer Assessment in a Composition MOOC
literature review chapter of the study might lack significant discussions about this scholarship. Finally, the data sources used in this study consisted of a survey, and writing assignments’ grades assigned by students and instructors. Surveys are self-report measurement techniques designed to question people about themselves, their attitudes and behaviors (Creswell, 2003). This type of measurement can be potential sources of unreliable answers because respondents may exaggerate. Participants may be embarrassed to state their true response, or they may simply forget the true account.
REFERENCES Baird, J. R., & Northfield, J. R. (1992). Learning from the PEEL experience. Melbourne, Australia: Monash University. Baker, S., Gersten, R., & Graham, S. (2003). Teaching Expressive Writing to Students with Learning Disabilities Research-Based Applications and Examples. Journal of Learning Disabilities, 36(2), 109–123. doi:10.1177/002221940303600204 PMID:15493427 Bennett, R. E. (2006). Technology and writing assessment: Lessons learned from the US National Assessment of Educational Progress. Annual Conference of the International Association for Educational Assessment. Singapore: IAEA. Retrieved from http://www.iaea.info/documents/paper_1162a26d7.pdf Bill & Malinda Gates Foundation. (n. d.). What we do. Retrieved from http://www.gatesfoundation.org/ What-We-Do/US-Program/Postsecondary-Success Boston, W., & Helm, J. S. (2012). Why student learning outcomes assessment is key to the future of MOOCs. National Institute for Learning Outcomes Assessment. Retrieved from http://illinois.edu/blog/ view/915/84723?displayType=month&displayMonth=201212 Bouzidi, L., & Jaillet, A. (2009). Can Online Peer Assessment be Trusted? Journal of Educational Technology & Society, 12(4), 257–268. Brammer, C., & Rees, M. (2007). Peer review from the students’ perspective: Invaluable or invalid? Composition Studies, 35(2), 71. Bridgeman, B., Trapani, C., & Yigal, A. (2012). Comparison of human and machine scoring of essays: Differences by gender, ethnicity, and country. Applied Measurement in Education, 25(1), 27–40. doi:1 0.1080/08957347.2012.635502 Bruffee, K. (1984). Collaborative Leaming and the Conversation of Mankind. College English, 46(7), 635–652. doi:10.2307/376924 Byrne, R., Tang, M., Truduc, J., & Tang, M. (2010). eGrader, a software application that automatically scores student essays: With a postscript on the ethical complexities. Journal of Systemics. Cybernetics & Informatics, 8(6), 30–35. Chen, C.-F. E., & Cheng, W.-Y. E. (2008). Beyond the design of automated writing evaluation: Pedagogical practices and perceived learning effectiveness in EFL writing classes. Language Learning & Technology, 12(2), 94–112.
213
A Case Study of Peer Assessment in a Composition MOOC
Cheville, J. (2004). Automated scoring technologies and the rising influence of error. English Journal, 93(4), 47–52. doi:10.2307/4128980 Cho, K., Schunn, C. D., & Wilson, R. W. (2006). Validity and reliability of scaffolded peer assessment of writing from instructor and student perspectives. Journal of Educational Psychology, 98(4), 891–901. doi:10.1037/0022-0663.98.4.891 Chodorow, M., & Burstein, J. (2004). Beyond essay length: Evaluating e-rater’s performance on TOEFL essays (TOEFL research report, No. RR-04-73). Princeton, NJ: Educational Testing Service. Cindy, J. (2007). Validating a computerized scoring system for assessing writing and placing students in composition courses. Assessing Writing, 11(3), 167–178. Condon, W. (2013). Large-scale assessment, locally-developed measures, and automated scoring of essays: Fishing for red herrings? Assessing Writing, 18(1), 100–108. doi:10.1016/j.asw.2012.11.001 Connors, R. (1997). Composition-rhetoric: Backgrounds, theory, and pedagogy. Pittsburg: University of Pittsburg Press. Connor, U., & Asenavage, K. (1994). Peer response groups in ESL writing classes: How much impact on revision? Journal of Second Language Writing, 3(3), 257–276. doi:10.1016/1060-3743(94)90019-1 Coursera. (2012). Peer Assessment. Coursera. Duke University. Retrieved from https://www.coursera. org/course/composition Coursera. (n. d.). Georgia Institute of Technology. Retrieved from https://www.coursera.org/course/gtcomp Coursera. (n. d.). The Ohio State University. Retrieved from https://www.coursera.org/course/writing2 Creswell, J. W. (2003). Research design: Qualitative, quantitative, and mixed methods approaches (2nd ed.). Thousand Oaks, CA: Sage Publications. Creswell, J. W., & Clark, P. V. L. (2007). Designing and conducting mixed methods research. Thousand Oaks, CA: Sage. Creswell, J. W., Plano Clark, V. L., Gutmann, M. L., & Hanson, W. E. (2003). Advanced mixed methods research designs. In A. Tashakkori & C. Teddlie (Eds.), Handbook of mixed methods in social and behavioral research (pp. 209–240). Thousand Oaks, CA, USA: Sage. Deane, P. (2013). On the relationship between automated essay scoring and modern views of the writing construct. Assessing Writing, 18(1), 7–24. doi:10.1016/j.asw.2012.10.002 DiPardo, A., & Freedman, S. (1988). Peer response groups in the writing classroom: Theoretic foundations and new directions. Review of Educational Research, 58(2), 119–150. doi:10.3102/00346543058002119 Educause. (2012). Retrieved from http://net.educause.edu/ir/library/pdf/eli7078.pdf Educause. (2013a). Peer Assessment in MOOCs. Retrieved from https://net.educause.edu/ir/library/pdf/ ELI139_OL14.pdf Elbow, P. (1981). Writing with power: Techniques for mastering the writing process. Oxford University Press.
214
A Case Study of Peer Assessment in a Composition MOOC
Educause. (2013b). Writing II: Rhetorical Composing. Retrieved from http://www.educause.edu/sites/ default/files/library/presentations/E13/SESS008/Writing2-Final-Report.pdf Falchikov, N. (1995). Peer feedback marking: Developing peer assessment. Innovations in Education and Training International, 32(2), 175–187. doi:10.1080/1355800950320212 Falchikov, N. (2001). Learning together: Peer tutoring in higher education. London: Routledge Falmer. doi:10.4324/9780203451496 Falchikov, N. (2005). Improving assessment through student involvement. New York: Routledge Falmer. Falchikov, N., & Goldfinch, J. (2000). Student peer assessment in higher education: A metaanalysis comparing peer and teacher marks. Review of Educational Research, 70(3), 287–322. doi:10.3102/00346543070003287 Fry, S. A. (1990). Implementation and evaluation of peer marking in higher education. Assessment & Evaluation in Higher Education, 15(3), 177–189. doi:10.1080/0260293900150301 George, D., & Mallery, P. (2009). SPSS for Windows step by step: a simple guide and reference 16.0 update. Boston, MA: Allyn and Bacon. Gielen, S., Peeters, E., Dochy, F., Onghena, P., & Struyven, K. (2010). Improving the effectiveness of peer feedback for learning. Learning and Instruction, 20(4), 304–315. doi:10.1016/j.learninstruc.2009.08.007 Grosseck, G. (2009). To Use or Not to Use Web 2.0 in Higher Education? Paper presented at the Procedia Social and Behavioral Sciences, World Conference on Educational Science. Haaga, D. A. F. (1993). Peer review of term papers in graduate psychology courses. Teaching of Psychology, 20(1), 28–32. doi:10.1207/s15328023top2001_5 Head, K. (2013). Lessons Learned from a Freshman-Composition MOOC. The Chronicle of Higher Education. Retrieved from http://chronicle.com/blogs/wiredcampus/lessons-learned-from-a-freshmancomposition-mooc/46337 Holt, M. (1992). The value of written peer criticism. College Composition and Communication, 43(2), 384–392. doi:10.2307/358229 Kaufman, J. H., & Schunn, C. D. (2011). Students perceptions about peer assessment for writing: Their origin and impact on revision work. Instructional Science, 39(3), 387–406. doi:10.1007/s11251-010-9133-6 Krashen, S. (1982). Principles and practice in second language learning and acquisition. Oxford: Pergamon. Krause, S. (2013). The end of the Duke Composition MOOC: again, what did we learn here? Retrieved from http://stevendkrause.com/2013/06/21/the-end-of-the-duke-composition-mooc-again-what-did-welearn-here/comment-page-1/ Kuh, G. (2012). What matters to student success (Keynote address).Proceedings of the annual National Symposium on Student Retention, New Orleans, LA, USA.
215
A Case Study of Peer Assessment in a Composition MOOC
Boston, W. & Helm, J. S. (2012). Why student learning outcomes assessment is key to the future of MOOCs. National Institute for Learning Outcomes Assessment. Retrieved from http://illinois.edu/blog/ view/915/84723?displayType=month&displayMonth=201212 Leal, F. (2012, September 14). Report: U.S students lack writing skills. Retrieved from http://www. ocregister.com/articles/students-371409-writing-graders.html Leki, I. (1990). Potential problems with peer responding in ESL writing classes. CATESOL Journal, 3, 5–19. Lee, H.-J., & Lim, C. (2012). Peer Evaluation in Blended Team Project-Based Learning: What Do Students Find Important? Journal of Educational Technology & Society, 15(4), 214–224. Liu, E. Z. F., Lin, S. S. J., & Yuan, S. M. (2002). Alternatives to instructor assessment: A case study of comparing self and peer assessment with instructor assessment under a networked innovative assessment procedures. International Journal of Instructional Media, 29(4), 10. Luo, H., Robinson, A. C., & Park, J. Y. (2014). Peer grading in a mooc: Reliability, validity, and perceived effects. Online Learning: Official Journal of the Online Learning Consortium, 18(2). Magin, D. (1993). Should student peer ratings be used as part of summative assessment? Research and Development in Higher Education, 16, 537–542. Mangelsdorf, K. (1992). Peer reviews in the ESL composition classroom: What do the students think? ELT Journal, 46(3), 274–284. doi:10.1093/elt/46.3.274 McLoughlin, C., & Lee, M. J. W. (2007). Social software and participatory learning: Pedagogical choices with technology affordances in the Web 2.0 era. In ICT: Providing choices for learners and learning. Proceedings of the ascilite Singapore. http://www.ascilite.org.au/conferences/singapore07/ procs/mcloughlin.pdf Murau, A. M. (1993). Shared Writing: Students’ Perceptions and Attitudes of Peer Review. Working Papers in Educational Linguistics, 9(2), 71-79. Nelson, G. L., & Murphy, J. M. (1992). An L2 writing group: Talk and social dimension. Journal of Second Language Writing, 1(3), 171–193. doi:10.1016/1060-3743(92)90002-7 Nelson, G., & Murphy, J. (1993). Peer response groups: Do L2 writers use peer comments in revising their drafts? Journal of Second Language Writing, 27, 135–142. New Media Horizon and EDUCAUSE. (2013). NMC Horizon Report: 2013 Higher Education Edition. New Media Consortium. Nold, E. (1981). Revising. In C. H. Frederiksen & J. F. Dominic (Eds.), Writing: The Nature, Development, and Teaching of Written Communication (pp. 67-79). Hillsdale, NJ: Erlbaum. Rada, R., Michailidis, A., & Wang, W. (1994). Collaborative hypermedia in a classroom setting. Journal of Educational Multimedia and Hypermedia, 3, 21–36. Rees, J. (2013). The MOOC Racket. Retrieved from http://www.slate.com/articles/technology/future_ tense/2013/07/moocs_could_be_disastrous_for_students_and_professors.html
216
A Case Study of Peer Assessment in a Composition MOOC
Rushton, C., Ramsey, P., & Rada, R. (1993). Peer assessment in a collaborative hypermedia environment: A case study. Journal of Computer-Based Instruction, 20, 75–80. Sadler, P. M., & Good, E. (2006). The impact of self-and peer-grading on student learning. Educational Assessment, 11(1), 1–31. doi:10.1207/s15326977ea1101_1 Simkin, M. G., & Ramarapu, N. K. (1997). Student perceptions of the peer review process in student writing projects. Journal of Technical Writing and Communication, 27(3), 249–263. Sloan, C. (2013). 10th Annual Survey of Online Learning: Changing Course: Ten Years of Tracking Online Education in the United States. Retrieved from http://sloanconsortium.org/node/384451 Sommers, N. (1980). Revision strategies of student writers and experienced adult writers. College Composition and Communication, 3(4), 378–388. doi:10.2307/356588 Stanley, J. (1992). Coaching student writers to become effective peer evaluators. Journal of Second Language Writing, 1(2), 17–233. Swales, J. M., & Feak, C. B. (2004). Academic writing for graduate students: Essential tasks and skills (Vol. 1). Ann Arbor, MI: University of Michigan Press. Thompson, R. F. (1981). Peer grading: Some promising advantages for composition research and the classroom. Research in the Teaching of English, 15(2), 172–174. Topping, K. J., Smith, E. F., Swanson, I., & Elliot, A. (2000). Formative peer assessment of academic writing between post students. Assessment & Evaluation in Higher Education, 25(2), 151–169. doi:10.1080/713611428 Topping, K. J. (2009). Peer assessment. Theory into Practice, 48(1), 20–27. doi:10.1080/00405840802577569 Villamil, O. S., & Gurrerro, M. C. M. (1996). Peer revision in the L2 classroom: Social cognitive activities, mediating strategies, and aspects of social behaviors. Journal of Second Language Writing, 5(1), 51–75. doi:10.1016/S1060-3743(96)90015-6 Vygotsky, L. S. (1978). Thought and language, 1962. In Mind and Society. Watters, A. (2012, August 27). The problems with peer grading in Coursera. Inside Higher Ed. Retrieved from: http://www.insidehighered.com/blogs/hack-higher-education/problems-peer-grading-coursera Wen, M. L., & Tsai, C.-C. (2006). University students perceptions of and attitudes toward (online) peer assessment. Higher Education, 27(18), 27–44. doi:10.1007/s10734-004-6375-8 Zhang, S. (1995). Reexamining the affective advantage of peer feedback in the ESL writing class. Journal of Second Language Writing, 4(3), 209–222. doi:10.1016/1060-3743(95)90010-1
217
218
Chapter 10
A Journey Through the Development of Online Environments:
Putting UDL Theory into Practice Christopher P. Ostrowski University of Calgary, Canada
Luciano da Rosa dos Santos University of Calgary, Canada
Jennifer Lock University of Calgary, Canada
Noha F. Altowairiki University of Calgary, Canada
S. Laurie Hill St. Mary’s University, Canada
Carol Johnson University of Calgary, Canada
ABSTRACT As higher education institutions move toward offering more online courses, they need to carefully consider how the principles of Universal Design for Learning (UDL) should be integrated into the design and development of the online environments so to better meet the needs of all learners. An example of how this can occur is illustrated in the chapter with a design project that used principles of UDL in the creation of online environments for field experience courses at one Canadian university. The design team shares the journey of developing their understanding of UDL and applying these principles when creating online environments for both students and instructors. The provision of educational developmental opportunities for instructors using various strategies is also highlighted. The chapter concludes with three recommendations for future research.
INTRODUCTION Contemporary higher education institutions have an increasingly diverse student population. Instructors need to take into account diverse learning needs when designing courses and learning tasks. Adding to this complexity, more face-to-face courses are being transitioned to blended or online learning environDOI: 10.4018/978-1-5225-1851-8.ch010
Copyright © 2017, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
A Journey Through the Development of Online Environments
ments. As instructors confront the challenges of transforming their teaching from face-to-face to online, they have additional opportunities to take advantage of new digital technologies and pedagogies to better meet students’ learning needs. This chapter shares the journey of a team of instructors and graduate students who studied and implemented Universal Design for Learning (UDL) in the design of online learning environments for field experience (practicum) courses. The project described provided the design team with an opportunity to develop their understanding and to apply the principles of UDL. Further, the way in which UDL guided the design of the online environments and the educational development of the field experience instructors is examined. The chapter concludes with recommendations for future research focused on using UDL for designing online environments and supporting instructors’ educational development. The objectives of this chapter are the following: • • • •
To provide an overview of Universal Design for Learning (UDL). To describe how a team developed their knowledge and skills to design and develop online environments using principles of UDL. To share strategies used by the team to support instructors in the use of the online environments. To provide recommendations for future research.
BACKGROUND There is a growing trend in higher education institutions to offer online and blended learning. With the demand for more flexible learning, institutions can take advantage of the affordances of multimedia, social media, interactive websites, and informal online learning opportunities (e.g., YouTube, Lynda, iTunes U) in developing robust learning within technology-enhanced environments (Johnson, Adams Becker, Estrada, & Freeman, 2014). Furthermore, since learning is both an individual and a social process (Oztok, Zingaro, Makos, Brett, & Hewitt, 2015), there is an opportunity to create learning experiences that foster a community of learners as well as nurturing student engagement (He, 2013). With the development of rich online and blended environments, careful planning must be given to both the pedagogical and technological components to meet the learning needs of all students. To meet the diverse needs of students with varying experiences and expertise, abilities, and approaches to learning, care must be taken in the design of learning tasks, courses, and programs to incorporate flexibility (Scott, Mcguire, & Shaw, 2003). One framework that provides a comprehensive approach to designing learning to meet the needs of students using multiple approaches and multimedia is Universal Design for Learning (Meyer, Rose, & Gordon, 2014). Founded on neuroscience and educational research, the Universal Design for Learning (UDL) framework can help instructors to design for diverse learning needs and enhance learning experiences for all students (Mangiatordi & Serenelli, 2013). Appropriate educational development opportunities need to be provided for instructors to develop knowledge and skills with regard to designing learning using the UDL framework. They need to have an understanding of the principles of UDL and what that looks like in practice in support of student
219
A Journey Through the Development of Online Environments
learning. Further, instructors will need to learn how to implement the components of UDL in both the design and facilitation of learning in technology-enhanced learning environments (e.g., online). In the following sections, the authors provide a brief overview of four areas related to online learning and UDL. First, a brief description of current trends in online learning within higher education contexts is provided. Second, a description of UDL and its three principles are shared as an overview of the framework. Third, the manner in which UDL is used in online environments is discussed. Fourth, educational development for teaching online, along with integrating UDL in online learning is examined.
Current Trends in Online Learning Online learning has experienced rapid growth in recent years, as exemplified by the 6.7 million students in the United States enrolled in at least one online course during 2012 (Allen & Seaman, 2013). Incorporating technology into teaching and learning approaches has opened doors for educators to design interactive and engaging online learning environments. Online learners have widespread access to information and communication tools to collaborate with experts, peers, and instructors to foster knowledge acquisition and personal growth. According to Johnson et al. (2014), “Online learning environments can offer different affordances than physical campuses, including opportunities for increased collaboration while equipping students with stronger digital skills” (p. 10). With the integration of synchronous and asynchronous discussion forums, instructors are well positioned to design and facilitate meaningful online learning experiences. For instance, the use of video (e.g., iMovie, Dropcam) and audio creation tools (e.g., VoiceThread, SoundCloud) enables instructors “to capture important human gestures, including voice, eye contact, and body language, which all foster an unspoken connection with learners” (Johnson et al., 2014, p. 18). Online learning is evolving in technological and pedagogical ways, which can create “a rich learning experience for students” (Wilcox & Lock, 2014). Universities have incorporated online learning to provide flexible, accessible, and more dynamic learning environments for a large number of students (Johnson et al., 2014). Online learners have been described as “not traditional learners” because of their differences in abilities, learning preferences, prior knowledge, and non-academic commitments (Rao, 2012; Rudestam & Schoenholtz-Read, 2010). To meet the increasingly diverse needs of learners, the UDL framework can be used to provide a blueprint for educators to design flexible and inclusive learning environments.
Universal Design for Learning (UDL) Universal Design for Learning (UDL) is “a set of principles for curriculum development that give all individuals equal opportunities to learn” (National Center on Universal Design for Learning, 2014, para. 1). The framework stems from a broad base of neuroscience and educational research. From the former, UDL draws on the “affective, recognition, and strategic learning networks” (Meyer et al., 2014, p. 88). From the latter, the focus is on “optimal techniques for building engagement, knowledge, and skills” (p. 88). The combination of neuroscience and educational research informs the three principles of UDL: 1. Providing multiple means of engagement 2. Providing multiple means of representation 3. Providing multiple means of action and expression (Meyer et al., 2014) 220
A Journey Through the Development of Online Environments
Each of the three principles are based on a neural network and has specific educational implications. First, the affective network monitors “the internal and external environment to set priorities, to motivate, and to engage learning and behavior” (Meyer et al., 2014, p. 54). This network corresponds to the first UDL principle, engagement, which is focused on the “why of learning” that stimulates or motivates a person to learn (CAST, 2015). Three main guidelines are suggested to provide multiple means of engagement and develop affective learning: options for self-regulation; options for sustaining effort and persistence; and options for recruiting interest (CAST, 2014). These guidelines, along with those discussed below for each UDL principle should be considered as general suggestions that instructors can elect to implement. Alternatively, instructors can develop their own appropriate methods of achieving the UDL principles based on the contextual needs of their practice (e.g., course design, learner’s background, curriculum). For example, Smith (2012) used just one of the suggested guidelines, providing options for recruiting interest, to address students’ affective learning in a graduate course. To address this guideline, Smith described four ways of providing multiple means of engagement: first, reading materials were selected based on students’ background and interests; second, students were grouped based on their interests and backgrounds for class discussions; third, students had the option to work collaboratively or individually to complete learning tasks; fourth, feedback was provided for students as a reward to acknowledge their contributions in the learning process. From this implementation, Smith’s students experienced increased affective learning opportunities. Second, the recognition network is responsible for interpreting “information in the environment and transforming it into usable knowledge” (Meyer et al., 2014, p. 54). This network corresponds to the second UDL principle, representation, which focuses on the “what of learning” (CAST, 2015). Gathering facts, identifying words, recognizing patterns, and perceiving information are recognition tasks (Meyer et al., 2014), which can be supported by providing multiple means of representation. Three general guidelines are suggested to support recognition learning: options for comprehension; options for language, mathematical expressions, and symbols; and options for perception (CAST, 2014; Meyer et al., 2014). For instance, Rose, Harbour, Johnston, Daley, and Abarbanell (2006) used three techniques to enhance lectures in a graduate course. First, each lecture was video-recorded and made available on the course website for further review. According to Rose et al. recorded lectures allow anytime access, as well as replay opportunities to help learners fill in comprehension gaps or re-listen to difficult segments. Second, optional (face-to-face or online) small group discussions offered opportunities for deeper knowledge construction. Third, a small number of students from each lecture were assigned to take notes, which were then shared with the class using the course website. Using this technique, students could access multiple versions of lectures notes, which was beneficial because “different students capture and express very different content from the lecture and they represent it in very different ways” (Rose et al., 2006, p. 11). Additionally, students often enhanced their notes by “bringing in additional information, commentary, or questions; adding images or drawings; adding multimedia (like video or sound); or preparing the notes in a particularly cogent and clear way” (p. 12). Traditional teaching approaches such as lecturing can be improved through using different supplementary techniques to be more inclusive to address all learners’ needs. Third, the strategic network is used to “plan, organize, and initiate purposeful actions in the environment” (Meyer et al., 2014, p. 54). This network corresponds to the third UDL principle, expression and action, which is focused on the “how of learning” (CAST, 2015). Organizing ideas, articulating thoughts, and expressing knowledge are strategic learning tasks (Meyer et al., 2014). To support strategic learning, three guidelines are recommended: options for executive functions, options for expressions and com221
A Journey Through the Development of Online Environments
munication, and options for physical action (CAST, 2014). For example, Kumar and Wideman (2014) explained how they provided multiple means of expression in midterm and final exams by offering a variety of questions that students could select from to best represent their knowledge. In addition, students could communicate their answers using their preferred method, such as equations, text, or diagrams. Moreover, by completing short assignments throughout the semester the weight of the final exam was reduced based on each student’s progress. Traditional learning tasks such as exams can be redesigned with increased flexibility to give all students an opportunity to express their knowledge. These three neural networks work together throughout the learning process (Rose et al., 2006). As learners differ in their motivations for learning, comprehension of information, and expression of knowledge, the UDL framework enables educators to design and facilitate inclusive learning experiences for all learners. Incorporating UDL principles in higher education “holds the potential to ameliorate some of higher education’s most pressing issues, including the intractably low rates of persistence, retention, and degree completion evident at most colleges and universities today” (Davies, Schelly, & Spooner, 2013, p. 195).
Universal Design for Learning in Online Environments The UDL framework can guide the design of online, as well as traditional face-to-face learning environments. Supported by modern technology, online environments offer flexible ways of implementing the principles of UDL. Scholars have recommended incorporating UDL into online higher education to address all learners’ needs and preferences, while maintaining or improving the integrity and quality of learning (Rao, 2012; Rao, Edelen-Smith, & Wailehua, 2015; Rao & Tanners, 2011). Rao and Tanners (2011) designed an online course based on UDL and assessed students’ perspectives of what UDL features were valuable to them. To provide multiple means of engagement, Rao and Tanners used synchronous and asynchronous discussion forums (e.g., Voicethread, Elluminate Live!). To provide multiple options of representation, learning content was offered through text, audio, and video. To provide multiple options of expression and action, students chose from a list of ways to express what they learned (e.g., writing a research paper or creating a multimedia project). Findings from the study revealed that 76% of students purchased print versions of the book, while 24% purchased the digital format. For the assigned articles, 52% of students read only the text, 32% read and listened to the articles concurrently, and 16% read some and listened to some. In terms of designing learning tasks, 92% of students appreciated having short weekly assignments with lower weighting, instead of fewer assignments weighted more heavily. In a survey one student commented, “These mini assignments are a great way to…keep us focused…throughout the semester… [They] also provide various ways for us to show the team’s understanding and knowledge, beyond just a major paper…thus giving more chances for multiple skill sets” (p. 223). Giving feedback on the short weekly assignments provided an opportunity for students to access their strengths and weaknesses, and improve their performance. The research of Engleman and Schmidt (2007) highlighted studied learners’ perceptions of an online graduate course for teacher education based on UDL and designed using UDL principles. The goal of the course was to teach students about UDL while providing opportunities for a first-hand experience in a flexible learning environment. Among the research findings, 83% of the students reported that having choices positively impacted their confidence because they had the opportunity to select a method that best aligned with their strengths. For each learning task, students could select from several activities. For example, they could select to write an essay, answer questions related to a book chapter, and cre222
A Journey Through the Development of Online Environments
ate a practical plan based on how learned knowledge could be transferred into their workplace. Most students (82%) reported having a choice gave them opportunities to challenge themselves by trying new approaches. As one student commented, “I tried something I never would have thought of” (Engleman & Schmidt, 2007, p. 118). Finally, the majority of students appreciated experiencing UDL first-hand, as they gained a deeper appreciation and understanding of UDL and a feeling of empowerment by having choices in their learning. Given this experience, the students reported a willingness to use UDL in their own teaching practice. Effectively implementing the principles of UDL (providing multiple options of engagement, representation, and expression) in an online environment requires the careful and intentional selection of supporting technology. To help educators to make appropriate design and technology decisions, the University of Arkansas at Little Rock (n. d.) offered ten steps: 1. 2. 3. 4. 5. 6. 7. 8. 9. 10.
Create content first, then the design Provide simple and consistent navigation Include an accommodation statement Use color with care Choose fonts carefully Model and teach good discussion forum etiquette Choose learning management system (LMS) tools carefully Provide accessible document formats Convert PowerPoint to HTML If the content is auditory make it also available visually, and vice versa (para. 1)
From this brief discussion of the literature on UDL and the active design of UDL principles into online course environments, we can see that there is supporting evidence for the implementation of UDL in higher education online environments. The variety of approaches used to implement UDL principles in the development of online course design demonstrates not only the flexibility of design available, but the opportunities for successfully incorporating UDL principles across multiple disciplines taught in the online course context. Together, the varied approaches and flexibility available through UDL design can enhance online courses to better support the diverse learning needs of our students.
Educational Development for Teaching Online A key challenge of effective online teaching and learning is adequate digital literacy by instructors (Johnson et al., 2014). Educational development for teaching online is concerned with how to engage instructors with a range of online teaching experiences (Palloff & Pratt, 2011). Workshops are a common form of professional development, but there is a need to provide ongoing support for instructors through mentorships and peer learning (Johnson et al., 2014; Palloff & Pratt, 2011). Instructor training workshops often only address a sampling of essential functions (e.g., posting documents, entering grades, starting a discussion forum) without providing guidance on pedagogical design and delivery considerations. Workshops also tend to cover a large amount of content over a single or handful of sessions, leaving few opportunities for instructors to practice and apply what they have learned (Johnson et al., 2014). In contrast, as found by Vitale (2010), side-by-side mentoring and modelling can offer instructors opportunities to develop the pedagogical (e.g., teaching and assessment strategies) and technical skills. In her 223
A Journey Through the Development of Online Environments
research, Vitalie noted that instructors could access exemplars and discuss pedagogical considerations with their mentors as a way of developing their knowledge and skills for teaching online. Educational development can be offered to instructors using face-to-face, blended, or online formats. Online training and support sessions (both synchronous and asynchronous) can be used in situations where instructors have busy schedules or are spread across different geographic locations. For example, mentoring can be conducted at a distance using audio- or video-conferencing. Beyond formal training sessions, resources can be provided in online environments allowing instructors to engage in learning at their own pace (Cook & Steinert, 2013). Such resource rich environments can be continually added to and become a place where instructors meet virtually to support their own and each other’s learning. A key finding of McQuiggan’s (2012) research is the need for increased and sustained opportunities for discourse between novice and experienced online instructors. By providing educational development that engages instructors in using synchronous and asynchronous communication, they are also developing confidence and competency in how they can use these forms of communication in their teaching. There is a need to move beyond educational development for only online teaching if planning to use UDL. Integrating principles of UDL into online teaching and learning approaches, requires more than designing an online environment (e.g., course shell). This process starts with acknowledging the diversity of learners and that one size does not fit all with regard to instruction and student learning. Rather, thoughtful consideration must be given to designing learning environments that are more inclusive of all students’ learning needs. The challenge for educators is how to take the theory of UDL and implement in practice, in online learning environments in meaningful ways. For this implementation to occur, they may need exemplars and mentoring to help them work through the process of designing and facilitating learning that embraces the richness of UDL within technology-enhanced environments.
MAIN FOCUS OF THE CHAPTER As more higher education institutions offer online courses and programs, what consideration is being given to the design of online environments using appropriate strategies and structures to meet the learning needs of all students? “The digital environment with its connectivity, multimedia, just-in-time communications, distributed authoring, wisdom of the crowd, and many other qualities, has opened the door to a broad palette of communication skills and options” (Meyer et al., 2014, p. 88). When designing the online environment for all learners, the following three principles apply: 1. “Provide multiple means of engagement (the ‘why’ of learning)” 2. “Provide multiple means of representation (the ‘what’ of learning)” 3. “Provide multiple means of action and expression (the ‘how’ of learning)” (p. 89) Based on a two-year study, the authors share their experiences and insights in designing online environments for field experience (practicum) using the principles of UDL. As part of this work, they had to develop a sound understanding of UDL in order to inform how they designed and developed the online field experience environments. Further, the authors, who were members of the design team, provided instructors with various educational development opportunities informed by UDL to support them in using the UDL online environments. Drawing from this work, the authors put forward four recommen-
224
A Journey Through the Development of Online Environments
dations for practice, as well as three recommendations for future research with regard to the integration of UDL in online learning environments.
Context of the UDL Design Work As part of the Bachelor of Education program at a western Canadian university, pre-service teachers are required to complete field experience (practicum) courses. The four field experience courses were supported by an online environment using a learning management system. The field experience curriculum integrated an online component to create a “sense of community, coherence, and shared learning amongst the pre-service teachers in the cohort” (da Rosa dos Santos, Seidel, & Lock, 2013, p. 1929). The design team (i.e., the authors) was tasked with redesigning the online environments using the principles of UDL and supporting instructors in implementing the environments for their pre-service teacher cohorts during the field experience courses. To refine the current online learning environments (i.e., course shells), a group of graduate students and instructors engaged in a two-year design-based research (DBR) study. DBR is focused on creating a solution to a problem and then studying the solution in action through multiple iterations. Such a methodology “strives to positively impact practice, bringing about transformation through the design and use of solutions to real problems” (McKenney & Reeves, 2012, p. 14). Our study involved the design and implementation of the online environments using the principles of UDL (Meyer et al., 2014) for four field experience courses. The goals of the research were threefold: 1. To use evidence-informed decisions in design and implementation 2. To support pre-service teachers in the use of the online environments (e.g., exemplars, how-to resources) 3. To support instructors in further customizing the course environments and facilitating learning Using an in-depth gap analysis and three cycles of refinement (each consisting of design, implementation, and evaluation), the team developed four online course environments and a companion online instructor course environment. The team also provided support to the instructors through workshops, webinars, and coaching for the purpose of helping them to further personalize their course environments, and refine their online teaching capacity. The team consisted of two academic staff members and five graduate students. Each member of the design team came to the project with different levels of understanding, experience, and expertise in UDL. To support moving from theory to the practice of UDL within the field experience context, the team engaged in three forms of educational development. First, the team began by developing a common understanding of UDL and ways of implementation through an academic book club (da Rosa dos Santos, Altowairiki, Johnson, Liu, Hill, & Lock, 2015). For a short period of time, the team studied Universal Design for Learning: Theory and Practice, by Anne Meyer, David Rose, and David Gordon (2014). Each week, two team members provided a summary and led a dynamic discussion of themes from selected book chapters. Engaging in the book club fostered a sense of community among a group of acquaintances, that had otherwise not worked together prior to the project. As an emergent community, team members faced inherent vulnerability when leading the discussions and sharing their perspectives. The interactions forced members to engage deeply and to develop a greater level of confidence in exploring how elements of UDL could be implemented in the 225
A Journey Through the Development of Online Environments
proposed design work. The book club, also helped team members develop skills to negotiate tensions and differing opinions of how the design work should be done. Moreover, it enabled each member to work on self-regulatory strategies (e.g., setting goals, establishing timelines) that were important throughout the project. Second, after the gap analysis the initial design work began for the project. From this initial work, it became evident that members needed to learn more about specific elements related to UDL and its implementation in the online environment. Each team member self-directed their educational development journey to help inform the design, development, and facilitation of UDL. For example, some members enrolled in a graduate-level UDL course, some attended UDL-focused conferences, and others were involved in UDL professional organizations. Furthermore, the team invited UDL experts to speak to the group and to engage in question/answers with a specific focus on the context of the project. This combination of educational approaches allowed members to share their understandings and collaboratively contribute to the development of others. Third, through the design, facilitation, data collection, and analysis processes, a space was created for learning with and from other team members and the project. The team developed an informal community of practice (Wenger, 1998). Through frequent formal (e.g., team meetings) and informal (e.g., hallway conversations, virtual communication) interactions, team members could express their needs, engage in conversation about the work, as well as provide support to each other as they completed various components of the project. The community of practice informed the tangible elements of the design work, and supported the team’s confidence and competence in understanding, implementing, and living out the principles of UDL throughout the design, development and implementation of the project.
Implementing and Fostering UDL in Online Learning Environments Through the design, development, and facilitation work related to this project, the team evolved into competent designers using principles of UDL (Meyer et al., 2014) in the online environment. By sharing their experiences and learning from the collected research data, the team tackled the nuances of moving back and forth between UDL theory and practice, including the challenges and successes of implementing UDL in online learning environments. The following sections highlight two key learning moments for the team that occurred during the project: 1. Implementing UDL in online environments 2. Using UDL to guide educational development strategies
Implementing UDL in Online Environments Based on the results from the study’s gap analysis, consultation with stakeholders, and the application of UDL principles, the team initiated a collaborative design process for creating the four course shells. Key design elements of the online learning environments and the manner in which they fulfilled aspects of UDL, as outlined in Universal Design for Learning Guidelines version 2.0 (CAST, 2011), are discussed. An online environment was advantageous because of the inherent flexibility for users to customize the display of information and learning content using the built-in settings of their personal devices. Beyond the accessibility features and customization (e.g., magnification, color inversion) options offered by Desire2Learn (D2L; learning management system) and modern web-browsers, headings with varying 226
A Journey Through the Development of Online Environments
font sizes, bolding, and graphical indicators were used throughout the online environments and in the supporting resource documentation for both instructors and students. The webinars and video content included in the environments were hosted on YouTube/Vimeo to allow access from different devices and the ability to enable closed-captioning. To access the tutorial content, learners could select from text documents, videos created by the design team, or external websites and multimedia. By using such features, the end-user (students and instructors) was able to access course information in various ways (e.g., text, video, audio). Given the intensity of the field experience program, the design team maintained a consistent layout for each course. The online environments were designed in an organized and focused manner to help students navigate content quickly and easily. The intent of using a consistent layout was to foster familiarity and allow students to deepen their engagement (Meyer et al., 2014) as they progressed through the field experience program. After the first field experience course, students consequently recalled information from their previous online environment(s) and were better prepared to engage in the subsequent online content structures. The use of a consistent format structure saved students from having to spend time and effort reacquainting themselves with the environment. Additionally, one form of assistance to guide students through course content was to have it organized into weekly sections, which included relevant documents, links to discussion forums, and checklists. This design element was intended to allow students to plan and manage their time, and minimize their uncertainty in the course delivery and requirements. To help with navigating the online environment and to assist in reacquainting learners with the online course areas, an introductory video tour was included on the homepage to highlight key features for each course in D2L. By developing a standard structure for the content that was duplicated in all four courses provided students with navigation ease and predictability. The majority of course materials were created using commonly available tools and resources (e.g., Google Docs, GarageBand, YouTube). Using such tools and resources allowed for a variety of materials to be quickly created and shared in the online environments. Students would be able to access these materials anytime and at any pace that was convenient for them during the field experience. This particular approach modeled technology use and online course interaction expectations for students. More specifically, it allowed students to see exemplars of how such tools could be used in their own teaching practices in the field experience. Discussion forums were used heavily throughout the courses. They provided opportunities for collaboration and engagement among students and their course instructor. According to Tu and Corry (2003), “asynchronous discussion designs may sound as easy as an instructor posting questions and the students responding, but an effective design, management, and strategy is required for students to internalize [learning]” (p. 304). Developing well designed questions was critical to invite students into a conversation and engage them in discourses that promoted deep and meaningful learning. As part of the field experience, students within the same course section were placed in different schools. A result of this placement variety meant that many students did not have consistent face-toface interactions with each other. Subsequently, the course section discussion forum areas provided a space for the students to connect and learn about their peers’ experiences at different schools. Further, students were often required as part of the course expectations to post or reply to other students. Within the D2L discussion forums, students could embed multimedia or links to external content that enabled students to communicate and interact with each other in multiple ways beyond text-based interactions. In anticipation of students with limited experience using online discussion forums, the design team provided exemplars and a tutorial to guide students in writing high quality discussion posts. The exemplars 227
A Journey Through the Development of Online Environments
established a level of technology expectation and provided scaffolding for students to enhance their discussion posts (Palloff & Pratt, 2005). To support student learning in field experience, a variety of resources were provided to them. For example, tutorial resources (e.g., documents on how to create an audio or video post) were created by members of the design team for the purpose of bridging potential gaps or issues with course expectations. In addition to tutorial resources, students were provided templates and exemplars for writing discussion forum posts and creating lesson plans. These types of resources created using multimedia were included to help students manage information, support planning, and develop their teaching practices.
Using UDL to Guide Educational Development Strategies Providing educational development opportunities for instructors was an integral part of implementing the online learning environments. The team worked with a cohort of 10 to 20 instructors per semester that had a wide range of experience in teaching online; it became quickly apparent that there was a need to provide multiple opportunities to support the instructors’ diverse technology needs and skills. The majority of instructors were experienced K-12 educators with limited online teaching experience. We used a mentoring approach to support instructors become acquainted and comfortable with using the online environments (Palloff & Pratt, 2011). UDL was not only used in the design of the online environments, but it was also used to guide the educational development for instructors. Supporting instructors to work in the online environment was equally as important as designing a UDL environment for students. The support of educational development based in UDL principles allowed them to experience UDL first-hand and provide active modeling for their online courses. Instructors were offered a combination of group workshops, personalized coaching, text-based documents, videos, and links to external media to support their development in using the online field experience environments. The array of available supports helped foster a common ground of technical skills and knowledge among the instructors. Moreover, the supports elevated the instructors’ confidence and competence in using the more advanced features of the learning management system, such as audio and video recordings (Vitale, 2010). The instructors were initially oriented to the online environments through group training sessions led by the design team. The sessions introduced concepts of teaching and learning online, such as facilitating discussions, promoting collaboration, engaging students, and fostering community (Palloff & Pratt, 2011). To simplify navigation and maximize transfer of knowledge, the instructor resource environment was designed to closely model the student course environments. A companion instructor online environment was created as a separate D2L course shell. As part of this environment, discussion forums were created to facilitate collaboration and engagement among instructors and the design team (Palloff & Pratt, 2011; Wenger, 1998). Through collaboration, instructors could contribute resources and experiences. The instructor resource environment included a repository of video tutorials, a discussion forum area, and links to external websites regarding UDL and online teaching. The tutorial documents covered various functions within D2L, such as, uploading content, providing student feedback, managing grades, and using discussion forums. Created by the design team, the documents included procedural instructions supported by scholarly literature, screenshots from the D2L environment, as well as links to external websites, literature, and videos. The intent of hosting the resources within a dedicated instructor environment was twofold: to acquaint instructors to the environment from a student’s perspective; and to showcase exemplars and stimulate ideas of how instructors could customize and deliver their own courses. 228
A Journey Through the Development of Online Environments
Instructors were provided personalized support through technology coaching by members of the design team. The coaching sessions were offered in-person or remotely (e.g., Adobe Connect Meeting, telephone). These sessions assisted instructors in becoming more familiar with the online environments and using various features in D2L. Also, instructors used the coaching sessions to ask questions and receive help designing specific course elements (e.g., setting up discussion forums, producing audio-video recordings, providing student feedback). The support allowed instructors to learn at their own pace and coaches could respond to specific needs (Palloff & Pratt, 2011). One-on-one coaching offered a safe space for instructors to work through insecurities related to designing and teaching in the online environment (Palloff & Pratt, 2011). The coaches gave ‘at-the-elbow’ support and attended to instructors’ individual needs. The coaching sessions helped instructors develop confidence and proficiency in using various tools in D2L. Moreover, the availability of the coaches allowed them to address instructors’ needs and questions before, during, and after the courses went live. Whether through coaching sessions or online resources, the instructors had a range of supports available with ‘anytime-anywhere’ access. This level of support was intended to mitigate anxiety and create a reliable safety-net for the instructors. As the instructors’ technical knowledge and skills advanced, additional coaching sessions provided a forum for the design team and instructors to critically discuss how UDL and online teaching pedagogy could be incorporated into their practices. A part of the educational development strategy was to foster a community of practice (CoP) among instructors. Before the start of each course, instructors had opportunities to meet together as part of an orientation process, as well as share insights from their experiences with previous iterations, and to help each other with the online environments. Members of the design team used these meetings to engage instructors and ensure the educational development offerings aligned with their needs. Through regular interaction with instructors, the design team facilitated a meaningful CoP in terms of content and process, to reach desired outcomes (Cheng & Lee, 2014; Vestal, 2006). The design team’s relationship with instructors was dynamic. As the design progressed through each iteration of the design-based cycle (Barab & Squire, 2004), the team continually refined how they supported instructors. Each coaching session, design meeting, group workshop, and conversation the designers had with instructors, students, and administrators impacted the supports that were offered. For example, the designers learned that providing too many tutorial resources could clutter the environment and make navigation difficult. As a result, the design team refined the availability of resources to meet specific course needs. By providing various educational development and community of practice opportunities, instructors further developed how they designed and customized their online course environments. As instructors became more familiar with the environments, they learned about advanced features of D2L through scaffolded online resources and coaching sessions. For example, in subsequent course iterations, instructors progressed from asking about basic features of D2L to wanting to learn more about complex strategies and goals for improving their courses. At the end of each year, all instructors were invited to meet with the design team to discuss improvements, showcase individual course shells, share what they learned through an exchange of ideas, and celebrate their work in the project. Moving forward, the knowledge gained from supporting instructors informed the design iterations and made the design team more aware of instructors’ specific technological and pedagogical needs. Supporting instructors helped the team recognize the need for multiple ways of engaging instructors in fostering their online teaching skills and technological abilities. A multifaceted network of supports
229
A Journey Through the Development of Online Environments
enabled the team to meet the needs of instructors who had varying experiences, backgrounds, and confidence in teaching online.
SOLUTIONS AND RECOMMENDATIONS From the team’s experiences in conceptualizing the project through to its implementation and evaluation, several factors influenced the design of the online environments and the support provided to instructors. Drawing on the experiences of this design-based research project, four recommendations for practitioners are offered in the following two areas: 1. UDL design 2. Educational development
UDL Design Designing the online environment with the principles of UDL should occur using an iterative process involving multiple stakeholders (e.g., administrators, students instructors). First, the team found it beneficial to assume a fluid design that would evolve with each design cycle. The team knew it would not be possible to adequately capture all of the desired elements of UDL in the first iteration. Addressing the needs of end-users directly can provide students with agency and assist in streamlining the design process. Using an iterative design process with opportunities to gain stakeholder feedback can ensure resources and course elements are relevant and meaningful to students. As well, designers and/or instructors need to maintain a clear focus on the learning outcomes rather than focusing on course activities and technologies available for implementation (McGee & Reis, 2012). Implementing the iterative design process gives stakeholders further opportunities to identify possible problematic areas where there may be a misalignment between the learning outcome and the learning tasks or with the use of technology for the task. The iterative redesign process, which occurs during and after course completion, allows for changes to occur that foster greater alignment between learning outcomes, instructional strategies and assessment within the technology-enhanced learning environment. Second, specific strategies need to be identified and implemented to help stakeholders effectively engage in and use the online environment. The design and delivery of UDL-based online courses can also incorporate opportunities to explicitly guide users through the environment and available features. For example, a video tour can give the user an overview of the online course shell and help instructors become familiar with the environment. Users accustomed to traditional text-based forms of communication could also benefit from being encouraged or required to explore alternative ways of experiencing and expressing their learning (e.g., video creation, audio reflections). Johnson (2013) suggested that the interactive components of online courses “allow students to not only retrieve, explore and analyze learning information, but to display, present and demonstrate” (p. 1183). The importance of multiple representation in learning is highlighted by Davis (2015). She suggests that the multiple integration of text, graphics, audio and video “reduce the dependency on text, the inflection heard in the voice of the characters provides additional information, and the activity provides a multisensory learning experience” (n.p.). The use of self-chosen formats (e.g., text, audio or video) permits students to add in such things as graphics and/or audio. This choice is affirmed in the Read, 230
A Journey Through the Development of Online Environments
Reflect, Display, and Do model which is referred to as the R2D2 model (Bonk, 2009; Bonk & Zhang, 2008) that encourages instructors to think about the varied multimedia choices available that can appeal to various learning styles of students.
Educational Development Two strategies are recommended to support educational development. First, educational development is an opportunity for instructors and the design team to learn with and from each other. The reciprocal relationship between designers and the instructors was important to inform the creation of the online environments, as well as how instructors were supported in using them. The discourse between subject matter experts and designers, can open a rich space to explore ways in which UDL can be used to enhance or enrich online learning experiences. For this to occur, it requires open communication within a trusting relationship that fosters candid discussion of approaches to improve online teaching practices. This type of educational development was supported through a community of practice (Palloff & Pratt, 2011; Wenger, 1998) among the design team and instructors. Second, what was learned through the design cycles was that the design team needed to provide timely and individualized support for instructors. It was found that the first and last weeks of class semesters held the highest support demands (e.g., instructors were setting up their course shells) for educational development support. Timely interactions that purposefully meet the needs of the instructors is critical for their advancement in using the online environment. Regular synchronous interactions enable the ability to build relationships and receive feedback to adjust and modify supports (He, 2014).
FUTURE RESEARCH DIRECTIONS Drawing on the experiences of this research project using UDL, three recommendations for future research are offered. The first recommendation for research is for the evaluation of the design of the online course environment that incorporates principles of UDL. This may include the assessment of purposefulness of the design features and elements required by specific fields and disciplines. Additionally, research that seeks to measure the impact of the application of the principles of UDL in the design of the online environment on student learning would provide additional evidence for implementing strategic UDLbased learning supports. The second recommendation for research is focused on the influence of the modeling of a UDL online environment. Students and instructors are in need of examples to illustrate what UDL looks like in practice within various contexts. For example, what does multiple means of representation look like in an online discipline-based course, as compared to a skills-based course? When a pedagogically sound online environment is created using principles of UDL, how can this be used as framework to learn about UDL? Such models provide an opportunity to deconstruct the environment to examine why and how the principles are used to meet specific learning outcomes. The third recommendation for research addresses the issues of providing sessional instructors with ongoing and consistent educational development. Often field experience courses are taught by sessional instructors who do not have access to regular educational development opportunities. This is challenging especially when there is an expectation for sessional instructors to already understand UDL principles and to be able to design and facilitate learning using UDL in the online environment. Given this issue, 231
A Journey Through the Development of Online Environments
questions arise that consider what are some strategies and ways in which sessional instructors are able to continue to develop their knowledge and skills in using principles of UDL in online environments during and beyond course employment?
CONCLUSION The project provided the design team with an opportunity to further develop their understanding of UDL and how to apply these principles in designing online environments for field experience courses. It was beneficial to approach UDL as a team to develop understanding together, as well as to work collaboratively with the instructors for whom the online environments were designed. The team realized that living out UDL through both the design of the online environment and educational development opportunities went far beyond creating a repository of resources in multiple formats and became a practical design approach. UDL required the team to provide multiple means of engagement (to support affective learning), representation (to support recognition learning), and action and expression (to support strategic learning). Throughout the design-based research study, the team wrestled with the details of each UDL principle when putting theory into practice, both for designing the online environment and for facilitating the educational development of instructors. As higher education institutions continue to shift towards the inclusion of teaching more courses online, careful consideration to the implementation and process of how the principles of UDL need to be used in the design and development of the online environments. While it cannot be assumed that if you design an environment using the UDL principles, the UDL features will be utilized by all stakeholders, having these features available enables a supportive learning foundation for all. Through the provision of educational development support, instructors will have access to learning opportunities that can increase their understanding and practical development in implementing UDL in their online courses. It is with this multilayered UDL approach that supports learning for both instructors and students, that the principles of UDL can be moved from theory to practical action. Together, the online day-to-day teaching and learning practices become enhanced learning experiences.
REFERENCES Allen, I. E., & Seaman, J. (2013). Changing course: Ten years of tracking online education in the United States. Retrieved from http://www.onlinelearningsurvey.com/reports/changingcourse.pdf Barab, S., & Squire, K. (2004). Design-based research: Putting a stake in the ground. Journal of the Learning Sciences, 13(1), 1–14. doi:10.1207/s15327809jls1301_1 Bonk, C. J. (2009). R2D2: A model for using technology in education. eCampus News. Retrieved from http://www.ecampusnews.com/top-news/r2d2-a-model-for-using-technology-in-education/ Bonk, C. J., & Zhang, K. (2008). The R2D2 Model: Read, reflect, display, and do. In Empowering Online Learning: 100+ Activities for Reading, Reflecting, Displaying, and Doing. San Francisco, CA, USA: Jossey-Bass. Retrieved from http://www.publicationshare.com/pdfs/Chapter-1-of-R2D2-100-activitiesBook-by-Bonk-and-Zhang.pdf
232
A Journey Through the Development of Online Environments
CAST. (2011). Universal Design for Learning Guidelines version 2.0. Wakefield, MA: Author. CAST. (2014). UDL Guidelines version 2.0. Retrieved from http://www.udlcenter.org/aboutudl/udlguidelines/principle1 CAST. (2015). About Universal Design for Learning. Retrieved from http://www.cast.org/our-work/ about-udl.html#.VutQj7uFN9C Cheng, E., & Lee, J. (2014). Developing strategies for communities of practice. International Journal of Educational Management, 28(6), 751–764. doi:10.1108/IJEM-07-2013-0105 Cook, D. A., & Steinert, Y. (2013). Online learning for faculty development: A review of the literature. Medical Teacher, 35(11), 930–937. doi:10.3109/0142159X.2013.827328 PMID:24006931 da Rosa dos Santos, L. Altowairiki, N., Johnson, C., Liu, Y.F., Hill, L., & Lock, J. (2015). It’s not just a book club: A novel approach to prepare researchers for practice. In P. Preciado Babb, M. Takeuchi, & J. Lock (Eds.). Proceedings of the IDEAS: Designing Responsive Pedagogy Conference (pp. 53-61). da Rosa dos Santos, L., Seidel, J., & Lock, J. (2013). Integrating an LMS into field experience: An insightful experiment. In J. Herrington, A. Couros & V. Irvine (Eds.), Proceedings of EdMedia: World Conference on Educational Media and Technology 2013 (pp. 1927-1931). Davies, P., Schelly, C. L., & Spooner, C. (2013). Measuring the effectiveness of Universal Design for Learning intervention in postsecondary education. Journal of Postsecondary Education and Disability, 36(3), 195–220. Davis, T. (2015). Visual design for online learning. San Francisco, CA: Jossey-Bass. Engleman, M., & Schmidt, M. (2007). Testing an experimental universally designed learning unit in a graduate level online teacher education course. Journal of Online Learning and Teaching, 3(2), 112–132. He, W. (2013). Examining students online interaction in a live video streaming environment using data mining and text mining. Computers in Human Behavior, 29(1), 90–102. doi:10.1016/j.chb.2012.07.020 He, Y. (2014). Universal design for learning in an online teacher education course: Enhancing learners’ confidence to teach online. MERLOT Journal of Online Learning and Teaching, 10(2), 283–298. Johnson, C. (2013). Exploring effective online course design components. In T. Bastiaens & G. Marks (Eds.), Proceedings of World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education 2013 (pp. 1183-1188). Chesapeake, VA: AACE. Johnson, L., Adams Becker, S., Estrada, V., & Freeman, A. (2014). NMC Horizon Report: 2014 Higher Education Edition. Austin, TX: The New Media Consortium. Retrieved from http://cdn.nmc.org/ media/2014-nmc-horizon- report-he-EN-SC.pdf Knight, P. T., & Wilcox, S. (1998). Effectiveness and ethics in educational development: Changing contexts, changing notions. The International Journal for Academic Development, 3(2), 97–106. doi:10.1080/1360144980030202 Kumar, K., & Wideman, M. (2014). Accessible by design: Applying UDL principles in a first year undergraduate course. Canadian Journal of Higher Education, 44(1), 125–147.
233
A Journey Through the Development of Online Environments
Mangiatordi, A., & Serenelli, F. (2013). Universal design for learning: A meta-analytic review of 80 abstracts from peer reviewed journals. Research on Education and Media, 5(1), 109–113. McGee, P., & Reis, A. (2012). Blended course design: A synthesis of best practices. Journal of Asynchronous Learning Networks, 16(4), 7–22. McKenney, S., & Reeves, T. C. (2012). Conducting educational design research. New York, NY: Routledge. Mcquiggan, C. A. (2012). Faculty development for online teaching as a catalyst for change. Journal of Asynchronous Learning Networks, 16(2), 27–61. Means, B., Bakia, M., & Murphy, R. (2014). Learning online: What research tells us about whether, when and how. New York, NY: Routledge. Meyer, A., Rose, D. H., & Gordon, D. (2014). Universal design for learning: Theory and practice. Wakefield, MA: CAST. National Center on Universal Design for Learning. (2014). About UDL. Retrieved from http://www. udlcenter.org/aboutudl/whatisudl Oztok, M., Zingaro, D., Makos, A., Brett, C., & Hewitt, J. (2015). Capitalizing on social presence: The relationship between social capital and social presence. The Internet and Higher Education, 26, 19–24. doi:10.1016/j.iheduc.2015.04.002 Palloff, R. M., & Pratt, K. (2005). Collaborating online: Learning together in community (Vol. 2). San Francisco, CA: Jossey-Bass. Palloff, R. M., & Pratt, K. (2011). The excellent online instructor: Strategies for professional development. San Francisco, CA: Jossey-Bass. Rao, K. (2012). Universal design for online courses: Addressing the needs of non-traditional learners. Proceedings of the2012 IEEE International Conference on Technology Enhanced Education (ICTEE) (pp. 1-8). doi:10.1109/ICTEE.2012.6208664 Rao, K., Edelen-Smith, P., & Wailehua, C. U. (2015). Universal design for online courses: Applying principles to pedagogy. Open Learning: The Journal of Open, Distance and e-Learning, 30(1), 35–52. Rao, K., & Tanners, A. (2011). Curb cuts in cyberspace: Universal instructional design for online courses. Journal of Postsecondary Education and Disability, 24(3), 211–229. Rose, D., Harbour, W., Johnston, C., Daley, S., & Abarbanell, L. (2006). Universal Design for Learning in postsecondary education: Reflections on principles and their application. Journal of Postsecondary Education and Disability, 19(2), 17. Rudestam, K. J., & Schoenholtz-Read, J. (2010). The flourishing of adult online education: An overview. In K. J. Rudestam & J. Schoenholtz-Read (Eds.), Handbook of Online Learning (2nd ed.). Thousand Oaks, CA: Sage Publications. Scott, S. S., Mcguire, J. M., & Shaw, S. F. (2003). Universal design for instruction: A new paradigm for adult instruction in postsecondary education. Remedial and Special Education, 24(6), 369–379. doi:10 .1177/07419325030240060801
234
A Journey Through the Development of Online Environments
Smith, F. (2012). Analyzing a college course that adheres to the Universal Design for Learning (UDL) framework. Journal of the Scholarship of Teaching and Learning, 12(3), 31–61. Tu, C. H., & Corry, M. (2003). Designs, management tactics, and strategies in asynchronous learning discussions. The quarterly Review of Distance Education, 4(3), 303-315. University of Arkansas at Little Rock. (n. d.). Ten simple steps toward universal design of online courses. Retrieved from http://ualr.edu/pace/tenstepsud/ Vestal, W. (2006). Sustaining communities of practice. KM world, 15(3), 8-40. Retrieved from http:// www.kmworld.com/Articles/Editorial/Features/Sustaining-communities-of-practice-15159.aspx Vitale, A. T. (2010). Faculty development and mentorship using selected online asynchronous teaching strategies. Journal of Continuing Education in Nursing, 41(12), 549–556. doi:10.3928/0022012420100802-02 PMID:20704095 Wenger, E. (1998). Communities of practice: Learning, meaning, and identity. Cambridge, UK: Cambridge University Press. doi:10.1017/CBO9780511803932 Wilcox, G., & Lock, J. (2014, October). Student perceptions of online practicum. In E-Learn: World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education (pp. 2059–2064).
KEY TERMS AND DEFINITIONS Coaching: A supportive learning experience wherein a coach has technological and pedagogical knowledge that is provided to the learner during a practical one-to-one teaching session. Community of Practice: A group of people who share a common interest or concern and improve their practice through regular interaction (Wenger, 1998). Educational Development: “All the work that is done systematically to help faculty members to do their best to foster student learning” (Knight & Wilcox, 1998, p. 98). Field Experience: Practicum placement in a K-12 school context with a mentor teacher. Online Learning: “[R]efers to a learner’s interaction with content and/or people via the Internet for the purpose of learning. The learning may be part of a formal course or program or simply something learners pursue for their own interests” (Means, Bakia, & Murphy, 2014, p. 6). Universal Design for Learning: “Universal design for learning (UDL) is a framework to improve and optimize teaching and learning for all people based on scientific insights into how humans learn” (CAST, 2015).
235
236
Chapter 11
Computer Simulation in Higher Education:
Affordances, Opportunities, and Outcomes Yufeng Qian Northeastern University, USA
ABSTRACT Computer simulation as both an instructional strategy and technology holds great potential to transform teaching and learning. However, there have been terminological ambiguity and typological inconsistency when the term computer simulation is used in the education setting. This chapter identifies three core components of computer simulation, and develops a learning outcome-based categorization, linking together computer simulation’s technical affordances, learning opportunities, and learning outcomes. Exemplary computer simulations in higher education are identified to illustrate the unique affordances, opportunities, and outcomes of each type of computer simulation.
INTRODUCTION Along with gaming, gamification, 3-D virtual reality, and 3-D modeling, computer simulation has recently become a buzzword in higher education and e-learning. In 2012, the New Media Consortium Horizon Report projected that games and simulations are on the two to three-year adoption horizon (New Media Consortium, 2012). Enabled by rapid advances in computer and Internet technologies and coupled with decreasing costs, simulation technology is widely incorporated into the science, technology, engineering, and mathematics curricula in higher education (D’Angelo et al., 2013). Healthcare and medical education, in particular, have made extensive use of computer simulation as routine instructional activities and tools across the curricula (Damassa & Sitko, 2010). There has also been growing interest in social sciences and humanities to use computer simulation to help college students understand and explore the complex systems and processes of social- and human-related disciplines (Gilbert & Troitzsch, 2005; Porter, Riley, & Ruffer, 2004). DOI: 10.4018/978-1-5225-1851-8.ch011
Copyright © 2017, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
Computer Simulation in Higher Education
Computer simulation, both as an instructional strategy and technology, holds great potential to transform teaching and learning. The educational basis of simulation are the notions of “real-world problems,” “contextual learning,” “situated learning,” “active learning,” and “deep learning.” A large body of literature condemns one-way lecture-based education as inefficient and ineffective in promoting higher levels of learning and helping learners transfer and apply knowledge and skills in real-world contexts (see for example, Clark, 2004; Jonassen, 1999; Wilson, 1996). In simulation, however, students are situated in real-life scenarios, encounter real-life problems or challenges, and are required to act, think, and solve problems like an expert. Simulation is a viable tool to engage students in “deep learning” that empowers higher-order thinking (such as analysis and creation) as opposed to “surface learning” that requires only memorization and comprehension. Indeed, computer simulation has demonstrated great effectiveness in engaging and motivating learners, in facilitating higher levels of learning that help learners transfer and apply knowledge and skills in real-world contexts (Damassa & Sitko, 2010; Tennyson & Jorczak, 2008), and most notably, in promoting 21st-century skills, including decision-making, critical thinking, problem solving, collaboration, effective communication, persistence, and learning to learn (Binkley et al., 2011). However, there has been terminological ambiguity regarding “game-based learning,” “3-D modeling,” “computer simulation” in education, causing confusion and difficulty in differentiating one from the other in terms of their distinct educational affordances and instructional benefits. Similarly, categorizations of computer simulations vary significantly across disciplines and the efforts to identify a learning outcome-based typology are scarce. At the same time, there has been limited research on exploring the unique learning opportunities enabled by computer simulation and its impact on learning outcomes in affective, cognitive, and social domains. The purposes of this chapter, therefore, are to differentiate computer simulation from other similar technologies, identify its core attributes, categorize it based on learning outcomes, and explore the distinctive learning opportunities and benefits enabled by computer simulation.
Background Computer simulation has been used interchangeably with other similar instructional technologies, such as “computer/digital games,” “augmented reality/virtual reality,” and “computer modeling.” These terms are often lumped together under “digital game-based learning” for convenient naming and classification purpose, but with apparent shifts in distinctive and essential attributes of each. Computer game, used most frequently with simulation, differs greatly in a number of essential attributes from those of simulation. A computer game is characterized by a set of clearly stated goals, rules, and rewards, in which users need to reach higher levels in order to progress. In simulation, on the other hand, learners are more focused on a phenomenon, system, process, or activity, with the ultimate goal to solve a real world problem (D’Angelo et al., 2013). As Prensky (2001) clarified, “simulations are not, in and of themselves games. In order to become games, they need additional structural elements - fun, play, rules, a goal, winning, competition, etc.” (p. 212). Similarly, augmented reality/virtual reality technology, which is mostly akin to simulation as to replication of real worlds, is characterized by an immersive and enhanced virtual environment, blended with reality (e.g., real-world spaces) and digital objects (e.g., images, videos, audios) (New Media Consortium, 2016). The distinction between augmented reality/virtual reality and simulation hinges on the underlying model. While an underlying model is not essential to an augmented reality/ virtual reality, a simulation must be structured to reveal or embed a model of a real world phenomenon, environment, or activity, such as the biological evolutionary model and an organization-operating model. 237
Computer Simulation in Higher Education
Among all these interchangeable terms, modeling is the closest to simulation in terms of core structural elements. Using the same visualization technologies, both modeling and simulation are able to concretize abstract concepts and manifest overt or covert systems or processes, and react accordingly to users’ actions. This shared attribute of concretization and demonstration of abstract system/process could mostly explain why “modeling and simulation” has been a widely adopted term when referring to both or either. However, going beyond a mere demonstration and manifestation of a concept/system/ process, which is central to modeling, simulation is also characterized by replication of the complexity and nuances of the real world, and by opportunities for learners to explore, practice, act, and reflect individually and collaboratively, to solve real life problems in the simulated environment. The important distinction is that simulation requires learners’ interaction with the underling model and the complexity of the simulated setting as well. Otherwise, it is considered modeling rather than a simulation. Closely related to the definitional ambiguity discussed above, views on the core components of simulation vary due to the different purposes and usages of simulation across disciplines. For example, David Gaba, a leader in the field of medical simulation, defined simulation as an instructional technique “to replace or amplify real experiences with guided experiences that evoke or replicate substantial aspects of the real world in a fully interactive manner” (Gaba, 2004, p. i2). He considered high fidelity simulation of the real world and guided real world experiences as the most important elements in health care simulations (Gaba, 2006). In contrast to the emphasis on high realism in replication of the real world and experiences, simulation in social sciences is viewed more from the perspective of modeling and process. Hartman (1996), in explicating the nature of simulation in natural and social sciences, maintained that “the basis of a simulation is a dynamic model that specifies – besides some static properties – assumptions about the time evolution of the considered object or system” (p. 98) and that “the most significant feature of a simulation is that it allows scientists to imitate one process by another process” (p. 77). Similarly, Maier and Grobler (2000) recognized “underlying model” as a major part of computer simulation, along with “the human-computer interface and various functionalities” (p. 140), with functionalities including all features that are not provided by the underlying model and the human-computer interface, such as degrees of transparency of the underlying model, and the time-steps within the simulation. Clear and precise terminology and typology are an important condition for understanding the work in a discipline. In view of the state of terminological ambiguity and typological inconsistency pertaining to computer simulation, it is imperative to identify the defining features of computer simulation along with its affordances. Furthermore, a typology that considers computer simulation’s distinct affordances and is built upon a learning theory will help in clarifying the learning opportunities provided by each type of computer simulation. In order to address these issues, this chapter will first discuss the core components that are essential to computer simulation; a learning outcome-based categorization will then be presented, followed by illustrations of exemplary computer simulation programs in higher education for each category.
CORE COMPONENTS OF COMPUTER SIMULATION What exactly constitutes a computer simulation? What are the core structural components of computer simulation that distinguish it from other similar instructional strategies and technologies, such as “computer games,” “computer modeling,” and “augmented/virtual reality?” This section will present
238
Computer Simulation in Higher Education
and discuss three core elements that are essential to computer simulation across a broad spectrum of disciplines and fields. First, representation of real life. As discussed previously, computer simulation is replicated real life; therefore, what is essential to simulation is the reproduction of the essential features of a real-life situation, event, system, or entity. To be effective in achieving the instructional goals of simulation strategy, many researchers contend that it is essential to maintain the complexity and nuances of real life in computer simulation (Gaba, 2004). Simulation-based learning is based on the notion of “contextualized learning.” Contextual learning theory assumes that meaningful learning happens when learners process new information and acquire knowledge and skills in a context that is authentic and rich with real life complexity. In such an environment, learners understand concepts in a context, discover meaningful relationships between concepts and their practical applications through self-discovery, and make decision and solve problems that are meaningful in a given situation, circumstance, and environment. It is therefore important for computer simulation to incorporate as many aspects as possible of the learning environment – physical, temporal, social, cultural, and psychological so that learning and learning outcomes are contextualized, thus meaningful to learners (Clark, 2004). Fidelity as a notion is closely related to reproduction and representation of real life. Fidelity in simulation refers to the degree of realism to which the simulation replicates reality. As an important consideration in simulation, in healthcare and medical education in particular, many computer simulations are categorized into “high,” “moderate,” and “low” in fidelity depending upon the degree of realism (i.e., authenticity) (Gaba, 2004; Hardman, 2007). Offering the highest level of realism in both environment and experience, high-fidelity simulation allows user interactions with the simulator on a variety of bases (e.g., physical, verbal), and the simulator responds immediately and accurately to the user’s action and performance (e.g., full-scale computerized mannequin), just like in a real setting. Moderate-fidelity simulation is a step down in terms of realism and user-computer interaction; it is, however, useful in helping learners develop a deeper understanding of concepts and process (e.g., cardiology simulator with pulse, heart and breathing sounds but with no chest or eye movement and the ability to talk). Lowfidelity simulation is static, highly focused on a single skill, and the purpose of such a program is to facilitate the learner’s practice and acquisition of a single psychomotor skill, where full reproduction of a real life environment is unnecessary given the instructional goal of such programs (e.g., intravenous insertion arm) (Al-Elq, 2010; Hardman, 2007). Fidelity is an important consideration in designing and choosing computer simulation for a specific learning task. High fidelity, however, is not always superior and necessary because the effect of fidelity on learning depends upon the learning task, levels of learning, and the learner’s levels (Lewis, Strachan, & Smith, 2012). For example, for a novice healthcare student to acquire introductory concepts and basic psychomotor skills, a simple low-fidelity simulation program, such as a clinical vignette, would work more effectively than a complex training aid with fully simulated environment (Munshi, Lababidi, & Alyousef, 2015). Therefore, fidelity in simulations is best utilized when it is aligned with instructional goals and learning outcomes. Second, conceptual model. The second structural component, which substantially differentiates simulation from other instructional strategies and technologies, is the formal specification of a conceptual model with which learners interact to learn about relationships between concepts, observe the part-whole relationship, and learn the consequences of their actions (Jacobs & Dempsey, 1993). A conceptual model is typically defined as “a physical, mathematical, or otherwise logical representation of a system, entity, phenomenon, or process” (Department of Defense, 1998). Models, in general, fall into two broad categories – physical versus abstract (de Jong, 2005). A physical model is a concrete representation of 239
Computer Simulation in Higher Education
an object, system, which could be small or big, visible or invisible, tangible or intangible. Common examples of physical models include the solar system, an atom, and human anatomy. The main purpose of physical models is for demonstration and illustration. In contrast to physical models, abstract models, as the name suggests, are abstract representation of a system, or an entity, such as an economics model (e.g., Heckscher-Ohlin-Samuelson International Trade Model), a public health model (e.g., PRECEDEPROCEED Model for cost-benefit evaluation). The purpose of abstract models is to reveal and depict the logical relationships between parts and the whole and the interconnection among the components of the system/entity. In addition, some types of abstract models (e.g., mathematical models) enable prediction of the effect of changes to the system. Such executable models allow learners to manipulate adjustable parameters and observe the effects/consequences of their input and action on the system/entity. Similar to the consideration of fidelity in reproduction of real life, the sophistication level of modeling should depend upon the learning task and students’ level. For basic introductory knowledge and entry level skills for novice learners in a discipline, the use of descriptive physical modeling mainly for demonstration and illustration purposes is most appropriate; conversely, an abstract model that is analytical and executable will work most effectively if the task involves higher levels of learning, thinking, such as critical thinking, decision making, and teamwork, in observing a real life system/entity and solving a real world problem. Third, learner action/interaction and immediate feedback. While the conceptual model is essential to any computer simulation programs, without learner action/interaction with the model situated in a real life context/environment, such a program can only be considered computer modeling. Indeed, central to the computer simulated learning environment are learner activities – role-playing, exploring, experimenting, collaborating, problem solving, and decision making – all require engaged, active, and higher level learning and thinking, which should be “critical, logical, reflective, metacognitive, and creative” (King, Goodson, & Rohani, n. d.) In response to learners’ input/action, a simulation program should provide instant feedback so that there is an interactive flow between the learner and the system, with the interaction leading to positive consequences for learners (Csikszentmihalyi, 1990; Evans, Kersh, & Kontanien, 2004). Many researchers maintain that the feedback must be immediate, contextual, and specific (Kernan & Lord, 1990; Norman & Schmidt, 1992; Prensky, 2001). Kirriemuir (2002) argued that immediate feedback could help stimulate learners’ curiosity and motivate their exploration and experimentation of the learning environment; Baer (2005) emphasized the important role of instant and contextual feedback in reducing uncertainty and enhancing learning.
LEARNING OUTCOME-BASED CATEGORIZATION As more and more computer simulations are being used in higher education, there emerge several categorization systems, which are predominantly discipline dependent. In medical and healthcare education, for example, computer simulations are categorized into high-, moderate-, and low-fidelity based on the level of accuracy and precision that the simulators or systems actually reproduce reality (Gaba, 2004). As discussed previously, fidelity is a critical component and an important consideration in planning, designing, and choosing computer simulation in healthcare and medical education. In computer science, engineering, and other similar disciplines where systems design plays a crucial role, the categorization of discrete event versus continuous system, based upon times and steps, is most widely adopted. Discrete event simulators, such as simulations of call centers, shipping services, simulate systems that move in 240
Computer Simulation in Higher Education
discrete times and steps. In contrast, a continuous simulator reflects a moving and continuously evolving system using continuous equations, such as climate modeling of polar ice cap melting (GoldSim, n. d.; McHaney, 1991). Since the use of computer simulation in social sciences is relatively new (Gilbert & Troitzsch, 2005), efforts to form categorization systems are scarce; therefore, computer simulation in social sciences is often labeled based upon purposes and usages, such as business strategy simulation, role-playing simulation, and agent-based social simulation. Obviously, current categorization systems of computer simulation, varied greatly between disciplines and in purpose, provide diverse explanations and expectations for learning. A categorization system linking cognitive levels, learning outcomes, and distinctive attributes is, therefore, much needed. Such a categorization, which is learning outcome oriented and focused on levels of thinking and learning, will provide educators with a useful roadmap in planning integration of simulation programs in their curriculum. Such a learning outcome-based taxonomy will help educators better align the educational affordances of a simulation to the desired learning outcomes. This section will present and discuss a taxonomy of computer simulation based on the cognitive levels learners are engaged in. The theoretical basis of the taxonomy draws from Bloom’s Taxonomy of Educational Objectives and later revised work. Exemplary simulation programs in higher education will be used to illustrate this taxonomy, in the section following. In 1956, working with a group of educational psychologists, Benjamin Bloom developed a classification of levels of intellectual behavior important in learning. Bloom identified six levels within the cognitive domain, from the simple recall or recognition of facts, the lowest level, through increasingly more complex and abstract mental levels, to the highest order which is classified as evaluation. In the mid-nineties, Lorin Anderson (a former student of Bloom) and David Krathwohl revised the taxonomy by reversing the original top two cognitive levels (level 5 - synthesis, level 6 - evaluation), considering “creating” (i.e., synthesis) as the highest level of learning, building upon lower levels of skills, including “evaluating” (i.e., evaluation), “analyzing” (i.e., analysis), “applying” (i.e., application), “understanding” (i.e., comprehension), and “remembering” (i.e., knowledge). In addition, building upon Bloom’s original three levels of knowledge – factual, conceptual, and procedural, Anderson et al. (2001) identified and added a fourth level, metacognitive, to the framework. Bloom’s Revised Taxonomy has become a widely used conceptual framework, guiding the design, development, and evaluation of teaching and learning in education and training. Modeling-based simulation. As discussed previously, “modeling” and “simulation” are often used interchangeably, especially in the fields of science, technology, engineering, and math, with both referring to a computer program that displays a static or an animated model that represents realistic/visible or abstract/invisible entity, system, or phenomenon. Currently, most models are 3D computer graphics made to represent a three-dimensional object through the use of specialized software. Scientists, engineers, and science educators use models to concretize, simplify, and clarify abstract concepts, as well as to develop and explain theories, phenomena and rules. The major instructional role of modeling-based simulation is for demonstration and illustration, the purpose of which is to facilitate learners’ understanding and comprehension of a concept, principle, process, system, or phenomenon in a content area. Remembering and comprehending, the bottom two levels of the cognition activities, lay out the foundation for higher levels of thinking and learning. With its focus on demonstration and comprehension, modeling-based simulation is, therefore, both fundamental and critical to other types of simulation, in which learning tasks are more demanding in cognitive levels.
241
Computer Simulation in Higher Education
An important instructional affordance of modeling simulation is its contribution to visualization of invisible, abstract, un-replicable, unreachable, complex, and/or difficult concepts, processes, or systems that are otherwise impossible to see and understand (Chan & Black, 2005; Landriscina, 2013). A widely cited example is the use of concrete molecular models to illustrate abstract and invisible concepts and phenomena in chemistry (Peterson, 1970). In addition to visual manifestation of the abstract, computer modeling is dynamic in nature and can display the change in and evolution of an event, a phenomenon over time. For example, in social studies, students can observe human development across centuries or even millennia with a modeling program (Abdollahian, Yang, Coan, & Yesilada, 2013); and in earth science, where many geological processes operate much longer than human lifetimes, modeling simulation can animate how landforms are created and developed (Houlding, 1994). Furthermore, for these and other time-dependent topics, dynamic modeling simulations allow for learners’ manipulation (e.g., compress time, pause, or slow the process), a critical affordance to help students build correct and complete understandings of the topic enabled by visual access to time-dependent events (Bell, Juersivich, Hammond, & Bell, 2012). Along with dynamic, the interactive affordance of modeling allows for learners’ exploration of the model by changing the variables. In exploring probability, statistics students can use virtual manipulatives (e.g., spinners) to conduct probability experiments and learn to make predictions (Beck & Huse, 2007). The interactive affordance enables learner-centered inquiry, with an added advantage of allowing personalized problem solving process (Bell, Juersivich, Hammond, & Bell, 2012). The unique affordances of modeling simulation - visualization, dynamic, and interactivity - effectively facilitate learners’ comprehension process and enable learners to build a correct and complete mental model of an event, or a phenomenon that precludes some possible misconception and misunderstanding. Modeling-based simulation is a useful tool in many disciplines and it is especially essential in science, technology, engineering, and math fields in which learners’ understanding and mastery of a set of core models in a discipline are essential to advanced study. Task/skill-based simulation. In task/skill-based simulation, learners are required to complete a task by demonstrating mastery of a set of skills in successfully completing the task. All fields, disciplines, subject areas, and specialties come with a set of core competencies and skills associated with performing a specific task. For example, students in science classes are often trained to develop the basic science process skills in performing scientific inquiry - observing, questioning, hypothesizing, predicting, investigating, interpreting, and communicating, with each containing more specific sub-skills and techniques (Harlen & Jelly, 1997). In surgical skills education, Reznick and McCrae (2006) maintained that complex skills could be understood, practiced, and fully developed in sequential stages. Using knot tying as an example, they explicated how the three-stage training (i.e., cognitive, associative, and autonomous) facilitates trainees’ progression in understanding (comprehension), performing (mastery), and integrating (automatic application) the knot tying skill. Task/skill-based simulation, therefore, goes beyond understanding/comprehension of a skill set, but instead requires a learner’s mastery, proficiency, and automaticity in performing a task. The most compelling instructional affordance of task/skill-based simulation is its ability to provide accessible opportunities and safe environments that enable learners’ repeated practice of skill sets. In many disciplines, there are situations where the scenarios are rare and the tasks are dangerous, such as, performing surgical procedures with rare complications, and flying an aircraft through dangerous weathers or traffics (McClusky & Smith, 2008). Gaba (1999) suggested that the most significant contribution of computer simulation to healthcare education are the opportunities for students to acquire proficiency, increase fluency, and ultimately achieve automaticity through reusable scenarios and repeated practice, 242
Computer Simulation in Higher Education
without the risk of endangering patients. The accessibility and safety feature of simulation programs is also recognized in other disciplines, such as aviation, where task/skill-based simulation has been used for many years with great success (Sultan, Corless, & Skelton, 2000). Task/skill-based simulation is most commonly used in high-risk disciplines, such as medicine, healthcare, aviation, shipping, and has been routinely incorporated into their curricula (Rolfe & Hampson, 2003; Yang, Yang, & He, 2001). As both an instructional strategy and an instructional tool, task/skillbased simulation has proven to be effective in these disciplines in facilitating students’ acquisition and practice of the skilled conduct of cognitive or psychomotor skills and procedures in a time-independent and risk-free manner. Problem-solving and decision-making simulation. Situated in a simulated learning environment that represents real-life complexity and uncertainty, learners in problem-solving and decision-making simulation are required to solve a real life problem and make decision that require the highest level of thinking, such as analysis, evaluation and synthesis, which are at the top of Bloom’s taxonomy of learning. Encompassing all core components of computer simulation, problem-solving and decisionmaking simulation provides a learning environment that represents the complexity of real life, includes the underlying model of an entity or a phenomenon, and allows learners to interact (e.g., explore, experiment, reflect) with the environment and with the model in order to work out a solution and make a decision that is most meaningful and viable in a given complex situation. For example, decision making in public policy and management normally involves a considerable number of factors relating to our social systems. The interconnectedness of the systems and the uncertainty in their behaviors add more complexities and challenges to the decision making process. To teach students about the dynamic complexity of public administration and public policy decision-making, the use of computer models that represent the social systems and their interconnectedness is essential. These dynamic and realistic models along with the rich information in the simulation allow students to experience the real world, test their decisions and learn the consequences of their decision in a risk-free environment, and gain skills in strategic thinking and best practices in public administration and decision making (Ku, MacDonald, Andersen, Andersen, & Deegan, 2006). The primary affordance of this type of simulation is its authenticity – the authenticity of the context and the learning task that require learners’ real life skills in solving a problem and making a decision. This type of simulation is especially useful when the consequences of a proposed solution/decision are simply impossible to manifest or be observed immediately (i.e., time/space/discipline-constrained instructional situations). In business, for example, the impact of a decision made usually takes months or years to manifest. However, enabled by the technical capabilities of computer simulation, learners would get instant feedback and visualization of the results of their decision-making in such learning environments (Goldsim, n. d.) Problem-solving and decision-making are critical core competencies in many professions, such as business management, educational administration, legal system, and public services. While it is relatively new, the use of problem solving and decision making computer simulation in social sciences discipline is gaining momentum. The educational affordances enabled in this type of simulation provide an ideal learning environment for students in social sciences and humanities to practice problem solving and decision making skills required in real life work settings.
243
Computer Simulation in Higher Education
EXEMPLARY COMPUTER SIMULATIONS IN HIGHER EDUCATION Modeling-based Simulation - PhET sims The PhET Interactive Simulations project at the University of Colorado Boulder is among the earliest efforts in promoting math and science education using computer simulation. Since its inception in 2002, PhET have created over 130 interactive modeling-based simulations for college and K-12 students on topics of physics, biology, chemistry, earth science, and math, all are available online for free (Moore, 2015). The driving goal of PhET is to engage students in an animated and interactive game-like environment and encourage science learning through exploration and discovery. PhET simulations are widely used across K-12 and college levels, and are highly regarded for their quality and impact (Moore, Chamberlain, Parson, & Perkins, 2014).
Design Features Paramount in each PhET simulation is the model, the design of which is interactive, dynamic and multirepresented. In each modeling, students are allowed to manipulate the key parameters of a model and see the resulting change immediately, in both pictorial and symbolic forms. For example, in “Concentration,” a simulation program for an introductory undergraduate chemistry course, students can add or remove solute, modify its molarity, and observe the solution color change immediately in the model. In addition, the effect is visualized in both the color change and the equation. The interactivity and immediate feedback of the modeling effectively visualizes the relationships between solute, concentration, and solution color (Carpenters, Moore, & Perkins, 2015). To take the modeling’s power to a higher level, the most recently developed PhET simulations have incorporated the feature of “pedagogically useful actions” - actions that will lead to insights that are otherwise impossible to achieve in the real world. “The Molecule Polarity” is such a modeling program addressing bond dipole and molecule polarity topic in undergraduate chemistry courses. This simulation is featured with sequencing in topic complexity through three tabs. The first “Two Atoms” tab, a generic two-atom molecule, allows students to rotate the molecule and change the electronegativity of each generic atom, thus view and draw conclusions regarding the relationship among electronegativity, bond dipole and dipole representations. In the “Three Atoms” tab, students can not only manipulate a generic three-atom molecule as they can in the “Two-Atoms” tab, they can also drag atoms to change the bond angle, an action that supports students in exploring and discovering the relationship between bond dipoles and molecule dipoles, a topic that is usually challenging for students to view and understand in the classroom or lab settings. Furthermore, to connect the concepts and relationships manifested in the previous two tabs to real world examples, the third “Real Molecule” tab provides a list of 19 real molecules from which students can compare and contrast so as to determine trends in molecule polarity and geometry (Moore, Chamberlain, Parson, & Perkins, 2014).
Engaged Exploration is a critical instructional benefit enabled by the interactive modeling in PhET simulations. Engaged exploration is a student self-initiated and self-directed exploration and discovery process driven primarily via their own questioning when interacting with the modeling. The immediate feedback about the effect 244
Computer Simulation in Higher Education
they create when changing the parameters in the model supports students’ curiosity and allows them to ask and answer their own questions as scientists do in the real world (Podolefsky, Perkins, & Adams, 2010). To resonate the open, self-driven exploration and discovery during the learning process, the design of the feedback in PhET simulations is characterized with implicit scaffolding, rather than explicit guidance, throughout the learning process, aiming to support students’ ownership of their learning (Paul, Podolefsky, & Perkins, 2012; Perkins & Moore, 2014; Podolefsky, Moore, & Perkins, 2013). Implicit scaffolding is accomplished through careful consideration and use of technological affordances and constraints to focus the scope of the learning goals and tasks, sequencing of concept and model complexity through sequential tabs, and cuing and feedback to keep students’ choices on the right track to ensure the learning is productive. With minimal explicit guidance, students approach the topic, explore the complexity, and discover the concepts and relationships within a phenomenon as how scientists would do science in the real world (Perkins & Moore, 2014).
Effects on Learning Outcomes The effects and instructional potentials of PhET simulations have been examined extensively by PhET’s research team in the past decade. Researchers have found overwhelming evidence that the modeling-based simulation is a viable learning tool and can provide a powerful learning environment to achieve educational goals in both affective and cognitive domains. As discussed earlier, engagement and engaged exploration, one of the underlying design principles of PhET simulations, is highly recognized in the interview data of student users of various simulations (Adams, 2008; Hensberry, Moore, & Perkins, 2015). In addition, when students are truly engaged emotionally and cognitively, some unexpected but inspiring learning outcomes would emerge. For example, McKagan, Handley, Perkins, and Wieman (2008) found that students “start asking many questions that are beyond the anticipated scope of the class, and sometimes even beyond the scope of knowledge of the instructors, including one Nobel laureate” (p. 5). Adams (2008) observed that engagement and engaged exploration happened only when students’ interaction with the simulation was driven and directed by their own questioning and curiosity. In looking deeper into the relationship between student engagement and levels of scaffolding, Chamberlain, Lancaster, Parson, and Perkins (2014) revealed that students’ engagement with a simulation was significantly affected by the design of guidance level. The use of “exploratory” features decreased significantly when students were provided more guidance, with students being provided with “light guidance” exploring and attending to their simulation interactions the most. The authors therefore recommended the scaffolding design with “initially light guidance promoting exploration and gradually increasing toward specific learning goals” (p. 637). Furthermore, the later topic-specific prompts also need to avoid direct and specific instruction on simulation use; guidance should instead “focus exploration in ways that continue to engage students’ investigative abilities with the simulation” (p. 637). In addition to the affective effect (i.e., engagement and engaged exploration), PhET simulations are also a powerful tool and effective learning environment to facilitate achieving cognitive related learning outcomes, such as conceptual understanding of the content and scientific inquiry skills. In examining the effect of the “Circuit Construction Kit,” a modeling simulation for an undergraduate introductory physics course, Keller, Finkelstein, Perkins, and Pollock (2006) found that students using the simulation demonstrated an immediate better understanding of direct current circuits compared to those using the real laboratory equipment. Similarly, in an interactive simulation on the topic of the photoelectric effect, where students are allowed to control light intensity, wavelength, and voltage and see the immediate 245
Computer Simulation in Higher Education
results of changes, approximately 85% of students successfully showed the competency in prediction, a significantly higher percentage than that in two other classroom-based instructional conditions (McKagan, Handley, Perkins, & Wieman, 2008). PhET, featured with engaged exploration, interactive modeling, and implicit scaffolding, provide an ideal platform for practicing and acquiring scientific inquiry process and skills, which are defined as key competencies in U.S. science education by the National Research Council (1996). In chemistry, for example, balancing chemical equations is a foundational, but challenging skill to teach in chemistry practice in a traditional classroom setting, where direct and explicit instruction is normally followed by drill-and-practice. Literature has shown that this is not the way experts have developed this skill (Carpenter, Moore, & Perkins, 2015). The “Balancing Chemical Equations,” a simulation designed for an undergraduate general chemistry class, was aimed to facilitate students’ acquisition of this concept and skill via the scientific inquiry process with implicit guidance. The interactive modeling in the simulation thus plays a key role in supporting students’ inquiry process of understanding the equation balancing. It was designed to display both symbolic and pictorial representations of the atoms and molecules in the chemical equation, as well as to cue students to the interactive relationship between atoms and molecules with additional molecules appearing or disappearing as coefficients are changed. As novice equation balancing learners, students’ experimentation with the model and subsequent understanding of equation balancing is being guided without feeling being guided, which enabled both an engaged inquiry process and a productive inquiry-based learning (Carpenter, Moore, & Perkins, 2015).
Using PhET Simulations in the Classroom Since its inception in 2002, PhET Sims Project has delivered over 315 million simulations worldwide (PhET, 2016a). Since PhET simulations are mostly modeling-based with minimal text, they are found to be a flexible and effective tool in various instructional settings and can be used in many ways (PhET, 2016b). In college lecture settings, simulations have been proved to be more effective tools for visual demonstration of objects, concepts, relationships, or systems, whether they are visible or invisible, static or dynamic. Further, instructors can make adjustment to the simulations based on the instructional needs, such as slow-down, speed-up, pause, or repetition of a demonstration. PhET simulations can also be used to facilitate whole class inquiry with instructors creating a scenario in the simulation and asking students to make predictions, eliciting “what-if” questions and reasoning, and doing experiments in the simulation to settle debates (PhET, n.d.). Outside of the classroom, PhET simulations can be used for inquiry-style homework – homework that uses open conceptual questions that encourage scientist-like exploration, sense-making and reasoning, and making connection to real world experiences. Similar to the role of an instructor in a guided inquiry science class, the implicit scaffolding embedded through the simulation cues students at various stages of the inquiry process and leads to productive scientific inquiry and learning (PhET, n.d.). In addition to the above three general ways of using PhET simulations, there are many specific ideas on integrating the simulations into the curriculum, all of which can be found on PhET Project’s “Tips and Resources” web page, which is constantly updated by the PhET team (PhET, 2016c).
246
Computer Simulation in Higher Education
Task/skill-based Simulation - Simulation Centers at U.S. Medical Schools Tasks/skills-based computer simulations have been widely and routinely used in high-risk disciplines, including healthcare. The most recent report on U.S. medical school curricula from the Association of American Medical Colleges (2014) has revealed that the majority of medical schools (136 out of the 140 participating in the 2013-2014 survey) have simulation centers. The most well-known include Harvard Medical School’s Center for Medical Simulation, Stanford Medical School’s Center for Immersive & Simulation-based Learning, Johns Hopkins Medicine’s Simulation Center, Washington University School of Medicine’s Simulation Centers, and the George Washington School of Medicine and Health Sciences’ Clinical Learning & Simulation Skills Center. While there is some variability in how the simulation centers are used, there is great consistency across the centers in terms of the types of simulation they support. The four major simulation types commonly used in healthcare education include standardized patients acting realistic roles, part-task physical trainers, computer-based simulations, and computerdriven mannequin-based simulations, with the latter two types supporting students’ task/skill acquisition using computer technology.
Design Features In healthcare, computer simulations are typically categorized based upon fidelity – the degree of precision, accuracy, and authenticity to which a simulation replicates reality (Alessi, 2000). Since task/ skill-based simulations normally require learners to perform a specific task and to demonstrate the skill set required for performing the task successfully, high fidelity is therefore an important consideration in task/skill-based simulations. For example, George Washington’s simulation center provides high fidelity patient simulators that are “programmed with patient characteristics, such as age and anatomy, as well as physiological conditions, such as pulse and other vital signs, and disease processes” (George Washington’s CLASS Center, n. d.). To show the effect of students’ actions, the simulated patient can respond accordingly to various drugs and interventions by showing the specific measurements of vital signs (e.g., pulse rate, respiration rate, blood pressure). Similarly, Washington University’s Clinical Simulation Center (n. d.) provides full-scale electromechanical mannequins for students to practice essential surgical skills and procedures needed in emergency room. “The mannequins are hooked up to a heart monitor and can breathe, sigh, blink and simulate conditions ranging from a collapsed lung or a heart murmur to an acute heart attack.” In addition, the computerized mannequin can respond accurately to treatment. The high level of realism in patient symptoms and reactions to students’ actions provides real life situations and risk free environments for students to practice and acquire essential medical skills, such as the surgical procedures and skills when working with endoscopy and laparoscopic simulators.
Immediate Feedback (i.e., responses from the simulator to students’ action) is another crucial feature in task/skill-based simulation. As aforementioned, in high fidelity computerized mannequin-based simulations, the mannequin is programmed to respond appropriately to the treatment, for example, showing the effect of a specific amount of drug, or the patient’s response to “shocking the heart.” In Stanford’s desktop-based simulations of a patient, students can “meet” the patient and “see” his/her body parts via animation, drawings, or video. Students practice diagnostic or therapeutic procedures by interacting with the patient (e.g., 247
Computer Simulation in Higher Education
asking questions, making comments) and viewing and analyzing data from labs, or x-rays. Students will see the effect of their diagnosis and action on the simulated patient immediately on the computer screen, for example, the patient’s conditions being improved or deteriorated, being stuck with needles, a serious life-threatening condition or a patient dying. Like many high-risk professions, skills to perform a medical task/procedure can be best acquired with hands-on practice and experience. High fidelity simulation of the physical, environmental, and psychological aspects of the task, along with the immediate feedback, provides a risk free environment and effective approach to enable students’ mastery of essential and critical medical skills.
Effects on Learning Outcomes Effects of task/skill-based computer simulation on students learning outcomes have been well documented across a variety of domains in healthcare education. Numerous individual studies have shown that simulation-based curricula are effective in improving students’ confidence, teamwork skills, and most importantly in achieving proficiency in clinical skills (Luna, Aranhan, & Spite, n. d.). A comprehensive meta-analysis of 609 studies on technology-enhanced simulation for health professional education concluded that simulations are consistently associated with significant effects for learning outcomes in knowledge, skills, and behaviors (Cook et al., 2011). Meta-analysis studies were also conducted in specific areas of healthcare. For example, in patient safety, a systematic review of 38 studies revealed that simulation interventions improved the technical and procedural performance of both individual clinicians and teams (Schmidt, Goldhaber-Fiebert, Ho, & McDonald, 2013). Going beyond mere mastery of medical skills needed in completing a medical task, recent research studies in healthcare education have started to explore the use of computer simulations in achieving automaticity (Poolton et al, 2016; Stefanidis, Scerbo, Montero, Acker, & Smith, 2012). As Stefanidis (2013) observed, while most current simulation curricula, which are proficiency-based, have been effective in helping students achieve required level of performance, they do not provide a complete picture and accurate measurement of a student’s expert performance. For example, two students may get equal results on time and accuracy measurements, but they may differ significantly in some cognitive and physiologic parameters, such as attention demands. A defining characteristic of experts in performing a task is automaticity – the ability to perform a task without significant attention demand, thus freeing sufficient attention to cope with other complexities that are common in real world situations. Automaticity is best achieved through repeated practice, and simulator training has proven to be the most ideal tool to help students go beyond the proficiency level to achieve automaticity (Stefanidis, Scerbo, Montero, Acker, & Smith, 2012). Along with the effect on proficiency and automaticity of technical and clinical skills, teamwork-related knowledge, skills, and attitudes are another prominent learning outcome emerged when using computer simulation in healthcare education. In the healthcare profession, a field that frequently requires teamwork and efforts, both individual technical and clinical competencies and nontechnical teamwork competencies play critical roles. Teamwork competencies consist of three categories: teamwork knowledge concerns information about the team’s objectives, and roles and responsibilities of individuals; teamwork skills refer to effective communication and collaboration that lead to successful completion of the teamwork; teamwork attitudes are individuals’ mental states that could influence their performance in the teamwork, such as one’s perception of teamwork, preference for teamwork style (Beaubien & Baker, 2004). Teamwork competencies must be trained and practiced and task/skill-based computer simulation provides 248
Computer Simulation in Higher Education
a versatile environment and approach to achieving this purpose. Stanford’s Center for Immersive and Simulated-based Learning supports this effort by incorporating the team training into their simulationbased curriculum. For example, their immersive and simulation sessions allow students to “experience what it is like to be in the role of someone else in the patient care team (such as a physician being the “nurse” in a simulation or vice versa). Such role-reversals can’t be executed in real patient care situations but can easily be created using simulation” (Stanford CISBL, n.d., para 5). Weaver et al.’s (2010) systematic literature review of 27 studies on simulation-based team training revealed that simulation-based curricula in healthcare is effective in improving “non-technical teamwork skills underlying effective team communication, cooperation, and coordination such as closed-loop communication, situational awareness, back-up behaviors, as well as necessary supportive structures such as shared mental models.”
Using Task/Skill-Based Simulation in the Classroom High fidelity computerized simulation-based learning has gained momentum in healthcare education in recent years, given its effectiveness in helping students achieve both clinical proficiency and automaticity and teamwork competencies required in real medical situations, as aforementioned. Not surprisingly, most medical schools in the U.S. that have a simulation center have adopted the computerized simulation as the dominated simulation modality in the formats of desktop-based or computerized mannequinbased. Harvard’s Center for Medical Simulation, founded in 1993 as one of the world’s first healthcare simulation centers, is widely recognized as a leader in innovating healthcare education using simulation technologies. They disseminate their mission, research, and best practices in computer simulation-based healthcare education via a variety of forms, such as ongoing courses, workshops, conferences, and training programs. Their one-stop shop website updates the center’s research projects, publications, current offerings, and new initiatives. Other simulation centers in healthcare (e.g., Stanford’s, Johns Hopkins’s) have also made similar efforts in using their websites as a portal to disseminate and promote their research and best practices in using task/skill-based simulation. Task/skill-based simulation is widely accepted as both a versatile tool and an effective instructional technique for learning that requires students’ competency, proficiency, fluency, and automaticity of motor and psychomotor skills and procedural/process knowledge at application levels. Simulation is especially needed in instructional situations where learning tasks are difficult, rare, or dangerous. Computer simulation provides a safe and easily accessible learning environment where learners have the opportunity to encounter rare cases, make mistakes, see the outcomes of their mistakes, thus gaining experiences and insights needed in real life (Aldrich, 2004). These learning opportunities can be repeated as often as needed. Given these educational benefits, task/skill-based simulation has been widely used in other high-risk fields, such as aviation, power plant control, nuclear power production, military, and law enforcement. It is foreseeable that simulation will prosper in some emerging fields, such as homeland security, counterterrorism, and some career programs involving emergency/crisis/disaster response and management.
Problem-solving and Decision-making Simulation - The Columbia Center for New Media Teaching and Learning The Columbia Center for New Media Teaching and Learning (CCNMTL) is a highly recognized organization in supporting faculty’s integration of digital technology into teaching and learning. Since its 249
Computer Simulation in Higher Education
founding in 1999 with a six-stage of design approach (from the initial understanding of the curriculum to the last evaluation, reflection, and revision step), CCNMTL has generated numerous technologyenhanced teaching and learning programs, including a number of problem-solving and decision-making simulation programs for undergraduate courses in sustainable development, medicine, humanitarian relief, and environmental science. The goal of these simulation programs is to promote a dynamic discoveryoriented learning by utilizing the unique affordances of simulation.
Design Features The most prominent feature of the problem-solving and decision-making simulation at CCNMTL is the reproduction of the complexity, uncertainty, and ambiguity of the real world that is represented in multiple forms in the simulated learning environment. For example, in “Brownfield Action,” a simulation program for an introductory environmental science course, students are presented with an environmental crisis of possible groundwater contamination in a fictional town, a complex issue resulting from the interplay of many aspects, such as the town’s infrastructure, historical background, residential and commercial consumption (CCNMTL, n.d.a). The complexity and ambiguity of the real world issue are presented in the simulation in a variety of forms, including maps, documents, videos, and a network of scientific data on the suspected contaminated land site. Similarly, the simulated environment in “COUNTRY X,” used in a graduate course in international and public affairs at Columbia University, exposes students to both narrow range information - the unique background and condition of the country in terms of political, economic, social, and security structures, and deep extended nuances - pertinent details and variables that muddle the issue, such as violence and economic instability. Both narrow range and deep extension information constitute the essence of mass atrocity prevention (CCNMTL, n.d.b). Complex real world issues are most likely multidisciplinary; CCNMTL’s problem-solving and decision-making simulations are, therefore, featured by learning tasks that require students’ learning and thinking across disciplines. For example, in “Brownfield Action” simulation, students must acquire and practice knowledge, skills, and expert practice from both scientific and social disciplines, such as environmental science, engineering, and journalism. Students have the opportunity to use a suite of geological and hydrological testing tools to assess the geological features of the town and the groundwater contamination. In addition, students need to interview residents, business owners, and government officials to collect data and opinions on public health, economics, and business management, and civics and laws. The “Millennium Village” simulation is probably best known for its integrated approach in both its interdisciplinary nature and the models. In this simulation, students are challenged to help a family and their village in the third world to survive and prosper by assessing and making a decision regarding their allocation and resources. Such a decision involves multiple disciplines, including “agronomy, nutrition, economics, epidemiology, public health and development management” (CCNMTL, n.d.c), as well as the underlying models relating to “agricultural, logistical growth, climatological, disease, and subsistence/health” (CCNMTL, n.d.c), all of which play a role in sustainable development as well as interact with each other in the real world. Closely related to the previous two design features (i.e., replication of real world complexity and interdisciplinary nature), CCNMTL’s problem-solving and decision-making simulations are characterized by role-playing, which requires learners’ justified actions from the viewpoint of an assigned role and assesses the viability of their solution when considering other perspectives. For instance, in “COUNTRY X,” working in a group of four, each student assumes a role of one of four characters, representing four 250
Computer Simulation in Higher Education
perspectives: the President, a Western diplomat, a sub-regional representative, and the leadership of the opposition. After exploring the existing conditions of the country, students need to come up with their preventive policy action from within their role and justify the rationale behind their decision (CCNMTL, n.d.b). In contrast, “Brownfield Action” requires team efforts in dealing with a large-scale interdisciplinary science problem that requires a semester-long scientific inquiry process. Assuming the role of an environmental consulting company, every team has the freedom to choose from over twenty-five locations in the town to visit and select from forty-five characters to interview, thus enabling variety in site selection and approaches to inquiry between the investigative teams (CCNMTL, n.d.a). Along with representation of complexity, interdisciplinary, and role-playing, CCNMTL’s problemsolving and decision-making simulations are all characterized with an underlying model, a representation of the constructs and their interactive relationship of an entity, phenomenon, or system, which can be visible or invisible. The underlying model in a discipline is essential to any types of computer simulation programs. As discussed previously in modeling-based simulation, the core of such simulations is the model with which students can explore, experiment with, thus leads to understanding of the principles/ rules underlying a topic under study. While models play a secondary role in task/skill-based simulation in which the major task instead is to acquire and practice the skills to the required degree of fluency, a model plays a supporting but still essential role in such simulations because students need to apply the model in performing a task and a skill set. Similarly, in problem-solving and decision-making simulation, a model, which is usually embedded, plays a crucial role in facilitating the inquiry and decision-making process. In “COUNTRY X,” the underlying model is manifested in the final debriefing that occurs after each student has presented and justified his/her individual policy choice from the perspective of his/ her adopted role. Based on the underlying models in pertinent fields, an updated country condition is then presented to the group, revealing the effect of combination of students’ individual strategic policy options, thus deepening students’ understanding of the complexity and interdependence of the factors involved in mass atrocity prevention (CCNMTL, n.d.b). Similarly, the “Millennium Village” simulation employs several models simultaneously such as “agricultural, logistical growth, climatological, disease, and subsistence/health, all of which are introduced through the environment and with explanations and tutorials about the models and their role in the simulation” (CCNMTL, n.d.c). The models determine the results of students’ decision making, which promotes students’ deeper understanding of the complexity and interdisciplinary nature of a real world issue.
Effects on Learning Outcomes Since 1999, the CCNMTL team and faculty using problem-solving and decision-making simulations have conducted a plethora of research studies on the effects of such simulations on students learning. Similar to the results in modeling-based and skill/task-based simulations, high levels of student engagement in the inquiry process and increased confidence in applying learning to real world problems have been reported among users of these simulations (Kelsey, 2010). However, Bower, Kelsey, and Moretti (2011) observed that frustration was frequently spotted during the process “when outcomes do not provide the immediate sensation of being done or with a clear sense of the end in sight” (p. 12). This is not surprising given that such learning environments and learning tasks, which represent a real world challenge and experience, are quite demanding that require higher levels of thinking and learning and students’ persistence aptitude. It is therefore crucial to provide immediate feedback and explicit instruction and
251
Computer Simulation in Higher Education
guidance throughout the inquiry process in problem-solving and decision-making simulation (Barab et al., 2000; Bower, Kelsey, & Moretti, 2011; Weinberger, Ertl, Fischer, & Mandl, 2005). In addition to engagement and confidence in affective domain, the prominent learning outcomes of problem-solving and decision-making simulations are students’ acquisition of expert knowledge, skills, attitudes, and practices in a subject field. In examining the design and effects of the “Millennium Village” simulation, Kelsey (2010) reported that what students could acquire from this simulation surpasses what can be achieved in the traditional classrooms, including the awareness, appreciation and deeper understanding of the complexity and interdisciplinary nature of a real world issue, the experimental attitudes and inquiry skills facilitated via simulation’s and/or the instructor’s explicit instruction and scaffolding, the ability to work with ambiguity (i.e., unknown and uncertain) and find a way to move forward, and the teamwork skills in effective goal setting, time management, workload/role assignment, communication, and negotiation; all of which are expert practices required in real world problem-solving and decision-making. Furthermore, a byproduct of students’ affective and cognitive outcomes in participating in the simulations is their career pursuit decision and employment. For example, in reviewing the dissemination and outcomes of “Brownfields Action” curriculum, Bower et al (2014) reported that many students in “Brownfields Action” showed significantly increased interest in pursuing a career in environmental consulting; and participation in “Brownfields Action” even assisted some students in gaining employment in environmental assessment profession.
Using CCNMTL’s Simulations in the Classroom Since 1999, CCNMTL’s problem-solving and decision-making simulations, along with its other efforts in technology-enhanced teaching and learning (e.g., Web 2.0, Annotation, Global Learning), have been broadly used in higher education classrooms and other platforms for education and training, such as workshops. The “Brownfields Action” simulation has been nationally recognized as a model curriculum in environmental science and has been adopted in many universities’ and high schools’ environmental science curriculum across the country (Bower et al., 2014). In reflecting on CCNMTL’s 10-year long innovation in technology-enhanced teaching and learning in higher education, Kelsey (2010) suggested five instructional situations where problem-solving and decision-making simulation is most fitted. First, in traditional curriculum, content and skills siloed into discrete disciplines are normally disconnected; a simulation instead can create learning experience and develop expertise across multiple disciplines so students are better prepared for the real world. Second, a system or model of concepts taught in didactic methods can help students’ comprehension and memorization quickly, but students are normally unable to generalize the concepts or model to new situations. A simulation with a deliberate self-discovery approach facilitates students’ learning of the model through exploration and experimentation, thus creating a meaningful learning experience of the model that enables students to more readily apply to real life situations. Third, some learning tasks involve students in situations that are dangerous, difficult, or rare (e.g., a flight task, an ethical dilemma, a rare surgical case). Simulation provides the most ideal platform for students to practice the skills needed in such situations, repeatedly. Forth, decision-making and prioritizing skills are hard, if not impossible, to acquire and practice in the classroom setting where the complexity, nuances, ambiguity, and authenticity of the real world situation are missing. Computer simulations can instead replicate the real world, immerse students in the complexity where they practice decision-making and prioritizing, see the immediate effect of their 252
Computer Simulation in Higher Education
action on the system, and make adjustment accordingly. Fifth, many real life issues are characterized with incomplete, uncertain, and conflicting information. Working with ambiguity, an important skill in workplace and real life, is often missing in college classrooms. Simulation with a deliberate design consideration on ambiguity provides an ideal learning environment for students to practice this skill with explicit guidance and implicit scaffolding.
Putting It All Together The following table provides a summary of the learning outcome-based categorization of computer simulation. For each type of computer simulation, the targeted cognitive levels / expected learning outcomes, unique instructional affordances, and distinctive learning opportunities are listed; examples of exemplary computer simulation programs in higher education and their URLs are also supplied (see Table 1).
FUTURE RESEARCH DIRECTIONS As evidenced in the exemplary computer simulation programs discussed above, computer simulation has proven to be an engaging, effective, and versatile instructional strategy and technology across a broad spectrum of disciplines in higher education. The effects of computer simulation on student learning outcomes have been greatly examined and well documented, predominantly in three domains: affective (e.g., engagement and engaged exploration, persistence in inquiry), cognitive (e.g., competencies in understanding and manipulating a system model, performing a task that requires skill sets, making a decision to solve a problem), and social (e.g., teamwork skills to get the job done). In contrast, very few studies in computer simulation research thus far have looked into the metacognitive domain. Metacognition refers to an awareness by learners of their cognition (i.e., self-appraisal) and deliberate, conscious regulation of their cognition (i.e., self-management) (Brown, 1987; Jacob & Paris, 1987). Metacognitive strategies are used by learners to plan, monitor, and assess their learning, thinking, and performance, and such practices further facilitate student learning in affective, cognitive, and social domains (Vermunt, 1996), as well as increase students’ abilities to transfer their learning to new contexts and tasks (Palincsar & Brown, 1984). It has been observed that metacognitive strategies are especially important in open-ended learning environments, such as computer simulation-based learning where learners have more control of their own learning (Collis & Meeuwsen, 1999; McManus, 2000; Miller & Miller, 2000; Quintana, Zhang, & Krajcik, 2005). Future research should explore how to embed metacognitive support into computer simulation programs and examine its effect on student learning outcomes in affective, cognitive, and social domains as well as on students’ transfer of learning into real life situations. In addition, the effectiveness of metacognitive strategies may depend upon the learning tasks, levels of learning, and students’ level, future research therefore should investigate and identify specific metacognitive strategies that best support student learning in modeling-based, task/skill-based, and problem solving and decision making-based simulations, respectively. Despite the recent proliferation of computer simulation in higher education classrooms and its great potential to transform teaching and learning, the acceptance and use of simulation in online learning are still lacking. Empirical evidence on the extent to which simulations are used in online learning is limited. It has been observed that only simple simulations, with a low level of complexity, seem to be used and explored in online learning environment (Cameron, 2003). Consequently, no substantive empiri253
Computer Simulation in Higher Education
Table 1. Learning outcome-based categorization of computer simulation Types of Simulation
Cognitive Levels / Learning Outcomes
Affordances
Opportunities
Exemplars in Higher Education
Modeling-based
Memorization. Comprehension.
Visualization. Dynamic. Interactivity.
Disciplinespecific conceptual understanding.
The PhET Interactive Simulations project at the University of Colorado Boulder https://phet.colorado.edu/
Task/Skill-based
Application.
Accessibility. Safety. Repetition.
Proficiency. Fluency. Automaticity.
Simulation Centers at U.S. Medical Schools. See for example, • Harvard University • https://harvardmedsim.org/ • Stanford University • http://cisl.stanford.edu/ • Johns Hopkins University • http://www.hopkinsmedicine.org/ simulation_center/ • Washington University • http://www.simulation.wustl.edu/ • The George Washington University • https://smhs.gwu.edu/class/
Problem-solving & Decision-making
Analysis. Evaluation. Synthesis.
Authenticity. Prediction. Manifestation.
High-order thinking. Real-life skills.
The Columbia Center for New Media Teaching and Learning http://ccnmtl.columbia.edu/portfolio/ simulations/
cal studies appear to exist that examine the effect of simulation on student learning outcomes in online education. A number of barriers to using simulation in higher education have been identified, which all seem applicable to online education, including the unfamiliarity of the approach to both instructors and students, time demanded in preparation and execution, high cost in purchase and maintenance of the program, and meticulous design of real life situation, modeling, and learner interaction (Hardman, 2007; Lean, Moizer, Towler, & Anney, 2006). To disseminate the potential and leverage the use of simulation in online education, future research efforts need to focus on the instructional design model and best practices for simulation integration into online learning. One such possible design framework could be Kolb’s experiential learning theory (Kolb, 1984). With its emphasis on the central role that experience plays in the learning process, experiential learning theory fits best with the distinctive affordance of simulation – a real life experience in a simulated environment. Future research should look closely into the four stages of the learning cycle speculated in experiential learning theory (i.e., concrete experience, reflective observation, abstract conceptualization, and active experimentation), and develop an experiential learning based instructional design model with specific implementation strategies, so as to guide educators’ thinking and design of simulation-based activities in online learning.
CONCLUSION The use of computer simulation is escalating in higher education as it holds great potential to facilitate experiential learning and promote higher order thinking, which will better prepare students for real world life, work, and challenges. The unique technical affordances of computer simulation enable a variety of learning opportunities, including engaged and guided exploration, real-life environment and experi-
254
Computer Simulation in Higher Education
ence, dynamic interaction with the model and peers, and immediate feedback, which are supported in all types of computer simulation. The significance of the cognitive level-based categorization of computer simulation is the emphasis on the learning outcomes expected in different types of simulation programs. This perspective adds an important pedagogical view to the well-established research on simulation in educational contexts. The appropriate use of the learning outcome-based typology of computer simulation requires thoughtful planning and must reflect alignment between learning tasks and expected learning outcomes. While still underused, simulation also holds great potential for online learning. With the advance of simulation technologies and the advent of practical simulation integration models for the online learning environment, computer simulation will prove to be a viable and effective tool to transform teaching and learning in online education and turn the experiential learning in online education a reality.
REFERENCES Abdollahian, M., Yang, Z., Coan, T., & Yesilada, B. (2013). Human development dynamics: An agent based simulation of macro social systems and individual heterogeneous evolutionary games. Complex Adaptive Systems Modeling, 1(18), 1–17. Adams, W. K., Reid, S., LeMaster, R., McKagan, S. B., Perkins, K. K., Dubson, M., & Wieman, C. E. (2008). A study of educational simulations part I - Engagement and learning. Retrieved from http://phet. colorado.edu/publications/PhET_Interviews_I.pdf Al-Elq, A. (2010). Simulation-based medical teaching and learning. Journal of Family and Community Medicine, 17(1), 35–40. doi:10.4103/1319-1683.68787 PMID:22022669 Aldrich, C. (2004). Simulations and the future of learning. San Francisco, CA: John Wiley and Sons. Alessi, S. (2000). Simulation design for training and assessment. In H. O’Neil & D. H. Andrews (Eds.), Aircrew training and assessment (pp. 197–222). London, United Kingdom: Routledge. Anderson, L. W., Krathwohl, D. R., Airasian, P. W., Cruikshank, K. A., Mayer, R. E., Pintrich, P. R., & Wittrock, M. C. et al. (2001). A taxonomy for learning, teaching, and assessing: A revision of Bloom’s Taxonomy of educational objectives. New York, NY: Longman. Association of American Medical Colleges. (2014). Simulation center use at medical schools: 20132014. Retrieved from https://www.aamc.org/initiatives/cir/423320/16.html Baer, L. L. (2005). The generation gap: Bridging learners and educators. The International Digital Media & Arts Association Journal, 2(1), 47–52. Barab, S. A., Hay, K. E., Squire, K., Barnett, M., Schmidt, R., Karrigan, K., & Johnson, C. et al. (2000). Virtual Solar System Project: Learning through a technology-rich, inquiry-based, participatory learning environment. Journal of Science Education and Technology, 9(1), 7–25. doi:10.1023/A:1009416822783 Beaubien, J., & Baker, D. (2004). The use of simulation for training teamwork skills in health care: How low can you go? Quality & Safety in Health Care, 13(Suppl. 1), 51–56. doi:10.1136/qshc.2004.009845 PMID:15465956
255
Computer Simulation in Higher Education
Beck, S. A., & Huse, V. E. (2007). A virtual spin on the teaching of probability. Teaching Children Mathematics, 13(9), 482–486. Bell, L., Juersivich, N., Hammond, T. C., & Bell, R. L. (2012). The TPACK of dynamic representations. In R. N. Ronau, C. R. Rakes, & M. L. Niess (Eds.), Educational technology, teacher knowledge, and classroom impact (pp. 103–135). Hershey, PA, USA: IGI Global. doi:10.4018/978-1-60960-750-0.ch005 Binkley, M., Erstad, O., Herman, J., Raizen, S., Ripley, M., Miller-Ricci, M., & Rumble, M. (2012). Defining twenty-first century skills. In P. Griffin, B. McGaw, & E. Care (Eds.), Assessment and teaching of 21st century skills (pp. 17–66). Dordrecht, Netherland: Springer. doi:10.1007/978-94-007-2324-5_2 Bower, P., Kelsey, R., Bennington, B., Lemke, L. D., Liddicoat, J., Miccio, B. S., . . . Datta, S. (2014). Brownfield Action: Dissemination of a SENCER model curriculum and the creation of a collaborative STEM education network. Retrieved from http://seceij.net/files/seceij/winter14/brownfield_action.pdf Bower, P., Kelsey, R., & Moretti, F. (2011). Brownfield Action: An inquiry based multimedia simulation for teaching and learning environmental science. Science Education & Civic Engagement. International Journal (Toronto, Ont.), 3(1), 1–14. Brown, A. (1987). Metacognition, executive control, self-regulation, and other more mysterious mechanisms. In F. E. Weinert & R. H. Kluwe (Eds.), Metacognition, motivation, and understanding (pp. 65–116). Hillsdale, NJ: Lawrence Erlbaum Associates. Cameron, B. H. (2003). The effectiveness of simulation in a hybrid and online networking course. TechTrends, 47(5), 18–21. doi:10.1007/BF02763200 Carpenter, Y., Moore, E. B., & Perkins, K. K. (2015). Representations and equations in an interactive simulation that support student development in balancing chemical equations. Retrieved from http:// confchem.ccce.divched.org/sites/confchem.ccce.divched.org/files/2015SpringConfChemP4.pdf CCNMTL. (n. d.a). Featured project: Brownfield Action 3.0. Retrieved from http://ccnmtl.columbia. edu/projects/feature_pages/277_brownfieldaction.pdf CCNMTL. (n. d.b). COUNTRY X. Retrieved from http://ccnmtl.columbia.edu/portfolio/political_science_and_social_policy/country_x.html CCNMTL. (n. d.c). Featured project: Millennium Village simulation. Retrieved from http://ccnmtl. columbia.edu/projects/feature_pages/288_mvsim_08.pdf Chamberlain, J. M., Lancaster, K., Parson, R., & Perkins, K. K. (2014). How guidance affects student engagement with an interactive simulation. Chemistry Education Research and Practice, 15(4), 628–638. doi:10.1039/C4RP00009A Chan, M. S., & Black, J. B. (2005). Direct-manipulation animation: Incorporating the haptic channel in the learning process to support middle school students in science learning and mental model acquisition. Proceedings of the 7th international conference on Learning sciences ICLS ‘06 (pp. 64-70). Santa Monica, CA: International Society of the Learning Sciences. Clark, A. (2004). Simulations and the future of learning. San Diego, CA: Pfeiffer.
256
Computer Simulation in Higher Education
Collis, B., & Meeuwsen, E. (1999). Learning to learn in a WWW-based environment. In French, D., Hale, C., Johnson, C., & Farr, G. (Eds.), Internet based learning - A framework for higher education and business (pp. 25-46). Sterling, VA: Stylus Publishing. Cook, D. A., Hatala, R., Brydges, R., Zendejas, B., Szostek, J. H., & Wang, A. T. (2011). Comparative effectiveness of technology-enhanced simulation versus other instructional methods: A systematic review and meta-analysis. Simulation in Healthcare, 7(5), 308–320. doi:10.1097/SIH.0b013e3182614f95 PMID:23032751 Csikszentmihalyi, M. (1990). Flow: The psychology of optimal experience. New York, NY: Harper and Row. D’Angelo, C., Rutstein, D., Harris, C., Bernard, R., Borokhovski, E., & Haertel, G. (2013). Simulations for STEM learning: Systematic review and meta-analysis. Menlo Park, CA: SRI International. Damassa, D. A., & Sitko, T. D. (2010). Simulation technologies in higher education: Uses, trends, and implications. Retrieved from http://www.educause.edu/library/resources/simulationtechnologies-highereducation-uses-trends-and-implications de Jong, T. (2005). The guided discovery principle in multimedia learning. In R. E. Mayer (Ed.), Cambridge handbook of multimedia learning (pp. 215–229). Cambridge, United Kingdom: Cambridge University Press. doi:10.1017/CBO9780511816819.015 Department of Defense. (1998). DoD modeling and simulation (M&S) glossary. Retrieved from http:// www.dtic.mil/whs/directives/corres/pdf/500059m.pdf Evans, K., Kersh, N., & Kontanien, S. (2004). Recognition of tacit skills: Sustaining learning outcomes in adult learning and work re-entry. International Journal of Training and Development, 8(1), 54–72. doi:10.1111/j.1360-3736.2004.00196.x Gaba, D. M. (1999). The human work environment and anesthesia simulators. In R. Miller (Ed.), Anesthesia (pp. 2613–2668). New York, NY: Churchill Livingstone. Gaba, D. M. (2004). The future vision of simulation in health care. Quality & Safety in Health Care, 13(Suppl. 1), i2–i10. doi:10.1136/qshc.2004.009878 PMID:15465951 Gaba, D. M. (2006). The futures here. We are it. Simulation in Healthcare, 1, 1–2. doi:10.1097/01266021200600010-00001 PMID:19088564 George Washington’s Clinical Learning & Simulation Skills (CLASS) Center. (n. d.). Non-human simulation rooms. Retrieved from https://smhs.gwu.edu/class/about/simlab Gilbert, N., & Troitzsch, K. G. (2005). Simulation for the social scientist. Berkshire, United Kingdom: Open University Press. GoldSim. (n. d.). Instruction: Types of simulation tools. Retrieved from http://www.goldsim.com/Web/ Introduction/SimulationTypes/
257
Computer Simulation in Higher Education
Hardman, S. (2007). Simulation: Transforming technology into teaching. In J. Woodhouse (Ed.), Strategies for healthcare education: How to teach in the 21st century (pp. 91–102). Oxford, United Kingdom: Radcliffe. Harlen, W., & Jelly, S. (1997). Developing science in the primary classroom. Essex, United Kingdom: Addison Wesley Longman. Hartmann, S. (1996). The world as a process: Simulations in the natural and social sciences. In R. Hegselmann, U. Mueller, & K. Troitzsch (Eds.), Modeling and simulation in the social sciences from the philosophy of science point of view (pp. 77–100). Dordrecht, Netherlands: Kluwer. doi:10.1007/97894-015-8686-3_5 Hensberry, K., Moore, E., & Perkins, K. (2015). Effective student learning of fractions with an interactive simulation. Journal of Computers in Mathematics and Science Teaching, 34(3), 273–298. Houlding, S. (1994). 3D geoscience modeling: Computer techniques for geological characterization. Berlin, Germany: Springer-Verlag. Jacobs, J. E., & Paris, S. G. (1987). Children’s metacognition about reading: Issues in definition, measurement, and instruction. Educational Psychologist, 22(3 & 4), 235–278. Jacobs, J. W., & Dempsey, J. V. (1993). Simulation and gaming: Fidelity, feedback, and motivation. In J. V. Dempsey & G. C. Sales (Eds.), Interactive instruction and feedback. Englewood Cliffs, NJ: Educational Technology Publications. Jonassen, D. (1999). Designing constructivist learning environments. In C. M. Reigeluth (Ed.), Instructional design theories and models: A new paradigm of instructional theory (Vol. 2, pp. 215–239). Mahwah, NJ: Lawrence Erlbaum Associates. Keller, C. J., Finkelstein, N. D., Perkins, K. K., & Pollock, S. J. (2006). Assessing the effectiveness of a computer simulation in conjunction with tutorials in introductory physics in undergraduate physics recitations. Retrieved from http://www.colorado.edu/physics/EducationIssues/papers/perc2005_keller.pdf Kelsey, R. (2010). Building to learn: A decade of innovation at the Columbia University Center for New Media Teaching and Learning. Retrieved from http://ccnmtl.columbia.edu/dr/papers/kelsey_jordan2010. pdf Kernan, M. C., & Lord, R. G. (1990). Effects of valence, expectancies and goal-performance discrepancies in single and multiple goal environments. The Journal of Applied Psychology, 75(2), 194–203. doi:10.1037/0021-9010.75.2.194 King, F. J., Goodson, L., & Rohani, F. (n. d.). High order thinking skills: Definition, teaching strategies, assessment. Retrieved from http://www.cala.fsu.edu/files/higher_order_thinking_skills.pdf Kirriemuir, J. (2002). The relevance of gaming and gaming consoles to the higher education/further education learning experience. London, United Kingdom: JISC. Kolb, D. A. (1984). Experiential learning: Experience as the source of learning and development. Englewood Cliffs, NJ: Prentice Hall.
258
Computer Simulation in Higher Education
Ku, M., MacDonald, R. H., Andersen, D. L., Andersen, D. F., & Deegan, M. (2006). Using a simulationbased learning environment for teaching and learning about complexity in public policy decision making. Journal of Public Affairs Education, 22(1), 49–66. Landriscina, F. (2013). Simulation and learning: A model-centered approach. New York, NY: Springer. doi:10.1007/978-1-4614-1954-9 Lean, J., Moizer, J., Towler, M., & Abbey, C. (2006). Simulations and games: Use and barriers in higher education. Active Learning in Higher Education, 7(3), 227–242. doi:10.1177/1469787406069056 Lewis, R., Strachan, A., & Smith, M. M. (2012). Is high fidelity simulation the most effective method for the development of non-technical skills in nursing? A review of the current evidence. The Open Nursing Journal, 6, 82–89. doi:10.2174/1874434601206010082 PMID:22893783 Luna, R. A., Aranhan, R. N., & Spite, D. H. (n. d.). Simulation in medical education. Retrieved from http://www.telessaude.uerj.br/resource/goldbook/pdf/25.pdf Maier, F. H., & Grobler, A. (2000). What are we talking about? A taxonomy of computer simulations to support learning. System Dynamics Review, 16(2), 135–148. doi:10.1002/1099-1727(200022)16:23.0.CO;2-P McClusky, D. A. III, & Smith, D. (2008). Design and development of a surgical skills simulation Curriculum. World Journal of Surgery, 32(2), 171–181. doi:10.1007/s00268-007-9331-9 PMID:18066685 McHaney, R. (1991). Computer simulation: A practical perspective. San Diego, CA: Academic Press Professional. McKagan, S. B., Handley, W., Perkins, K. K., & Wieman, C. E. (2008). A Research-based curriculum for teaching the photoelectric effect. Retrieved from http://www.colorado.edu/physics/EducationIssues/ papers/McKagan_etal/photoelectric.pdf McManus, T. F. (2000). Individualizing instruction in a web-based hypermedia learning environment: Nonlinearity, advance organizers, and self-regulated learners. Journal of Interactive Learning Research, 11(2), 219–251. Miller, S. M., & Miller, K. M. (2000). Theoretical and practical considerations in the design of webbased instruction. In B. Abbey (Ed.), Instructional and cognitive impacts of web-based education (pp. 156–177). Hershey, PA: Idea Group Publishing. doi:10.4018/978-1-878289-59-9.ch010 Moore, E. B. (2015). Designing accessible interactive chemistry simulations. Retrieved from http:// confchem.ccce.divched.org/2015SpringConfChemP8 Moore, E. B., Chamberlain, J. M., Parson, R., & Perkins, K. K. (2014). PhET interactive simulations: Transformative tools for teaching chemistry. Journal of Chemical Education, 91(8), 1191–1197. doi:10.1021/ed4005084 Munshi, F., Lababidi, H., & Alyousef, S. (2015). Low-versus high-fidelity simulations in teaching and assessing clinical skills. Journal of Taibah University Medical Sciences, 10(1), 12–15. doi:10.1016/j. jtumed.2015.01.008
259
Computer Simulation in Higher Education
National Research Council. (1996). National science education standards. Washington, DC: National Academy Press. New Media Consortium. (2012). Horizon report: 2012 Higher education edition. Retrieved from http:// www.nmc.org/pdf/2012-horizon-report-HE.pdf New Media Consortium. (2016). Augmented reality. Retrieved from http://www.nmc.org/horizon_topic/ augmented-reality/ Norman, G. R., & Schmidt, H. G. (1992). The psychological basis of problem-based learning: A review of the evidence. Academic Medicine, 67(9), 557–565. doi:10.1097/00001888-199209000-00002 PMID:1520409 Palincsar, A. S., & Brown, A. L. (1984). Reciprocal teaching of comprehension-fostering and comprehension-monitoring activities. Cognition and Instruction, 1(2), 117–175. doi:10.1207/s1532690xci0102_1 Paul, A., Podolefsky, N. S., & Perkins, K. K. (2012). Guiding without feeling guided: Implicit scaffolding through interactive simulation design. In P. V. Engelhardt, A. D. Churukian, & N. S. Rebello (Eds.), 2012 physics education research conference proceedings (pp. 302−305). Philadelphia, PA: AIP. Perkins, K. K., & Moore, E. B. (2014). Blending implicit scaffolding and games in PhET interactive simulations. In J. L. Polman, E. A. Kyza, D. K. O’Neill, I. Tabak, W. R. Penuel, A. S. Jurow, & L. D’Amico et al. (Eds.), The international conference of the learning sciences: Learning and becoming in practice (pp. 1201–1202). Boulder, CO: International Society of the Learning Sciences. Peterson, Q. R. (1970). Some reflections on the use and abuse of molecular models. Journal of Chemical Education, 47(1), 24–29. doi:10.1021/ed047p24 PhET. (2016a). Interactive simulations for science and math. Retrieved from http://phet.colorado.edu/ PhET. (2016b). Research. Retrieved from http://phet.colorado.edu/en/research#use PhET. (2016c). Teaching resources. Retrieved from http://phet.colorado.edu/en/teaching-resources PhET. (n. d.). Using PhET interactive simulations in college lecture: Ideas for engaging students through inquiry in lecture settings. Retrieved from http://phet.colorado.edu/files/guides/Planning/UG_Phys_GuideLecture-Overview.pdf Podolefsky, N. S., Moore, E. B., & Perkins, K. K. (2013). Implicit scaffolding in interactive simulations: Design strategies to support multiple educational goals. Retrieved from http://arxiv.org/abs/1306.6544 Podolefsky, N. S., Perkins, K. K., & Adams, W. K. (2010). Factors promoting engaged exploration with computer simulations. Physical Review Physics Education Research, 6(2), 1–11. Poolton, J. M., Zhu, F. F., Malhotra, N., Leung, G. K., Fan, J. K., & Masters, R. S. (2016). Multitask training promotes automaticity of a fundamental laparoscopic skill without compromising the rate of skill learning. Surgical Endoscopy, 30(1), 1–8. PMID:26743112 Porter, T. S., Riley, T. M., & Ruffer, R. L. (2004). A review of the use of simulations in teaching economics. Social Science Computer Review, 22(4), 426–443. doi:10.1177/0894439304268464
260
Computer Simulation in Higher Education
Prensky, M. (2001). Digital game-based learning. New York, NY: McGraw-Hill. Quintana, C., Zhang, M., & Krajcik, J. (2005). A framework for supporting metacognitive aspects of online inquiry through software-based scaffolding. Educational Psychologist, 40(4), 235–244. doi:10.1207/ s15326985ep4004_5 Reznick, R., & McCrae, H. (2006). Teaching surgical skills: Changes in the world. The New England Journal of Medicine, 355(25), 2664–2669. doi:10.1056/NEJMra054785 PMID:17182991 Rolfe, J. M., & Hampson, B. P. (2003). Flight simulation: Viability versus liability issues of accuracy, data and validation. Aeronautical Journal, 107(1076), 631–635. Schmidt, Goldhaber-Fiebert, Ho, & McDonald. (2013). Simulation exercises as a patient safety strategy: A systematic review. Patient Safety Network, 158(5), 426–432. PMID:23460100 CISL Stanford. (n. d.). Types of learning. Retrieved from http://cisl.stanford.edu/what_is/learning_types/ Stefanidis, D. (2013). Improving surgeon skills with simulator training to automaticity. Retrieved from http://www.physiciansweekly.com/surgeon-skills-and-automaticity-with-simulator-training/ Stefanidis, D., Scerbo, M. W., Montero, P. N., Acker, C. E., & Smith, W. D. (2012). Simulator training to automaticity leads to improved skill transfer compared with traditional proficiency-based training: A randomized controlled trial. Annals of Surgery, 255(1), 30–37. doi:10.1097/SLA.0b013e318220ef31 PMID:21637099 Sultan, C., Corless, M., & Skelton, R. E. (2000). Tensegrity flight simulator. Journal of Guidance, Control, and Dynamics, 23(6), 1055–1064. doi:10.2514/2.4647 Tennyson, R. D., & Jorczak, R. L. (2008). A conceptual framework for the empirical study of instructional games. In H. F. O’Neil & R. S. Perez (Eds.), Computer games and team and individual learning (pp. 39–54). Oxford, United Kingdom: Elsevier. Vermunt, J. D. (1996). Metacognitive, cognitive and affective aspects of learning styles and strategies: A phenomenographic analysis. Higher Education, 31(1), 25–50. doi:10.1007/BF00129106 Washington University’s Clinical Simulation Center. (n. d.). Clinical simulation center. Retrieved from http://www.simulation.wustl.edu/The-Centers/Clinical-Simulation-Center Weaver, S. J., Salas, E., Lyons, R., Lazzara, E. H., & Rosen, M. A. (2010). Simulation-based team training at the sharp end: A qualitative study of simulation-based team training design, implementation, and evaluation in healthcare. Journal of Emergencies, Trauma, and Shock, 3(4), 369–377. doi:10.4103/09742700.70754 PMID:21063560 Weinberger, A., Ertl, B., Fischer, F., & Mandl, H. (2005). Epistemic and social scripts in computersupported collaborative learning. Instructional Science, 33(1), 1–30. doi:10.1007/s11251-004-2322-4 Wilson, B. G. (1996). Constructivist learning environments: Case studies in instructional design. Englewood Cliffs, NJ: Educational Technology.
261
Computer Simulation in Higher Education
Yang, S. H., Yang, L., & He, C. H. (2001). Improve safety of industrial processes using dynamic operator training simulators. Process Safety and Environmental Protection, 79(6), 329–338. doi:10.1205/095758201753373096
KEY TERMS AND DEFINITIONS Computer Simulation: A computer-based instructional program that is designed to reproduce a real life entity, phenomenon, activity, situation, process, or system, the purpose of which is to provide a real-life or close-to-real-life learning environment. Fidelity: The degree of realism to which the simulation replicates reality. Learning Outcome-Based Categorization: A typology of computer simulation that builds around the six levels of cognitive learning described in revised Bloom’s taxonomy: memorization, comprehension, application, analysis, evaluation, and synthesis. Modeling: A physical or mathematical representation of a concrete or an abstract entity, phenomenon, activity, situation, process, or system, with which learners can manipulate and interact and learn the consequences of their actions. Modeling-Based Simulation: A computer-based instructional program that displays a static or an animated model that represents realistic/visible or abstract/invisible entity, activity, phenomenon, system, or process. Task/Skill-Based Simulation: A computer-based instructional program that requires learners to complete a task by demonstrating mastery of a set of skills in successfully completing the task. Problem-Solving & Decision-Making Simulation: A computer-based instructional program that requires learners to solve a real life problem and make decision that require the highest level of thinking, including analysis, evaluation, and synthesis.
262
263
Chapter 12
A Review of Literature and a Model for Scaffolding Asynchronous StudentStudent Interaction in Online Discussion Forums Kristin L. K. Koskey The University of Akron, USA Susan N. Kushner Benson The University of Akron, USA
ABSTRACT The purpose of this chapter is to overview types of asynchronous student-student interactions with a focus on designed interaction in an online discussion forum context, as well as to illustrate pedagogical approaches to scaffolding interactions. Student-student interaction in asynchronous online discussion is the emphasis of this chapter. The chapter focuses on a review of the literature on the roles of the instructor, student, and learning task in the online teaching and learning process. Ways in which these roles interact is then discussed including an overview of types of interactions. The chapter then focuses on contextual and designed interactions including conditions documented in research as to how to effectively use designed interaction to scaffold student-student interaction. Next, a guiding model is presented for how to plan for asynchronous interaction. Finally, challenges faced when designing or implementing synchronous discussions are discussed, as well as potential recommendations for overcoming these challenges.
INTRODUCTION The purpose of this chapter is to overview types of asynchronous student-student interactions with a focus on designed interaction in online discussion forum contexts, as well as to illustrate a hierarchy of pedagogical approaches to scaffolding student-student interactions. Asynchronous online discussion DOI: 10.4018/978-1-5225-1851-8.ch012
Copyright © 2017, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
A Review of Literature and a Model for Scaffolding Asynchronous Student-Student Interaction
is the focus of this chapter, as this remains a common form of interaction in the online environment as documented by Heejung, Sunghee, and Keol (2009). Asynchronous refers to when students “do not have to be online at the same time to communicate” compared to synchronous where student communication in real time (Ko & Rossen, 2010, p. 4). Asynchronous discussion can be facilitated using numerous platforms such as discussion forums offered by a learning system such as Blackboard or Desire2Learn, Wiki sites, and blog platforms such as provided by Yellowdig, to name a few. Although the pedagogy discussed and outlined in this chapter focus on asynchronous forms of discussion, some of the methods might transfer to synchronous forms of discussion. The chapter is organized in five sections. In the first section of the chapter, a review of the literature on the roles instructors, students, and the learning tasks play in the teaching and learning process followed is presented. A discussion of the ways in which these roles interact is discussed in the second section. Next, the concepts of contextual and designed interaction are introduced as two ways to promote student-student interaction, with a focus on characteristics of implementing designed interaction. The documented advantages of designed-interaction in the literature are discussed, and the current research on the use and implementation of this model are briefly reviewed. In the fourth section, a guiding model for planning for asynchronous student-student interaction is outlined, followed by examples. Finally, the chapter concludes with a discussion of the challenges documented in the literature that might be faced by instructional design experts and instructors of adult learners who seek to implement designed interaction, as well as suggestions for addressing the pedagogical challenges.
THE ROLES OF THE INSTRUCTOR, STUDENT AND LEARNING TASK Nearly 25 years ago, Alison King (1993) published an article titled From Sage on the Stage to Guide at the Side. In this article King described outdated classrooms where professors were the central figure – transmitting knowledge to students who in turn memorized and then reproduced the knowledge on an exam. In her work, King suggested a new metaphor of professors as guides on the side. King suggested that professors are still responsible for presenting course content but that learning is fostered through less directive approaches. King described that the professor’s role “is to facilitate students’ interaction with the material and with each other in their knowledge-producing endeavor” (p. 30). Since its publication, King’s work has been cited nearly 800 times as scholars in the field of teaching and learning consider how the roles of instructors and students best maximize the process. In no other context is this discussion livelier than within the arena of the online classroom.
The Role of the Instructor in the Online Classroom Zane Berge (1995) was an early scholar in the field of online teaching and learning. He conceptualized learning as involving interaction with content and interaction with other people. Reflecting about the roles and functions of the online instructor in computer conferencing, Berge described the role of the online instructor as pedagogical, social, managerial, and technical. From a pedagogical perspective, Berg contended that one of the most important roles of an online instructor is that of an educational facilitator – using questions and probes to focus discussion on important concepts and skills. From a social perspective, Berge suggested that online instructors assume a social role in which they promote a friendly and social learning environment that builds relationships among students and develops class 264
A Review of Literature and a Model for Scaffolding Asynchronous Student-Student Interaction
cohesiveness. As mangers, online instructors identify and manage the curriculum and course procedures and policies. Finally, in the role of technology support person, online instructors guide students in the use of the system and software. A fundamental goal of the online instructor is to make the technology transparent so students can direct their attention to the academic requirements of the course. Berge explained that historically distance education was limited almost entirely to interactions between the instructor and student. Prophetically, he predicted that “it is increasingly possible for students to interact with one another, even when geographically separated” (p. 23) According to Google Scholar, Berge’s original work has been cited over 900 times. Berge built upon the original 1995 manuscript, co-authoring several more manuscripts on this topic. In the years since his initial publications, the conceptualization of instructor roles has not changed much. For example, Goodyear, Salmon, Spector, Steeples, and Tickner (2001) collaborated to report the outcomes of a workshop attended by 25 researchers and practitioners from the United Stated, the United Kingdom, and other European countries. The workshop was co-sponsored by several organizations included the International Board of Standards for Training, Performance and Instruction and the Centre for Studies in Advanced Learning Technology. An outcome of the workshop was the identification of eight roles involved in online teaching. The first seven roles align with Berge’s framework 1. 2. 3. 4. 5. 6. 7.
Content facilitator Assessor Advisor/counselor (pedagogical) Technologist (technical) Designer Manager/ administrator Process facilitator (managerial)
Goodyear et al. (2001) identified an eighth role, that of researcher. Building on Berge’s 1995 framework, Lui, Lee, Bonk, Bude, and Magjuka (2005) conducted one of the first empirical investigations of the role of the instructors in the online classroom. Conducting 27 semi-structured interviews with faculty the authors further defined and elaborated on Berge’s four roles and extended upon the pedagogical, managerial, and technology roles by identifying sub-categories. Also citing the pioneering work of Berge, Heuer, King, and Espasa (2010) conducted a comprehensive study to clarify the roles of online instructors. The authors conducted online focus groups, surveys and Delphi methods across ten European countries as part of the European Project (Elene-TLC). A total of 78 experts across 14 universities participated in the research. The authors concluded that successful online instructors assume roles in 5 domains: pedagogical, social, design/planning, management, and information and communications technology (ICT). Twenty years after Zane Berge’s foundational work, professor Julie Taylor-Massy (2015), recipient of the 2013 Colorado State University Online Innovative Educator Award, identified the follow 5 roles as being unique to the online instructor: E-learning designer, technology specialist, content coach, social director, and managing correspondent.
265
A Review of Literature and a Model for Scaffolding Asynchronous Student-Student Interaction
The Role of Students Initially, research on student roles in the online classroom tended to focus more on the characteristics of successful online learners rather than on specific roles per se. For example, Boyd (2004) discussed four factors that influence student success in online and distance education – technological, environmental, personal, and learner characteristics such as learning styles, reading and writing skills, and self-direction. Similarly, in summarizing previous research on student roles, Craig, Goold, and Mustard (2008) explained that students need to be active, self-regulated, and self-directed learners. Their survey of 2711 college students at Australia’s Deaken University, revealed that 80% or more of the students reported that the roles and responsibilities of students include being self-motivated, submitting work on time, submitting original work, assuming responsibility for course requirements, allot sufficient time to complete assignments, asking for help, and exploring ideas rather than just memorizing facts. With increasing numbers of students enrolled in online classes, contemporary research has shifted from student attributes to student roles. Williams, Morgan, and Cameron (2011) explored the process of group formation in online classes. Using qualitative methods to analyze discussion logs among 127 students enrolled in six online courses, they concluded that “student roles - leader, wannabe, spoiler, agreeable enable, coat-tail, and supportive worker – evolved in the absence of assigned roles” (p. 51). The authors further suggested that there should be a balance between students creating their own roles and faculty assignment of roles. In a qualitative study at a community college, Bork and Rucks-Ahidiana (2013) conducted interviews with 47 students enrolled in a wide-range of online classes. They concluded that instructors and students often identified similar role expectations but disagreed on the way to meet the expectations. For example, student and instructors agreed that motivation was an important student characteristic, but instructors purported that student should be self-motivated while students thought it was the responsibility of the instructors to provide engaging instructional materials and activities. This role ambiguity, explained the authors, can result in frustration, confusion, and tension between students and instructors who struggle to understand how roles in an online classroom vary from that in a traditional classroom. An area of increased attention related to student roles is that of student-to-student interaction. As we will elaborate upon in greater detail in subsequent sections of this chapter, student-to-student interaction has the potential to optimize student learning. The collaborative role of students within an asynchronous classroom is not without challenges. Macmillan, Forte, and Grant (2014) described student-student relationships within an asynchronous online classroom as a balancing act. The diversity of ideas, beliefs, and experiences can enrich the online learning environment, but conflicts can arise and negatively impact the learning environment. Instructors are challenged to provide safe environments as a way to prevent and respond to conflict. In contrast to interactions that may be conflictual, Chou (2012) suggested that to maximize the potential of student-student interaction, instructors must promote and guide collaborative learning, actively protecting against “lurking or social loafing” (p. 25). Finally, one of the most challenging aspects of student-student interaction is assessment. How should student-student interaction be evaluated? Although students need multiple opportunities to interact with each other, it is the quality of interactions that impact student learning more so than the number of times they post to a discussion thread. Central to the assessment process are clearly articulated learning objectives that define the expectations associated with student interaction. These objectives must be shared with students, and supported through sound pedagogical strategies. 266
A Review of Literature and a Model for Scaffolding Asynchronous Student-Student Interaction
Although it is important to consider the roles that students assume in online classrooms, educators would be remiss to ignore completely the student attributes that are precursors for success. In 2015 I. Elaine Allen and Jeff Seaman (2015), co-directors of the Babson Survey Group, published Grade Level: Tracking Online Education in the United State, their twelfth annual report documenting online education in the United States. In the report 68% of academic leaders stated that they believed that students in online classes need more discipline to succeed in online classes than in face-to-face courses. Furthermore, 45% of the leaders expressed a concern about the retention of online students – a steady increase compared to previous years. These data suggest that instructors in an online class can best support the learning of their students both by nurturing students’ understandings of the skills and perspectives that are precursors to successful online learning as well as to use pedagogical approaches that engage students in more active than passive learning roles.
The Role of the Learning Tasks Bloom’s Taxonomy has long been valued as a framework for considering the nature, complexity, and intended outcomes of learning tasks. Originally published in 1955 by noted learning theorists Benjamin Bloom, Max Englehart, Edward Furst, Walter Hill, and David Krathwohl, the work was officially titled Taxonomy of Educational Objective. Generations of P-16 educators have used the original taxonomy as the basis for generating and classifying instructional objectives that ranged from memorizing factual information to more complex learning tasks such as applying concepts and rules to make evaluative judgments. A group of educational theorists collaborated between 1995 and 2000 to review and revise the original taxonomy. David Krathwohl, one of the original authors of the initial taxonomy, and Lorin Anderson, a former student of Benjamin Bloom, led this effort. Together they published A taxonomy for learning, teaching and assessing: A revision of Bloom’s Taxonomy of educational objectives (Anderson & Krathwohl, 2001). There were three striking changes in the revised taxonomy. First, the names of the categories were changed from nouns to verbs to reflect a more active approach to thinking and learning. Second, several categories were renamed. Finally, two highest levels were reordered to move creating as the highest level of learning. The revised taxonomy is illustrated in Figure 1. The foundation level of the revised taxonomy is Remembering. At this level, students recall information from long-term memory. At the Understanding level, students are able to explain and interpret information. Although Remembering and Understanding often require rote learning such as the memorization of facts, concepts, and explanations, these tasks are the foundation for more sophisticated learning and should not be dismissed as incidental. At the Applying level, students use their acquired knowledge in new ways to solve problems, make predictions, and perform tasks. Analyzing involves learning tasks in which students can identify, compare, and contrast the relationships between facets of knowledge. At the Evaluating level students use criteria and standards to make judgments about the quality, viability, or logic of an idea or product. Finally, at the Creating level, students demonstrate learning at the most sophisticated level generating a unique product or outcome. Anderson and Krathwohl’s revised taxonomy (2001) has been used in research about online education. For example, Chyung and Stepich (2003) used Bloom’s Taxonomy as a curriculum design tool. They described how they used Bloom’s revised taxonomy to establish align instructional objectives, learning tasks, and evaluation criteria in designing an asynchronous graduate-level course in instructional technology. In discussing the challenges that designing online instruction can be challenging, they concluded that Bloom’s taxonomy can be a useful tool in creating congruence in online education. Halawi, 267
A Review of Literature and a Model for Scaffolding Asynchronous Student-Student Interaction
Figure 1. Bloom’s revised taxonomy of learning
McCarthy, and Pires (2009) used the revised Taxonomy as the foundation for evaluating e-learning. Contending that universities have a responsibility to evaluate the effectiveness of e-learning for purposes of program evaluation and accreditation, the authors used the taxonomy to evaluate various aspects of e-learning. Creating a survey based on the taxonomy, the authors examined the correlation between individual factors, instructional factors, and learning outcomes. They concluded that using Bloom’s taxonomy is an effective lens through which to evaluate the effectiveness of e-learning. Comer and Lenaghan (2013) describe the methods and strategies they used in the asynchronous online discussions embedded in their online classes to foster higher levels of learning. Through the use of student-posted original examples and value-added comments, they described how asynchronous online discussions can promote higher-level learning. Reflecting on the role of the instructor, Comer and Lenaghan offer that too much instructor participation can impact the quality of the discussion, recognizing the importance of a more active student role. Although instructors, students, and the learning tasks play important roles in the online teaching and learning process, it is the interaction between these three roles that is of greatest importance in using technology tools to maximize student achievement. Discussion forums are not only a standard feature of all learning management systems and online courses, but also provide the most viable technology tool for role integration and interaction within the online classroom. As will be illustrated in subsequent sections of this chapter, the revised taxonomy can serve as a powerful framework for scaffolding interaction in online discussions.
268
A Review of Literature and a Model for Scaffolding Asynchronous Student-Student Interaction
TYPES OF INTERACTION IN ONLINE DISCUSSION FORUMS Providing for this interaction or interactivity either asynchronously or synchronously in the online learning environment is a major factor contributing to student success in the online learning environment (Ingerham, 2012). Indeed, active learning with others is recognized by Quality Matters as one of the quality indicators of an online learning environment. For the reader less familiar with Quality Matters, it is a nationally recognized organization in K-16 education that offers numerous educational resources for those developing an online or hybrid (i.e., blended) course. This organization also provides multiple mediums for which educators can collaborate on research and best practices. Quality Matters established a set of standards for what constitutes a quality environment for hybrid or fully online courses. Relevant to student interaction is Standard 5: “Course Activities and Learner Interaction and Engagement” (Quality Matters Program, 2014). This standard focuses on the use of interactive learning activities providing for students to be active learners to scaffold their achievement (Quality Matters Program, 2014). Interaction in an online discussion can take on multiple forms depending on a number of factors such as the learning objectives, number of students in the course, experience of the instructor with technology and pedagogy, students’ motivation to participate, and students’ degree of familiarity or history with other students, to name a few. These factors will be explored more in-depth later in the chapter. Consider scenarios 1-4 in Table 1. All four of the scenarios are in the context of a 16-week long fully online undergraduate course with approximately 50 students enrolled. As part of the course requirements, students read the textbook chapters and other supplementary materials on a weekly basis. The instructor develops discussion prompts related to the course readings with the goal that the students critically analyze and apply the concepts in the readings. The following three questions should be considered when reading the scenarios in Table 1. 1. In what ways does student engagement with the content differ among the scenarios? 2. In what ways does the interaction between the student and instructor differ among the scenarios? 3. In what ways does the interaction between or among students differ among the four scenarios? The scenarios illustrate different types of student interaction defined originally by Moore (1989) as student-content, student-instructor, and student-student. The three types of interaction are defined next within the context of online discussions. All three types of interaction are found to be important for student learning (Bernard, Abrami, Borokhovski, Wade, Tamim, Surkes, & Bethel, 2009). Student-content interaction is when the student is “…intellectually interacting with the content that results in the learner’s understanding, the learner’s perspective, or the cognitive structures of the learner’s mind” (Moore, 1989, p. 2). Abrami, Bernard, Bures, Borokhovski, and Tamin (2011) described examples of student-content activities as including reading, engaging in simulations, watching videos, using multimedia or software programs, exploring resources to expand their understanding, and working on course assignments. These examples might take place before students engages in an online discussion to prepare students for discussion or to frame the discussion. For example, students might read using guiding questions provided by an instructor or explore multiple resources to stimulate later discussion. In another example, students might watch videos illustrating a concept or practice to analyze later on discussion board. In yet another example, students could engage in simulations to explore certain cause-effect relationships to inform the discussion. Students might also engage with the content collaboratively during discussion, targeting higher level learning 269
A Review of Literature and a Model for Scaffolding Asynchronous Student-Student Interaction
Table 1. Scenarios 1 – 4 Illustrating Varying Degrees of Asynchronous Interactions Using a Discussion Forum Scenario 1
The students are required to post a reflection on the readings to an online discussion forum on a weekly basis. The instructor assigns participation points for those who post a reflection but does not respond to each student’s posting.
Scenario 2
The students are required to post a reflection on the readings to an online discussion forum on a weekly basis. The instructor replies with feedback to each student’s reflection.
Scenario 3
The students are randomly assigned into small groups on an online forum. The four discussion prompts focus on the students’ reflection on the concepts in the readings. Each student is required to post at least one reflection and two responses to other group members for each discussion prompt. Over the course of 16 weeks, the instructor signs on multiple times per week to review the postings and assign participation points for those who met the posting requirements.
Scenario 4
The instructor collects data on the students (major, interests, past experiences, future career goals, etc.) the first week of class to purposefully group students into “teams.” The students generate a team name within their groups. The four prompts focus on problems (e.g., problems of practice) that the teams work to solve within a limited time period. The problems require each team to apply the concepts from the readings. Each team member takes the lead as facilitator. The facilitator is responsible for structuring the approach the team takes to solving the problem, stimulating ongoing discussion, responding to each team member’s posting indicating when ideas corroborate or conflict within the group, scaffolding the team in reaching a consensus on their final response to the problem and crafting the team’s final response in a medium that the team agrees on (e.g., power point, website, discussion posting, Google doc, audio recording, video recording). All team members must evaluate whether the final product represents the teams’ consensus. The instructor signs on multiple times a week to monitor each team’s discussion and addresses any major misconceptions or questions that emerge during discussion. The instructor also provides descriptive feedback on each team’s final response.
objectives. Student-content interaction occurs at different degrees across Scenarios 1 – 4 in Table 1. In Scenario 1, students engage with the content. In Scenario 2, students engage with the content and instructor. In Scenarios 3-4, students engage with the content, instructor, and other students at varying degrees. Student-content interaction is necessary for meaningful student-instructor and student-student interaction, both defined next. Student-instructor interaction is classified as a student and instructor dialoguing such as in Scenario 2. Moore (1998) described the purpose of this interaction as to “stimulate or at least maintain the student’s interest in what is to be taught, to motivate the student to learn, to enhance and maintain the learner’s interest…” (p. 2). Student-student interaction is classified as students engaging in discussion with other students. This engagement might take place in small groups or in other forms. Scenario 3 and Scenario 4 in Table 1 illustrate varying degrees of student-student interaction. Noteworthy is that the three types of interactions do not necessarily occur in isolation. Scenario 4 illustrates all three occurring within the same discussion forum. Student-instructor interaction and instructor-student interaction could be further distinguished to reflect the direction of the interaction. The direction refers to who initiates the interaction. Student-instructor interaction occurs when a student initiates the interaction with an instructor. For instance, a student asking a question during discussion directed at an instructor. Instructor-student interaction is whereby an instructor initiates the interaction with students such as by posing an initial question to students. The direction of the interaction might indicate the function of the interaction, which can be important when analyzing the nuances of interactions. Function is defined here as the purpose of the initiated interaction. Students initiate interactions for the purpose of elaborating, communicating their degree of understanding, sharing an experience or resource materials, asking for clarification or an example, extending on a previously proposed idea, or communicating their agreement or disagreement. Instructors might initiate an interaction for one or more of these same purposes. Additional purposes instructors
270
A Review of Literature and a Model for Scaffolding Asynchronous Student-Student Interaction
initiate interactions include to probe or address misconceptions, scaffold further critical thinking, or stimulate further discussion. The level of learning being targeted can drive the function of student-content, student-instructor/ instructor-student, and student-student interactions. It could be argued that the level of learning shares a positive relationship not with the amount of engagement necessarily but with the level of engagement with the content and other students. If the learning objectives are lower level (remembering, understanding), then interaction will likely be at a lower level as is appropriate. For instance, the function of student-instructor and student-student interaction might be mainly for communicating or clarifying the degree of understanding. For higher level learning objectives, students’ level of interaction with the content and other students should increase with the function of interactions reflecting evidence of higher levels of learning. For instance, the interaction with the content requires transfer of concepts to new contexts or situations, deep analysis, and evaluation. The function of the interaction with the instructor or other students becomes to, for instance, communicate corroboration or disagreement with an idea or solution (requiring students to analyze and evaluate) or apply concepts to collaborate in creating a solution or product (creating). The instructor’s role is to design the discussion task or context to align with the level of learning being targeted. Indeed, in their studying applying Bloom’s taxonomy to classify 850 students’ responses in a discussion forum, Ertmer, Sadaf, and Ertmer (2011) found that higher level question prompts yielded higher level student responses in terms of the taxonomic levels of learning. Regardless of the level of learning, experts in online education consistently argue that interaction opportunities and active learning are major factors contributing to student success in the online learning environment (Harasim, 1989; Ingerham, 2012). As a result of this focus, a number of studies have been conducted examining how different types of interactions and pedagogical approaches to discussion forums relate to student learning outcomes. Bernard et al. (2009) conducted a review of 74 empirical studies focused on comparing the three types of interactions. Their results indicated that all three types of interaction have a positive impact on student learning. Student-student interaction yielded the largest average Hedge’s g effect size of 0.49 on student achievement across studies, compared to 0.46 for student-content interaction, and .32 for student-instructor interaction (Bernard et al., 2009). Lou et al. (2001) also conducted a meta-analysis of research on types of interaction but focused on comparing when students work in small groups versus independently using online technology. In reviewing 122 studies, they found that students who engaged in small group learning had significantly higher achievement compared to students who engaged in independent work.
FORMS OF STUDENT-STUDENT INTERACTION Despite the empirical support for the positive impact of student-student interaction on student achievement outcomes, researchers have found that students are not likely to voluntarily interact (An, Shin, & Lim, 2009). If student-student interaction is a goal, the instructor must promote and design specific requirements for this type of interaction. Promoting for student-student interactions must be done purposefully and be part of the designed instruction in order to avoid what the current authors refer to as the “I agree Syndrome” in an asynchronous discussion forum. Contextual interaction and designed interaction are two forms of student-student interaction that an instructor might promote. Contextual interaction provides for opportunities for students to interact but not necessarily collaborate, whereas designed interaction 271
A Review of Literature and a Model for Scaffolding Asynchronous Student-Student Interaction
aims to intentionally scaffold collaboration among students (Borokhovski, Tamim, Bernard, Abrami, & Sokolovskaya, 2012). Consider the scenarios in Table 1 again. Scenario 3 in Table 1 illustrates contextual interaction where students are required to respond to each other’s postings. Scenario 4 illustrates a task providing for students to extend their interaction by collaborating as evidenced by the requirement to work as a team to reach a consensus and create a team product. Which of the two forms of interaction to promote depends heavily on the learning objectives. For example, if a goal is to expose students to multiple viewpoints, self-reflect, share resources with other students, or obtain feedback from other students, contextual interaction would serve the purpose. If, however, the goal is for students to be able to evaluate and create through debating viewpoints, developing a learning community, or collaborating to create a product or an artifact, then designed interaction is more appropriate. Product is used as an inclusive term and could consist of, for instance, solutions to problems or a given challenge, creation of a plan of action or response, creation of a tangible product related to the course objectives (e.g., creation of a website), or an artifact representing the group’s discussion. Despite the different purposes, designed interaction has been found to have a greater positive impact on student achievement outcomes based on a meta-analysis conducted by Borokhovski et al. (2012). Social constructivism is the underlying philosophy in designed interaction to achieve the goal of active learning among students who collaborate on a mutual task or goal. Kerr (2011) more recently described the constructivist approach as providing for students to be active learners, constructing their own knowledge through collaborative learning with other students on authentic tasks. Collaborative learning is defined as when students “negotiate meaning with other learners” (Kerr, 2011, p. 230), key in designed interaction. Methods to employ and conditions that are necessary to provide for student-student interaction under this constructivist framework are documented across the literature on online learning. Rovai (2004) outlined several key conditions for instructors to consider in applying designed interaction in practice to scaffold student-student interactions. Borokhovski et al. (2012) further summarized what they found to be the key Table 2. Strategies for Implementing Designed Interaction in Asynchronous Online Forum Contexts Strategies
Source(s)
✓ Develop roles for students to specify expectations for collaborative interaction
Borokhovski et al. (2012)
✓ Establish and enforce rules and procedures for exchanges among students
Borokhovski et al. (2012)
✓ Require peer monitoring of interactions through descriptive and timely feedback to other peers
Borokhovski et al. (2012)
✓ Instructor monitoring of interactions through descriptive and timely feedback to students
Borokhovski et al. (2012)
✓ Problem-based learning approach to the collaborative learning task using authentic tasks
Bernard et al. (2000) Rovai (2004)
✓ Educate students on how to collaborate through use of multimedia vignettes or other methods
Abrami et al. (2011) Lou (2001)
✓ Small group sizes
Lou (2001)
✓ Provide for individual accountability (i.e., individual expectations/requirements) and collaborative accountability towards the group’s goal
Abrami et al. (2011)
✓ Structure around a task that is meaningful to students based on their background and interests
Rovai (2004)
✓ Require consistent interaction
Rovai (2004)
272
A Review of Literature and a Model for Scaffolding Asynchronous Student-Student Interaction
conditions for providing for “facilitative collaboration” (p. 320) based on their meta-analysis of studies classified as contextual interaction compared to designed interaction. Lou (2001) documented specific conditions (e.g., providing specific guidelines to the students on how to collaborate) that resulted in enhanced student-student interactions when working in small groups. A list of the key qualities identified across the literature for providing for a high level of student-student interaction is summarized in Table 2. A key quality noted by Rovai (2004) in the case of group discussions was to provide for students to solve problems collaboratively to promote an increased dependence on peers compared to the instructor. In this sense, “knowledge is constructed by the individual through his or her interactions” (Rovai, 2004, p. 80). The instructor plays more of a supportive role similar to what Dewey (1916) originally conceptualized as a necessary condition for active learning to occur. These qualities echo earlier descriptions by Vygotsky (1981), Bruner (1966), and Jonassen (1994) for the need for reciprocity in the learning process. Jonassen described this being created through a “Focus on knowledge construction, not reproduction… Present authentic tasks… [that] provide real world case-based learning environments…support collaborative construction of knowledge through social negotiation, not competition among learners… (p. 35). The latter quality of “social negotiation” is central to the designed interaction approach.
USING LEVELS OF LEARNING TO GUIDE INTERACTION As discussed throughout the chapter, the learning objectives should guide the type of interaction and level of student-content, student-instructor/instructor-student, and student-student interaction. Figure 2 below is a model for instructors on how the level of learning can guide the three types of interactions – separating out the student-instructor and instructor-student interaction function. This model was informed by the research and theory reviewed in the chapter. As illustrated in Figure 2, as the level of learning increases, the function of each type of the interaction changes form. Also, as the level of learning increases, a more designed interaction approach to online discussion is recommended. Further, as level of learning increases, student-student interaction increases consistent with a social constructivist-learning model. As illustrated in the left column of Figure 2, when the objective of the learning tasks primarily requires remembering and understanding, interaction is more contextual in nature and student-student interaction is limited. For example, students might be required to complete an online multiple choice quiz to check their knowledge of basic concepts – repeating the quiz multiple times until a certain threshold score is achieved that signifies mastery of the subject matter (student-content interaction). Or, the instructor might post a series of discussion prompts at the beginning of the instructional time frame that require students to summarize reading assignments or the main points of a video presentation either in writing or verbally using audio or video technology. Students may be required to comment on the posts of their classmates, but responses may require little if any interaction if students’ replies are brief statements of agreement. Although questions posed to classmates may have the potential for student-student interaction, if the questions posed require closed-ended, surfacelevel responses or remain unanswered, student-student interaction is not authentic. In addition, embedded within the course could be a discussion thread or section on a blog site where students post general questions about course assignments, pose questions to the instructor asking for explanation or clarification of course concepts (student-instructor interaction and instructor-student interaction). Although there is a degree of interaction between students and the 273
A Review of Literature and a Model for Scaffolding Asynchronous Student-Student Interaction
Figure 2. Using levels of learning to guide forms of interactions
instructor, the interaction is limited to lower level learning tasks associated with remembering and understanding factual information. Interaction between students, if any, is limited to static tasks such as reading discussion or blog posts or replying to classmates with generally passive responses. In the middle column of Figure 2, learning tasks become more complex as interactions require applying and analyzing cognitive processes. Students’ interactions with content extend beyond rote learning. Instead, as interaction moves from contextual toward designed, students are provided opportunities to apply content to investigate and solve problems. The instructor begins to assume the role of group facilitator, presenting students with leaning tasks that are more authentic and require more complex thinking. For example, imagine a course in program evaluation where students are learning foundational evaluation theories and approaches. The students are presented with a request from a local organization for an evaluation plan of an educational program. Each team is tasked with applying their knowledge of evaluation theory and approaches to develop an evaluation plan including identification of the evaluation theory applied, approach implemented, as well as generating specific evaluation questions. Each team might also be required to interact with a representative from the local organization who requested the plan. Or, as a second example, students in a science class could work in research teams to apply the scientific method to a contemporary problem. For example, the use and effectiveness of antibacterial products is a general topic of interest to scientists across multiple fields. Research teams could
274
A Review of Literature and a Model for Scaffolding Asynchronous Student-Student Interaction
collaborate to identify research questions and pose hypothesis. They could design a research study, collect and analyze data, creating a comprehensive database that can be shared with other students and researchers beyond their classroom. In each of these examples, the instructor might create discussion groups of 4 to 6 students to enhance interaction and encourage students to take a more active role and promote student-student interaction. The quality and depth of interaction within the small group can vary, however, depending on the learning task. Without a degree of purposeful, designed interaction, small group discussion may merely replicate a low-interaction situation because the student roles and learning tasks have not been transformed from that of the low-interaction situation. As illustrated in the right column of Figure 2, learning tasks increase to requiring students to evaluate and create. Designed interaction approach is implemented requiring collaboration among students. The content focuses on students transferring their understanding of the concepts and their skills to authentic problem-based tasks. The instructor structures the discussion prompt and context using the strategies listed in Table 2. Students might be grouped into teams consisting of 4 to 6 students requiring a higher level of collaboration compared to when targeting mid-level learning objectives. Here the discussion context becomes such that the instructor focuses on scaffolding the collaboration and the students take the lead in actively working together to solve a problem of practice or create a product. Students assume leadership roles, work collaboratively, and depend upon each other to reach consensus as they create a unique solution to the problem posed or product requested. Each student is responsible for individual contributions, as well as working with the team to reach a consensus on the final product to be created. This highest level is similar to that illustrated in scenario 4 in Table 1. It is recommended that at this highest level, the instructor purposefully delay instructor-student interaction on the discussion forum. If an instructor tends to take the lead and respond to every post, then the opportunities for students to collaborate can be limited. Purposefully delaying instructorstudent interaction is to wait to provide descriptive feedback until all team members have voiced their contributions or to wait to provide holistic feedback on the process and final product. This delayed interaction is not to say the instructor does not need to monitor the discussion. Instructors need to monitor the ongoing discussion to assess student learning and the effectiveness of the learning task in facilitating the level of learning targeted. An instructor can indicate his or her presence while still purposefully delaying descriptive feedback by using general responses that motivate students to continue the conversation (e.g., “Keep up the collaboration!” “The discussion is on task!”) or inform students whether they are generally on the right track (e.g., “Your team’s discussion is on the right track”). At times, a response re-directing the group to stay on task might be warranted if the facilitator is not fulfilling his or her role (e.g., “Although this discussion is interesting, let’s reflect back on the task at hand to X.” “How can the team elaborate on …?” “What is the team’s consensus on …?”). One exception for delaying interaction are when students pose a specific question to the instructor that need addressing to progress their learning or collaboration forward. Another exception to delaying interaction is when a major misconception emerges that needs to be immediately addressed before being deeply embedded. Responding to questions directed and the instructor at this level involves more guiding how the group might work together to or answering the question with a question to guide further thinking among the group members.
275
A Review of Literature and a Model for Scaffolding Asynchronous Student-Student Interaction
CHALLENGES AND RECOMMENDATIONS Several challenges exist to implementing designed interaction reflecting a high level of student-student interaction in the online environment including, but not limited to, class size, time involved in assessing student learning, student motivation, and student experience with collaborating using technology. Next is an overview of these challenges and potential strategies for addressing each. Arguably one of the biggest challenges to successful implementation of designed interaction is class size. Large online classes can be as problematic as large face-to-face classes that are held in lecture auditoriums where instructors might literally face their classes from a stage-like structure. Although best-practice research suggests that that size of online classes should be smaller than face-to-face classes, this practice is not always implemented in learning environments where larger class size results in additional tuition dollars without the need for a large physical classroom and sufficient physical seating. Still, student-student interaction can somewhat easily be promoted by organizing students into smaller discussion groups. The process for creating small groups is a standard feature of learning management systems and is essentially an administrative task. Once assigned to small groups, instructors can likely replicate learning tasks, initial written directions and discussion prompts across groups; thus, multiple small groups per se may not directly impact instructional time. Smaller groups within the larger class can also be an effective strategy in courses in which students represent a wide range of academic programs. For example, students from a wide range of majors are often required to take foundation classes in disciplines such as the sociology, psychology, business, sciences, etc. In survey courses such as these, students may not initially appreciate the importance and relevance of the course to their educational and professional goals. Instructors in courses such as these are often challenged to respond to students who query “Why do I need to know this?” Without perceived relevance and application, student-student interaction may be more challenging to stimulate. Small group discussion can be particularly effective in general courses like these. One strategy is to group students according to their program of study. Instead of replicating the same learning task across small groups, unique learning and discussion tasks can be developed that apply to specific programs. For example, a learning or discussion task that requires students to apply new concepts and reach consensus to solve a problem can be framed within program-specific contexts. Initially this will require additional instructional planning to create unique applications, but once created the unique applications can be replicated in subsequent course offerings. Another option is to create diverse groups where group members purposively represent diverse academic perspectives. An explicit instructional task could include that students identify ways in which course content applies across disciplines. Designed student-student interaction on the multi-disciplinary nature of the course content would engage students in analyzing and evaluating. Like all quality instruction, designed interaction also requires that instructors align assessments with learning objectives. In contrast to student-content contextual interaction, assessing more complex learning objectives that are the focus of student-student interaction often requires more substantive ways to provide students with corrective feedback. Instructors can facilitate the assessment process in two ways. First, instructors can create scoring rubrics within the learning management system. Well-designed scoring rubrics that align with the learning objectives at hand are an efficient and effective way to assess both the process of interaction and products that might be the outcome. Due to their familiarity with common errors and misunderstandings, instructors can develop a repository of written comments that
276
A Review of Literature and a Model for Scaffolding Asynchronous Student-Student Interaction
can be easily be added as comments within rubric as is, or can be adapted to provide more personalized and specific feedback. Lack of student motivation for participating in the discussion can hinder the effective implementation of designed interaction and ultimately the level of student-student interaction and student learning. In designing the discussion prompt, it is essential to specify individual and group expectations that are weighted toward the students’ grade. As noted earlier in the chapter, students are not likely to interact voluntarily (An et al. 2009). The focus of assessing students’ discussion postings too often focuses on the number of postings as a quality indicator rather than the content within the discussion and collaboration. It is imperative the expectations are clear on the rubric for the required individual and group contributions. Also, providing for tasks that are closely related to students’ interests and field of study will likely increase their perception of the relevancy and value of the discussion task. Further, providing students choice in their discussion medium and how they express the final product might increase their motivation to engage. For example, allowing students to conduct their discussion using a platform of their choice and providing the final product to be a verbal representation, visual representation, or written representation so long as it aligns with the learning task. At times, however, the learning tasks might require a specific product (e.g., students collaborating on developing a website). Students’ experiences collaborating using technology is the final challenge discussed here. Depending on the course, students might have highly varying degrees of experience collaborating with others in an online environment and using technology. A documented method in the literature is to educate students on what effective collaboration looks like in the context of a discussion forum (Abrami et al., 2011; Lou, 2001). This education could be done through specifying on the rubric the criteria that defines collaboration and providing illustrative vignettes using video. Also, students can be purposefully grouped such that students with more experience in collaboration or using technology as a medium for collaboration can take the lead as facilitator of the team for the first discussion so to provide an example to other students. A pre-assessment at the beginning of the semester can be administered to identify these students to guide purposeful grouping. A further strategy for students with varying degrees of experience using technology or collaborating using technology is to consider a variety of response mediums and tools that preserve the learning objectives. For instance, current platforms such as Blackboard and Desire2Learn provide tools for collaborating using audio or video in an asynchronous or synchronous forum. In addition, blogs can be created using platforms such as Wiki or Yellowdig to facilitate discussion forums as an alternative to an institution’s learning system that might be more convoluted to learn. These alternative platforms can also provide for preserving the discussions beyond the immediate semester and promote an efficient option for interaction across sections. Again, a pre-assessment can be administered to determine which platforms or tools students might be familiar and comfortable using. It is imperative that the platform and response medium preserves the learning objective. For example, if a component of the learning objective for the discussion relates to communicating in writing, then audio and video response mediums would not be aligned response mediums.
SUMMARY In this chapter, a review of the literature on the roles instructors, students, and the learning tasks play in the teaching and learning process followed was presented. How these roles interact in an online discus277
A Review of Literature and a Model for Scaffolding Asynchronous Student-Student Interaction
sion forum context was discussed. Contextual interaction and designed interaction were introduced as two ways to promote student-student interaction, with a focus on characteristics of applying designed interaction. The guiding model for planning for asynchronous student-student interaction was outlined, which can be adapted for multiple learning platforms used to facilitate online discussions. Finally, challenges faced by instructors and possible pedagogical solutions to those challenges were outlined.
REFERENCES Abrami, P. C., Bernard, R. M., Bures, E., Borokhovski, E., & Tamim, R. M. (2011). Interaction in distance education and online learning: Using evidence and theory to improve practice. Journal of Computing in Higher Education, 23(2-3), 82–103. doi:10.1007/s12528-011-9043-x Allen, I. E., & Seaman, J. (2015). Grade level: Tracking online education in the United States. Babson Survey Group. Retrieved from http://www.onlinelearningsurvey.com/reports/gradelevel.pdf An, H., Shin, S., & Lim, K. (2009). The effects of different instructor facilitation approaches on students interactions during asynchronous online discussions. Computers & Education, 53(3), 749–760. doi:10.1016/j.compedu.2009.04.015 Anderson, L. W., & Krathwohl, D. R. (Eds.). (2001). A taxonomy for learning, teaching and assessing: A revision of Bloom’s Taxonomy of educational objectives: Complete edition. New York, NY: Longman. Berge, Z. L. (1995). Facilitating computer conferencing: Recommendations from the field. Educational Technology, 35(1), 22–30. Bernard, R. M., Abrami, P. C., Borokhovski, E., Wade, C. A., Tamim, R. M., Surkes, M. A., & Bethel, E. C. (2009). A meta-analysis of three interaction treatments in distance education. Review of Educational Research, 79(3), 1243–1289. doi:10.3102/0034654309333844 Bloom, B., Englehart, M., Furst, E., Hill, W., & Krathwohl, D. (1956). Taxonomy of educational objectives: The classification of educational goals. Handbook I: Cognitive domain. New York, Toronto: Longmans, Green. Bork, R. H., & Rucks-Ahidiana, Z. (2013). Role ambiguity in online courses: An analysis of student and instructor expectations. Community College Research Center, Teachers College, Columbia University. Retrieved from http://ccrc.tc.columbia.edu/publications/role-ambiguity-in-online-courses.html Borokhovski, E., Tamim, R., Bernard, R. M., Abrami, P. C., & Sokolovskaya, A. (2012). Are contextual and designed student-student interaction treatments equally effective in distance education? Distance Education, 33(3), 311–329. doi:10.1080/01587919.2012.723162 Boyd, D. (2004). The characteristics of successful online students. New Horizons in Adult Education, 2(18), 31–39. doi:10.1002/nha3.10184 Bruner, J. (1966). Toward a theory of instruction. Cambridge, MA: Harvard University Press. Chou, P. (2012). Teaching strategies in online discussion board: A framework in higher education. Higher Education Studies, 2(2), 25–30. doi:10.5539/hes.v2n2p25
278
A Review of Literature and a Model for Scaffolding Asynchronous Student-Student Interaction
Chyung, S., & Stepich, D. (2003). Applying the “Congruence” Principle of Bloom’s Taxonomy to designing online instruction. Quarterly Review of Distance Education, 4(3), 317–330. Comer, D. R., & Lenaghan, J. A. (2013). Enhancing discussions in the asynchronous online classroom: The lack of face-to-face interaction does not lessen the lesson. Journal of Management Education, 37(2), 261–294. doi:10.1177/1052562912442384 Craig, A., Goold, A., Coldwell, J., & Mustard, J. (2008). Perceptions of roles and responsibilities in online learning: A case study. Interdisciplinary Journal of E-Learning and Learning Objects, 4, 205–223. Dewey, J. (1916). Democracy and Education: An introduction to the philosophy of education. New York, NY: MacMillan Company. Ertmer, P. A., Sadaf, A., & Ertmer, D. J. (2011). Student-content interactions in online courses: The role of question prompts in facilitating higher-level engagement with course content. Journal of Computing in Higher Education, 23(2-3), 157–186. doi:10.1007/s12528-011-9047-6 Goodyear, P., Salmon, G., Spector, J. M., Steeples, C., & Tickner, S. (2001). Competences for online teaching: A special report. Educational Technology Research and Development, 49(1), 65–72. doi:10.1007/ BF02504508 Halawi, L. A., McCarthy, R., & Pires, S. (2009). An evaluation of E-learning on the basis of Blooms Taxonomy: An exploratory study. Journal of Education for Business, 84(6), 374–380. doi:10.3200/ JOEB.84.6.374-380 Harasim, L. (1989). On-line education: A new domain. In R. Mason & A. Kaye (Eds.), Mindweave: Communication, computers, and distance education (pp. 50–62). Oxford: Pergamon Press. Heejung, A., Sunghee, S., & Keol, L. (2009). The effects of different instructor facilitation approaches on students interactions during asynchronous online discussions. Computers & Education, 53(3), 749–760. doi:10.1016/j.compedu.2009.04.015 Heuer, B. P., & King, K. P. (2004). Leading the band: The role of the instructor in online learning for educators. Journal of Interactive Online Learning, 3(1), 1–11. Ingerham, L. (2012). Interactivity in the online learning environment: A study of users of the North Carolina Virtual Public School. The Quarterly Review of Distance Education, 13(2), 65–75. Jonassen, D. H. (1994). Computers in schools: Mindtools for critical thinking. University Park, PA: Pennsylvania State University Press. Kerr, S. (2011). High school online. Pedagogy, preferences, and practices of three online teachers. Journal of Educational Technology Systems, 39(3), 221–244. doi:10.2190/ET.39.3.b King, A. (1993). From sage on the stage to guide on the side. College Teaching, 41(1), 30–35. doi:10. 1080/87567555.1993.9926781 Ko, S., & Rossen, S. (2010). Teaching online: A practical guide (3rd ed.). New York, NY: Routledge. Liu, X., Lee, S., Bonk, C. J., Bude, C., & Magjuka, R. J. (2005). Exploring four dimensions of online instructor roles: A program level case study. Journal of Asynchronous Communication, 9(4), 29–48.
279
A Review of Literature and a Model for Scaffolding Asynchronous Student-Student Interaction
Lou, Y., Abrami, P. C., & dAppolonia, S. (2001). Small group and individual learning with technology: A meta-analysis. Review of Educational Research, 71(3), 449–521. doi:10.3102/00346543071003449 MacMillan, T., Forte, M., & Grant, C. (2014). Thematic Analysis of the “Games” Students Play in Asynchronous Learning Environments. Journal of Asynchronous Learning Networks, 18(1). Retrieved from http://olc.onlinelearningconsortium.org/publications/olj_main Moore, M. G. (1998). Three types of interaction. American Journal of Distance Education, 3(2), 1–6. doi:10.1080/08923648909526659 Quality Matters Program. (2014). Quality Matters rubric standards 2014 fifth edition with assigned point values. Retrieved from https://www.qualitymatters.org/rubric Rovai, A. P. (2004). A constructivist approach to online college learning. The Internet and Higher Education, 7(2), 79–93. doi:10.1016/j.iheduc.2003.10.002 Taylor-Massey, J. (2015). Redefining teaching: The five roles of the online instructor. Retrieved from http://blog.online.colostate.edu/blog/online-teaching/redefining-teaching-the-five-roles-of-the-onlineinstructor/ Vygotsky, L. S. (1981). The genesis of higher mental functions. In J. V. Wertsch (Ed.), The concepts of activity in Soviet psychology. Armonk, NY: Sharpe. Williams, K. C., Morgan, K., & Cameron, B. A. (2011). How do students define their roles and responsibilities in online learning group projects? Distance Education, 32(1), 49–62. doi:10.1080/01587919 .2011.565498
KEY TERMS AND DEFINITIONS Asynchronous Discussion: Students and/or instructor are not necessarily engaging in a discussion in real time. Bloom’s Revised Taxonomy: A hierarchical model of learning moving from lower levels of learning to higher levels of learning. The levels of learning are remembering, understanding, applying, analyzing, evaluating, and creating. This model is often used to classify and evaluate levels of learning outcomes. Contextual Interaction: A form of student-student interaction where the instructor provides students opportunities to interact but not necessarily collaborate. Designed Interaction: A form of student-student interaction where the instructor provides students opportunities to actively interact through collaborating typically in small groups on a problem-based authentic learning task. Instructor-Student Interaction: Instructor and student dialogue with the student initiating the dialogue. Student-Content Interaction: Students interact with the course content through engaging in such learning activities as reading, watching videos, using software programs, participating in simulations, exploring resources, and working on course assignments. Student-Instructor Interaction: Student and instructor dialogue with the student initiating the dialogue. Student-Student Interaction: Students dialogue with other students.
280
281
Chapter 13
Best Teaching and Technology Practices for the Hybrid Flipped College Classroom Lori Ogden West Virginia University, USA Neal Shambaugh West Virginia University, USA
ABSTRACT Two cases of the flipped classroom approach, one undergraduate course and one a graduate course, are used to demonstrate the different ways that flipping instruction can occur in both F2F and online courses, thus, extending the notion of hybrid and flipped teaching decisions with F2F and virtual classrooms. Both cases are summarized in terms of instructional design decisions, the models of teaching framework, and research conducted on the courses. Findings from research conducted on both courses indicate that a flipped classroom approach can enhance the teaching of both F2F and online courses as it provides instructors an opportunity to adapt instruction to meet the individual needs of students. Recommendations, based on this course development work, are provided for undergraduate and graduate courses in terms of access, meaningful activities, and feedback.
INTRODUCTION A flipped classroom teaching model promotes active learning in the classroom, provides more in-class time for student-centered activities and cultivates confidence in students. Direct instruction is moved from the classroom to online delivery usually through video, while classroom activity focuses on whole class application of knowledge and skills and instructor attention to student needs. Benefits to the flipped classroom approach have been cited as increased student achievement, increased student engagement, and improved attitudes toward learning (Hamdan, et al, 2013).
DOI: 10.4018/978-1-5225-1851-8.ch013
Copyright © 2017, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
Best Teaching and Technology Practices for the Hybrid Flipped College Classroom
Two cases are discussed, one undergraduate and one graduate. A face-to-face (F2F) undergraduate course provides an example of a hybrid flipped classroom. Direct instruction was divided among lectures delivered in the classroom and lectures delivered via video were assigned as homework. By moving half of the lectures outside of the classroom, half of the F2F sessions were focused on student questions, problem solving, and cooperative learning activities. Hybrid instruction is typically understood as a mix of face-to-face and online instruction. Online activity may include a mix of synchronous and asynchronous activities, with instruction being delivered through real-time chats and discussion rooms. The second case study of an online graduate course provides an example of a virtual flipped classroom. Instruction is delivered asynchronously as audio and video clips within a set of modules, while feedback by the instructor and between peers is conducted in both real-time discussions and instructor-produced video assessment of whole class performance. Hybrid learning can be better understood not only as a mix of F2F and online deliveries but “as a fundamental redesign of the instructional model characterized by increases in interaction between student-instructor, student-student, student-content and student-outside resources” (Dziuban, et al, 2004, p. 3). Section one reports on the background and use of the college flipped classroom. Section two documents the two cases describing course features using two perspectives, instructional design decisions (Shambaugh & Magliaro, 1997) and the models of teaching framework (Joyce, Weil, & Calhoun, 2014). Section three identifies best teaching and technology practices based on the research conducted on these two college courses.
USE OF THE FLIPPED CLASSROOM Pedagogical Trends Leading to Flipped Classroom Initially used in the 1960s, educators returned to instructional video through digital formats in F2F classrooms (i.e., movie clips, digital whiteboards) and hybrid classrooms where video was viewed in both real time and asynchronous activities. Concerns with passive learning led to more active approaches and an increase in instructor feedback as needed by each student (Ent, 2016). The research for flipped classroom can be grounded in the work of Bransford, Brown, and Cocking (2000), who reported that students need factual knowledge, conceptual understanding of how this knowledge is structured, and activities which help the student to retrieve and apply this knowledge. In terms of learning mathematics, one of the content areas in this chapter, the literature recommends that educators promote mastery learning and conceptual understanding (Ames & Archer, 1988), model the use of learning strategies (Zimmerman, 1990), and convey the value and utility of mathematics (Eccles et al., 1983). Many universities have integrated technology into their mathematics courses in an effort to combat low levels of motivation and to bolster student mastery of mathematical procedures and conceptual understanding of math concepts. Rakes, Valentine, McGatha, and Ronau (2010) completed a meta-analysis of literature regarding the instruction of algebra and concluded that students perform better when instruction incorporates the use of technology and manipulatives to foster conceptual understanding rather than procedural skills.
282
Best Teaching and Technology Practices for the Hybrid Flipped College Classroom
Features of Flipped Classrooms Bishop and Verleger (2013) defined the flipped classroom as a pedagogical technique consisting of both interactive group learning activities inside the classroom and computer-based individual instruction outside the classroom. Streaming digital video is cited as the major online activity which characterizes the flipped approach (TED, 2011). The video may be instructor-produced or available from outside sources. Video introduces students to conceptual and procedural skills and prepares students for in-class activity (Brame, 2013).
Factors that Promote or Hinder the Use of Flipped Classrooms for Math Learning The Flipped Learning Network (2014) suggests that a flipped classroom features a flexible environment, a change in the learning culture from passive to active learning, intentional content in both online and F2F activities, and the need for a professional educator. A major challenge to trying out a flipped approach is the motivation to develop a new approach and learn media production skills (Salifu, 2016). Such motivation is influenced by attitudes towards change and technology, but also the pragmatic priorities of faculty in academic settings and the pressures of time (Wells & Holland, 2016). Supporting the above features, adequate technology needs to be in place. The key online technology system for the college course is the learning management system (LMS). The LMS houses the course structure and materials and links to the videos, as well as an assessment system to document student time and performance. Video production is another key factor, including the need for short media lengths, visual appeal, incorporation of instructor audio, a mix of text and visuals (Mayer, 2009) and the quality of engagement. Quality video production requires a set of skills provided by professional development activities for faculty. A pragmatic question from educators is determining if students have watched the videos and what they may have learned or not learned. Assessment, such as short quizzes or worksheets, can be used to determine what students are learning with video (Frydenberg, 2013). Students may be new to online instruction, so any foundational or introductory level course instructors may need to determine student experience with online delivery as well as their attitudes, knowledge and skills in a content area. With a flipped classroom, both the students and instructor become more active than in the past. Instructor activity within the classroom must tap multiple approaches as opposed to a singular model (Horan, Hersi., & Kelsall, 2016), such as group discussion, mini-lectures for review, and student questioning.
Outcomes of Flipped Classrooms Bishop and Verleger (2013) completed a survey of 24 studies on flipped classroom teaching techniques, all of which included video. In general students liked the flipped classroom and watched videos when they were assigned, although some students disliked the approach. Optional video assignments compared to textbook assignments better prepared students for class, suggesting that students are more apt to watch a video to prepare for class than read the textbook. Also reported was a preference for in-class lectures but that students also valued the interactive class time to address their issues. Instructors at one university assigned pre-existing video lectures for homework in an engineering course and replaced class lectures with discussions and activities. Students in the flipped classroom 283
Best Teaching and Technology Practices for the Hybrid Flipped College Classroom
outperformed their traditionally taught peers on the midterm exam (Azedevo, 2012). Another variation of the flipped classroom was piloted in a junior level engineering course at another university. Unlike the previous approach, instructors authored their own videos. Although a similar classroom format was followed, feedback reported that students felt the video lectures were effective in teaching them the material, but that the classroom activities were disorganized and caused some students to fall off task (Toto & Nguyen, 2009). Student feedback suggested that implementing video lectures outside of class is not enough to impact student learning. What replaces the lecture during F2F class time is integral to the success of the flipped classroom. Ogden and Shambaugh (2016) reported that undergraduate students in a college algebra course did not like teaching themselves with videos. While liking the online videos, they cited the attentiveness of the instructor and commitment to student learning. Students’ perceptions of online versus in-class instruction may not differ. From a survey of pre-class online activities in an undergraduate science course, no statistically significant differences were found on student attitudes and preferences across class levels, major fields, and previous experiences using video (Long, Logan, & Waugh, 2016). These perception differences may, however, vary depending on content areas (sciences, liberal arts) and class level (introductory, advanced courses). Generalizing research results is difficult owing to course features and context. Research on the flipped classroom lacks results on student performance, while one documented success is improved student motivation (Fawley, 2014).
The College Flipped Classroom: Two Cases Course design decisions are summarized in terms of instructional design, including learning context, learner differences, instructional sequence, teaching and assessment, and flipped classroom features. Instructional design (ID) provides a systematic process that “assists designers in understanding related instructional variables and/or guides them through the process of analyzing, designing, developing, implementing, and evaluating instructional products” (Lee & Jang, 2014, p. 744). Although both cases were existing courses, the researchers/instructors had the liberty to design and implement instruction according to what they determined to be best for their students. The pedagogical features of both courses are then represented using a second approach, the models of teaching framework (Joyce, Weil, & Calhoun, 2014), which reports syntax (procedures), social system, conditions for use, and the direct and indirect effects. Research on both courses is then documented.
Instructional Design Decisions Undergraduate Course: College Algebra A F2F undergraduate college algebra course used a flipped classroom approach to provide online video instruction on algebra concepts and skills, while the F2F classroom meetings addressed student questions and provided personal attention. •
284
Learning Context: The teaching approach was developed within existing reform efforts, which included interactive laboratories that use technology and student activities to emphasize writing and collaboration. Although reform efforts have covered a decade, the rate at which students are withdrawing or earning grades of D or F (DFW rate, the percentage of students withdrawing or earning grades of a “D” or “F” in a course) in college algebra has continued to be problematic.
Best Teaching and Technology Practices for the Hybrid Flipped College Classroom
•
• • • •
The charge to improve DFW rates has led to the use of online video lectures for students to view outside of class so that F2F class time is available for more engaging learning activities. Learning Outcomes: Explain and apply the concept of a function; solve mathematical application problems; solve equations and inequalities using multiple representations; graph functions and relate graphical features to algebraic and numeric features; use and compare algebraic, graphical, and numerical approaches to solve problems involving lines, parabolas, circles, systems of equations and matrices. Learner Differences: Undergraduate students with a mix of majors as college algebra is required for most programs of study. Students take a placement exam to establish placement. A ten or eleven score out of 25 is required for placement into the course. Instructional Sequence: Five units including pre-college algebra review, solving equations and inequalities, functions/graphs, rational functions, and logarithmic and exponential functions. Teaching and Assessment: Each unit is taught using online video lecture, F2F lecture, cooperative assignments, online homework, and question/answer sessions. Once a unit is completed, a unit exam is administered. A comprehensive final exam is administered. Flipped Approach: In-class activities feature cooperating learning, mentored laboratory sessions, and proctored examinations. Video lectures are viewed outside of class.
Graduate Course: Instructional Design The second setting, an online graduate instructional design (ID) course, used online videos within modules for content and practice. Synchronous F2F chat sessions revisited key ideas and answered questions on students’ design projects. This approach to “flipping” the online delivery, as opposed to synchronous video to deliver instruction, has been used since 2010, while the course has been taught since 1994 (Shambaugh, 2007). • • • •
•
Learning Outcomes: Examining learning beliefs, understanding the ID process, and learning from others. Learner Differences: Master’s and doctoral level students with a mix of teachers, trainers, and university support staff. A course in educational psychology is usually taken concurrently. Instructional Sequence: Course is organized by seven phases of ID, including learner beliefs, ID models, needs assessment, instructional sequence, teaching and assessment, prototype, and program evaluation. Instructional Sequence: The online course, posted on an LMS includes seven modules that address seven ID phases. Within each module the following activities are used: audio-narrated module screens, scenarios to prompt student thinking about each ID phase (see Shambaugh, 2009), a “quick think” to apply key ideas, and a design activity. Teaching and Assessment: Each module requires scenario postings, “quick think” responses, and design activity submissions. Student groups provide peer feedback on scenarios, quick thinks, and design activities. The instructor provides individual feedback on design activity submissions. All feedback is posted on the LMS site. Student postings comprise 25% of the course grade, while the design activities, culminating in an ID project, constitute 75% of the course grade.
285
Best Teaching and Technology Practices for the Hybrid Flipped College Classroom
•
•
Flipped Approach Features – Asynchronous Online: F2F instruction, which consisted of inclass mini-lectures and student activities, was replaced with online instructor-narrated modules. Each module included three tasks: scenario responses, quick think applications, and design activities. A textbook supported the modules as supplementary reading. Flipped Approach Features – Synchronous Online: Online “F2F features” included chat sessions, which were organized by groups (similar ID projects) and weekly office hour chat sessions.
Models of Teaching Framework The pedagogical features of the two courses are now compared using the models of teaching framework (Joyce, Weil, & Calhoun, 2014). The framework components include model goals, the social system describing student and teacher roles and relationships, syntax or procedures for model use, principles of reaction from students and subsequent decisions from teachers, the support system necessary to provide the conditions for efficient and effective model implementation, and the instructional (direct) and nurturant (indirect) effects of the model. The instructional effects of the overall teaching strategy for each course can be regarded as the learning outcomes, while the nurturant effects can be viewed as additional learning that might occur.
Undergraduate Course: College Algebra Course goals address classroom norms that encourage student/student interaction and student/teacher interaction and using multiple teaching strategies to meet the diverse learning needs of students. The social system describes student and teacher roles and relationships. Typically, college algebra has been taught with the professor as the “sage on the stage”, the transmitter of knowledge. Students “passively receive that knowledge, memorize it, and regurgitate it on an exam” (King, 1993, p. 1). The flipped classroom model enables the instructor to assume a facilitator’s role, so that the students can take center stage. The instructor is responsible for orienting students to each unit, creating video lectures, maintaining the online homework system, supporting a cooperative learning environment, and assessing student performance. Students are responsible for preparing for class by viewing video lectures and completing online homework assignments. In class students must be prepared to ask questions and collaborate with their peers on a variety of assignments. The syntax or procedural flow of this course is based on research conducted (Ogden & Shambaugh, 2016) and includes the following activities: Integrated Instruction of the Mathematics Unit • •
286
Various teaching strategies (video lectures, F2F lectures, question/answer session, online homework, and cooperative assignments) are assigned to unit lessons based on available resources, the length of the lesson, and the context of the lesson. Video lectures are viewed by students outside of class. F2F lectures take place in the classroom. Students are provided with a “fill-in-the-blank” note-guide to facilitate their organization of ideas.
Best Teaching and Technology Practices for the Hybrid Flipped College Classroom
• • •
Students ask questions regarding material covered in the online videos, F2F classes, or online homework assignments. Question and answer sessions give students the authority to direct instruction toward problem areas. Outside of class students individually complete online homework assignments, which include 10-25 questions. Question formats vary from open response to multiple choice and focus on procedural knowledge, such as solving equations. In the F2F class students complete cooperative learning activities, in groups of three or four, on procedural knowledge, conceptual understanding and applications.
Exams • •
After the integrated instruction of the unit, a unit exam is administered to assess students’ understanding of the unit. Depending on class size, unit exams are administered online or on paper. After all units of material have been taught a cumulative final exam is administered.
Each of the teaching approaches elicits an action or reaction from both the teacher and student. It is critical that students learn how to self-regulate and ask questions that will further their understanding of the content. Principles of reaction for students and instructor are described below: Lectures • •
Students worry that online lectures imply that they will have to teach themselves the content. Instructors brief students on the overall approach and reassure students that the video is only one of the teaching strategies used. Instructors encourage student questions and spend F2F time re-explaining topics. Students work problems similar to those from the lecture videos.
Cooperative Learning Activities •
Students must be prepared to work with other students and contribute to their group in a meaningful way. Instructors work as a facilitator in the cooperative activity. The instructor encourages students to ask each other for help and refer to their notes before asking for assistance.
Online Homework •
Students are given several days to complete assignments and multiple attempts at each problem. Students attempt the problems before they are due, solicit help in class, and re-visit the problems after receiving help. Instructors set due dates that allow students enough time to try problems, ask questions, and make corrections.
The support system extends beyond the traditional roles of teacher and student in an effort to facilitate a change in the learning environment. Instructors develop their own instructional videos or use existing videos. If existing videos are used, time must be spent finding and screening media. In addition, many
287
Best Teaching and Technology Practices for the Hybrid Flipped College Classroom
students are not used to this teaching approach outside of the classroom, while in the classroom asking questions and working collaboratively with others may make some students uncomfortable. As high failure rates and high DFW rates have provided the impetus for reform efforts in college algebra, the direct or instructional effects include student grades, student performance on laboratory activities, and the DFW rate. Class averages on unit exams provided a way to assess student understanding of each unit, class averages on the final exam and overall class averages assessed student understanding of the content across the entire course. Class averages on individual cooperative laboratory activities assessed students’ conceptual understanding of specific topics and identified topics of difficulty. Three indirect or nurturant effects emerged (Ogden & Shambaugh, 2016). First, students exhibited better self-regulation skills. Self-regulated learners take responsibility for what they know and develop the strategy skills to learn what they do not know (Zimmerman, 1990). Second, students began to relate success to learning and not just good grades, a by-product of the mastery-oriented learning environment. The teaching approach fostered individual growth rather than competition for grades. Third, students reported feeling more confident in their ability to do mathematics and be successful in future courses.
Graduate Course: Instructional Design Course goals include examining student beliefs about instruction and students, using the instructional design process to make decisions about ways to promote learning, and learning from each other. The first goal ensures that students examine the learning beliefs that will influence how they think, design, and act. The second goal addresses ID process understanding and the use of tools (e.g., task analysis, learning taxonomies) and processes (e.g., feedback, revision) to analyze instructional problems and make design decisions. The third goal of peer learning was added to the online deliveries as the LMS provided structure to support additional peer and instructor feedback. The social system is described before the procedural syntax to lay out the interactions of the instruction’s learning tasks. Teachers and students are viewed as co-participants, as co-learners in the course. Activity is roughly distributed equally between teacher and student. The online participation structures within each module include a set of instructor-narrated screens, student scenarios to represent their initial understanding of an ID phase, quick-think prompts to provide practice in applying ID phase features, and a design activity. Other online activities include students working in groups critiquing each other’s work. Groups consists of students with similar projects such as public school units, college courses, training modules, community education, and professional development. The instructor participates in weekly office chats, individual feedback on design activities, and video feedback on overall student performance across each ID module. A textbook and optional readings support each module. The syntax or procedural flow of the course activities has remained intact across the deliveries and includes three sets of activity across each phase of instructional design: Setting the Stage: Introducing Students to a Phase of Instructional Design • •
288
A scenario (public school, higher education, training) is presented with a question relevant to that ID phase. Students post online and the instructor responds to most of the postings. Module screens are organized based on one major idea. The instructor narrates each screen providing social contact between student and instructor.
Best Teaching and Technology Practices for the Hybrid Flipped College Classroom
Representing Understanding: What Students Learn is Revealed by Course Products • •
At the end of each module a “quick think” activity prompts students to apply key ideas from that ID phase to a specific instructional problem. Students comment on group member postings. Following the quick-think activity, students complete a design activity for that ID phase. The design activity is structured with due dates, rationale, procedures, and assessment.
Debriefing the Participants: Using Multiple Forms of Assessment with Peers and Instructor • • • •
Group members critique their design activities, while the instructor posts feedback on the document or in a reply box to the posting. Five group chats are scheduled in which members self-organize to meet on selected weeks to respond to structured questions about their projects. Instructor produces a video summary what was seen across student work and suggests changes. Weekly office hour chats provide real-time meetings to address student needs.
For any teaching approach studied over time, how students respond becomes better understood as well as how instructors react to student performance. Principles of reaction for students and instructors include the following: Learning Beliefs and ID Models • • •
Students have difficulty matching theory-based learning principles with their learning beliefs. Multiple examples are provided. Students demonstrate varying degrees of proficiency to identify and prioritize learning principles and translate these items into a mission statement paragraph or sentence. Multiple submissions are encouraged. Students construct an initial representation of the ID process but fail to understand the nature of the connections between the components. The instructor assigns model construction at the beginning and end of course to assess model development.
Analysis Phase: Needs Assessment • • • •
Students do not understand the instructional problem they will be designing for and feedback prompts for clarity as to the instructional problem, need, or gap. Students resist the work that needs assessment requires. This phase is given a high percentage of the project grade and requires benchmarking student performance over several weeks. Some students do not “get outside of their own experience” and use the course to justify what they currently do. Instructors prompt ongoing thinking through revision of the needs assessment writeup, include outside sources, and document how and what they discovered. Project learning goals may be numerous. The instructor suggests that goals be categoried into five or fewer items and include student learning outcomes as well as instructor project goals.
289
Best Teaching and Technology Practices for the Hybrid Flipped College Classroom
Design Phases: Sequence, Teaching/Assessment •
Students submit weekly drafts of their design components and the instructor provides timely feedback. An instructor uses the mission statement and project goals from the needs assessment to assess the appropriateness of the design decisions.
Design Phase: Program Evaluation •
Students are tired and want to complete the project and course. Thus, program evaluation is presented concisely as issues of efficiency, effectiveness, and appeal, as well as raising awareness to the differences between formative and summative program evaluation.
The support system, according to Joyce, Weil, and Showers (1992), includes those “additional requirements of the model beyond the usual human skills, capacities, and technical facilities” (p. 15). Students are new to beliefs examination, complex tasks, and detailed feedback. Taking responsibility for design decisions initially prompts student concerns from “Is this what the instructor wants?” but evolves over the course to asking “What do I need to do to help my identified learners?” Submitting drafts of work and reflective activities are uncomfortable for some. The instructional effects include the course’s learning outcomes; namely, examining beliefs about learning and teaching, thinking about ways to promote learning, and helping students to learn from each other. The mission statement task became the most efficient means to assess to what extent a student’s learning beliefs were exhibited within the project. ID process understanding was evaluated by performance on learning tasks and engagement as recorded by the LMS. Analyzing finished ID projects were assessed by completeness, consistency, and coherence. Nurturant effects, according to Joyce, Weil, and Showers, “come from experiencing the environment created by the model” (1992, p. 16). One nurturant effect is building trust to develop a community of learners that values everyone’s participation. Another nurturant effect is “stepping outside oneself” and to learn from other points of view and consider options. A third nurturant effect is a greater sense of professional identity from representing one’s foundational learning beliefs through learning principles. A fourth nurturant effect is developing habits of reflectivity by both students and instructor.
Course Research Findings Undergraduate Course: College Algebra The College Algebra course has been formally studied over three deliveries using design and development research (Richey & Klein, 2007) to document design decisions (pre-teaching), implementation (during teaching), and evaluation (post-teaching), and is summarized in Ogden and Shambaugh (2016). Learning Theory Foundations The nature of design and development research involves developing, testing, and revising designs. This type of research would be limited if confined to one theoretical perspective on learning, but the approach considers multiple perspectives for mathematics education design research (Cobb & Bowers, 1999; Cobb
290
Best Teaching and Technology Practices for the Hybrid Flipped College Classroom
& Yackel, 1996; Cobb et.al., 2011). Piaget (1977) cited the individual construction of knowledge to make sense of the world. Like Piaget, Bruner (1990) saw learning as a constructive process and suggested that teachers design instruction that promotes student thinking through activity and discovery such as constructing, exploring, and categorizing information. Sociocultural learning theorists see learning as a process of acculturation or the result of being initiated into the ideas and practices of a community (Duffy & Cunningham, 1996). Furthermore, Bronfenbrenner (1979) viewed student learning and development as the result of a student’s perception and interaction with the environment. Learning is treated as an individual process and the learning environment is treated as a micro-culture with emerging social practices (Cobb, 2007). The combination of these perspectives supports the design and development process in which changes are made to the learning environment through design decisions and instructional strategies, and learning assessment informs instructional re-design efforts. Research Findings Research conducted on the course from 2012-2014 documents the design and development of the flipped classroom teaching model across three deliveries of the course. Ogden, Pyzdrowski, and Shambaugh (2014) documents the first two deliveries, and Ogden and Shambaugh (2016) documents the design and development across all three deliveries. Both studies used design and development research methodology, which Richey and Klein (2007) defined as “the systematic study of design, development, and evaluation processes with the aim of establishing an empirical basis for the creation of instructional and non-instructional products and tools and new or enhanced models that govern their development” (p. 1). The design and development framework poses research questions as design objectives, providing a way to organize the reporting of design decisions and research findings throughout multiple deliveries of a course. Data sources for the model’s design decisions included the instructor’s journal, course syllabus, and course weekly schedule. Implementation was documented through the analysis of the instructor’s journal, student surveys, and notes from student conferences. Evaluation of the model was reported in terms of its effectiveness and appeal. “Determining effectiveness requires asking questions about whether the design accomplishes what it sets out to do” (Shambaugh & Magliaro, 1997, p. 228) and determining a program’s appeal requires asking questions about the learners’ perceptions with respect to the design and its impact on their learning. Data sources analyzed to determine the effectiveness of the model in the third delivery of the course included class averages on unit exams and the final exam, the DFW rate (the percentage of students withdrawing or earning grades of a “D” or “F” in a course), and student responses on laboratory activities. Data sources to determine the appeal of the teaching model across the three deliveries of the course included student surveys, student interview data, and university course evaluations. A two semester study was conducted during the 2012-2013 academic year documenting the design and development process over the first two deliveries of the course (Ogden, Pyzdrowski, & Shambaugh, 2014). The design of the first course delivery was a hybrid flipped classroom model. Video lectures were assigned as homework for about 50% of the classes. Online homework assignments were assigned on a weekly basis. On days when instruction was flipped, students worked on problem sets or laboratory activities cooperatively. Findings indicated that students felt that the teaching model enabled them to ask more questions in class. The instructor emerged as an integral part of the teaching model as students indicated that the model could only be a good as the teacher using it.
291
Best Teaching and Technology Practices for the Hybrid Flipped College Classroom
The design of the second delivery incorporated a truer flipped classroom model as video lectures were assigned as homework for most topics in the course and F2F class time was spent with students working on problems in groups. Weekly online homework assignments were assigned. Although students felt that watching video lectures for homework and completing online homework assignments outside of class was too much work, they perceived the flipped classroom model as student-centered and felt that having the freedom to ask questions helped them to learn mathematics. F2F class time was devoted to student needs, and as one student said, “…was dedicated to what you didn’t know, so you learned. It wasn’t a teacher just teaching everything and saying they think you know something you don’t. You have time to ask about what you don’t know” (Ogden, Pyzdrowski, & Shambaugh, 2014). The second study documented the design and development of the course from Fall 2012-Fall 2013 (Ogden & Shambaugh, 2016). The course evolved into a hybrid flipped classroom teaching model. Instruction varied daily and included video lectures, F2F lectures, cooperative learning assignments, and online homework problem sets. An effort was made to evaluate the model with respect to both effectiveness and appeal, as the previous study (Ogden, Pyzdrowski, & Shambaugh, 2014) only evaluated the appeal of the teaching model. Student grades, performance on laboratory activities, and the DFW rate were compared across all three deliveries of the course to evaluate how the flipped classroom teaching model was connected to student learning. Results indicated that the DFW rate improved throughout the three semesters of teaching college algebra as it was 52.5% in case 1, 42.5% in case 2, and 20% in two sections of case 3 (30% in case 3a and 10% in case 3b). With the exception of exam two, class averages on exams improved throughout the three semesters as well. In addition, the overall class average improved, the overall class average in case 1 was 69.5%, 73.7% in case 2, and 79% in case 3 (case 3a – 78.1%, case 3b – 79.9%). In all three deliveries, students felt that the teaching model enabled them to ask more questions in class. In deliveries one and three, students stressed that the course components worked together to foster student learning. Illustrative quotes from student surveys included, “I liked the Flipped Classroom because it kind of made everything a little bit different every day, it wasn’t so typical, so it made focus more…not so boring. Also, I learn by doing things, so it really helped me to understand better.” Another student added, “I think the structure gave everyone an opportunity, no matter how they learned, to be able to learn and ask questions.” In deliveries two and three students felt that the teaching model fostered a nurturing classroom culture. One student said, “I thought it was very open. I thought that it was very easy to ask questions, I didn’t feel uncomfortable. I like that my professor could get to know me.” Many students indicated that their feelings regarding their ability to learn mathematics had changed. For example, “I still am not a fan of math, but I think I have the ability to succeed in future math classes after taking this course.”
Graduate Course: Instructional Design Research on teaching a graduate course in instructional design has been conducted systematically since 1994. Two issues influence the teaching of instructional design; namely, how one views learning and teaching and how one understands and depicts instructional design (Shambaugh & Magliaro, 2001). This research documented how instructional design could be taught based on a particular view of teaching, one that regarded teacher and student as mutual learners (co-participatory) and supported critical, self-appraisal (reflexivity) by all participants. The aim of the instruction was the construction of shared
292
Best Teaching and Technology Practices for the Hybrid Flipped College Classroom
understandings through authentic, meaningful activity, and informed by the following theoretical assumptions. Learning Theory Foundations Learning is a natural aspect of human lives and is constructed by individuals over time in an attempt to make sense out of their world (Bruner, 1990; Jonassen, 1991; Piaget, 1971). Individuals are challenged to build, link, clarify, and transform their thinking and actions (Bruner, 1990). As learning is viewed as an active process of individual construction within social settings, then teaching is a process of supporting this construction rather than communicating knowledge (Duffy & Cunningham, 1996). This assistance in the development of cognitive and performance capacities (Tharp & Gallimore, 1988) can take multiple forms, including modeling, contingency management, feeding back, instructing, questioning, cognitive structuring, and reflecting (Shambaugh & Magliaro, 1995; Tharp & Gallimore, 1988). Tharp and Gallimore (1988) considered these multiple means of assisting performance in order to link different theories and disciplines (i.e., behaviorist, cognitive, neo-Vygotskian (Vygotsky, 1978) theory of development) together to form a general theory of teaching. Research Findings Research conducted on the course from 1994-2015 can be separated by period of F2F instruction, blended instruction, and 100% online delivery. Shambaugh and Magliaro (2001) document F2F teaching from 1994-1999, Shambaugh (2007) documents blended teaching from 1994-2009, and Shambaugh (2016) documents online delivery from 2010-2015. A five-year, six delivery, study of F2F teaching instructional design was conducted using design and development research (Shambaugh & Magliaro, 2001). In this approach, research questions formulated as design objectives, analyzed the design decisions, implementation, and evaluation (Richey & Klein, 2007). Each of the three design and development phases included data sources from which a systematic analysis could be conducted. Data sources for the model’s design decisions included course syllabi and instructor emails for course sequence, learning tasks, instructional materials, and assessment rubric. Model implementation was analyzed from student performance on formative tasks, including co-instructor emails and student interviews. Model evaluation was analyzed from student performance on their final project, a final set of student interviews, and student responses to instruction and instructor efforts to assist learners. The instructional design process consisted of Needs Assessment, Sequencing, Teaching, Assessment, Technology, Prototype, and Program Evaluation. Fifteen design activities were used to structure student thinking and decisions across these ID phases. Major results from the five year study included adding Learning Beliefs as a distinct instructional design phase and the development of a student guide and a published text (Shambaugh & Magliaro, 1997) that provided additional assistance to help newcomers experience the ID process through design activity. Instructional improvements included adding meaningful examples of designed instruction, structured group activities to transfer design thinking to their design projects, and instructional media/technology issues integrated throughout the course. A second study (Shambaugh, 2007) analyzed 12 years of blended teaching of the ID course (also incorporating the deliveries from the 2001 paper) and provided a major source of evidence supporting subsequent “flipping” of learning tasks. This study of 16 deliveries of the course also used a design and development cycle. The learning tasks within blended instruction began to incorporate a web site for
293
Best Teaching and Technology Practices for the Hybrid Flipped College Classroom
information and resources, the use of a Wiki to speed up the design review process, and a web board to post student work and receive critiques. Classroom activities were used to introduce, reinforce, or provide practice across each main topic in the course. Classroom activities were designed as “jumpstart” exercises in which students experienced the nature of what was asked in the out-of-class design activities. The sequence of a class session now followed this sequence: • • • • • •
Summary of what students posted to the web board. Warm-up activity to introduce students to the next ID phase. Mini-lecture/presentation. Jump-start activity to help students record thinking on paper; usually organized around random groups, early to promote meeting students, and later organized around content-specific groups, to provide peer feedback on similar projects. Scenario-descriptions where students record design decisions and peers critique resulting in an immediate design-review cycle and subsequent revisions. Explanation of the next design activity.
An interesting variation to the teaching during this 12-year period was the use of a Wiki for two deliveries. Students posted draft work to an online “wiki” or a set of collaborative web pages. Students were required to provide feedback on iterative designs (Shambaugh, 2003). The purpose behind the approach was to speed up the design-reflect-revision cycle (Carroll, 2000). The collaborative or CoWeb pages were visually spartan and required little coding; however, students were accustomed to more interface detail. More direct instruction and experience in class sessions were reported as needed to acquaint students with this simple online tool. Overall, the blended approach, as documented across 12 years, served as a test-bed for moving the course eventually to online delivery. Finally, Shambaugh (2016) summarizes the teaching decisions for 100% online deliveries of the course from 2010-2013 and 2015 hosted on the university’s LMS. The phases of instructional design were collapsed to the following seven: Learning Beliefs, ID Models, Needs Assessment, Instructional Sequence, Teaching and Assessment, Prototype, and Program Evaluation. Online instruction was structured around seven modules, one for each ID phase. Each module included the following sequence: • • • •
Scenario activity: introduce students to the human features of that ID phase, and organized around public school, college, and training settings. Module screens: organized around major ideas of that ID phase and each screen narrated by the instructor. Quick-think activity: students apply what they have learned to a prompting question. Design activity: students respond to structured tasks, which serve to build an ID project over time.
Feedback included instructor-to-student feedback on design activities in the form of online replies to postings; peer feedback on scenario responses, quick-think, and design activities for group members; and instructor-to-class feedback (email, video clip) on overall student performance within the module and individual design activity. Feedback on the course was obtained through university-based student evaluations and instructor-developed mid-semester and end-of-course open-ended questions. Key results from student feedback and design activities included the following:
294
Best Teaching and Technology Practices for the Hybrid Flipped College Classroom
• • • • • •
Students reported on prompt and detailed feedback on their work. A high number of student and instructor postings were recorded (e.g., two deliveries with 45 students each produced on average 3000 postings with 1/3 of these postings from the instructor). Personalized voicing of modules and student work examples increased teacher presence. Significant amount of work was expected each week. Needs assessment challenged students to “get outside” of their experience and provide data-driven priorities and design decisions. The effectiveness of addressing media/technology issues throughout the ID process varied with individuals’ interest and exposure to technology tools.
SOLUTIONS AND RECOMMENDATIONS Best Teaching and Technology Practices in Using Flipped Classroom for Hybrid Courses General Recommendations Three general caveats are suggested. First, Wells and Holland (2016) suggests a reorientation of prevailing learner, educator and institutional cultures and contexts in order to achieve learner-centered, autonomous, flexible learning experiences. One approach is to invest students in the ongoing revision of flipped classroom course features. Solicit student reactions and ways to improve the use of time. A flipped classroom forces one to re-think how time is used in the classroom – from talking and lecturing to listening and assisting. A second caveat recommends that a flipped classroom becomes a contextually-based pedagogical practice. This approach takes into account learning outcomes, the range of student characteristics and their understanding of the content, as well as multiple teaching strategies specifically chosen to promote active student learning. Design decisions need to consider the overall curriculum strategy (Ogden & Shambaugh, 2016). In this way, students receive high value for their investment, and academic programs offer instruction that is coherent and appropriate to program goals. A third caveat advocates a systematic study over time to document course effectiveness for student engagement, student learning and program needs. Design and development research (Richey & Klein, 2007) provides a structured approach to document design decisions, implementation, and evaluation. Course design and assessment files provide data sources, such as course syllabi, student comments, and student performance artifacts. As design decisions evolve, so will data sources.
Undergraduate Teaching Recommendations Access and Motivation For on campus undergraduates technological access is usually not a problem, but the online environment might present a new experience. There will likely be students who are not motivated by homework or online activity (Hamdan, et al, 2013). For the instructor “The process of planning for and executing a flipped learning experience requires vast amounts of rigor, foresight, deep instructional knowledge,
295
Best Teaching and Technology Practices for the Hybrid Flipped College Classroom
creativity, and risk-taking” (Hirsch, 2015). The question that is most frequently asked with the flipped classroom is “How do you know students have watched the video?” The online activity has to “count” in the assessment system, so that failure to complete the video and supporting activity results in a consequence. Clear instructions establish a clear purpose to the activity and what students need to do, as well as how these activities are assessed (Hirsch, 2015). An instructor could create a course procedures video, so students are clear as to what they need to do (Hirsch, 2015). The university’s LMS usually has the ability to document the students who have accessed course activities and how long students have spent on them. Meaningful Activities A flipped classroom design requires significant preparation time to design the learning activities. Online content development requires more work than developing a syllabus and staying ahead of the students’ week-to-week. Online content creation and media production skills may be new to faculty. Recommendations are provided: • • • • • • • •
Allow more content/media creation time than with F2F deliveries. Script online media and produce short media pieces. Use screencasting software products, which are faculty-friendly tools to record and edit video, and add captions. Create content that is a mix of text and visuals and cognitively structured (Mayer, 2009). Use advance organizers to structure the conceptual organization of online materials. Obtain feedback from students about both online and in-class activities. Revise design decisions for the next course delivery. Keep course learning outcomes in the forefront of any mid-course changes.
Feedback “The critical component of flipped learning occurs in the classroom itself – how teachers pivot from the video’s baseline content to deeper, more expansive targets and make room for students to investigate, evaluate, and apply new knowledge in creative ways” (Hirsch, 2015, p. 1). Gonzalez (2014) recommends the “in-class flip,” where the online video is viewed at a learning station supplemented by solo and group activities. The classroom viewing reduces technical issues that might be experienced at home. The teacher observes students watching the videos and responds to questions. Downsides to the in-class flip is reduced classroom time for other activities. This approach may provide students with a transition to the usual flipped classroom.
Graduate Teaching Recommendations Access and motivation Given their physical location and time limitations, graduate students have a range of access issues. •
296
Graduate students may live in different time zones requiring instructor attention to due dates and times of synchronous activities.
Best Teaching and Technology Practices for the Hybrid Flipped College Classroom
• • • •
Older students who work full time face issues with synchronous activities. If such synchronous activities are required in a program, then a program’s materials must be clear on this requirement. Real-time chat can be accommodated by grouping large enrollments and having the smaller groups schedule online chats. Technical requirements for online activity should be clear in materials. Program and courses orientation web pages, handouts, and video tours can guide students to navigate the online features of a course.
Meaningful Activities Graduate students, some of whom may be new to hybrid deliveries, value an overall view of program and course features. Both the structure and clarity of tasks requires careful design and testing over time and incorporating student feedback. • • • • • •
As more academic programs are delivered online, learning task structure needs to align with the course’s learning outcomes and the program’s overall goals. Syllabi should provide a clear explanation of how the course works, what features are offered inclass and what features are delivered online. A typical class session or week could be described or visualized to give students an idea of activities and performance expectations. Activity guidelines should be posted on the online site and available as downloadable files. Learning task guidelines should address the purpose of task, relevance, due date, instructions, and guidelines. A cognitive organizer helps students to see the overall scope of what is to be learned and how learning tasks connect to each other. Instructors who are teaching hybrid courses for the first time should look secure permission for and archive student examples of work and learning activities.
Rather than high-end video production software, both cases in this chapter used screencasting products, which enabled the instructors to capture on-screen activity and provide post-production editing of audio and video. • • • • • •
Scripting videos ensures that key points have been covered and shorter video clips. A script file provides universal design access to learners with special needs. One technical area that is frequently overlooked is the quality of audio. For office or studio-based narrations, an external microphone is preferred over built-in versions. Attention should be paid to where the audio is recorded. Avoid noisy office locations or where disturbances are common. Capturing in-class activity through video may be useful to depict appropriate in-class behaviors and activity procedures. Classroom video will need quality audio recordings. A lavalier microphone improves instructor comments, but may not adequately capture student activity. Most recommendations for video suggest shorter segments, less than 30 minutes. Another approach is to break up videos into segments of 10-15 minutes. Ask students for feedback on video clips.
297
Best Teaching and Technology Practices for the Hybrid Flipped College Classroom
Feedback In combination with clear and relevant learning tasks, quality feedback is critical, particularly with any course in which the traditional classroom has been changed from what students are used to. Quality feedback can be characterized as prompt, consistent, and responsive to student needs. • • • • •
Specify time for replies to student queries (e.g., 24 hours). Specific time for comments on student work (e.g., 5 working days). Consistent feedback means that instructor comments should remain similar across student submissions. One form of inconsistency is changing feedback across a student’s work over time, unless that change is acknowledged. Responsive feedback includes appropriate comments related to the student’s performance of the task as specified by the assessment tool (e.g., rubric, checklist). In-class feedback requires that the instructor remains aware of students’ work.
FUTURE RESEARCH DIRECTIONS As variations in online instruction continue to appear, teacher presence in terms of course design and assessment will increase in hybrid courses. The direction of future research will be directed at learningassessment alignment or connecting learning outcomes and teaching options. Priority will be given to the evaluation of course design in higher education as opposed to the primacy of one’s teaching style. With the increased accountability of student learning in higher education programs, studying one’s teaching may become an expectation of college teaching. As a result, less attention will be paid to defining hybrid and flipped terminology and more attention will be spent on how different course features support learning outcomes. As college faculty will be expected to know how to produce video and use multiple forms of media in their instruction, new professional development approaches will need to provide ongoing support of F2F and online courses innovations.
CONCLUSION The use of a flipped classroom can be implemented in both F2F and online courses. Documenting course features through an instructional design process examines the coherence of design decisions to address explicit learning outcomes. The models of teaching framework specifically examine how a teaching model is grounded in learning theory and how that model is used given the context of its implementation. As with other studies, the use of a flipped classroom approach in both settings improved student motivation and refocused instructor activity to individual student needs. In both the undergraduate and graduate courses profiled, the instructor remained an integral part of the flipped classroom model as the approach bolstered student-to-teacher interaction and active learning. Although instructional videos were a large part of the undergraduate course, their use alone did not imply a flipped classroom. The flipped classroom strategy required an involved teacher who paid attention to student needs. In the F2F course, students identified essential instructor characteristics as passion, enthusiasm, and the ability to provide a nurturing learning environment. One student said “You need someone who is nurturing, be-
298
Best Teaching and Technology Practices for the Hybrid Flipped College Classroom
cause people won’t always understand things from the videos and trying it at home, so you have to have a teacher that is willing to go back through things and answer questions” (Ogden & Shambaugh, 2016, p. 66). Meanwhile, key findings from the graduate online course indicated that the instructor provided prompt and detailed feedback on both solo and group projects and engaged in numerous online conversations with students (Shambaugh, 2016). By flipping the online classroom, the instructor developed online relationships with students while providing detailed and prompt feedback in both synchronous and asynchronous activities. Research from the use of cooperative learning has indicated higher student achievement and greater student productivity as well as increased social competence and self-esteem (Li, & Lam, (2013). In the algebra course, students worked together to address common math learning issues, while in the F2F meetings, students worked together on problem sets and laboratory activities. In the graduate online course students participated in synchronous chat sessions and provided peer critiques on their work. What replaces the lecture in a flipped classroom delivery is critical to the success of the teaching approach. Students lose confidence in the flipped classroom teaching strategy if the class activities are not organized and worthwhile (Toto & Nguyen, 2009). Both courses reported that students engaged in activities that connected the video to the learning activities. Students viewed the instructional videos as a resource that afforded them more time to engage in course content and ask questions. In essence, the hybrid flipped classroom provided instructors with a means to individualize instruction. A flipped classroom approach can enhance both F2F and online courses, as it provides instructors an opportunity to better understand their students and adapt their teaching decisions to student needs. A flipped classroom enables instructors to promote active learning, provide more in-class time for student-centered activities and cultivate confidence in students. The approach leverages multiple teaching strategies to enhance student learning, but like any other approach must be carefully studied over time to reach its full potential, particularly its potential to support learning outcomes.
REFERENCES Ames, C., & Archer, J. (1988). Achievement goals in the classroom: Students learning strategies and motivation processes. Journal of Educational Psychology, 80(3), 260–267. doi:10.1037/0022-0663.80.3.260 Azedevo, A. (2012, October 17). Wired Campus: San Jose State U. Says replacing live lectures with videos increased test scores. Chronicle of Higher Education Blog. Retrieved from http://chronicle.com/ blogs/wiredcampus/san-jose-state-u-says-replacing-live-lectures-with-videos-increased-test-scores Bishop, J. L., & Verleger, M. A. (2013, June). The flipped classroom: A survey of the research.Proceedings of the ASEE National Conference, Atlanta, GA, USA. Brame, C. (2013). Flipping the classroom. Vanderbilt University Center for Teaching. Retrieved from http://cft.vanderbilt.edu/guides-sub-pages/flipping-the-classroom/ Bransford, J. D., Brown, A. L., & Cocking, R. R. (2000). How people learn: Brain, mind, experience, and school. Washington, D.C.: National Academy Press. Bronfenbrenner, U. (1979). The ecology of human development: Experiments by nature and design. Harvard University Press.
299
Best Teaching and Technology Practices for the Hybrid Flipped College Classroom
Brown, J., Collins, A., & Duguid, P. (1989, January-February). Situated cognition and the culture of learning. Educational Researcher, 18(1), 32–42. doi:10.3102/0013189X018001032 Bruner, J. (1990). Acts of meaning. Cambridge, MA: Harvard University Press. Carroll, J. M. (2000). Making use: Scenario-based design of human-computer interactions. Cambridge, MA: MIT Press. doi:10.1145/347642.347652 Cobb, P. (2007). Putting philosophy to work. In Second handbook of research on mathematics teaching and learning: A project of the National Council of Teachers of Mathematics. Cobb, P., & Bowers, J. (1999). Cognitive and situated learning perspectives in theory and practice. Educational Researcher, 28(2), 4–15. doi:10.3102/0013189X028002004 Cobb, P., Stephan, M., McClain, K., & Gravemeijer, K. (2011). Participating in classroom mathematical practices. In A Journey in Mathematics Education Research (pp. 117–163). Springer Netherlands. Cobb, P., & Yackel, E. (1996). Constructivist, emergent, and sociocultural perspectives in the context of developmental research. Educational Psychologist, 31(3-4), 175–190. doi:10.1080/00461520.1996 .9653265 Duffy, T. M., & Cunningham, D. J. (1996). Constructivism: Implications for the design and delivery of instruction. In D. H. Jonassen (Ed.), Handbook of Research for Educational Communications and Technology (pp. 170–198). New York: Macmillan. Dziuban, C. D., Hartman, J. L., & Moskal, P. D. (2004). Blended learning. Educause Research Bulletin, 2004(7). Retrieved from https://net.educause.edu/ir/library/pdf/erb0407.pdf Eccles, J. S., Adler, T. F., Futterman, R., Goff, S. B., Kaczala, C. M., Meece, J. L., & Midgley, C. (1983). Expectancies, values, and academic behaviors. In J. T. Spence (Ed.), Achievement and achievement motivation (pp. 75–146). San Francisco, CA: W. H. Freeman. Ent, V. L. (2016). Is flipped learning really new to academia? TechTrends, 60(3), 204–206. doi:10.1007/ s11528-016-0060-5 Fawley, N. (2014). Flipped classrooms: Turning the tables on traditional library instruction. American Libraries, 45(9-10), 19. Flipped Learning Network. (2014). The four pillars of F-L-I-P. Retrieved from www.flippedlearning. org/definition Frydenberg, M. (2013). Flipping excel. Information Systems Education Journal, 11(1), 63–73. Gonzalez, J. (2014). Modifying the flipped classroom: The “in-class” version. Flipped Classroom. Retrieved from http://www.edutopia.org/blog/flipped-classroom-in-class-version-jennifer-gonzalez Hamdan, N., McKnight, P., McKnight, K., & Arfstrom, K. M. (2013). A review of flipped learning. South Bend, IN: Flipped Learning Network. Retrieved from http://flippedlearning.org/domain/41 Hirsch, J. (2015). 100 Videos and counting: Lessons from a flipped classroom. Retrieved on April 6, 2016 from http://www.edutopia.org/blog/100-videos-lessons-flipped-classroom-joe-hirsch
300
Best Teaching and Technology Practices for the Hybrid Flipped College Classroom
Horan, D. A., Hersi, A. A., & Kelsall, P. (2016). The dialogic nature of meaning making within a hybrid learning space: Individual, community, and knowledge-building pedagogical tools. In J. Keengwe (Ed.), Handbook of Research on Active Learning and the Flipped Classroom Model in the Digital Age (pp. 19–40). Hershey, PA: IGI Global. doi:10.4018/978-1-4666-9680-8.ch002 Jonassen, D. H. (1991). Objectivism versus constructivism; Do we need a new philosophical paradigm? Educational Technology Research and Development, 39(3), 5–14. doi:10.1007/BF02296434 Joyce, B., Weil, M., & Calhoun, E. (2014). Models of teaching (9th ed.). Boston, MA: Pearson. Joyce, B., Weil, M., & Showers, B. (1992). Models of teaching (4th ed.). Boston, MA: Allyn & Bacon. King, A. (1993). From sage on the stage to guide on the side. College Teaching, 41(1), 30–35. doi:10. 1080/87567555.1993.9926781 Lee, J., & Jang, S. (2014). A methodological framework for instructional design model development: Critical dimensions and synthesized procedures. Educational Technology Research and Development, 62(6), 743–765. doi:10.1007/s11423-014-9352-7 Li, M. P., & Lam, B. H. (2013). Cooperative learning. Retrieved from http://www.ied.edu.hk/aclass/ l’heories/cooperativelearningcoursewriting_LBH% 2024June.pdf Long, T., Logan, J., & Waugh, M. (2016). Students perceptions of the value of using videos as a pre-class learning experience in the flipped classroom. TechTrends, 60(3), 245–252. doi:10.1007/s11528-016-0045-4 Mayer, R. E. (2009). Multimedia learning (2nd ed.). New York: Cambridge University Press. doi:10.1017/ CBO9780511811678 Ogden, L., Pyzdrowski, L., & Shambaugh, N. (2014). A Teaching Model for the College Algebra Flipped Classroom. In J. Keengwe, G. Onchwari, & J. Oigara (Eds.), Promoting Active Learning through the Flipped Classroom Model (pp. 47–70). Hershey, PA, USA: IGI Global. doi:10.4018/978-1-4666-49873.ch003 Ogden, L., & Shambaugh, N. (2016). The continuous and systematic study of the college algebra flipped classroom. In J. Keengwe (Ed.), Handbook of Research on Active Learning and the Flipped Classroom Model in the Digital Age (pp. 41–72). Hershey, PA, USA: IGI Global. doi:10.4018/978-1-4666-9680-8. ch003 Piaget, J. (1971). Genetic epistemology. New York: W. W. Norton. Piaget, J. (1977). The development of thought: Equilibration of cognitive structures. New York: Viking. Rakes, C. R., Valentine, J. C., McGatha, M. B., & Ronau, R. N. (2010). Methods of instructional improvement in algebra: A systematic review and meta-analysis. Review of Educational Research, 80(3), 372–400. doi:10.3102/0034654310374880 Richey, R., & Klein, J. D. (2007). Design and development research: Methods, strategies, and issues. New York: Routledge.
301
Best Teaching and Technology Practices for the Hybrid Flipped College Classroom
Salifu, S. (2016). Understanding flipped instructions and how they work in the real world. In J. Keengwe (Ed.), Handbook of Research on Active Learning and the Flipped Classroom Model in the Digital Age (pp. 72–90). Hershey, PA, USA: IGI Global. doi:10.4018/978-1-4666-9680-8.ch004 Shambaugh, N. (2003). Use of CoWebs in scenario-based ID instruction. Proceedings of the 26th Annual Anaheim: Selected Papers On the Practice of Educational Communications and Technology (pp. 400-407) Association for Educational Communications and Technology (AECT). Shambaugh, N. (2007). Using developmental research to evaluate blended teaching in higher education. In P. Richards (Ed.), Global Issues in Higher Education (pp. 1–28). Hauppauge, NY: Nova Science Publishers. Shambaugh, N. (2016). Documenting the online course (Working paper EDP640 2010-2015). West Virginia University. Shambaugh, R. N., & Magliaro, S. G. (1997). Mastering the possibilities: A process approach to instructional design. Boston, MA: Allyn & Bacon. Shambaugh, R. N., & Magliaro, S. G. (2001). A reflexive model for teaching and learning instructional design. Educational Technology Research and Development, 49(2), 69–92. doi:10.1007/BF02504929 TED. (2011, March). Salman Khan: Let’s use video to reinvent education [Online newsgroup]. Retrieved from www.ted.com/talks/salman_kan_let_s_use.videos_to_reinvent_education.html Tharp, R. G., & Gallimore, R. (1988). Rousing minds to life: Teaching, learning, and schooling in social context. Cambridge: Cambridge University Press. Toto, R., & Nguyen, H. (2009). Flipping the work design in an industrial engineering course.Proceedings of the 39th ASEE/IEEE Frontiers in Education Conference, San Antonio, TX, USA. doi:10.1109/ FIE.2009.5350529 Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes. Cambridge, MA: Harvard University Press. Wells, M., & Holland, C. (2016). Flipping learning! Challenges in deploying online resources to flipped learning in higher education. In J. Keengwe (Ed.), Handbook of Research on Active Learning and the Flipped Classroom Model in the Digital Age (pp. 1–18). Hershey, PA, USA: IGI Global. doi:10.4018/9781-4666-9680-8.ch001 Zimmerman, B. J. (1990). Self-regulated learning and academic achievement: An overview. Educational Psychologist, 25(1), 3–17. doi:10.1207/s15326985ep2501_2
KEY TERMS AND DEFINITIONS ADDIE Model: A generic representation of instructional design consisting of the phases of Analysis, Design, Development, Implementation, and Evaluation. Asynchronous: Online delivery that can be accessed at any time, any place.
302
Best Teaching and Technology Practices for the Hybrid Flipped College Classroom
Flipped Classroom: Inverting classroom activities and online instruction, in which online provides students with instruction (usually video) and supporting tasks, while the F2F setting instructors address student needs and online task performance. Hybrid: A mix of F2F and online activity. Instructional Design: A structured process of designing educational products and intervention, a process composed of phases of activity (see ADDIE) that are interconnected. Models of Teaching Framework: A means to document approaches to teaching. Joyce, Weil, and Calhoun (2014) organize teaching across different families in terms of their major learning outcome (e.g., behavioral, cognitive, social, personal). Screencasting: A software product that records human activity on a computer and may feature the ability for a user to edit the audio and video and add other media elements. Synchronous: Online delivery that is in real-time with users.
303
304
Chapter 14
Creating Inclusive Online Learning Environments That Build Community and Enhance Learning Morris Thomas University of the District of Columbia, USA Rachelle Harris Milestone’s Educational Consulting, USA Arlene King-Berry The University of the District of Columbia, USA
ABSTRACT The current status of today’s society is driven by and involves technology. Many people cannot function without their cell-phones, social media, gadgets, tablets, and other forms of technology for which people interact. Many of these technologies depend upon and are utilized within an online context. However, as it pertains to online learning environments, many faculty struggle with developing and implementing opportunities that builds a sense of community for their learners. This chapter: 1) Discusses key factors that impact student engagement, 2) Addresses factors that facilitate continued engagement for diverse online learners, 3) Provides evidence-based practices for creating and sustaining online learner engagement, and 4) Offers real world suggestions from the online teaching experience of chapter’s authors.
It is virtually impossible to engage learners in purposeful and meaningful inquiry without the Internet and communication technologies to precipitate and sustain discourse that is central to higher order learning… – Dr. Randy Garrison & Norman Vaughan
DOI: 10.4018/978-1-5225-1851-8.ch014
Copyright © 2017, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
Creating Inclusive Online Learning Environments That Build Community and Enhance Learning
OVERVIEW The current status of today’s society is driven by and involves technology. The average person is not willing to or cannot function without his or her cell phone which is a popular medium for social media and acts as the gateway for other gadgets such as smart watches, tablets, and other forms of technology people utilize to interact with one another. Many of these technologies depend upon and are utilized within an online context. However, as it pertains to online learning environments, many faculty struggle with developing and implementing opportunities that build a sense of community for their learners. One of the leading issues in online learning environments involves a lack of community. Oftentimes learners do not feel connected to their peers and in many cases their instructors. Being that these same learners’ primary social interactions and relationships mostly occur in similar mediums outside of the learning context, this seems odd and serves as the impetus for further discussion. One of the major critiques or concerns with online learning environments is its supposed limitations in providing a comparable learning experience to rival what learners receive in face-to-face learning environments. It has been suggested that online learning environments lack a sense of community and that learning experiences are not as significant as they are in more traditional learning settings. Garrison & Vaughan (2008) posit that it is essentially impossible to engage learners in significant research without utilizing online communication technologies to increase rapid exchange and involve momentous discussions that are needed to facilitate higher order learning. Moreover, well-designed learning environments are likely more meaningful learning experiences than sitting passively in a lecture hall. Therefore, online learning environments and its design should be given adequate consideration to better serve the growing online learner population. Research in neuroscience and the physiology of learning demonstrates the strong link between emotion and cognition; little real learning occurs in the absence of the strong, positive emotions engendered by deep engagement, motivation, interest, and caring (Zull, 2011). This chapter 1. 2. 3. 4.
Discusses key factors that impact learner engagement Addresses factors that facilitate continued engagement for diverse online learners Provides evidence-based practices for creating and sustaining online learner engagement Offers real world suggestions from online teaching experiences
The notion of environmental dynamics speaks to the circumstances or conditions that surround the social, intellectual, or moral forces that produce activity and change in a given place (Thomas, Hilton & Ingram, 2015). The circumstances and conditions of online learning environments are important to consider since there is an increased demand for online programs. According to the Babson Survey Research Group, online learner enrollments continue to be the fastest growing sector in higher education. Hence, the analysis of these learning environments is extremely important to ensure that they are inclusive for the ever increasing learning population. Moreover, it is extremely important to consider that learners have varied needs, and these learning environments should be able to accommodate and serve the broadest learner spectrum (Hurtado, Milem, Clayton-Pedersen, & Allen, 1999). Furthermore, building community is important because several studies suggest that when learners experience learning environments where they feel a sense of belonging; they internalize higher adjustment and satisfaction values with those institutions and are more likely to persist to graduation (Schwitzer, Griffin, Ancis, &
305
Creating Inclusive Online Learning Environments That Build Community and Enhance Learning
Thomas, 1999). Therefore, creating inclusive online environments may be a solution to assist institutions relying on online programs to improve retention and increase graduation rates. Using WEB 2.0 tools as a means for building community is discussed in this chapter. There are many WEB 2.0 tools that exist. Web 2.0 provides multiple online platforms to expand online engagement and build community in simple and inexpensive ways. It is important to discuss these tools to motivate faculty and learners to utilize them in their learning experiences. In this chapter the authors highlight the following WEB 2.0 tools in addition to other tools that can be used to enhance the online learning environment: • • • • • • •
Google Docs MuseoTech.Net Blogs Doodle Crowdgrader SlideShare.Net PeerWise
The seven aforementioned WEB 2.0 tools provide several opportunities to create inclusive online learning environments that build community and enhance learning. This chapter also describes innovative pedagogies and technologies that enhance learner interaction in online learning regardless of their cultural, linguistic, or diverse learning abilities. In addition, this chapter provides instructional resources for higher education faculty and instructional designers to use in online learning environments. Faculty can gain insight from research-based practices that support high levels of engagement and achievement for all learners. This includes strategies that focus on learner-instructor, learner-content, and learner-learner interaction. This chapter delves into learners’ interests, offers appropriate challenges, and offers tools to increase motivation for online learners. Instructional designers can increase their toolkit with authoring and planning tools for learner interaction in discussion board forums, live chat rooms, and live web conferences, as well as use learning analytics to create engaging, robust learning environments. This chapter is not designed to give faculty an exhaustive online resource toolkit; however, it is designed to encourage the use and practicality of these tools for building inclusive and community oriented online learning environments.
TEACHING AND LEARNING IN HIGHER EDUCATION Prior to looking specifically into online classroom environments, it is helpful to consider teaching and learning as a whole. Nilson (2010) asserts that some of the most common teaching methods are “lecture, interactive lecture, recitation, directed discussions, writing/speaking exercises, classroom assessment techniques, group work/learning, learner-peer feedback, just-in-time teaching, case method, inquiry-based or guided, problem-based learning, project-based learning, role plays and simulations, service-learning with reflections, and fieldworks/clinicals” (p. 107). These methods are meant to help educators reach the specific learning goals. Although training in course design and instructional delivery could enhance instructional impact, it is not a common practice for instructors to receive training in course design and 306
Creating Inclusive Online Learning Environments That Build Community and Enhance Learning
delivery. In these cases, the instructors are likely to use their imaginations to teach how they would like to be taught to and/or recycle pedagogy they experience as a learner (Moore, 2013). Teaching methods are not meant to be concrete additions to the learning process; they must be intertwined with instructor reflection and revision. Teaching methods should be discovered, deployed, and frequently revised to elicit learner learning (Ambrose, Bridges, DiPietro, Lovett & Norman, 2013). The authors provide some information to assist instructors to improve teaching & learning practices specifically as they pertain to creating online learning environments that build community and enhance learning.
THE ONLINE LEARNING CONTEXT Online community is a phrase that inundates the higher educational environment today; however, attempting to define the phrase often encounters difficulty due to the community morphing aspect in relation to online learning. Regardless of the constantly evolving landscape in the online learning environment, the concept of what constitutes a learning community remains at the foundation of an online community. What does it mean to be a part of a learning community? This is the foundational question that must be addressed before identifying the components of an online learning community. One of the most important aspects of fostering an online learning community rests with having a group of individuals who are set on learning a similar set of objectives. These learners must have a way or various ways of interacting with one another. Furthermore, the online community must have a sense of exclusivity. This means that access to the learning space is limited to a particular group of people or learners. Finally, the interaction within and among the learners is one of reverence within the learner group, and this respect also extends itself to the interface between the learner group and the instructor. Learners have a level of respect for one another, and they, as a group, have a level of respect for the instructor. Building an online community without such levels of respect yields a breakdown in a somewhat fragile community due to the lack of face-to-face communication. This fragile state of interaction is usually what makes both instructors and learners apprehensive to embrace the online option; however, instructional designers (ID) who effectively build online learning communities help to decrease anxiety and ensure positive learning experiences within this context. IDs open minds to the novelty and excitement that is associated with online learning communities when successfully executed. Building online communities gives voices to those less boisterous learners who are doomed to educational experiences that render them muffled back row or “corner huggers” in a traditional face-to-face learning environment. The conservative or shy nature that these learners usually exhibit is lifted when working beyond the constraints and structure related to the traditional classroom setting, and inclusive opportunities are afforded to them that allow for various means of successful participation; however, more inclusive online learning environments even work beyond the constraints and structure related to Learning Management Systems (LMSs) through social networking platforms and Web 2.0 applications.
HOW ONLINE LEARNING SERVE ITS LEARNERS The demand for online learning is rapidly growing. This demand is due in part to learners seeking opportunities to obtain their education in online environments due to its flexibility and cost efficiency. 307
Creating Inclusive Online Learning Environments That Build Community and Enhance Learning
According to the Babson Survey Research Group (2013), over 6.7 million learners were enrolled in at least one online class in 2011, compared to only 1.6 million in 2002 in higher education institutions. Approximately 72 percent of these institutions offered online courses in 2002. However, that number had steadily increased to nearly 87 percent in 2012. It is certain this number has increased and will continue to grow. Online learning environments provide a platform for faculty and their learners to experience learning without time or geographic restrictions (Brown & Green, 2011). Learning management systems provide many advantages in availability of various instructional strategies through the many applications that exist and, in most cases, are built directly into the LMS. Similar to the face-to-face learning environment, instructional strategies are most effective when employed specifically to meet particular learning goals and objectives. Online learning environments permit a range of interactive methodologies. Instructors find that in adapting courses to online models, they pay more attention to the instructional design of their courses. As a result, the quality, quantity, and patterns of communication learners practice during learning is improved. The strength of online learning lies in its capacity to support multiple modes of communication including any combination of interactions (e.g., learner-learner, learner-faculty, faculty-learner, faculty-faculty, learner-others, or others-learners). In addition to the aforementioned, online learning meets the needs of its learners by placing the locus of control as it pertains to learning in the hands of the learners as opposed to the faculty. In traditional face-to-face learning settings, it is often the instructor who provides information. These environments involve mainly direct instruction and are therefore teacher-centered learning environments. However, in online learning, the learners have an inherent greater sense of responsibility and involvement as it pertains to their learning (Brown & Green, 2011). The online learning environment assists in this dynamic by providing instantaneous access to a seemingly unlimited amount of information. As a result, learners are no longer dependent on the faculty for knowledge thus making learning a more collaborative, contextual, and active process. Online teaching is contingent upon experiences created and facilitated by faculty prepared to deliver instruction through this medium (Brown & Green, 2011). Today’s learners are very diverse. Therefore, it is imperative for faculty to include activities that accommodate the many preferred learning styles that exist (Moore, 2013). Significant learning experiences are facilitated by offering various instructional modalities in a given learning environment. In creating online learning, multiple instructional strategies should be employed to best meet the needs of its learners. Adapting some of the models that exist in traditional face-to-face learning environments provides a rich learning structure for which learners are more likely to achieve desired learning outcomes. Online learning provides increased learning experiences for learners by providing convenience and flexibility, accessibility, and a wide range of course options (Means, Toyama, Murphy, Bakia & Jones, 2010). In addition to learners being able to make more decisions regarding their education, online learning fosters various opportunities for learner enrichment. Learner enrichment is obtained with the chance for interaction. Online courses provide a space where every learner can have a voice and share their unique ideas and perspectives. In online environments, learners can share without fear of speaking in front of their peers and with decreased anxieties that exist in face-to-face instructional settings. Online learning environments also provide a sense of empowerment for learners. Learners find faculty more approachable and often feel more comfortable communicating with their instructors through online chats, emails, and discussions rather than face-to-face. Means et. al. (2010) also report improved academic performance in online learning environments. On average, learners in online learning environments perform modestly
308
Creating Inclusive Online Learning Environments That Build Community and Enhance Learning
better than those receiving face-to-face instruction due to increased time. Means et. al. (2010) posits that learners in online learning environments spend more time on task than learners in the face-to-face courses. The increased focus on quality for online courses has a direct correlation to instructors’ abilities to meet their department and colleges’ overarching objectives. There is an intense and specific focus on making sure the faculty teaching the online courses are effectively using technology to help learners achieve the learning goals of the course. Faculties are encouraged to pay closer attention to course structure and format due to the nature and essence of online learning environments; thus, many faculty members report that preparing to teach in the online structure prepares them to better dissect the courses they teach. This often equates to faculty who are paying particular attention to course objectives. When focusing on course objectives, the aim is to make sure the course structure, activities, and assessments are in alignment with the course objectives. This increases the accountability of instructors in online courses being that they are required to build courses with the end in mind.
BUILDING COMMUNITY IN THE ONLINE LEARNING ENVIRONMENT The sense of community in an online learning environment is an essential part of the overall learning context. The connection between learners and faculty in face-to-face and online courses are equally important. Faculty must build successful rapports with learners, and learners need to make connections with other learners to some degree in order to function as learner units in situations such as collaborative processes. As Rovai (2002) suggests, it is possible that emotional connectedness may provide the support needed for online learners to successfully complete courses and programs as well as learn more when in online learning environments. To foster a sense of community, active engagement is necessary. Active engagement in the learning process is a cornerstone of meaningful educational outcomes. Educational research consistently confirms that engaged, empowered learners are more likely to undertake challenging activities, to be actively engaged, to enjoy and adopt a deep approach to learning, and to exhibit enhanced performance, persistence, and creativity. Despite attributes of the online environment that can enhance learner engagement (e.g., self-pacing, flexibility, interactivity), it is difficult to elicit and maintain engagement in most instructional situations, and online learning presents special challenges in this regard. Cull, Reed, and Kirk (2010) articulate the following challenges inherent in online learning, where face-to-face contact is absent and learners and instructors are in contact only via the internet: • • • •
Faculty are unable to read nonverbal cues indicating that learners are disengaged, frustrated or unenthusiastic. Faculty may find it harder to express their own feeling of enthusiasm, encouragement or concern. The sense of anonymity that characterizes the online environment may make it easier for learners to withdraw, participate minimally, or completely disappear from the instructional experience. Learners may be predisposed to disengagement as a result of having enrolled in online courses based on the erroneous assumptions that online courses are easier and will require less of their time.
These possible challenges to the online learning environment make a further argument for the necessity of building online learning communities that foster engagement for all learners with elements such as interactive discussions, collaborative opportunities, and autonomous engagement. 309
Creating Inclusive Online Learning Environments That Build Community and Enhance Learning
KEY FACTORS THAT IMPACT LEARNER ENGAGEMENT Several factors impact the engagement of individual learners in the learning process. Motivation, interest, and caring can be considered key or essential to nurturing and eliciting learner engagement, especially in an online learning environment.
Motivation Engagement by individual learners in the learning process is strongly influenced by their motivation. Motivation, in this context, is defined as a “state of readiness” that fluctuates with time, according to the situation and circumstances (National UDL Center, 2012). Learner learning of skills and strategies requires sustained attention, effort, and engagement in instructional activities. Learners who are motivated are more likely to engage in activities that help them learn and achieve (Kelly, 2012). Thus, understanding what motivates individual online learners is crucial. However, most learners are motivated by success; instructors can provide opportunities for success by setting clear expectations, providing meaningful feedback, and providing additional resources.
Suggested Practices Following is a non-exhaustive listing of suggestions for motivating online learners in a variety of content areas. Mathematics •
Instructor could send messages of encouragement when problems are difficult to solve Don’t Give Up!; You can do it; Take a break and come back; You know this material!
Science •
Instructor could include publisher-generated tests and quizzes that provide immediate feedback/ results. These tests/quizzes will let learners identify areas they need to improve.
English •
Instructor might require learners to interview a journalist or news reporter regarding the value of creative writing and public speaking and how those courses contributed to his/her success.
All Courses • •
310
Instructor could provide an explanation of how a course will contribute to learners’ success in their chosen field. S/he could also provide incentives (e.g., rewards, points and or special attention from the instructor) that would inspire learners to complete assignments. Instructor could encourage use of tools and new learning activities that would allow learners to grapple with real-world problems.
Creating Inclusive Online Learning Environments That Build Community and Enhance Learning
Real World Example One of the chapter authors, whenever her delivery of content does not appear to “click,” she asks learner volunteers who do understand the content to use their own experience, perspective, etc. to explain the content for their peers in Chat Box or some other electronic group discussion forum. She provides extra credit for these volunteers. The instructor provides opportunities for learner input into required assignments. For example, when required to develop an inclusive lesson for a class containing learners with disabilities, education learners are allowed to choose the specific topic, materials, presentation format (e.g., audio digital devices), etc. they would use to carry out the lesson.
Interest Several strands of research demonstrate that displaying a personal interest in learners is not only nice, but also necessary in order for real learning to occur. Learners differ significantly from one another with regard to what attracts their attention and engages their interest. The interest of individual learners also differs over time and circumstance. For example, learners’ interests change as they develop and gain new knowledge and skills as their biological environments change and as they develop into self-determined adolescents and adults. It is, therefore, important that instructors have alternative ways to elicit the interest of learners as well as ways that reflect the important inter- and intra-individual differences they present (National Center UDL, 2012).
Suggested Practices Instructors can address the unique preferences and requirements diverse learning involves regarding their learning by offering online experiences that are individualized, differentiated, or personalized to fit the interests, prior experiences, and needs of their online learners. Specifically, instructors can provide: • • •
• •
•
Individual, small group and whole group tasks, discussions, and projects; A choice of online learning activities that offer a range of interactive options (e.g., completing an individual game or participating in an online simulation with other learners) Multiple levels of scaffolding, instructions and/or task structure (e.g., learners might be tasked with designing a web quest or Internet research project featuring several levels of structure ranging from independent web searches to instructor-guided links with scripted questions and tasks). (Nat’l UDL Center 2012) Information and activities that are relevant and valuable to learners’ interests and goals Varied activities and sources of information that are: ◦◦ Personalized and contextualized to learners’ lives; ◦◦ Culturally relevant and responsive; ◦◦ Socially relevant; ◦◦ Age and ability appropriate; ◦◦ Appropriate for different racial, cultural, ethnic, and gender groups A choice regarding the types and content of rewards or recognition available in the instructional environment 311
Creating Inclusive Online Learning Environments That Build Community and Enhance Learning
Real World Example As an online instructor, one of the authors used myriad activities to help online learners identify and incorporate their interests. For example, she: • • • • • • • • •
Had learners complete profiles at the onset of the class. Learners were required to post photos of themselves, introduce themselves, and share their interests via webcam or cell phone videos. Helped online learners connect their own personal interests with the academic content they intended to teach. Joined in online discussions regarding learners’ interests. Knowledge of these interests enabled the instructor to make course work appealing to learners. Used technologies embedded in course management systems such as Blackboard Collaborate, Wikis, Blogs, text messages, or even a simple telephone call Asked learners to find websites, news stories, and other online resources of relevance to the course, using some kind of social bookmarking service. Encouraged learners to look for connections between course material and their personal and professional interests. Designed authentic activities whose purpose was clear to the learners. Provided tasks that allowed for active participation, exploration, and experimentation. Invited personal responses, evaluation, and self-reflection related to content and activities.
Relevant Resources A number of helpful web-based resources exist that can be used by online instructors to stimulate the interest of their learners. Really Simple Syndication and Crayon.net are illustrative of the variety of resources available. • •
Really Simple Syndication (RSS) feeds send free, up-to-date content to one’s computer via the Internet (www.weblogg-ed.com). The RSS Quick Start Guide can help learners and teachers learn how to access RSS feeds and put them to use in the classroom. Crayon.net (http://www.crayon.net/) offers electronic templates that enable learners to create their own newspapers. The site allows learners to bring multiple sources together, thus creating an individualized and customized newspaper.
Caring Caring has not always been seen as an important factor in online learning (Jones, 2011). However, a growing body of research suggests that online learners need to feel that the instructor and other learners care that they learn. For example, in a study of 609 online learners, Jones found that caring was the number one predictor of online instructor ratings.
312
Creating Inclusive Online Learning Environments That Build Community and Enhance Learning
Suggested Practices There are many basic strategies that instructors can utilize to communicate a sense of caring. Specifically, they can: • • •
Provide frequent feedback regarding performance. Ask learners whether they feel that they are getting the support they need. Maintain a directory of online resources that can be shared with learners who may need help in dealing with health, financial, and childcare issues.
Real World Example One of the chapter authors communicated caring during an online course she taught as follows. •
•
As the instructor for an online course in special education, she provided feedback for all multiple choice test items by programming into the evaluative portion of the test positive comments for each correct response selection (e.g., “Great Answer”) and offering encouraging feedback for an incorrect response (e.g., “Sorry. Review the correct answer and you will never miss these points again.”) When a learner failed to submit an assignment on time, the instructor sent an email asking to schedule a time to meet in the Chat Room or to have a conference call. The instructor let the learner know that she was concerned and wanted to know if all was well. The learner, who had missed several assignments and had failed to respond to voice and text messages, finally contacted the instructor. She explained her non-responsiveness to the phone calls and texts by indicating that she had been undergoing chemotherapy, had been too weak to respond, and had had no one available to respond for her. She stated how good it made her feel that someone had called and expressed concern. She decided to drop the course that semester but returned the next semester, once again thanking the instructor for her words of comfort.
FACTORS THAT FACILITATE LEARNER ENGAGEMENT In addition to key factors that impact the engagement of learners, other factors also are critical to facilitating and sustaining the involvement of learners in the learning process. Vital to the ongoing engagement of online learners, especially those whose intellectual, behavioral, emotional, or physical characteristics may increase the challenge of learning online are: (a) the ability of learners to self-regulate their learning, (b) the interaction relationships available in the online experience, and (c) the accessibility of online course content.
Self-Regulation In order to sustain the effort and concentration that learning requires, learners must regulate their attention and affect. Learners differ considerably in their ability to self-regulate in this way. These differences may reflect disparities in their initial motivation, in their capacity and skills for self-regulation, in their 313
Creating Inclusive Online Learning Environments That Build Community and Enhance Learning
susceptibility to contextual interference, and so forth. A key instructional goal for creating inclusive online environments is to help learners develop self-regulatory skills that will sustain their engagement in the online learning process.
Suggested Practices In order to facilitate online learners’ ability to self-regulate their learning, instructors must help learners recognize that self-regulation should occur at each step of the learning process: 1. Before engaging in the learning task (i.e., by analyzing the task, setting goals, developing a plan of action); 2. While learning or completing the task (i.e., by utilizing learning strategies and monitoring how well the strategies are working); and 3. Following completion of the learning task (i.e., by engaging in self-reflection and self-evaluation after the task) (Zimmerman, 2002). Instructors can incorporate, throughout their course requirements, features that create learners with the self-directive dispositions, knowledge, and skills that enable them to learn effectively online. Specifically, instructors can: • • • •
Help learners sustain effort and concentration in the face of distractors: ◦◦ Consistently remind learners of their ultimate goal and its value. ◦◦ Encourage learners to create strategies for self-reinforcement. Encourage learners to use computer scheduling and task management tools (e.g., calendar alerts found in most smart phones). Provide a range of demands and possible resources that allows all learners to find challenges that are optimally motivating. Help learners identify self-regulation methods that can strategically modulate their emotional reactions so that they can effectively cope with and engage in the learning environment. ◦◦ Provide information and encourage use of strategies for stress management.
Real World Example A persistent challenge faced by one of the authors during an online class was the inability of learners to determine a mutually agreeable time of the day to develop online group projects. Many of the learners had classes, were working, or taking care of their families during the day and on weekends. To assist learners’ self-regulation with regard to the learning assignment, the instructor asked each learner to develop a schedule of daily activities and identify a time when s/he would be able to participate in a Blackboard Collaborate or Conference Call. Learners identified 11 p.m. as the best time for a 30-minute conference. Additionally, learners agreed that each week one individual would develop a draft of the project and send it to all participants for discussion during their weekly meeting. The lesson learned was this: instructors need not resolve problems online learners encounter; they merely need to provide, or suggest to learners, tools that can help them manage their own learning more effectively.
314
Creating Inclusive Online Learning Environments That Build Community and Enhance Learning
Interaction Relationships Numerous researchers have found that interaction relationships affect engagement within an online environment and that effective online instruction requires opportunities for learners to interact with the instructor and with each other (Maki & Maki, 2007; Zhao, Lei, Yan, Lai, & Tan, 2005). Actually, Fletcher, Dowfett & Austin (2012) have identified three core interaction relationships: learner-instructor, learnercontent, and learner-learner interactions. Learner-instructor interactions occur in a live class where the instructor provides content in four areas: participant, chat, audio, and whiteboard. These interactions allow the instructor to provide learners with regular feedback regarding instructional activities, learner performance, etc. Learner-content interactions provide content in a manner that attracts the learner. Content can be funny, fascinating, or otherwise engaging enough to pull learners in. Learner-learner interaction provides opportunities for learners to learn from one another.
Suggested Practices In order to maximize the engagement of learners in online learning, instructors must ensure that activities occurring in the online environment address each of the core interaction relationships. Learner-Instructor Interactions • • •
Use discussion board posts to provide learners with regularly-occurring feedback. Provide Chat Box opportunities for learners to ask questions, seek clarification regarding assignments (e.g., Blackboard; Moodle). Use synchronous interactive space to provide opportunities for learners to communicate with the instructor. (e.g., VoiceThread; FaceTime; Elluminate Live)
Learner-Content Interactions • •
Provide interactive sessions with immersive stimulations that allow learners to play, fail, and succeed in a safe environment. Utilize electronic programs that provide real-time training, demonstration and an environment of collaboration (e.g., Blackboard Collaborate).
Learner-Learner Interactions • • •
Require learners to participate in weekly discussions on assigned topics and respond to the posts of their classmates. For example, learners might be required to support or refute a position of their peer(s) providing references that support their own positions. Incorporate into instruction and instructional activities, electronic programs that can facilitate learner-learner interactions (e.g., Discussion Board feature of Blackboard). Utilize technology to engage learners in online clinical experiences with their peers.
315
Creating Inclusive Online Learning Environments That Build Community and Enhance Learning
Real World Example Mandernach (2009) advocated using instructor-personalized multimedia components that promote collaboration in the often “faceless” environment of online learning. In her study of online learners in a general psychology course taught across sequential terms, Mandernach utilized weekly videos and audiobased PowerPoint presentations and designated specific times for e-mail and chatting with learners. She concluded that merging an interactive course protocol with a consistent multimedia presence created a high level of engagement and a strong atmosphere for learning, as evidenced by positive qualitative and quantitative learner feedback at the end of the course. In addition to conveying the expectation of mutual respect in the online environment, it is important to implement activities that promote community building. The first aspect to building online communities within and online environment is to include a venue for learners to introduce themselves to one another. This is commonly done through an online discussion within the LMS. Instructors may initiate this discussion with their own information to give learners a guide or example of how do this, or they may post discussion prompts that lead learners to introduce themselves through their perspectives on a particular issue. For instance, an instructor may post a prompt such as one of the following: 1. Write a poem illustrating your teaching philosophy as your introduction to the class. Include as much information about your educational background and current occupational position as possible. Read and respond to four other posts keeping in mind two people with whom you would like to collaborate on future group activities. 2. You are the facilitator in an online course, and you need to illustrate who you are to your class members without using words. Create a collage that communicates who you are in your personal life and what kind of an educator you are in the online environment. Read and respond to four other posts keeping in mind two people with whom you would like to collaborate on future group activities. These activities promote inclusion and take a diverse learning community into account while giving learners opportunities to introduce themselves to one another. These activities also promote autonomy within the learner population because they are given a choice of how to respond. Furthermore, these activities span beyond the general aspect of providing one’s name and educational background to the others in the course, and this is important to quickly conveying who they are to one another and provides a means for them to select with whom they would like collaborate for future activities without the instructor making all the choices.
BUILDING COMMUNITY THROUGH COLLABORATION IN ONLINE LEARNING Collaboration is an essential aspect to building online communities in the online learning environment. Collaboration in such environments allows learners to get to know one another in an expeditious manner for the benefit of achieving certain objectives. In various online environments, building the collaboration aspect within the environment should be coordinated within the first week of instruction due to the fact that these courses do not oftentimes last beyond 15 weeks.
316
Creating Inclusive Online Learning Environments That Build Community and Enhance Learning
Figure 1.
In addition to forming collaborative groups in online courses, learners need a variety of resources to aid them in successfully participating in these collaborative groups. There are both synchronous and asynchronous mediums for successfully implementing collaboration. Synchronous mediums are ways to communicate and collaborate in real time; whereas, asynchronous mediums are ways to communicate without considering coinciding time slots. Figure 1 is a table that illustrates synchronous and asynchronous online tools for collaborative activities.
ONLINE LEARNING ECOSYSTEM INCLUSIVE DESIGN (OLEID) Morphing technology leads to morphing terms across professional studies, and the word ecosystem is one of the words that has an altered meaning when applied to education. Basic knowledge with regard to an ecosystem brings a common space for plants and animals to mind. The plants and the animals, being mere parts of the instrumentality, are components of the system that include the natural nonliving manifestations such as the earth, the sun, and the climate. All of these components work together in order to ensure that they thrive as a whole unit. The same concept applies to Online Learning Ecosystem Inclusive Design (OLEID) in the sense that course designers create these online spaces that work together to allow instructors to interact with learners and for learners to thrive.
317
Creating Inclusive Online Learning Environments That Build Community and Enhance Learning
When one thinks of OLEIDs, the elements that work in unison to make learning possible for a variety of online learners should come to mind. OLEIDs include assessments, discussion boards, online libraries, online tutorials, chat features, webinars, social media networks, email, course syllabi, color layout selection, learning management systems, audio media, visual media, support, reflective surveys, collaborative activities, individual activities, videos, games, blogs, and engaging content. All of these elements work together to promote an online environment that interact in unison for inclusive purposes. These components allow learners with various abilities and technological competencies to interact with one another, learn from one another, and thrive in unison with the instructor and the course content. For instance, the novice chat participant comes into contact with the advanced chat participant because the novice participant has access to useful tutorials within the learning space that allows for interaction with advanced users. Various ecosystem components meet learners’ differing needs. Figure 2 is a sample OLEID. Some of the components of this sample OLEID are multifaceted, and they may actually be placed in several OLEID categories which makes them even more valuable and inclusive to the learning environment. OLEIDs provide designers with an abundance of resources that not only reflect the online course but also the surrounding global context with expert knowledge spread throughout these ecosystems and utilizes various online resources including social media and blogs. OLEIDs’ inclusive nature coincides with Beyond E-learning: Approaches and Technologies to Enhance Organizational Knowledge, Learning, and Performance in which the author is a proponent of fusing all avenues to learning into a space that promotes inclusion (Rosenburg, 2006). The possibilities for inclusion with online instruction are vast due to the electronic resources that are available both centrally and globally. Figure 2.
318
Creating Inclusive Online Learning Environments That Build Community and Enhance Learning
In “Developing and Sustaining the Digital Education Ecosystem: The Value and Possibilities of Online Environments for Learner Learning”, the authors state that virtual learning environments focus more on learning than on the restrictions that hinder the traditional learning environment (Ingvarson & Gaffney, 2008). Given this view of the virtual learning environment, it is inevitable that learners in such environments adapt to becoming more independent and willing to travel various roads to acquiring knowledge; therefore, virtual learning environments have the potential to foster more creativity than traditional learning environments in some respects. In assessing such environments, designers have to consider accessibility and usability and design course components that are user friendly or familiar; this often incurs implementing tools that span beyond what is available on the LMS such as blogs and social media. To assess familiarity or usability, continuous reflective user information is essential to morphing the environment, informing designers of what works and what lacks cohesion with the learning environment. One way to gather reflective user information is via user surveys in regards to technological competence; furthermore, within environment tutorials to support any technological savvy deficiencies is also useful. Providing such supportive material allows learners to flourish into competent users/learners within the OLEID environment which leads to more competent interaction within the learning community. This competent interaction among the learners works to promote equality in the learner population.
WEB 2.0 Applications and Social Networking in the OLEID Community It is astonishing how web applications have evolved to become so supportive to learning environments, and in some cases, indispensable to learning ecosystems. WEB 2.0 environments work within OLEIDs to foster informal learning environments that support the formal learning process. The less structured nature of the WEB 2.0 applications does not comprehend the learning value of the tools within OLEIDs. In the span of the last 20 years, learner interaction with social media has evolved into a catalyst for global educational interaction; therefore, educators have to take these changes into consideration when designing OLEIDs and incorporating acceptable WEB 2.0 and social media platforms for learner interaction. This is often difficult for colleges and universities because WEB 2.0 features do not fall under the structured arena that LMSs convey; however, it is time for educators to give learners some honor credit. Just as educators have to trust learners to a particular point within the college or university honor system and believe that most learners will not participate in dubious educational activities, educators need to let go of the perception that WEB 2.0 platforms are useless features because they are proven educational assets. Look at hard copy books compared to digitally generated texts. Although hard copy books are not included in the above sample OLEID, they are not obsolete in the learning ecosystem framework; however, they are in at least three formats including audio, hard copy, or digital formats with digital and audio being most convenient or comfortable for some. While others relish the idea of having to open a book on a laptop as opposed to opening up the crisp pages of a new book, others yearn to carry an IPAD, Smartphone, or laptop around with access to thousands of online texts. The point is that all forms of books whether audio, digital, or hard copy are viable supports for learners’ learning habits. All their mediums work well to meet various needs for diverse learners. It is difficult for some to remove themselves from the tangible nature of the hard copy textbook, and it is freeing for others to have access to an audio or digital form of the same textbook. What does it take to enjoy such fruitful outcomes of technology? It just takes acceptance that one is just as beneficial as the other, and educators must accept the beneficial
319
Creating Inclusive Online Learning Environments That Build Community and Enhance Learning
nature of WEB 2.0 applications. However, there are hindrances to acceptance of WEB 2.0 applications in learning communities. The author of “The Facebook Guide for Teachers - eLearning Industry” documents ways to implement Facebook into an online course while imposing some of the structure and restrictions that are usually afforded to LMSs. The author suggests setting boundaries with contracts that illustrate usage restrictions (Pappas, 2013, para. 12-13), a practice that may be amenable to college or university administrators concerned with privacy issues. The author goes on to say that the “age of fast-paced technology” makes it essential to “properly include social media sites like Facebook in meaningful, professional, and engaging ways that reach every learner and encourages inclusion and participation” (Pappas, 2013, para. 22). There is no doubt that social media has benefits that span beyond the LMS, and these benefits are being presented to educators in a more structured manner that is working to alleviate some of the apprehension related to utilizing social media in OLEIDs.
Enhancing Learning in Online Environments Simple strategies to E.N.H.A.N.C.E. (Engage Navigate Highlight Assessment Network Connect Edutain) the learning that takes place in an online environment is captured in the acronym explained below: In course design it is easy for instructors to complicate the course for themselves and for the learners. The ENHANCE Learning Model is all about doing the opposite. The authors have found simplicity to be the key to increasing the likelihood that their courses will have the capacity to both build community and enhance learning simultaneously. Each word in this acronym offers a simple suggestion to encourage thoughtful preparation in designing a course that will provide an environment conducive for learning. See E.N.H.A.N.C.E. Learning Model below in Figure 3. Figure 3.
320
Creating Inclusive Online Learning Environments That Build Community and Enhance Learning
•
•
•
•
Engage: It is important to think of ways to engage and involve the learners in the learning process. This does not have to be difficult, and it can be achieved through activities that the learners can perform to make them become a part of the learning experience. Get to know the learners; ask them what motivates them. How will this material or course be something they can use immediately, or how might it benefit them in the future? Think about what intrinsic and extrinsic factors will contribute to their involvement. Navigate: Instructors should ensure that the course is well designed and easy to navigate. They should look at the course from an objective point of view, or try to see it from the learners’ perspective. Vet the wording used in the course to determine if it makes sense to the learners, or question whether it is simply instructor relevant. If learners require and explanation for the instructor’s explanation, the language most likely is not learner friendly. Do the examples make sense for learners, or are they working to confuse them? Are the activities and course structure correlated with the appropriate development time in relation to the course’s design? Can the learners navigate the course with ease and clear understanding? The course should not resemble a treasure hunt or maze but should be clearly organized for the learners. Highlight: The term highlight is most useful for the learners. The notion of highlighting involves informing learners of what is important as it pertains to the course. This term is also related to the previous term, Navigate. Highlighting not only involves placing emphasis on what the learners should pay attention to at any particular moment in the course, but it also encompasses the notion that the course cannot and should not cover any and everything involving a particular subject. Most semesters are approximately fifteen weeks or three and a half months. This is a limited time frame, and one cannot realistically expect learners to learn all there is to know about a given subject in such a compacted time-frame. It is not the learners’ responsibility to decipher what is most important, but it is up to the instructors (the SMEs) to highlight what is essential for their learners to gain during this limited time-frame and focus on this information. Include activities, readings, assignments, etc. that align with this essential information. Highlighting is a practical strategy for decreasing learners’ inabilities to comprehend information. This strategy also increases learners’ potentials for being successful at retaining and employing the information covered in the course. The highlighting strategy makes it more realistic for learners to obtain mastery in course content. Assessment: Assessment should be ongoing and include both formal and informal assessment opportunities to provide learners with accurate gauges of their performance in courses. Learners should not be surprised at mid-terms or on their finals as to how they are performing. Neither should learners only be given a few chances to succeed in a course. It is okay to provide learners with multiple chances to improve as well. The goal of a course should not be rooted in rigor but should be in learning content. Learning involves transfer of knowledge, and some knowledge is obtained through trial and error, which includes making mistakes. The authors are not saying that grading in the traditional sense has become obsolete, but in addition to the traditional grading methods, more opportunities should exist in the form of assessment that will enhance learning. Both formative and summative assessments should be utilized in a balanced manner. It is also a common misnomer that assessments are separate from instruction and/or learning. For instance, some instructors think, “I teach first, and then I assess the learners,” but on the contrary, teaching and learning also occurs during the assessment process. The assessment itself can provide teachable moments and learning opportunities when designed properly.
321
Creating Inclusive Online Learning Environments That Build Community and Enhance Learning
•
•
•
Network: Network involves providing opportunities for the learners to learn with whom they are studying; after all, it is very likely that these learners will transition from classmates into colleagues. In many instances learners are not provided a chance that is built into the course’s design for them to foster these pivotal relationships. Simple activities like icebreakers and event content related projects and task that include a social and interactive component can allow learners to interact with one another. The instructor can be sure to facilitate some aspects of these interactions to ensure that networking is taking place. The relationships that are built into these settings can be lasting but also have a positive impact on the learning that takes place (Rovai, 2002). Connect: Connect is similar to network, but in this context it is referring to the instructor connecting to the learners. It has become quite normal for instructors to place distance between them and their learners for reasons such as professionalism and in efforts to create the instructor-to-learner dynamic. Nevertheless, it can be very powerful for learners to see instructors as individuals with whom they can relate. Learners are more likely to maximize their learning experiences when they perceive a deeper connection to their instructors (Thomas et. al. & Maki & Maki, 2007). Should instructors not build a connection with their learners, the community aspect of the learning environment is diminished, and the overall learning can be negatively affected. Edutain: Edutain is the combination of the words educate and entertain. This term is pretty selfexplanatory; it simply involves the act of learning through methods that both educate and entertain. Just because information is being taught does not mean it cannot be fun and entertaining. In fact, it is more likely that the learners will learn more when the activities included courses afford them chances to enjoy themselves. To Edutain, think of simple yet creative ways to capture the attention of the learners while covering the course’s content. This can include games, music, videos, and movies. Technology affords many opportunities to deliver fun, entertaining, and educational experiences for learners.
Implementing the E.N.H.A.N.C.E. Learning Model into a course does not have to be an arduous task. It has simple elements that enhance learners’ experiences.
Inclusivity and Equity for Online Learners Another important aspect to enhance learning in the online environment is to employ strategies that create inclusivity for all learners. The online classroom does not assume sameness or render differentiation mute. The main task for online instructors who are concerned with inclusion is not to view all learners as equal but rather to create an equalizing environment for learners to have access to equal education. An OLEID works with the acceptance that differentiation is essential to working toward inclusion in the online learning environment. Learners are distinctive and must have learning experiences that will allow them to showcase their strengths and support their deficiencies. The various components in the sample OLEID (Figure 2) includes technology and tools that work to serve all learner populations. It includes tools within the ecosystem that work to promote equality for those who have disabilities that may hinder their ability to showcase various talents. For example, the Wireless Page Magnifier is a tool that works within the ecosystem to give visually impaired learners access to course content that they may not otherwise be able to access without magnification, and the tool is technologically advanced and efficient in that it replaces the age old, bulky magnifiers that learners once had to endure. It is portable as well which is an improvement because learners no longer have to travel to the tool; the new 322
Creating Inclusive Online Learning Environments That Build Community and Enhance Learning
version may be easily placed in a small box and delivered to learners. The online course designer must work to incorporate OLEIDs that are sympathetically designed with all learners in mind. Without such systematic design, inclusion in the online environment is wrought with hindrances that impede holistic learner progress and success.
Universal Design for Learning As the conversation about inclusive teaching grows, so does the complexity of the issue. The term “inclusive” has become a moving target, making it difficult for educators to understand educational issues and create solutions to remedy them. A common understanding is needed of the nature of inclusive instruction and its potential efficacy in increasing the quality and reach of education (McGuire, Scott & Shaw, 2006; Tinto, 2008). The Universal Design for Learning (UDL) provides a much-needed framework for discussing inclusion in education. Notably, it provides clear recommendations for proactively addressing inclusion issues from a broad perspective that does not simply center on race, class, gender, and/or ability. UDL includes three components that can be utilized as a framework for illustrating inclusive teaching practices amongst instructors. It provides clear, systematic possibilities for inclusive teaching that are more explicit than mere suggestions of perspectives that heavily critique the educational system but do not provide tangible solutions (Archer, Hutchings & Ross, 2003; Bok, 2006). UDL rejects one-sizefits all approaches to teaching through traditional delivery of curriculum and assessment (Meo, 2008). Online instructors must design their courses with learner centered approaches in mind; furthermore, being able to invest in a design that is amenable to various learners is also essential to ensuring a course design that is inclusive. UDL is all encompassing in terms of the instructors and the learners, and it is a research and practice-based framework (McGuire & Scott, 2006). UDL is based on research in learner differences and best practices for teaching and assessment (Rose & Meyer, 2006). UDL scholars have identified three “networks” as critical to learning: the recognition, the strategic, and the affective networks. The recognition network constitutes the “what” of learning. Learners recognize and begin to process information and ideas as they are represented through language, symbols, and objects. Recognition is accommodated most effectively when teachers provide multiple means by which learners experience and acquire information and ideas. The strategic network, the “how” of learning, engages learners in multiple means of organizing and expressing what they are in the process of learning. The affective network, the “why” of learning, sparks learners to get engaged in the learning process and stay engaged. To best meet the needs of the affective network, UDL asserts that teachers need to provide multiple means of engagement with content (CAST, 2011; Rose & Meyer, 2006). In sum, UDL promotes, as core principles, multiple means of representation, engagement, and expression. Encouraging educators to be intentional about the development and use of classroom strategies, techniques, and practices with each principle in mind simultaneously, while achieving the learning goal, is a new connection that the implications of this research have to offer existing UDL literature. The authors are simply suggesting that, if an instructor plans to use any teaching technique in the online environment, he or she should be conscious of how this activity will assist learners in reaching desired learning goals. UDL can be used to shape, focus, and improve an instructor’s training/professional development (Leichliter, 2010). The application of UDL is necessary to address issues of access and equity in colleges and universities. UDL should not be viewed as a practice for only learners with disabilities or other characteristics that marginalize them. The inclusive practices of the UDL framework inform best practices for all learners (Sopko, 2009). 323
Creating Inclusive Online Learning Environments That Build Community and Enhance Learning
Accessibility The external environment should provide options that equalize accessibility for widely diverse learners. While multimedia provides a dynamic interactive environment, instructors must be aware that its use can exclude some learners. Learners with hearing, speech, language, learning, emotional, or mobility impairments may be excluded, as may those whose primary language is not English. Instructors must consider whether, and how well, the attributes of instructional materials match the attributes of diverse learners, keeping in mind that a single material attribute may have opposite impacts on learners with particular special needs. For example, the presence of graphics produces a barrier for the learner who cannot see, yet can clarify a concept for the learner who is dyslexic. The challenge for designers is to create appealing and inventive pages that provide content in a variety of ways and formats with alternatives available for those who cannot utilize every multimedia option (Stinson, 2016). Familiarity with the principles of Universal Design can also help instructors ensure that course content is accessible for all learners.
Suggested Practices All online instructors should add an accessibility statement to their syllabi. The statement should be in accordance with the Americans with Disabilities Act (ADA) and Section 504 of the Rehabilitation Act. It should let learners who might have disabilities know: a. That support services are available at the institution b. That they can obtain information in the online syllabus regarding how to access these services c. That they must provide information verifying their disability Following is a non-exhaustive listing of suggestions that online instructors can use to facilitate accessibility to their online courses for learners with diverse learning characteristics and needs. The first part of the listing is organized to address accessibility challenges that individual learners may face; the second part of the listing is organized to address accessibility challenges posed by specific academic content. Visual Impairments • • • • • • • •
324
Increase font size on webpages; use large font on PowerPoint slides. Use large text with a high contrast of white or yellow on dark gray so no color inversion is necessary. Use Zoom Test, or NVDA reading software. Use Class Act, a resource to support instructors and staff who work with Deaf and Hard of Hearing Learners in mainstream academic environments. Prepare audio presentations as audio files. Use Standard HTML in program design. The HTML tag guarantees that content is presented in an accessible format. Design Large Buttons and link images to make it easier for everyone to navigate links. Include an accessibility note and symbol on online course entry-level pages that let learners know that a learning activity or material is available in diverse formats.
Creating Inclusive Online Learning Environments That Build Community and Enhance Learning
• • • • • • • •
Use professional-level interactive graphing and statistical programs in the study of mathematics to make complex topics more accessible for all learners and to help them connect to datasets that are current and relevant to their lives. Use magnification software, CCTV. Use the speech part of screen reader programs. Add captions to video clips. Use dark or brightly colored backgrounds, and provide high contrast between text and background so it can easily be read when magnified. Use tables for data not formatting. Incorporate the ALT attribute on the course website. The attribute provides a short descriptive phrase that appears as alternative text for images in browsers, Include menu alternative for image maps.
Hearing Impairments • • •
Download descriptive captions or typed and formatted text for sound clips. Provide descriptive captions for all videos that have soundtracks or a descriptive transcript. Add closed captions to You Tube videos.
Learning and Cognitive Impairments • • •
Utilize audio accompaniment or sound reinforcement for text-based information. Use photos and diagrams for explaining complex ideas, relationships, timelines, and sequences. Use graphical content to communicate concepts.
Earth Sciences Courses •
Bring professional scientific methods and techniques to online learners of varying abilities by: ◦◦ Collecting data with inquiry tools, ◦◦ Adding geotags with GPS tools, ◦◦ Interactively analyzing visualizations of data patterns through Web browsers
History Courses •
Engage learners in historical thinking and reasoning by: ◦◦ Utilizing original documents available to historians as digital resources from the Smithsonian and other institutions.
325
Creating Inclusive Online Learning Environments That Build Community and Enhance Learning
Mathematics Courses • • •
Make topics more accessible to all learners and help them connect to datasets that are current and relevant to their lives by: ◦◦ Utilizing professional-level interactive graphing and statistical programs Make graphs, tables, and charts easier to develop by: ◦◦ Providing digitized materials Help learner carry out algorithms by: ◦◦ Linking learners to websites that provide digital calculators ◦◦ Linking learners to electronic mnemonic devices that provide practice problems and examples
RECOMMENDATIONS FOR FUTURE RESEARCH Progress has been made as it pertains to online learning communities; however, opportunities exist to increase the literature as it pertains to learner and faculty outcomes in these environments. This chapter highlighted Web 2.0 tools and applications. These applications continue to be updated; this poses not only a challenge to the learners but also for faculty. Research on how to best keep learners and faculty trained in using these features would be beneficial. In addition to the Web 2.0 tools there are many LMSs that exist and similar to the Web 2.0 tools these systems are consistently updated. Thus, a look into how to practically offer continuous preparation for learners and faculty should be explored. If the learners and faculty struggle to utilize these systems and features due to lack of preparation, the online learning environment and its benefits will be limited. This chapter also focused a great deal on building community and enhancing the learning experience in the online environments. Therefore, opportunities to conduct both qualitative and quantitative studies exist for both faculty and learners who are afforded an opportunity to participate in and or employ these strategies. Finally, much has been posited as it pertains to comparing face-to-face to online courses, however, not many true comparison studies have been conducted to discover how both modes of instruction could inform the other. The purpose of conducting such a study is not for competition but to discover ways to strengthen both learning environments and experiences. Strengthening the online learning environment in particular will also include further exploration and discussion as it pertains to inclusion and accessibility. Ergo, there would be opportunities to conduct studies specifically involving learners who require reasonable accommodations in accordance with ADA and Section 504 of the Rehabilitation Act to ensure as this ever changing environment is accessible and adequately accommodates all learners.
CONCLUSION Online instructors are morphing along with the learners within the learning environments that they currently serve. One of the major focuses that instructors must engage in with learners presently coincides with teaching them the benefits of being to be able to decipher the educational value of the information that is freely disseminated to them on Web 2.0 applications. Teaching learners to counter misinformation is essential to building their critical thinking skills. Guiding learners toward becoming effective evaluators of such information has educational value in that it promotes critical 326
Creating Inclusive Online Learning Environments That Build Community and Enhance Learning
thinking skills, one of the major skills with which learners should leave from a college or university. In other words, instructors should not shy away from Web 2.0 applications; on the contrary, they must use them to promote skills and abilities in their learner populations, and in doing so, they promote inclusion. In these cases, inclusion for those learners who thrive in social media environments means teaching them to evaluate and use information to promote learning within their OLEIDs. Another application that is available on Web 2.0 sites to promote inclusion is digital storytelling. One of the best advantages to using digital storytelling as a means of formative assessment in online courses is that many learners now have Smartphones that they may utilize to construct their stories. In relation to the curriculum, learners use these digital stories to convey their newfound knowledge and synthesize it with their own experiences in relation to the worldview. This further promotes critical analysis skills that are both sought after in the workplace and in the Academy. Twitter is also mentioned in the above OLEID (Figure 2), and it is a powerful microblogging tool. For instance, instructors may create a tweet for their course asking for the major take away from a module. The responses from the learners will be quick and to the point, mirroring the texts that many of them send in their personal lives, less than 140 characters. Although this is not in the tradition of a formal essay response, it prepares learners to condense or summarize important points into concise statements, and it gives learners access to several responses that may be read in a short period of time. This appeals to learners who have short attention spans, and gives them access to information that they may not be able to successfully obtain when reading extensive responses. Also, some learners who engage in academic discussions via Twitter also forge interpersonal relationships. Junco (2014) notes that when discussing assigned readings, learners make connections realizing they have shared values and interests. As OLEID instructors, educators should embrace the idea that complex content is not always absorbed successfully or usefully with all learners. Sometimes there is great value in simplifying the dissemination of information. Moreover, this prompt and response method is useful to nontraditional learners who work full-time and need to optimize all of their time. Furthermore, Twitter is a great way to get important announcements out to learners who may not have time to frequently peruse the LMS. It has become very clear that the online learning space is growing. This growth is due to simple supply and demand. Learners are demanding to have access to learning in online environments. Thus, institutions must make it a priority to meet this demand and supply high quality learning experiences that promote positive learning outcomes. In this chapter, many viable solutions have been offered that can assist in creating inclusive online learning environments that build community and enhance learning. As this learning context advances, so too does its diversity of learners. Perhaps the most significant impact of online learning is in its potential to positively impact learner learning. Therefore, it is important to proactively address the needs of instructors teaching in these spaces through utilizing cutting-edge pedagogies and applications. This chapter provides many strategies that can be utilized to enhance online learning environments. However, in order for these strategies to become a benefit to learners, educators must continue to keep themselves informed of the challenges and benefits of online teaching to grow their perspective and lens from which they view this learning environment. In some ways, within the context of the online learning environment and its instructional approaches, one’s imagination and ability to maintain an open-mind are the only factors that create any limitations.
327
Creating Inclusive Online Learning Environments That Build Community and Enhance Learning
REFERENCES Amador, J. A., & Mederer, H. (2013). Migrating successful learner engagement strategies online: Opportunities and challenges using jigsaw groups and problem-based learning. Journal of Online Learning and Teaching, 9, 89–105. Ambrose, S., Bridges, M. W., DiPietro, M., Lovett, M. C., & Norman, M. K. (2010). How learning works: seven research-based principles for smart teaching. San Francisco, CA: Jossey-Bass. Archer, L., Hutchings, M., & Ross, A. (2003). Higher education and social class: Issues of exclusion and inclusion. Psychology Press. Bok, D. (2006). Our underachieving colleges: A candid look at how much learners learn and why they should be learning more. Princeton, New Jersey: Princeton University Press. Brown, A., & Green, T. D. (2011). The essentials of instructional design: connecting fundamental principles with process and practice (2nd ed.). Upper Saddle River, NJ: Pearson/Merrill Prentice Hall. CAST. (2011). Universal design for learning guidelines version. Wakefield, MA. Carey, M.A. (1994). The group effect in focus groups: Planning, implementing, and interpreting focus group research. InMorse, J. (Ed.), Critical issues in qualitative research methods (pp. 225–241). Thousand Oaks, CA: Sage. Class Act, R. I. T. Promoting Access for Deaf & Hard of Hearing learners. National Association for Developmental Education. (n. d.). Retrieved from https://sites.google.com/site/nadedisabilities/universaldesign/ritclassactpromotingaccessfordeafhardofhearinglearners Cull, S., Reed, D., & Kirk, K. (2010). Teaching Geoscience Online - A Workshop for Digital Faculty. EDUCAUSE. (2005). 7 Things you should know about...Social bookmarking. Retrieved from http://net. educause.edu/ir/library/pdf/ ELI7001.pdf Fletcher, G., Dowsett, G. W., & Austin, L. (n. d.). Actively promoting learner engagement: developing and implementing a signature subject on ‘Contemporary Issues in Sex and Sexuality’. Journal of University Teaching & Learning Practice, 9(3), 1-10. Garrison, D. R., & Vaughan, N. D. (2008). Blended learning in higher education: Framework, principles, and guidelines. San Francisco: Jossey-Bass. Huber, M. T., & Morreale, S. P. (Eds.). (2002). Disciplinary styles in the scholarship of teaching and learning: Exploring common ground. Washington, DC: American Association for Higher Education and The Carnegie Foundation of Teaching. Hurtado, S., & Carter, D. F. (1997). Effects of college transition and perceptions of the campus racial climate on Latino learners sense of belonging. Sociology of Education, 70(4), 324–345. doi:10.2307/2673270 Ingvarson, D., & Gaffney, M. (2008). Developing and Sustaining the Digital Education Ecosystem: The Value and Possibilities of Online Environments for Learner Learning. In M. Lee, M. Gaffney (Eds.), Leading a digital school: Principles and practice (pp. 146-167). Camberwell, Australia: ACER Press.
328
Creating Inclusive Online Learning Environments That Build Community and Enhance Learning
Junco, R. (2014). Engaging learners through social media: evidence-based practices for use in learner affairs. Hoboken, NJ: John Wiley & Sons. Kelly, R. (2012). Five factors that affect online learner motivation faculty focus. Madison, WI: Magna Publications. Leichliter, M. (2010). A case study of universal design for learning applied in the college classroom [Doctoral Dissertation]. Maki, R. H., & Maki, W. S. (2007). Online courses. In F. T. Durso (Ed.), Handbook of applied cognition (pp. 527–552). New York: Wiley & Sons, Ltd. doi:10.1002/9780470713181.ch20 McGuire, J. M., Scott, S. S., & Shaw, S. F. (2006). Universal design and its applications in educational environments. Remedial and Special Education, 27(3), 166–175. doi:10.1177/07419325060270030501 Means, B., Toyama, Y., Murphy, R., Bakia, M., & Jones, K. (2010). Evaluation of evidence-based practices in online learning. Washington, DC: U.S. Department of Education. Meo, G. (2008). Curriculum planning for all learners: Applying universal design for learning (UDL) to a high school reading comprehension program. Preventing School Failure: Alternative Education for Children and Youth, 52(2), 21–30. doi:10.3200/PSFL.52.2.21-30 Moore, C. (2013). Inclusive college teaching: a study of how four award-winning faculty employ universal design [Doctoral dissertation]. Temple University. National Center on Universal Design for Learning. (2012). UDL Guidelines-Version 2.0. Retrieved from http://www.udlcenter.org/aboutudl/udlguidelines/principle3 Nilson, L. B. (2010). Teaching at its best: a research-based resource for college instructors. San Francisco, CA: Jossey-Bass. Pappas, C. (2013). The Facebook Guide for Teachers-eLearning Industry. Retrieved from http://elearningindustry.com/the-facebook-guide-for-teachers Pascarella, E. T., & Terenzini, P. T. (2005). How college affects learners: A third decade of research (Vol. 2). Jossey-Bass. Rose, D., & Meyer, A. (Eds.), (2006). A practical reader in universal design for learning. Cambridge, MA: Harvard Education Press. Rose, D., & Strangman, N. (2007). Universal design for learning: Meeting the challenge of individual learning differences through a neurocognitive perspective. Universal Access in the Information Society, 5(4), 381–391. doi:10.1007/s10209-006-0062-8 Rosenberg, M. J. (2006). Beyond e-learning: Approaches and technologies to enhance organizational knowledge, learning, and performance. San Francisco: Pfeiffer. Rovai, A. P. (2002). Development of an instrument to measure classroom community. The Internet and Higher Education, 5(3), 197–211. doi:10.1016/S1096-7516(02)00102-1
329
Creating Inclusive Online Learning Environments That Build Community and Enhance Learning
Scwitzer, A. M., Griffin, O. T., Kancis, J. R., & Thomas, C. R. (1999). Social adjustment experiences of African American college learners. Journal of Counseling and Development, 77(2), 189–197. doi:10.1002/j.1556-6676.1999.tb02439.x Sopko, K. M. (2009). Universal design for learning: Policy challenges and recommendations. Project Forum at National Association of State Directors of Special Education. Alexandria, VA: United States Office of Special Education Programs. Stinson, B. (n. d.). Universal Design and Accessibility for Online Classes. Retrieved from https://www. coursesites.com/webapps/portal/frameset.jsp?tab_tab_group_id=null&url=/webapps/blackboard/execute/launcher?type=Course&id=_269867_1&url Thomas, M., Hilton, A. A., & Ingram, T. (2015, November) (Accepted). Campus environments: their importance and impact. Paper presented at theAssociation for the Study of Higher Education (ASHE) 40th Annual Conference. Tinto, V. (2008). When access is not enough. The Hispanic Outlook in Higher Education, 19, 13–13. Tracey, R. (n. d.). A Framework for Content Curation. Retrieved from https://ryan2point0.wordpress. com/2015/06/17/a-framework-for-content-curation/ Wong, K., Kwan, R., & Leung, K. (2011). An exploration of using Facebook to build a virtual community of practice. In Hybrid Learning (Vol. 6837, pp. 316–324). LNCS. Zhao, Y., Lei, J., Yan, B., Lai, C., & Tan, H. S. (2005). What makes the difference? A practical analysis of research on the effectiveness of distance education. Teachers College Record, 107(8), 1836–1884. doi:10.1111/j.1467-9620.2005.00544.x Zimmerman, B. J. (2002). Becoming a self-regulated learner: An overview. Theory into Practice, 41(2), 64–70. doi:10.1207/s15430421tip4102_2 Zull, J. E. (2011). From brain to mind. Sterling, VA: Stylus Publishing. Zydney, J. M., Stegeman, C., Bristol, L., & Hasselbring, T. S. (2010). Improving a multimedia learning environment to enhance learners learning, transfer, attitudes and engagement. IJLT International Journal of Learning Technology, 5(2), 147. doi:10.1504/IJLT.2010.034547
KEY TERMS AND DEFINTIONS Community: A particular group of people that are connected through commonality. Engage: To involve. Enhance: To increase and or make significant. Inclusive: Pertaining to and including the largest possible contingency of people. Instructors: Those providing instruction, teaching, and or educating. Learners: Those who are receiving and or participating in the learning process facilitated by instructors. Learning: The transfer of information and knowledge. Online: The environment that uses technology and exist within a Learning Management System.
330
331
Chapter 15
Common Scenario for an Efficient Use of Online Learning: Some Guidelines for Pedagogical Digital Device Development Walter Nuninger University of Lille, France
ABSTRACT Training efficiency required for Higher Education (quality, accessibility, bigger groups with heterogeneous prior experience, funding, competition…) encourages providers to find new ways to facilitate access to knowledge and enhance skills. In this scope, the use of digital pedagogical devices has increased with innovative solutions; the ones based on an LMS to support a blended course or MOOCS design for self-education. This evolution has impacted teaching practices, learning and organizations leading to a new paradigm for trainers and a new business model to be found for online and distance learning. The innovation mostly relies on the use of learner-centered digital learning solutions in a comprehensive way for the commitment of more active and independent learners and their skills recognition. Based on a 3-year experiment (hybridized course for CVT) and continuous improvement in the WIL, a common scenario is proposed to address the issue for distance training.
INTRODUCTION Efficiency of Higher Education training relies upon many players at different levels in the multi-level organization of Higher Education Institutions (HEIs) (Nuninger et al, 2016) but once in the classroom, the trainer is in charge (referring to the person leading the training; the teacher who will change role depending on the context and group) and, while respecting his personal workload, has to handle new constraints such as: time reduction for face to face learning, groups with large numbers of learners (referring to the students enrolled in Higher Education as trainees in Work Integrated Learning (WIL) and apprentices) with a heterogeneous level (knowledge) and prior experience (skills), and observed low involvement of the digital natives. In addition, the European Standards and Guidelines drawn up by the DOI: 10.4018/978-1-5225-1851-8.ch015
Copyright © 2017, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
Common Scenario for an Efficient Use of Online Learning
European Network of Quality Assurance (ENAQ, 2015) enhance Quality Assurance with an impact on the Pedagogical Team and the HEI’s organization to meet new requirements. The aim is to develop a learning organization and Community of Practice (mixing talents) to adapt to the world evolution and citizens’ needs through innovative learner-centered pedagogical tools. The HEIs have to imagine a new business model for funding and promotion of their added value in the scope of Lifelong Learning (LLL); high level trainings customizable for all, at any time and from (almost) anywhere (Davies, 2007; Yang et al., 2015). Such a concern is more demanding than the European Accessibility Act in 2015 by the European Community which is part of the challenge of distance learning, and complies with the European framework of “Europe 2020” (EC, 2010). Indeed, among the levers stressed by the European policy, note: innovation and knowledge, high-employment partly based on digital evolution, skill development throughout the life cycle and performance of the education systems. The huge developments of Information and Communications Technology (ICT) with cheaper materials to access the net and greater speed to connect to a large amount of information have given opportunities to HEIs to develop new pedagogical tools, giving access to validated knowledge. Despite the expected results, some discrepancies remain due to social class (Becker, 2000) and to a lack of basic skills and learning autonomy; and especially when digital devices are at play. Beyond such prior knowledge, such as the European Computer Driving License for instance, some learning skills are expected such as the ability to find and select the correct and validated information with respect to personal needs. Therefore, although Learning Management Systems (LMS) such as Moodle are useful tools to support the face-to-face course thanks to many pedagogical activities for huge groups, they are not sufficient for online learning with a concern for autonomy. In the same way, the MOOCs with free access and brief assessment that can motivate autonomous learning and sound like appealing solutions for mass learning, lack factual recognition to prove quality; even if some assessment by peers can be integrated and online examination performed, the business model is still to be found with respect to the necessity to share cost and benefits between parties. But, whatever the business model, once the resources are identified with respect to the expected learning outcomes for the target, the challenge is ultimately for the teachers to drive the behavior change of learners while they re-build the learning autonomously. This is why learner-centered pedagogical digital learning devices should be developed in order to guide learners through the knowledge, going further than just providing knowledge through the net with video-recorded lessons, for instance, with little guidance: the expected ramp up skills require a deep personal development and learning ability. It is important to note that the underlying outcome is also the change in the Project development team (pedagogical team and IT support) while re-building the learning environment in and outside the classroom (Biddix et al., 2014), and building up the pedagogical digital device based on new rules. This development has to be conducted too, and this is the responsibility of the Higher Education provider. As a consequence, some areas of freedom of action should be made possible for the trainer-teachers to innovate in pedagogy with the technical and financial support of the HEI, requiring first, a clear policy from the HEI and second, project good practices for the trainer, such as DMAIC approach (Define, Measure, Analyze, Improve and Control) during the process of development of the innovative pedagogical device, but also of the use of it: the issue is to identify directions of improvement and prove efficiency before dissemination at an institutional level. One result is the autonomous buildup of the parties’ Personal Learning Environment (Downes, 2012). Further, a formative and factual assessment should be performed for the recognition of skills in practice; unfortunately, not always carried out completely correctly today. As a consequence, the development of online learning in Higher Education should be rigorously developed based on continuous improvement to comply with efficiency criteria and allowing a personal act of learning. In this 332
Common Scenario for an Efficient Use of Online Learning
way, the aim is to develop reflexive learning to guide towards autonomy. Such orientation also requires a new behavior of the trainers in the pedagogical teams for the development of innovative solutions. Trainers should also be trained as ICT is not the original field of expertise of the teacher-researchers (Albion, 2001). The underlying requirements are for the HEIs to support the pedagogical teams thanks to IT support and resources: sharing learning outcomes and technology. The challenge is greater than just developing IT or pedagogy; it should be a global and shared vision of parties requiring a pre-requisite in IT and a new pedagogical culture. For the learners, online learning should be appealing, reassuring, facilitating, promising recognition for employment and, of course, at an acceptable cost: economically but also with respect to the training energy to spend (partly solved thanks to an ability in organization, collective intelligence and learning skills). With respect to the triangle of performances, the challenge is: • • •
First to design a common scenario to facilitate knowledge ownership and guidance for ramp up skills with individual follow-up and remedial solutions Second, to be able to factually assess knowledge ownership and new skills developed by distance learning, thanks to powerful digital pedagogical devices with respect to learning outcomes (effectiveness) Third, the issue is to prioritize resources in the given context and evaluate the return on investment for efficiency
Finally, in addition to the efficient development of online learning devices with respect to a chosen pedagogy, the difficulty remains in developing business models for sustainability, respecting the values of Higher Education and giving people accessibility to knowledge to develop the key skills required for learning throughout life and their employability in a complex society. In the following, first, the background is given on the main concepts dealing with e-learning or digital based distance learning for the satisfaction of the HEIs’ issues with respect to performance and skilled workforce. Second, the various motivations from HEIs to develop such kinds of training will be clarified and a proposal of definition of digital training introduced. Third, a synthesis is given of ONAAG (Outil Numérique d’Appui de l’Auto-formation Guidée, in French): a Digital Support for Guided Self-Learning solution developed for quicker and greater ownership of learning outcomes built on the existing tools and partly provided by the institution (Moodle and ScenariChain). The aim is to bring to light the common scenario applied to different Teaching Units (TUs) and audience, including Continuous Vocational Training (CVT) and apprenticeship with alternation. This scenario can be spread out for different uses, adapted to the topic and participants, providing flexibility. Then, based on the various experiences in Work Integrated Learning (WIL) that motivate a learner-centered pedagogy to improve reflexive learning ability (Cendon, 2016), a final proposal is presented for high efficiency of online learning through a deliberate pedagogical use of ICT: the aim is to mix pedagogical activities that put the learner into challenging situations. The innovation relies on the way to develop the pedagogical devices with a focus on learning outcomes, respecting a continuous improvement process that takes into account all parts of the project: techniques and pedagogy for a shared culture that leads to sustainability and excellence (EFQM, 2013). Obviously, a Community of Practice (CoP) should be developed with all the parties: trainers but also IT developers sharing expertise for effective and quick solutions on the ground, respecting Rapid Application Development by including the learners for swift feedback because they are the first ones concerned and will become experts after the training. Efficiency and capitalization of the experience are ways to answer the funding problem. 333
Common Scenario for an Efficient Use of Online Learning
BACKGROUND Skills of the Citizen, Including Digital Competence for the Parties Involved The tremendous evolution of ICT has changed citizens’ behavior and their relationship to the world, with new media and new uses. The impact was first on the amount of information and data access, second on business with new opportunities (such as e-business in competition with traditional trade). Today, ICT is changing the job based on new opportunities and new skills. What about the training then as a worldwide requirement for sustainable development or as a business targeting excellence? This is not just a question about using ICT to train but a crucial concern about how to train today, with such tremendous technical developments, being creative and regulating the momentum to ensure true ownership and basic competences for the citizen to be able to change throughout life. Such skills belong to the 8 key transversal competences identified by the European Commission (2007) for the active participation of the European citizen in society and the economy (see Table 1), including “Learning to learn” and “Digital Competences”. The emphasis on these interdependent skills is on critical thinking, creativity, initiative, problem solving, risk assessment, decision taking and the constructive management of feelings. Digital competences refer to the ability to mobilize digital technologies to perform a task efficiently in a given situation. Then, the learning to learn ability allows awareness of required knowledge and skills, and ability to identify learning methods and means. As a consequence, both areas drive learners towards autonomy in their training route and training attitudes, through a personal network they have to create. In this framework, HEIs are encouraged to develop innovative trainings benefiting from ICT evolution in order to develop global and specific abilities, but also to enhance learning competences while developing the digital ones. In this way, using ICT for learning complies with the “learning by doing” spirit on this specific topic. This remark justifies pre-requisites in ICT (basic user) to avoid rejection by the learner involved in e-learning/digital learning as one of the pedagogical solutions requiring ICT means and use. Then, the “learning ability” is carried out from other pedagogical approaches such as active and reflexive pedagogy for instance. The Digital Competences framework (DIGCOMP) defines 5 competence areas (information, communication, content-creation, safety and problem-solving) for 21 competences at 3 levels: basic, intermediate and proficient user (Ferrari, 2013; EC 2014). For digital element integrated into the training, the pre-requisite requirements for learners are to be able to reach and analyze digital information, collaborate with cross-cultural awareness, create and re-elaborate new integrated content with respect to their needs and problem-solving. Note that this also applies to the trainers involved in the design and implementation of this kind of training but at the proficiency level; to use the technology but also to develop their teaching material through their Personal Learning Environment (PLE). Indeed, with ICT, training is not only a question about pedagogy, but also a concern over technology for creativeness and training digital revolution; an innovative pedagogical technological solution.
Integrated Learning Environment Connected to the PLE of the Parties The new access to a tremendous amount of data has put in jeopardy the role of the teacher as the expert who possesses the knowledge in his field, urging a new paradigm in the act of training. The challenge is now to guide towards validated information with more interactions and, through a new relationship, be recognized as the “go-to” resource, who can help and support, although he does 334
Common Scenario for an Efficient Use of Online Learning
Table 1. 8 key transversal competences for the European Citizen (EC, 2007) Id
Skills
Motivation
1
Communication in the mother tongue
Able to use different communication forms to interact linguistically depending on context, expressing themselves and interpreting.
2
Communication in foreign languages
Additional means to intercultural understanding
3
Mathematical competence and basic competences in science and technology.
Basics to solve issues in everyday life with respect to context in a responsible way
4
Digital competence
Basics for a confident and critical use of ICT to handle information and exchange
5
Learning to learn
Basics to voluntarily learn individually or with others to target the personal objective with awareness of opportunities
6
Social and civic competences
To adapt behavior to context for beneficial interpersonal relationships, knowing codes of conducts and customs
7
Sense of initiative and entrepreneurship
To turn ideas into action to achieve objectives; creativity, ethics and governance
8
Cultural awareness and expression
For creative expression and involvement in an international and/or multicultural context
not know everything. The trainer-leader will have to incent learning, being an exemplary model to target the learning outcomes, but quickly change for a guidance-role within a convenient pedagogical approach based on his interpersonal skills, social and emotional intelligence (Goleman, 2007; Brackett et al., 2011). He will put learners into situations to make them aware and give them the opportunity to experiment with their peers in a secure learning environment. He will question and provide debriefing and feedback to enhance the learners’ development (Schein, 2013; Stone & Heen, 2014). For his personal evolution and professional efficiency, the trainer will develop his Personal Learning Environment (PLE) as a personal construction of an e-learning environment and a tailored use of ICT media. The PLE is a way to generate, organize and share data with dissemination control by the owner and ICT, depending on his personal will for recognition of his added-value; an Ethics and Sustainability concern for the parties with digital competence. The PLE will integrate Personal Web Tools, Personal Learning Network and Cloud Learning Environment. Linked to the training with integrated ICT solutions and with a learner focus, the trainer will create an Integrated Learning Environment (ILE) (Pang, 2010; Nuninger & Châtelet, 2016) based on his own PLE and those of his networked learners (Drexler, 2010) later recognized as experts too. In this way, a digital training solution is created to share information, expertise and professional culture. In contrast with the PLE not depending on HEI, the ILE is a formalized Collective Learning Environment with the captive use restriction from the Learning Management System (LMS provided by the HEI) to give access and exchange data in the online training context. As an extension, Digital Learning will probably give more flexibility with respect to ICT means, depending on the context and prior experience of the parties. The ILE gives access to the content and learning activities with real time guidance, online interaction, synchronous and asynchronous feedback based on personal and collective work to enhance learning outcomes. ILE will be the core of the Digital Learning with respect to the pedagogical approach and the innovative use of ICT decided and mastered by the trainer-teacher.
335
Common Scenario for an Efficient Use of Online Learning
Learner-Centered Pedagogy to Enhance Abilities Digital learning or e-learning supposes the ability to use ICT and the internet, but the underlying not expressed pre-requisites are: involvement in the training and learning autonomy. The commitment will depend on the personal motivations of the learners with respect to their needs and expectations, which are rooted in their personal and professional project in life. The responsibility of the trainer will be to initiate the learning process as the one in charge of the training implementation; for the learning outcomes defined in the depository of skills and the curriculum for the scenario. As for any courses, the use of ICT should be justified with respect to this leading act by the trainer. One key lever is to put learners into a new situation to make them aware and then, experiment in a safe environment for skills recognition (Biggs, 2003); this can be done offline with ICT tools or online with individual or collective e-learning activities. The training scenario will be designed around the situations the learners are put into with guidance to facilitate the personal development of parties with respect to the theory by Lave & Wenger (1991): moving from the periphery of the group (individualism), cooperation towards new community membership when more confident and competent (collective ability). Supervision, regulation and control with feedback and debriefings are known to be the mitigating actions to prevent rejection of the activity and ensure ownership, in addition to follow-up for assessment. In this way, ICT can be a help for conditioning through proposed activities with clear targets. The approach is a learner-centered pedagogy that refers to Problem Based Learning, active pedagogy and reflexive pedagogy (Cendon, 2016; Nuninger & Châtelet, 2014) with formative assessment alignment to the expected upstream defined outcomes (Brown, 2004; Bloxham & Boyd, 2007; Biggs, 2015). The core is the learning cycle by Kolb (1984) and the 8 learning-events model by Leclerc & Poumay (2005) for gradual learning. The basic strategic pedagogical approaches than can integrate ICT are summarized in Table 2: face-to-face, hybrid-course and distance learning. Whatever the digital solution for the training, its use will be a formalization of these concepts that require interactions between parties: asynchronous online interactions, but also virtual ones in real time and online. The further defined Digital learning will then mix ICT solutions (see Table 3) to give substance to the training scenario. Table 2. Learning models that can also be performed with ICT integration. The differentiating factors are the training place and time for synchronous or asynchronous activities with or without the trainer supervision. Kind of Learning
Explanation
Face-to-face
Learning through a synchronous interaction between trainees and trainer in the same place and at the same time; in this way, it can be extended to virtual interaction in real time during a virtual classroom.
Blended learning and hybrid-courses
A learning that mixes upstream prepared synchronous activities in the classroom (virtual or not) thanks to asynchronous activities realized autonomously; in this way, it can combine face-to-face for feedback and online learning within a created virtual learning environment (LMS for instance) for online tools to support the lesson.
Distance learning
336
Learning time or/and distance (place) are dissociated and formalized with or without internet leading to an asynchronous relationship between trainers and trainees; in this way, alternation is part of distance learning as trainer and trainee do not share the same situation as in the classroom (virtual or not) and also blended-courses considering the asynchronous activities. It provides more flexibility in the training for parties and is especially useful for WIL and alternation as it allows smooth scheduling to solve work and geographic constraints thanks to delivery of content and information between trainers and trainees at different times. Distance learning takes real benefit from ICT evolution (internet) with virtual delivery and reduced time of interactions between parties. Focus seems to be put on personal work and collaboration.
Common Scenario for an Efficient Use of Online Learning
Community of Practices to Share Cultures Requirements for pedagogy with online distance solutions require trainers to be trained: to be able to design, implement, guide, assess, use and improve the trainer’s pedagogical toolbox for instance; a continuous education responsibility of the HEI but also a challenge to achieve Quality Assurance of the training offer with performance efficiency (Nuninger et al., 2016): cost, learning outcomes and results. In the worldwide competition, the HEIs (universities) are key players of society; their skilled workforce production should be creative and innovative, based on research work in several fields among which training engineering and ICT to invent new media and new uses. Going further than the pedagogical teams (for one training or one topic), the challenge for HEI is to incent more ambitious collective work in order to develop the core of the business based on different fields of expertise and specific training expertise; i.e. a CoP with focus on training practices and collective intelligence (Couros, 2003). The CoP enhances the co-creation and experimentation of new solutions of learning for identified targets (audience and market), sharing different cultures to enrich the system and reducing the risk-taking of local innovation. Indeed, the U-theory of evolution (Scharmer, 2009) for the trainers’ development path will be strengthened thanks to shared values with respect to training (pedagogical culture), acceptance of possible confrontation and debates for feedback and reflexivity (Nuninger et al., 2016); the foundation of creativity, innovation and change in order to be able to foster the integration of ICT into the course, inventing new uses and solutions. The CoP will be an indicator of the engagement of the trainers, their imagination and alignment in the scope of meaningful knowing (Wenger, 2000); also a way to recognize their added-value for sustainability. To be efficient, the CoP needs key elements such as: leadership, events, connectivity, membership, projects, and artifact in a given context. For the training CoP, the artifact is the ICT evolution for opportunities, the change of generation of learners (also of trainers) and economy that modify the training context and urge adaptation. With digital training concern, the trainer is not the only one to take action for online guidance, asynchronous feedback or off-line follow-up because of the higher number of learners, the increased number of digital tasks (setting, assistance, online interaction and assessment) and the neccesity to equally share the responsibility in the pedagogical team to limit the trainer’s workload. Then, the CoP is a way to limit discrepancies during training implementation. It also facilitates trainers’ recruitment, transfer and beneficial exchange with the ones in charge of the technical aspect of the digital training, ensuring membership of the digital training project.
E-LEARNING AND MOTIVATIONS FOR DIGITAL LEARNING Performance Triangle and Evolution Helix of HEIs The operational performance triangle is assessed under the control of partners (income) and society or state requirements (needs) for the effectiveness of training processes (objectives versus results), the efficiency of the HEI organization (action versus results) and the relevance of means (objectives versus resources) in a competitive context. The tremendous evolution of ICT to access knowledge enhances the increasing use of ICT in education. Indeed, citizens have adopted this new way to communicate (although they do not always master it or control their data) and already learn by themselves. Online tutorials are tremendous examples of the fact that anyone can transfer his expertise at no, low or not evaluated cost through the net (and maybe not validated expertise). But, the same natural responses occurred 337
Common Scenario for an Efficient Use of Online Learning
Figure 1. overlapping view proposed for the training pedagogical approach (horizontal) and the possible digital learning modalities. Digital learning is an extension of distance learning with new possibilities; distance learning being a hybridization of means
with HE providers with the LMS or MOOCs for instance, questioning the training business model. For Universities, the challenge is in their local context to be recognized training structures (for quality and level of expertise) with funding constraints (industrial partners) and specific societal responsibility (for instance, state universities under state control). In the training market, the dynamic helix of evolution between the stakeholders (Etzkowitz & Leydesdorff, 2000; Etzkowitz, 2008; Farinha & Ferreira, 2013) will evolve towards a sustainable overlapping area of interaction for mutual benefit. The consequence will be an evolving business model depending on the degree of digital integration in the training, the degree of interrelationship and ability of the learning organization; a model that also depends on the kind of training. The innovation will be collective, with shared risk-taking and benefit. Thus, adding a digital element to training is a logical and compelled evolution to be thought about. As a consequence, the training offer is a development project for innovation with, as for any industrial approach, a compromise to set to achieve the results; between the targeted learning outcomes (knowledge and skills) and the selected means (then income and expenses): adequacy of the tools for the chosen pedagogy among which ICT, leading to Digital Learning.
E-Learning Towards Digital Learning Based on several definitions by authors and practices observed (Sambrook, 2003; Homan & Macpherson, 2005; Conole, 2010; Moore et al., 2011), Table 3 stands for a proposal to list and clarify the underlying spirit of the learning concepts including ICT overlapping the pedagogical approaches (see Figure 1 and Table 2); differentiating pedagogical approaches, choices for implementation (time and place), including ICT or not, and the way to access learning and use (synchronously or asynchronously, autonomously or with trainer interventions). Note that although many terms exist, not all are so useful with respect to a global view of the challenge of the integration of ICT into the course, whatever the pedagogical approach is. For instance, M-learning for e-Learning made available from Smartphone or tablet is a focus on a specific IT solution for a new use, but in fact the pedagogical creativity is already in mind if the impact
338
Common Scenario for an Efficient Use of Online Learning
Table 3. Learning approaches with integrated ICT. Focus is put on spirit and main practices in order to explain terminology with respect to their use and pedagogical approach; the criteria are the level of autonomy of the learner (online or offline) with respect to the guidance by the trainers in real time or not. Type of learning
Explanation
Online learning*
Learning within online pedagogical devices; i.e. accessible within the internet as the key element. Depending on the pedagogical culture, it is mainly considered as a closed-ended and topic-centered training solution for autonomous learning with not much interaction expected (using online tools for learning such as video of tutorials).
E-learning*
A digital training device based on selected ICT means with respect to the learning outcomes and mostly dedicated for self-education as materials is designed to be sent out remotely by using electronic communication. The focus is put on accessibility of context (online or offline) and, therefore, supposes mostly asynchronous interventions of the trainer. In comparison with the distance learning approach, e-learning is a specific solution to support distance learning and enhance learning.
Digital learning as an hybridization of the course through ICT
Learning that includes all kinds of learning approaches and ICT solutions in a deliberate pedagogical act that integrates ICT activities or digital devices accessible within the net or not; among them virtual classroom, LMS, online video, digital pedagogical serious game…. It can be considered as a hybridization of the courses through ICT, mixing pedagogical solutions and required interactions between the parties that can be synchronous (in real time) or asynchronous (offline); all require online connection with the opportunity to promote collective work in addition to personal work based on digital offline learning (local software for instance) by the individual or the group as a learning community.
*This focus for the definitions can be discussed with respect to the trainers’ culture (Moore et al., 2011).
of ICT evolution and accessibility of the training for all are taken into account, whatever the place, the time and the means: for instance, benefiting from GPS localization for new knowledge transmission or expertise through real time guidance, based on actual behavior of the learner within the triplet placetime-information. Depending on the trainers’ culture, e-learning is the mostly argued term (Moore et al., 2011). The reason is the way trainers, in their work experience, limit or open the area of intervention: including all online solutions for training (“online learning” in that case includes e-learning) or not; limiting the extent to a formalized pedagogy within an ICT means (then “e-learning” includes online and offline solutions). This later view is our focus with respect to our practices (see Figure 1). As a consequence, Digital Learning seems a broader view of the issue, stressing the pedagogical formalization of the use, considering the opportunities given by ICT. Depending on the HEI context (the expertise core of the business in pedagogy and ICT, income and strategy with respect to learning objectives), the digital pedagogically integrated solutions for trainings are various (poor or awesome) with respect to the level of development in the organization with possible jeopardy as not everything can be overcome at once; then, several business models are possible for each situation faced by the HEI, its pedagogical team or CoP. The ONAAG (Nuninger & Châtelet, 2016) experimentation presented further in this chapter is a thought-out digital learning experience during Work Integrated Learning (WIL) based on the existing basic IT tools provided by the University that seems a good practice background to develop online learning and incent CoP.
Digital Training Solutions and Business Models The business model is a set of planned activities designed to result in a profit in a marketplace. Ebusiness will aim to use and leverage the unique qualities of the internet and World Wide Web. Based on this definition, authors propose e-learning as a set of planned training activities through the use of
339
Common Scenario for an Efficient Use of Online Learning
the internet and the World Wide Web, to achieve learning outcomes and make a profit in the training market. As a consequence, depending on the evolution history of the HEI and the steering committee policy in the Higher Education context, the solutions vary. Different trends are noted by Mittal (2010) for HEI to integrated ICT or develop e-learning: • • • • •
Continuing Convergence (e-learning is an add-on in existing training); Market Consolidation (distance learning being a way to reduce cost for instance); Brand Strategy to attract new students and be recognized (e-learning being a quick solution within MOOCs for instance); Added-Value Service thanks to ICT and pedagogy, i.e. a way to enhance partnership and alliance to subcontract this technical aspect; More Flexible Training offer to overcome society’s challenge for knowledge with higher modularity to pull through the challenge of the dialogism (customization based on standardization of solutions).
This last level being the evolution of the business models as a learning organization: from the traditional scope of education, through the vocational training framework (private) to a more sustainable Education to Business (E2B) model in the scope of Lifelong Learning. The key levers of success for a digital based training are: • • • •
Cost effectiveness Easy to use for the parties Secure and customized Attractive to ensure successful adoption
With respect to Table 4, today e-learning has evolved toward a fully integrated pedagogical solution based on ICT for synchronous and asynchronous learning, mixing digital tools (technique) and their pedagogical use for pedagogical efficiency performance (objectives, results and cost; i.e. time, teaching/ learning energy and funds). Digital learning holds several dimensions as a hybrid solution to be carried Table 4. Evolution of the e-learning concept over more than twenty years; adapted from Bruet (2015) Period before 2000
Transition 2002/2007
since 2008
340
Point of view (bias)
Business model
Solutions and results
E-learning is considered as an alternative training solution to face-to-face training (to be replaced)
First huge investment with quick return on investment (ROI) over time
Static training product available online, but requires new funding for content evolution. ICT evolution questioned ROI model
E-learning is an attractive pedagogical means based on ICT but remains just a pedagogical device like any others.
New business models are developed as an extension of e-commerce model
Dynamic training solutions benefit from the evolution of ICT, but pedagogical efficiency is questioned with respect to face-to-face training.
E-learning should be part of a training system to achieve learning outcomes, leading to the concept of digital learning
Business models will depend on the level of e-integration for the targeted market and with respect to the position of the provider
Blended training oriented solution, with face-to-face and distance learning. Digital learning solutions are a hybridization of the training with different integrated IT tools.
Common Scenario for an Efficient Use of Online Learning
Table 5. Key point to focus on while designing Digital Learning Id
Steps
Explanation
1
Upstream training design in the scope of distance learning (requirement specifications)
-customer focus: identify the target group (needs) with respect to the strategic goal, considering the business logic (compromise based on SWOT matrix with funds concern) -co-definition of the learning outcomes (repository of skills) -curriculum definition with pedagogical concern (choice) taking into account the ICT integration for e-learning -prioritization of the resources for the chosen framework
2
Sub-contracting training realization to the dedicated team and services
-dedicated organization for the implementation of the digital training project respecting the chosen framework (Quality Assurance) -pedagogical device development and implementation by dedicated services -recruitment & contract (fundraising) -realization though digital device -follow-up of the learners -final assessment respecting the rules
3
Decision-making with respect to the training life-cycle.
-performance review of the training to decide on improvement areas or abrupt change -capitalization of the added-value -skilled workforce recognition (core of the business)
out with respect to the target (customer-focus), taking into account the resources and the competition. As a consequence, the business model is not unique and depends on the context and strategy of the steering committee of the HE providers. It will depend on the place in the market with respect to two areas (Kluijfhout, 2005) which are the dimension at the core of the business (Pedagogical, Technological or Organizational) and the level of dissemination of the training offer: the learning environment (course/ training), the institution environment (trainings) as a mezzo position or a larger scope (macro: national, international environment). The position drives the level of virtual (digital) elements in the training: from the classic university toward the 100% virtual university; the underlying concepts being distance learning and blended-lessons. As a consequence, the challenge is to match the learning outcomes and the pedagogical approach in an integrated manner by means of web-based devices; i.e. Digital Learning. This questions the learning environment (a virtual one) in conjunction with design to attract learners for involvement but also the role of the pedagogical team for training requirement specifications and commitment in the implemented solution. The reasons are the momentum of the Digital Learning based on the trainer’s animation and supervision. The issue is the assessment of the learning outcomes and the recognition of the skills. This latter challenge is a problem in itself due to the “anonymity” on the net. As a consequence, success requires interaction and virtual face-to-face between the parties involved. In this sense, the digital element is the extension of human behaviors (Chacón, 1992): process complex information to learn by doing; interact with the media to learn on their own and communicate with others through the media to learn from peers in their network. Then digital learning should be thought about respecting the usual efficient process of a sustainable training offer (Mittal, 2010; Nuninger et al., 2016) summarized in Table 5 with respect to steering committee and upstream design.
341
Common Scenario for an Efficient Use of Online Learning
HYBRID-COURSE WITH ONAAG A synthesis of the Digital Support for Guided Self-Learning solution, called ONAAG (Nuninger & Châtelet, 2016), is briefly presented to highlight the motivations, the Agile development process with the difficulties met in bringing out a common design proposal for efficient digital training device. The new results after three years of experimentation are given to enhance the key levers of such a solution which is learner-centered but also focuses on the trainers’ needs for efficient interventions in and outside the classroom. The extension will be to adapt the solution to full virtual interactions with the support of the HEI for higher dissemination to other topics and a larger audience (i.e. for industrialization in some ways) and a higher level of digital learning.
Training Context and Resources The pedagogical device was first envisioned at a local level by one teacher-researcher for his needs in the Automatic Control course he leads in a WIL (training chartered engineers), using the basic IT solutions provided by the provider. Based on the observed success, he decided to spread out the win-win solution to other courses he conducted (Automation and Computer Programming) with similar requirements due to transverse skills but in another training context. A decision facilitated by the University official support provided through the selection of his reply to a call for proposals for pedagogical innovation with the integration of digital components; a fixed amount of work hours was given for technical IT support and production of the content. After three years of intensive development, the value of ONAAG is recognized both by the IT service of the University and the groups of learners who communicate on the device and the use of it. Since 2015, the author started integrating colleagues in the project, giving more visibility to the solution of sufficient quality in the local context. The interest of ONAAG is to incent the process of change for a new pedagogical act at a local level respecting the U-theory of innovation and continuous improvement approaches. It might impact the training (all teaching units) and the policy of the provider, in the same way it had challenged the teacher, changing the way he leads the blended oriented course. The steps are summarized in Figure 2, from the initialization by the teacher (motivated by his needs and the learners’ expectations) towards the integration of colleagues (for richness and transfer) after the proof of feasibility and efficiency of the device and its use (for skill recognition). Along the way, interaction with the IT support will be closer with increasing official support by the HEI. Today, the possible dissemination at the institutional level and future of the device that copes with a high level of requirements for Higher Education is submitted to the community of practice in a context of university merging and training evolution on the market. As a consequence, the innovation is at the same time: • • •
342
A key to improve training effectiveness for the learners, meeting the learning outcomes; A way to guide the pedagogical team towards excellence, developing pedagogy and skills; An opportunity to strengthen the economic development of the Higher Education Provider, building common devices based on a shared pedagogical culture for efficiency.
Common Scenario for an Efficient Use of Online Learning
Figure 2. Historical path of the ONAAG project development over three years and of its leader-teacher. He followed the U-theory of innovation, developing different attitudes as the one who leads the courses for different kinds of learners.
343
Common Scenario for an Efficient Use of Online Learning
Figure 3. Digital competence level of the CVT group in 2014 and 2015 with their expectation with respect to the Automatic Control TU
Motivation of the Pedagogical Device in the WIL context In the WIL leading to Chartered Engineer (CVT path) in the field of production (Nuninger & Châtelet, 2014), ONAAG was created for the Automatic Control TU as a reply to the learners’ desire for prior training expressed in satisfaction surveys: 77% of the respondents in 2014, although 58% maintained they had no use for the topic; in 2015, figures are 60% and 33% with ONAAG already started as a prerequisite (see Figure 3). This pedagogical device that integrated ICT was also a reply to the trainer’s needs due to different factors: • • • •
Low Prior Knowledge of learners in mathematics as a tool to handle the concepts of the TU; Low Digital Competences for computer programming: 17% up to 33% have low basics and difficulties with the LMS, besides, they should be independent users for digital simulation in the TU; Low Autonomy in learning and collective intelligence (for involvement and to develop the skills through the Automatic Control project (in group)); Constraints Due to WIL and Alternation with formative assessment and imposed schedule.
Then, the course targets the blended course with active pedagogy implemented with formative assessment. Through the years, the LMS has already been a support of the course for information and remote work of the learners. One consequence is an increasing workload for the trainers for individual follow-up, but also a way to point out the lack of autonomy of learners giving stress to the group; encouraged to individually evolve to be graduated with remediation activities. In the framework of the hybrid-course that already alternates upstream preparation of the course and feedback during face-to-face lessons with LMS follow-up and a final collective project, the aim of the thought-out device is:
344
Common Scenario for an Efficient Use of Online Learning
• • •
First, to train with respect to pre-requisites before starting the course to limit heterogeneity in the group and facilitate the access to the new concepts; Second, incent a new behavior of the learners to be more autonomous in their learning and involved with adhesion to the imposed digital tools of the HEI (the LMS); Third, put them into a situation of system analysis with free choice to develop their skills based on their new knowledge and the collective work.
ONAAG is not just adding new ICT in the course (such as the LMS for support with dynamic or static information) but a fully learner-centered integrated ICT solution in the pedagogical act of training and learning to achieve the learning outcomes. In this way, it is a formalized e-learning: a digital learning with part of it carried out face-to-face but based on personal online activities and asynchronous work.
ONAAG Design and Scenario ONAAG is based on the existing technology provided by the HEI among which an LMS (Moodle), specific developments based on the publishing ScenariChain solution for content capture by authors using models (a collaborative solution for structured editing and publication by Kelis: scenari-platform.org) and the trainer’s pedagogical material (part of his PLE with access given through the LMS). ONAAG is a training that can be followed autonomously by the learner, alone or collectively, asynchronously or synchronously with minimal intervention of the trainer for additional guidance during the learning path. The use during the blended course enhances an evolving interaction toward a new relationship between learners and trainer; this latter not being the unique resource in this connected world. ONAAG is modular to remain flexible. It was designed to remain quickly adaptable to other contexts, topics and groups of participants, and was proved to be compliant with the expectations (thanks to several experimentations on three kinds of audience) as a fully integrated solution controlled by the trainer with no more support by ICT service (Nuninger & Châtelet, 2016). The modularity of ONAAG is grafted on a two area pedagogical framework: hybrid-course and formative situational activities that require an active behavior of the committed learner; a pretext for guided but voluntary learning (Senge et al., 1994). The design is made to facilitate, be integrated into the training and, by its rational and prudent use, to influence the learning behavior taking advantage of the new ICT possibilities. Thus, it proposes a progressive increase in levels of knowledge and skills ownership, followed by the learner at his own training rhythm (in the time available) with high adaptability of the scenario for the course. ONAAG has two components connected by the LMS of the course, which use is incorporated into the course along and through the training time (see Figure 4): •
ONAAG-1 to consolidate the pre-requisites, starts and completes the course in optimum conditions (discovery and reproduction learning levels). It is a generic guided and formal scenario implanted on the LMS Moodle as a succession of content (html lessons with videos for examples, exercises or tutorials depending on the topics studied) followed by a dedicated quiz for self-assessment of the knowledge acquired. Each level is unlocked depending partly on the personal progress and the completion of activities to avoid blockages and discouragements; a structure that allows the monitoring by the trainer while allowing a freedom for the learner to organize his learning during the allotted time. The feedback by the trainer is based on individual returns (for customization)
345
Common Scenario for an Efficient Use of Online Learning
Figure 4. Final framework of ONAAG hybrid training since 2014. ONAAG-1 is not specific to a TU and can be easily set thanks to modular sections (topics and levels) that can be activated or not (see Appendix 1)
•
within the forums and flipped classroom for ownership of the group. Figure 11 and Figure 12 in Appendix 1 are two screenshots to illustrate the device. ONAAG-2 is skill-oriented for ownership, operation, and work-in-transposition. It is a case study workspace designed on the publishing model Topaz as a set of analysis routes (for automatic systems) to be followed in complete autonomy to solve the issue. The training is less formal as decision-making is based on free-choices, proposed issues and a self-assessment quiz that requires the learners to make calculus and digital simulation, reflect and collaborate; putting into practice the knowledge for deep learning, expertise on transversal skills (project management and collective work). The guidance is limited, based on progress reports through the LMS (to limit unproductive dead-ends) but asynchronous before a final review with the group competition in the classroom, feedback and recognition by the peers and the trainer. Figure 13 and Figure 14 in Appendix 3 are two screenshots to illustrate the Topaz solution.
Operationally, the synchronous course is adapted with respect to the group development and customized based on the digital information received to avoid failure and guarantee the results, in compliance with Ethics, formative assessment rules and constraints of accredited diploma.
346
Common Scenario for an Efficient Use of Online Learning
Figure 5. Dashboard of ONAAG to measure the global satisfaction, the time spent with respect to personal organization, difficulty in feeling and prior expectation (see each area of the figure for CVT groups in 2014 and 2015 after the improvement of ONAAG, 1st component)
Summarized Feedback After Three Years After three years of experimentation with three groups (more than 155 learners) within a different use with respect to the learning outcomes depending on the year of training or topic (Automatics or Computer programming), ONAAG is proved to be transposable and meets the expectations, allowing a real individual and collective progression. The asynchronous time declared by learners to be spent on the pedagogical device ONAAG varies from 12 hours to 24 hours (offline and online, i.e. less than half the duration of the topic). This estimate depends on topic, level of outcomes, prior experience, autonomy and involvement for better ownership and adherence to this new way to learn (see Figure 5 for criteria with respect to time allotted, satisfaction with the approach, difficulty level felt and technical aspect on ICT and Figure 6 for global satisfaction; the target is decided with respect to expected results respecting ECTS and outcomes). Note that net accessibility and learner’s availability (CVT) vary. In 2014, the tracking activity peaks are mostly before working hours, after lunch and after 8pm till very late at night; it always reflects the autonomy in the personal organization with respect to work constraints, reminders by the trainer and personal priorities in life. The digital natives are observed to be less curious about the pedagogical aspect and more demanding for immediate access to responses than older ones who want to reflect and find the
347
Common Scenario for an Efficient Use of Online Learning
Figure 6. Global satisfaction by learners (CVT groups in 2014 and 2015) based on 5-level Likert scale (5 being the maximum level of satisfaction) with respect to 4 areas: content, video, skating (modularity) and quiz
learning keys. Although ONAAG seems binding in 2014 (only 40% of the CVT group is interested), the curiosity trend is inverted in 2015 thanks to the new initialization of ONAAG during the first lesson to reduce the fear of the unknown and personal responsibility in the learning: guidance is not doing something for someone, but providing support and a safe environment for their personal experimentation. Prior training experience affects the level of perceived difficulty and increases the time spent on the digital learning for the ones involved but, at the end of the learning process, learners are confident in their ability (100% in 2015, among whom 46% are “strongly” confident in comparison to the rates in 2014: 86% and 17% “strongly”) and are satisfied with their new expertise: 92% in 2015, among whom 59% are “strongly” satisfied (only 14% in 2014). The time spent is then consistent with the expected objectives and the results. As times goes by, the individual organizations are moving towards more regularity and dedicated time reflecting impact, change in behavior and involvement. Comparison between 2014 and 2015 shows that the prototype was well improved based on the users’ opinion and the trainer’s decision to achieve an almost finished structure requiring new content. The last version of the first component of ONAAG is now a unique LMS course that can be customized with respect to the transversal skills required for the group of learners, making resources available or not. The solution that overcomes the specific content helps easier updates but had required a prior specific workload for system enrichment and validation. In addition, it required a complete change of the
348
Common Scenario for an Efficient Use of Online Learning
course documents. But, today, two colleagues involved in similar TUs have joined the project. The second component of ONAAG, of more complex mapping, was harder to capture with respect to the content and the willingness to keep a standard structure. The difficulty lay the setting of the quiz to remain adaptable to any case study of systems (for updates and replication with a new case study), limiting changes to only the description of the system and to the tuning of the values for the correct replies, keeping the common questions as they are (trainer responsiveness). Indeed, the issue of a common structure (quality of the device) is to be able to duplicate it for new examples of systems, developing the course materials (profitability of production). With respect to the learners’ satisfaction, many enquiries were made and their final results compared to previous groups (taught without ONAAG). The final evaluation results of the promotions in 2015 and 2014 are higher than in 2013 (not using ONAAG) with a smaller deviation in the group denoting a higher internal transfer of the expertise. The same observation is made for the group of apprentices with all developed applications running at the end of their IT project (this was rarely the case before). Nevertheless, note that the chosen pedagogy disrupts the youngest and requires a more mature audience for an efficient digital training focusing on skills. Despite some rejection by some learners due to the active behavior required with respect to the pedagogy and the learning goal (note remarks as: “I need more lessons in class,” “I want the solutions written down”, “too much information”), the global feedback is really positive: “I can learn at my own pace”, “I took pleasure in trying to solve the issue”, “work group is sharing experience for a higher knowledge”, “my perception about the TU has evolved including with respect to mathematics as simply a tool for Automatic control” and “Feedback is worthwhile to understand the aim of the activity”. In 2016, the global satisfaction of groups about the pedagogical device and use is 69% (up to 93% for the CVT group) and 72% agree that ONAAG had helped them to develop a new set of skills (80% for the CVT audience that followed ONAAG 1 and 2). But the main recognition of the digital training innovation is the following by a CVT learner: “despite my preconceived ideas about the topic (unreachable), I realize that everything is done to enable learning”. The trainer’s qualitative opinion refers to a change in his practice that goes further than the expected and obstacles met due to the limited ICT possibilities afforded for new development (the technical solution is imposed) with respect to the initial training project (willingness for automatic completion of progress reports).
Risk-Taking with ONAAG The feedback on the experience has proved the interest of the digital training solution and shed light on the main risks for the parties with mitigating actions: • • •
Rejection of the chosen pedagogy if the process is not explained first (course initialization) and the learning gradual with respect to pre-requisites and new concepts; Flyover Notions by the Learners if the device is only considered as a use of ICT with low integration into the course scenario, urging a blended solution with face-to-face and interaction to keep momentum; Increasing workload leading to low welfare at work for the parties if the learning process is not regulated thanks to imposed deadlines for progress reports, equally distributed in the available time, rigorous personal organization and the right for the learner to ask for legitimate guidance;
349
Common Scenario for an Efficient Use of Online Learning
Figure 7. New planning of training with ONAAG for 2017, to reinforce recursively the learning commitment thanks to the situation learners are put into by ONAAG2 urging knowledge acquisition with ONAAG1
• •
Failure of learner if a clear evaluation framework is not designed in alignment with the outcomes and the learning process (formative) to ensure ownership thanks to follow-up and customization of remedial work; Level of risk-taking of the trainer in his area of freedom in the pedagogical team with respect to the innovative approach if not limited within a CoP support thanks to the HEI organization; with services support and funds to make possible new pedagogical projects that can further be disseminated and industrialized. It questions the critical size and management of the HEI and it is a specific concern of a senior lecturer whose mission covers three areas: research work in a field of expertise, training on their field of expertise and involvement in the HEI organization.
Future Prospects for ONAAG Although regarded as sequential components, with a step back, both parts of ONAAG can be taken in concert (see Figure 7), motivating to go deeper into the knowledge (ONAAG 1) when required by the situation (ONAAG 2) for ramp-up skills. The first experimentation with ONAAG2 had shown that learners might feel lost by the situation given as an issue in front of a complex mapping of unknown routes they will have to explore with free decisions. To limit this lack of confidence, new videos will be added for each decision-step to explain the issue and reassure with additional guidance after the quiz; indeed,
350
Common Scenario for an Efficient Use of Online Learning
there is not a unique response but several, depending on the hypothesis made. With ONAAG2, what is evaluated is not the correctness of the solution but the process set to get to it (skills). Thus, the feedback could be handled with a short online class during alternation and the final review could be prepared online by the group. A new way to train that unfortunately will depend on the HEI for time constraints, development cost and monitoring of the trainer’s work (setting and content including videos…). ONAAG allows the formalization of a Digital Integrated Learning Environment for sustainable training toward higher learning outcomes (knowledge and skills) that overlaps the pedagogy, the ICT means and the content through guided activities: motivating for the audience, favoring autonomy, facilitating the gains as learner-centered, ensuring the progress of individuals and collective learning through the simulation and monitoring for better guidance, in real time and face-off; in this way being perfectly integrated into the hybrid course. ONAAG was developed under the trainer’s willingness respecting a continuous improvement process (Plan-Do-Check-Act with a satisfaction survey at the end of the TU for feedback and identification of the area of improvement), respecting the spirit of Agile development with the ICT services (within online survey along the experience by the learners and qualitative observations by the trainer himself, using the device to validate choices and prioritize orientation for an immediately available solution). Today, the trainer is autonomous in his innovative use of the device, is able to duplicate the model and adapt with new content or a new scenario with respect to the evolution of his needs and the targeted learning outcomes. In this way, ONAAG complies with the HEI orientation for ICT integration, the pedagogical efficiency and the trainer’s personal needs with respect to his interest in the work with customer-focus and welfare at work. The device is a way to incent more exchange in the pedagogical team, unifying topics thanks to transversal skills for a collective effort towards efficient trainings; enhancing the CoP with respect to the WIL culture. The digital training with ONAAG strengthens the new paradigm of the teacher-learner relationship, sharing data, work experience and culture for the transfer of expertise. Flexible and accessible, it complies with the LLL and the European framework as it supplements this with the ability to learn autonomously and the digital competence (a social responsibility). The system can be enriched with time allotted to virtual classroom; an option tested in 2015 for debriefings with respect to the group evolution on the project; a simpler way required by the members to overcome specific obstacles. The new version will take into account online meeting with virtual classroom and virtual face-to-face tutoring for the one involved or on specific points negotiated upstream.
PROPOSAL FOR ONLINE LEARNING Based on the developed ONAAG solution and training experience, a proposal is made for transfer and development of a Digital Training, targeting Excellence. An upstream assumption is made that the HEI organization supports the training project (see Figure 2).
Strategic Orientation The strategic policy by the HEI should be decided upstream based on a SWOT matrix (Strengths, Weaknesses, Opportunities and Threats). Then, the prioritization of resources should be made to achieve the goal and a dedicated team identified with or without contract out of the technical ICT aspect. This depends on the critical size of the HEI organization. Three areas can be identified for the issue:
351
Common Scenario for an Efficient Use of Online Learning
Figure 8. Sustainable circles to develop a digital training project, creating a Digital Integrated Learning Environment based on the PLE of the parties and ICT means available
• • •
A substitution of an existing training but with ICT solutions for cost efficiency; An improvement of the training by the addition of some digital aspects to facilitate access by different participants (place and times) or simplify functioning; And a change in the training offer towards a new market.
Depending on such choices, the digital integration level varies between an extended classroom (virtual), blended learning and distributed learning (with no dependence on time and place). In the same way, the resources will not be the same when the market scope is wider: from the simple solution to a complex and more demanding one for huge groups, several trainings and worldwide access; resources should be prioritized both for technique (connectivity, equipment and e-services for content) and pedagogy development (skills, dedicated time for production and implementation) respecting the expertise of the parties for sustainability (see Figure 8).
352
Common Scenario for an Efficient Use of Online Learning
Figure 9. Process proposed to develop an efficient digital training, guided by excellence model
Process to Develop Digital Learning As for any development of the training offer, once the strategy is decided, the following process development should be implemented with the selected team project depending on the step. The proposal is based on the process described by Nuninger et al. (2016) for an efficient WIL training offer. The process in Figure 9 is adapted to include the specific constraints due to ICT integration with respect to support or contract out of the technical aspect. Depending on the size of the HEI, the CoP is a way to solve the
353
Common Scenario for an Efficient Use of Online Learning
issue with Digital experts and Pedagogical experts, sharing the training culture of the organization in order to respect the sustainable choice given in Figure 8. One risk is to limit digital training possibilities due to habits of the ICT support services, the imposed technical choices of the HEI due to standardization or the lack of funding or expertise for new developments. This explains the importance of driving the change in the organization based on a mix culture of expertise. As an example, ONAAG has been developed in the area referred to in black, highlighted 3 in Figure 9 as an innovation in a given context (area 1 and WIL requirement already defined for the training in area 2) allowed by a free area of action and, therefore, can incent the process in area four for dissemination, depending on the chosen policy of the provider. The process has four main steps (referred in black highlighted numbers in Figure 9): 1. The upstream decision of the HEI to launch a Digital Training project development with identified partners (for funds and resources) and experts from the CoP; 2. The co-design of the Digital Training requirement specifications, as a compromise and shared view of the learning outcomes co-defined in the curriculum with identified prioritized resources for the goal; 3. The sub-contracted Digital Training development (the techniques and the content) for the operations. The responsibility is first, the efficient management of a two-team project for the development and/or setting of ICT solutions and the development and/or capture of the pedagogical content. The issue is a sustainable relationship between teams with appropriate pedagogical animation and training before and during the Digital Training; enhancing the CoP. The prototype is developed and tested with the first group for agile improvement and further feedback to the HEI, with continuous follow-up of learners’ development; 4. The downstream capitalization of the Digital Training experience with the parties involved for future development based on surveys, results and feedback for decision-making at a higher level before industrialization and dissemination through standards and efficient training offer; starting the change in the HEI. The training production line is obviously the sequential content production (repository of skills, curriculum and materials) and ICT development (equipment and efficiency of ICT), then distribution to the customer for training (business model for income with respect to expenses and strategy) with respect to their personal path and needs, using the training offer (digital or not). Such a process should be supported if a large ambition is targeted. Nevertheless, in structures allowing an area of freedom for trainers, some innovative uses can be quickly tested, based on the trainer’s skills (for design, for ICT integration based on the existing solutions and even digital material with a limited e-support) to initiate the process of change, being exemplary. Depending on the position and decisions, trainers will be asked to develop new projects, new content or simply participate in the implementation as skill resources; depending on the level of interaction required for the digital training and its ambition: delivery of content (simple solution for online learning) or aiming at tacit skills, social and collective intelligence that require higher involvement and skills (the complex Digital Hybrid Learning); the e-learning being an intermediate solution mainly for cognitive skills.
354
Common Scenario for an Efficient Use of Online Learning
Figure 10. Excellence concept for digital training offer
Key Levers for Success Once the specification has been made, the key levers for success are mainly related to the third step of the process to develop the prototype with the parties, requiring their full commitment for the issue: matching the learning outcomes and the pedagogical digital approach on the ground. It denotes the triplet: expertise, resources and strategy. First, the role of the trainers has to be defined (and they should be trained with respect to Digital competences but also with respect to the active pedagogy and formative assessment) because they are the ones who initiate, motivate, maintain momentum, guide and regulate the interactions in addition to their added value in terms of skills. Second, the developed Digital Integrated Learning Environment should be aligned with the pedagogy with efficient ICT to limit rejection (easy to use, accessibility and attractive). At the level of the HEI, the factor relies on: the identification of the opportunities and threats for input and mutually beneficial relationships with partners; the facilitation of the CoP to strengthen the core of the business and allow the worthwhile condition of creativity and change for an appropriate pedagogical model. The right decision-making will be made if learners’ attractors and deterrents are perfectly researched and identified with impact on the digital business model, as the funding sources. Excellence model (EFQM, 2015) should be targeted all along the production line as the spirit of the learning organization (see Figure 10): enablers are leadership, people, strategy, partnership and resources then, processes, products and services to get the results with respect to people, customer and society expectations and of course, business results. The Digital Training solution is not
355
Common Scenario for an Efficient Use of Online Learning
Table 6. Levers and obstacles for the success of the Digital Learning; adapted from Kluijfhout (2005) Key factors
Obstacles met
-Quality assurance with performance triangle and SWOT analysis of the market -LLL framework that incents flexibility and customization using standards for capitalization -Training opportunities of ICT innovation (access, automation, new pedagogical uses) -Leadership to drive the change based on CoP -Unifying research work (expert) and pedagogy for transfer and mutual enrichment -Enhancing the learning ability and the collective ability by new means of interaction and guidance -Worldwide access to study program -Openness and autonomy (learning ability)
-Not thought-out strategy with short-term policy -No prioritized resources for the goal (funds) -Resistance to change from the parties -Un reliable technique (connectivity, facilities) -Lack of pedagogical approach within digital solution -Lack of e-service support for the parties -Lack of continuous education (pedagogy, tutoring, ICT) and recognition of the trainers -Lack of digital competence of the learners -Lack of involvement -No pedagogical team animation -Workload due to the lack of resources or regulation
just an addition of digital resources, but the alignment of pedagogical activities through the ICT medium to enhance collective interaction to learn, sharing experience.
Specification of Digital Solution Versus Pedagogy: The Common Scenario The scenario of the Digital Training is the formalization of the pedagogical use of the ICT to achieve the learning outcomes, stressing the teaching, learning situations and situational activities in conjunction with the associated technology to build a coherent whole that reinforces learning thanks to the interaction between parties (in the group between learners, and between learner(s) and trainer); the simple addition of digital data should be avoided as the Digital Hybrid Learning and Training is of higher ambition. Depending on upstream choices and input, the increasing learning ability will come from increasing interactions between parties (with content availability, information exchange and collaboration) within the digital solution level (digital access, interactive digital tools and virtual interaction online); for each level of interaction, the degree of virtuality can vary between a formal face-to-face up to a 100% virtual classroom. The aim is an increasing autonomy based on accessibility (by the ICT means) and the formative assessment through a situation of gradual difficulty solved by increasing collective work under the guidance of the trainer for reflexive learning ability. Recognition of skills is made by the individual but also by the peers’ review and the trainer (the expert for evaluation). As a consequence, the Digital Hybrid Learning solution should have (inspired by Southard et al., 2015): • • • •
A consistent structure (logical, intuitive and compliant with the pedagogical needs) Allow a high-impact introduction (for motivation and for the clarity of the trainer role and learning outcomes) With a rich and dynamic instructional content for guidance and new collective interactions An interactive content to increase autonomy
But, the richness still remains in the skilled trainer with real time intervention in the virtual classroom and the degree of cooperation to develop the digital aspect and through the digital solution. This is a twofold mission for the trainer who should, from an operational point of view, follow a DMAIC approach to design the hybrid-solution while implementing the scenario of the blended oriented course:
356
Common Scenario for an Efficient Use of Online Learning
• •
•
“Define”: To initiate the learning process at the first meeting with the group, specifying outcomes, pedagogical method and expected attitudes but also unveiling and proposing some advice to begin with the online device to help acceptance; “Measure and Analyze”: To guide the learners through an individual discovery of knowledge in order to understand and try to apply concepts in controlled situations with self-assessment, identifying their main difficulties; a way to motivate interactions in the group to overcome the blocks before the next synchronous meeting; “Improve and correct”: To regulate and evaluate the learning progression of the group thanks to worthwhile feedback and debriefings based on knowledge sharing and collective work, helping individuals to adapt attitude and organization to improve learning ability and skills in the topic.
The process is a recursive one along the training path, with the gradual learning outcomes made clear and real-time setting of the digital online device (including scenario setting, modification of the content and data update but also IT improvement as new innovation) in order to meet the group’s expectation and guarantee the learning outcomes while integrating the group specificities in the given context. The following main criteria for learners’ satisfaction and their attraction should be kept in mind: the interactivity (fun and feedback), the pedagogical approach and quality of pedagogical materials, the efficient guidance by the trainer, then the efficiency of the ICT in addition to the usual motivation levers found for WIL: training based on work expectations, self-education resources, support by the parties among whom the trainer-tutor and of course, diploma or qualification, recognition and quality of materials and collective experience. The recognition by the learners of the Digital Hybrid Training will lead to the recognition of the HEI for return on investment: online learning as a consumer appeal, a support to blended courses through standards and a new motivation for the pedagogical innovation in the CoP. Digital Hybrid Training is a way to customize the learners’ learning route with more flexibility and cost efficiency. Table 6 summarizes the key factors and obstacles met during Digital Hybrid Training development project.
FROM THE LOCAL INITIATIVE TOWARDS THE INSTITUTIONAL LEVEL Global Vision and Coherence in the Context The integration of digital elements in the training should be made in a coherent and comprehensive way in the total training offer depending on the targeted market and the specificities of the training (i.e., the learning outcomes and the learners). The global point of view is a way to smooth the resources and benefit from existing modular digital solutions. The issue is a higher responsiveness of the HEI to training tenders, profitability as customization is made possible for a personal training route in the scope of LLL, safety of training solutions based on CoP resources and flexibility to achieve a high level of quality of trainings. This is the responsibility of the HEI steering committee in their specific context (see black highlighted number 1 in Figure 9). The business models depend on the critical size of the HEI with support services in the organization and the possibility of a dedicated team with respect to the digital project; a production line to be identified and managed not oblivious of the social responsibility with respect to the skilled workforce. The partnerships of the HEI represent opportunities and obstacles considering competition (industrial groups develop their own training system for professional skills or prefer to buy 357
Common Scenario for an Efficient Use of Online Learning
turnkey solutions), contract out of the technical part and even subcontract of the operational follow-up of learners based on automation with ICT. It questions the place of the University in the pedagogical innovation and the inner values, then, its ability to make business (see black highlighted numbers 4 and 2 in Figure 9) based on research work and creativity of trainers (senior lecturers) (see number 3).
Risk-Taking and Conflicts of Interest A deeper analysis of the situation points out the following questions: who should be the persons in charge of pedagogical creativeness? Who should be in charge of the industrialization of the training device once imagined and designed (for technical production, dissemination and implementation over time)? How to recognize the added value of the CoP in the HEI, of the individual in the collective work at the university and his personal creativity? How to solve the dialogism between an area of autonomy for experimentation and innovation and at the same time be efficient to the business with immediate return on investment, respectful of the value of education and training, and controlled risk-taking? Indeed, the experimentation in a limited area is always at low cost with limited risk-taking and immediate feedback and control (for a small group, with only one trainer leading the course, when using an already existing IT solution as a basis for change), whereas at a higher scale, substantial resources should be found for quality and safety of the training devices due to the impact on the image of the HEI (market issue) facing the internal issue of change management leading to this new global vision. Indeed, the risk-taking increases with backlashes brought on by specific initiatives that gain momentum, involuntarily highlighting differences in the training and between training paths in the institution, requiring balanced considerations. The model of Excellence for the HEI can allow a sustainable development of the structures: with guidance of the workforce for change and a regulated momentum; with leaders able to move the barriers and then, with processes for decision-making, a decided level for a functional splitting of activities and of responsibilities into profit centers for local regulation and higher supervision.
Integrating the Dissemination Requirements at the Provider Scale from the Start With respect to the presented experimentation, some solutions should be found. Once the digital training device is thought out, experienced as a prototype and validated, the solution should be capitalized. Digital training is both pedagogical use (tested with groups) and technical ICT solution. Although the prototype can be based on available solutions to be adapted, transfer and dissemination require a new level of product quality. Depending on context in the HEI, lack of ICT supports (digital resources to manage a huge number of learners, skills for specific coding, time to implement and test) are faced that might lead to failure or renunciation. Indeed, for ambitious pedagogical innovations, some technical choices imposed by the HEI will kill off the idea before it starts with higher discouragement. Areas of autonomy provided by the CoP will be a way to allow experimentation and a way to afford multi-expertise for the goal, enhancing an agile approach and continuous improvement. These are the responsibilities of the HEI in order to help the evolution from the prototype (for a selected group) to the industrialization at a larger scope with respect to market and training offer. As a consequence, the device should be flexible and upgradeable: first, for the content to enrich the solution and second, with respect to digital possibilities for the parties. But, the underlying requirement is the transferability to the pedagogical team of the expertise of the trainer committed in the prototype. The passing of power between the project develop358
Common Scenario for an Efficient Use of Online Learning
ment team and the operational one can be painful as for the trainer it is somehow a loss of his creation. In addition, many training devices are efficient because of the trainer’s skills; he is the one who makes the difference. The concept of digital learning includes the required interaction between trainers and trainees. Then, only the CoP can mitigate the risk as the e-learning is not limited to technique, richness comes from its use; as a piece of art, the course is a creation for each new situation and a mutual enrichment between parties. The issue of Digital Learning (and then Digital Hybrid Training) is therefore to facilitate such an interaction, taking advantage of new ways to communicate, exchange and manage data for the learning interest; the pedagogical act. ICT is not the challenge in itself, nor the pedagogy, but the association of both for the upstream identified needs. The rest of it is based on the management and the leadership. Teaching is not just a reproduction but a transfer for the personal evolution of the parties, trainer included; otherwise, limited to techniques, the involvement will decrease. ICT is a way to facilitate within automation some tasks as a prior-analysis of the learners’ results to help focus on the key elements for learners’ guidance. Finally, digital learning puts the parties at the center of the training act. In this way, the rate of efficient Digital Hybrid Learning in the training offer is a measure of the changing culture. The dissemination and industrialization amount will depend on the critical size of the HEI and its policy direction.
CONCLUSION In this chapter a proposal is given to develop efficient Digital Hybrid Training, i.e. training that allows mutual benefit for the parties through an increasing pedagogical interaction (training/learning) made possible by the new opportunities of ICT means. A way to enhance learning outcomes based on a formalized pedagogical approach in a given context. It mixes the known efficient learning approaches with guidance by trainers and innovative use of available ICT tools or improved ones. A development process is given, based on four steps: strategic decision, digital training specifications based on the shared learning outcomes with partners, agile development by a team with mixed expertise for the techniques and the pedagogy targeting a validated prototype with the parties involved (learners included), then a decision-making act for capitalization and dissemination with respect to performance efficiency of the business. Depending on the size of the HEI and local context to support innovation and CoP, the level of virtuality in the training is questioned as it is not just an addition of ICT means but a reasoned use of it for the goal. Fully integrated solutions will create a Digital Integrated Learning Environment to enhance learning ability, digital competences and collective intelligence, while offering better access and facilitation of the learning for ramp-up skills in the targeted fields. The key targets are: accessibility, personalization, long-life learning framework and interactive activities for autonomy and guidance with virtual social interactions. The business model will depend on the shareholders and the responsibility of the HEI in its context, the added value of the training offer to the business including digital learning (at different levels of interaction, virtual or not, online or not) and the importance given to the core of expertise (i.e. the trainers) through the Community of Practice to share culture. The success allows the identification of the good practices to develop the training with an excellence focus to handle the complex sustainable training system: society needs, economic context (funds/needs/training/applied innovation), HEI (upstream innovation/efficient training). As for the Work Integrated Learning requirement, the Digital Hybrid Learning encourages a new paradigm of the trainers and also the learners, and a learning organization for the HEI to help transfer knowledge access and skills with support. 359
Common Scenario for an Efficient Use of Online Learning
REFERENCES Albion, P. R. (2001). Some Factors in the Development of Self-Efficacy Beliefs for Computer Use Among Teacher Education Students. Journal of Technology and Teacher Education, 9(3), 321–347. Becker, H.-J. (2000). Who’s Wired and Who’s Not: Children’s Access to and Use of Computer Technology. The Future of Children. Children and Computer Technology, 10(2), 44–75. Biddix, J. P., Chungb, C.-J., & Parkc, H. W. (2014). The hybrid shift: Evidencing a student-driven restructuring of the college classroom. Computers & Education, 80, 162–175. doi:10.1016/j.compedu.2014.08.016 Biggs, J. B. (2003). Constructive alignment in university. HERDSA Review of Higher Education, 1 Biggs, J.B. (2915). Teaching for quality learning at university. Buckingham: The Open University Press. Bloxham, S., & Boyd, P. (2007). Developing Assessment in Higher Education: A Practical Guide, Ed:Open University Press Brackett, M. A., Rivers, S. E., & Salovey, P. (2011). Emotional Intelligence: Implications for Personal, Social, Academic, and Workplace Success. Social and Personality Psychology Compass, 5(1), 88–103. doi:10.1111/j.1751-9004.2010.00334.x Brown, S. (2004). Assessment for learning. Learning and Teaching in Higher Education, 1(may), 81–89. Bruet, J. (2015). Intégrer le digital learning: la mutation technologique des services de formation. Suisse:e-doceo. Cendon, E. (2016). Bridging Theory and Practice – Reflective Learning in Higher Education. In W. Nuninger & J.-M. Châtelet (Eds.), Quality Assurance and Value Management in Higher Education. Hershey, PA: IGI Global. doi:10.4018/978-1-5225-0024-7.ch012 Chacón, F. (1992). A taxonomy of computer media in distance education. Open Learning, 7(1), 12–27. doi:10.1080/0268051920070103 Conole, G. (2010). Review of pedagogical models and their use in e-learning, Retrieved from http:// cloudworks.ac.uk/cloud/view/2982 Couros, A. (2003). Communities of Practice: A Literature Review. Retrieved from https://www.tcd.ie/ CAPSL/_academic_practice/pdfdocs/Couros_2003.pdf Davies, P. (2007). The Bologna Process and University Lifelong Learning: The State of Play and future Directions. BeFlexPlus. Retrieved from http://www.eucen.eu/BeFlex/FinalReports/BeFlexFullReportPD. pdf Downes, S. (2012). Connectivism and Connective Knowledge: Essays on meaning and learning networks. Retrieved from http://www.downes.ca/ Drexler, W. (2010). The networked student model for construction of personal learning environments: Balancing teacher control and student autonomy. Australasian Journal of Educational Technology, 26(3), 369–385. doi:10.14742/ajet.1081
360
Common Scenario for an Efficient Use of Online Learning
Etzkowitz, H. (2008). The Triple Helix: University–Industry–Government Innovation in Action. New York: Routledge. doi:10.4324/9780203929605 Etzkowitz, H., & Leydesdorff, L. (2000). The dynamics of innovation. In Science and Technology (pp. 109-123). European Association for Quality Assurance in Higher Education-ENAQ. (2015). Standards and Guidelines for Quality Assurance in the European Higher Education Area, Retrieved from http://www.enqa. eu/wp-content/uploads/2015/11/ESG_2015.pdf European Commission-EC. (2007), The Key Competences for Lifelong Learning – A European Reference Framework, Annex of a OJ of the EU 2006/L394), Education and culture DG, Ed: Official publications of the EC European Commission-EC. (2010). EUROPE 2020: A strategy for smart, sustainable and inclusive growth, Retrieved 2016, April from http://ec.europa.eu/europe2020/index_en.htm European Commission-EC. (2014), A common European Digital Competence Framework for Citizens (DIGCOMP), Eramus+. Retrieved from http://openeducationeuropa.eu/sites/default/files/DIGCOMP%20 brochure%202014%20.pdf European Foundation for Quality Management-EFQM. (2013). An overview of the EFQM Excellence Model. Retrieved from http://www.efqm.org Farinha, L., & Ferreira, J. J. (2013). Triangulation of the Triple Helix: A Conceptual Framework (White Paper). Triple Helix. Ferrari, A. (2013), DIGCOMP: A Framework for Developing and Understanding Digital Competence in Europe. Goleman, D. (2007). Social intelligence: the new science of human relationships, Reprint Editdion. Bantam. Homan, G., & Macpherson, A. (2005). E-learning in the corporate university. Journal of European Industrial Training, 29(1), 75–90. doi:10.1108/03090590510576226 Kluijfhout, E. (2005). E-learning business models: Pedagogical approaches, design implications, and prerequisites for e-learning. Retrieved from http://fr.slideshare.net/eric.kluijfhout/pedagogical-approachesdesign-implications-and-prerequisites-for-e-learning Kolb, D. A. (1984). Experiential Learning - Experience as the source of learning and development. Englewoods Cliffs, NJ, USA: Prentice-Hall. Lave, J., & Wenger, E. (1991). Situated learning: legitimate peripheral participation. Cambridge University Press. doi:10.1017/CBO9780511815355 Leclercq, D., & Poumay, M. (2005). The 8 Learning Events Model and its principles. LabSET. Retrieved from http://www.labset.net/media/prod/8LEM.pdf Mittal, A. (2010). Framework of e-learning business model. Retrieved from http://fr.slideshare.net/mittalashi/framework-of-e-learning-business-models
361
Common Scenario for an Efficient Use of Online Learning
Moore, J. L., Dickson-Deane, C., & Galyen, K. (2011). e-Learning, online learning, and distance learning environments: Are they the same? The Internet and Higher Education, 14(2), 129–135. doi:10.1016/j. iheduc.2010.10.001 Nuninger, W., & Châtelet, J.-M. (2014). Engineers Abilities Improved Thanks to a Quality WIL Model in Coordination with the Industry for Two Decades.[IJQAETE]. International Journal of Quality Assurance in Engineering and Technology Education, 3(1), 15–51. doi:10.4018/ijqaete.2014010102 Nuninger, W., & Châtelet, J.-M. (2016). Hybridization-Based Courses Consolidated through LMS and PLE Leading to a New Co-Creation of Learning: Changing All Actors’ Behavior for Efficiency. In D. Fonseca & E. Redondo (Eds.), Handbook of Research on Applied E-Learning in Engineering and Architecture Education (pp. 55–87). Hershey, PA: Engineering Science Reference; doi:10.4018/978-14666-8803-2.ch004 Nuninger, W., Conflant, B., & Châtelet, J.-M. (2016), Roadmap to Ensure the Consistency of WIL with the Projects of Companies and Learners: A Legitimate and Sustainable Training Offer, In Nuninger W. & J.-M. Châtelet J.M (Eds.) Advances in Educational Marketing, Administration, & leadership (AEMAL) Book Series (chap. 8), IGI Gobal. doi:10.4018/978-1-5225-0024-7.ch008 Pang, Y. J. (2010). Techniques for Enhancing Hybrid Learning of Physical Education, In P. Tsang et al. (Ed.) Hybrid Learning 3rd International Conf., Beijing, China, August 16-18,Proceedings LNCS 6248 (pp. 94–105), Berlin Heidelberg: Springer-Verlag doi:10.1007/978-3-642-14657-2_10 Sambrook, S. (2003), E-learning in small organizations, Education + Training, 45 (8/9), pp. 506-516 Scharmer, C. O. (2009). Theory U: Leading from the Future as It Emerges. Berrett-Koeheler Publishers. Schein, E. H. (2013), “Humble Inquiry: The Gentle Art of Asking Instead of Telling”, Ed: BerrettKoehler Publishers Senge, P. & al. (1994). The fifth discipline fieldbook. London: Nicolas Brealey Publishing. Southard, S., Meddaugh, J. & France-Harris, A. (2015). Can SPOC (Self-Paced Online Course) Live Long and Prosper? A Comparison Study of a New Species of Online Course Delivery, Online Journal of Distance Learning Administration, XVIII(2; spring) Stone, D., & Heen, S. (2014), “Thanks for the feedback: The Science and Art of Receiving Feedback”, Ed: Viking Wenger, E. (2000), “Communities of Practice and Social Learning Systems”, in Organization, 7(2), 225-246, Ed: SAGE London doi:10.1177/135050840072002 Yang, J., Schneller, Ch., & Roche, S. (2015). The Role of Higher Education in Promoting Lifelong Learning, Ed: UNESCO Institute for Lifelong Learning
362
Common Scenario for an Efficient Use of Online Learning
KEY TERMS AND DEFINITIONS Blended Courses (Hybrid Courses): Differ from traditional face to face lessons by mixing learning with synchronous and asynchronous activities to prepare face-to-face feedback and knowledge complements; both could be carried out online with ICT as LMS, virtual classroom or other means. Continuous Vocational Training (CVT): Programs for employees coming back to the university to improve their skills financed by companies, State support or personal funds. By contrast, apprentices in Initial Vocational Training (IVT) are younger people. Digital Hybrid Training (Or Learning): Is considered by the author as a full integration of the digital in the learning scenario with online interactive content, digital activities and increasing virtual interactions (in the group and with the trainer). Focus is put on increasing learning ability and the specific learning outcomes; both based on digital competence and social intelligence. It requires a new trainer’s behavior and a commitment of the parties. Hybridization: Refers to a voluntary act mixing in an integrated manner ICT and traditional pedagogical methods to achieve a specific learning outcome; knowledge, skills and transversal abilities (intercultural, collaborative); changing attitude and creating PLE. Learning Management System (LMS): The LMS is a digital tool focused on the trainer’s expectation and therefore, dedicated to the distribution of training and especially distance-learning. The Content Management System it is based on allows a digital storage of the trainer’s pedagogical material (manage and publish for a given group of users). An LMS is a way to create dynamic interaction in the group. Lifelong Training: Learning throughout life; the aim is to give access to training at all age and at any time, whatever the personal experience is (previous work experience, grade, …) to achieve new goals with respect to qualifications for new opportunities. ONAAG: Acronym of the innovative and learner-centered project “Outil Numérique d’Appui de l’Auto-Formation Guidée” (Digital Support for Guided Self-Learning); in order to facilitate knowledge acquisition and autonomy, stressing the interest of collaborative work in a safe learning environment. The project was supported by the University of Lille in the framework of the development of new pedagogy integrating ICT from 2014-2016. Rapid Application Development (RAD) and Agile approach: Is a software development approach that gives more importance to results than to planning tasks, focusing on continuous adjustment in reaction to progress and new information. In the same way as Agile Development Method, the client is involved at all steps and prototypes are quickly produced and tested to validate the functions. The spirit can be extended to other kinds of project to enhance the sharing of expertise with product focus. Trainer: Generic term for the ones who lead the courses; the expected involved teachers, teacherresearchers, experts and even instructors and tutors who will change attitude in the act of teaching depending on the context and group, guiding and supporting the learners (also referred to as students in Higher Education, apprentices or trainees in WIL) for a better reflexive learning process and autonomy; not just providing knowledge but giving keys to facilitate the personal evolution. Work Integrated Learning (WIL): Generic term for trainings that alternate learning at school and training in the workplace with Formative Work Situation (FWS), unifying educational and professional approaches for mutual benefit. Pedagogy is learner-centered, focuses on competences and puts the learner into a situation for personal change of behavior to enhance the “learning by doing” motto. WIL requires joint commitment of parties and develops collective and social intelligence.
363
Common Scenario for an Efficient Use of Online Learning
APPENDIX 1: ONAAG 1 IMPLEMENTED ON THE LMS The first component of ONAAG is implemented on Moodle respecting a formal scenario built on modular sections (see Figure 11) composed by a content (html lesson with some videos of corrected examples) followed by self-assessment tests (see Figure 12) completed with additional documents (pdf). The solution is initiated with a video of introduction, specification of learning outcomes and a survey to identify the group (previous training path). The interest mostly lies in the modular structure that can easily evolve (number and organization of the sections once created) with respect to the group and its progression, and the autonomous organization permitted to follow each section. The synchronous lessons allow feedback and validation. Badges of success are issued after each learning level (section) to keep a high level of motivation; target is mostly knowledge in the topic. At the end, a satisfaction survey is proposed for improvement of the device. Figure 11. Screenshot showing the modular framework of ONAAG 1 (topic is on Laplace Transform) with the open 4th section activated with the lesson and test
364
Common Scenario for an Efficient Use of Online Learning
Figure 12. Screenshot of a question in a test of ONAAG 1 for functional analysis (topic is Computer programming)
APPENDIX 2: ONAAG 2 BUILT ON THE PUBLISHING MODEL TOPAZ The second component of ONAAG allows a study workspace, designs a challenge to put into practice the knowledge in a new unknown context. The path is entirely free to follow as, for each route that presents a situation to be solved all the required data are given. For each node of the decision tree (see Figure 13) the learners are guided by a short video explaining the challenge. Whatever the situation, it should be thought about to identify the outcomes, then solved under assumptions to be decided by the learners: the result should be predicted by the theory (calculus) and the hypothesis validated within simulation. A quiz allows a self-evaluation of the result for improvement thanks to a guided solution and support pages. Progress reports should be uploaded on the LMS for the trainer as the material for further debriefing. The expertise is validated at the end of the process through an oral presentation based on this material; target is higher competence in the topic. The interest lies in the autonomous organization and the collective work required for result validation by the peers. Figure 14 is a screenshot showing the framework of the screen of ONAAG 2 with the resources.
365
Common Scenario for an Efficient Use of Online Learning
Figure 13. Simplified decision tree with one described path of challenge for the case study implemented on ONAAG 2 to put into practice knowledge of Automatic Control (2016 version): make assumption, apply theory and analyze result with self-assessment test. The role of the trainer is indicated.
Figure 14. Screenshot of ONAAG 2 (2016 version) as an advice to begin with the pedagogical device
366
367
Chapter 16
Electronic Learning: Theory and Applications Kijpokin Kasemsap Suan Sunandha Rajabhat University, Thailand
ABSTRACT This chapter aims to explain the overview of electronic learning (e-learning); the emerging trends in e-learning; the important factors of e-learning; the relationships among e-learning quality, learning satisfaction, and learning motivation; the implementation of e-learning; the approaches and barriers to e-learning utilization; e-learning for medical and nursing education; and the significance of e-learning in modern education. When compared to the traditional mode of classroom learning, there is clear evidence that e-learning brings faster delivery, lower costs, more effective learning, and lower environmental impact in the modern learning environments. E-learning allows each individual to tackle the subject at their own pace, with interactive tasks being set in place to ensure a thorough understanding throughout each module. The chapter argues that utilizing e-learning has the potential to increase educational performance and reach strategic goals in modern education.
INTRODUCTION In the knowledge society, electronic learning (e-learning) has built on the extensive use of information and communication technology (ICT) to deliver learning and instruction (Navimipour & Zareie, 2015). With the advent of information technology (IT), teaching and learning using e-learning systems have become common phenomena in recent years (Islam, 2016). The widespread utilization of ICT and resulting access to the Internet have enabled the convergence of e-learning to daily practices of educational institutions (Bates, 2005). E-learning is a part of educational process on many levels of education, from primary education to higher education, extending to postgraduate level (Decman, 2015). E-learning is the effective way to provide the continuing education and has been shown to be an effective method in modern education (Lahti, Kontio, & Välimäki, 2016). Modern technologies significantly contribute to the flexible modes of teaching and learning (Bharuthram & Kies, 2013). E-learning is one of the most significant developments in both schools and companies DOI: 10.4018/978-1-5225-1851-8.ch016
Copyright © 2017, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
Electronic Learning
(Violante & Vezzetti, 2015) and is a pattern of distance learning that is completely virtualized through the Internet (Lara, Lizcano, Martinez, Pazos, & Riera, 2014) toward delivering the learning contents to learners (Farid et al., 2015). E-learning is considered as a fundamental part of student’s learning experience in higher education (Urh, Vukovic, Jereb, & Pintar, 2015). E-learning allows students to choose learning contents and tools appropriate to their learning interests, needs, and skill levels in modern learning environments (Kasemsap, 2016a). This chapter aims to bridge the gap in the literature on the thorough literature consolidation of elearning. The extensive literature of e-learning provides a contribution to practitioners and researchers by describing the advanced issues of e-learning in order to maximize the educational impact of e-learning in modern education.
Background Since the late 1990s, the utilization of learning management systems (LMSs) for online education has steadily increased in higher education (Islam, 2016). LMSs have become indispensable tools for online education LMS, also known as course management system (CMS) or the virtual learning environment (VLE), is an e-learning system that has been widely adopted by universities (Islam, 2014). CMS in higher education has emerged as one of the most widely adopted e-learning platforms (Kim, Trimi, Park, & Rhee, 2012). VLE is widespread in higher education (McGill & Hobbs, 2008), typically used to support e-learning by providing online courses and other learning activities (Lim & Chiew, 2014). VLE constitutes the current information systems-related category for the online training and development (Mueller & Strohmeier, 2010). The LMS is web-based software that is utilized for the delivery and management of online education and training (Limayem & Cheung, 2011). The LMS contains the important features for distributing courses over the Internet and online collaboration (Islam, 2016). Whether focusing on distance education or classroom-based education, most universities utilize LMSs to improve the learning and teaching processes (McGill & Klobas, 2009). While much of the e-learning development at universities in the past 15 years has been on the institutionally supported LMS, alternative educational technologies are accomplished concerning the rapid growth in emerging technologies and social media platforms (Scott, 2013). Social media enables the creation of knowledge value chain to customize information and delivery for a technological business growth (Kasemsap, 2014). Normark and Cetindamar (2005) defined e-learning as the ability of system to electronically transfer, manage, support, and control both learning activities and learning materials. E-learning plays an important role in education as it supports online teaching via computer networks and provides educational services by utilizing IT (Ahmad, Härdle, Klinke, & Alawadhi, 2013). E-learning can improve the learning efficiency (Ludwig, Bister, Schott, Lisson, & Hourfar, 2016) and should be utilized to arrange the traditional ways of teaching in modern education (Lüdert, Nast, Zielke, Sterry, & Rzany, 2008). The successful adoption of ICT to enhance e-learning can be very challenging, requiring a complex blend of technological, pedagogical, and organizational components, which may require the resolution of contradictory demands and conflicting needs (McPherson & Nunes, 2008). The integration of ICT into educational environments has made the important contributions to the learning processes (Drigas, Ioannidou, Kokkalia, & Lytras, 2014) and has accelerated the developments in the e-learning environments (Ozyurt & Ozyurt, 2015). Web-based learning allows students to learn at their own pace, access the information at a time that is convenient for them, and provides modern 368
Electronic Learning
education to the remote students (Kasemsap, 2016b). Learning style is one of the most important parameters to be utilized for taking into consideration individual differences while creating adaptive learning environments (Liegle & Janicki, 2006). By means of e-learning systems, teachers can immediately adjust the learning schedule for each student regarding student’s achievement and build more adaptive learning environments (Cheng, Wei, & Chen, 2011). Learners have the different learning styles, cognitive traits, learning goals, and varying progress of their learning over period of time, which affects the learner’s performance while providing the same package of the learning course to all learners (Premlatha & Geetha, 2015). Creating the learning environment for students based on their learning characteristics in e-learning environments, which are the products of technology, is easier than creating it in traditional classroom environments (Ozyurt, Ozyurt, Guven, & Baki, 2014). Marković et al. (2013) stated that adaptive educational systems attempt to maintain a learning style profile for each student and use this profile to adapt the presentation and navigation of instructional content to each student. van Seters et al. (2012) indicated that one way to personalize instruction is by using adaptive e-learning to offer the training of varying complexity. Adaptive e-learning system can create the personalized learning materials and scenarios, individually adapted by students, in accordance with their targeted knowledge level, background knowledge, learning styles, and interests for various learning topics (Jovanovic & Jovanovic, 2015). Adaptive e-learning environments can provide students with individualized environments, such as different learning strategies and sources, support for solution, and interfaces taking into consideration individual differences (Yasir & Sami, 2011). Adaptive e-learning environments create higher satisfaction level, diminish learning time, and increase the educational achievement of students (Popescu, 2010).
ADVANCED ISSUES OF ELECTRONIC LEARNING IN MODERN EDUCATION This section emphasizes the overview of e-learning; the emerging trends in e-learning; the important factors of e-learning; the relationships among e-learning quality, learning satisfaction, and learning motivation; the implementation of e-learning; the approaches and barriers to e-learning utilization; e-learning for medical and nursing education; and the significance of e-learning in modern education.
Overview of Electronic Learning The proliferation of the Internet and the World Wide Web applications has created the new opportunities as well as the learning challenges for institutions and individuals who are either receiving or delivering education (Büyüközkan, Ruan, & Feyzioğlu, 2007). Along with the rapid development of IT and the increasing demand for building business continuity capabilities, e-learning has emerged into an applicable solution for the on-demand training and organizational learning (Liu & Wang, 2009). The remarkable velocity and volatility of modern knowledge require the modern learning methods in order to offer the important features, such as learning efficiency, task relevance, and personalization (Acampora, Gaeta, & Loia, 2011). It is desirable for the ubiquitous e-learning environment to provide the user-oriented personalization of e-learning materials (Muntean & Muntean, 2009). E-learning content can be produced and embedded in the learning practice in modern education, work-based learning, and community learning contexts (de 369
Electronic Learning
Freitas, 2007). E-learning content includes expert and peer education, case studies, interactive games, and modeling of best-practice processes (Peterson, Robinson, Verrall, Quested, & Saxon, 2007). The performance of the learners can be enhanced by posting the suitable e-learning contents to the learners based on their learning styles (Deborah, Baskaran, & Kannan, 2014). Recent developments and new directions in education have emphasized the learners’ needs, profiles, and pedagogical aspects by emphasizing the learner-centered approaches in educational settings (Yalcinalp & Gulbahar, 2010). As technology is used more in education, the teachers’ roles are increasingly integrated with those of support staff, administrators, and technical staff (Gunga & Ricketts, 2007). E-learning systems play an increasing role in various educational environments around the world (Decman, 2015). There has been the significant recent interest in the dynamics of institutional change and e-learning (Nichols, 2008). The increase in e-learning application has prompted many institutions to adopt a whole learning organization approach to teacher professional development (Wilson, 2012). Teacher professional development is the process of improving and increasing the capabilities of teachers through access to education and training opportunities in the workplace (Kasemsap, 2017a). Stein et al. (2011) indicated that there are five notions of e-learning (i.e., as tool and equipment, as a facilitator of interaction, as learning, as a reduction in distance, and as a collaborative enterprise). For the production of e-learning environments, a wide range of software solutions can be applied which differ in their functionality and vary in cost (Woelber, Hilbert, & Ratka-Krüger, 2012). Research of web-based elearning education focuses on the inclusion of new technological features and the exploration of software standards (Alonso, López, Manrique, & Viñes, 2005). Research into e-learning has changed in focus and breadth over the last four decades as a consequence of changing technologies, and changes in educational policies and practices (Cox, 2013). E-learning research is at the early majority stage and the foci have shifted from issues of the effectiveness of e-learning to teaching and learning practices (Hung, 2012). Technology acceptance model (TAM) can predict teachers’ intentions to continue using e-learning for professional development based on perceived ease of use and usefulness (Smith & Sivo, 2012). The ease of use and intuition of the Web 2.0 technologies allow creating the e-learning environments, which realize the activity-rich pedagogical models and facilitate the competence development of students (Schneckenberg, Ehlers, & Adelsberger, 2011). E-learning service quality, course quality, perceived usefulness, perceived ease of use, and self-efficacy have direct effects on students’ behavioral intention to utilize the e-learning systems (Li, Duan, Fu, & Alford, 2012).
Emerging Trends in Electronic Learning Recent developments of e-learning specifications, such as learning object metadata (LOM), sharable content object reference model (SCORM), learning design, and other pedagogy researches in semantic e-learning have shown a trend of applying the innovative computational techniques, especially Semantic Web technologies, to promote the existing content-focused learning services to semantic-aware and personalized learning services (Huang, Webster, Wood, & Ishaya, 2006). Learning design involves a wide set of knowledge, skills, and competencies, including learning theory and its applications, course design principles, use of media, use of different technologies, and relevant business processes (MacLean & Scott, 2011). Individual, social, and organizational factors are important to consider in explaining the students’ behavioral intentions and usage of e-learning environments (Tarhini, Hone, & Liu, 2015). Regarding the theory of planned behavior, major variables (e.g., general-person characteristics, motivation to 370
Electronic Learning
learn, general and task-specific self-efficacy, situational barriers and enablers, and instructional design characteristics) can predict the effective participation in e-learning (Garavan, Carbery, O’Malley, & O’Donnell, 2010). Learning self-efficacy and compatibility of values with learning tools can contribute to the continued utilization of e-learning tools (Hung & Cho, 2008). E-learning strategies need to focus on improving staff awareness of e-learning methods to supplement rather than replace traditional teaching methods, while providing the ongoing support and mentoring for development, technological training, and incentives for staff involvement (Blake, 2009). Extra efforts by online instructors are needed to maximize the e-learning process through business games in the online training (Hernández, Gorjup, & Cascón, 2010). Linehan et al. (2011) stated that there are various recommendations for teachers to organize the learning contents, such as rapid learning feedback, balanced tasks and users’ skills, experimentation with tasks, and the use of practical game mechanics in various learning activities. The digitalization of educational resources and learning materials has enabled the reuse of these resources across countries and scholarly domains (Richter & McPherson, 2012). The usability of computer interfaces has a positive impact on the online learning (Davids, Harvey, Halperin, & Chikte, 2015). Online learning pedagogy allows students to have a more accurate perception of the effectiveness of their own learning (Shohreh & Keesling, 2000), increasing student-to-teacher interaction, as well as critical thinking (Hay, Peltier, & Drago, 2004). In the age of information explosion, e-learning recommender systems have emerged as the effective information filtering techniques that attempt to provide the most suitable learning resources for learners while applying e-learning systems (Dwivedi & Bharadwaj, 2015). Online education provides students with an educational alternative to face-to-face courses, permitting students to proceed, at their own pace, and to indicate their own personal course timeline (Shanley, Thompson, Leuchner, & Zhao, 2004), while having at the same time full-time jobs in case of the working students. Interaction among peers is privileged by online students toward promoting the existence of a learning community (Moura, Cunha, Azeiteiro, Aires, & de Almeida, 2010). Flexibility, interaction, teaching presence, collaborative learning, and a great sense of community are the important categories in the online students’ discourses (Hansen, 2008). Interaction is the major approach to getting successful e-learning outcomes in engineering education (Martínez-Caro, 2011).
Important Factors of Electronic Learning Technology is a basic infrastructure that enables the implementation of e-learning (Urh et al., 2015). The utilization of digital technologies and social networking has rapidly grown over the last decades, and these technologies are increasingly incorporated into the teaching of higher education (Garrison, 2011). Educational technologies can improve the involvement of new technologies in the educational system and can encourage the process of harmonization of necessary knowledge (Bedrule-Grigoruta & Rusua, 2014). However, perhaps because higher education institutions are resistant to change, educational technologies in universities has not managed to match the ubiquity of technology in everyday life (White, 2007). Effective online learning environment should encourage contact between students and organizational stuff, cooperation between students, feedback, active learning techniques, communication, and learning diversity of students (Shea, Pickett, & Pelz, 2003). Jara et al. (2012) indicated that online collaborative communication represents the constructivist method to transmit the knowledge and experience from the teacher to students, thus overcoming physical distance and isolation. Teachers should adopt strategies 371
Electronic Learning
to change the negative attitudes about e-learning by introducing more e-learning courses and should encourage students to utilize the Internet in their education and communication with teachers and colleagues toward promoting e-learning utilization (Brumini et al., 2014). Because e-learning requires the learners to have a certain level of computer and Internet skills (Rosenberg, 2001), previous knowledge is an important factor for successful e-learning (Hay et al., 2008). Teachers need skill for operating and planning e-learning as an instructional media (Triyono, 2015). Administration of e-learning requires knowledge of technology and people (Urh et al., 2015). Administration of technology and people is easily performed by utilizing LMS. LMS tools are used for instructional tasks that are performed to promote the student’s learning activities (Schoonenboom, 2014). In addition, finance is an important part of the entire e-learning (Urh et al., 2015).
Relationships among Electronic Learning Quality, Learning Satisfaction, and Learning Motivation Nowadays, various factors affect students’ feelings of satisfaction and dissatisfaction with their e-learning experiences in modern education (Youn & Vachon, 2005). Many universities are pursuing increases in online course offerings as a method of offsetting the rising costs of providing high-quality educational opportunities and of better serving their student populations (Albert & Johnson, 2011). E-learning can alleviate the restrictions of time and space compared with traditional face-to-face education toward promoting the satisfaction and the learning motivation of students (Lee & Lee, 2015). Students with higher motivational orientations perform better in the online group discussions (Zhu, Valcke, Schellens, & Li, 2009). Knowledge, information, and learning technologies are recognized as the essential tool for achieving the effective quality of e-learning (Ehlers & Hilera, 2012). Sun et al. (2008) indicated that there are six antecedents of perceived e-learner satisfaction: learner dimensions (e.g., learner attitude toward computers, learner computer anxiety, learner Internet self-efficacy), instructor dimensions (e.g., instructor response timeliness, instructor attitude toward e-learning), learning course dimensions (e.g., e-learning course flexibility, e-learning course quality), technology dimensions (e.g., technology quality and Internet quality), design dimensions (e.g., perceived usefulness and perceived ease of use), and environmental dimensions (e.g., diversity in assessment and learner perceived interaction with others). The high proportion of perceived usefulness is explained by perceived content quality, and perceived ease of use is explained by perceived system quality and anxiety (Calisir, Gumussoy, Bayraktaroglu, & Karaali, 2014). With e-learning, universities are trying to achieve the learning goals, such as a high degree of satisfaction, motivation, and efficiency of students (Urh et al., 2015). Learners’ satisfaction and the results of learning are influenced by the learners’ individual characteristics, learning content, learning environment, various types of interactions, and instructor characteristics related to the teaching styles (Driver, 2002). Among the learners’ individual characteristics, learners’ motivation to learn has a close relationship with learners’ satisfaction in e-learning (Lee & Lee, 2015). Students’ learning styles affect their learning while solving complex problems when a case-based e-learning environment is implemented in a conventional lecture-oriented classroom (Choi, Lee, & Kang, 2009). Since learners’ learning motivation is related to the level of their willingness to learn the learning content, it directly affects the effectiveness and satisfaction of e-learning (Tannenbaum, Mathieu, & Cannon-Bowers, 1991). Learning motivation leads to learning transfer in the modern learning environments (Kasemsap, 2013). Interactions in online learning environments can be categorized into three 372
Electronic Learning
types: interaction among learners, interaction between learners and learning content, and interaction between learners and instructors (Moore & Kearsley, 2011). The higher level of interaction with other learners and the instructor is proved to promote learners’ satisfaction with e-learning courses (Bolliger & Martindale, 2004). Interaction with instructors includes prompt responses to learners’ questions, feedback on assignments, and encouragement for the participation in the e-learning activities.
Implementation of Electronic Learning The implementation of e-learning as a strategy has exponentially risen over the last 20 years as more adults utilize this medium to enhance their skills and acquire knowledge (Arthur-Mensah & Shuck, 2014). Strategic planning of e-learning implementation includes decision making about the suitable pattern of implementing e-learning on the different levels in an institution (Begičević, Divjak, & Hunjak, 2007). Many countries are integrating ICT in education to enhance the learner’s experience of learning (Pagram & Pagram, 2006). However, e-learning is still in its early stages of adoption and implementation in the developing countries (Farid et al., 2015). With the growing demand for e-learning along with striving for excellence associated with globalization, there are worldwide calls for enhancing and assuring quality in e-learning, specifically in the context of the developing countries (Masoumi & Lindström, 2012). The overall success of an e-learning initiative depends on the attainment of success at each of three stages of e-learning systems development: system design, system delivery, and system outcome (Holsapple & Lee-Post, 2006). The e-learning maturity model (eMM) provides a framework for e-learning quality improvement that measures the capability of institutions to sustainably engage in e-learning and visualizes that capability in a way that assists leaders and managers using that information to accomplish the systematic improvements in their institution’s e-learning activities (Marshall, 2012). Lykourentzou et al. (2009) indicated that the increasing popularity of e-learning has created an educational need for accurate student achievement prediction mechanisms, allowing instructors to improve the efficiency of their courses by addressing the specific needs of their students at an early stage. As ranking e-learning resources according to quality criteria is beneficial for end users, e-learning quality measures play a prominent role in modern education (Pons, Hilera, Fernandez, & Pages, 2015). Developing countries are facing different challenges in the implementation which are quite different from the developed countries (Bhuasiri, Xaymoungkhoun, Zo, Rho, & Ciganek, 2012). Many developing countries are eager to implement the e-learning paradigm (Gronlund & Islam, 2010) but are experiencing different issues, such as resources, infrastructure, Internet access, support from institution, personal characteristics as well as the learning culture in the promotion of e-learning paradigm (Nawaz, 2012). The intense use of the ICT in the education sector of the developed countries alleviates to the establishment of completely ICT-based universities called virtual universities (Farid et al., 2015). In the education sector, developing countries are facing problems, such as lack of skilled teachers, educational infrastructure, and technology access to enhance the education at different levels (Qureshi, Ilyas, Yasmin, & Whitty, 2012). To date, the research into factors impacting on the acceptance of e-learning has focused predominantly on students at higher education institutions (Hrtonova, Kohout, Rohlikova, & Zounek, 2015). By aligning principles of the diffusion innovation adoption model and the concerns-based adoption model (Hall, 1997), Collis and Moonen (2001) utilized the 4E Model (i.e., educational effectiveness, ease of use, engagement, and environment) to describe the likelihood of technology acceptance in education. Adopting strategic principles for implementing e-learning has the potential of providing authentic e-learning (Herrington, Reeves, & Oliver, 2010), thus improving educational delivery and reducing costs (Sharpe, 373
Electronic Learning
Benfield, & Francis, 2006). Research and institutional experiences have identified various factors for successful e-learning implementation (Lin, Ma, & Lin, 2011), which are related to the components of managers, instructors, students, technology, and pedagogy.
Approaches and Barriers to Electronic Learning Utilization E-learning has been adopted because it goes beyond the role limitations established by traditional learning formats (Oiry, 2009). E-learning incorporates a wide range of educational media and activities designed to engage the learner emotionally, psychologically, and physically (Peterson, Robinson, Verrall, & Quested, 2008) and provides the educational opportunities to students for self-directed learning (Gaikwad & Tankhiwale, 2014). However, there are many obstacles of e-learning, such as limited social interaction, technology problems, and quality of the learning content (Pintar, Jereb, Vukovic, & Urh, 2015). In presence of a large number of students and of e-learning websites, the tasks of the learning agents are difficult regarding the limited educational resources (Rosaci & Sarné, 2010). For learners in the distributed e-learning environments, it is difficult to locate the right peer for collaboration on the right knowledge, at the right time, and in the right way (Zheng & Yano, 2007). Limited computer access, IT skills, technical issues, and poor peer commitment negatively affect the utilization of e-learning tools in modern education (Moule, Ward, & Lockyer, 2010). Cost, the time to produce interventions, and the training requirements for both educators and learners are considered as the barriers to e-learning utilization (Gordon, Chandratilake, & Baker, 2013). The e-learning environment that supports social network awareness is the highly effective method of increasing peer interaction and assisting student learning by raising the awareness of social and learning contexts of peers (Lin, Huang, & Chuang, 2015). Agent technology plays an important role in today’s software development at the industrial level and brings advantages to the development of educational applications (Hammami & Mathkour, 2015). The main solutions to enhance e-learning systems include standardization, strategies, funding, integration of e-learning into the curriculum, blended teaching, user friendly packages, access to technology, and skills training (Childs, Blenkinsopp, Hall, & Walton, 2005).
Electronic Learning for Medical and Nursing Education There is an increasing application of technology in an attempt to enhance teaching and learning in medical education, from the use of websites and VLE to interactive online tutorials to blogs and podcasts (Asarbakhsh & Sandars, 2013). E-learning plays an increasingly important role in medical education (Goh & Clapham, 2014). As many medical schools focus on student-centered learning strategies, e-learning provides the helpful approach to promoting clinical decision-making skills in a case-based way (Abendroth, Harendza, & Riemer, 2013). Regarding e-learning, it is possible to provide the quick feedback on the prescribing decisions and this will improve with the advances in virtual reality in medical education (Maxwell & Mucklow, 2012). Beeckman et al. (2008) stated that e-learning program can increase the classification skills of qualified nurses and nursing students. Health information professionals are key to the development of online learning and teaching activities in medical and nursing education (Bury, Martin, & Roberts, 2006). Situated e-learning is a helpful adjunct to traditional learning for medical and nursing students (Feng et al., 2013). E-learning has the limited value in teaching the complex anatomy to novice learners, but has the good value in teaching clinical knowledge and medical skills (Fung, 2015). The use of instructional videos to teach clinical 374
Electronic Learning
skills is the growing area of e-learning based on observational learning that is recognized as one of the most powerful learning strategies in medical education (Cooper & Higgins, 2015). Changes in the World Wide Web, with a shift to more social-networking activity in modern education and to web-based delivery to the ubiquitous and portable devices significantly increase the learning opportunities for surgical e-learning (Larvin, 2009). Ruiz et al. (2007) stated that e-learning provides a relatively new approach to addressing geriatrics educators’ concerns, such as the shortage of professionals trained to care for older people, overcrowded medical curricula, the move to transfer teaching venues to community settings, and the switch to competency-based education models in medical education. Generating e-learning modules aiming at the gaps in quality of care is feasible and acceptable to medical learners (Kobewka et al., 2014).
Significance of Electronic Learning in Modern Education In recent years, e-learning has become an increasingly important method in modern education (Tan et al., 2014). Rapid developments in the use of ICT in higher education require the effective methods for evaluating the contribution of such tools to student learning, especially when they are complementing a face-to-face experience (Ginns & Ellis, 2009). E-learning is the widely used technology in today’s teaching environment (Dow, Li, Huang, & Hsuan, 2014). As the latest stage of learning and training evolution, e-learning is supposed to provide intelligent functionalities in processing multimedia education resources and in supporting context-sensitive pedagogical education processes (Huang & Mille, 2006). E-learning can be utilized as the in-class teaching and learning, if the learning techniques are suitable for the teaching goals with the effective student-teacher interaction (Oztekin, Delen, Turkyilmaz, & Zaim, 2013). As e-learning program is founded on a platform of evidence-based practice, it is easily transferable to an international context (Sinclair, Schoch, Black, & Woods, 2011). E-learning provides a bridge between the cutting edge of education and training and outdated procedures embedded in institutions and professional organizations (Harden, 2005). Most e-learning studies propose the possible improvements in course material (Violante & Vezzetti, 2014). The development of e-learning as the teaching strategy in higher education has significant implications concerning student’s learning, the role of the teacher, and the institution of higher education (Muirhead, 2007). E-learning courses on postgraduate educational supervision can be employed for continuing professional development (Gupta et al., 2012). E-learning tools have been used to enhance the conventional courses in higher education institutions toward creating the hybrid e-learning module that aims at promoting the students’ learning experiences (Ahmed, 2010). The advantages of e-learning for teachers are the improved distribution of learning content, ease of update, standardization, and tracking of learner’s activities (Maxwell & Mucklow, 2012). E-learning processes can support the development of dynamic capabilities (Costello & McNaughton, 2016). Various aspects of accessibility, flexibility, interactiveness, personalization, and productivity should be embedded in all levels of management and services within the field of e-learning in higher education (Ossiannilsson & Landgren, 2012). In a web-based learning environment, interactivity has been referred to as the most important element for successful e-learning (Violante & Vezzetti, 2015). Significant factors identified as influential to adoption of e-learning in modern education include the institutional infrastructure, staff attitudes and skills, and perceived student expectations (King & Boyatt, 2015). Effective learning and knowledge management concepts are necessary as part of lifelong learning for individual and organizational spaces (Punie, 2007). Lifelong learning and knowledge management have 375
Electronic Learning
become a fundamental goal of modern educational policies (Kasemsap, 2016c). E-learning facilitates lifelong learning (Chen, 2014) and utilizes electronic communication for teaching and learning from a distance. Azeiteiro et al. (2015) indicated that e-learning in higher education can be of great relevance in the effective lifelong learning education for sustainable development in a population of students. Trust and collective learning are the useful features that are enabled by effective collaborative leadership of e-learning projects across higher education institutions promoting lifelong learning (Jameson, Ferrell, Kelly, Walker, & Ryan, 2006). The e-learning system, which is independent of time and place (Lee & Lee, 2008), a self-regulated learning process (Narciss, Proske, & Korndle, 2007), and an interdisciplinary approach to teaching and learning, constitute key factors in education for sustainable development (ESD) (Lozano, Lozano, Mulder, Huisingh, & Waas, 2013). E-learning has been used for ESD, in particular in the context of lifelong learning and adult education, and many studies have been conducted to assess the outcomes of e-learning of sustainable development in higher education (Azeiteiro, Bacelar-Nicolau, Caetano, & Caeiro, 2015). E-learning brings new dimensions to traditional education, when it comes to adult learning, and increases the learning motivation of students about environmental issues (Azeiteiro et al., 2015).
FUTURE RESEARCH DIRECTIONS The classification of the extensive literature in the domains of e-learning will provide the potential opportunities for future research. The goals of learning analytics are to build better pedagogies, empower student to take an active part in their learning, target at-risk student populations, and evaluate factors affecting completion and student’s success (Kasemsap, 2016d). Communities of practice (CoPs) help promote a growing cycle of knowledge sharing activities that allow for the members to regularly meet, reflect, and evolve in the knowledge management environment (Kasemsap, 2016e). Big data is the very large sets of data that are produced by people using the Internet, and that can only be stored, understood, and utilized with the help of special tools and methods (Kasemsap, 2016f). Digital libraries comprise digital collections, services, and infrastructure to educationally support the lifelong learning, research, and conservation of the recorded knowledge (Kasemsap, 2016g). Educational computer games can motivate students to develop the basic competencies and encourage challenging themselves to be better and learn the additional knowledge related to the important tasks (Kasemsap, 2017b). Web mining is the application of data mining techniques to discover the interesting patterns from web data in order to better serve the needs of web-based multifaceted applications (Kasemsap, 2017c). An examination of linkages among e-learning, learning analytics, CoPs, big data, digital libraries, educational computer games, and web mining in modern education would seem to be viable for future research efforts.
CONCLUSION This chapter highlighted the overview of e-learning; the emerging trends in e-learning; the important factors of e-learning; the relationships among e-learning quality, learning satisfaction, and learning motivation; the implementation of e-learning; the approaches and barriers to e-learning utilization; e-learning for medical and nursing education; and the significance of e-learning in modern education. 376
Electronic Learning
E-learning is an affordable solution which provides the learners with the ability to fit learning around their lifestyles. E-learning facilitates learning without having to organize when and where so that the individual who is interested in an educational course can be present. When compared to the traditional mode of classroom learning, there is the clear evidence that elearning brings faster delivery, lower costs, more effective learning, and lower environmental impact in the modern learning environments. E-learning allows each individual to tackle the subject at their own pace, with interactive tasks being set in place to ensure a thorough understanding throughout each module. E-Learning, in comparison with traditional learning, allows for easier access to online resources, databases, periodicals, journals, and other learning materials. E-learning allows both instructors and students to learn anywhere and at any time. The effective e-learning environment utilizes both learning and training principles throughout its curriculum. This allows instructors to provide their learners with the tools to tackle current issues, develop lifelong skills, improve on their problem-solving skills, and utilize the e-learning resources to the best of their educational ability. Utilizing e-learning has the potential to increase educational performance and reach strategic goals in modern education.
REFERENCES Abendroth, M., Harendza, S., & Riemer, M. (2013). Clinical decision making: A pilot e-learning study. The Clinical Teacher, 10(1), 51–55. doi:10.1111/j.1743-498X.2012.00629.x PMID:23294745 Acampora, G., Gaeta, M., & Loia, V. (2011). Combining multi-agent paradigm and memetic computing for personalized and adaptive learning experiences. Computational Intelligence, 27(2), 141–165. doi:10.1111/j.1467-8640.2010.00367.x Ahmad, T., Härdle, W., Klinke, S., & Alawadhi, S. (2013). Using wiki to build an e-learning system in statistics in the Arabic language. Computational Statistics, 28(2), 481–491. doi:10.1007/s00180-012-0312-6 Albert, L. J., & Johnson, C. S. (2011). Socioeconomic status– and gender-based differences in students’ perceptions of e-learning systems. Decision Sciences Journal of Innovative Education, 9(3), 421–436. doi:10.1111/j.1540-4609.2011.00320.x Alonso, F., López, G., Manrique, D., & Viñes, J. M. (2005). An instructional model for web-based elearning education with a blended learning process approach. British Journal of Educational Technology, 36(2), 217–235. doi:10.1111/j.1467-8535.2005.00454.x Arthur-Mensah, N., & Shuck, B. (2014). Learning in developing countries: Implications for workforce training and development in Africa. New Horizons in Adult Education and Human Resource Development, 26(4), 41–46. doi:10.1002/nha3.20084 Asarbakhsh, M., & Sandars, J. (2013). E-learning: The essential usability perspective. The Clinical Teacher, 10(1), 47–50. doi:10.1111/j.1743-498X.2012.00627.x PMID:23294744 Awidi, I. T., & Cooper, M. (2015). Using management procedure gaps to enhance e-learning implementation in Africa. Computers & Education, 90, 64–79. doi:10.1016/j.compedu.2015.08.003
377
Electronic Learning
Azeiteiro, U. M., Bacelar-Nicolau, P., Caetano, F. J. P., & Caeiro, S. (2015). Education for sustainable development through e-learning in higher education: Experiences from Portugal. Journal of Cleaner Production, 106, 308–319. doi:10.1016/j.jclepro.2014.11.056 Bates, A. W. T. (2005). Technology, e-learning and distance education. London, UK: Routledge. doi:10.4324/9780203463772 Bedrule-Grigoruta, M. V., & Rusua, M. L. (2014). Considerations about e-learning tools for adult education. Procedia: Social and Behavioral Sciences, 142, 749–754. doi:10.1016/j.sbspro.2014.07.610 Beeckman, D., Schoonhoven, L., Boucqué, H., van Maele, G., & Defloor, T. (2008). Pressure ulcers: E-learning to improve classification by nurses and nursing students. Journal of Clinical Nursing, 17(13), 1697–1707. doi:10.1111/j.1365-2702.2007.02200.x PMID:18592624 Begičević, N., Divjak, B., & Hunjak, T. (2007). Prioritization of e-learning forms: A multicriteria methodology. Central European Journal of Operations Research, 15(4), 405–419. doi:10.1007/s10100007-0039-6 Bharuthram, S., & Kies, C. (2013). Introducing e-learning in a South African higher education institution: Challenges arising from an intervention and possible responses. British Journal of Educational Technology, 44(3), 410–420. doi:10.1111/j.1467-8535.2012.01307.x Bhuasiri, W., Xaymoungkhoun, O., Zo, H., Rho, J. J., & Ciganek, A. P. (2012). Critical success factors for e-learning in developing countries: A comparative analysis between ICT experts and faculty. Computers & Education, 58(2), 843–855. doi:10.1016/j.compedu.2011.10.010 Blake, H. (2009). Staff perceptions of e-learning for teaching delivery in healthcare. Learning in Health and Social Care, 8(3), 223–234. doi:10.1111/j.1473-6861.2009.00213.x Bolliger, D. U., & Martindale, T. (2004). Key factors for determining student satisfaction in online courses. International Journal on E-Learning, 3(1), 61–67. Brumini, G., Špalj, S., Mavrinac, M., Biočina-Lukenda, D., Strujić, M., & Brumini, M. (2014). Attitudes towards e-learning amongst dental students at the universities in Croatia. European Journal of Dental Education, 18(1), 15–23. doi:10.1111/eje.12068 PMID:24423171 Bury, R., Martin, L., & Roberts, S. (2006). Achieving change through mutual development: Supported online learning and the evolving roles of health and information professionals. Health Information and Libraries Journal, 23(Suppl. 1), 22–31. doi:10.1111/j.1471-1842.2006.00677.x PMID:17206993 Büyüközkan, G., Ruan, D., & Feyzioğlu, O. (2007). Evaluating e-learning web site quality in a fuzzy environment. International Journal of Intelligent Systems, 22(5), 567–586. doi:10.1002/int.20214 Calisir, F., Gumussoy, C. A., Bayraktaroglu, A. E., & Karaali, D. (2014). Predicting the intention to use a web-based learning system: Perceived content quality, anxiety, perceived system quality, image, and the technology acceptance model. Human Factors and Ergonomics in Manufacturing & Service Industries, 24(5), 515–531. doi:10.1002/hfm.20548 Chen, T. L. (2014). Exploring e-learning effectiveness perceptions of local government staff based on the diffusion of innovations model. Administration & Society, 46(4), 450–466. doi:10.1177/0095399713482313
378
Electronic Learning
Cheng, C. H., Wei, L. Y., & Chen, Y. H. (2011). A new e-learning achievement evaluation model based on rough set and similarity filter. Computational Intelligence, 27(2), 260–279. doi:10.1111/j.14678640.2011.00380.x Childs, S., Blenkinsopp, E., Hall, A., & Walton, G. (2005). Effective e-learning for health professionals and students—barriers and their solutions: A systematic review of the literature—findings from the HeXL project. Health Information and Libraries Journal, 22(Suppl. 2), 20–32. doi:10.1111/j.14703327.2005.00614.x PMID:16279973 Choi, I., Lee, S. J., & Kang, J. (2009). Implementing a case-based e-learning environment in a lectureoriented anaesthesiology class: Do learning styles matter in complex problem solving over time? British Journal of Educational Technology, 40(5), 933–947. doi:10.1111/j.1467-8535.2008.00884.x Collis, B., & Moonen, J. (2001). Flexible learning in a digital world: Experiences and expectations. London, UK: Kogan Page. Cooper, D., & Higgins, S. (2015). The effectiveness of online instructional videos in the acquisition and demonstration of cognitive, affective and psychomotor rehabilitation skills. British Journal of Educational Technology, 46(4), 768–779. doi:10.1111/bjet.12166 Costello, J. T., & McNaughton, R. B. (2016). Can dynamic capabilities be developed using workplace e-learning processes? Knowledge and Process Management, 23(1), 73–87. doi:10.1002/kpm.1500 Cox, M. J. (2013). Formal to informal learning with IT: Research challenges and issues for e-learning. Journal of Computer Assisted Learning, 29(1), 85–105. doi:10.1111/j.1365-2729.2012.00483.x Davids, M. R., Harvey, J., Halperin, M. L., & Chikte, U. M. E. (2015). Determining the number of participants needed for the usability evaluation of e-learning resources: A Monte Carlo simulation. British Journal of Educational Technology, 46(5), 1051–1055. doi:10.1111/bjet.12336 de Freitas, S. (2007). Post-16 e-learning content production: A synthesis of the literature. British Journal of Educational Technology, 38(2), 349–364. doi:10.1111/j.1467-8535.2006.00632.x Deborah, L. J., Baskaran, R., & Kannan, A. (2014). Learning styles assessment and theoretical origin in an e-learning scenario: A survey. Artificial Intelligence Review, 42(4), 801–819. doi:10.1007/s10462012-9344-0 Decman, M. (2015). Modeling the acceptance of e-learning in mandatory environments of higher education: The influence of previous education and gender. Computers in Human Behavior, 49, 272–281. doi:10.1016/j.chb.2015.03.022 Dow, C. R., Li, Y. H., Huang, L. H., & Hsuan, P. (2014). Development of activity generation and behavior observation systems for distance learning. Computer Applications in Engineering Education, 22(1), 52–62. doi:10.1002/cae.20528 Drigas, A. S., Ioannidou, R. E., Kokkalia, G., & Lytras, M. D. (2014). ICTs, mobile learning and social media to enhance learning for attention difficulties. Journal of Universal Computer Science, 20(10), 1499–1510.
379
Electronic Learning
Driver, M. (2002). Exploring student perceptions of group interaction and class satisfaction in the web enhanced classroom. The Internet and Higher Education, 5(1), 35–45. doi:10.1016/S1096-7516(01)00076-8 Dwivedi, P., & Bharadwaj, K. K. (2015). E-learning recommender system for a group of learners based on the unified learner profile approach. Expert Systems: International Journal of Knowledge Engineering and Neural Networks, 32(2), 264–276. doi:10.1111/exsy.12061 Ehlers, U. D., & Hilera, J. R. (2012). Special issue on quality in e-learning. Journal of Computer Assisted Learning, 28(1), 1–3. doi:10.1111/j.1365-2729.2011.00448.x Farid, S., Ahmad, R., Niaz, I. A., Arif, M., Shamshirband, S., & Khattak, M. D. (2015). Identification and prioritization of critical issues for the promotion of e-learning in Pakistan. Computers in Human Behavior, 51, 161–171. doi:10.1016/j.chb.2015.04.037 Feng, J. Y., Chang, Y. T., Chang, H. Y., Erdley, W. S., Lin, C. H., & Chang, Y. J. (2013). Systematic review of effectiveness of situated e-learning on medical and nursing education. Worldviews on EvidenceBased Nursing, 10(3), 174–183. doi:10.1111/wvn.12005 PMID:23510119 Fung, K. (2015). Otolaryngology–head and neck surgery in undergraduate medical education: Advances and innovations. The Laryngoscope, 125(Suppl. 2), S1–S14. doi:10.1002/lary.24875 PMID:25124523 Gaikwad, N., & Tankhiwale, S. (2014). Interactive e-learning module in pharmacology: A pilot project at a rural medical college in India. Perspectives on Medical Education, 3(1), 15–30. doi:10.1007/s40037013-0081-0 PMID:24072666 Garavan, T. N., Carbery, R., O’Malley, G., & O’Donnell, D. (2010). Understanding participation in elearning in organizations: A large-scale empirical study of employees. International Journal of Training and Development, 14(3), 155–168. doi:10.1111/j.1468-2419.2010.00349.x Garrison, D. R. (2011). E-learning in the 21st century: A framework for research and practice. Marceline MO: Walsworth Publishing Company. Ginns, P., & Ellis, R. A. (2009). Evaluating the quality of e-learning at the degree level in the student experience of blended learning. British Journal of Educational Technology, 40(4), 652–663. doi:10.1111/ j.1467-8535.2008.00861.x Goh, J., & Clapham, M. (2014). Attitude to e–learning among newly qualified doctors. The Clinical Teacher, 11(1), 20–23. doi:10.1111/tct.12117 PMID:24405914 Gordon, M., Chandratilake, M., & Baker, P. (2013). Low fidelity, high quality: A model for e-learning. The Clinical Teacher, 10(4), 258–263. doi:10.1111/tct.12008 PMID:23834573 Gronlund, A., & Islam, Y. M. (2010). A mobile e-learning environment for developing countries: The Bangladesh virtual interactive classroom. Information Technology for Development, 16(4), 244–259. doi:10.1080/02681101003746490 Gunga, S. O., & Ricketts, I. W. (2007). Facing the challenges of e-learning initiatives in African universities. British Journal of Educational Technology, 38(5), 896–906. doi:10.1111/j.1467-8535.2006.00677.x
380
Electronic Learning
Gupta, P., Thangaratinam, S., Shehmar, M., Gee, H., Karri, K., Bondili, A., & Khan, K. S. (2012). An electronic training-the-trainers programme: Developing resources for training in educational supervision in obstetrics and gynaecology. The Obstetrician & Gynaecologist, 14(1), 39–44. doi:10.1111/j.17444667.2011.00087.x Hall, G. E. (1997). Stages of concern. Paper presented at the Annual Conference of the Association for Educational Communications and Technology (AECT ‘97), Albuquerque, NM, USA. Hammami, S., & Mathkour, H. (2015). Adaptive e-learning system based on agents and object petri nets (AELS-A/OPN). Computer Applications in Engineering Education, 23(2), 170–190. doi:10.1002/ cae.21587 Hansen, D. E. (2008). Knowledge transfer in online learning environments. Journal of Marketing Education, 30(2), 93–105. doi:10.1177/0273475308317702 Harden, R. M. (2005). A new vision for distance learning and continuing medical education. The Journal of Continuing Education in the Health Professions, 25(1), 43–51. doi:10.1002/chp.8 PMID:16078802 Hay, A., Peltier, J., & Drago, W. (2004). Reflective learning and on-line education: A comparison of traditional and on-line MBA students. Strategic Change, 13(4), 169–182. doi:10.1002/jsc.680 Hay, D. B., Kehoe, C., Miquel, M. E., Hatzipanagos, S., Kinchin, I. M., Keevil, S. F., & Lygo-Baker, S. (2008). Measuring the quality of e-learning. British Journal of Educational Technology, 39(6), 1037–1056. doi:10.1111/j.1467-8535.2007.00777.x Hernández, A. B., Gorjup, M. T., & Cascón, R. (2010). The role of the instructor in business games: A comparison of face-to-face and online instruction. International Journal of Training and Development, 14(3), 169–179. doi:10.1111/j.1468-2419.2010.00350.x Herrington, J., Reeves, T. C., & Oliver, R. (2010). A guide to authentic e-learning. New York, NY: Routledge. Holsapple, C. W., & Lee-Post, A. (2006). Defining, assessing, and promoting e-learning success: An information systems perspective. Decision Sciences Journal of Innovative Education, 4(1), 67–85. doi:10.1111/j.1540-4609.2006.00102.x Hrtonova, N., Kohout, J., Rohlikova, L., & Zounek, J. (2015). Factors influencing acceptance of elearning by teachers in the Czech Republic. Computers in Human Behavior, 51, 873–879. doi:10.1016/j. chb.2014.11.018 Huang, W., & Mille, A. (2006). ConKMeL: A contextual knowledge management framework to support multimedia e-learning. Multimedia Tools and Applications, 30(2), 205–219. doi:10.1007/s11042-0060024-4 Huang, W., Webster, D., Wood, D., & Ishaya, T. (2006). An intelligent semantic e-learning framework using context-aware Semantic Web technologies. British Journal of Educational Technology, 37(3), 351–373. doi:10.1111/j.1467-8535.2006.00610.x
381
Electronic Learning
Hung, H., & Cho, V. (2008). Continued usage of e-learning communication tools: A study from the learners’ perspective in Hong Kong. International Journal of Training and Development, 12(3), 171–187. doi:10.1111/j.1468-2419.2008.00302.x Hung, J. I. (2012). Trends of e-learning research from 2000 to 2008: Use of text mining and bibliometrics. British Journal of Educational Technology, 43(1), 5–16. doi:10.1111/j.1467-8535.2010.01144.x Islam, A. K. M. N. (2014). Sources of satisfaction and dissatisfaction with a learning management system: A critical incident approach. Computers in Human Behavior, 30(1), 249–261. doi:10.1016/j. chb.2013.09.010 Islam, A. K. M. N. (2016). E-learning system use and its outcomes: Moderating role of perceived compatibility. Telematics and Informatics, 33(1), 48–55. doi:10.1016/j.tele.2015.06.010 Jameson, J., Ferrell, G., Kelly, J., Walker, S., & Ryan, M. (2006). Building trust and shared knowledge in communities of e-learning practice: Collaborative leadership in the JISC eLISA and CAMEL lifelong learning projects. British Journal of Educational Technology, 37(6), 949–967. doi:10.1111/j.14678535.2006.00669.x Jara, C. A., Candelas, F. A., Torres, F., Dormido, S., & Esquembre, F. (2012). Synchronous collaboration of virtual and remote laboratories. Computer Applications in Engineering Education, 20(1), 124–136. doi:10.1002/cae.20380 Jovanovic, D., & Jovanovic, S. (2015). An adaptive e-learning system for Java programming course, based on Dokeos LE. Computer Applications in Engineering Education, 23(3), 337–343. doi:10.1002/ cae.21603 Kasemsap, K. (2013). Practical framework: Creation of causal model of job involvement, career commitment, learning motivation, and learning transfer. International Journal of the Computer, the Internet and Management, 21(1), 29–35. Kasemsap, K. (2014). The role of social media in the knowledge-based organizations. In I. Lee (Ed.), Integrating social media into business practice, applications, management, and models (pp. 254–275). Hershey, PA, USA: IGI Global. doi:10.4018/978-1-4666-6182-0.ch013 Kasemsap, K. (2016a). The roles of e-learning, organizational learning, and knowledge management in the learning organizations. In E. Railean, G. Walker, A. Elçi, & L. Jackson (Eds.), Handbook of research on applied learning theory and design in modern education (pp. 786–816). Hershey, PA, USA: IGI Global. doi:10.4018/978-1-4666-9634-1.ch039 Kasemsap, K. (2016b). Exploring the role of web-based learning in global education. In M. Raisinghani (Ed.), Revolutionizing education through web-based instruction (pp. 202–224). Hershey, PA, USA: IGI Global. doi:10.4018/978-1-4666-9932-8.ch012 Kasemsap, K. (2016c). The roles of lifelong learning and knowledge management in global higher education. In P. Ordóñez de Pablos & R. Tennyson (Eds.), Impact of economic crisis on education and the next-generation workforce (pp. 71–100). Hershey, PA, USA: IGI Global. doi:10.4018/978-1-46669455-2.ch004
382
Electronic Learning
Kasemsap, K. (2016d). The role of learning analytics in global higher education. In M. Anderson & C. Gavan (Eds.), Developing effective educational experiences through learning analytics (pp. 282–307). Hershey, PA, USA: IGI Global. doi:10.4018/978-1-4666-9983-0.ch012 Kasemsap, K. (2016e). Utilizing communities of practice to facilitate knowledge sharing in the digital age. In S. Buckley, G. Majewski, & A. Giannakopoulos (Eds.), Organizational knowledge facilitation through communities of practice in emerging markets (pp. 198–224). Hershey, PA, USA: IGI Global. doi:10.4018/978-1-5225-0013-1.ch011 Kasemsap, K. (2016f). Mastering big data in the digital age. In M. Singh & D. G. (Eds.), Effective big data management and opportunities for implementation (pp. 104–129). Hershey, PA, USA: IGI Global. doi:10.4018/978-1-5225-0182-4.ch008 Kasemsap, K. (2016g). Mastering digital libraries in the digital age. In E. de Smet & S. Dhamdhere (Eds.), E-discovery tools and applications in modern libraries (pp. 275–305). Hershey, PA, USA: IGI Global. doi:10.4018/978-1-5225-0474-0.ch015 Kasemsap, K. (2017a). Encouraging continuing professional development and teacher professional development in global education. In R. Cintron, J. Samuel, & J. Hinson (Eds.), Accelerated opportunity education models and practices (pp. 168–202). Hershey, PA, USA: IGI Global. doi:10.4018/978-15225-0528-0.ch008 Kasemsap, K. (2017b). Mastering educational computer games, educational video games, and serious games in the digital age. In R. Alexandre Peixoto de Queirós & M. Pinto (Eds.), Gamification-based e-learning strategies for computer programming education (pp. 30–52). Hershey, PA, USA: IGI Global. doi:10.4018/978-1-5225-1034-5.ch003 Kasemsap, K. (2017c). Mastering web mining and information retrieval in the digital age. In A. Kumar (Ed.), Web usage mining techniques and applications across industries (pp. 1–28). Hershey, PA, USA: IGI Global. doi:10.4018/978-1-5225-0613-3.ch001 Kim, K., Trimi, S., Park, H., & Rhee, S. (2012). The impact of CMS quality on the outcomes of e-learning systems in higher education: An empirical study. Decision Sciences Journal of Innovative Education, 10(4), 575–587. doi:10.1111/j.1540-4609.2012.00360.x King, E., & Boyatt, R. (2015). Exploring factors that influence adoption of e-learning within higher education. British Journal of Educational Technology, 46(6), 1272–1280. doi:10.1111/bjet.12195 Kobewka, D., Backman, C., Hendry, P., Hamstra, S. J., Suh, K. N., Code, C., & Forster, A. J. (2014). The feasibility of e-learning as a quality improvement tool. Journal of Evaluation in Clinical Practice, 20(5), 606–610. doi:10.1111/jep.12169 PMID:24828785 Lahti, M. E., Kontio, R. M., & Välimäki, M. (2016). Impact of an e-learning course on clinical practice in psychiatric hospitals: Nurse managers’ views. Perspectives in Psychiatric Care, 52(1), 40–48. doi:10.1111/ppc.12100 PMID:25624098
383
Electronic Learning
Lara, J. A., Lizcano, D., Martinez, M. A., Pazos, J., & Riera, T. (2014). A system for knowledge discovery in e-learning environments within the European higher education area: Application to student data from Open University of Madrid, UDIMA. Computers & Education, 72, 23–36. doi:10.1016/j. compedu.2013.10.009 Larvin, M. (2009). E-learning in surgical education and training. ANZ Journal of Surgery, 79(3), 133–137. doi:10.1111/j.1445-2197.2008.04828.x PMID:19317777 Lee, J., & Lee, W. (2008). The relationship of e-learner’s self-regulatory efficacy and perception of e-learning environmental quality. Computers in Human Behavior, 24(1), 32–47. doi:10.1016/j.chb.2006.12.001 Lee, Y. J., & Lee, D. (2015). Factors influencing learning satisfaction of migrant workers in Korea with e-learning-based occupational safety and health education. Safety and Health at Work, 6(3), 211–217. doi:10.1016/j.shaw.2015.05.002 PMID:26929830 Li, Y., Duan, Y., Fu, Z., & Alford, P. (2012). An empirical study on behavioural intention to reuse e-learning systems in rural China. British Journal of Educational Technology, 43(6), 933–948. doi:10.1111/j.14678535.2011.01261.x Liegle, J. O., & Janicki, T. N. (2006). The effect of learning styles on the navigation needs of web-based learners. Computers in Human Behavior, 22(5), 885–898. doi:10.1016/j.chb.2004.03.024 Lim, Y. C., & Chiew, T. K. (2014). Creating reusable and interoperable learning objects for developing an e-learning system that supports remediation learning strategy. Computer Applications in Engineering Education, 22(2), 329–339. doi:10.1002/cae.20558 Limayem, M., & Cheung, C. M. K. (2011). Predicting the continued use of Internet-based learning technologies: The role of habit. Behaviour & Information Technology, 30(1), 91–99. doi:10.1080/014 4929X.2010.490956 Lin, C. C., Ma, Z., & Lin, R. C. P. (2011). Re-examining the critical success factors of e-learning from the EU perspective. International Journal of Management in Education, 5(1), 44–62. doi:10.1504/ IJMIE.2011.037754 Lin, J. W., Huang, H. H., & Chuang, Y. S. (2015). The impacts of network centrality and self-regulation on an e-learning environment with the support of social network awareness. British Journal of Educational Technology, 46(1), 32–44. doi:10.1111/bjet.12120 Linehan, C., Kirman, B., Lawson, S., & Chan, G. (2011). Practical, appropriate, empirically-validated guidelines for designing educational games. Paper presented the 29th Annual ACM Conference on Human Factors in Computing Systems (CHI ‘11), Vancouver, Canada. doi:10.1145/1978942.1979229 Liu, Y., & Wang, H. (2009). A comparative study on e-learning technologies and products: From the East to the West. Systems Research and Behavioral Science, 26(2), 191–209. doi:10.1002/sres.959 Lozano, R., Lozano, F., Mulder, K., Huisingh, D., & Waas, T. (2013). Advancing higher education for sustainable development: International insights and critical reflections. Journal of Cleaner Production, 48, 3–9. doi:10.1016/j.jclepro.2013.03.034
384
Electronic Learning
Lüdert, T., Nast, A., Zielke, H., Sterry, W., & Rzany, B. (2008). E-learning in the dermatological education at the Charité: Evaluation of the last three years. JDDG: Journal der Deutschen Dermatologischen Gesellschaft, 6(6), 467–472. doi:10.1111/j.1610-0387.2008.06738.x PMID:18400021 Ludwig, B., Bister, D., Schott, T. C., Lisson, J. A., & Hourfar, J. (2016). Assessment of two e-learning methods teaching undergraduate students cephalometry in orthodontics. European Journal of Dental Education, 20(1), 20–25. doi:10.1111/eje.12135 PMID:25560366 Lykourentzou, I., Giannoukos, I., Mpardis, G., Nikolopoulos, V., & Loumos, V. (2009). Early and dynamic student achievement prediction in e-learning courses using neural networks. Journal of the American Society for Information Science and Technology, 60(2), 372–380. doi:10.1002/asi.20970 MacLean, P., & Scott, B. (2011). Competencies for learning design: A review of the literature and a proposed framework. British Journal of Educational Technology, 42(4), 557–572. doi:10.1111/j.14678535.2010.01090.x Marković, S., Jovanović, Z., Jovanović, N., Jevremović, A., & Popović, R. (2013). Adaptive distance learning and testing system. Computer Applications in Engineering Education, 21(Suppl. 1), E2–E13. doi:10.1002/cae.20510 Marshall, S. (2012). Improving the quality of e-learning: Lessons from the eMM. Journal of Computer Assisted Learning, 28(1), 65–78. doi:10.1111/j.1365-2729.2011.00443.x Martínez-Caro, E. (2011). Factors affecting effectiveness in e-learning: An analysis in production management courses. Computer Applications in Engineering Education, 19(3), 572–581. doi:10.1002/cae.20337 Masoumi, D., & Lindström, B. (2012). Quality in e-learning: A framework for promoting and assuring quality in virtual institutions. Journal of Computer Assisted Learning, 28(1), 27–41. doi:10.1111/j.13652729.2011.00440.x Maxwell, S., & Mucklow, J. (2012). E-learning initiatives to support prescribing. British Journal of Clinical Pharmacology, 74(4), 621–631. doi:10.1111/j.1365-2125.2012.04300.x PMID:22509885 McGill, T. J., & Hobbs, V. J. (2008). How students and instructors using a virtual learning environment perceive the fit between technology and task. Journal of Computer Assisted Learning, 24(3), 191–202. doi:10.1111/j.1365-2729.2007.00253.x McGill, T. J., & Klobas, J. E. (2009). A task-technology fit view of learning management system impact. Computers & Education, 52(2), 496–508. doi:10.1016/j.compedu.2008.10.002 McPherson, M. A., & Nunes, J. M. (2008). Critical issues for e-learning delivery: What may seem obvious is not always put into practice. Journal of Computer Assisted Learning, 24(5), 433–445. doi:10.1111/j.1365-2729.2008.00281.x Moore, M. G., & Kearsley, G. (2011). Distance education: A systems view of online learning. Belmont, CA: Cengage Learning. Moule, P., Ward, R., & Lockyer, L. (2010). Nursing and healthcare students’ experiences and use of e-learning in higher education. Journal of Advanced Nursing, 66(12), 2785–2795. doi:10.1111/j.13652648.2010.05453.x PMID:20946565
385
Electronic Learning
Moura, A. P. M., Cunha, L. M., Azeiteiro, U. M., Aires, L., & de Almeida, M. D. V. (2010). Food consumer science post-graduate courses: Comparison of face-to-face versus online delivery systems. British Food Journal, 112(5), 544–556. doi:10.1108/00070701011043781 Mueller, D., & Strohmeier, S. (2010). Design characteristics of virtual learning environments: An expert study. International Journal of Training and Development, 14(3), 209–222. doi:10.1111/j.14682419.2010.00353.x Muirhead, R. J. (2007). E-learning: Is this teaching at students or teaching with students? Nursing Forum, 42(4), 178–184. doi:10.1111/j.1744-6198.2007.00085.x PMID:17944698 Muntean, C. H., & Muntean, G. M. (2009). Open corpus architecture for personalised ubiquitous elearning. Personal and Ubiquitous Computing, 13(3), 197–205. doi:10.1007/s00779-007-0189-5 Narciss, S., Proske, A., & Korndle, H. (2007). Promoting self-regulated learning in web-based learning environments. Computers in Human Behavior, 23(3), 1126–1144. doi:10.1016/j.chb.2006.10.006 Navimipour, N. J., & Zareie, B. (2015). A model for assessing the impact of e-learning systems on employees’ satisfaction. Computers in Human Behavior, 53, 475–485. doi:10.1016/j.chb.2015.07.026 Nawaz, A. (2012). E-learning experiences of HEIs in advanced states, developing countries and Pakistan. Universal Journal of Education and General Studies, 1(3), 72–83. Nichols, M. (2008). Institutional perspectives: The challenges of e-learning diffusion. British Journal of Educational Technology, 39(4), 598–609. doi:10.1111/j.1467-8535.2007.00761.x Normark, O. R., & Cetindamar, D. (2005). E-learning in a competitive firm setting. Innovations in Education and Teaching International, 42(4), 325–335. doi:10.1080/14703290500062581 Oiry, E. (2009). Electronic human resource management: Organizational responses to role conflicts created by e-learning. International Journal of Training and Development, 13(2), 111–123. doi:10.1111/j.14682419.2009.00321.x Ossiannilsson, E., & Landgren, L. (2012). Quality in e-learning: A conceptual framework based on experiences from three international benchmarking projects. Journal of Computer Assisted Learning, 28(1), 42–51. doi:10.1111/j.1365-2729.2011.00439.x Oztekin, A., Delen, D., Turkyilmaz, A., & Zaim, S. (2013). A machine learning-based usability evaluation method for eLearning systems. Decision Support Systems, 56, 63–73. doi:10.1016/j.dss.2013.05.003 Ozyurt, O., & Ozyurt, H. (2015). Learning style based individualized adaptive e-learning environments: Content analysis of the articles published from 2005 to 2014. Computers in Human Behavior, 52, 349–358. doi:10.1016/j.chb.2015.06.020 Ozyurt, O., Ozyurt, H., Guven, B., & Baki, A. (2014). The effects of UZWEBMAT on the probability unit achievement of Turkish eleventh grade students and the reasons for such effects. Computers & Education, 75, 1–18. doi:10.1016/j.compedu.2014.02.005 Pagram, P., & Pagram, J. (2006). Issues in e-learning: A Thai case study. The Electronic Journal of Information Systems in Developing Countries, 26(6), 1–8.
386
Electronic Learning
Peterson, D., Robinson, K., Verrall, T., & Quested, B. (2008). Experiences on e-learning projects. ISBT Science Series, 3(1), 175–182. doi:10.1111/j.1751-2824.2008.00163.x Peterson, D., Robinson, K., Verrall, T., Quested, B., & Saxon, B. (2007). E-learning and transfusion medicine. ISBT Science Series, 2(2), 27–32. doi:10.1111/j.1751-2824.2007.00107.x Pintar, R., Jereb, E., Vukovic, G., & Urh, M. (2015). Analysis of web sites for e-learning in the field of foreign exchange trading. Procedia: Social and Behavioral Sciences, 197, 245–254. doi:10.1016/j. sbspro.2015.07.131 Pons, D., Hilera, J. R., Fernandez, L., & Pages, C. (2015). Managing the quality of e-learning resources in repositories. Computer Applications in Engineering Education, 23(4), 477–488. doi:10.1002/cae.21619 Popescu, E. (2010). Adaptation provisioning with respect to learning styles in a web-based educational system: An experimental study. Journal of Computer Assisted Learning, 26(4), 243–257. doi:10.1111/ j.1365-2729.2010.00364.x Premlatha, K. R., & Geetha, T. V. (2015). Learning content design and learner adaptation for adaptive e-learning environment: A survey. Artificial Intelligence Review, 44(4), 443–465. doi:10.1007/s10462015-9432-z Punie, Y. (2007). Learning spaces: An ICT-enabled model of future learning in the knowledge-based society. European Journal of Education, 42(2), 185–199. doi:10.1111/j.1465-3435.2007.00302.x Qureshi, I. A., Ilyas, K., Yasmin, R., & Whitty, M. (2012). Challenges of implementing e-learning in a Pakistani university. Knowledge Management & E-Learning: An International Journal, 4(3), 310–324. Richter, T., & McPherson, M. (2012). Open educational resources: Education for the world? Distance Education, 33(2), 201–219. doi:10.1080/01587919.2012.692068 Rosaci, D., & Sarné, G. M. L. (2010). Efficient personalization of e-learning activities using a multi-device decentralized recommender system. Computational Intelligence, 26(2), 121–141. doi:10.1111/j.14678640.2009.00343.x Rosenberg, M. J. (2001). E-learning: Strategies for delivering knowledge in the digital age. New York, NY: McGraw–Hill. Ruiz, J. G., Teasdale, T. A., Hajjar, I., Shaughnessy, M., & Mintzer, M. J. (2007). The consortium of e-learning in geriatrics instruction. Journal of the American Geriatrics Society, 55(3), 458–463. doi:10.1111/j.1532-5415.2007.01095.x PMID:17341252 Schneckenberg, D., Ehlers, U., & Adelsberger, H. (2011). Web 2.0 and competence-oriented design of learning: Potentials and implications for higher education. British Journal of Educational Technology, 42(5), 747–762. doi:10.1111/j.1467-8535.2010.01092.x Schoonenboom, J. (2014). Using an adapted, task-level technology acceptance model to explain why instructors in higher education intend to use some learning management system tools more than others. Computers & Education, 71, 247–256. doi:10.1016/j.compedu.2013.09.016
387
Electronic Learning
Scott, K. M. (2013). Does a university teacher need to change e-learning beliefs and practices when using a social networking site? A longitudinal case study. British Journal of Educational Technology, 44(4), 571–580. doi:10.1111/bjet.12072 Shanley, E. L., Thompson, C. A., Leuchner, L. A., & Zhao, Y. (2004). Distance education is as effective as traditional education when teaching food safety. Food Service Technology, 4(1), 1–8. doi:10.1111/ j.1471-5740.2003.00071.x Sharpe, R., Benfield, G., & Francis, R. (2006). Implementing a university e-learning strategy: Levers for change within academic schools. ALT-J: Research in Learning Technology, 14(2), 135–151. doi:10.1080/09687760600668503 Shea, P. J., Pickett, A. M., & Pelz, W. E. (2003). A follow-up investigation of teaching presence in the SUNY learning network. Journal of Asynchronous Learning Networks, 7(2), 61–80. Shohreh, A. K., & Keesling, G. (2000). Development of a web-based Internet marketing course. Journal of Marketing Education, 22(2), 84–89. doi:10.1177/0273475300222002 Sinclair, P., Schoch, M., Black, K., & Woods, M. (2011). Proof of concept: Developing a peer reviewed, evidence-based, interactive e-learning programme. Journal of Renal Care, 37(2), 108–113. doi:10.1111/ j.1755-6686.2011.00217.x PMID:21561547 Smith, J. A., & Sivo, S. A. (2012). Predicting continued use of online teacher professional development and the influence of social presence and sociability. British Journal of Educational Technology, 43(6), 871–882. doi:10.1111/j.1467-8535.2011.01223.x Stein, S. J., Shephard, K., & Harris, I. (2011). Conceptions of e-learning and professional development for e-learning held by tertiary educators in New Zealand. British Journal of Educational Technology, 42(1), 145–165. doi:10.1111/j.1467-8535.2009.00997.x Sun, P. C., Tsai, R. J., Finger, G., Chen, Y. Y., & Yeh, D. (2008). What drives a successful e-Learning? An empirical investigation of the critical factors influencing learner satisfaction. Computers & Education, 50(4), 1183–1202. doi:10.1016/j.compedu.2006.11.007 Tan, W., Chen, S., Li, J., Li, L., Wang, T., & Hu, X. (2014). A trust evaluation model for e-learning systems. Systems Research and Behavioral Science, 31(3), 353–365. doi:10.1002/sres.2283 Tannenbaum, S. I., Mathieu, J. E., & Cannon-Bowers, J. A. (1991). Meeting trainees’ expectations: The influence of training fulfillment on the development of commitment, self-efficacy, and motivation. The Journal of Applied Psychology, 76(6), 759–769. doi:10.1037/0021-9010.76.6.759 Tarhini, A., Hone, K., & Liu, X. (2015). A cross-cultural examination of the impact of social, organisational and individual factors on educational technology acceptance between British and Lebanese university students. British Journal of Educational Technology, 46(4), 739–755. doi:10.1111/bjet.12169 Triyono, M. B. (2015). The indicators of instructional design for e-learning in Indonesian vocational high schools. Procedia: Social and Behavioral Sciences, 204, 54–61. doi:10.1016/j.sbspro.2015.08.109
388
Electronic Learning
Urh, M., Vukovic, G., Jereb, E., & Pintar, R. (2015). The model for introduction of gamification into e-learning in higher education. Procedia: Social and Behavioral Sciences, 197, 388–397. doi:10.1016/j. sbspro.2015.07.154 van Seters, J. R., Wellink, J., Tramper, J., Goedhart, M. J., & Ossevoort, M. A. (2012). A web-based adaptive tutor to teach PCR primer design. Biochemistry and Molecular Biology Education, 40(1), 8–13. doi:10.1002/bmb.20563 Violante, M. G., & Vezzetti, E. (2014). Implementing a new approach for the design of an e-learning platform in engineering education. Computer Applications in Engineering Education, 22(4), 708–727. doi:10.1002/cae.21564 Violante, M. G., & Vezzetti, E. (2015). Virtual interactive e-learning application: An evaluation of the student satisfaction. Computer Applications in Engineering Education, 23(1), 72–91. doi:10.1002/ cae.21580 White, S. (2007). Critical success factors for e-learning and institutional change: Some organisational perspectives on campus-wide e-learning. British Journal of Educational Technology, 38(5), 840–850. doi:10.1111/j.1467-8535.2007.00760.x Wilson, A. (2012). Effective professional development for e-learning: What do the managers think? British Journal of Educational Technology, 43(6), 892–900. doi:10.1111/j.1467-8535.2011.01248.x Woelber, J. P., Hilbert, T. S., & Ratka-Krüger, P. (2012). Can easy-to-use software deliver effective elearning in dental education? A randomised controlled study. European Journal of Dental Education, 16(3), 187–192. doi:10.1111/j.1600-0579.2012.00741.x PMID:22783845 Yalcinalp, S., & Gulbahar, Y. (2010). Ontology and taxonomy design and development for personalised web-based learning systems. British Journal of Educational Technology, 41(6), 883–896. doi:10.1111/ j.1467-8535.2009.01049.x Yasir, E. A. M., & Sami, M. S. (2011). An approach to adaptive e-learning hypermedia system based on learning styles (AEHS-LS): Implementation and evaluation. International Journal of Library and Information Science, 3(1), 15–28. Youn, S., & Vachon, M. (2005). An investigation of the profiles of satisfying and dissatisfying factors in e-learning. Performance Improvement Quarterly, 18(2), 97–113. Zheng, Y., & Yano, Y. (2007). A framework of context-awareness support for peer recommendation in the e-learning context. British Journal of Educational Technology, 38(2), 197–210. doi:10.1111/j.14678535.2006.00584.x Zhu, C., Valcke, M., Schellens, T., & Li, Y. (2009). Chinese students’ perceptions of a collaborative e-learning environment and factors affecting their performance: Implementing a Flemish e-learning course in a Chinese educational context. Asia Pacific Education Review, 10(2), 225–235. doi:10.1007/ s12564-009-9021-4
389
Electronic Learning
ADDITIONAL READING Alsabawy, A. Y., Cater-Steel, A., & Soar, J. (2013). IT infrastructure services as a requirement for elearning system success. Computers & Education, 69, 431–451. doi:10.1016/j.compedu.2013.07.035 Bremer, C. (2012). Enhancing e-learning quality through the application of the AKUE procedure model. Journal of Computer Assisted Learning, 28(1), 15–26. doi:10.1111/j.1365-2729.2011.00444.x Brown, M., & Bullock, A. (2014). Evaluating PLATO: Postgraduate teaching and learning online. The Clinical Teacher, 11(1), 10–14. doi:10.1111/tct.12052 PMID:24405912 Cheok, M. L., & Wong, S. L. (2015). Predictors of e-learning satisfaction in teaching and learning for school teachers: A literature review. International Journal of Instruction, 8(1), 75–90. doi:10.12973/ iji.2015.816a Cheung, R., & Vogel, D. (2013). Predicting user acceptance of collaborative technologies: An extension of the technology acceptance model for e-learning. Computers & Education, 63, 160–175. doi:10.1016/j. compedu.2012.12.003 Chyung, S. Y., & Vachon, M. (2013). An investigation of the profiles of satisfying and dissatisfying factors in e-learning. Performance Improvement Quarterly, 26(2), 117–140. doi:10.1002/piq.21147 Cook, D. A., & Triola, M. M. (2014). What is the role of e-learning? Looking past the hype. Medical Education, 48(9), 930–937. doi:10.1111/medu.12484 PMID:25113119 Diamond, S., & Irwin, B. (2013). Using e-learning for student sustainability literacy: Framework and review. International Journal of Sustainability in Higher Education, 14(4), 338–348. doi:10.1108/ IJSHE-09-2011-0060 Dominici, G., & Palumbo, F. (2013). How to build an e-learning product: Factors for student/customer satisfaction. Business Horizons, 56(1), 87–96. doi:10.1016/j.bushor.2012.09.011 Ghislandi, P. M. M., & Raffaghelli, J. E. (2015). Forward-oriented designing for learning as a means to achieve educational quality. British Journal of Educational Technology, 46(2), 280–299. doi:10.1111/ bjet.12257 Jones, A. R. (2013). Increasing adult learner motivation for completing self-directed e-learning. Performance Improvement, 52(7), 32–42. doi:10.1002/pfi.21361 Kim, J., Lee, A., & Ryu, H. (2013). Personality and its effects on learning performance: Design guidelines for an adaptive e-learning system based on a user model. International Journal of Industrial Ergonomics, 43(5), 450–461. doi:10.1016/j.ergon.2013.03.001 Labus, A., Despotović-Zrakić, M., Radenković, B., Bogdanović, Z., & Radenković, M. (2015). Enhancing formal e-learning with edutainment on social networks. Journal of Computer Assisted Learning, 31(6), 592–605. doi:10.1111/jcal.12108 Lau, R. W. H., Yen, N. Y., Li, F., & Wah, B. (2013). Recent development in multimedia e-learning technologies. World Wide Web (Bussum), 17(2), 189–198. doi:10.1007/s11280-013-0206-8
390
Electronic Learning
Li, Z. (2013). Natural, practical and social contexts of e-learning: A critical realist account for learning and technology. Journal of Computer Assisted Learning, 29(3), 280–291. doi:10.1111/jcal.12002 Lin, P. C., Lu, H. K., & Liu, C. (2013). Towards an education behavioral intention model for e-learning systems: An extension of UTAUT. Journal of Theoretical and Applied Information Technology, 47(3), 1120–1127. Safford, K., & Stinton, J. (2016). Barriers to blended digital distance vocational learning for non-traditional students. British Journal of Educational Technology, 47(1), 135–150. doi:10.1111/bjet.12222 Sander, B., & Golas, M. M. (2013). HistoViewer: An interactive e-learning platform facilitating group and peer group learning. Anatomical Sciences Education, 6(3), 182–190. doi:10.1002/ase.1336 PMID:23184574 Skues, J. L., & Cunningham, E. G. (2013). The role of e-learning coaches in Australian secondary schools. Journal of Computer Assisted Learning, 29(2), 179–187. doi:10.1111/j.1365-2729.2012.00488.x Stoffregen, J., Pawlowski, J. M., & Pirkkalainen, H. (2015). A barrier framework for open e-learning in public administrations. Computers in Human Behavior, 51, 674–684. doi:10.1016/j.chb.2014.12.024 Teo, T. (2014). Preservice teachers’ satisfaction with e-learning. Social Behavior and Personality: An International Journal, 42(1), 3–6. doi:10.2224/sbp.2014.42.1.3 Valsamidis, S., Kazanidis, I., Petasakis, I., Kontogiannis, S., & Kolokitha, E. (2014). E-learning activity analysis. Procedia Economics and Finance, 9, 511–518. doi:10.1016/S2212-5671(14)00052-5 van Nuland, S. E., & Rogers, K. A. (2016). E-learning, dual-task, and cognitive load: The anatomy of a failed experiment. Anatomical Sciences Education, 9(2), 186–196. doi:10.1002/ase.1576 PMID:26480302 Wang, T. H. (2014). Developing an assessment-centered e-learning system for improving student learning effectiveness. Computers & Education, 73, 189–203. doi:10.1016/j.compedu.2013.12.002 Yeh, Y. C., & Lin, C. F. (2015). Aptitude-treatment interactions during creativity training in e-learning: How meaning-making, self-regulation, and knowledge management influence creativity. Journal of Educational Technology & Society, 18(1), 119–131. Zafar, A., & Albidewi, I. (2015). Evaluation study of eLGuide: A framework for adaptive e-learning. Computer Applications in Engineering Education, 23(4), 542–555. doi:10.1002/cae.21625
KEY TERMS AND DEFINITIONS Education: The process of imparting knowledge, skill, and judgment. E-Learning: The type of learning conducted via electronic media, especially via the Internet. Information Technology: The development, installation, and implementation of computer systems and applications. Internet: The connected group of computer networks allowing for electronic communication.
391
Electronic Learning
Knowledge Management: The range of practices used by organizations to identify, create, represent, and distribute knowledge. Learning: The process of gaining knowledge and skill. Technology: The science or knowledge put into the practical utilization to solve the problems or invent the useful tools. Website: The page or collection of pages on the World Wide Web that contains the specific information.
392
393
Chapter 17
European Dialogue Project: Collaborating to Improve on the Quality of Learning Environments Regina Brautlacht Bonn-Rhein-Sieg University of Applied Sciences BRSU, Germany Franca Poppi University of Modena and Reggio Emilia, Italy Maria Lurdes Martins School of Technology and Management, Polytechnic Institute of Viseu, Portugal Csilla Ducrocq Paris-Sud University, Faculty of Sciences, France and Paris Graduate School of Economics, Statistics and Finance, France
ABSTRACT Telecollaborating and communicating in online contexts using English as a Lingua Franca (ELF) requires students to develop multiple literacies in addition to foreign language skills and intercultural communicative competence. This chapter looks at the intersection of technology and teaching ELF, examining mutual contributions of technologies, more specifically Web 2.0, and ELF to each other, and the challenges in designing and implementing collaboration projects across cultures. Moreover, it looks at how the development of digital competencies in ELF (DELF) can be enhanced through the implementation of Web 2.0 mediated intercultural dialogues. The detail of the research design including internet tools used, participants and tasks are also discussed. Data analysis points to a positive attitude towards telecollaboration, also providing confirmation of some of the problems identified in theoretical framework, such as different levels of personal engagement.
DOI: 10.4018/978-1-5225-1851-8.ch017
Copyright © 2017, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
European Dialogue Project
EUROPEAN DIALOGUE PROJECT: COLLABORATING TO IMPROVE ON THE QUALITY OF LEARNING ENVIRONMENTS Globalization is one of the most debated phenomena of the present age. The term itself did not appear before the 1970s, but now pervades contemporary political rhetoric and is a keyword of both academic and popular discourse on economy, society, technology and culture. Given the complexity of globalization, it is unsurprising that several conceptualisations have been proposed to shed light on its many shades of meaning. In Giddens’s (1990) terms for example, it refers to “the intensification of worldwide social relations which link distant localities in such a way that local happenings are shaped by events occurring many miles away and vice versa” (p. 64). Held, McGrew, Goldblatt & Perraton (1999) focus on the spatial dimension of globalization, and they define it as a process or set of processes which involve all social domains, e.g. politics, business, culture and the environment. Finally, Friedman (2005) concludes that the world has shrunk from size “small” to size “tiny” because of technological facilities, which have also contributed to “flattening the playing field at the same time” (p. 10). In fact, over the past years Information and Communication Technologies (ICTs) have entered every area of society and influenced every aspect of our social and cultural lives. Educational institutions, however, despite being firmly rooted in our broader social and cultural milieu, have, at times, been left largely unchanged by the technological developments that have swept through society. As a consequence, students raised in a world of instant information and interactive technologies have been confronted with educational practices which may have struck them as rigid, inflexible and outmoded. This is no new criticism, as more than 30 years ago Byrne (1976) highlighted the need to reduce the degree of seriality which tended to characterise classroom learning. In the face of these pedagogical concerns, ICTs can prove extremely valuable. One of their more visible attributes lies in their ability to move beyond the sequential nature of classroom teaching providing meanwhile other educational benefits. These include, for instance, the offer of a virtually unlimited array of authentic materials, the contact with up-to-date information about language use, and the possibility to address a potentially global audience. This chapter reports on two online collaboration projects developed by four European universities and carried out in the spring of 2014 and 2015, running from March to June. Students, working in international teams, were asked to discuss and compare the values shared by young people in Italy, Portugal, Germany and France, being in charge of designing and carrying out a collaborative survey to assess students’ views on specific topics. In the first project the topics were related to values within Europe, while in 2015 it was decided to tackle issues connected with corporate social responsibility (CSR). The findings were shared in a joint compendium. Later, students were asked to evaluate the project in an online questionnaire. Several Web 2.0 tools were available for the participants. Bonn-Rhein-Sieg University of Applied Sciences provided their e-learning platform, ILIAS, known as LEA (Lernen und Arbeiten online, i.e. learning and working online). Furthermore, ADOBE Connect 9, a collaboration software used for online meetings, was provided by the German Research Network. In addition, participants used Facebook groups, Skype and Google Drive for their collaboration.
394
European Dialogue Project
Figure 1. EDP logo
HIGHER EDUCATION IN THE DIGITAL AGE Higher Education has a vital importance for sociocultural and economic development and has been facing great challenges all over the world, namely associated with financing, equity of conditions at access into and during the course of studies, staff development, skills-based training, enhancement of quality in teaching and research, employability of graduates, establishment of co-operation agreements, among others. On the other hand, higher education institutions (HEI) are being challenged by new opportunities relating to technologies that are transforming the ways in which knowledge can be created, managed, disseminated, retrieved and controlled. The second half of the 20th century witnessed a period of remarkable expansion: nearly a sixfold increase in student enrolments worldwide. This confirms that higher learning and research are envisaged as essential components of cultural, socio-economic and environmentally sustainable development of individuals, communities and nations. In Europe, the first decade of the 21st century has witnessed a set of changes in higher education. Since June 1999, with the Bologna Declaration, European higher education institutions have walked a long, complex and ongoing path with the purpose of creating a harmonised and attractive European Higher Education Area (EHEA). It is, therefore, a process with political and economic motivations, triggered by a confluence of factors such as the emergence of a knowledge society, increased mobility both within Europe and between continents and the need for greater competitiveness of European economies and HEI. The development of a EHEA is closely linked to teaching and learning foreign languages, whose relevance in building a more competitive Europe is underlined by Vogel (2001), who states that “a consensus exists among politicians, industries, universities and students, that the global economy depends on the internationally minded, interculturally trained, multilingual graduate” (p. 381). Therefore, it is important to examine the role of foreign languages in achieving the goals of the Bologna Process, focusing on three specific areas: languages and mobility, languages and employability and, finally, languages and lifelong learning.
395
European Dialogue Project
Foreign Language Learning and Mobility One of the main principles of the Bologna Process was to promote students, teachers, researchers and other staff mobility. This is considered of primary importance in the strategy of building a European knowledge economy, and was highlighted as one of the main goals in the European Council conclusions on the strategic framework for European cooperation in education and training for the decade 2010-2020 (“making lifelong learning and mobility a reality” (EF, 2010, p. 3). The challenge of academic mobility is undoubtedly a linguistic challenge. It is essential to develop communication skills in foreign languages to allow effective communication and integration in the receiving country.
Foreign Language Learning and Employability The structural changes that characterise the Bologna Process are not an end in themselves. They pursue the aim of contributing to the development of a knowledge economy in a global scenario. In the European labour market, characterised by mobility, multilingualism and multiculturalism, the development of foreign language skills is becoming increasingly important with regard to employability. The Lisbon Strategy (European Council, 2000), which emphasises the importance of knowledge - the ability to research, apply and transfer - to prosperity and economic dynamism, highlights the critical role of developing a multilingual communicative competence and its role in structuring the knowledge economy. Connell (2002) alludes to the importance of a linguistically able workforce. The author considers this crucial for citizens to play an active role in the international economy.
Foreign Language and Lifelong Learning Lifelong learning is one of the major cornerstones of the Bologna Process and its dominance has been reinforced in the guidelines for 2010-2020, as it is seen as a fundamental premise to make the European economy the most competitive and dynamic at a global scale. The European Commission (2001) considers the ability to communicate in foreign languages as “a desirable life-skill for all European citizens” (p. 3). The report Key Competencies for Lifelong Learning also refers to communication in foreign languages as one of the eight key competences “necessary for personal fulfilment, active citizenship, social cohesion and employability in a knowledge society” (European Parliament and Council of the European Union, 2006, p. 13).
GLOBAL COMMUNICATIVE COMPETENCIES: DIGITAL ENGLISH AS A LINGUA FRANCA (DELF) With the widespread use of digital technology, many new applications are readily available in our interconnected world. These new developments require learners to develop specific skills and competencies that are often referred to as global competencies or 21st century skills (Griffin & Care, 2015). As English is the Lingua Franca of the World Wide Web, it is also important in online interactions. Digital English as a Lingua Franca (DELF) is used in business and private online interactions in written and spoken formats. In essence, we are dealing with all digital communication of non-native and native speakers of English and the skills needed for these types of interactions. These global competencies 396
European Dialogue Project
Figure 2. Four layers of Global Communication Competencies (GCC). Adapted from Louhiala-Salminen & Kankaanranta 2011, p. 258.
include digital literacy but in context with communication and collaboration activities using English as a Lingua Franca (ELF). Research on ELF and in particular Business English as a Lingua Franca (BELF) have examined how businesses use English globally in their communication environments (Kankaanranta & Louhiala-Salminen, 2013, Kankaanranta & Planken, 2010). More recent Computer Assisted Language Learning (CALL) and Computer Mediated Communication (CMC) research, which we will address in the next sections, have considered the use of different tools in collaborating and problem-solving tasks. Intelligent CALL prepares and involves students in active online projects. Recent research has not specifically addressed digital communication competencies within the ELF paradigm. The authors build upon the model of global communicative competence from Louhiala-Saminen & Kankaaranta (2011, p. 258) and have slightly modified it for more general global communication scenarios. It is thus not specially geared for a business environment, but can also be used in this context. This adapted model includes an additional layer of digital competence in an ELF online environment. As seen in the model (Figure 2) below, the autonomous learner (shaded black) needs four competence layers to be a global communicator. It begins with general knowledge in communication, in addition
397
European Dialogue Project
competence in ELF is needed, but also specifically using ELF in a digital environment (DELF). Digital literacy and the knowledge of new digital collaboration methods (e.g. mass communication; collaborative problem-solving) extend beyond traditional communication practices. The competence in multicultural/ intercultural skills, shown in the last layer, is used to decode messages from other cultural backgrounds (Louhiala-Salminen & Kankaanranta, 2011, p. 258). Our study examines how students engage in communication strategies in their online collaborative tasks. We identify adaptation repertoires using ELF in CMC (e.g. multimodality strategies, awareness of cultural and language competencies). In addition, our two online communication projects in 2014 and 2015 have provided new ideas into developing our university curricula to include Digital English as a Lingua Franca by sensitising students towards digital global communication practices in real-life scenarios.
English as a Lingua Franca The dramatic intensification in worldwide relations brought about by globalization has inevitably called to the fore the question of language, the primary medium of human social interaction, and the means through which social relations are constructed and maintained. While much everyday interaction still occurs within local networks, contacts among people living in the most different places of the world have grown out of all proportions. As a consequence, the need for a language that could be used and understood by everyone is one of the most important aspects that teachers and scholars have to tackle. This language is a Lingua Franca, namely a language used to enable routine communication among people who speak different languages. In present times, the choice of a Lingua Franca that can be used universally has fallen upon English, defined as “the most widely taught, read and spoken language the world has ever known” (Kachru & Nelson, 2001, p. 9). At present English is the dominant language in the educational sector all over Europe, where it is primarily taught as the first foreign language. English language teaching is therefore almost totally EFL biased and accuracy is considered to be the norm. Native and non-native speakers alike demand allegiance to and achievement of the native speaker standards. Indeed, what is emerging (Seidlhofer, 2008, p. 169) with some clarity is that in view of the present globalisation through English and of English, insistence on a monochrome native-speaker standard has now become an anachronism that inevitably leads to some confusion in language teaching and manifests itself in a number of contradictions and discrepancies. What we need is a critical appraisal in language use and language teaching analogous to what we find in other areas of English study, and a fostering on how language functions in social contexts of use. To be able to move forward, it is necessary for teachers to take on board the present reality which does not rely on the hegemony of the norm-based standard English, be it the British or the American model. This move forward will have no chance of survival unless teacher education is carefully re-evaluated and teachers will stop simply teaching what they themselves were taught to do and how to teach it in their teacher training courses. In particular, it will be the ‘new’ teachers’ task to help their students develop common pragmatic strategies for achieving reciprocal understanding, in the awareness that native speaker varieties, therefore, might be considered to be ‘unrealistic standards’ and consequently unreachable goals for non-native learners who need the language for different purposes than native speakers. Non-native speakers have to be intelligible to other non-native speakers as most of them will never communicate with a native speaker of English. In other words, traditional ELT (English Language Teaching) needs to be complemented with insights from ELF (English as a Lingua Franca).
398
European Dialogue Project
Digital Competence: Research on CMC Continuous developments in ICT over the last 60 years have had strong implications in foreign language teaching and learning. The emergence and dissemination of the concept of CALL (Computer Assisted Language Learning) is an example of the growing interest of teachers and researchers for this area of knowledge. It is important first to define this core concept and for this we will use Levy & Hubbard (2005), who define CALL as “the search for and study of applications of the computer in language teaching and learning” (p. 1). The evolution of the concept is closely related to the findings in the area of ICT. The work of Warschauer & Healy (1998) should also be highlighted, since the authors have developed multiple studies related to this theme, namely trying to systematise different stages of CALL. The first phase, associated with behaviourist learning theories is characterised by activities of stimulus-response and repetitive exercises. Next, we find the communicative phase, which is based on a communicative approach to teaching and learning and the focus lies now on the effective use of language. Originality is encouraged and textual reconstruction activities and role-play are promoted. The integrative phase coincides with the development of multimedia technology and the emergence of new theories which argue that language learning is a social construction. According to this perspective, students should be confronted with rich and authentic learning environments. Although Warschauer & Healy (1998) had not systematised a fourth phase, he pointed out some directions for the evolution of computer-mediated language learning and he referred to this new stage as “intelligent” (p. 68). The main goal of intelligent CALL is to prepare students for active citizenship in a global and networked society. The author highlights the fact that it is essential to be able to find, evaluate and critically interpret information available on the web, stating that “students themselves create their ‘texts’ from their own selection of materials from a variety of sources. In teaching, reading, we will have to go behind how to decode texts, or understand them and pay increasing attention to how to explore and interpret the vast range of online texts.” (Warschauer & Healey, 1998, p. 65) The second aspect to be considered relates to an effective online writing, since that is ubiquitous in the knowledge society and was reinforced by the advent of Web 2.0. Warschauer & Healy (1998) stress that the development of a digital literacy is also one of the goals of teaching and learning foreign languages. Thus, the ultimate purpose is that the learner becomes active, autonomous, independent and able to plan her/his “active, conscious, and purposeful self-regulation of learning” (Oxford, 2003, p. 2). This intelligent phase features the concept of multimodality, which refers not only to the variety of media available today and the different ways of constructing meaning, but also the possibility of combining these modes more easily in an orchestration of meanings (Kress, Jewitl, Osborn & Tsatsarelis, 2011). It should be made clear that the emergence of new phases of CALL is not a consequence of rejecting earlier phases. As Thomas, Reinders & Warschauer (2013) point out: “[C]hange does not occur at an identifiable point. Changes such as these occur over time and with a great deal of unevenness and overlap rather than a result of a smooth and linear process of historical transition” (p. 6). Recent research on CALL is more focused on the collaborative potential of social media, namely on project-based research, underpinned by developments in portable digital devices and pedagogical principles that focus on learning as a participative experience, also enhancing critical thinking, making students “active agents and users of the target language”. (Thomas, Reinders & Warschauer, 2013, p. 7). Collaborative technologies can enhance the development of authentic tasks from the real world, which students can relate to, thus assuming a more active than a passive one in their language learning process (Thomas & Reinders, 2010). This contemporary CALL, based on decentralised, democratic 399
European Dialogue Project
and learner-centric environments (Reinders & Darasawang, 2012) is a fertile ground for research-based projects, which will enable students to develop a wide range of skills such as collecting and analysing data, negotiating meaning, collaborating, solving problems or improving understanding of cross-cultural communication in a globalised world (Shaffer, 2008).
CMC Across Cultures and the Need to Develop DELF In order to be effective communicators in digital environments, students must develop a set of complex skills regarded as vital for an active and reflexive citizenship in a globalised world permeated by multimodal technologies. This has brought to the fore the concept of digital literacy that, according to Dudeney, Hockly and Pegrum (2013) is related to “the individual skills needed to effectively interpret, manage, share and create meaning in the growing range of digital communication channels” (p. 2). This umbrella term involves, according to Martin (2005): the awareness, attitude and ability of individuals to appropriately use digital tools and facilities to identify, access, manage, integrate, evaluate, analyze and synthesize digital resources, construct new knowledge, create media expressions, and communicate with others, in the context of specific life situations, in order to enable constructive social action; and to reflect on this process. (p. 135) To be competent communicators in collaborative technology mediated projects students must possess some technical skills, allowing them to know how to use the different tools, but also be familiar with a range of social practices and behaviours associated with collaborating online. The challenge is, undoubtedly, to promote these skills and attitudes in parallel with teaching ELF. Dudeney, Hockly & Pegrum (2013) look at the concept of digital literacy in detail and how it can be developed in teaching ELF. The authors have developed a digital literacy taxonomy, considering four main domains: language, connections, information, and (re)design. Concerning the language domain, it is important to combine the traditional print literacy with texting literacies, using digital language effectively, processing hyperlinks appropriately, and comprehending and creating texts in multiple media so that students can design their own narrative pathways. Focusing on connections, Dudeney, Hockly & Pegrum (2013) stress the need to develop personal, participatory and intercultural literacies. The Web 2.0 and its open, participatory and social nature has given dialogue a prominent place in the knowledge building process and the construction of meaningful learning will greatly depend on learners’ capacity to engage in the creation and maintenance of connections and dialogical processes, which go beyond one’s geographical, cultural and religious boundaries. Regarding the information domain, one should consider search and information literacies. Apart from learning where to find relevant and updated information, students should also develop filtering, analytical, life-long learning skills, so that they can critically evaluate sources afforded by technologies. As far as the (re)design domain is concerned, Dudeney, Hockly & Pegrum (2013) highlight the importance of remix literacy, which refers to the ability of combining different types of media in order to create something new. Although a myriad of technological tools and resources is available both for teachers and students, their effective use depends on a constellation of factors. Firstly, technology should never override pedagogy, so only after defining the intended learning outcomes should one be thinking about the best tools to assist this process of knowledge construction. It is also of utmost importance that students have some 400
European Dialogue Project
time to get familiar with the tools, so that they focus on the learning experience that is provided. In this regard, Mason & Rennie (2008) emphasize that “the introduction of wikis, blogs, podcasts, discussion boards, and so on, needs to be carefully balanced the learning part of a symbiotic system that brings benefits to the learners rather than confusing, intimidating or undermining their confidence” (p. 60). Collaborative CALL projects, involving individuals or groups in different locations, have been documented in multiple studies, which highlight both the strengths and weaknesses encountered during the collaboration processes. The increase of students’ cultural awareness is pointed out by different authors (Abrams, 2002Furstenberg, Levet, English & Maillet, 2001, Kramsch & Thorne, 2002). Another perceived advantage is the development of some aspects of intercultural competence (Belz, 2003Chun & Wade, 2004). Other studies, however, focus on factors that can hinder successful collaborative CALL projects involving participants from different cultures. In this regard, O’Dowd & Ritter (2006) highlight four levels (individual, classroom, socioinstitutional, and interaction) which might explain miscommunication when collaborating online. Thorne (2003) stresses that different expectations from the different groups involved as well as different levels of personal engagement might constitute “a strong example of the challenges inherent to cross cultural interaction” (p. 45)
Competence in DELF Computer-mediated-communication (CMC) has increased exponentially and is often exploited for networking practices among people who share professional and/or personal interests, giving life to virtual Communities of Practice, or “constellations of interconnected practices” (Wenger 1998, Seidlhofer 2011). Virtual encounters are nowadays as meaningful as face-to-face ones and “electronic propinquity” (Korzenny, 1978) has become the new substitute for “physical propinquity”, with electronic media providing a vital source of internal and external communication, as they enable people to interact at any time, in different places. When interacting on-line it is of paramount importance to keep in mind some of the most relevant properties of CMC (Computer Mediated Communication) • • • • • • • •
Immateriality Extension in participation framework Multimodality Hypertextuality Co-articulation and interactivity Multiple reading modes Intertextuality Granularity
English most often works as the Lingua Franca for CMC communication, transcending group memberships and cutting across speech communities, languages, geographical and national boundaries as traditionally conceived (Jenkins, Cogo & Dewey, 2011Seidlhofer, 2011), as well as native/non-native/ ESL/EFL dichotomies. However, language use in CMC needs indeed to be adapted to the communicative needs of its users, with CMC ELF users skilfully drawing on their (pluri)lingual (and semiotic) repertoires to successfully interact in intercultural contexts.
401
European Dialogue Project
Intercultural Competence The term intercultural communication “is used to refer to the communication process among members of different cultural communities” (Ting-Tomey, 1999, p. 16) in which “the native language of one or more participants is not the primary language of the community and its speakers” (Ponterotto, 2005, p. 254). Communication is usually easier among people who share the same culture. Even if speakers from different lingua-cultural backgrounds are proficient in the chosen language of interaction it can be difficult or challenging to convey the information which should make the exchange successful. Moreover, the same difficulties, even if less frequently, may arise in interactions between speakers of the same language, due to differences emerging from ethnic, age, gender, regional and social variables. It can therefore be stated that miscommunication is a cultural rather than linguistic phenomenon. Individuals should always bear in mind the cultural differences that exist between themselves and the other group members, and need to learn how to manage such differences constructively. For an intercultural exchange to be satisfactory, three outcomes should be obtained: the feeling of being understood, the feeling of being respected, and the feeling of being supported. The feeling of being understood does not mean that the listener agrees with the speaker; it implies the willingness to share aspects of our own self-conceptions with others in a culturally sensitive manner. The feeling of being respected shows that our identity-based behaviour is considered as credible and on equal basis with members of other groups. It also means treating members of other groups with dignity and courtesy. Finally, the feeling of being supported refers to our sense of being positively valued as useful individuals despite having different group-based identities (Ting-Tomey, 1999). ELF users draw on, construct, and move between global, national, and local orientations towards cultural characterisations. Therefore, the manner of simplification prevalent in approaches to culture in the ELT language classroom, which easily leads into essentialist representations of language and culture in ELT and an over representation of “Anglophone cultures,” should be replaced by an open attitude, based on acceptance and awareness (Baker, 2015), which allows students to get as much exposure as possible to the languages and cultures of other countries than the United Kingdom and the United States.
Autonomous Learner The need for instruction to account for learners to become self-sufficient and independent of the teacher has long been highlighted (Bruner, 1996, p. 53). However, for those inside the educational system it can easily appear that there are so many constraints, so many factors over which learners (and teachers) have no control, that learner autonomy is impossible (Little, 1991, p. 11). But to take such a view is to fall into the trap of confusing autonomy with self-instruction, and to forget that autonomy is essentially a matter of the psychological relation between the learner and the content and process of learning and we can recognise it in a wide variety of behaviours as a capacity for detachment, critical reflection, decision making and independent action. To a self-directed learning situation learners’ psychological responses will determine whether they have sufficient motivation to continue with their learning. It is often argued that the pursuit of learner autonomy requires a shift in the role of the teacher from purveyor of information to facilitator of learning and manager of resources (Little, 1991, p. 11). Obviously it is not easy for teachers to change their role from purveyors of information to counsellors and managers of learning resources. It is not easy for teachers to stop talking: after all, if they stop talking, they stop teaching, and they are afraid that their learners may stop learning. For a teacher to commit himself 402
European Dialogue Project
Table 1. General Information for the EDP in 2014 and 2015 2014
2015
Timeframe
10.03.14 – 11.05.14 (9 weeks)
31.03.15 – 11.06.15 (10 weeks)
Participating countries
4 France, Germany, Portugal, Italy
3 Germany, Portugal, Italy
Participating students
75 France (12), Germany (20), Italy (23), Portugal (20)
83 Germany (27), Portugal (25), Italy (31)
Online platforms
LEA (ILIAS LMS System), Adobe Connect, Doodle, Skype, (Google Drive)
LEA (ILIAS LMS System), Doodle, Skype, (Google Drive)
or herself to learner autonomy takes a lot of nerve (Little, 1991, p.11), not least because it requires him or her to abandon any lingering notion that he or she can somehow guarantee the success of his or her learners by his or her own effort. Instead he or she must dare to trust the learners (Lamb, 2008).
THE PROJECT AND ITS RATIONALE In 2014, 72 university students took part in the “European Dialogue” (EDP) project. They came from France, Germany, Portugal and Italy. France withdrew in 2015 as there were massive higher education reforms being negotiated in Paris making it impossible to commit to the project at that time. In 2015, three countries (Germany, Portugal and Italy) remained and increased the number of students to 83. Students worked online in international teams using various Web 2.0 tools to communicate with each other. The teams were in charge of discussing topics provided by the teacher-coordinators and designing and implementing a survey to assess and later compare views on specific topics within Europe. The findings were shared in a joint compendium. The participants included four higher educational institutions from four countries: France, Germany, Italy and Portugal, with very different student bodies.
Aim of the Project In today’s globalised world where web-based technologies prevail for communication, learning, including language learning, is seen not just as an accumulation of specific items of knowledge but also the ability to adapt this knowledge to vastly differing contexts and to match these to the individual needs of the learner. This involves the ability to use one’s resources autonomously, in unforeseen and unexpected situations. Learning is no longer seen as the sole responsibility of educational institutions but as a lifelong process arising from learning through personal and professional experience and through various channels. The aim of the EDP project was encouraging autonomous communication in English between students from four European countries - using English as a Lingua Franca. Defining autonomous language learning is not a simple matter of pulling a straightforward definition from the literature. Yet, most definitions have in common three essential components: structure, control and responsibility. Simply put, learners must be operating within a learning environment which enables them to exercise control over their learning and to take on the responsibility that this entails. To illustrate this, there are several principles which would characterise the structure of an autonomous learning environment:
403
European Dialogue Project
• • •
An emphasis on learning through the active use and exploration of the target language to accomplish desired ends Direct contact with the target language through interaction with a wide variety of media and materials, so that choice becomes a key element of learner control Learner management of choices and assessment, empowering learners to determine their own pace and make decisions based on personal need, learning style and interest
The development of new technologies, which might imply a change for the better in the traditional roles whereby the teacher was in tight control, transmitting content, seems to facilitate the task of helping learners become independent. In fact, although ICTs do not necessarily ensure learner independence, they provide the practical means whereby learners can take a more active part in determining their own objectives and their own learning programmes. As we will show in the next sections, the above criteria were the inspiring ones which guided us when we decided to set up the European Dialogue Project, a learning environment in which our students could actively use and explore the target language in reallife contexts, being at the same time aware of the impact of their autonomous language choices on their interlocutors, who all came from different lingua-cultural backgrounds.
Digital Tools: Asynchronous and Synchronous Tools Several online Web 2.0 tools were provided by Bonn-Rhein-Sieg University of Applied Sciences, Germany, to allow the members of the international teams to communicate with their peers: Adobe Connect for the virtual meetings and a joint online platform (“Lernen und Arbeiten online (LEA)”https://lea. hochschule-bonn-rhein-sieg.de) to upload written work or send out messages to group members. The use of other, more popular Web 2.0 tools (Facebook, Skype) was not recommended in the first stage of this project as the coordinators preferred to have students use software created particularly for the purpose of online meetings and communication in professional contexts and also for reasons of privacy. Nevertheless, during the project learners spontaneously decided to use freely available Web 2.0 tools, either because it allowed them to make communication with their international team members more self-directed and spontaneous or because of technical problems which arose with professional software, mostly due to the different setups of computer labs in the four participating schools. Based on this experience, in the second year of the project some adjustments were made and a Facebook group was provided for each international team involved in the project in order to facilitate spontaneous communication. Likewise, in the second year of the project, Skype replaced Adobe Connect as the media recommended for virtual meetings. In order to avoid misunderstandings and facilitate the writing of minutes after each virtual meeting, the learners were told to record their meetings on Skype. Although a professional online tool for conducting and analysing surveys was considered, the participating teacher-coordinators preferred the alternative to allow students freedom in deciding how to carry out their surveys. Once the time frame was determined, the breakdown of tasks and the selected tools were written down in a joint document, referred to as “Student Guidelines” with all the information students would need for the project to be successfully carried out. The guidelines included the objectives of the project, deadlines and detailed description of tasks, a list of topics to research, guidelines on how to write a survey, guidelines on how to write the minutes of a meeting, instructions on using the joint platform (LEA), information about Web 2.0 tools (Facebook, Skype, Google forms). This document was made accessible to every participant online on the joint platform (LEA). 404
European Dialogue Project
Figure 3. Overview of the EDP tasks
Tasks The project ran over 10 weeks during which 7 tasks had to be completed by a given deadline (see Figure 3). After being introduced to the project in class, students chose one of the topics and were then signed up for the workspace of the team according to the topic they would like to research. They also had time to get acquainted with the student guidelines. Besides, students joined the Facebook group created for their team and agreed on a date for the first virtual meeting. Apart from getting familiar with the tools, they were also required to do some research on the topic (see Appendix, Tables A1 and A2) they had chosen with regard to the situation in their own country. Next, in task two, students met virtually for the first time in order to know each other and build rapport. There should also be some discussion on the research topic, namely issues that might be worth including in the survey. In order to facilitate communication, some ground rules were created, namely establishing the members who would chair and write the minutes of the meetings. As a follow-up of task two, each country member should draft some questions to share with the other team members. This brought about some predictability when completing task three, whose purpose was to prepare the team’s survey. During the meeting the participants should agree on a final list of 15-20 questions they would be using for their research. Afterwards, students from each country carried out their surveys (task four) and prepared a written report of their findings together with graphic information and made this available online (task five) before the third virtual meeting, whose aim was to compare and contrast critically the observations from the different countries (task six). Lastly, a compendium was created based on each
405
European Dialogue Project
Figure 4. Photo of a virtual meeting
country’s summary and the discussion held during the third meeting. Additionally, students were asked to evaluate the project.
Students’ Evaluations At the end of the 2014 and 2015 projects the students were asked to fill in a questionnaire which was meant to make them reflect upon language awareness and intercultural awareness. The input provided by the students was then analysed with a view to shedding light on their perceptions of working online with peers from different countries using English as a Lingua Franca. The questionnaire collects information on the previous online experience students had as well as documents the frequency of collaboration in written and oral interactions. Furthermore, it asks the students for their overall perception of the project. The main focus of this chapter is the online questionnaire of student perceptions, but it also considers the interactions on Facebook and a sample of the transcripts of the Skype recording and instructors’ (coordinators’) feedback. The analysis which follows is based on two small scale investigations of the two projects held in 2014 und 2015. The first study from 2014 provided some insights which were duly taken into account when organizing the 2015 project. • • • • •
406
Students highly recommended the project to continue in 2015. Students had little experience working and collaborating online with other non-native speakers of English (digital literacy). Students found this real-life project a good method to learn to collaborate and work together in a global world. (autonomous learning in an ELF environment) A majority of students experienced language and intercultural challenges during the project. Student’s preference of online tools were Facebook and Skype.
European Dialogue Project
• •
EDP coordinators felt that all written and oral interactions online should be made available for further research. Facebook groups were added to the project to examine written interactions. Further research in ELF using digital technology (DELF) is needed.
Background Information The project was conducted during the summer term in 2014 and 2015. The project timeframe was 9 weeks in 2014 and 10 weeks in 2015. Due to the different time schedules of the partner universities it is challenging to find a coinciding timeframe to conduct the project. Although all countries are in Europe, the start and end of an academic year vary, in particular Germany which starts at the end of March. Based on the feedback of the students and the observations from the project coordinators, several changes were made after the first project in 2014. Based on the feedback from the students, Skype replaced Adobe Connect in 2015. Adobe Connect 9 hosted by the German Research Network is free of charge to higher education in Germany; however technical and compatibility issues remained unresolved for the Portuguese students that the members of the project had to resort to another web conferencing system. In 2014, students had to quickly switch to Skype as an alternative system as Adobe Connect did not work for their group meetings. Based on the feedback of the students, Skype was chosen as the required video conferencing tool in 2015. The project requested students to install the software to use Skype if they did not already have it on their computers or requested alternatively to use the facilities with Skype in their respective universities. Finally, the coordinators added one additional assignment for both the Italian and German students; each German team had to record the meetings using a free Skype recording application (MP3 Skype Recorder) and then the Italian members of each team had to transcribe at least of the meetings. Furthermore, all students were provided with an already set-up Facebook group (secret setting) to allow students to quickly get connected. These set-up Facebook groups offered the opportunity to examine the correspondence after the project had been completed. In 2014 students had created their own Facebook groups and thus the coordinators did not have access to the online communication threads. During the project in 2015 only one e-tutor from Germany had access to each of the 10 Facebook groups and none of the instructors to ensure that students felt comfortable using the group. The e-tutor monitored periodically the activities to ensure that the groups were not experiencing any technical issues, but the groups were autonomous in their online activities. The students in Germany were enrolled in Bonn-Rhein-Sieg University, a university of applied sciences. The university has a distinctive international profile with about 7.000 students and 145 faculty members. The university has five departments and a total of 26 degree programmes. In 2015 the students were a required module in the second semester of their 6 semester B.Sc. in Business Management. The course, Business English: A Simulation Course in Entrepreneurship is a required module in foreign languages with 6 ECTS. Prerequisite for the course is a B2 placement on the Online Oxford Placement Test, and most students in the project were tested at a C1 level. Classes took place once a week on either Mondays or Tuesdays from 10:00 am to 1:30 pm. The EDP tasks were 30% of their total grade. Furthermore, several students had completed a vocational apprenticeship prior to starting their undergraduate work or had been abroad during a student exchange in an English speaking country (e.g. USA, Australia). The Italian students were enrolled in the Master degree programme in Languages for Communication in International Enterprises and Organizations, University of Modena and Reggio Emilia. In the summer term 2015 the students did not have a regular class and so they received their primary instruction through the EDP materials that were provided to all members of the project (student guidelines and online 407
European Dialogue Project
platforms). However, they were sent emails from their professor to give additional updates concerning EDP. The students were in their second semester of a two-year programme and had previously had a course focusing on ELF theories and intercultural communication. They are required to have a B2/C1 level of proficiency to be admitted to the master degree programme. The Portuguese students are also first year students and the youngest in the project. They are enrolled in an undergraduate degree programme in Tourism at the School of Technology and Management, Polytechnic Institute of Viseu. The English for Tourism course (5 ECTS) is a requirement and classes take place twice a week on campus. The English language proficiency varied from B1-C1. The students were able to sign-up voluntarily for this project and it counted 30 percent of their final grade. The coordinator provided additional mentoring throughout the project.
Data Collection The students in the project received access to an online questionnaire (see Appendix, Figure A1) from EvaSys, an online evaluation suite that offers SPSS data export and was hosted by BRSU. The students’ questionnaires, the students’ interactions in their Facebook groups and the Skype recordings were also used to gather further data. The online questionnaire was slightly modified in 2015 adding two questions about the perceptions of teacher support before and during the project. In addition, one question was included about how students felt about the Skype recordings. Based on the feedback from 2014, an additional text to clarify the difference between language and intercultural awareness was added to Part A and Part B of the questionnaire (see the Appendix, Figure A1). The students received access to the survey via a link at the beginning of June 2014 and 2015. Students in Germany and Portugal were allocated time to do the survey in class in 2015. The students in Italy were asked to complete the survey online outside of the classroom. The questionnaire in 2015 had a total of 35 closed questions, mostly multiple choice and, most of which used a five-point Likert scale. Furthermore, 18 open questions to clarify comments were included. It is divided into five sections: 1. 2. 3. 4. 5.
Data on demographics Previous experience Students’ perception of language awareness and intercultural awareness Students’ perception on collaboration Perceptions of the project. Only a fraction did not complete the survey as seen in the table below.
In 2014, 75 students were part of the project and 63 completed the survey; and in 2015 a total of 73 responded out of 83 participants.
Results We will focus mainly on the data collected in 2015 and also provide some data from the transcribed ELF oral interactions from the Skype meeting and written interactions posted on the Facebook groups of each team. These oral and written interactions complement the data from the questionnaires. Data analysis will mainly focus on the survey. However, in some cases, we will refer back the 2014 conducted survey to show similarities or changes. 408
European Dialogue Project
Table 2. Number of Participants in the EDP Student Questionnaire 2015 2014
2015
participants in the EDP
participants in the Questionnaire (n)
participants in the EDP
participants in the Questionnaire (n)
12
8
0
0
German
20
16
27
26
Italian
23
23
30
30
Portuguese
20
16
25
17
Total
75
63
83
73
Participants French
Demographic Data In 2015 the students from Germany and Portugal were undergraduate students and younger than the Italians, who are enrolled in a postgraduate degree programme. 38.4% of the students were 21-23 years of age and 27.4% between the ages 18-20. None of the students in Italy were between the ages of 18-20 and more than half (56.7%) were 24-26 years old. Slightly more than one third (32.2%) of the students surveyed were male; most students in Italy and Portugal were female, but two thirds of the German respondents were male.
Previous Experience The next questions focus on the previous experience the students had with online collaboration and using English as a means to communicate with other non-native speakers. Before the project 11% (n = 73) had worked in a virtual team. The Portuguese had the least experience (5.9%). Overall the majority (60.3%) had participated in an online discussion in English prior to the project, but had never conducted a survey online before (84.9%).
Language Awareness The next part of the online survey looks at the students’ perceptions on language awareness. The explicit perceptions students have about their knowledge of language learning are looked into. More than half the students perceived to have had language difficulties that resulted from lack of language competencies and these were mainly in oral interaction (72.2%) rather than written discourse. Although two-thirds of the students did not feel that there were any misunderstandings in their international teams, more than half the student (59.2%) found that there was a difference in interacting in English with students from the other countries in the project. When asking to explain these differences the issue of pronunciation (7x) and the different accents (5x) were frequently mentioned, the use of different vocabulary and different levels of English were seen as a challenge. In general students stated that they could understand their interlocutors and make themselves understood. However, the German proficiency level was usually considered similar or better (5x). In addition,
409
European Dialogue Project
Figure 5. Results for the question “Did you experience any difficulties linked with language competence during the project? If you did, did they concern”
several students commented on the lower level of proficiency of the Portuguese team members. Overall two-thirds felt that their proficiency level was comparable to the other international team members. Survey Question: Did you experience any differences when interacting in English with the students from your own country and with the ones from another country? Can you explain? From 71 respondents 59.2% chose “yes” and 40.8% “no.” The students reported that they tried to solve their encountered challenges by repeating the words or phrases, by reformulating what was said, by clarifying and sometimes writing parallel in chat, by talking slower and in shorter sentences as well as trying to be more flexible and patient as possible. Survey Question: How did you try to solve the challenges that you encountered? Repeating something was mentioned 4 times, adding it to chat 3 times or using the dictionary twice; rephrasing, paraphrasing, reformulating (mentioned three times), asking for help, and doing exercises to improve one’s English skills were also mentioned, together with asking politely for repetition, slowing down the pace of the conversation and using shorter sentences. Survey Question: List some of the strategies that you adopted in order to try to solve misunderstandings/challenges? Many students said that they rephrased and reformulated their messages (10x) or asked to repeat something (4x) or asked questions (4x). The students also mentioned that they adapted the language and used simpler words (7x) and repeated things several times for clarity (3x). In addition, they used an asynchronous tool chat function on Skype or Facebook to clarify oral communication and negotiated in their Facebook group (4x). Survey Question: How easy was it for you to communicate orally in the virtual meetings? There were three main tendencies: oral communication was easy due to high level of proficiency, the troubles were more technical in nature (e.g. low internet connection, noisy surroundings, no headsets used) and “Difficulties were mainly based on cultural difference.”.
410
European Dialogue Project
Awareness of using English as a Lingua Franca, being precise and speaking in a clear manner, reducing the complexity of the language use and dealing with different accents and levels of proficiencies were all mentioned by the respondents when asked what they had learned about oral communication in this project. In addition, adding text (multimodality) to the conversation was seen as beneficial for oral interactions. Finally, the coordinators wanted to find out if the additional Skype recording was acceptable for the students, thus an additional question was added to the survey in 2015. The Skype recording was seen by 54.8% (n = 73) to help them write the minutes of their virtual meetings and more than half (42.3%) used it to check their pronunciation. It also helped them realise the grammar mistakes they made (27.4%). And, less than a quarter (23.3%) did not enjoy using the recordings.
Intercultural Awareness In Part B (intercultural awareness), the students’ perceptions of intercultural awareness are measured. This involves understanding the differences between themselves and people from other countries or backgrounds, especially in attitude and values. Nearly half the students experienced (49.3%) a difference in intercultural communication with other members of the project and the others did not (50.7%). From those students that perceived to have had intercultural differences (n = 39), the main concerns were during oral interaction (64.1%) and 98.4% tried to solve these challenges. The strategy to solve their intercultural communication differences was done more frequently in oral interaction (61.6%) rather than written (46.6%) which coincides with the statements in Part A of language awareness that the main concerns were oral in nature. Some of the cultural differences that were mentioned are different attitudes, commitment levels, work ethics, motivation and the use of polite language. The final question on intercultural awareness was to see the strategies the students adopted in order to solve their intercultural challenges. This included trying to state their point of view, trying to be polite and being informed about other cultural values and habits.
Collaboration The section on collaboration is divided into three parts. First, it asks about the perceptions of the own participation and the group collaboration dynamics. Second, the frequency of the meetings and duration are documented. Finally, the preferred use of collaboration tools for oral and written interactions are also identified by the survey. Survey Question: How do you feel about your group collaboration dynamics? Students were asked if they felt satisfied with their own level of participation and the dynamics of collaboration in their teams. Nearly all students (94.4%) felt that they were active in their international team. The next question (see Table 3) shows that the students felt differently about their team’s collaboration dynamics. Here one can see that the satisfaction level was wide spread. On a scale from 1-5 the most interesting figures are those from Italy as these students were most satisfied (55.2%) and at the same time had the highest percentage of least satisfied participants (6.9%). Germans felt differently about the collaboration dynamics and consistently less than their counterparts in the other countries. Some of the comments reflect the statistics in Table 3. Fourteen comments were made that the collaboration was good to excellent. One Italian commented that “my fellow students were supportive, cooperative, suggested good ideas. A positive attitude towards each other allowed us a smooth team 411
European Dialogue Project
Results for the Question “How do you feel about your group collaboration dynamics?” Scale 1 n
2
%
3
%
4
%
5
%
M
SD
%
Germany
26
26.9
34.6
30.8
7.7
0
2.19
0.94
Italy
29
55.2
6.9
20.7
10.3
6.9
2.07
1.36
Portugal
17
23.5
52.9
17.6
0
5.9
2.12
0.99
Total
72
37.5
27.8
23.6
6.9
4.2
2.13
1.13
Note. The scales ranges from 1 = totally satisfied to 5 = unsatisfied.
working.” “Everyone did their duty” “Everyone was involved. We worked in a team.” “I find German students very active and less the Portuguese students”. Several students in Germany commented that the workload was not evenly divided among the team members and that not everyone was present at the virtual meetings. “Two team members did 95% of the work which I find very unfair.”
Virtual Meetings The project had three virtual meetings scheduled as individual tasks. Students could, of course, add additional meetings. Since the teams in each country had two or three members, it was only required that at least one member of each country should attend each virtual meeting. More than the majority of the students attended three virtual meetings (63.9%) Another 15.3% had more than three meetings online. Only 5.6% attended just one meeting or two meetings (15.3%). Only a quarter of the total meeting were less than an hour, the average virtual meeting was at least one hour (43%) and about 29% were more one to two hours. The length of the virtual meetings was significantly longer in 2014. The majority of the meetings meeting took 1-2 hours 55.6%/ (n = 63). Survey Question: What technologies did you prefer to use with your international team? (more than one answer possible). In addition, the authors wanted to find out what tools (technologies) the students preferred to use to collaborate. From the 73 respondents, Skype (87.7%) and Facebook (65.8%) were preferred as online technologies in oral communication to work in their international teams. In written communication Facebook group (90.4%) was preferred to email (6.8%) or the LMS (LEA Platform). In 2014, email played a more prominent role with 50.8% of the respondents as a preferred tool. Facebook was seen as a very convenient, fast and easy to use and accessible to everyone. Some students preferred Skype as they received an immediate response from the synchronous tool rather than waiting for a response on an asynchronous tool as Facebook. I prefer Facebook Group because everybody has a smartphone and you are available all the time. Everyone looks almost daily on Facebook so it is more practical than e.g. LEA.
412
European Dialogue Project
Figure 6. Results for the question “What technologies did you prefer to use with your international team? (more than one answer possible)”
Skype, because nearly everyone has it. The same applies to Facebook. Another technology mentioned by the German students was Whatsapp as this mobile communication tool was seen as easier and faster than Facebook. In 2014, there were a number of students (n = 63/50.8%) that preferred to use email or Facebook (74.6%), but in 2015 only 6.8% (n = 73) of the students were interested in using email for written communication in their international teams.
Students’ Perceptions and Recommendations All students (100%) felt that the project was useful to train collaboration in international teams. The students overwhelmingly enjoyed (91.8%) the project and 93.8% would recommend it to fellow students. They perceived the project timetable as manageable (93.8%). The last part of the questionnaire asks about the perceptions students had about the project, recommendations they would give future participants and the project coordinators. Students felt the need to be better prepared for the virtual meetings on Skype (own preparation or technical knowledge). They recommended more Skype meetings than required and to plan, organize and be patient if things do not run smoothly. To be open-minded and active in the project was also offered as advice. The Italian students did not have weekly scheduled lectures with their professor so that they were more dependent on the student guidelines. Thus, the students felt that the support before and during the project was not as satisfying. The German and Portuguese students had weekly classes and had faceto-face support during the entire project. The Italians requested more scaffolding during the project. A Portuguese student recommended that the students have their own simulation using Skype and writing minutes before the project begins to help the students master these oral and written tasks. The percep-
413
European Dialogue Project
tion that “oral meetings improved my English skills in communication. The written tasks improved my vocabulary.” and the “Awareness of my level of English and motivation to improve” were all mentioned as most useful in the learning process. A few German students felt the workload of the project was too much as they still had to complete regular tasks of their module in Business English. In addition, some students requested the coordinators to make sure that students take the project work seriously. Also some German students mentioned adding other countries in Africa or the US as they felt Europeans were not that culturally different.
Facebook Conversations - Observations in DELF A total of 10 Facebook groups were created at the beginning of the project in 2015 based on the students’ feedback and the coordinators idea to be able to analyse what interactions took place on Facebook. Initially in 2014 the idea of offering a social media platform as part of the project was rejected as this is not a purely business related communication tool and used mainly in social and private interactions. The focus was more on BELF, looking at teaching competencies in business. However, in 2015 it was decided to have an insight into the authentic oral and written interactions the students had online using English as a Lingua Franca, focussing more on digital tools in general and observing digital interactions in EFL that we have named DELF (Digital English as a Lingua Franca) interactions. Therefore, it was necessary to expand the initial pedagogical approach of exclusively offering opportunities of real-life business scenarios; and focusing on how the students communicated collaborative on social software they were already accustomed to using in their private communication. During the first project it had been difficult to observe the communication interactions online as the coordinators had no access to the Facebook groups of the participants. The original idea had been for students to use the forum functions on LEA, but the students had left the project LMA platform to communicate via Facebook. Hence in 2015, all Facebook groups were created and offered as part of the project. The students were not required to use these groups and none of the EFP tasks were related to a Facebook activity. One e-tutor had access to the team’s Facebook group during the project, but did not interact with the group. Afterwards, the conversations and posts were analysed to see who initiated posts and what types of interactions occurred. In most groups the first posts dealt with exchanging Skype names and finalizing the first meeting dates. In some groups members posted comments and icons to build rapport and discussed issues related to the project or personal issues. At the same time, there were several differences in task completion that challenged the teams’ collaboration efforts. Here are two conflict situations: Finding a suitable time to meet for the virtual meetings also was an issue in several groups and posed a conflict potential as members were not willing to compromise to set a date to meet on Skype for their virtual meetings. Some groups used the group more intensively than others as seen by the following two graphs. The frequency of posts in each team were counted to see the average participation. We counted all the interactions in each Facebook group by country and per person. A written post was either a written response or an initiated posted or a posted icon. All three countries were least active in three different teams (see Appendix, Tables A3 and A4 for more data)
414
European Dialogue Project
Figure 7. Student Facebook conversation about a challenge with completing a task
Figure 8. Student Facebook conversation about unethical collaboration within the team
Skype Conversations – Observations Each team met on Skype three times to discuss the individual tasks. In 2015 the Italian members of each team transcribed at least one of these three virtual meetings. The questions we raise is if there was a significant difference between the countries in the amount of words that were spoken. The pie chart below (Figure 10) illustrates that the average amount of words each person spoke per meeting from the three countries was not significant. The number of members from each country varied from 2-4 persons and therefore the data does not show an average per meeting, but per person in each country.
415
European Dialogue Project
Figure 9. Average participation per country per team on Facebook as a share of average posts and comments
Figure 10. Average words spoken per person per Skype meeting
There was no required length of time for each of the three virtual meetings. When looking at the team contributions and details of who spoke the most words, one can see a much more detailed picture of the group dynamics (see Figure 11). This bar graph shows the group dynamics and the percentage of participation based on words spoken in each transcribed recording. As seen, the percentages of each country varies significantly within each virtual team. For example, the Italians (team 2) spoke 85% of the entire meeting, whereas in team 5 it was 11%. Only team 10 had a similar contribution from each country, thus there was no dominant country that spoke the most. In three teams the Portuguese contributed less than 10% i.e. the lowest level of participation as seen in team 1 (7%), team 2 (5%) and team 9 (5%); however, in team 5 they contributed 69% of the meeting.
416
European Dialogue Project
Figure 11. Average participation per country per team meeting as a share of average spoken words
The dynamics of the teams and the participants in each country varied not only in the team but also the members’ contributions (spoken words) in each country were also quite different as seen through the transcribed meetings (see the Appendix, Tables A5 and A6 for more data).
Discussion of the Results The findings suggest that the EDP collaborative and challenging tasks, by allowing students to experience a multicultural work environment, fostered individual responsibility and motivation for learning about other cultures and their own language awareness. Data analysis showed that students’ autonomy was challenged, since students played an active role in exploring and using ELF to solve the proposed tasks. Moreover, they had to make choices about materials and tools and also to negotiate meaning and make collaborative decisions (Lamb, 2008). It was also found that online discourse does not work properly and sometimes breaks down when team members fail to work together on their assignments, either because students have a low commitment or a different level of expectations about the project (Thorne, 2003). It should also be mentioned that ten weeks may be considered insufficient for those new to collaborating online to develop both the skills and confidence to fully participate in collaborative learning. From the data on the perceptions of students it can be seen that the few cases of unsuccessful communication were often commented upon, but looking at the ratio of spoken words per minute, it was found that the students’ performances were overall very similar. Moreover, the data did not change if a country chaired a meeting. In general, students’ sensitivity to the project and the subsequent tasks made them respond adequately by doing what they were asked to complete their group project, resulting in a positive learning experience for all of the students, including the ones who did not contribute much.
417
European Dialogue Project
Students’ skill level in using digital technology can both enhance and hinder a positive collaborative dynamic among team members. When technology is not working properly, problems may arise. Accordingly, when the students realized that they were encountering problems with understanding each other’s accents, they decided to add a chat to the tools they were already using. In doing this they made use of the multimodal affordances of ICTs, a very important aspect of CMC. This may be a good reason to place emphasis on the development of DELF, promoting competences to ensure the effectiveness of collaborative learning experiences. As a result, students’ active involvement in solving authentic tasks, the encouragement of cultural awareness, and the development of individual and collaborative (meta)competencies all attest to the importance of online collaboration in the development of global communication skills. It also adds another component in English as a Lingua Franca, as it sees the need to teach digital competencies and communication strategies when working online with speakers worldwide. This project represents an innovative and very positive contribution to ELT in a globalised and digital world (DELF), legitimising the role of technologies in enhancing self-directed language learning.
CONCLUSION The aim of the EDP project was to encourage autonomous communication in English between students from four European countries - using English as a Lingua Franca. In order to make this possible, provisions were made to create a learning environment characterised by: •
• • •
An active use and exploration of the target language to accomplish desired ends (the students had to choose one of the available topics; join the Facebook group created for their team and agree on a date for the first virtual meeting; do some research on the topic they had chosen, with regards to the situation in their own country; write minutes of the meetings) Direct contact with the target language through interaction with a wide variety of media and materials, on line platforms (Skype, Facebook) Adobe Connect) so that choice becomes a key element of learner control Freedom in deciding how to carry out their surveys Learner management of choices and assessment, as learners were free to determine their own pace and make decisions based on personal need, learning style and interest
Undoubtedly, as the feedback of the students has proven, a learning environment like the one provided by the EDP project, characterised by learner-centeredness and democratic rights can enable students to hone their skills, in preparation for their future professional life. In fact, by negotiating meaning, collaborating, solving problems or improving understanding of intercultural communication, they are already playing a role as active stakeholders in today’s globalised world. This is proved, for instance, by the fact that most students stated that “oral meetings improved [their] English skills in communication. The written tasks improved [their] vocabulary.” Moreover, most of them also openly declared that the project had helped them to raise their awareness of their level of English and of the differences which may arise when interacting in intercultural groups, as well as their motivation to improve. Obviously, learners, who have never experienced ‘learning without a teacher’ before, may feel overwhelmed by the autonomy and self direction that is expected from them in an intercultural technology418
European Dialogue Project
mediated collaboration and therefore may be reluctant to participate or otherwise adopt the ‘wait and see” strategy by waiting for their peers to get involved first before committing. However, by adopting a positive attitude to communicational misunderstandings and errors, by breaking down tasks into manageable chunks and showing some flexibility with deadlines, such fears may be overcome and learners will inevitably gain in confidence and motivation to improve their language skill. In other words, rather than adopting the so called ‘swim-or-drown strategy’, it is advisable for teachers to pave the way for their students’ autonomous global interactions, by providing the right amount of scaffolding, both in terms of information, but also in terms of awareness towards the different issues involved.
Limitation of the Study The study mainly focuses on the perceptions of the students. It does consider briefly the oral interactions on Skype and the Facebook threads by analysing the spoken or written words per students, but not how they collaborated (comprise, adjust, negotiate) in detail. It would be advisable to have a better understanding of the explicit linguistic and intercultural adjustments students dealt with in their spoken and written discourse. In addition, it would be important to identify and classify in detail the oral and written ELF interactions in online communication and to analyse the challenges in their online ELF, i.e. DELF interactions and to see how digital communication varied from face-to-face ELF interactions.
REFERENCES Abrams, Z. I. (2002). Surfing to cross-cultural awareness: Using internet-mediated projects to explore cultural stereotypes. Foreign Language Annals, 35(2), 141–160. doi:10.1111/j.1944-9720.2002.tb03151.x Baker, W. (2015). Culture and complexity through English as a lingua franca: Rethinking competences and pedagogy in ELT. Journal of English as a Lingua Franca, 4(1), 9–30. doi:10.1515/jelf-2015-0005 Belz, J. (2003). Linguistic perspectives on the development of intercultural competence in telecollaboration. Language Learning & Technology, 7(2), 68–99. Bruner, J. (1966). Toward a theory of instruction. Cambridge, MA: Harvard University Press. Byrne, D. (1976). Teaching Oral English. London: Longman. Chun, D. M., & Wade, E. R. (2004). Collaborative cultural exchanges with CMC. In L. Lomicka & J. Cooke-Plagwitz (Eds.), Teaching with technology (pp. 220–247). Boston, MA: Heinle. Connell, T. (2002). Languages and Employability: A Question of Careers. London: National Centre for Languages. Dudeney, G., Hockly, N., & Pegrum, M. (2013). Digital Literacies. London: Routledge. EF. (2010). Work programme on the follow-up of the objectives of education and training systems in Europe. Retrieved from http://eur-lex.europa.eu/legal-content/EN/TXT/?uri = URISERV%3Ac11086
419
European Dialogue Project
European Commission. (2001). Making a European Area of Lifelong Learning a Reality. Retrieved from http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri = COM:2001:0678:FIN:EN:PDF European Council (2000). Lisbon European Council 23 and 24 March 2000, Presidency Conclusions. Brussels: EU. European Parliament and the Council of the European Union. (2006). Recommendation 2006/962/EC of the European Parliament and of the Council of 18 December 2006 on key competences for lifelong learning. Retrieved fromhttp://europa.eu/legislation_summaries/education_training_youth/lifelong_learning/ c11090_en.htm Friedman, T. L. (2005). The World is Flat. London: Penguin. Furstenberg, G., Levet, S., English, K., & Maillet, K. (2001). Giving a virtual voice to the silent language of culture: The CULTURA project. Language Learning & Technology, 5(1), 55-102. Retrieved fromllt. msu.edu/vol5num1/furstenberg/default.html Giddens, A. (1990). The Consequences of Modernity. Stanford, CT: Stanford University Press. Griffin, P., & Care, E. (2015). Assessment and teaching of 21st century skills. In P. Griffin & E. Care (Eds.), THE ATC21S Method (pp. 3–33). Dordrecht: Springer. doi:10.1007/978-94-017-9395-7 Held, D., McGrew, A. G., Goldblatt, D., & Perraton, J. (1999). Global Transformations. Politics, Economics and Culture. Cambridge: Polity Press. Jenkins, J., Cogo, A., & Dewey, M. (2011). Review of developments in research into English as a lingua franca. Language Teaching, 44(3), 281–315. doi:10.1017/S0261444811000115 Kachru, B., & Nelson, C. (2001). World Englishes. In A. Burns & C. Coffin (Eds.), Analysing English in a Global Context (pp. 9–25). London: Routledge. Kankaanranta, A., & Louhiala-Salminen, L. (2013). “What language does global business speak?” – The concept and development of BELF. Ibérica, 26, 17–33. Kankaanranta, A., & Planken, B. (2010). BELF Competence as Business Knowledge of Internationally Operating Business Professional. Journal of Business Communication, 47(4), 380–407. doi:10.1177/0021943610377301 Korzenny, F. (1978). A theory of electronic propinquity: Mediated Communication in organizations. Communication Research, 5(1), 3–23. doi:10.1177/009365027800500101 Kramsch, C., & Thorne, S. (2002). Foreign language learning as global communicative practice. In D. Block & D. Cameron (Eds.), Language learning and teaching in the age of globalization (pp. 83–100). London: Routledge. Kress, G., Jewitt, C., Ogborne, J., & Tsatsarelis, C. (2001). Multimodal teaching and learning: The rhetorics of the science classroom. London and New York, NY: Continuum. Lamb, T. (2008). Learner autonomy and teacher autonomy: Synthesising an agenda. In T. Lamb & H. Reinders (Eds.), Learner and Teacher autonomy (pp. 269–284). Amsterdam: John Benjamins Publishing Company. doi:10.1075/aals.1.21lam Levy, M., & Hubbard, P. (2005). Why call CALL CALL? Computer Assisted Language Learning, 18(3), 143–149. doi:10.1080/09588220500208884
420
European Dialogue Project
Little, D. (1991). Learner Autonomy. 1: Definitions, Issues and Problems. Dublin: Authentik. Louhiala-Salminen, L., & Kankaaranta, A. (2011). Professional Communication in a Global Business Context: The Notion of Global Communicative Competence. IEEE Transactions on Professional Communication, 54(3), 244–262. doi:10.1109/TPC.2011.2161844 Martin, A. (2005). DigEuLit – a European Framework for Digital Literacy: a Progress Report. Journal of eLiteracy, 2, 130-136. Mason, R., & Rennie, F. (2008). E-Learning and Social Networking Handbook. Abingdon: Routledge. O’Dowd, R., & Ritter, M. (2006). Understanding and Working with ‘Failed Communication’ in Telecollaborative Exchanges. CALICO, 23(3), 623–642. Oxford, R. (2003). Language learning styles and strategies: Concepts and relationships. International Review of Applied Linguistics in Language Teaching Journal, 41(4), 271–278. Retrieved from http:// web.ntpu.edu.tw/~language /workshop/read2.pdf Ponterotto, D. (2005). Intercultural Communication: Searching for Categories in Conversation Analysis. In M. Bondi & N. Maxwell (Eds.), Cross-Cultural Encounters: Linguistic Perspectives (pp. 253–265). Roma: Officina Edizioni. Reinders, H., & Darasawang, P. (2012). Diversity in language support. In G. Stockwell (Ed.), Computerassisted language learning: Diversity in research and practice. Cambridge: Cambridge University Press. doi:10.1017/CBO9781139060981.004 Seidlhofer, B. (2008). Standard future or half-baked quackery: descripitve and pedagogic bearings on the globalisation of English. In C. Gnutzmann & F. Inteman (Eds.), The globalisation of English and the English language classroom (2nd ed., pp. 159–173). Tübingen: Gunter Narr Verlag. Seidlhofer, B. (2011). Understanding English as a Lingua Franca. Oxford: Oxford University Press. Shaffer, D. (2008). Education in the digital age. The Nordic Journal of Digital Literacy, 4(1), 39–51. Thomas, M., & Reinders, H. (2010). Task-Based Language Teaching and Technology. New York, NY: Continuum. Thomas, M., Reinders, H., & Warschauer, M. (2013). Contemporary Computer-Assisted Language Learning. London: Bloomsbury Publishing Plc. Thorne, S. L. (2003). Artifacts and cultures-of-use in intercultural communication. Language Learning & Technology, 7(2), 38–67. Retrieved from llt.msu.edu/vol7num2/pdf/thorne.pdf Ting-Toomey, S. (1999). Communicating Across Cultures. New York, NY: The Guilford Press. Tudor, I. (2005). Higher Education language policy in Europe: A snapshot of action and trends [Discussion brief ]. Retrieved from http://web.fu-berlin.de/enlu/ UNESCO. (1998). Higher Education in the Twenty-first Century Vision and Action Final Report. Retrieved fromunesdoc.unesco.org/images/0011/001163/116345e.pdf
421
European Dialogue Project
Vogel, T. (2001). Internationalization, Interculturality and the Role of Foreign Languages in Higher Education. Higher Education in Europe, 3(26), 381–389. Retrieved from http://elearning.surf.nl/elearning/english/3793 Warschauer, M. & Healey, D. (1998). Computers and language learning: An overview. Language Teaching. Language Teaching, 31, 57-71. Retrieved from http://www.gse.uci.edu/person/warschauer_m/ overview.html Wenger, E. (1998). Communities of Practice. Learning, Meaning and Identity. New York, NY: Cambridge University Press. doi:10.1017/CBO9780511803932
422
European Dialogue Project
APPENDIX Student Questionnaire EDP 2015 Table 3. EDP Student Survey Topics 2014 Teams
Survey Topics
General Description
Gender Issues in Europe
Women’s role in the workforce/country comparison, women in senior level positions.
Team 2
Higher Education in Europe
Value of higher education /country comparison. How important is it in your country to go to university? Can you get a better or higher paid job if you attend undergraduate or postgraduate studies? What is the proportion of the population in your country that holds a university degree today and 20 30 years ago?
Team 3
Recruitment within Europe
Working in Europe (brainpower vs. braindrain, willingness to relocate to another country, value of skilled workers etc.)
Team 4
Data Protection & Data Privacy in Europe
Using social networks for employment and for private use, attitudes to privacy and data protection in society. Value of personal data and who has access to it.
Team 5
Happiness in Europe
How do you define happiness, comparison of the countries and the idea of being happy with family, education, career and professional work opportunities.
Team 6
Ethical and Environmental Issues in Europe
ethical consumption: environmentally friendly policies, sustainable development and organic production etc.
Team 7
Role Models in Europe
Who are the main local and international role models in your country and why are they role models? What values do they stand for?
Team 8
National Values in Europe
Family, friends, sport, wealth, careers, health, the environment, etc., other values
Team 9
Cultural Diversity in Europe
Keeping culturally distinct peoples while maintaining a healthy balance between international and national rights and respect for national values (e.g. debate about using a veil or other religious, ethical or cultural issues in a united Europe).
Team 10
Career Opportunities in Europe
Men vs. women, young vs. old, skilled vs. unskilled, citizens vs. immigrants.
Team 1
423
European Dialogue Project
Table 4. EDP Student Survey Topics 2014 Teams
Survey Topics
General Description
Team 1
Recycling and reuse
e-waste, consumer behavior, awareness of environmental impact, how waste is avoided
Team 2
Organic consumption
fair trade, fair wage, ethical sourcing, responsible and environmental conditions for production
Team 3
Ethical consumption
child labour, ethical products, such as fair trade–certified coffee and chocolate, fair labor– certified garments, cosmetics produced without animal testing
Team 4
Social equality
impact and tens of social benefits, smaller vs larger enterprises
Team 5
Corruption in business and government
money laundering, human rights issues, bribery practices by giving offering favour to influence the action of others, misuse of power in public and corporate offices for personal gain
Team 6
Companies’ responsibilities to their local communities and society
How do companies look beyond how to make the most money and commit to building a better society?
Team 7
The Impact of Globalization (positive aspects)
New technologies, social networking, transcending traditional political, cultural and economic boundaries. Positive developments
Team 8
The Impact of Globalization (negative aspects)
Wealth, poverty, and equality, organized crime, terrorism, pollution, deforestation, overfishing, climate change, water scarcity. How does globalization affect local communities? Negative impact of multinational corporations on local industry and communities.
Team 9
Responsible tourism
extra fees to help conserving natural heritage and biodiversity; respect the socio-cultural authenticity; Make optimal use of environmental resources, etc.
Team 10
What lies ahead? The world in 50 years
Look at the economic, social and environmental developments of the future. Will the developments be positive or negative in respect of the earth’s ecosystem? Can the world provide a better quality of life; are the moral and legal obligations of companies, governments and civil society organizations better than they are now?
Table 5. EDP 2015: Facebook Group entries Team
No. of members in the group
Entries per country
Germany
Italy
Portugal
Germany
Italy
Portugal
Team 1
2
3
2
22
30
19
Team 2
2
3
3
46
69
46
Team 3
3
4
3
20
19
19
Team 4
3
2
2
22
48
29
Team 5
2
4
3
21
32
32
Team 6
3
3
3
9
20
8
Team 7
3
3
2
12
45
32
Team 8
3
4
2
34
36
31
Team 9
2
2
3
12
53
20
Team 10
2
2
2
108
75
29
Sum
25
30
25
306
427
265
Average
2.5
3
2.5
30.6
42.7
26.5
Note. Entries are posts or comments. Three German team members did not have a Facebook account and thus were not included in the statistics.
424
European Dialogue Project
Table 6. EDP 2015: Average Facebook Group entries Average entries per team member
Team Germany
Italy
Average entries per country Portugal
Germany
Italy
Portugal
Team 1
11
10
9.50
36%
33%
31%
Team 2
23
23
15.33
38%
38%
25%
Team 3
6.67
4.75
6.33
38%
27%
36%
Team 4
7.33
24
14.50
16%
52%
32%
Team 5
10.50
8
10.67
36%
27%
37%
Team 6
3
6.67
2.67
24%
54%
22%
Team 7
4
15
16
11%
43%
46%
Team 8
11.33
9
15.50
32%
25%
43%
Team 9
6
26.50
6.67
15%
68%
17%
Team 10
54
37.50
14.50
51%
35%
14%
Sum
136.83
164.42
111.67
Average
13.68
16.42
11.17
33%
40%
27%
Note. Entries are posts or comments. Three German team members did not have a Facebook account and thus were not included in the statistics.
Table 7. EDP 2015: Spoken words during virtual meetings (Skype transcripts) Team
Attendees
Spoken words per country
Germany
Italy
Portugal
Germany
Italy
Portugal
Team 1
2
3
2
583
923
89
Team 2
3
3
3
241
2027
129
Team 3
3
4
3
956
1050
442
Team 4
2
2
2
528
1239
769
Team 5
2
2
1
739
376
1231
Team 6
2
3
3
2150
1339
665
Team 7
3
3
2
2953
3214
2331
Team 8
3
4
2
652
1591
1576
Team 9
2
2
3
1193
1020
160
Team 10
2
2
2
1994
2716
2481
Sum
24
28
23
11989
15495
9873
Average
2.4
2.8
2.3
1198.9
1549.5
987.3
425
European Dialogue Project
Table 8. EDP 2015: Average spoken words during virtual meetings (Skype transcripts) Average spoken words per team member
Team
Average spoken words per country
Germany
Italy
Portugal
Germany
Italy
Portugal
Team 1
291.50
307.67
44.50
45%
48%
7%
Team 2
80.33
675.67
43
10%
85%
5%
Team 3
318.67
262.50
147.33
44%
36%
20%
Team 4
264
619.50
384.50
21%
49%
30%
Team 5
369.50
188
1,231
21%
11%
69%
Team 6
1,075
446.33
221.67
62%
26%
13%
Team 7
984.33
1,071.33
1,165.50
31%
33%
36%
Team 8
217.33
397.75
788
15%
28%
56%
Team 9
596.50
510
53.33
51%
44%
5%
Team 10
997
1,358
1,240.50
28%
38%
35%
Sum
5194.2
5836.8
5319.3
Average
519.42
583.68
531.93
32%
36%
33%
Figure 14. Student Questionnaire EDP 2015 page 3
426
427
Chapter 18
Evaluation Methods for E-Learning Applications in Terms of User Satisfaction and Interface Usability Nouzha Harrati University of Souk Ahras, Algeria
Zohra Mahfouf University of Souk Ahras, Algeria
Imed Bouchrika University of Souk Ahras, Algeria
Ammar Ladjailia University of Souk Ahras, Algeria
ABSTRACT The use of online technology has become ubiquitous and integral part of our daily life from education to entertainment. Because of the ubiquity of e-learning and vital influence for engineering the educational process, it is no surprise that many research studies are conducted to explore different aspects covering the use of e-learning in higher education. The assessment and evaluation aspects are considered arguably the most influential part for measuring the success and effectiveness of e-learning experience. As more and more universities worldwide have opted to use online technology for their course delivery, research in e-learning systems have attracted considerable interest in order to apprehend how effective and usable e-learning systems in terms of principles related to human computer interaction.
INTRODUCTION In a modern society, the use of online technology has become ubiquitous and integral part of our daily life from education to entertainment. This is mainly due to the proliferation of the use of computers and smart devices combined with the availability and affordance of internet connectivity in most places. In fact, digital networks and modern communication have greatly transformed and reshaped the way we live and work in such a contemporary era yielding a tremendous effect on the necessity and opportunity to learn (Garrison, 2011). Although, there are advocates in the academic community who prefer traditional teaching methods which include face-to-face communication, considerable efforts are being devoted DOI: 10.4018/978-1-5225-1851-8.ch018
Copyright © 2017, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
Evaluation Methods for E-Learning Applications in Terms of User Satisfaction
to promoting e-learning and the use of new technology for course delivery and teaching. The learning paradigm is shifting from lecturer-centered to student-centered as it can be undertaken anywhere, from classrooms to homes. In fact, several scholars described the growth rate of e-learning as unprecedented and explosive as the adoption of e-learning went beyond academic institutions to be considered seriously in corporate companies and public administrations as part of their employee training programs. Because of the ubiquity of e-learning and vital influence for engineering the educational process, it is no surprise that many research studies are conducted to explore different aspects covering the use of elearning in higher education. This includes for instance the learning models, software interactivity and human behaviors. The assessment and evaluation aspects are considered arguably the most influential part for measuring the success and effectiveness of e-learning experience (Anderson, 2008). Evaluation for e-learning goes beyond assessing the learner performance. The evaluation of the delivery procedure for e-learning is as critically important to understand and harvest a meaningful and fruitful learning experience (Granić, 2008; Harrati, Bouchrika, Tari, & Ladjailia, 2016). In fact, considerable criticism regarding the quality of existing e-learning systems are being cited by a number of studies (Chua & Dyson, 2004) in addition to further issues including low performance, poor usability and customizability. Furthermore, online education has been further criticized as not supporting a student-centred learning but replicating the traditional face-to-face teaching paradigm. For the definition of e-learning, although the term can be simply explained as an educational software system that allows a user to learn anywhere and at any time, an agreed definition for e-learning is still elusive among scholars (Moore, Dickson-Deane, & Galyen, 2011). The term of e-learning starts with the letter e which stands conventionally for electronic in the same way as e-mail. The term “online learning” is occasionally used to refer synonymously to e-learning in which case the learning process takes place away from formal classrooms and facilitated by the use of internet-based technologies. The terms e-learning and online learning can vaguely overlap with other terms such as distance learning which is often associated with older technologies (Moore et al., 2011; Pachler & Daly, 2011). Horton (2011) defined e-learning as the practice of using information and communication technology (ICT) to simulate a learning experience that can be created, organized and managed with enough freedom decoupled from any temporal or geographical boundaries. Triacca et al. (Triacca, Bolchini, Botturi, & Inversini, 2004) argued that certain level of interactivity needs to be included to render the definition applicable for describing the learning experience. Pachler and Daly believes that the primary aspect in the debate for the elusive definition of the term seems to be around which specific pedagogical model needs to be designed and integrated within the use of digital and online technology. Pachler et al. further stressed that e-learning is no longer about the distance or remote learning, but forms part of a modern paradigm and conscious choice in education for the best and most appropriate ways of promoting effective teaching. The Joint Information Systems Committee (JICS) which is an influential organization within the United Kingdom supporting higher education institutions in the implementation and adoption of new technologies, referred to e-learning as “enhanced learning” with the definition of “learning facilitated and supported through the use of information and communications technology”. Blended Learning is another term which is frequently used and it tends to point to the teaching process where computer-based learning are integrated in tandem with face-to-face classical teaching activities (Garrison & Kanuka, 2004). This is known as a hybrid form of e-learning in which online technologies are employed to enhance or supplement traditional teaching (Garrison, 2011). Flipped classroom is a pedagogical form of blended learning where typical lecture and homework of a course are reversed. Lectures are viewed
428
Evaluation Methods for E-Learning Applications in Terms of User Satisfaction
by students at home via distance education before the class session, while in-class time is dedicated to exercises and discussions. In addition to the use of e-learning in academic institutions for acquiring knowledge, one of the other important goals of e-learning is to develop professional skills and understanding in the corporate world to help employees accomplish their career objectives (Colvin Clark & Mayer, 2008). In the commercial place, e-learning is synonymous with both terms: Computer-Based Training (CBT) and Web-Based Training (WBT) in which they do refer to the delivery of training courses and materials through the use of computers or web technology. Meanwhile, the term tends to point to a mode of study within the university context in which physical presence in not required inside a classroom. Semantically, it is vital to understand and differentiate between the terms; learning and training as they are inextricably linked and have common aspects within the educational process. The term training is the act of giving instructions, knowledge or information through voice, written words or other communicative methods of demonstration with a fashion that instructs the trainee. Meanwhile, learning refers to the process of absorbing the information in order to enrich and increase skills and abilities that can make use of it for various contexts (Garrison, 2011).
Importance and Benefits Because of the vital importance for the use of online technology as a medium for distance or virtual education, corporations and schools are investing substantially large amount of money, time and resources in developing alternatives to traditional methods of education and training. In the corporate side, employees ought to be kept up-to-date with the latest information and knowledge in a very competitive business world. Various companies have kept pace in adopting e-learning solutions for their corporate training such as CISCO e-Learning and Dell Learning (Wang, Wang, & Shee, 2007). The global e-learning market has witnessed a remarkable growth to exceed hundreds of billions of US dollars based on a recent report by Global Industry Analysts (Chuo, Liu, & Tsai, 2015) with millions of students are enrolling for web-based courses (Wirt et al., 2005). The annual growth rates in technology-based learning are expected at 27% for the next several years. For a contemporary era where technological and educational modernization are shaping and redefining the standards of education, e-learning is considered the converging point to such evolution. Because of the importance for e-learning which evolved greatly under the rapid advancement of internet technology, the US Web-based Education Commission published the following statement: The question is no longer if the Internet can be used to transform learning in new and powerful ways. The Commission has found that it can. The Web-based Education Commission calls upon the new Congress and Administration to embrace an ‘e-learning’ agenda as a centerpiece of our nation’s federal education policy. The statement given by the commission for the development and innovation program recommends further that embracing the e-learning should be accompanied with a deeper understanding of how students learn, how technological tools support, assist and assess learning gains and more importantly what is required to keep the pace of e-learning moving positively forward. As technology has progressed so much that geographical gap is virtually bridged with the deployment of tools that make people collaborate and interact together remotely with the feeling that they are inside the same room. The use of e-Learning in schools and corporations gained popularity mainly due 429
Evaluation Methods for E-Learning Applications in Terms of User Satisfaction
to the perceived advantages of flexibility around fitting the students’ time requirements and overcoming the issue around the geographical restrictions. The time aspect is one of the issues that instructors and learners both have to deal with in learning or tutoring sessions. In the case of traditional face-to-face teaching, the arrangement of time can be restrictive for the attendance to a certain group of students who have the ability and availability to attend at a specific time. Along with the timing restrictions, travelling and being present at the location where the learning would take place can be a major obstacle. On the other hand, e-learning offers the benefits to facilitate the learning process without having to worry about when or where every learner can be available and present to attend the course. In other words, e-learning provides the students with the capability to accommodate learning and training around their busy lifestyles, granting effectively the opportunity even to the busiest person to pursue further their career to earn new qualifications. In a study published by (Welsh, Wanberg, Brown, & Simmering, 2003), the authors reported that organizations can accomplish numerous benefits from implementing e-learning programs, including consistency in training, reduced cycle time and cost, better convenience for learners and improved tracking capabilities. Zhang and Nunamaker (2003) suggest that effective and efficient computer-based training methods are in great demand by the industry to ensure that employees and partners are equipped with the most advanced skills. In the same way, academics and practitioners alike consider e-learning software systems to be a valuable platform for knowledge sharing and transfer tool in the educational world. Garrison (2011) pointed out that apart from reasons of knowledge transfer and education, academic institutions pursue the deployment of e-learning systems as a means to boost their revenues and retain market share of students in addition to improve national recognition or prestige. Regardless all of the benefits discussed for e-learning as flexibility, convenience and the ability to remotely access and participate virtually in classrooms from the student’s own comfort, the students may experience the feeling of isolation (Garrison, 2011). This is because the e-learning process is a solo act during most of the time which sets the learner to have the sense that they are acting completely alone. Although, a number of studies argued that the use of social computing technology can greatly overcome such setbacks and enhance the learners’ satisfaction via growing stronger peer connections inside a virtual learning community to reduce the feelings of isolation (Johnson, Hornik, & Salas, 2008). Another concerning factor for the deployment of e-learning is the medical aspect as e-learning requires the use of computers and tablets. Consequently, bad posture, eyestrain and other physical issues may badly affect the learner’s well-being. In the study by (Welsh et al., 2003), the authors listed further potential drawbacks for e-learning including higher up-front cost, lack of trainee interaction. The research study argued though that the drawbacks of e-learning systems can be compensated by the integration of blended learning as a hybrid form of traditional teaching with online learning. The Department of Education for the United States of America has further echoed concerns about distance education courses and programs that can lead academic institutions in directions that are not congruent and compliant with its mission of inculcating the learners with the rightful skills and knowledge.
History of E-Learning Distance education has been around for more than a century whilst e-learning has started to evolve during the last two decades having a prominent impact on the educational and training paradigm for academic institutions, corporations and public administrations. For the origin of the term “e-learning”, there is no reliable source documenting the birth of the word whilst there are some suggestions that the term is most likely originated during the 1980’s (Moore et al., 2011). Other terms such as online learning 430
Evaluation Methods for E-Learning Applications in Terms of User Satisfaction
and virtual learning began to spring up during the same time in search for a better definition of exactly what was e-learning. The history of distance education dates back to the work of Sir Isaac Pitman for using courses delivered by the postal system in 1840’s. Pitman decided to start a distance course via sending assignments to his students by post which they need to complete and send it back to him by post. It is claimed to be pioneering work for distance learning whilst the concept remained the same throughout the evolution of distance education for the exception of the delivery medium as the technology advanced enormously (Horton, 2011). Other studies trace back the idea of online learning to 1926 when the educationalist J. C. Stobart wrote a memo suggesting the creation of a “wireless university”. In 1969, the establishment of the British Open University marked a turning point for the development of distance learning (Bates, 2005). It is no wonder that e-learning has its roots from mail-learning via correspondence courses. Educational content delivery format for distance learning have taken various forms including postal delivered instructions, materials in print format, classes over electronic medium, via smart devices and now, virtual classrooms. The notion of a testing machine emerged initially in 1920’s by Sidney Pressey who was an educational psychology professor at Ohio State University. Pressey invented a machine to provide drill and practice for students during his introductory courses. Pressey (1926) stated that: “the procedure in mastery of drill and informational material were in many instances simple and definite enough to permit handling of much routine teaching by mechanical means.” In 1954, B. F. Skinner from the University of Harvard introduced a series of studies designed to improve teaching methods for spelling, mathematics, and other subjects by inventing a mechanical machine that would surpass the traditional teaching experience. Skinner believed that classrooms suffer from the drawbacks of learning rate for different learners being variable and reinforcement procedure is delayed because of the lack of individual attention to every student. Skinner was motivated by the fact that it is impossible to have a personal tutorial for every student all the times. He developed a theory of programmed learning which was to be implemented by teaching machines. The teaching machine consists mainly of a system program which contains teaching materials and test items that the student is gradually taken through them. The teaching machine is composed by fill-in-the-blank exercises where if answered correctly, the student gets a reinforcement and taken to the next questions. If otherwise, the learner is presented with the correct answer to increase later the chance of getting reinforced. The first computer-based training system was introduced in 1960 with the invention of the PLATO (Programmed Logic for Automatic Teaching Operation) program (Bitzer, Braunfeld, & Lichtenberger, 1961). It was originally designed for students attending the University of Illinois. The system has the basic layout which is used in modern e-learning applications consisting of graphic elements, textual information along with forums and chat rooms. With the rapid evolution of the internet and world wide web, e-learning began to take a new trend with the introduction of the first online web-based learning management system (LMS) in 1996 named as Cecil (Sheridan, White, & Gardner, 2002). Many new concepts and topics have floated up and flourished recently within the area of e-learning including the three major trends discussed next: •
M-Learning: The development of the mobile technology gave birth to a new era known as mlearning. Mobile learning can be defined as the portable and lightweight platform where the learner can engage in learning or training activities without having any geographical constraint via the use of mobile phones, smartphones, handheld computers, tablets, notebooks and media players. The mobility of the learner and portability of the hardware form the basis for the m-learning technology. 431
Evaluation Methods for E-Learning Applications in Terms of User Satisfaction
•
•
Micro-Learning: Theo Hug was one of the earlier scholars to discuss the concept micro-learning which is a new form of learning in the field of adult learning and training (Hug, Lindner, & Bruck, 2005). It is regarded as a practical mode to achieve informal learning in a new uncluttered environment. Micro-learning is based on the design of micro or lighter activities through micro-steps in digital environments. These learning activities are made part of the learner’s daily routines. Micro-learning is an important paradigm shift that avoids the need to have separate learning sessions since the learning process is embedded in the daily routine of the end-user. Unlike common e-learning approaches, micro-learning tends towards the use of push technology which reduces the cognitive load on the learners. The choice of micro-learning objects, timing and progression pace of micro-learning activities are of importance for didactical designs to keep the learners engaged with better efficiency. Gamification: Is defined as the use of game thinking, aesthetics and game mechanics in a nongame context to engage and motivate the learners and solve problems for an educational context. Basically it’s the use of gaming technology to solve problems outside of the games sector. The word was first coined in 2002 by Nick Pelling, a British IT professional, but it was not widely used until 2010. Based on various research studies conducted by numerous educational scholars, what makes games effective and attractive for learning is the students’ level of motivation, activity, interactivity, competition and engagement. A study performed by Traci Sitzmann from the University of Colorado, reported that staff trained on video games are willing to learn and acquire more factual information and attain a higher skill level with a high retention rate of knowledge longer than employees who are trained in less interactive environments (Sitzmann, Kraiger, Stewart, & Wisher, 2006). Sitzmann argued that regardless the fact that learners can be overwhelmed with high level of instructions within the game, the interactivity and the game elements make the game engaging leading to the conclusion that the engagement of the student in the game leads to efficient and satisfying learning experience. In (Connolly, Boyle, MacArthur, Hainey, & Boyle, 2012), the authors presented a literature review on gamification for e-learning focusing on positive outcomes. The study stresses on the necessity of more rigorous evidence on the effectiveness and real impact of gamification.
E-LEARNING PLATFORMS The basic components of an e-learning process can be identified as: technological infrastructure, elearning software platform, e-learning content and participants. The technological infrastructure refers to the communication medium and hardware platform hosting the e-learning operations. Educational materials are mostly transmitted via the internet although in the past, courses were delivered using a blend of traditional computer-based media such as CD-ROM. Technological tools for supporting the elearning process involve the use of some or all of the following devices: desktop and laptop computers, interactive whiteboards, video cameras, mobile and wireless tools, including mobile phones. The most vital component for the e-learning process is the e-learning software platform which is usually named as the Learning Management System (LMS). The LMS is a software system developed for the purpose of managing online courses including the administration, documentation, reporting and delivery of educational and training programs. The e-learning software allows the instructor or institution administrator to manage every aspect of courses from the enrollment of students, delivering educational materials 432
Evaluation Methods for E-Learning Applications in Terms of User Satisfaction
in addition to the assessments part via digital delivery of assignments and exam preparations. Further, the LMS provides a platform for interaction between students and lecturers via the use of chat rooms or discussion boards or video conferencing. Most learning management systems are developed as web applications using various platforms including PHP, .NET and Java integrated with a classical relational database engine for storing data such as PostgreSQL, SQL server and MySQL. There are a number of features and functionalities that a learning management system should minimally offer for the achieving the ideal e-learning experience. Most systems are like to include most of the following features: Course Content Delivery, Student Registration and Administration, Event Scheduling, Tracking, Curriculum and Certification Management, Assignment and Assessment, Reporting and Courseware Authoring. There is a plethora of different e-learning systems in the market either coming as freely available as open source or commercial products. We review in this section the most popular learning management system having the dominant market share for the e-learning sector.
Moodle Moodle is a free, online Learning Management system enabling lecturers and instructors to create their own private website filled with dynamic courses that extend learning anytime and anywhere. Developed on pedagogical principles, Moodle is used for blended learning, distance education, flipped classroom and other e-learning projects in schools, universities, workplaces and other sectors. The recent version of Moodle supports responsive design giving the users the ability to create mobile-friendly online courses and integrate third party add-ons. Moodle is an acronym for Modular Object-Oriented Dynamic Learning Environment developed by Martin Dougiamas in 2002 using the PHP programming language. In terms of usage, Moodle is the second largest provider with 23% market share, following Blackboard (41%) whilst having the most number of users estimated to be over 70 million registered students. Although, the software enjoys richer functionalities and robustness, the main drawback for using Moodle is the perceived complexity for new users (Harrati et al., 2016).
Blackboard Learn Blackboard Learn which is commonly known as Blackboard, is a web-based content management system created in 1997 by faculty members at Cornell University as a course management system for education. Blackboard helps creating a virtual place or classroom where the interaction between students and their instructors is achieved through the use of discussion forums, email, chat rooms and other functionalities. The LMS can be extended and customized according to various needs of the institutions. It is one of the most popular and successful commercial e-learning systems.
Claroline Claroline is a collaborative online learning and working open source platform released under the GPL open source license. It offers the possibility for many institutions to create and administrate collaborative online learning spaces. Claroline is available in more than 100 countries and is translated to 35 languages. The use of claroline is intuitive and easy and does not require particular skills. Claroline is compatible with GNU/Linux, Mac OS and Microsoft Windows. It is based on PHP and MySQL as the widely used relational database management system. 433
Evaluation Methods for E-Learning Applications in Terms of User Satisfaction
EdX EdX is an open-source and free learning management system offered by edX.org. It is the same framework that universities such as MIT and Harvard utilize to offer online education to over 100,000 students. It was released as open source in March 2013, and the goal was to act as the WordPress for Massive Open Online Course (MOOC) platforms, allowing developers and users to integrate plug-ins to expand the core functionality of the system. edX has a fast, modern feel, with the ability to accommodate large enrollments. Although it is an open source, investment will need to be made in both installation and maintenance of the system.
Sakai Sakai is a service-oriented Java-based open source learning management system founded in 2004 by the universities of Michigan, Indiana, Stanford and the Massachusetts Institute of Technology with the purpose to develop a new LMS as scalable, reliable, interoperable and extensible. The project was funded by a grant from the Mellon Foundation. Sakai is deployed at over 300 academic institutions for offering online education.
ASSESSMENT AND EVALUATION The terms assessment and evaluation have been often used synonymously in the education area but they have different semantics when it comes to the area of e-learning. Assessment is used usually to refer to the role in formal education of judging the students’ attainment of educational objectives for a specific course (Garrison, 2011). Student assessment is by nature multifaceted that includes exploring different aspects as the acquisition of skills, competencies, capacity to apply critical and creative solutions to challenging problems within different contexts. Generally, the assessment process occurs throughout the course in order to provide formative and continuous feedback for the learners whilst offering summative assessment information on learning accomplishments to both student and instructor at the completion of the course. Assessment of student learning is a key component of the evaluation of the e-learning paradigm and it is among other factors with which educators involved in e-learning are concerned. Black and Wiliam (1998) argued that feedback can have a remarkable effect on self-esteem and motivation, which in turn can influence directly how and what the student can learn. Assessment via the use of technological systems which sometimes called e-assessment, can allow students to engage more with certain level of confidence to their own learning as opposed to norm-referenced comparisons to their fellow students. The ability to easily review and revisit records and feedback of their own learning activities and its outcomes is considered to be an important aspect for the assessment process (Pachler & Daly, 2011). This is referred as the self-regulation for e-learning which is defined as the control by students for aspects of their own learning. On the other hand, evaluation is used to refer to the process of comparing or measuring a unit, course, program or other elements of e-learning against some set of performance or outcome criteria. Comprehensive evaluation spans to measures of satisfaction, perception of learning, costing and cost benefits, and other criteria for assessing the success as defined by the relevant stakeholders and participants. Effective evaluation of e-learning process requires a close examination of the instructional design incorporated 434
Evaluation Methods for E-Learning Applications in Terms of User Satisfaction
during the course. Garrison et al. (2011) listed different types of proactive evaluation starting with determination of the strategic intent of the e-learning program. Being able to clearly determine the reasons why the particular pedagogical program has been developed for online learning is critical to assessing its effectiveness. The second form of proactive evaluation is to look closely at the content of the courses and examine the cohesion and consistency aspect in addition to the ease of access of modification. The third element of evaluation focuses on an examination of the interface design for the learning management system. An effective graphical interface is mastered by users with ease and gives the possibility to present the educational content in a variety of formats including graphics, video, and other advanced interactive and dynamic formats. The design of the interface should be based on a familiar metaphor that will help the users navigate among the different components of the course. The graphical interface should be customizable by both the students and the educators to increase their comfort and the readability of the educational content. The fourth form of evaluation is about to assess the amount of interactivity supported by the course and the learning management system. Garrison (2011) concluded that the final evaluation process revolves around the quality, quantity and thoroughness of the assessment of student learning and engagement for using the e-learning system. In spite of the widespread use of e-learning systems and the substantial investments in purchasing, developing and maintaining learning management systems, there is no consensus yet on devising a standard framework or taxonomy for evaluating the quality and effectiveness of e-learning systems. The dearth of conventional e-learning system quality models is in stark contrast compared to the considerable body of work on software quality assurance. Chua et al. (Chua & Dyson, 2004) proposed the ISO 9126 Quality Model as a useful framework specifically for evaluating learning management systems with particular emphasis for teachers and educational administrators as the primary stakeholders. The ISO 9126 evaluation model was adopted by the International Organization for Standardization (ISO) and is considered as one of a large group of internationally recognized standards. Although, the authors have stressed on the potency of the model as a useful evaluation tool that can crystalize better insights relevant to the educators, the ISO 9126 model has not been used extensively within the e-learning environment. Holsapple et al. (Holsapple & Lee‐Post, 2006) introduced the E-Learning Success Model which is adapted from DeLone and McLean’s Information Systems success model which, in turn, is an extension of their original model. The e-learning success model depends on the attainment of success at each of the three stages of the e-learning process including: system design, system delivery, and system outcome. Figure (1) shows the e-learning success model with the different sub-components for each of the three stages.
USABILITY EVALUATION As more and more universities worldwide have opted to use online technology for their course delivery, research in e-learning systems have attracted considerable interest in order to apprehend how effective and usable e-learning systems in terms of principles related to human computer interaction (Bringula, 2013; Escobar-Rodriguez & Monge-Lozano, 2012; Navimipour & Zareie, 2015). Positive user experience emerges as an important pillar for the adoption of educational learning systems. This is mainly because the availability of technological infrastructures and systems is not adequate to enforce the uptake of new educational approaches from both sides of the teachers or the learners (Laurillard, Oliver, Wasson, & Hoppe, 2009; Persico, Manca, & Pozzi, 2014; Phillips, McNaught, & Kennedy, 2012). Usability nature of e-learning software products is a key characteristic to achieve the acceptance and satisfaction for both 435
Evaluation Methods for E-Learning Applications in Terms of User Satisfaction
Figure 1. E-Learning Success Model (Holsapple & Lee‐Post, 2006)
players regardless of their background, experience or orientation. The satisfaction part is related to how the users believe or feel positively that the system meets their requirements (Capece & Campisi, 2013; Islam, 2014; Lee, Kim, & Lee, 1995; Yeh & Lin, 2015). Meanwhile, other researchers have defined satisfaction as the gap between the expected gain and the actual gain when using the system (Tsai, Yen, Huang, & Huang, 2007). Positive user experience is of prime importance for online systems playing vital role for technology acceptance as well as the continuous commercial success of software companies. Considerable research within the human-computer interaction literature concerns the analytical quantification of the various factors that determine and shape software usability (Albert & Tullis, 2013; Hornb\aek, 2006). Most examined covariate factors are related to the user such as age, academic level, social status, gender or specific impairments (Mentes & Turan, 2012; Pariente-Martinez, GonzalezRodriguez, Fernandez-Lanvin, & De Andres-Suarez, 2014). In spite of the fact that there are numerous research studies on child-computer interaction, performance rates of older people and accessibility for users with special needs, most web applications are designed and developed for younger people whilst ignoring other groups of users with specific requirements. Without doubt, studying these factors along with the involved constraints of these groups is crucial in order enhance the system usability and adapt the user interface to different user requirements. Usability is defined as the extent to which a product can be easily used by specified users to achieve certain goals with effectiveness, efficiency and satisfaction (Mayhew, 1999). In practice, the usability aspect of software products is marginalized during the classical stages of software development life-cycles pushing more efforts and resources into the software back-end to address the functional requirements (Burton-Jones & Grange, 2012). In fact, regardless of how software are neatly coded or sophisticated,
436
Evaluation Methods for E-Learning Applications in Terms of User Satisfaction
recent studies of software sales reports that software failures are due to usability reasons where simply the user does not know how to use the purchased product (Cassino, Tucci, Vitiello, & Francese, 2015). Software systems are valued on the basis of its graphical interface and the related power of communication and expression for the implemented functionalities (Cassino et al., 2015). It is no doubt that usability is now recognized as an important software quality attribute, earning its place among more traditional attributes such as performance, robustness, content and security (Henriksson, Yi, Frost, & Middleton, 2007; Ismailova, 2015). Moreover, research focus has shifted recently from the study of “use” to exploring ways of effective and ease of use for information systems (Burton-Jones & Grange, 2012).
Usability Evaluation Methods The process of usability evaluation consists of methodologies for measuring the ease-of-use aspects of the user interface for a given software system and identifying specific problems. In fact, Usability evaluation plays a vital role within the overall user interface design process which undergoes continuous and iterative cycles of design, prototyping and testing. Evaluating the usability of interactive systems is itself a process involving various activities depending on the method utilized (Ivory & Hearst, 2001).
Empirical Methods Empirical-based usability methods require the participation of end users who are instructed to interact with the software system. Meanwhile, their behavior and interaction with the system are recorded and observed by an expert. Results are obtained from the users through interviews and questionnaires where they are asked for their opinions and concerns in addition to possible suggestions of how to improve better the interface design and its usability. Interestingly, there is a recent trend of using medical equipment for assessing the user satisfaction level for using information systems. Dimoka et al. (2012) pointed out to the potentials of employing brain imaging and psychophysiological tools such as skin conductance response, eye tracking and facial Electromyography (Eckhardt, Maier, & Buettner). Liapis (Liapis, Katsanos, Sotiropoulos, Xenos, & Karousos, 2015) conducted research experiments to recognize stress through analysing skin conductance signals. This was carried out as part of an evaluation of user emotional experience in order to identify stressful tasks in human-computer interaction. In fact, one of the challenges in software development is to involve end users in the design and development stages so as to observe and analyze their behavior to collect feedback in effective and efficient manner. There is number of methods and theories in the literature for understanding, predicting, and assessing the interaction process with its involved parts including personal factors, behavior, and the environment. In order to assess the user acceptance of technological products, one of the most well established models is the Technology Acceptance Model (TAM), which was proposed by Davis (Davis, Bagozzi, & Warshaw, 1989). The TAM is tailored to include questions to explore two aspects of the user satisfaction which are: perceived ease-of-use and perceived usefulness. The ease of use refers to how users believe that adopting a particular technological product would require no effort and hassle to use it (Davis et al., 1989). The perceived usefulness concerns the degree to which a user believes that using a particular software system would improve their job performance. The Technology Acceptance Model has been used in various studies to assess the factors affecting individual’s to the use of technology (Venkatesh & Davis, 2000). There are other related models and theories such as the System Usability Scale (SUS) which was proposed mainly for the evaluation of web application for two aspects; the learnability and 437
Evaluation Methods for E-Learning Applications in Terms of User Satisfaction
Table 1. System Usability Scale for Usability Evaluation (Brooke, 1996) Strongly disagree 1
2
Strongly Agree 3
4
5
1. I think that I would like to use this system frequently. 2. I found the system unnecessarily complex. 3. I thought the system was easy to use. 4. I think that I would need the support of a technical person to be able to use this system. 5. I found the various functions in this system were well integrated. 6. I thought there was too much inconsistency in this system. 7. I would imagine that most people would learn to use this system very quickly. 8. I found the system very cumbersome to use. 9. I felt very confident using the system. 10. I needed to learn a lot of things before I could get going with this system.
usability. The SUS is a well-researched and widely used questionnaire for assessing the usability of mostly web applications. The System Usability Scale (SUS) (Brooke, 1996) is one of the most popular methods in the literature which is devised mainly to evaluate the usability for web applications. Its popularity is gained among the HCI community mainly due to its desirable psychometric metrics including high reliability and validity (Bangor, Kortum, & Miller, 2008; Brooke, 1996; Lewis & Sauro, 2009). The SUS questionnaire is composed of ten questions with a mix of positive and negative items. For each question, the respondent rates the magnitude of their agreement using a 5-point Likert scale with statements going from strongly disagree (1) to strongly agree (5). In order to compute the overall SUS score, the score contribution for each odd question which is positively worded, is estimated as the scale minus 1. For the even items, the score contribution is 5 minus the scale position. Therefore, each contribution ranges from 0 to 4. The SUS is the sum of all score contributions for the 10 items multiplied by 2.5 as shown in Equation (1) where Ui refers to the rating of the ith item. The SUS scores ranges between 0 and 100 in 2.5-point increments where higher values reflect higher satisfaction from the user.
5 SUS = 2.5 × ∑ (U 2 n −1 − 1) + ( 5 − U 2 n ) n =1
(1)
Inspection-Based Methods Alternatively, usability evaluation can be carried out through inspection methods which aim to identifying interaction problems within the interface without the involvement of end users. The interface is assessed manually by an expert or usability consultant for compliance to a set of predefined usability guidelines or conventional set of heuristics to detect usability deficiencies (Fernandez, Abrahão, & Insfran, 2013; Fernandez, Insfran, & Abrahão, 2011). The most-used usability heuristics for user interface design are those developed by Jackob Nielsen and Rolf Molich in 1990. Jakob Nielsen (1994) summarized ten heuristics for user interface design. A description to these heuristics is outlined briefly in Table 2. 438
Evaluation Methods for E-Learning Applications in Terms of User Satisfaction
Table 2. Nielsen’s 10 heuristics for user interface design Heuristic
Description
Visibility of system status
The system should provide users with the status and progression of the tasks they are doing.
Match between system and the real world
The system’s language and logical appearance of information should be homogenous and compatible with users
User control and freedom
The user should be giving the option of canceling, undoing or redoing a task
Consistency and standards
The design should conform to interface standards so that users are not confused with names and situations of words, commands and actions
Error prevention
Systems should eliminate or prevent the occurrence of errors by error checking mechanisms for example confirmation options
Recognition rather than recall
System objects should be clearly visible to minimize the memory load.
Flexibility and efficiency of use
Using accelerators like shortcuts is desirable to expert users to speed up the interactions with the system.
Aesthetic and minimalist design
Dialogs should be as simple and relevant as possible.
Help users recognize, diagnose, and recover from errors
Error messages should be expressed in plain language (no codes), precisely indicate the problem, and constructively suggest a solution.
Help and documentation
It is better to give a rubric for help and documentation, help contents should be simple and brief and easy to search
Automated Usability Evaluation Inspection or empirical approaches involve usability practitioners to manually examine a graphical user interface in order to detect usability deficiencies via inspecting usage test cases or analyzing the results of questionnaires. Although such methods are known to be laborious and very expensive, they often yield results that are biased by the acquisition environment or experts’ subjectivity. Alternatively, several automated evaluation methods are conceived for auto discovery of usability faults at the same time avoiding the drawbacks in terms of reducing costs and time through liberating usability experts from conducting repetitive tasks manually in addition to increase the coverage of tested features (Quade, Lehmann, Engelbrecht, Roscher, & Albayrak, 2013). Furthermore, because of the immense volume of data acquired from usability evaluation, the total or partial use of automated methods can be very beneficial during the development of web applications (Cassino et al., 2015; de Santana & Baranauskas, 2015). However, the majority of the surveyed research studies are purely based on manual or statistical analysis of recorded activity data for the participants. Usability evaluation can be conducted by users either remotely or locally. Tullis et al. (Tullis, Fleischman, McNulty, Cianchette, & Bergel, 2002) conducted a comparative experiment between remote and laboratory-based testing where they emphasized the advantages of remote evaluation in terms of costs and effectiveness. Methods for usability evaluation are conventionally grouped into two main categories by the HCI community; the first class is based on analyzing the graphical interface through reading the source code of the website to examine the content and structure of the application. Cassino et al. (2011) assessed the source code to infer the design model of the interface and the interaction styles implemented on every page of the website to generate a quantitative report of the evaluation based on heuristic factors. Meanwhile, other methods rely on examining the usage data i.e. logs. The user logs used for usability evaluation are captured at either the server-side or the client-side. Many studies advocate that logging
439
Evaluation Methods for E-Learning Applications in Terms of User Satisfaction
techniques are proven to be more reliable and efficient in terms of providing useful usability insights for the evaluators (de Santana & Baranauskas, 2015). Server-side logs are automatically generated by the web server where each line in the log file corresponds to a request made by a user to access a given resource on the server such as an html page or image. Server logs can be analyzed to produce usability insight from many real users performing particular tasks over a long period of time in natural working conditions as opposed to simulated or artificial settings within a laboratory having a limited sample of users (Geng & Tian, 2015). Data preparation and mining methods are used to process the raw web server logs to derive the users visiting patterns and other metrics for usability analysis. The process of data preparation from log files consists of data cleaning, user sessions re-identification across different requests and navigation path completion. However, Geng (Geng & Tian, 2015) pointed out that the task for log processing is time consuming and computationally intensive. On the other hand, many critics argued that recorded server logs contain bare information about the users’ goals and lacking essential data about the in-page interactivity and events. For the client-side logs, user usage data is acquired through either data loggers integrated into the web application or via the use of custom browser plugins or 3rd party software to enable the tracking of user activities. Client-side logs can acquire accurate and detailed comprehensive user traces for usability evaluation as the logger is usually implemented via custom event listeners to record low-level user interaction events such as keystrokes and mouse clicks. Hence, the recorded data contains elaborate details about the user interaction with the interface via a particular input device. Events are triggered from actions that are generated either by the user or the system. The main setback of using client-side loggers is the privacy concerns as users need to legally grant their permission in order to record their activity traces on a given website. Further, users generally are not willing to have additional software installed on their computers to record their activities online (Geng & Tian, 2015). Therefore, capturing clientside data can best be achieved in laboratory settings where explicit consent of the users can be provided.
Website Evaluation Tools Paganelli (2002) worked on developing a desktop-based application for recording and analysing interaction logs for website systems based on a predefined task model. The activities to be performed on a website is specified using the notations for the ConcurTaskTrees environment (Paternò, Santoro, & Spano, 2012) which provides a graphical representation for the hierarchical logical structure of the task model. Tiedtke (Tiedtke, Märtin, & Gerth, 2002) described a framework implemented in Java and XML for automated usability evaluation of interactive websites combining different techniques for data-gathering and analysis. Their system uses a task-based approach and incorporates usability issues. Atterer and Schmidt (2007) presented an implementation of UsaProxy which is an application that provides website usage tracking functionality using an HTTP proxy approach. Recently Vasconceols (de Vasconcelos & Baldochi Jr, 2012) implemented an automated system called USABILICS for remote evaluation based on interface model. Tasks to be performed by a user are predefined using an intuitive approach that can be applied for larger web systems. The evaluation is based on matching a usage pattern performed by the user against the one conducted by an expert of the system providing a usability index for the probed application. Muhi (Muhi, Sz\Hoke, Fülöp, Ferenc, & Berger, 2013) proposed a general framework for usability evaluation that can be tested in production systems. The framework takes as input an XML configuration file describing the positioning of the different interface elements of an application whilst user activities are logged into a separate XML file. A validator module is deployed to check the log-files 440
Evaluation Methods for E-Learning Applications in Terms of User Satisfaction
according to semantic rules that are defined within the usability data model. Andrica and Candea (2011) presented the WaRR which is an automated tool that records and replays with high fidelity the interaction between users and modern web applications in this tool the recording functionality is embedded in the web browser, it has direct access to user keystrokes and clicks. There are a number of commercially available tools that are used for recording user traces for usability purposes. CrazyEgg logs mouse events with the ability to visualize activity maps of the more popular locations of clicks on a page. Web Criteria Site Profile is another tool used mainly to assess simple attributes of usability including page loading time and ease of finding content on a website. This is based on automated agents browsing the website to retrieve data making use of the GOM model. Web TANGO is a software that employs the Monte Carlo simulation and information retrieval methods to predict the user’s behavior and navigation paths. This is based on data acquired from extensive experiments conducted against websites nominated as successful having received higher user ratings.
Usability For E-Leaning Systems For research studies related to assess the usability aspect of the learning management systems, Persico (Persico et al., 2014) employed the Technology Acceptance Model to investigate the willingness of university users for the adoption of e-learning systems. Evaluation is based on three dimensions including usefulness, ease of use and effectiveness. Escobar-Rodriguez (Escobar-Rodriguez & Monge-Lozano, 2012) analyzed how university students use the Moodle platform in order to determine and understand the factors which might influence their intention to use the platform. The Technology Acceptance Model is used to assess the usability of the system in terms of perceived usefulness and ease of use against actual usage behavior. Surprisingly, only a few studies in the literature have used SUS to evaluate the perceived usability of e-learning management systems (Orfanou, Tselios, & Katsanos, 2015). The first study of using the SUS for e-learning system was conducted by (Renaut, Batier, Flory, & Heyde, 2006) to inspect usability problems for the SPIRAL platform. The researchers employed the SUS scale as a post-assessment of the usability reporting a score of 72% of the participating university lecturers who described the platform as positively easy to use. In (Simões & de Moraes, 2012), the authors examined the usability of the Moodle e-learning platform using three different evaluation methods including the SUS questionnaire to assess user’s satisfaction for a sample size of 59 students. The authors concluded that the SUS is an effective tool for exploring the usability aspect without reporting the obtained SUS score. Marco et al. (2013) proposed a way of remote collaboration in real time within the platform Moodle through the use of Drag & Share. The collaborative tool enables sharing and synchronization of files. The efficiency of users was quantified using the time taken for task completion meanwhile user satisfaction was assessed using the SUS questionnaire with a reported score of 89.5%. Because of the dearth of studies and approaches devoted for the exploratory evaluation of the acceptance and usability aspect by university lecturers for using e-learning applications. Motivated by the fact that the process for introducing e-learning systems is bound to have a slow and complex trend () which needs to be understood and evaluated beyond the use of just summative ways, Harrati et al. () explored an empirical-based study to assess the satisfaction level of how lecturers interact with an e-learning environment system based on a predefined task model describing low-level interactivity details. The main thrust of this research is to evaluate the usability of the e-learning platform as usability is considered a vital attribute for the adoption of educational systems by lecturers. An online automated system for formalizing user interaction with a given system guided through a set of rules describing certain goals 441
Evaluation Methods for E-Learning Applications in Terms of User Satisfaction
to be achieved by the end user is setup for usability practitioners. The task model is mainly utilized to capture all the interactions and navigation path to be carried out by the university staff. Empirical client-side log data is collected from university lecturers from the Electrical and Computer Science departments participating within the usability evaluation of the e-learning system in a non-intrusive fashion without the need to install additional tools. The Moodle e-learning platform is used as the case study for this research. Subsequently, data analysis is conducted to infer the usability level. This is carried out in compliance with the defined task model and usability metrics describing efficiency of use. Regardless of the fact that users have expressed higher satisfaction scores through the System Usability Scale (SUS) (), empirical results performed to inspect the usability of the e-learning platform have revealed that potential reasons to impede the adoption of new technologies within the teaching process is primarily related to the complex nature of the software interface where the majority of lecturers failed to complete simple tasks.
CONCLUSION Because of the vital importance for the use of online technology as a medium for distance or virtual education, corporations and schools are investing substantially large amounts of money, time and resources in developing alternatives to traditional methods of education and training. In fact, several scholars described the growth rate of e-learning as unprecedented and explosive as the adoption of e-learning went beyond academic institutions to be considered seriously in corporate companies and public administrations as part of their employee training programs. Because of the ubiquity of e-learning and vital influence for engineering the educational process, it is no surprise that many research studies are conducted to explore different aspects covering the use of e-learning in higher education. The assessment and evaluation aspects are considered arguably the most influential part for measuring the success and effectiveness of e-learning experience. As more and more universities worldwide has opted to use online technology for their course delivery, research in e-learning systems have attracted considerable interest in order to apprehend how effective and usable e-learning systems in terms of principles related to human computer interaction. Usability evaluation consists of methodologies for measuring the ease-of-use aspects of the user interface for a given software system and identifying specific problems. In fact, Usability evaluation plays a vital role within the overall user interface design process which undergoes continuous and iterative cycles of design, prototyping and testing.
REFERENCES Albert, W., & Tullis, T. (2013). Measuring the user experience: collecting, analyzing, and presenting usability metrics. Newnes. Anderson, T. (2008). The theory and practice of online learning. Athabasca University Press. Andrica, S., & Candea, G. (2011). WaRR: A tool for high-fidelity web application record and replay. Proceedings of the41st International Conference on Dependable Systems & Networks. doi:10.1109/ DSN.2011.5958253
442
Evaluation Methods for E-Learning Applications in Terms of User Satisfaction
Atterer, R., & Schmidt, A. (2007). Tracking the interaction of users with AJAX applications for usability testingProceedings of the SIGCHI conference on Human factors in computing systems (pp. 1347-1350). doi:10.1145/1240624.1240828 Bangor, A., Kortum, P. T., & Miller, J. T. (2008). An empirical evaluation of the system usability scale. International Journal of Human-Computer Interaction, 24(6), 574–594. doi:10.1080/10447310802205776 Bates, A. T. (2005). Technology, e-learning and distance education. Routledge. doi:10.4324/9780203463772 Bitzer, D., Braunfeld, P., & Lichtenberger, W. (1961). PLATO: An automatic teaching device. IRE Transactions on Education, 4(4), 157–161. doi:10.1109/TE.1961.4322215 Black, P., & Wiliam, D. (1998). Assessment and classroom learning. Assessment in Education: Principles, Policy & Practice, 5(1), 7–74. doi:10.1080/0969595980050102 Bringula, R. P. (2013). Influence of faculty-and web portal design-related factors on web portal usability: A hierarchical regression analysis. Computers & Education, 68, 187–198. doi:10.1016/j. compedu.2013.05.008 Brooke, J. (1996). SUS-A quick and dirty usability scale. Usability evaluation in industry, 189(194), 4-7. Burton-Jones, A., & Grange, C. (2012). From use to effective use: A representation theory perspective. Information Systems Research, 24(3), 632–658. doi:10.1287/isre.1120.0444 Capece, G., & Campisi, D. (2013). User satisfaction affecting the acceptance of an e-learning platform as a mean for the development of the human capital. Behaviour & Information Technology, 32(4), 335–343. doi:10.1080/0144929X.2011.630417 Cassino, R., & Tucci, M. (2011). Developing usable web interfaces with the aid of automatic verification of their formal specification. Journal of Visual Languages and Computing, 22(2), 140–149. doi:10.1016/j. jvlc.2010.12.001 Cassino, R., Tucci, M., Vitiello, G., & Francese, R. (2015). Empirical validation of an automatic usability evaluation method. Journal of Visual Languages and Computing, 28, 1–22. doi:10.1016/j.jvlc.2014.12.002 Chua, B. B., & Dyson, L. E. (2004). Applying the ISO 9126 model to the evaluation of an e-learning system. Paper presented atASCILITE. Chuo, Y., Liu, C., & Tsai, C. (2015). Effectiveness of e-learning in hospitals. Technology and Health Care, 23(Suppl. 1), S157–S160. doi:10.3233/thc-150949 PMID:26410320 Colvin Clark, R., & Mayer, R. (2008). E-Learning and the Science of Instruction. San Francisco, USA: Pfeiffer. Connolly, T. M., Boyle, E. A., MacArthur, E., Hainey, T., & Boyle, J. M. (2012). A systematic literature review of empirical evidence on computer games and serious games. Computers & Education, 59(2), 661–686. doi:10.1016/j.compedu.2012.03.004 Davis, F. D., Bagozzi, R. P., & Warshaw, P. R. (1989). User acceptance of computer technology: A comparison of two theoretical models. Management Science, 35(8), 982–1003. doi:10.1287/mnsc.35.8.982
443
Evaluation Methods for E-Learning Applications in Terms of User Satisfaction
de Santana, V. F., & Baranauskas, M. C. C. (2015). WELFIT: A remote evaluation tool for identifying Web usage patterns through client-side logging. International Journal of Human-Computer Studies, 76, 40–49. doi:10.1016/j.ijhcs.2014.12.005 de Vasconcelos, L. G., & Baldochi, L. A. Jr. (2012). Towards an automatic evaluation of web applicationsProceedings of the 27th Annual ACM Symposium on Applied Computing (pp. 709-716). doi:10.1145/2245276.2245410 Dimoka, A., Banker, R. D., Benbasat, I., Davis, F. D., Dennis, A. R., & Gefen, D. et al. others. (2012). On the use of neurophysiological tools in is research: Developing a research agenda for neurois. Management Information Systems Quarterly, 36(3), 679–702. Eckhardt, A., Maier, C., & Buettner, R. (2012). The Influence of Pressure to Perform and Experience on Changing Perceptions and User Performance: A Multi-Method Experimental Analysis. Proceedings ICIS ‘12. Escobar-Rodriguez, T., & Monge-Lozano, P. (2012). The acceptance of Moodle technology by business administration students. Computers & Education, 58(4), 1085–1093. doi:10.1016/j.compedu.2011.11.012 Fernandez, A., Abrahão, S., & Insfran, E. (2013). Empirical validation of a usability inspection method for model-driven Web development. Journal of Systems and Software, 86(1), 161–186. doi:10.1016/j. jss.2012.07.043 Fernandez, A., Insfran, E., & Abrahão, S. (2011). Usability evaluation methods for the web: A systematic mapping study. Information and Software Technology, 53(8), 789–817. doi:10.1016/j.infsof.2011.02.007 Garrison, D. R. (2011). E-learning in the 21st century: A framework for research and practice. Taylor & Francis. Garrison, D. R., & Kanuka, H. (2004). Blended learning: Uncovering its transformative potential in higher education. The Internet and Higher Education, 7(2), 95–105. doi:10.1016/j.iheduc.2004.02.001 Geng, R., & Tian, J. (2015). Improving web navigation usability by comparing actual and anticipated usage. IEEE Transactions on Human-Machine Systems, 45(1), 84–94. Granić, A. (2008). Experience with usability evaluation of e-learning systems. Universal Access in the Information Society, 7(4), 209–221. doi:10.1007/s10209-008-0118-z Harrati, N., Bouchrika, I., Tari, A., & Ladjailia, A. (2016). Exploring user satisfaction for e-learning systems via usage-based metrics and system usability scale analysis. Computers in Human Behavior, 61, 463–471. doi:10.1016/j.chb.2016.03.051 Henriksson, A., Yi, Y., Frost, B., & Middleton, M. (2007). Evaluation instrument for e-government websites. Electronic Government, an International Journal, 4(2), 204-226. Holsapple, C. W., & Lee‐Post, A. (2006). Defining, Assessing, and Promoting E‐Learning Success: An Information Systems Perspective*. Decision Sciences Journal of Innovative Education, 4(1), 67–85. doi:10.1111/j.1540-4609.2006.00102.x
444
Evaluation Methods for E-Learning Applications in Terms of User Satisfaction
Hornb\aek, K. (2006). Current practice in measuring usability: Challenges to usability studies and research. International journal of human-computer studies, 64(2), 79-102. Horton, W. (2011). E-learning by design. John Wiley & Sons. doi:10.1002/9781118256039 Hug, T., Lindner, M., & Bruck, P. A. (2005). Microlearning: Emerging concepts, practices and technologies after e-learning. Proceedings of Microlearning ‘05. Islam, A. N. (2014). Sources of satisfaction and dissatisfaction with a learning management system in post-adoption stage: A critical incident technique approach. Computers in Human Behavior, 30, 249–261. doi:10.1016/j.chb.2013.09.010 Ismailova, R. (2015). Web site accessibility, usability and security: A survey of government web sites in Kyrgyz Republic. Universal Access in the Information Society. Ivory, M. Y., & Hearst, M. A. (2001). The state of the art in automating usability evaluation of user interfaces. ACM Computing Surveys, 33(4), 470–516. doi:10.1145/503112.503114 Johnson, R. D., Hornik, S., & Salas, E. (2008). An empirical examination of factors contributing to the creation of successful e-learning environments. International Journal of Human-Computer Studies, 66(5), 356–369. doi:10.1016/j.ijhcs.2007.11.003 Laurillard, D., Oliver, M., Wasson, B., & Hoppe, U. (2009). Implementing technology-enhanced learning (pp. 289–306). Technology-Enhanced Learning. doi:10.1007/978-1-4020-9827-7_17 Lee, S. M., Kim, Y. R., & Lee, J. (1995). An empirical study of the relationships among end-user information systems acceptance, training, and effectiveness. Journal of Management Information Systems, 12(2), 189–202. doi:10.1080/07421222.1995.11518086 Lewis, J. R., & Sauro, J. (2009). The factor structure of the system usability scale (pp. 94–103). Human Centered Design. doi:10.1007/978-3-642-02806-9_12 Liapis, A., Katsanos, C., Sotiropoulos, D., Xenos, M., & Karousos, N. (2015). Recognizing emotions in Human Computer Interaction: Studying stress using skin conductance. Human-Computer InteractionINTERACT ‘15, 255–262. Marco, F. A., Penichet, V. M. R., & Gallud, J. A. (2013). Collaborative e-Learning through Drag & Share in Synchronous Shared Workspaces. J. UCS, 19(7), 894–911. Mayhew, D. J. (1999). The usability engineering lifecycle. In CHI’99 Extended Abstracts on Human Factors in Computing Systems (pp. 147-148). Mentes, S. A., & Turan, A. H. (2012). Assessing the Usability of University Websites: An Empirical Study on Namik Kemal University. Turkish Online Journal of Educational Technology, 11(3), 61–69. Moore, J. L., Dickson-Deane, C., & Galyen, K. (2011). e-Learning, online learning, and distance learning environments: Are they the same? The Internet and Higher Education, 14(2), 129–135. doi:10.1016/j. iheduc.2010.10.001 Muhi, K., SzHoke, G., Fülöp, L. J. H., Ferenc, R., & Berger, Á. (2013). A Semi-automatic Usability Evaluation Framework, Computational Science and Its Applications ICCSA ‘13 (pp. 529-542).
445
Evaluation Methods for E-Learning Applications in Terms of User Satisfaction
Navimipour, N. J., & Zareie, B. (2015). A model for assessing the impact of e-learning systems on employees satisfaction. Computers in Human Behavior, 53, 475–485. doi:10.1016/j.chb.2015.07.026 Nielsen, J. (1994). Heuristic evaluation. Usability inspection methods, 17(1), 25-62. Orfanou, K., Tselios, N., & Katsanos, C. (2015). Perceived usability evaluation of learning management systems: Empirical evaluation of the System Usability Scale. The International Review of Research in Open and Distributed Learning, 16(2). doi:10.19173/irrodl.v16i2.1955 Pachler, N., & Daly, C. (2011). Key issues in e-learning: Research and practice. Bloomsbury Publishing. Paganelli, L., & Paternò, F. (2002). Intelligent analysis of user interactions with web applications Proceedings of theInternational conference on Intelligent user interfaces (pp. 111-118). doi:10.1145/502716.502735 Pariente-Martinez, B., Gonzalez-Rodriguez, M., Fernandez-Lanvin, D., & De Andres-Suarez, J. (2014). Measuring the role of age in user performance during interaction with computers. Universal Access in the Information Society. Paternò, F., Santoro, C., & Spano, L. D. (2012). Improving support for visual task modelling (pp. 299–306). Human-Centered Software Engineering. Persico, D., Manca, S., & Pozzi, F. (2014). Adapting the Technology Acceptance Model to evaluate the innovative potential of e-learning systems. Computers in Human Behavior, 30, 614–622. doi:10.1016/j. chb.2013.07.045 Phillips, R., McNaught, C., & Kennedy, G. (2012). Evaluating e-learning: Guiding research and practice. Routledge. Pressey, S. L. (1926). A simple apparatus which gives tests and scores-and teaches. School and society, 23(586), 373-376. Quade, M., Lehmann, G., Engelbrecht, K.-P., Roscher, D., & Albayrak, S. (2013). Automated usability evaluation of model-based adaptive user interfaces for users with special and specific needs by simulating user interaction (pp. 219–247). User Modeling and Adaptation for Daily Routines. doi:10.1007/978-14471-4778-7_9 Renaut, C., Batier, C., Flory, L., & Heyde, M. (2006). Improving web site usability for a better e-learning experience. Current developments in technology-assisted education (pp. 891–895). Badajoz, Spain: FORMATEX. Sheridan, D., White, D., & Gardner, L. A. (2002). Cecil: the first web-based LMS. Paper presented at the ASCILITE. Simões, A. P., & de Moraes, A. (2012). The ergonomic evaluation of a virtual learning environment usability. Work (Reading, Mass.), 41, 1140. PMID:22316872 Sitzmann, T., Kraiger, K., Stewart, D., & Wisher, R. (2006). The comparative effectiveness of web‐based and classroom instruction: A meta‐analysis. Personnel Psychology, 59(3), 623–664. doi:10.1111/j.17446570.2006.00049.x
446
Evaluation Methods for E-Learning Applications in Terms of User Satisfaction
Tiedtke, T., Märtin, C., & Gerth, N. (2002). AWUSA-A tool for automated website usability analysis. Proceedings of theWorkshop on Interactive Systems. Design, Specification, and Verification. Rostock, Germany (pp. 12-14). Triacca, L., Bolchini, D., Botturi, L., & Inversini, A. (2004). MiLE: Systematic Usability Evaluation for E-learning Web Applications. Paper presented at theWorld Conference on Educational Multimedia, Hypermedia and Telecommunications. Tsai, P. C.-F., Yen, Y.-F., Huang, L.-C., & Huang, C. (2007). A study on motivating employees learning commitment in the post-downsizing era: Job satisfaction perspective. Journal of World Business, 42(2), 157–169. doi:10.1016/j.jwb.2007.02.002 Tullis, T., Fleischman, S., McNulty, M., Cianchette, C., & Bergel, M. (2002). An empirical comparison of lab and remote usability testing of web sites. Proceedings of theUsability Professionals Association Conference. Venkatesh, V., & Davis, F. D. (2000). A theoretical extension of the technology acceptance model: Four longitudinal field studies. Management Science, 46(2), 186–204. doi:10.1287/mnsc.46.2.186.11926 Wang, Y.-S., Wang, H.-Y., & Shee, D. Y. (2007). Measuring e-learning systems success in an organizational context: Scale development and validation. Computers in Human Behavior, 23(4), 1792–1808. doi:10.1016/j.chb.2005.10.006 Welsh, E. T., Wanberg, C. R., Brown, K. G., & Simmering, M. J. (2003). E‐learning: Emerging uses, empirical results and future directions. International Journal of Training and Development, 7(4), 245–258. doi:10.1046/j.1360-3736.2003.00184.x Wirt, J., Choy, S., Rooney, P., Hussar, W., Provasnik, S., & Hampden-Thompson, G. (2005). The Condition of Education, 2005. NCES 2005-094. National Center for Education Statistics. Yeh, Y.-c., & Lin, C. F. (2015). Aptitude-Treatment Interactions during Creativity Training in E-Learning: How Meaning-Making, Self-Regulation, and Knowledge Management Influence Creativity. Journal of Educational Technology & Society, 18(1), 119–131. Zhang, D., & Nunamaker, J. F. (2003). Powering e-learning in the new millennium: An overview of e-learning and enabling technology. Information Systems Frontiers, 5(2), 207–218. doi:10.1023/A:1022609809036
KEY TERMS AND DEFINITIONS Assessment: Is used usually to refer to the role in formal education of judging the students’ attainment of educational objectives for a specific course. Blended Learning: Refers to the teaching process where computer-based learning are integrated in tandem with face-to-face classical teaching activities. E-Learning: Is the learning process facilitated and supported through the use of information and communications technology. Evaluation: Refers to the process of comparing or measuring a unit, course, program or other elements of e-learning against some set of performance or outcome criteria. 447
Evaluation Methods for E-Learning Applications in Terms of User Satisfaction
Learning Management System: Is a software system developed for the purpose of managing online courses including the administration, documentation, reporting and delivery of educational content. Micro-Learning: Is a new form of learning based on the design of micro or lighter activities through micro-steps in digital environments. These learning activities are made part of the learner’s daily routines. M-Learning: Is defined as the portable and lightweight learning process where the learner can engage in learning or training activity without having any geographical constraint via the use of their mobile machine in a portable and mobile fashion. Usability: Is the extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use.
448
449
Chapter 19
Improve the Flipped Classroom with Universal Design for Learning Thomas J. Tobin The Pennsylvania State University, USA Barbi Honeycutt FLIP It Consulting, USA
ABSTRACT The flipped-classroom approach has been adopted widely across higher education. Some faculty members have moved away from it because of the perceived workload required in order to implement a full course “flip.” Faculty members can adopt the three principles of Universal Design for Learning (UDL) in order to reduce their own workload and make their flipped-classroom content and interactions more engaging, meaningful, and accessible for students. Adopting both the classroom flip and UDL provides benefits to learners and instructors that go beyond adopting either separately.
INTRODUCTION The flipped-classroom model holds the promise of allowing faculty members to be able to focus on higher-order thinking and application of course concepts with students during in-class meetings. While the term “flipped classroom” is relatively new, the concept is not. Faculty members at colleges and universities have been experimenting with the idea of “the inverted classroom” since 2000 (see Lage, Platt, & Tregalia), but there are still different interpretations and definitions, especially since 2007 when the term “flipped classroom” was coined (Noonoo, 2012). In this chapter, the authors will suggest a three-part solution that helps faculty members to create robust and engaging flipped classrooms across the higher-education curriculum to improve learning, encourage engagement, and enhance access to learning interactions for all learners. As scholars continue to analyze the different models and definitions for the flipped classroom model, educators are eager to learn more and to assess the value of the approach for student learning. Research DOI: 10.4018/978-1-5225-1851-8.ch019
Copyright © 2017, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
Improve the Flipped Classroom with Universal Design for Learning
has shown that the flipped classroom model improves student achievement of learning outcomes and increases student engagement (see Roach, 2014; Deslauriers, Schelew, & Weiman, 2011; Moravec, Williams, Aguilar-Roca & O’Dowd, 2010; and Shaver, 2010). However, faculty experiences with the model vary, based on how the flipped classroom is defined and implemented. For example, the following case shows how one definition can result in negative experiences for both faculty members and students. In 2012, Dr. Barbi Honeycutt visited a campus to facilitate a faculty-development workshop focused on the flipped classroom. As she was arranging her materials, distributing handouts, and organizing her work space, a faculty member came into the room, introduced himself, and said, “I flipped all of my lectures last semester for my 300-level course. I recorded all three lectures a week for the entire semester.” Dr. Honeycutt smiled politely and asked, “Oh, you did? So how did that work out for you?” She suspected what was coming next, but let the faculty member share his story. He said, “I’m exhausted. My teaching assistant is exhausted. After each recording, we spent about six hours per lecture finishing the editing and uploading files. It took so much time. And I’m not sure it was worth it.” Of course they were exhausted. A three-credit course meets for approximately an hour three times a week over the course of a typical semester. That’s a minimum of 45 hours of video-recorded lectures. Add another six hours to each of those one-hour video segments and the whole idea of the flipped classroom seems impossible, especially on top of all of the other responsibilities that faculty members and students have. Dr. Honeycutt was curious: “Wow, that’s a lot of time spent in the recording studio. What kind of feedback did you get from your students?” The faculty member said, “That’s the thing. I’m not so sure my students ever watched all of the videos. Very few came to class prepared. I eventually found myself just going back to my routine and delivering the same lectures during class that I had recorded in the videos. I figured that the videos were there if students wanted to re-watch a lecture or if they missed class. I guess I don’t see the point of the flipped classroom. It took too much time, and it didn’t seem to matter to the students anyway. At least I can say I tried it, but it’s probably not something I would do again.” Dr. Tom Tobin and Dr. Honeycutt have heard versions of this record-every-minute story from faculty members in colleges and universities across the world. It is a common misconception about the flippedclassroom model. In a recent article on “The Condensed Classroom” for The Atlantic, Ian Bogost even perpetuates this misunderstanding: “condensed classes actually seem to require more work rather than less … [T]hey require the creation of elaborate video lectures” (2013). Faculty adopters nearly always begin with excitement and enthusiasm about the possibilities of the flipped model, especially after hearing from other faculty members about positive changes in their students and even reports of having collaborative fun in the classroom. Some faculty members, however, respond with resistance and hesitance when the words “flipped classroom” are mentioned, for reasons much like those expressed by the faculty member in the case above. It is tempting to want to “dive in head first” and move all of one’s content-sharing, lectures, readings, and student self-guided study away from in-classroom time. The most common method for doing this is by recording videos for students to watch—not surprising, since the original “flipped classroom” concept was primarily video-focused (see Noonoo, 2012). In cases like the one described above, three challenges can be addressed to help faculty members to be successful with the flipped approach. First is the lack of a common definition of what is meant by the term “flipped classroom,” and consequently, confusion over how best to apply it. Think of the exhausted instructor who recorded all of his
450
Improve the Flipped Classroom with Universal Design for Learning
in-class lectures and then spent hours editing, arranging, and sharing 45 hour-long videos—which most of his students didn’t watch, and who could blame them? Or, at least, think of his exhausted teaching assistant who spent weeks editing and uploading files rather than actually learning more about how to teach and engage students. We need a better definition of “flipping” that helps us to avoid the “record it all and post it” temptation. A second challenge is lack of clarity, purpose, and structure when it comes to designing assignments, assessments, and activities in the flipped classroom. This has a lot to do with how people feel about the reasons for which they do the work of putting interactions and content into fixed formats to begin with—something this chapter will explore in depth. The faculty member in the narrative above achieved the goal of making information available to students. In fact, he probably did too thorough a job, focusing on access to information at the expense of interaction and learner engagement. His students likely gave up on watching those hour-long lectures three times a week because there was too much to sort through, a lack of focus, and too much of a similarity to the in-class lectures. We need a better way to design the out-of-classroom learning experiences for our students so that they feel a connection between the in-class and out-of-class tasks. The two experiences need to be distinct yet connected, so that students feel supported, encouraged, and motivated to continue with the course. Students need to see the value of both the in-class and out-of-class learning experiences. Finally, there is the challenge of how to adjust to the shifting roles of both the faculty member and the students in the flipped learning environment (Mazur, 2013). The educator in our case study understood what he thought he and his students were supposed to do for the flipped activities outside of their in-class meetings: create and watch videos. This mental model wasn’t sufficient to keep the students engaged, however. It also taught the faculty member that the flipped-classroom approach was too much work for very little reward. By not formulating and clarifying roles for himself and his students that went beyond just watching videos of lectures, he set himself and his students up for frustration and burnout. It’s no wonder that he and his students reverted to what was more familiar (and less time consuming) to them—traditional lectures. We need a better way to set expectations for what faculty members and students do in the flipped-classroom model. We need a better way to encourage students to adopt the model and see the value of this type of learning experience, not only for the content knowledge in their discipline, but for all of the other “transferable” skills they learn when participating in these dynamic learning experiences. Fortunately, researchers in the area of Universal Design for Learning (UDL) have been addressing these three challenges for many years, but in a different context: supporting learners across the ability spectrum. Especially with the advent of multimedia technology and the widespread use of mobile devices among learners in higher education, Dr. Tobin (2014) argues that it is time to broaden the scope of UDL in order to apply it for reasons beyond disability support (pp. 13-14). UDL is an ideal way to address the three challenges of definition, design, and expectation setting that can help faculty members to create a positive and engaging flipped-classroom experience. As objectives for this chapter, readers will be able to define the flipped-classroom and universal-design-for-learning approaches, implement specific ways that the two can be used simultaneously to create a seamless means of support for learners, and create interactions that allow learners to obtain information, demonstrate skills, and stay engaged with their learning.
451
Improve the Flipped Classroom with Universal Design for Learning
CHALLENGE 1: DEFINING THE FLIPPED CLASSROOM It may help to craft a better definition of flipping the classroom by looking at the limitations of the current understanding and application of the concept, both among practitioner faculty members and in the scholarly literature. Lage, Platt, and Treglia published in 2000 on the “inverted classroom,” which has the same premise as the flipped model. They explain, “Inverting the classroom means that events that have traditionally taken place inside the classroom now take place outside the classroom and vice versa” (p. 32). Since then, a robust scholarly literature has developed around how best to implement the flipped-classroom model in order to engage and support learning. As technological innovations have emerged, they have allowed faculty members and course designers to create more opportunities and resources for what the flipped classroom can be. As a result, the definition of the inverted, or flipped, classroom model has evolved. Strayer (2012) expands on Lage, Platt, and Treglia’s definition, explaining that “an inverted (or flipped) classroom is a specific type of blended learning design that uses technology to move lectures outside the classroom and uses learning activities to move practice with concepts inside the classroom” (p. 171). The conversation around the flipped classroom model has broadened exponentially since teachers in K-12 settings started implementing the model (see Bergmann & Sams, 2012). In addition to the literature, when Dr. Honeycutt and Dr. Tobin speak with instructors across the world, they hear many definitions and interpretations of what it actually means to flip a classroom. As cited above, the most common—and most debated—definition centers on the creation and integration of videos as tools for delivering instruction. Videos can indeed be an effective way to deliver and review information, and some educators are embracing the advantages of video-based education. Salman Khan is one of the pioneers in creating videos for educational purposes. His creation, Khan Academy, provides instructional videos to audiences through free Internet platforms such as YouTube (Bishop & Verleger, 2013). Khan’s work opened the door for other educators to re-think the role videos can play in education. Likewise, the definition of the flipped classroom from EDUCAUSE, a nonprofit association which conducts research on instructional-technology integration in higher education, specifically mentions video: “The flipped classroom is a pedagogical model in which the typical lecture and homework elements of a course are reversed … Short video lectures are viewed by students at home before the class session, while in-class time is devoted to exercises, projects, or discussions” (EDUCAUSE, 2012). Somewhere along the way, the flipped-classroom model became synonymous with recording entire video lectures for students to watch on their own outside of class time. There are so many design flaws and assumptions made when this is the only way the flipped classroom is defined. For example: How to assess whether or not students watched the video? What if students do not have access to video-watching technology outside of campus? What would happen if all professors recorded all of their lectures and students were suddenly tasked with watching more than 15 hours of lectures every week just to prepare for their classes? Who has time to record and edit all of these videos? Why exactly does an entire lecture have to be recorded for every class? Most importantly, what is the goal? Watching a recorded lecture is not teaching, and not automatically learning. A video of a lecture is still just a lecture. Where’s the innovation? Many educators challenge the idea that the flipped-classroom model must include videos of lectures to enhance learning. High-school science teachers Jonathan Bergmann and Aaron Sams are the authors of Flip Your Classroom: Reach Every Student in Every Class Every Day. They define the classroom 452
Improve the Flipped Classroom with Universal Design for Learning
flip as a mindset, rather than as a set of strategies or techniques: “It’s about flipping the attention away from the teacher and toward the learner … and leveraging educational tools to enhance the learning experience” (Sams & Bennett, 2012). In order to help to avoid the frustration felt by the faculty member who had recorded 45 hours of lecture videos, the authors advocate re-framing the definition of the flipped classroom back to one of its earliest forms. Instead of focusing the definition on a specific technology or practice—in this case, recorded videos of lectures—the key to the classroom flip is intentionally leveraging various types of technology as pedagogical tools. This approach hearkens back to the original concept of inverting, or flipping, the design of the learning environment as first identified by Lage, Platt, and Treglia (2000, p. 34). They made it clear that inverted, or flipped, instructional design could indeed include videos as instructional tools, but the main idea is to reverse the types of interactions that happen in and out of the classroom. Additionally, if a video is used, it should be designed with a specific purpose and connected to a learning outcome. It is not a replacement or substitute for the instructor. It is not used solely for the purpose of accommodating students who miss class. The purpose of the video should be to prepare students for the in-class activities where the content will be applied and analyzed. Such an approach when using video has been show to increase student learning outcomes as measured in test grades and assignment grades (Calimeris & Sauer, 2015, p. 14). In addition to reversing when and where activities occur, educators can make class activities serve more structured purposes. The adoption of a definition of the classroom flip that is not limited to a specific technology or technique allows faculty members intentionally to design learning interactions in specific sequences based on the progressive categories of Bloom’s Taxonomy of learning domains (see Anderson & Krathwohl, 2001). For each course topic, students engage with learning concepts within the lowest levels of Bloom’s Taxonomy—knowledge and comprehension—before class. In-class time is spent focusing on the higher levels of Bloom’s Taxonomy: analysis, evaluation, and creation (Honeycutt & Garrett, 2013, p. 2). Derek Bruff (2012) provides a useful distinction between the traditional classroom and flipped classroom in terms of when learners first encounter new ideas. In the traditional classroom, new concepts are typically first encountered via the classroom lecture. The instructor introduces learners to new material, with deeper understanding and content reinforcement taking place via homework. Although this is an oversimplification—inquiry-based learning (IBL), problem-based learning (PBL), and active learning approaches also provide learning gains—many courses are still taught using the traditional lecture format, and research demonstrates that adopting the flipped-classroom model for individual or group activities leads to improved student learning outcomes (see Foldnes, 2016, pp. 47-38). In the flipped classroom, learners first encounter new ideas via videos, readings, interviews with practitioners in the field, or other self-directed study. Deeper learning, application, and critical thinking are then emphasized in the classroom via individual and group activities. The promise of the flipped classroom is the potential to connect students effectively, encourage engagement, and enhance learning. As scholars continue to expand what it means to flip a classroom, educators are seeing promising results and are looking for ways to make the flipped model more successful (see Mason, Shuman, & Cook, 2013; Riendeau, 2012; Talbert, 2014; Wilson, 2013). One connection that has started to emerge is the intersection between the flipped classroom model and Universal Design for Learning. Both approaches emphasize the importance of intentionally designing course interactions around learners’ needs, with the goals of enhancing learning, increasing student engagement, connecting students with the course material, and allowing learners to demonstrate their skills in supported and varied ways. 453
Improve the Flipped Classroom with Universal Design for Learning
In Dr. Honeycutt’s conversation with the early-adopter faculty member, these challenges can be examined more carefully, with an eye toward how to can address them using UDL. The faculty member’s overly-rigid interpretation of the flipped-classroom model all but guaranteed his students’ and his own burnout. He was creating and sharing too much information, all without a purposeful focus on the design and structure for the students’ learning experiences, both during and outside of class meetings. There was confusion around the roles and expectations for the students and the instructor, which eventually led everyone to revert to their comfort zones in more traditional lecture-based class roles. To address these common challenges, concepts from UDL can be introduced into the flipped model to help faculty members and students succeed better in this type of learning environment.
CHALLENGE 2: OVERCOMING BARRIERS (NOT THE ONES YOU THINK) In adopting the flipped classroom, we need a better way to design the out-of-classroom experiences for our students so they feel supported, motivated to stick with the course, and engaged with the learning. The theory and practice of UDL allows us to create out-of-classroom interactions and support that meet exactly these needs. Faculty members often associate UDL with the use of technology to help extend learning opportunities to students with disabilities, since most faculty members have had the experience of students with disabilities requesting specific accommodations, such as extra time on tests or alternative formats being created for course materials. Unlike accommodations, UDL is not a means of making specific changes for individual students upon request; in fact, it allows us to do much more than merely accommodate student disabilities. In order to blend UDL and the flipped classroom model, we must first address an obstacle: people may be hard-wired not to want to use UDL or the flipped classroom model at all. Most faculty members and institutional staff members have had the experience of working on requests for accommodations from students with disabilities. Also, most faculty members have not yet received formal training or conducted, research about Universal Design for Learning (Lombardi & Murray, 2011), and are unlikely to know specific details about what UDL encompasses. This sets people up to color their emotional response to UDL with the valence that they associate with accommodations. For neuropsychologists, the term “valence” has to do with how people add emotional coloring to “events, objects, and situations” that “may possess positive or negative valence; that is, they may possess intrinsic attractiveness or aversiveness” (Frijda, 1986, p. 207). In plain English, this means that people’s emotions affect how they perceive the world around them and the events that they experience. Researchers have been asking college and university faculty members for decades about how they respond to having students with learning challenges in their courses (see Fonosch & Schwab, 1981; Fichten, 1986; Nelson et al., 1990; Houck et al.,1992; Bento, 1996; Benham, 1997; Bigaj et al., 1999; Cook et al., 2009; Murray et al., 2009; Zhang et al., 2010; Lombardi & Murray, 2011; Murray et al., 2011). Faculty members should always respond supportively when students come to them with forms for accommodating challenges. Say you are a faculty member teaching a business-writing course, and a student comes to you at the end of the second week with a piece of paper to request an accommodation. The student says, “I need time and a half on tests and quizzes, and I need either software that can read the test questions out loud for me, or a live human being to do the same.” What should your answer be? Of course, it should be “Sure, I’ll set that up. Thank you for letting me know.” 454
Improve the Flipped Classroom with Universal Design for Learning
But how do faculty members actually feel when presented with accommodation requests? Based on the twelve large research studies mentioned above, the emotional valence associated with accommodations is almost uniformly negative. In many faculty members’ minds, the fact that one must accommodate learners with disabilities brings up feelings of uncertainty, confusion, annoyance, and even anger, as the following excerpts from the research indicate. Although faculty were willing to accommodate students with learning disabilities, they were concerned about maintaining academic integrity. For instance, several faculty members indicated that they would be willing to make accommodations for students with LD only if they could be assured that it would not lower academic standards … Some faculty commented that each student’s case would have to be treated on an individual basis. Finally, faculty indicated that the student’s attitude would influence whether or not they would provide him or her accommodations. (Nelson et al., 1990, p. 198) Among the statistical group comparisons presented … four are somewhat troubling. Specifically, if faculty [members] perceive that having a learning disability could limit the selection of a major … and influences whether students with a learning disability can complete a degree program, such views may be inadvertently communicated to students with learning disabilities … This could be especially damaging at the university level, where student-teacher interaction is often limited, thus making alterations in one’s perceptions more difficult. Whether conscious or unconscious, misconceptions or prejudicial attitudes may create barriers to the pursuit of certain careers or result in unequal opportunities. (Houck et al., 1992, p. 693) Faculty attitudes towards disabled students were typically characterized by … the perception that disabled students were somehow “less able” and that their “disability” could jeopardize not only their own individual performance, but also limit the other students and the instructor. In several cases, these unfavorable feelings were compounded by the phenomenon of reactance: the professors grieved the perceived loss of their academic freedom, curtailed by the legal requirements of special accommodations. (Bento, 1996, p.494) Faculty members rated the majority of items under one theme, Accommodations-Willingness, as low importance and low-agreement … It is possible that faculty members felt negatively about … accommodations because they are relatively difficult to implement, perceived as altering the nature of the course, or both. (Cook et al., 2009, p.93) [M]any faculty members are not fully supporting students with disabilities according to legal requirements or recommendations for best practices. This is an area of concern that institutions of higher education need to address to make certain that faculty members provide the necessary and reasonable accommodations and supports to students with disabilities. (Zhang et al., 2010, p 283) For many faculty members, interactions with students with disabilities, especially specific requests for accommodations, carry an aversive and negative emotional valence. Regardless of what they think ought to be the case—and regardless of whether people act consciously on such negative emotions— they ground our reaction and approach to learners with disabilities. In interviews with faculty members
455
Improve the Flipped Classroom with Universal Design for Learning
throughout North America about adopting UDL, the authors have heard feedback similar to the research findings above: • • • • • •
“I don’t have time to do all of that work if it benefits just a few students with disabilities.” “My institution doesn’t have a service for captioning videos, so I would have to do it all myself, and I have a lot of video clips for each of my courses.” “I’ve had a number of students come to me with ‘learning disabilities,’ but they don’t seem to have disabilities when I interact with them in my courses.” “I think at least a few of my students are trying to game the system by claiming to have disabilities.” “I know that I should be following the law, but I’m not always clear about what it requires, and no one at my institution is enforcing it.” “I haven’t had a student with a disability in my courses for years. I will wait until I have a request for an accommodation before I start doing all of that work.”
Based on this kind of feedback, any conversations, training programs, or advocacy for adopting Universal Design for Learning principles would stand poor chances of success even before they happened, due to the negative emotional valence associated with making accommodations for students with disabilities, even though UDL is not a means of granting specific accommodations. In fact, the contrary actually applies. As Sam Johnston, a research scientist at the Center for Applied Special Technology (CAST) puts it, “we want a situation that is good for everybody … Part of it is thinking about what has to happen at the level of design that makes accommodation less necessary” (personal communication, November 15, 2013). Dr. Johnston means that adopting UDL principles in the design of course interactions, the need for specific accommodations requests is greatly reduced, because it increases access for all learners—including people with learning challenges. So why hasn’t everyone adopted UDL yet? Because people apply their negative emotions related to their experience of granting accommodations when they are presented with the option to apply UDL principles in their course design and teaching. Conversations about UDL are often smothered by aversive emotional valence before they can be considered on their merits.
Like Chocolate and Peanut Butter Before addressing how and why to adopt UDL principles when designing interactions for flipped classrooms, here are a few core definitions. The research scientists at CAST came up with the concept of UDL, based on the various ways in which our brains process learning tasks: Universal design for learning (UDL) is one part of the overall movement toward universal design … While providing access to information or to materials is often essential to learning, it is not sufficient. UDL requires that we not only design accessible information, but also an accessible pedagogy … The framework for UDL is based in findings from cognitive neuroscience that tell us about the needs of individual learners. It embeds accessible pedagogy into three specific and central considerations in teaching: the means of representing information, the means for students’ expression of knowledge, and the means of engagement in learning. (Rose et al., 2006, p. 17)
456
Improve the Flipped Classroom with Universal Design for Learning
UDL is often associated with the use of technology to help extend learning opportunities for students. UDL is an approach to the creation of learning experiences that incorporates multiple means of • • •
Engaging with content and people Representing information Expressing skills and knowledge
The CAST web site for higher education, UDL on Campus, notes the importance of learner engagement among UDL principles (CAST, 2014); in light of this, CAST re-arranged the presentation of the components of UDL in 2014 to list engagement first. For the category of engagement, this means creating multiple ways to help students to self-regulate, such as helping them to keep pace with readings for the course, suggesting ways to portion out writing and research to avoid last-minute “crunching,” and encouraging learners to make connections beyond the immediate work they are doing in our courses. Making multiple representations of information is the part of UDL with which most of us are familiar. Faculty members largely know that they should be captioning or transcribing video clips, providing audio versions of text-based lecture notes, and segmenting longer items into more manageable (and more reviewable) chunks (see Zhang et al., 2010; Lombardi & Murray, 2011; Murray et al., 2011). Allowing learners to express their skills in multiple ways is often a new exercise for faculty members and course designers: if learners can write a 3-page essay, they can also be allowed to choose to create an audio podcast or video report, so long as all of the alternatives provide students with the means to meet the required assignment objectives. Choice is a powerful motivational strategy. Not only is choice part of UDL, but it can be helpful in addressing some of the challenges of the flipped-classroom model as well. Student motivation is often one of the top concerns for faculty members who flip their courses. If choice as an option for completing assignments is “built in,” then this challenge is addressed through the application of the principles of UDL. Before diving in to practical applications about how best to apply UDL principles to course interactions, however, we must first determine how to uncouple UDL from the negative emotional valence of people’s experiences with requests for specific accommodations. UDL began in the disability-advocacy community as a way of creating a more inclusive society, generally, which began with an architectural movement called universal design. “Recognition of disability as a civil right entails making sure that a person with a disability has access to the buildings, classrooms, and courts where those rights are learned and adjudicated” (Davidson, 2006, p. 126). UDL is an outgrowth of universal-design ideas in the built environment—such as allocating parking spaces for drivers with disabilities. UDL can often get mired in people’s perceptions of a “medical model” that perceives disability as primarily a health issue, where disabilities are deficits in function that reside within the individuals themselves. This medical model of disability helps to explain why many people unconsciously associate negative emotions with their interactions with people who have disabilities (Stodden et al., 2011, p. 83): the “otherness” is associated with the people with whom they interact. Contrast this with a social model of disability, in which the disabling factor is seen to be in the environment. If a student in a wheelchair encounters a library building with stairs but no ramp or flat-ground entryway, the disability is not inherent in the student—it is the poor design of the building that presents the challenge. Because higher education is largely in transition between these two mental models of disability, our first radical reflection about UDL is to re-frame it away from the concept of disability all together, and 457
Improve the Flipped Classroom with Universal Design for Learning
situate UDL in a narrative with which all faculty members and staffers are familiar—and one which ties in with the flipped-classroom approach through a much more neutral emotional valence: mobile learning. In comparison with learners from only fifteen years ago, the students who come to college today are significantly • • •
More likely to require remedial instruction (Adams, 2015) More likely to have poor study habits and time-management challenges (College Board, 2015) Less likely to have significant time for study outside of the classroom (College Board, 2015)
In North America, more college students than ever before are adult learners with family and job responsibilities—and precious little time for studying: “Adult learners are juggling family, work, and educational responsibilities. They don’t do optional” (Mason, 2014). A recent EDUCAUSE study shows that 86% of college students own smartphones, and a significant percentage also own other mobile devices such as tablets (Chen et al., 2015). The authors of an article reporting on the study extol the potential benefits of any-time, anywhere learning and collaboration: As an integral part of students’ daily lives, mobile technology has changed how they communicate, gather information, allocate time and attention, and potentially how they learn. The mobile platform’s unique capabilities—including connectivity, cameras, sensors, and GPS—have great potential to enrich the academic experience. Learners are no longer limited to the classroom’s geographical boundaries; for example, they can now record raw observations and analyze data on location. Furthermore, mobile technology platforms let individuals discuss issues with their colleagues or classmates in the field. The ever-growing mobile landscape thus represents new opportunities for learners both inside and outside the classroom. (Chen et al., 2015) The argument for adopting Universal Design for Learning has always been based on the broad benefits that UDL methods provide to all learners, but for years there wasn’t a compelling and simple case that demonstrates in a concrete way how those benefits play out. Now there is one: UDL is a way to reach out to adult learners on their mobile devices to help them to find more time for studying and engaging in learning. Mobile devices can also be valuable learning tools in the flipped-classroom model. They can be used before, during, and after class for reviewing content, conducting research, engaging in problem solving, and assessing learning. They can connect students to the course material, to other students, and to the instructor.
The Fight for a Level Playing Field An important aside about re-focusing on mobile devices: if the fight for access to the built environment has largely been won, thanks to the protests and voices of disability-rights advocates, the fight for the rights of people with disabilities in higher education continues in earnest. In recent years, several highprofile court cases took nationally-prominent institutions to task for failing to meet even the minimum legal requirements, such as the decision against Harvard and MIT for not captioning their edX course materials (Lewin, 2015). In advocating for a switch in tactics to broaden the argument for adopting UDL practices, the authors are cognizant of the way that such a switch can take the spotlight off the needs of people with disabilities. Hao (2016) performed research that shows strong student preferences for 458
Improve the Flipped Classroom with Universal Design for Learning
flipped-classroom approaches; the results incidentally demonstrate a corresponding strong preference for mobile-device-positive UDL, as well: The students showed the highest preference levels for BYOD [bring your own device] and IRS [instant response system]. Not only did the IRS feature impress the students, but also, the students repeatedly mentioned how much they loved both IRS and BYOD. One junior wrote in response to the open-ended questions, “The IRS got me less anxious about pop quizzes and made me look forward to taking a quiz.” Comparatively, the freshmen who reportedly had low motivation and self-discipline skills especially loved BYOD. One student noted, “I was never allowed to use my cell phone in class. I’m amazed that cell phones can be used for educational use in class. It’s so cool!” Another favorite feature was the absence of formal lectures and inclusion of a group discussion format. “I like to exchange opinions with my peers. The discussions stimulated me to think from different aspects,” one emphasized. “Lectures are boring. I prefer doing activities in class!” (p. 86) Our hope is that making the argument that adopting UDL benefits students using mobile devices will reduce the need for making specific accommodations and move us closer to the larger goal of disability advocacy, which is to allow everyone to have the same opportunities to learn—to erase difference and the need for separate accommodations as far as possible. UDL applications create opportunities for learners to encounter new information on their own, outside of interactions with instructors—leaving more time and space for collaborative review and exploration when instructors and learners are together. This is also the goal of the flipped-classroom approach. This is why the flipped-classroom approach and UDL work so well together, like “peanut butter and chocolate: two great tastes that taste great together,” to borrow a phrase from the candy-sales world.
Flipped + UDL Narratives Imagine a single mother—call her Melissa—who is taking business-management courses at her local community college. She has a job in order to be able to support her family, and she takes courses in the evenings and on weekends. She does her homework, engages with the course readings, and completes her course projects after 10:00 p.m., when the kids are finally in bed. Her statistics-course professor has posted video clips in the learning-management system as study aids toward the midterm and final exams, but Melissa cannot take advantage of the videos because she doesn’t want to wake her children and she doesn’t want to tune her kids out all together by using headphones. Melissa does not have a disability, but she does have a challenge: time. Now, imagine if Melissa’s professor provided transcripts of the audio in the video clips, or, better yet, captions. Melissa can turn down the sound, turn on the captions, and study for her course examinations, while remaining available in case her children need her. Adopting good UDL practices lets Melissa’s professor reach out to her—and to all of her classmates—with options that allows her to choose how she experiences the materials that the professor has posted. This is a double win: the professor’s work in creating the videos, plus one alternative version, is rewarded with more students actually using the resources, and the professor’s students are rewarded with more flexibility in how they study for the course and learn its materials, concepts, and processes. Imagine, too, a student on the football team at a large university in the Southeastern United States: call him Jamaal. Jamaal is often on a bus or train, traveling to away games with his teammates. He al459
Improve the Flipped Classroom with Universal Design for Learning
ready has a special arrangement that allows him to miss a certain number of in-person course meetings in his chemistry course, and he realizes that he’s missing out on an opportunity for learning. He wants to keep up with his professor’s narrated lecture slides, but his Internet connection is spotty when he is traveling. Jamaal has to wait until he is back on campus to be able to download and open his professor’s PowerPoint slides from the course web-resources page, since his mobile phone doesn’t have Microsoft Office on it. Jamaal does not have a disability, but he does have a challenge: resource availability. Now, imagine if Jamaal’s chemistry professor took the same narrated PowerPoint slides and created a screen-capture video version that the professor then uploaded to YouTube. Jamaal—and all of his classmates—could then stream the video, even under challenging bandwidth conditions, and he would not need any specific software title in order to experience the lecture slides. Finally, imagine a student—call her Amanda—whose National Guard unit is called up for an activeduty military tour of duty, right in the middle of her studies toward her nursing degree. Amanda’s professor in her anatomy and physiology course requires all students to pass a two-part final examination, in which the professor and student meet one on one and the professor quizzes the student on the name and location of various parts of the human body, with the professor providing one piece of information (the name of the part or its location on an anatomical model), and the student providing the other, by naming the part or pointing to the location on the model associated with the name. Amanda suspects that she will need to drop the course, since she will not be present to be able to complete the final examination, and there are no options for demonstrating her knowledge in a different way. Amanda thought she could buy her own anatomical model, but a quick look online showed her that the model used by her professor costs more than $6,000.00. Amanda does not have a disability, but she does have a challenge: distance. Now, imagine that Amanda’s professor offered students two different ways to take the final examination: in person (as above) or by Skype or other video-call software, using un-labeled diagrams provided by the professor ahead of time. The professor asks students to pan their cameras around themselves to show that there are no open books or study sheets being used; students can schedule the one-on-one time when and where it is most convenient to conduct the exam. Amanda uses the “private calls to home” area where she is deployed in order to do her live session for the final exam, and is able to continue her studies. These examples highlight professors adopting UDL techniques in order to reach out to their students who are using mobile devices in order to overcome distance, time, and resource limitations: challenges to which everyone can relate, and which are not freighted with aversive emotional valence. If anything, these stories about designing course interactions for mobile learners are uplifting, and they provide faculty members with motivation to put in the effort up front to intentionally design experiences to enhance student engagement, increase learning, and decrease potential barriers. Educator Allison Posey is interviewed in CAST’s recent book, Universal Design for Learning: Theory and Practice, about why she adopts UDL practices: “I work the hardest the first time I design a lesson; then it gets much easier and I even find that I do not have to re-teach the content as often: most students get it the first time” (Meyer et al., 2014, p. 161).
CHALLENGE 3: ADJUSTING STUDENTS AND FACULTY ROLES Remember that UDL is an approach to the creation of learning experiences that incorporates multiple means of
460
Improve the Flipped Classroom with Universal Design for Learning
• • •
Representing information, Engaging with content and people, and Expressing skills and knowledge.
As the definition of the flipped classroom evolves, one characteristic remains constant. Students must complete foundational knowledge work prior to class in order to be prepared to engage in the higherlevel discussions and participate in the activities that will take place during class time. If that individual engagement doesn’t happen, and students aren’t well prepared to discuss and analyze, then it’s very challenging to have a productive group session during class. Lack of student out-of-class preparation is one of the most frequently cited reasons for perceived flipped-classroom failures (see Perchik, 2014; Brame, 2013; and Bishop & Verlager, 2013). If instructors are not introducing new concepts and materials during the in-person classroom setting, they need some way to ensure that learners’ first encounters with new ideas will be productive—and they need to find ways to make sure learners actually do encounter the new ideas and don’t skip that first exposure or bail out if things get challenging. Here is where UDL plays a significant role. UDL gives all individuals equal opportunities to learn and provides a blueprint for creating instructional goals, methods, materials, and assessments that work for everyone—not a single, one-size-fits-all solution, but rather flexible approaches that can be customized and adjusted for individual needs (CAST, 2013). UDL helps with two aspects of the flipped-classroom approach, specifically: 1. Deep student engagement with first-exposure concepts, processes, and ideas; and 2. Meaningful student expression during in-person group interactions.
Student Motivation and Engagement UDL offers a framework for shifting the roles of faculty members and students in the flipped-learning environment. In fact, the first tenet of UDL, engaging with content and each other, is where adopting UDL practices does the most service to the flipped-classroom model. In addition to merely presenting new information to learners outside of class time, that presentation can be designed so that it triggers interaction and engagement, providing roles and specifying responsibilities for learners and faculty members to inhabit and explore. UDL works best when learners receive feedback and encouragement about their progress in as close to real time as possible (CAST, 2014). Especially since learners encounter new topics on their own, it’s up to faculty members and course designers to structure the materials they are using in order to help keep learners engaged and interacting with the course concepts and with each other. This is one of the benefits of the flipped classroom model, as well. When students are engaging in activities and experiences during class, the instructor can immediately see their progress, identify areas of confusion, and provide resources and support to clarify misunderstanding and confusion. This type of “assessment in action” is done in real time, and it’s what makes the flipped-classroom model so effective for learning and maintaining momentum throughout a course. Students and instructors don’t have to wait until a midterm or a final exam to make adjustments. This characteristic highlights the power of combining both the flipped model and UDL principles. Some educators misunderstand the flipped-classroom model to require only that learners read the textbook or watch the lecture or review course materials on their own, and thus are expected to master 461
Improve the Flipped Classroom with Universal Design for Learning
key concepts without any guidance from the instructor (Plotnikoff, 2013). Such a scenario is no different from the traditional-classroom model where students study their notes on their own or prepare individually for class. The flipped-classroom model asks learners to collaborate most closely and immediately when they are in the classroom together. Not only does this approach not preclude the possibility of interaction and collaboration during out-of-class activities, the level of engagement in out-of-class activities correlates to how well learners are prepared to be active participants as part of in-class activities. Thinking about out-of-class interactions through the lens of UDL helps us to design activities that keep learners engaged, focused, and on task. In the case of the faculty member who abandoned the flipped classroom, it is clear that neither he nor his students were clear about their roles and expectations. The students were required to watch a video of a lecture. But what were they supposed to do with that information? How do they know what information is important? How do they know what information will be applied during the in-person class? Often when instructors create assignments or tasks for students, they do so through the lens of being an expert. They are used to the expert role because they are scholars in their disciplines and they have been studying this information for a long time. As a result, they suffer from “the curse of knowledge” which means they know so much about the topic that they can’t remember what it’s like not to know it (see Heath & Heath, 2007). Faculty members often have a difficult time putting themselves into the role of being the novice, which is exactly what most of their students are. Their students are not experts yet. They are encountering the course material for the first time. If the instructor has not clearly designed a path towards success from the perspective of the novice, then students will disengage and resist. This ambiguity causes frustration and can lead to apathy, which is then often interpreted by the instructor as “it didn’t seem to matter to the students.” In our narrative, the students and the faculty member reverted back to the more traditional roles of lecturing and taking notes during class time rather than engaging in the higher domains of critical thinking and analysis. One way to design interactions so that the applications for learned content are evident to learners is to adopt the “ten and two” mental model. In driver’s education, learners are often told to place their hands at the ten o’clock and two o’clock positions on the steering wheel in order to have the strongest and most flexible control over the car. A similar model for course-interaction design offers maximum flexibility and control for learners. It has to do not with the clock face, but with minutes of time. Ask learners to spend no more than ten minutes encountering new things and ideas, and then ask them to spend at least two minutes taking an action related to it. For example, students might watch a five-minute video, read the first part of a textbook chapter for five minutes, and then take two minutes to reflect on the commonalities between the two resources. These reflection notes then feed forward into the next “take two” break, and create a foundation for richer in-class “flipped” conversation, as well. Good UDL supports interactions and new collaborative roles in the face-to-face elements of the flipped classroom model, as well. Faculty members and course designers can move beyond the learning management system (LMS) and use social-media and Web 2.0 tools like Google Apps in order to foster collaborative exploration and application of course concepts during face-to-face time together. Students can choose their roles as writers or speakers (or both) in the face-to-face environment if options are available for collaborative work in spaces that extend beyond the face-to-face classroom: To flip the classroom discussions, I instituted a 200-word maximum (or a 1-minute time limit for audio and video posts), forcing students to be pithy, but concise. Whatever they did not get to talk about in the
462
Improve the Flipped Classroom with Universal Design for Learning
main post could be discussed in the comments field with other students. Students were much more likely to engage multiple times with the same post. (Cummings, 2016, p. 92)
Representing Information In the flipped-classroom model, learners encounter new ideas outside of class time. Students stand the best chance of understanding those ideas if they are presented in the ways in which the learners take in information best. But how do professors and course designers know how each student learns best? Well, they don’t. That’s why UDL counsels people to present each piece of new information in at least two ways. Some might perceive UDL as requiring all possible alternative formats for any content that are created for learners. For videos of professors explaining course concepts, for example, one would have to create captions, a separate text transcript, an audio-only version, and so on. Think, in such a mindset, of how many separate files would have to be created in order to support the class flip in the first place. Having to make five times that number of files seems like an insurmountable obstacle. Only, UDL doesn’t work like that. And neither does the flipped-classroom model. With UDL, two questions help to maximize the utility of out-of-class content without multiplying the workload. First, think about the course content as it’s taught in a traditional fashion. Which concepts do learners traditionally find challenging? Where do they always get things wrong on tests and assignments? Where do they benefit from different approaches to the content? Those are the “first pass” places to create multiple alternative versions of content files. Second, select one primary and one secondary format for all course materials. This means that existing text-based course materials, such as lecture notes, study guides, and practice quizzes, are the primary format. Then, by selecting a secondary format, such as audio-only, a whole-course application of UDL can be done within a narrowly-defined scope. With the flipped-classroom model, it is best not to try to flip everything in a whole course. It’s best to start by identifying “flippable moments” or places where active learning will add value to content mastery (Honeycutt, 2013). Start by looking for places within the course where students are confused, bored, or where there’s information they absolutely must know before moving on to the next part of the course. These moments are the places to invest the most time and energy in when flipping, and they can provide a good starting point for figuring out where to integrate UDL principles. UDL doesn’t ask us to create materials to anticipate every possible use (e.g., students with visual disabilities, learners with poor Internet connections when they are going home on the bus), just to “design for the extremes,” and add more ways of representing information later on, if and when new learner needs get expressed. CAST has created an “Educator Worksheet” to help professors and designers to work through this decision making process (2011). Adopting a “plus one” mentality regarding the formats in which course content is offered also automatically takes advantage of differentiated instruction (DI), which aims to assess the level from which students are beginning, and offers varied ways for them to learn. One of the primary objectives of differentiated instruction is that it acknowledges that not all students learn the same way. By offering instructional choices, students can use the learning style(s) that works best for them. The differentiated instructional process begins with an assessment of the students’ prior knowledge and experiences. (Livingston, 2006)
463
Improve the Flipped Classroom with Universal Design for Learning
Just having more than one pathway through learning content and interactions opens up the benefits of differentiation; it is not necessary to plan for and execute every media approach in order to have a positive effect on student learning and persistence. Likewise, UDL doesn’t advocate a narrow adherence to rule-bound structures. For example, Quality Matters (QM) Standard 8 deals with “Accessibility and Usability,” and is founded on the principles of UDL. QM pointedly does not prescribe the forms that such design must take: • • • • •
8.1 Course navigation facilitates ease of use. 8.2 Information is provided on accessibility of technologies required in the course. 8.3 The course provides alternative means of access to course materials in formats that meet the needs of diverse learners. 8.4 The course design facilitates readability. 8.5 Course multimedia facilitate ease of use. (Quality Matters, 2016)
In the case of the faculty member who had problems flipping his classroom, he had narrowly defined the flipped classroom as “recorded videos of lectures” and he expected his students to watch each full hour of his lectures prior to coming to class. He burned himself out and became exhausted recording all of these videos, and he did not successfully integrate them into the overall learning experience within the course. The students were burned out and frustrated because they were unable to see the purpose and value of spending time watching the videos. Had the professor considered alternative formats, integrated existing resources, segmented them into smaller chunks, and intentionally designed the videos to enhance or support specific learning outcomes, then the approach would have been more effective and engaging. The use of videos allows students to “pause and rewind the professor” (Ehlers, 2014), but relying on videos alone to deliver the message does not address the needs of all learners, nor does it allow faculty members to identify more effective and engaging means for delivering information and engaging students in the learning process.
Expressing Concept Mastery This final part of UDL and the flipped-classroom model can be the hardest to implement, but is also the most fun (yes, that says “fun”) for professors and students alike. Where possible, provide students with multiple ways to demonstrate their skills and learning. This does not necessarily mean having to create separate alternative assignments. Rather, look at the objectives for assignments and think of whether students must use a particular format in order to demonstrate those objectives, or if they can accomplish the same tasks in different ways. The research shows that allowing students choices about their work leads to increased satisfaction and learning gains. For example, Wanner’s idea of the “flexible student” is supported by “designing in” choices to course interactions: Choosing assessment tasks was also well received by the students. They were content with the number of choices available and felt that additional choices could have been confusing. They did not always choose areas of strength for their assignments (such as an accomplished essay writer choosing to write an essay rather than another form of assessment), but this was the most common response. (Wanner & Palmer, 2015, p.361).
464
Improve the Flipped Classroom with Universal Design for Learning
For some assignments, such as learning how to write a business memo, the format is an integral part of the assignment; learners who created videos in response to such an assignment would not demonstrate good memo format. However, there are many kinds of assignments where the format is not integral to the skill set being demonstrated. In those situations, offer students the chance to create their responses to the assignment in any format that meets the objectives, or provide a list of possible formats, such as a written response, a short video report, an audio podcast, or a hand-drawn diagram that students then photograph and submit. Not only does allowing multiple means of expression free up learners to select their best skill sets, but it also makes grading less of a repetitive chore. Many faculty members would rather see varied and creative responses to an assignment than have to grade dozens of five-page essays. However, if varied assignments are used, then the instructor needs to be prepared with several assessment strategies to accommodate the variety of assignments. This is not meant as a deterrent to inviting students to submit assignments in other formats. Actually, it can open up the potential for more feedback and less grading if designed carefully. For example, one way to address this challenge is to integrate both formative and summative assessment processes. Formative assessments—such as classroom assessment techniques—are designed for practice and allow students to test their skills and knowledge and receive feedback without the high stakes involved in grading (see Angelo & Cross, 1993). Formative assessment tasks may be graded, but the percentage towards the overall course grade should be low. These “practice” assessments allow students to test their knowledge and correct their mistakes while giving the instructor valuable feedback about how to proceed, based on learner performance and feedback. Formative assessment activities provide ideal opportunities to try alternative assignment formats. In the business-memo example, a flipped strategy would be for students to write a summary or record a one-minute video explaining in their own words how a memo is formatted and why it matters. Or, in another flipped strategy, students could be given a poorly formatted memo and asked to correct it by circling the errors and explaining how to make corrections. These activities would be completed during class time to allow students to practice analyzing, evaluating, and creating skills before they are given a summative evaluation such as a test, project, or final exam where the students would demonstrate mastery. The instructor who spoke with Dr. Honeycutt had uploaded at least 45 hours of video lectures without carefully considering whether or not video was the best medium. He was also unable to make stronger connections between the videos and the activities designed for the in-person class time. He delivered the message only one way—using a video. He did not specify what to look for in the videos or how students could test their own knowledge of the information presented. Had he introduced multiple ways for students to demonstrate mastery and test their knowledge, then his students would have become more engaged and prepared for their in-person class time. And by modeling different approaches and formats to the students, the faculty member could have motivated his students to do the same when they completed their assignments.
CONCLUSION As faculty members continue to implement the flipped-classroom model and other active learning strategies, they will face many challenges and opportunities. It is important to identify which flippedclassroom techniques work best for both instructors and their students. Clearly defining the flip and its expectations for faculty and student roles is essential in creating a successful learning environment. It is 465
Improve the Flipped Classroom with Universal Design for Learning
also important to consider carefully the design and structure of both pre-class and in-class activities, in order to ensure that students are set up for success and not burnout. Designing with this intent creates a clear path for assessment by allowing students to practice before demonstrating mastery. By integrating the elements of UDL into the flipped-classroom model, expectations are clarified, and they focus on a variety of ways to connect students to the professor, the course material, and to each other as partners in the learning process. In early 2015, Dr. Tobin visited a campus to facilitate a faculty-development workshop on UDL. As he was arranging his materials, distributing handouts, and organizing his work space, a faculty member came into the room, introduced herself, and said, “You know, I expanded all of my course content according to UDL last year for my freshman-level writing course. I had no idea how much work accessibility was! I had to make three, four, and sometimes five different versions of everything in my course.” Tom smiled politely and asked, “Oh, you did? So how did that work out for you?”
REFERENCES Academic Partnerships. (2015, November 30). Quality Matters Monday: Standard 8.1. Faculty eCommons blog. Retrieved from http://facultyecommons.com/quality-matters-monday-standard-8-1/ Adams, C. J. (2015, September 9). 2015 SAT, ACT scores suggest many students aren’t college-ready. Education Week. Retrieved from http://www.edweek.org/ew/articles/2015/09/09/2015-sat-act-scoressuggest-many-students.html Anderson, L. W., & Krathwohl, D. (2001). A taxonomy for learning, teaching, and assessing: A revision of Bloom’s taxonomy of educational objectives. New York: Longman. Angelo, T., & Cross, P. (1993). Classroom assessment techniques: A handbook for college faculty (2nd ed.). San-Francisco: Jossey-Bass. Benham, N. E. (1997). Faculty attitudes and knowledge regarding specific disabilities and the Americans with Disabilities Act. College Student Journal, 31, 124–129. Bento, R. F. (1996). Faculty decision-making about “reasonable accommodations” for disabled college students. College Student Journal, 30(4), 494. Bergman, J., & Sams, A. (2012). Flip Your Classroom: Reach Every Student in Every Class Every Day. Arlington, VA: International Society for Technology in Education. Bigaj, S. J., Shaw, S. F., & McGuire, J. M. (1999). Community-technical college faculty willingness to use and self-reported use of accommodation strategies for students with learning disabilities. Journal for Vocational Special Needs Education, 21(2), 3–14. Bishop, J. L., & Verleger, M. A. (2013). The flipped classroom: A survey of the research. Proceedings of the American Society for Engineering Education] annual conference. Retrieved from http://www. asee.org/public/conferences/20/papers/6219/download Bogost, I. (2013). The condensed classroom. The Atlantic. Retrieved from http://www.theatlantic.com/ technology/archive/2013/08/the-condensed-classroom/279013/
466
Improve the Flipped Classroom with Universal Design for Learning
Brame, C. (2013). Flipping the classroom. Vanderbilt University Center for Teaching. Retrieved from http://cft.vanderbilt.edu/guides-sub-pages/flipping-the-classroom/ Bruff, D. (2012, Sep. 15). The flipped classroom FAQ. CIRTL Network. Retrieved from http://www. cirtl.net/node/7788 Calimeris, L., & Sauer, K. M. (2015). Flipping out about the flip: All hype or is there hope. International Review of Economics Education, 20, 13–28. doi:10.1016/j.iree.2015.08.001 Center for Applied Special Technology. (2011). UDL Guidelines–Educator worksheet. CAST UDL Online Modules. Retrieved from http://udlonline.cast.org/guidelines Center for Applied Special Technology. (2013). About UDL. Retrieved from http://cast.org/udl/index.html Center for Applied Special Technology. (2014). UDL on campus: Universal Design for Learning in higher education—A guide. Retrieved from http://udloncampus.cast.org/ Chen, B., Seilhamer, R., Bennett, L., & Bauer, S. (2015, Jun. 22). Students’ mobile learning practices in higher education: A multi-year study. EDUCAUSE Review. Retrieved from http://er.educause.edu/ articles/2015/6/students-mobile-learning-practices-in-higher-education-a-multiyear-study College Board. (2015). 2015 College Board program results: Expanding access, challenging students, equipping educators. https://www.collegeboard.org/program-results Cook, L., Rumrill, P. D., & Tankersley, M. (2009). Priorities and understanding of faculty members regarding college students with disabilities. International Journal of Teaching and Learning in Higher Education, 21(1), 84–96. Cummings, (2016). Flipping the online classroom with web 2.0: The asynchronous workshop. Business and Professional Communications Quarterly, 79, 81-101. Davidson, M. (2006). Universal design: The work of disability in an age of globalization. In L. Davis (Ed.), The Disability Studies Reader (2nd ed., pp. 117-130). New York, NY: Routledge. Deslauriers, L., Schelew, E., & Weiman, C. (2011). Improved learning in a large-enrollment physics class. Science, 332(1), 862–864. doi:10.1126/science.1201783 PMID:21566198 EDUCAUSE. (2012). 7 things you should know about flipped classrooms. EDUCAUSE Learning Initiative. Retrieved from https://net.educause.edu/ir/library/pdf/eli7081.pdf Ehlers, T. (2014, Oct. 23). Online education is the future of learnings. Method Test Prep blog. Retrieved from http://info.methodtestprep.com/blog/online-education-is-the-future-of-learning Fichten, C. S. (1986). Self, other, and situation-referent automatic thoughts: Interaction between people who have a physical disability and those who do not. Cognitive Therapy and Research, 10(5), 571–587. doi:10.1007/BF01177820 Foldnes, N. (2016). The flipped classroom and cooperative learning: Evidence from a randomized experiment. Active Learning in Higher Education, 17(1), 39–49. doi:10.1177/1469787415616726
467
Improve the Flipped Classroom with Universal Design for Learning
Fonosch, G., & Schwab, L. O. (1981). Attitudes of selected university faculty members toward disabled students. Journal of College Student Personnel, 22(3), 229–235. Frijda, N. H. (1986). The Emotions. Studies in Emotion and Social Interaction series. Cambridge, UK: Cambridge University Press. Hao, Y. (2016). Exploring undergraduates perspectives and flipped learning readiness in their flipped classrooms. Computers in Human Behavior, 59, 82–92. doi:10.1016/j.chb.2016.01.032 Heath, C., & Heath, D. (2007). Made to stick: Why some ideas thrive and others die. New York: Random House. Heer, R. (2015). Revised Bloom’s taxonomy. Effective teaching practices. Center for Excellence in Learning and Teaching (CELT). Ames, IA: Iowa State University. Retrieved from http://www.celt.iastate.edu/ wp-content/uploads/2015/09/RevisedBloomsHandout-1.pdf Honeycutt, B. (2013, March 25). Looking for “flippable moments” in your class. Faculty Focus blog. Retrieved from http://www.facultyfocus.com/articles/instructional-design/looking-for-flippable-momentsin-your-class/ Honeycutt, B., & Garrett, J. (2013). The flipped approach to a learner-centered class (White paper). Madison, WI: Magna Publications. Houck, C. K., Asselin, S. B., Troutman, G. C., & Arrington, J. M. (1992). Students with learning disabilities in the university environment: A study of faculty and student perceptions. Journal of Learning Disabilities, 25(10), 678–684. doi:10.1177/002221949202501008 PMID:1460390 Lage, M. J., Platt, G. J., & Treglia, M. (2000). Inverting the classroom: A gateway to creating an inclusive learning environment. The Journal of Economic Education, 31(1), 30–43. doi:10.1080/00220480009596759 Lewin, T. (2015, February 13). Harvard and MIT are sued over lack of closed captions. New York Times. Retrieved from http://www.nytimes.com/2015/02/13/education/harvard-and-mit-sued-over-failing-tocaption-online-courses.html Livingston, D. (2006, February 11). Differentiated instruction and assessment in the college classroom. Paper presented at the 12th Annual Conference on College and University Teaching. Kennesaw, GA: Kennesaw State University. Retrieved from http://home.lagrange.edu/dlivingston/differentiated.htm Lombardi, A. R., & Murray, C. (2011). Measuring university faculty attitudes toward disability: Willingness to accommodate and adopt universal design principles. Journal of Vocational Rehabilitation, 34(1), 43–56. Mason, G., Shuman, T. R., & Cook, K. E. (2013). Comparing the effectiveness of an inverted classroom to a traditional classroom in an upper-division engineering course. IEEE Transactions on Education, 56(4), 430–435. doi:10.1109/TE.2013.2249066 Mason, K. C. (2014, August 25). Colleges adjust to new reality that more students juggle work, family. PBS News Hour. Retrieved from http://www.pbs.org/newshour/updates/colleges-adjust-to-new-realitythat-students-juggle-work-family-more/
468
Improve the Flipped Classroom with Universal Design for Learning
Mazur, E. (2013, March 13). The flipped classroom will redefine the role of educators. EvoLLLution blog. Retrieved from http://www.evolllution.com/distance_online_learning/audio-flipped-classroomredefine-role-educators-10-years/ Meyer, A., Rose, D., & Gordon, D. (2014). Universal design for learning: Theory and practice. Wakefield, MA: CAST. Moravec, M., Williams, A., Aguilar-Roca, N., & ODowd, D. K. (2010). Learn before lecture: A strategy that improves learning outcomes in a large introductory biology class. CBE Life Sciences Education, 9(4), 473–481. doi:10.1187/cbe.10-04-0063 PMID:21123694 Murray, C., Lombardi, A., & Wren, C. (2011). The effects of disability-focused training on the attitudes and perceptions of university staff. Remedial and Special Education, 32(4), 290–300. doi:10.1177/0741932510362188 Murray, C., Lombardi, A., Wren, C. T., & Keys, C. (2009). Associations between prior disability-focused training and disability-related attitudes and perceptions among university faculty. Learning Disability Quarterly, 32(2), 87–100. doi:10.2307/27740359 Nelson, J., Dodd, J., & Smith, D. (1990). Faculty willingness to accommodate students with learning disabilities. Journal of Learning Disabilities, 23(3), 185–189. doi:10.1177/002221949002300309 PMID:2313192 Noonoo, S. (2012, June 20). Flipped learning founders set the record straight. THE Journal. Retrieved from https://thejournal.com/articles/2012/06/20/flipped-learning-founders-q-and-a.aspx Perchik, J. (2014, May 14). Flipped classroom: When it fails and why. In-Training blog. Retrieved from http://in-training.org/flipped-classroom-fails-7052 Plotnikoff, D. (2013, July 16). Classes should do hands-on exercises before reading and video, Stanford researchers say. Stanford Report. Retrieved from http://news.stanford.edu/news/2013/july/flippedlearning-model-071613.html Quality Matters. (2016). Higher Ed Program Rubric. Retrieved from https://www.qualitymatters.org/rubric Riendeau, D. (2012). Flipping the classroom. The Physics Teacher, 50(1), 507. doi:10.1119/1.4758164 Roach, T. (2014). Student perceptions toward flipped learning: New methods to increase interaction and active learning in economics. International Review of Economics Education, 17, 74–84. doi:10.1016/j. iree.2014.08.003 Rose, D., Harbour, W., Johnston, C. S., Daley, S., & Abarbanell, L. (2006). Universal design for learning in postsecondary education: Reflections on principles and their application. Journal of Postsecondary Education and Disability, 19(2), 17. Retrieved from http://www.udlcenter.org/sites/udlcenter.org/files/ UDLinPostsecondary.pdf Sams, A., & Bennett, B. (2012). The truth about flipped learning. eSchool News. Retrieved from http:// www.eschoolnews.com/2012/05/31/the-truth-about-flipped-learning/3/
469
Improve the Flipped Classroom with Universal Design for Learning
Shaver, M. (2010). Using low tech interactions in the chemistry classroom to engage students in active learning. Journal of Chemical Education, 87(12), 1320–1323. doi:10.1021/ed900017j Stodden, R. A., Brown, S. E., & Roberts, K. (2011). Disability-friendly university environments: Conducting a climate assessment. New Directions for Higher Education, 1(154), 83–92. doi:10.1002/he.437 Strayer, J. F. (2012). How learning in an inverted classroom influences cooperation, innovation and task orientation. Learning Environments Research, 15(2), 171–193. doi:10.1007/s10984-012-9108-4 Talbert, R. (2014, January 27). The inverted calculus course: Overture. Casting Out Nines column. The Chronicle of Higher Education. Retrieved from http://chronicle.com/blognetwork/castingoutnines/2014/01/27/the-inverted-calculus-course-overture/ Tobin, T. J. (2014). Increase online student retention with universal design for learning. Quarterly Review of Distance Education, 15(3), 13–24. Wanner, T., & Palmer, E. (2015). Personalising learning: Exploring student and teacher perceptions about flexible learning and assessment in a flipped university course. Computers & Education, 88, 354–369. doi:10.1016/j.compedu.2015.07.008 Wilson, S. G. (2013). The flipped class: A method to address the challenges of an undergraduate statistics course. Teaching of Psychology, 40(3), 193–199. doi:10.1177/0098628313487461 Zhang, D., Landmark, L., Reber, A., Hsu, H., Kwok, O., & Benz, M. (2010). University faculty knowledge, beliefs, and practices in providing reasonable accommodations to students with disabilities. Remedial and Special Education, 31(4), 276–286. doi:10.1177/0741932509338348
KEY TERMS AND DEFINITIONS Accommodation: A specific change that allows an individual with a learning challenge to enjoy equal access to educational opportunities. Accommodations are usually provided reactively, in response to requests from students via campus units, such as disability-services offices, who act as advocates. Bloom’s Taxonomy of Learning Domains: A pyramid-shaped representation of the various ways in which learners acquire and demonstrate their learning, ranging in cognitive complexity along domains of remembering, understanding, applying, analyzing, evaluating, and creating. Anderson and Kratwohl’s 2001 update to the original 1956 pyramid structure redefines the cognitive domain as the intersection of cognitive processes and knowledge dimensions, creating a three-dimensional hierarchy along their intersections. See Heer (2015) for a visual representation of the model. Emotional Valence: A term from psychology that refers to the intrinsic attractiveness or aversiveness (or “emotional coloring”) of the events, objects, and situations that people experience. Flipped Classroom: A model for teaching and learning in which what is traditionally thought of as “homework” is performed together during in-class sessions and “lecture” is experienced by students during individual study away from the classroom. Learners encounter new concepts first on their own, and work collaboratively with the instructor and classmates to examine, test, and demonstrate new applications and skills.
470
Improve the Flipped Classroom with Universal Design for Learning
Plus-One Approach: A mental model for prioritizing interactions to be expanded in terms of adding media alternatives, learners choices, and modes of learner engagement. Think of the places in the course where learners always: 1) Have questions, 2) Get things wrong on tests and assignments, and 3) Request explanations in different terms. Apply “plus one” design to these elements: add one choice, alternative, or means of self-regulation in each place identified. Plus-one thinking helps to focus one’s design efforts to the places where they are likely to have the greatest impact for learners. Ten and Two: The practice of providing not more than ten minutes of information or content before asking learners to spend at least two minutes taking an action of some kind. The ten-and-two rule of thumb is a handy way to ensure that content and interactions are appropriately “chunked” throughout one’s course. Universal Design for Learning (UDL): A set of principles for designing interactions with learners to create multiple means for learner engagement, multiple means of representing information, and multiple means for learners to demonstrate their knowledge and skills. UDL increases access for all learners, not just individuals with learning challenges.
471
472
Compilation of References
Abdollahian, M., Yang, Z., Coan, T., & Yesilada, B. (2013). Human development dynamics: An agent based simulation of macro social systems and individual heterogeneous evolutionary games. Complex Adaptive Systems Modeling, 1(18), 1–17. Abendroth, M., Harendza, S., & Riemer, M. (2013). Clinical decision making: A pilot e-learning study. The Clinical Teacher, 10(1), 51–55. doi:10.1111/j.1743-498X.2012.00629.x PMID:23294745 Abrami, P. C., Bernard, R. M., Bures, E., Borokhovski, E., & Tamim, R. M. (2011). Interaction in distance education and online learning: Using evidence and theory to improve practice. Journal of Computing in Higher Education, 23(2-3), 82–103. doi:10.1007/s12528-011-9043-x Abrams, Z. I. (2002). Surfing to cross-cultural awareness: Using internet-mediated projects to explore cultural stereotypes. Foreign Language Annals, 35(2), 141–160. doi:10.1111/j.1944-9720.2002.tb03151.x Academic Partnerships. (2015, November 30). Quality Matters Monday: Standard 8.1. Faculty eCommons blog. Retrieved from http://facultyecommons.com/quality-matters-monday-standard-8-1/ Acampora, G., Gaeta, M., & Loia, V. (2011). Combining multi-agent paradigm and memetic computing for personalized and adaptive learning experiences. Computational Intelligence, 27(2), 141–165. doi:10.1111/j.1467-8640.2010.00367.x Adams, C. J. (2015, September 9). 2015 SAT, ACT scores suggest many students aren’t college-ready. Education Week. Retrieved from http://www.edweek.org/ew/articles/2015/09/09/2015-sat-act-scores-suggest-many-students.html Adams, W. K., Reid, S., LeMaster, R., McKagan, S. B., Perkins, K. K., Dubson, M., & Wieman, C. E. (2008). A study of educational simulations part I - Engagement and learning. Retrieved from http://phet.colorado.edu/publications/ PhET_Interviews_I.pdf Ahmad, T., Härdle, W., Klinke, S., & Alawadhi, S. (2013). Using wiki to build an e-learning system in statistics in the Arabic language. Computational Statistics, 28(2), 481–491. doi:10.1007/s00180-012-0312-6 Ajayi, L. (2009). An Exploration of Pre-Service Teachers’ Perceptions of Learning to Teach while Using Asynchronous Discussion Board. Journal of Educational Technology & Society, 12(2), 86-n/a. Akyol, Z., & Garrison, D. R. (2011). Understanding cognitive presence in an online and blended community of inquiry: Assessing outcomes and processes for deep approaches to learning. British Journal of Educational Technology, Cognitive presence in an online and blended community of inquiry, 42(2), 233–250. Akyol, Z., & Garrison, D. R. (2008). The development of a community of inquiry over time in an online course: Understanding the progression and integration of social, cognitive and teaching presence. Journal of Asynchronous Learning Networks, 12(3-4), 3–22.
Compilation of References
Akyol, Z., & Garrison, D. R. (2008). The development of a Community of Inquiry over time in an online Course: Understanding the progression and integration of social, cognitive and teaching presence. Journal of Asynchronous Learning Networks, 12(3-4), 3–22. Akyol, Z., & Garrison, D. R. (2011). Understanding cognitive presence in an online and blended community of inquiry: Assessing outcomes and processes for deep approaches to learning. British Journal of Educational Technology, 42(2), 233–250. doi:10.1111/j.1467-8535.2009.01029.x Akyol, Z., Garrison, D. R., & Ozden, M. Y. (2009). Online and blended communities of inquiry: Exploring the developmental and perceptual differences. International Review of Research in Open and Distance Learning, 10(6), 65–83. Albert, L. J., & Johnson, C. S. (2011). Socioeconomic status– and gender-based differences in students’ perceptions of elearning systems. Decision Sciences Journal of Innovative Education, 9(3), 421–436. doi:10.1111/j.1540-4609.2011.00320.x Albert, W., & Tullis, T. (2013). Measuring the user experience: collecting, analyzing, and presenting usability metrics. Newnes. Albion, P. R. (2001). Some Factors in the Development of Self-Efficacy Beliefs for Computer Use Among Teacher Education Students. Journal of Technology and Teacher Education, 9(3), 321–347. Aldrich, C. (2004). Simulations and the future of learning. San Francisco, CA: John Wiley and Sons. Al-Elq, A. (2010). Simulation-based medical teaching and learning. Journal of Family and Community Medicine, 17(1), 35–40. doi:10.4103/1319-1683.68787 PMID:22022669 Alessi, S. (2000). Simulation design for training and assessment. In H. O’Neil & D. H. Andrews (Eds.), Aircrew training and assessment (pp. 197–222). London, United Kingdom: Routledge. Allen, E. I., & Seaman, J. (2013). Changing course: Ten years of tracking online education in the United States. Babson Survey Research Group. Retrieved from http://www.onlinelearningsurvey.com/reports/changingcourse.pdf Allen, E. I., & Seaman, J. (2013). Changing course: Ten years of tracking online education in the United States. Quahog Research Group, LLC and Babson Survey Research Group. Retrieved from http://www.onlinelearningsurvey.com/ reports/changingcourse.pdf Allen, E. I., & Seaman, J. (2014). Grade change: Tracking online education in the United States. Babson Survey Research Group. Retrieved from http://www.onlinelearningsurvey.com/reports/gradechange.pdf Allen, E., & Seaman, J. (2008). Staying the course: Online education in the United States, 2008. The Sloan Consortium. Retrieved from http://sloanconsortium.org/publications/survey/staying_course Allen, I. E., & Seaman, J. (2013). Changing course: Ten years of tracking online education in the United States. Retrieved from http://www.onlinelearningsurvey.com/reports/changingcourse.pdf Allen, I. E., & Seaman, J. (2015). Grade level: Tracking online education in the United States. Babson Survey Group. Retrieved from http://www.onlinelearningsurvey.com/reports/gradelevel.pdf Ally, M. (2008). Foundation of educational theory for online learning. In T. Anderson (Ed.), The Theory and Practice of online Learning (2nd ed.). Edmonton: AU Press. Alonso, F., López, G., Manrique, D., & Viñes, J. M. (2005). An instructional model for web-based e-learning education with a blended learning process approach. British Journal of Educational Technology, 36(2), 217–235. doi:10.1111/j.14678535.2005.00454.x
473
Compilation of References
Altowairiki, N. (2013). Instructors’ and students’ experiences with online collaborative learning in Higher Education [Master’s thesis]. University of Calgary. Amador, J. A., & Mederer, H. (2013). Migrating successful learner engagement strategies online: Opportunities and challenges using jigsaw groups and problem-based learning. Journal of Online Learning and Teaching, 9, 89–105. Amaral, K. E., Shank, J. D., Shibley, I., & Shibley, L. R. (2011). Designing a blended course: Using ADDIE to guide instructional design. Journal of College Science Teaching, 40(6), 80. Ambrose, S., Bridges, M. W., DiPietro, M., Lovett, M. C., & Norman, M. K. (2010). How learning works: seven researchbased principles for smart teaching. San Francisco, CA: Jossey-Bass. Ames, C., & Archer, J. (1988). Achievement goals in the classroom: Students learning strategies and motivation processes. Journal of Educational Psychology, 80(3), 260–267. doi:10.1037/0022-0663.80.3.260 Andersen, P., & Andersen, J. (1982). Nonverbal Immediacy in Instruction. In L. Barker (Ed.), Communication in the Classroom (pp. 98–120). Englewood Cliffs, NJ: Prentice-Hall. Anderson, J. L., & Barnett, M. (2013). Learning physics with digital game simulations in middle school science. Journal of Science Education and Technology, 22(1), 914–926. doi:10.1007/s10956-013-9438-8 Anderson, L. W., & Krathwohl, D. (2001). A taxonomy for learning, teaching, and assessing: A revision of Bloom’s taxonomy of educational objectives. New York: Longman. Anderson, L. W., & Krathwohl, D. R. (Eds.). (2001). A taxonomy for learning, teaching and assessing: A revision of Bloom’s Taxonomy of educational objectives: Complete edition. New York, NY: Longman. Anderson, L. W., & Krathwohl, D. R. (Eds.). (2001). A taxonomy for learning, teaching, and assessing: A revision of Bloom’s taxonomy of educational objectives. New York, NY: Longman. Anderson, L. W., Krathwohl, D. R., Airasian, P. W., Cruikshank, K. A., Mayer, R. E., Pintrich, P. R., & Wittrock, M. C. et al. (2001). A taxonomy for learning, teaching, and assessing: A revision of Bloom’s Taxonomy of educational objectives. New York, NY: Longman. Anderson, T. (2008). The theory and practice of online learning. Athabasca University Press. Anderson, T., Rourke, L., Garrison, D. R., & Archer, W. (2001). Assessing teacher presence in a computer conferencing context. Journal of Asynchronous Learning Networks, 5(2), 1–7. Anderson, T., Rourke, L., Garrison, D. R., & Archer, W. (2001). Assessing Teacher Presence in a computer conferencing context. Journal of Asynchronous Learning Networks, 5(2). Retrieved from http://auspace.athabascau.ca/handle/2149/725 Andresen, M. A. (2009). Asynchronous discussion forums: success factors, outcomes, assessments, and limitations. Journal of Educational Technology & Society, 12(1), 249. Andrica, S., & Candea, G. (2011). WaRR: A tool for high-fidelity web application record and replay. Proceedings of the41st International Conference on Dependable Systems & Networks. doi:10.1109/DSN.2011.5958253 Angelo, T., & Cross, P. (1993). Classroom assessment techniques: A handbook for college faculty (2nd ed.). SanFrancisco: Jossey-Bass. An, H., Shin, S., & Lim, K. (2009). The effects of different instructor facilitation approaches on students interactions during asynchronous online discussions. Computers & Education, 53(3), 749–760. doi:10.1016/j.compedu.2009.04.015
474
Compilation of References
Antonova, A., & Martinov, M. (2010). Serious Games and Virtual Worlds in education and business.Paper presented on the SAI conference, Sofia. Antonova, A., & Todorova, K. (2010). Serious Games and Virtual Worlds for high-level learning experiences. Proceedings of the S3T Conference, Varna. Aragon, S. R. (2010). Creating social presence in online environments. In New Directions for Adult and Continuing Education (pp. 57-68). San Francisco, CA, USA: Jossey Bass. Arbaugh, J. B. (2001). How instructor immediacy behaviors affect student satisfaction and learning in web courses. Business Communication Quarterly, 64(4), 42–54. doi:10.1177/108056990106400405 Arbaugh, J. B., Cleveland-Innes, M., Diaz, S. R., Garrison, D. R., Ice, P., Richardson, K., & Swan, K. P. (2008). Developing a Community of Inquiry instrument: Testing a measure of the Community of Inquiry framework using a multiinstitutional sample. The Internet and Higher Education, 11(3), 133–136. doi:10.1016/j.iheduc.2008.06.003 Arbaugh, J. B., & Hwang, A. (2006). Does teaching presence exist in online MBA courses? The Internet and Higher Education, 9(1), 9–21. doi:10.1016/j.iheduc.2005.12.001 Archer, L., Hutchings, M., & Ross, A. (2003). Higher education and social class: Issues of exclusion and inclusion. Psychology Press. Argon, S. (2003). Creating social presence in online environments. New Directions for Adult and Continuing Education, 100(100), 57–68. doi:10.1002/ace.119 Argyle, M., & Dean, J. (1965). Eye-contact, distance and affiliation. Sociometry, 28(3), 289–304. doi:10.2307/2786027 PMID:14341239 Arthur-Mensah, N., & Shuck, B. (2014). Learning in developing countries: Implications for workforce training and development in Africa. New Horizons in Adult Education and Human Resource Development, 26(4), 41–46. doi:10.1002/ nha3.20084 Asarbakhsh, M., & Sandars, J. (2013). E-learning: The essential usability perspective. The Clinical Teacher, 10(1), 47–50. doi:10.1111/j.1743-498X.2012.00627.x PMID:23294744 Association of American Medical Colleges. (2014). Simulation center use at medical schools: 2013-2014. Retrieved from https://www.aamc.org/initiatives/cir/423320/16.html Asterhan, C. S., & Rosenberg, H. (2015). The promise, reality and dilemmas of secondary school teacher–student interactions in Facebook: The teacher perspective. Computers & Education, 85, 134–148. doi:10.1016/j.compedu.2015.02.003 Atterer, R., & Schmidt, A. (2007). Tracking the interaction of users with AJAX applications for usability testingProceedings of the SIGCHI conference on Human factors in computing systems (pp. 1347-1350). doi:10.1145/1240624.1240828 Awidi, I. T., & Cooper, M. (2015). Using management procedure gaps to enhance e-learning implementation in Africa. Computers & Education, 90, 64–79. doi:10.1016/j.compedu.2015.08.003 Azedevo, A. (2012, October 17). Wired Campus: San Jose State U. Says replacing live lectures with videos increased test scores. Chronicle of Higher Education Blog. Retrieved from http://chronicle.com/blogs/wiredcampus/san-jose-stateu-says-replacing-live-lectures-with-videos-increased-test-scores Azeiteiro, U. M., Bacelar-Nicolau, P., Caetano, F. J. P., & Caeiro, S. (2015). Education for sustainable development through e-learning in higher education: Experiences from Portugal. Journal of Cleaner Production, 106, 308–319. doi:10.1016/j.jclepro.2014.11.056 475
Compilation of References
Baer, L. L. (2005). The generation gap: Bridging learners and educators. The International Digital Media & Arts Association Journal, 2(1), 47–52. Baird, J. R., & Northfield, J. R. (1992). Learning from the PEEL experience. Melbourne, Australia: Monash University. Baker, J. D. (2004). An investigation of relationships among instructor immediacy and affective and cognitive learning in the online classroom. The Internet and Higher Education, 7(1), 1–13. doi:10.1016/j.iheduc.2003.11.006 Baker, S., Gersten, R., & Graham, S. (2003). Teaching Expressive Writing to Students with Learning Disabilities ResearchBased Applications and Examples. Journal of Learning Disabilities, 36(2), 109–123. doi:10.1177/002221940303600204 PMID:15493427 Baker, W. (2015). Culture and complexity through English as a lingua franca: Rethinking competences and pedagogy in ELT. Journal of English as a Lingua Franca, 4(1), 9–30. doi:10.1515/jelf-2015-0005 Bakker, M., van der Hauvel-Panhuizen, M., van Borkulo, S., & Robitzsch, A. (2012). Effects of mini-games for enhancing multiplicative abilities: A first exploration. Serious games: The challenge. Communications in Computer and Information Science, 280, 53–57. doi:10.1007/978-3-642-33814-4_7 Balım, A. G., İnel, D., & Evrekli, E. (2007). Probleme dayalı öğrenme (PTÖ) yönteminin kavram karikatürleriyle birlikte kullanımı: Fen ve teknoloji dersi etkinliği. Turkish Republic of Northern Cyprus. Proceedings of theInternational Educational Technologies Conference, Famagusta, Cyprus. Ball, M. (2012). Using VoiceThread in a PK-2 Classroom. Learning and Leading with Technology, 40(3), 34–35. Bandura, A. (1981). Self-referent thought: A developmental analysis of self-efficacy. In J. H. Flavell & L. D. Ross (Eds.), Social cognitive development: Frontiers and possible futures. Cambridge, England: Cambridge University Press. Bandura, A. (1993). Perceived self-efficacy in cognitive development and functioning. Educational Psychologist, 28(2), 117–149. doi:10.1207/s15326985ep2802_3 Bangor, A., Kortum, P. T., & Miller, J. T. (2008). An empirical evaluation of the system usability scale. International Journal of Human-Computer Interaction, 24(6), 574–594. doi:10.1080/10447310802205776 Baños, R. M., Cebolla, A., Oliver, E., Alcañiz, M., & Botella, C. (2013). Efficacy and acceptability of an Internet platform to improve the learning of nutritional knowledge in children: The ETIOBE mates. Health Education Research, 28(2), 234–248. doi:10.1093/her/cys044 PMID:22498924 Barab, S. A., Hay, K. E., Squire, K., Barnett, M., Schmidt, R., Karrigan, K., & Johnson, C. et al. (2000). Virtual Solar System Project: Learning through a technology-rich, inquiry-based, participatory learning environment. Journal of Science Education and Technology, 9(1), 7–25. doi:10.1023/A:1009416822783 Barab, S., & Squire, K. (2004). Design-based research: Putting a stake in the ground. Journal of the Learning Sciences, 13(1), 1–14. doi:10.1207/s15327809jls1301_1 Barkley, E., Cross, K. P., & Mayor, C. H. (2005). Collaborative learning techniques. San Francisco: Jossey-Bass Publishers. Bassett, P. (2011). How Do Students View Asynchronous Online Discussions As A Learning Experience? Interdisciplinary Journal of E-Learning and Learning Objects, 7, 69-79. Bates, A. W. (2015). Teaching in a digital age: Guidelines for designing teaching and learning. Anthony William Bates. Retrieved from http:open.bccampus.ca Bates, A. W. T. (2005). Technology, e-learning and distance education. London, UK: Routledge. doi:10.4324/9780203463772
476
Compilation of References
Battistoni, R. (1997). Service learning and democratic citizenship. Theory into Practice, 36(3), 150–156. Retrieved from http://www.tandfonline.com/doi/ref/10.1080/00405849709543761 doi:10.1080/00405849709543761 Beach, R., & OBrien, D. (2015). Fostering Students Science Inquiry Through App Affordances of Multimodality, Collaboration, Interactivity, and Connectivity. Reading & Writing Quarterly, 31(2), 119–134. doi:10.1080/10573569.201 4.962200 Beaubien, J., & Baker, D. (2004). The use of simulation for training teamwork skills in health care: How low can you go? Quality & Safety in Health Care, 13(Suppl. 1), 51–56. doi:10.1136/qshc.2004.009845 PMID:15465956 Becker, H.-J. (2000). Who’s Wired and Who’s Not: Children’s Access to and Use of Computer Technology. The Future of Children. Children and Computer Technology, 10(2), 44–75. Beck, S. A., & Huse, V. E. (2007). A virtual spin on the teaching of probability. Teaching Children Mathematics, 13(9), 482–486. Bedrule-Grigoruta, M. V., & Rusua, M. L. (2014). Considerations about e-learning tools for adult education. Procedia: Social and Behavioral Sciences, 142, 749–754. doi:10.1016/j.sbspro.2014.07.610 Beeckman, D., Schoonhoven, L., Boucqué, H., van Maele, G., & Defloor, T. (2008). Pressure ulcers: E-learning to improve classification by nurses and nursing students. Journal of Clinical Nursing, 17(13), 1697–1707. doi:10.1111/j.13652702.2007.02200.x PMID:18592624 Begičević, N., Divjak, B., & Hunjak, T. (2007). Prioritization of e-learning forms: A multicriteria methodology. Central European Journal of Operations Research, 15(4), 405–419. doi:10.1007/s10100-007-0039-6 Bell, L., Juersivich, N., Hammond, T. C., & Bell, R. L. (2012). The TPACK of dynamic representations. In R. N. Ronau, C. R. Rakes, & M. L. Niess (Eds.), Educational technology, teacher knowledge, and classroom impact (pp. 103–135). Hershey, PA, USA: IGI Global. doi:10.4018/978-1-60960-750-0.ch005 Belz, J. (2003). Linguistic perspectives on the development of intercultural competence in telecollaboration. Language Learning & Technology, 7(2), 68–99. Benham, N. E. (1997). Faculty attitudes and knowledge regarding specific disabilities and the Americans with Disabilities Act. College Student Journal, 31, 124–129. Bennet, A., & Bennet, D. (2006). Organizational survival in the new world. Burlington, UK: Elsevier. Bennett, R. E. (2006). Technology and writing assessment: Lessons learned from the US National Assessment of Educational Progress. Annual Conference of the International Association for Educational Assessment. Singapore: IAEA. Retrieved from http://www.iaea.info/documents/paper_1162a26d7.pdf Bento, R. F. (1996). Faculty decision-making about “reasonable accommodations” for disabled college students. College Student Journal, 30(4), 494. Berge, Z. L. (1995). Facilitating computer conferencing: Recommendations from the field. Educational Technology, 35(1), 22–30. Bergman, J., & Sams, A. (2012). Flip Your Classroom: Reach Every Student in Every Class Every Day. Arlington, VA: International Society for Technology in Education. Bernard, R. M., Abrami, P. C., Borokhovski, E., Wade, C. A., Tamim, R. M., Surkes, M. A., & Bethel, E. C. (2009). A meta-analysis of three types of interaction treatments in distance education. Review of Educational Research, 79(3), 1243–1289. doi:10.3102/0034654309333844 477
Compilation of References
Berthon, P. R., Pitt, L. F., Plangger, K., & Shapiro, D. (2012). Marketing meets Web 2.0, social media, and creative consumers: Implications for international marketing strategy. Business Horizons, 55(3), 261–271. doi:10.1016/j. bushor.2012.01.007 Bharuthram, S., & Kies, C. (2013). Introducing e-learning in a South African higher education institution: Challenges arising from an intervention and possible responses. British Journal of Educational Technology, 44(3), 410–420. doi:10.1111/j.1467-8535.2012.01307.x Bhuasiri, W., Xaymoungkhoun, O., Zo, H., Rho, J. J., & Ciganek, A. P. (2012). Critical success factors for e-learning in developing countries: A comparative analysis between ICT experts and faculty. Computers & Education, 58(2), 843–855. doi:10.1016/j.compedu.2011.10.010 Biddix, J. P., Chungb, C.-J., & Parkc, H. W. (2014). The hybrid shift: Evidencing a student-driven restructuring of the college classroom. Computers & Education, 80, 162–175. doi:10.1016/j.compedu.2014.08.016 Bigaj, S. J., Shaw, S. F., & McGuire, J. M. (1999). Community-technical college faculty willingness to use and selfreported use of accommodation strategies for students with learning disabilities. Journal for Vocational Special Needs Education, 21(2), 3–14. Biggs, J. B. (2003). Constructive alignment in university. HERDSA Review of Higher Education, 1 Biggs, J.B. (2915). Teaching for quality learning at university. Buckingham: The Open University Press. Bill & Malinda Gates Foundation. (n. d.). What we do. Retrieved from http://www.gatesfoundation.org/What-We-Do/ US-Program/Postsecondary-Success Binkley, M., Erstad, O., Herman, J., Raizen, S., Ripley, M., Miller-Ricci, M., & Rumble, M. (2012). Defining twenty-first century skills. In P. Griffin, B. McGaw, & E. Care (Eds.), Assessment and teaching of 21st century skills (pp. 17–66). Dordrecht, Netherland: Springer. doi:10.1007/978-94-007-2324-5_2 Bishop, J. L., & Verleger, M. A. (2013). The flipped classroom: A survey of the research. Proceedings of the American Society for Engineering Education] annual conference. Retrieved from http://www.asee.org/public/conferences/20/ papers/6219/download Bishop, J. L., & Verleger, M. A. (2013, June). The flipped classroom: A survey of the research.Proceedings of the ASEE National Conference, Atlanta, GA, USA. Bitzer, D., Braunfeld, P., & Lichtenberger, W. (1961). PLATO: An automatic teaching device. IRE Transactions on Education, 4(4), 157–161. doi:10.1109/TE.1961.4322215 Black, P., & Wiliam, D. (1998). Assessment and Classroom Learning. Assessment in Education: Principles, Policy & Practice, 5(1), 7–74. doi:10.1080/0969595980050102 Blake, H. (2009). Staff perceptions of e-learning for teaching delivery in healthcare. Learning in Health and Social Care, 8(3), 223–234. doi:10.1111/j.1473-6861.2009.00213.x Bloom, B., Englehart, M., Furst, E., Hill, W., & Krathwohl, D. (1956). Taxonomy of educational objectives: The classification of educational goals. Handbook I: Cognitive domain. New York, Toronto: Longmans, Green. Bloxham, S., & Boyd, P. (2007). Developing Assessment in Higher Education: A Practical Guide, Ed:Open University Press Blumenfeld, P., Soloway, E., Marx, R., Krajcik, J. S., Guzdial, M., & Palincsar, A. (1991). Motivating project-based learning. Educational Psychologist, 26(3-4), 369–398. doi:10.1080/00461520.1991.9653139
478
Compilation of References
Bodie, L. W., & Bober-Michel, M. (2014). An Experimental Study of Instructor Immediacy and Cognitive Learning in an Online Classroom. Proceedings of the2014 International Conference on Intelligent Environments, Shanghai, China. IEEE. doi:10.1109/IE.2014.50 Bogost, I. (2013). The condensed classroom. The Atlantic. Retrieved from http://www.theatlantic.com/technology/ archive/2013/08/the-condensed-classroom/279013/ Bok, D. (2006). Our underachieving colleges: A candid look at how much learners learn and why they should be learning more. Princeton, New Jersey: Princeton University Press. Bolliger, D. U., & Martindale, T. (2004). Key factors for determining student satisfaction in online courses. International Journal on E-Learning, 3(1), 61–67. Bonk, C. J. (2009). R2D2: A model for using technology in education. eCampus News. Retrieved from http://www. ecampusnews.com/top-news/r2d2-a-model-for-using-technology-in-education/ Bonk, C. J., & Zhang, K. (2008). The R2D2 Model: Read, reflect, display, and do. In Empowering Online Learning: 100+ Activities for Reading, Reflecting, Displaying, and Doing. San Francisco, CA, USA: Jossey-Bass. Retrieved from http://www.publicationshare.com/pdfs/Chapter-1-of-R2D2-100-activities-Book-by-Bonk-and-Zhang.pdf Bork, R. H., & Rucks-Ahidiana, Z. (2013). Role ambiguity in online courses: An analysis of student and instructor expectations. Community College Research Center, Teachers College, Columbia University. Retrieved from http://ccrc. tc.columbia.edu/publications/role-ambiguity-in-online-courses.html Borokhovski, E., Tamim, R., Bernard, R. M., Abrami, P. C., & Sokolovskaya, A. (2012). Are contextual and designed student-student interaction treatments equally effective in distance education? Distance Education, 33(3), 311–329. do i:10.1080/01587919.2012.723162 Borup, J., Graham, C. R., & Velasquez, A. (2011). The use of asynchronous video communication to improve instructor immediacy and social presence in a blended learning environment. In A. Kitchenham (Ed.), Blended learning across disciplines: models for implementation (pp. 38–57). Hershey, PA, USA: IGI Global. doi:10.4018/978-1-60960-479-0.ch003 Borup, J., West, R. E., & Graham, C. R. (2012). Improving online social presence through asynchronous video. The Internet and Higher Education, 15(3), 195–203. doi:10.1016/j.iheduc.2011.11.001 Borup, J., West, R. E., & Graham, C. R. (2013). The influence of asynchronous video communication on learner social presence: A narrative analysis of four cases. Distance Education, 34(1), 48–63. doi:10.1080/01587919.2013.770427 Borup, J., West, R. E., & Thomas, R. (2015). The impact of text versus video communication on instructor feedback in blended courses. Educational Technology Research and Development, 63(2), 161–184. doi:10.1007/s11423-015-9367-8 Boston, W. & Helm, J. S. (2012). Why student learning outcomes assessment is key to the future of MOOCs. National Institute for Learning Outcomes Assessment. Retrieved from http://illinois.edu/blog/view/915/84723?displayType=mo nth&displayMonth=201212 Boston, W., & Helm, J. S. (2012). Why student learning outcomes assessment is key to the future of MOOCs. National Institute for Learning Outcomes Assessment. Retrieved from http://illinois.edu/blog/view/915/84723?displayType=mo nth&displayMonth=201212 Bouzidi, L., & Jaillet, A. (2009). Can Online Peer Assessment be Trusted? Journal of Educational Technology & Society, 12(4), 257–268.
479
Compilation of References
Bower, P., Kelsey, R., Bennington, B., Lemke, L. D., Liddicoat, J., Miccio, B. S., . . . Datta, S. (2014). Brownfield Action: Dissemination of a SENCER model curriculum and the creation of a collaborative STEM education network. Retrieved from http://seceij.net/files/seceij/winter14/brownfield_action.pdf Bower, P., Kelsey, R., & Moretti, F. (2011). Brownfield Action: An inquiry based multimedia simulation for teaching and learning environmental science. Science Education & Civic Engagement. International Journal (Toronto, Ont.), 3(1), 1–14. Bowman, J. (2014). Online learning in music: Foundations, frameworks, and practices. Oxford, New York: Oxford University Press. doi:10.1093/acprof:oso/9780199988174.001.0001 Boyd, D. (2004). The characteristics of successful online students. New Horizons in Adult Education, 2(18), 31–39. doi:10.1002/nha3.10184 Brackett, M. A., Rivers, S. E., & Salovey, P. (2011). Emotional Intelligence: Implications for Personal, Social, Academic, and Workplace Success. Social and Personality Psychology Compass, 5(1), 88–103. doi:10.1111/j.1751-9004.2010.00334.x Brame, C. (2013). Flipping the classroom. Vanderbilt University Center for Teaching. Retrieved from http://cft.vanderbilt. edu/guides-sub-pages/flipping-the-classroom/ Brammer, C., & Rees, M. (2007). Peer review from the students’ perspective: Invaluable or invalid? Composition Studies, 35(2), 71. Branch, R. M., & Kopcha, T. J. (2014). Instructional design models. In J. M. Spector, M. D. Merrill, & J. Elen (Eds.), Handbook of research on educational communications and technology (4th ed., pp. 77–87). New York, NY: Springer. doi:10.1007/978-1-4614-3185-5_7 Bransford, J. D., Brown, A. L., & Cocking, R. R. (2000). How people learn: Brain, mind, experience, and school. Washington, D.C.: National Academy Press. Bridgeman, B., Trapani, C., & Yigal, A. (2012). Comparison of human and machine scoring of essays: Differences by gender, ethnicity, and country. Applied Measurement in Education, 25(1), 27–40. doi:10.1080/08957347.2012.635502 Brindley, J. E., Walti, C., & Blaschke, L. M. (2009). Creating effective collaborative learning groups in an online environment. International Review of Research in Open and Distance Learning, 10(3), 1–18. Bringle, R. G., Phillips, M. A., & Hudson, M. (2001). The measure of service-learning: Research scales to assess student experiences. Washington, DC: American Psychological Association. Bringula, R. P. (2013). Influence of faculty-and web portal design-related factors on web portal usability: A hierarchical regression analysis. Computers & Education, 68, 187–198. doi:10.1016/j.compedu.2013.05.008 Bronfenbrenner, U. (1979). The ecology of human development: Experiments by nature and design. Harvard University Press. Brook, C., & Oliver, R. (2007). Exploring the influence of instructor actions on community development in online settings. In N. Lambropoulos & P. Zaphiris (Eds.), User-centered design of online learning communities (pp. 341–364). Hershey, PA, USA: IGI Global. doi:10.4018/978-1-59904-358-6.ch015 Brooke, J. (1996). SUS-A quick and dirty usability scale. Usability evaluation in industry, 189(194), 4-7. Brown, A. (1987). Metacognition, executive control, self-regulation, and other more mysterious mechanisms. In F. E. Weinert & R. H. Kluwe (Eds.), Metacognition, motivation, and understanding (pp. 65–116). Hillsdale, NJ: Lawrence Erlbaum Associates.
480
Compilation of References
Brown, A., & Green, T. D. (2011). The essentials of instructional design: connecting fundamental principles with process and practice (2nd ed.). Upper Saddle River, NJ: Pearson/Merrill Prentice Hall. Brown, J., Collins, A., & Duguid, P. (1989, January-February). Situated cognition and the culture of learning. Educational Researcher, 18(1), 32–42. doi:10.3102/0013189X018001032 Brown, S. (2004). Assessment for learning. Learning and Teaching in Higher Education, 1(may), 81–89. Bruet, J. (2015). Intégrer le digital learning: la mutation technologique des services de formation. Suisse:e-doceo. Bruff, D. (2012, Sep. 15). The flipped classroom FAQ. CIRTL Network. Retrieved from http://www.cirtl.net/node/7788 Bruffee, K. (1984). Collaborative Leaming and the Conversation of Mankind. College English, 46(7), 635–652. doi:10.2307/376924 Brumini, G., Špalj, S., Mavrinac, M., Biočina-Lukenda, D., Strujić, M., & Brumini, M. (2014). Attitudes towards elearning amongst dental students at the universities in Croatia. European Journal of Dental Education, 18(1), 15–23. doi:10.1111/eje.12068 PMID:24423171 Bruner, J. (1966). Toward a theory of instruction. Cambridge, MA: Harvard University Press. Bruner, J. (1990). Acts of meaning. Cambridge, MA: Harvard University Press. Bruner, J. S. (1960). The process of education. Cambridge, MA: Harvard University Press. Bruns, A., & Humphreys, S. (2005, October 16-18). Wikis in teaching and assessment: The M/Cyclopedia project. Proceedings of theInternational Wiki Symposium, San Diego, CA, USA. doi:10.1145/1104973.1104976 Brunvand, S., & Byrd, S. (2011). Using VoiceThread to Promote Learning Engagement and Success for All Students. Teaching Exceptional Children, 43(4), 28–37. doi:10.1177/004005991104300403 Bryant, J., & Zillman, D. (Eds.). (2002). Media effects: Advances in theory and research (2nd ed.). Hillsdale, N J: Lawrence Erlbaum. Burden, K., & Atkinson, S. (2008). Evaluating pedagogical affordances of media sharing Web 2.0 technologies: A case study. Paper presented at theproceedings of ASCILITE, Melbourne, Australia. Burton-Jones, A., & Grange, C. (2012). From use to effective use: A representation theory perspective. Information Systems Research, 24(3), 632–658. doi:10.1287/isre.1120.0444 Bury, R., Martin, L., & Roberts, S. (2006). Achieving change through mutual development: Supported online learning and the evolving roles of health and information professionals. Health Information and Libraries Journal, 23(Suppl. 1), 22–31. doi:10.1111/j.1471-1842.2006.00677.x PMID:17206993 Büyüközkan, G., Ruan, D., & Feyzioğlu, O. (2007). Evaluating e-learning web site quality in a fuzzy environment. International Journal of Intelligent Systems, 22(5), 567–586. doi:10.1002/int.20214 Byrne, D. (1976). Teaching Oral English. London: Longman. Byrne, R., Tang, M., Truduc, J., & Tang, M. (2010). eGrader, a software application that automatically scores student essays: With a postscript on the ethical complexities. Journal of Systemics. Cybernetics & Informatics, 8(6), 30–35. Byrnes, J. P. (2001). Cognitive development and learning in instructional contexts. Allyn & Bacon. Calimeris, L., & Sauer, K. M. (2015). Flipping out about the flip: All hype or is there hope. International Review of Economics Education, 20, 13–28. doi:10.1016/j.iree.2015.08.001 481
Compilation of References
Calisir, F., Gumussoy, C. A., Bayraktaroglu, A. E., & Karaali, D. (2014). Predicting the intention to use a web-based learning system: Perceived content quality, anxiety, perceived system quality, image, and the technology acceptance model. Human Factors and Ergonomics in Manufacturing & Service Industries, 24(5), 515–531. doi:10.1002/hfm.20548 Cameron, B. H. (2003). The effectiveness of simulation in a hybrid and online networking course. TechTrends, 47(5), 18–21. doi:10.1007/BF02763200 Campbell, D. (2014). The Influence of Teacher Immediacy Behaviors on Student Performance in an Online Course (and the Problem of Method Variance). Teaching of Psychology, 41(2), 163–166. doi:10.1177/0098628314530351 Cao, Y., & Hong, P. (2011). Antecedents and consequences of social media utilization in college teaching: A proposed model with mixed‐methods investigation. On the Horizon, 19(4), 297–306. doi:10.1108/10748121111179420 Capdeferro, N., & Romero, M. (2012). Are online learners frustrated with collaborative learning experiences? International Review of Research in Open and Distance Learning, 13(2), 26–44. Capece, G., & Campisi, D. (2013). User satisfaction affecting the acceptance of an e-learning platform as a mean for the development of the human capital. Behaviour & Information Technology, 32(4), 335–343. doi:10.1080/014492 9X.2011.630417 Carey, M.A. (1994). The group effect in focus groups: Planning, implementing, and interpreting focus group research. InMorse, J. (Ed.), Critical issues in qualitative research methods (pp. 225–241). Thousand Oaks, CA: Sage. Carpenter, Y., Moore, E. B., & Perkins, K. K. (2015). Representations and equations in an interactive simulation that support student development in balancing chemical equations. Retrieved from http://confchem.ccce.divched.org/sites/ confchem.ccce.divched.org/files/2015SpringConfChemP4.pdf Carroll, J. M. (2000). Making use: Scenario-based design of human-computer interactions. Cambridge, MA: MIT Press. doi:10.1145/347642.347652 Cartwright, V., & Hammond, M. (2003). The integration and embedding of ICT into the school curriculum: more questions than answers. Paper presented at theITTE 2003 Annual Conference of the Association of Information Technology for Teacher Education, Trinity and All Saints College, Leeds. Case Study Description. (n. d.). Writing@CSU. Retrieved from http://writing.colostate.edu/guides/guide.cfm?guideid=60 Casey, K. M., Davidson, G., Billig, S. H., & Springer, N. C. (2005). Service-learning: Research to transform the field. Greenwich, CT: Information Age Publishing. Cassino, R., & Tucci, M. (2011). Developing usable web interfaces with the aid of automatic verification of their formal specification. Journal of Visual Languages and Computing, 22(2), 140–149. doi:10.1016/j.jvlc.2010.12.001 Cassino, R., Tucci, M., Vitiello, G., & Francese, R. (2015). Empirical validation of an automatic usability evaluation method. Journal of Visual Languages and Computing, 28, 1–22. doi:10.1016/j.jvlc.2014.12.002 CAST. (2011). Universal Design for Learning Guidelines version 2.0. Wakefield, MA: Author. CAST. (2011). Universal design for learning guidelines version. Wakefield, MA. CAST. (2014). UDL Guidelines version 2.0. Retrieved from http://www.udlcenter.org/aboutudl/udlguidelines/principle1 CAST. (2015). About Universal Design for Learning. Retrieved from http://www.cast.org/our-work/about-udl.html#. VutQj7uFN9C
482
Compilation of References
CCNMTL. (n. d.a). Featured project: Brownfield Action 3.0. Retrieved from http://ccnmtl.columbia.edu/projects/feature_pages/277_brownfieldaction.pdf CCNMTL. (n. d.b). COUNTRY X. Retrieved from http://ccnmtl.columbia.edu/portfolio/political_science_and_social_policy/country_x.html CCNMTL. (n. d.c). Featured project: Millennium Village simulation. Retrieved from http://ccnmtl.columbia.edu/projects/ feature_pages/288_mvsim_08.pdf Cendon, E. (2016). Bridging Theory and Practice – Reflective Learning in Higher Education. In W. Nuninger & J.-M. Châtelet (Eds.), Quality Assurance and Value Management in Higher Education. Hershey, PA: IGI Global. doi:10.4018/9781-5225-0024-7.ch012 Center for Applied Special Technology. (2011). UDL Guidelines–Educator worksheet. CAST UDL Online Modules. Retrieved from http://udlonline.cast.org/guidelines Center for Applied Special Technology. (2013). About UDL. Retrieved from http://cast.org/udl/index.html Center for Applied Special Technology. (2014). UDL on campus: Universal Design for Learning in higher education—A guide. Retrieved from http://udloncampus.cast.org/ Chacón, F. (1992). A taxonomy of computer media in distance education. Open Learning, 7(1), 12–27. doi:10.1080/0268051920070103 Chamberlain, J. M., Lancaster, K., Parson, R., & Perkins, K. K. (2014). How guidance affects student engagement with an interactive simulation. Chemistry Education Research and Practice, 15(4), 628–638. doi:10.1039/C4RP00009A Chan, M. S., & Black, J. B. (2005). Direct-manipulation animation: Incorporating the haptic channel in the learning process to support middle school students in science learning and mental model acquisition. Proceedings of the 7th international conference on Learning sciences ICLS ‘06 (pp. 64-70). Santa Monica, CA: International Society of the Learning Sciences. Chang, B., & Kang, H. (2016). Challenges facing group work online. Distance Education, 37(1), 73–88. doi:10.1080/ 01587919.2016.1154781 Chen, B., Seilhamer, R., Bennett, L., & Bauer, S. (2015, Jun. 22). Students’ mobile learning practices in higher education: A multi-year study. EDUCAUSE Review. Retrieved from http://er.educause.edu/articles/2015/6/students-mobilelearning-practices-in-higher-education-a-multiyear-study Chen, C.-F. E., & Cheng, W.-Y. E. (2008). Beyond the design of automated writing evaluation: Pedagogical practices and perceived learning effectiveness in EFL writing classes. Language Learning & Technology, 12(2), 94–112. Cheng, C. H., Wei, L. Y., & Chen, Y. H. (2011). A new e-learning achievement evaluation model based on rough set and similarity filter. Computational Intelligence, 27(2), 260–279. doi:10.1111/j.1467-8640.2011.00380.x Cheng, E., & Lee, J. (2014). Developing strategies for communities of practice. International Journal of Educational Management, 28(6), 751–764. doi:10.1108/IJEM-07-2013-0105 Chen, T. L. (2014). Exploring e-learning effectiveness perceptions of local government staff based on the diffusion of innovations model. Administration & Society, 46(4), 450–466. doi:10.1177/0095399713482313 Cheville, J. (2004). Automated scoring technologies and the rising influence of error. English Journal, 93(4), 47–52. doi:10.2307/4128980
483
Compilation of References
Chickering, A. W., & Gamson, Z. F. (1987). Seven principles for good practice in undergraduate education. AAHE Bulletin, 39(7), 3–7. Chickering, A. W., & Gamson, Z. F. (1987). Seven Principles for Good Practice. AAHE Bulletin, 39(7), 3–7. Childs, S., Blenkinsopp, E., Hall, A., & Walton, G. (2005). Effective e-learning for health professionals and students— barriers and their solutions: A systematic review of the literature—findings from the HeXL project. Health Information and Libraries Journal, 22(Suppl. 2), 20–32. doi:10.1111/j.1470-3327.2005.00614.x PMID:16279973 Ching, Y. H., & Hsu, Y. C. (2013). Collaborative learning using VoiceThread in an online graduate course. Knowledge Management & E-Learning: An International Journal, 5(3), 298–314. Ching, Y.-H., & Hsu, Y.-C. (2015). Online Graduate Students’ Preferences of Discussion Modality: Does Gender Matter? Journal of Online Learning and Teaching, 11(1), 31. Chodorow, M., & Burstein, J. (2004). Beyond essay length: Evaluating e-rater’s performance on TOEFL essays (TOEFL research report, No. RR-04-73). Princeton, NJ: Educational Testing Service. Choi, I., Lee, S. J., & Kang, J. (2009). Implementing a case-based e-learning environment in a lecture-oriented anaesthesiology class: Do learning styles matter in complex problem solving over time? British Journal of Educational Technology, 40(5), 933–947. doi:10.1111/j.1467-8535.2008.00884.x Cho, K., Schunn, C. D., & Wilson, R. W. (2006). Validity and reliability of scaffolded peer assessment of writing from instructor and student perspectives. Journal of Educational Psychology, 98(4), 891–901. doi:10.1037/0022-0663.98.4.891 Chou, P. (2012). Teaching strategies in online discussion board: A framework in higher education. Higher Education Studies, 2(2), 25–30. doi:10.5539/hes.v2n2p25 Chua, B. B., & Dyson, L. E. (2004). Applying the ISO 9126 model to the evaluation of an e-learning system. Paper presented atASCILITE. Chun, D. M., & Wade, E. R. (2004). Collaborative cultural exchanges with CMC. In L. Lomicka & J. Cooke-Plagwitz (Eds.), Teaching with technology (pp. 220–247). Boston, MA: Heinle. Chuo, Y., Liu, C., & Tsai, C. (2015). Effectiveness of e-learning in hospitals. Technology and Health Care, 23(Suppl. 1), S157–S160. doi:10.3233/thc-150949 PMID:26410320 Churcher, K. M., Downs, E., & Tewksbury, D. (2014). Friending Vygotsky: A social constructivist pedagogy of knowledge building through classroom social media use. The Journal of Effective Teaching, 14(1), 33–50. Chyung, S., & Stepich, D. (2003). Applying the “Congruence” Principle of Bloom’s Taxonomy to designing online instruction. Quarterly Review of Distance Education, 4(3), 317–330. Cicconi, M. (2014). Vygotsky Meets Technology: A Reinvention of Collaboration in the Early Childhood Mathematics Classroom. Early Childhood Education Journal, 42(1), 57–65. doi:10.1007/s10643-013-0582-9 Cindy, J. (2007). Validating a computerized scoring system for assessing writing and placing students in composition courses. Assessing Writing, 11(3), 167–178. CISL Stanford. (n. d.). Types of learning. Retrieved from http://cisl.stanford.edu/what_is/learning_types/ Clark, C., Strudler, N., & Grove, K. (2015). Comparing Asynchronous and Synchronous Video vs. Text Based Discussions in an Online Teacher Education Course. Online Learning, 19(3), 48–69.
484
Compilation of References
Clark, R. C., & Mayer, R. E. (2011). E-learning and the science of instruction: Proven guidelines for consumers and designers of multimedia learning. John Wiley & Sons. doi:10.1002/9781118255971 Class Act, R. I. T. Promoting Access for Deaf & Hard of Hearing learners. National Association for Developmental Education. (n. d.). Retrieved from https://sites.google.com/site/nadedisabilities/universal-design/ritclassactpromotingaccessfordeafhardofhearinglearners Çoban, S., & Tuncer, I. (2008). An experimental study of game‐based music education of primary school children. Proceedings of the 2nd European Conference on Games‐Based Learning (EC‐GBL), Barcelona, Spain. Cobb, P. (2007). Putting philosophy to work. In Second handbook of research on mathematics teaching and learning: A project of the National Council of Teachers of Mathematics. Cobb, P., & Bowers, J. (1999). Cognitive and situated learning perspectives in theory and practice. Educational Researcher, 28(2), 4–15. doi:10.3102/0013189X028002004 Cobb, P., Stephan, M., McClain, K., & Gravemeijer, K. (2011). Participating in classroom mathematical practices. In A Journey in Mathematics Education Research (pp. 117–163). Springer Netherlands. Cobb, P., & Yackel, E. (1996). Constructivist, emergent, and sociocultural perspectives in the context of developmental research. Educational Psychologist, 31(3-4), 175–190. doi:10.1080/00461520.1996.9653265 Coll, C., Rochera, M. J., de Gispert, I., & Diaz-Barriga, F. (2013). Distribution of feedback among teacher and students in online collaborative learning in small groups. Digital Education Review, 23, 27–45. College Board. (2015). 2015 College Board program results: Expanding access, challenging students, equipping educators. https://www.collegeboard.org/program-results Collins, A., Brown, J. S., & Holum, A. (1991). Cognitive apprenticeship: Making thinking visible. American Educator, 6, 38–46. Collis, B., & Meeuwsen, E. (1999). Learning to learn in a WWW-based environment. In French, D., Hale, C., Johnson, C., & Farr, G. (Eds.), Internet based learning - A framework for higher education and business (pp. 25-46). Sterling, VA: Stylus Publishing. Collis, B., & Moonen, J. (2001). Flexible learning in a digital world: Experiences and expectations. London, UK: Kogan Page. Colvin Clark, R., & Mayer, R. (2008). E-Learning and the Science of Instruction. San Francisco, USA: Pfeiffer. Comer, D. R., & Lenaghan, J. A. (2013). Enhancing discussions in the asynchronous online classroom: The lack of face-to-face interaction does not lessen the lesson. Journal of Management Education, 37(2), 261–294. doi:10.1177/1052562912442384 Condon, W. (2013). Large-scale assessment, locally-developed measures, and automated scoring of essays: Fishing for red herrings? Assessing Writing, 18(1), 100–108. doi:10.1016/j.asw.2012.11.001 Connell, T. (2002). Languages and Employability: A Question of Careers. London: National Centre for Languages. Connolly, T. M., Stansfield, M. H., Gould, C., Tsvetkova, N., Kusheva, R., & Stoimenova, B., … Dimitrova, N. (2011). Understanding the pedagogy Web 2.0 Supports: The presentation of a Web 2.0 pedagogical model. Proceedings of International Conference on European Transnational Education (ICEUTE), Salamanca, Spain. Connolly, T., Stansfield, M., Josephson, J., Lazaro, N., Rubio, G., & Ortz, C. R., … Tsvetanova, S. (2008). Using alternate reality games to support language learning. Proceedings of the 2nd European Conference on Games Based Learning. UK: Academic Conferences and Publishing International Reading. 485
Compilation of References
Connolly, T. M., Boyle, E. A., MacArthur, E., Hainey, T., & Boyle, J. M. (2012). A systematic literature review of empirical evidence on computer games and serious games. Computers & Education, 59(2), 661–686. doi:10.1016/j. compedu.2012.03.004 Connolly, T. M., & Stansfield, M. H. (2007). From eLearning to games-based eLearning: Using interactive technologies in teaching Information Systems. International Journal of Information Technology Management, 6(2), 188–208. doi:10.1504/IJITM.2007.014000 Connors, R. (1997). Composition-rhetoric: Backgrounds, theory, and pedagogy. Pittsburg: University of Pittsburg Press. Connor, U., & Asenavage, K. (1994). Peer response groups in ESL writing classes: How much impact on revision? Journal of Second Language Writing, 3(3), 257–276. doi:10.1016/1060-3743(94)90019-1 Conole, G. (2010). Review of pedagogical models and their use in e-learning, Retrieved from http://cloudworks.ac.uk/ cloud/view/2982 Conrad, D. (2002). Engagement, excitement, anxiety and fear: Learners experience of starting an online course. American Journal of Distance Education, 16(4), 205–226. doi:10.1207/S15389286AJDE1604_2 Cook, D. A., Hatala, R., Brydges, R., Zendejas, B., Szostek, J. H., & Wang, A. T. (2011). Comparative effectiveness of technology-enhanced simulation versus other instructional methods: A systematic review and meta-analysis. Simulation in Healthcare, 7(5), 308–320. doi:10.1097/SIH.0b013e3182614f95 PMID:23032751 Cook, D. A., & Steinert, Y. (2013). Online learning for faculty development: A review of the literature. Medical Teacher, 35(11), 930–937. doi:10.3109/0142159X.2013.827328 PMID:24006931 Cook, L., Rumrill, P. D., & Tankersley, M. (2009). Priorities and understanding of faculty members regarding college students with disabilities. International Journal of Teaching and Learning in Higher Education, 21(1), 84–96. Cooper, D., & Higgins, S. (2015). The effectiveness of online instructional videos in the acquisition and demonstration of cognitive, affective and psychomotor rehabilitation skills. British Journal of Educational Technology, 46(4), 768–779. doi:10.1111/bjet.12166 Corti, K. (2006). Games-based learning: A serious business application. PIXELearning Limited. Cosman, P. H., Cregan, P. C., Martin, C. J., & Cartmill, J. A. (2001). Virtual reality simulators: Current status in acquisition and assessment of surgical skills. ANZ Journal of Surgery, 72(1), 30–34. doi:10.1046/j.1445-2197.2002.02293.x PMID:11906421 Costello, J. T., & McNaughton, R. B. (2016). Can dynamic capabilities be developed using workplace e-learning processes? Knowledge and Process Management, 23(1), 73–87. doi:10.1002/kpm.1500 Couros, A. (2003). Communities of Practice: A Literature Review. Retrieved from https://www.tcd.ie/CAPSL/_academic_practice/pdfdocs/Couros_2003.pdf Coursera. (2012). Peer Assessment. Coursera. Duke University. Retrieved from https://www.coursera.org/course/composition Coursera. (n. d.). Georgia Institute of Technology. Retrieved from https://www.coursera.org/course/gtcomp Coursera. (n. d.). The Ohio State University. Retrieved from https://www.coursera.org/course/writing2 Cox, M. J. (2013). Formal to informal learning with IT: Research challenges and issues for e-learning. Journal of Computer Assisted Learning, 29(1), 85–105. doi:10.1111/j.1365-2729.2012.00483.x
486
Compilation of References
Craig, A., Goold, A., Coldwell, J., & Mustard, J. (2008). Perceptions of roles and responsibilities in online learning: A case study. Interdisciplinary Journal of E-Learning and Learning Objects, 4, 205–223. Creswell, J. W., & Plano Clark, V. L. (2011). Designing and conducting mixed methods research (2nd ed.). Thousand Oaks, CA, USA: SAGE Publications, Inc. Creswell, J. W. (2003). Research design: Qualitative, quantitative, and mixed methods approaches (2nd ed.). Thousand Oaks, CA: Sage Publications. Creswell, J. W., & Clark, P. V. L. (2007). Designing and conducting mixed methods research. Thousand Oaks, CA: Sage. Creswell, J. W., Plano Clark, V. L., Gutmann, M. L., & Hanson, W. E. (2003). Advanced mixed methods research designs. In A. Tashakkori & C. Teddlie (Eds.), Handbook of mixed methods in social and behavioral research (pp. 209–240). Thousand Oaks, CA, USA: Sage. Crookall, D., Oxford, R., & Saunders, D. (1987). Towards a reconceptualization of simulation: From representation to reality. Simulation/Games for Learning, 17(4), 147‐171. Csikszentmihalyi, M. (1990). Flow: The psychology of optimal experience. New York, NY: Harper and Row. Cull, S., Reed, D., & Kirk, K. (2010). Teaching Geoscience Online - A Workshop for Digital Faculty. Cummings, (2016). Flipping the online classroom with web 2.0: The asynchronous workshop. Business and Professional Communications Quarterly, 79, 81-101. D’Angelo, C., Rutstein, D., Harris, C., Bernard, R., Borokhovski, E., & Haertel, G. (2013). Simulations for STEM learning: Systematic review and meta-analysis. Menlo Park, CA: SRI International. da Rosa dos Santos, L. Altowairiki, N., Johnson, C., Liu, Y.F., Hill, L., & Lock, J. (2015). It’s not just a book club: A novel approach to prepare researchers for practice. In P. Preciado Babb, M. Takeuchi, & J. Lock (Eds.). Proceedings of the IDEAS: Designing Responsive Pedagogy Conference (pp. 53-61). da Rosa dos Santos, L., Seidel, J., & Lock, J. (2013). Integrating an LMS into field experience: An insightful experiment. In J. Herrington, A. Couros & V. Irvine (Eds.), Proceedings of EdMedia: World Conference on Educational Media and Technology 2013 (pp. 1927-1931). Dail, J., & Giles, T. (2012). The Hunger Games and Little Brother come to life on VoiceThread. The ALAN Review, Summer, 6-11. Damassa, D. A., & Sitko, T. D. (2010). Simulation technologies in higher education: Uses, trends, and implications. Retrieved from http://www.educause.edu/library/resources/simulationtechnologies-higher-education-uses-trends-andimplications Darabi, A., Arrastia, M. C., Nelson, D. W., Cornille, T., & Liang, X. (2011). Cognitive presence in asynchronous online learning: A comparison of four discussion strategies. Journal of Computer Assisted Learning, 27(3), 216–227. doi:10.1111/j.1365-2729.2010.00392.x Davids, M. R., Harvey, J., Halperin, M. L., & Chikte, U. M. E. (2015). Determining the number of participants needed for the usability evaluation of e-learning resources: A Monte Carlo simulation. British Journal of Educational Technology, 46(5), 1051–1055. doi:10.1111/bjet.12336 Davidson, M. (2006). Universal design: The work of disability in an age of globalization. In L. Davis (Ed.), The Disability Studies Reader (2nd ed., pp. 117-130). New York, NY: Routledge.
487
Compilation of References
Davies, P. (2007). The Bologna Process and University Lifelong Learning: The State of Play and future Directions. BeFlexPlus. Retrieved from http://www.eucen.eu/BeFlex/FinalReports/BeFlexFullReportPD.pdf Davies, P., Schelly, C. L., & Spooner, C. (2013). Measuring the effectiveness of Universal Design for Learning intervention in postsecondary education. Journal of Postsecondary Education and Disability, 36(3), 195–220. Davis, F. D., Bagozzi, R. P., & Warshaw, P. R. (1989). User acceptance of computer technology: A comparison of two theoretical models. Management Science, 35(8), 982–1003. doi:10.1287/mnsc.35.8.982 Davis, T. (2015). Visual design for online learning. San Francisco, CA: Jossey-Bass. De Chesnay, M., & Anderson, B. (Eds.). (2016). Caring for the vulnerable: Perspectives in nursing theory, practice and research. Burlington, MA: Jones and Bartlett Learning. de Freitas, S. (2007). Post-16 e-learning content production: A synthesis of the literature. British Journal of Educational Technology, 38(2), 349–364. doi:10.1111/j.1467-8535.2006.00632.x De Freitas, S. (2008). Serious virtual worlds. JISC. De Jong, T. (2013). Learning by design. In Wild, Lefrere, Scott (Eds.), TEL2020: Technology and knowledge in the future, a roadmap. de Jong, T. (2005). The guided discovery principle in multimedia learning. In R. E. Mayer (Ed.), Cambridge handbook of multimedia learning (pp. 215–229). Cambridge, United Kingdom: Cambridge University Press. doi:10.1017/ CBO9780511816819.015 De Laat, M., & Lally, V. (2004). Its not so easy: Researching the complexity of emergent participant roles and awareness in asynchronous networked learning discussions. Journal of Computer Assisted Learning, 20(3), 165–171. doi:10.1111/ j.1365-2729.2004.00085.x De Oliveira, L. C., & Olesova, L. (2013). Learning about the Literacy Development of English Language Learners in Asynchronous Online Discussions. Journal of Education, 193(2), 15–23. de Santana, V. F., & Baranauskas, M. C. C. (2015). WELFIT: A remote evaluation tool for identifying Web usage patterns through client-side logging. International Journal of Human-Computer Studies, 76, 40–49. doi:10.1016/j.ijhcs.2014.12.005 de Vasconcelos, L. G., & Baldochi, L. A. Jr. (2012). Towards an automatic evaluation of web applicationsProceedings of the 27th Annual ACM Symposium on Applied Computing (pp. 709-716). doi:10.1145/2245276.2245410 De Wever, B., Schellens, T., Valcke, M., & Van Keer, H. (2006). Content analysis schemes to analyze transcripts of online asynchronous discussion groups: A review. Computers & Education, 46(1), 6–28. doi:10.1016/j.compedu.2005.04.005 De Wever, B., Van Keer, H., Schellens, T., & Valcke, M. (2010). Roles as a structuring tool in online discussion groups: The differential impact of different roles on social knowledge construction. Computers in Human Behavior, 26(4), 516–523. doi:10.1016/j.chb.2009.08.008 Deane, P. (2013). On the relationship between automated essay scoring and modern views of the writing construct. Assessing Writing, 18(1), 7–24. doi:10.1016/j.asw.2012.10.002 Deborah, L. J., Baskaran, R., & Kannan, A. (2014). Learning styles assessment and theoretical origin in an e-learning scenario: A survey. Artificial Intelligence Review, 42(4), 801–819. doi:10.1007/s10462-012-9344-0 Decman, M. (2015). Modeling the acceptance of e-learning in mandatory environments of higher education: The influence of previous education and gender. Computers in Human Behavior, 49, 272–281. doi:10.1016/j.chb.2015.03.022
488
Compilation of References
deNoyelles, A., Mannheimer Zydney, J., & Baiyun, C. (2014). Strategies for Creating a Community of Inquiry through Online Asynchronous Discussions. Journal of Online Learning & Teaching, 10(1), 153–165. deNoyelles, A., Zydney, J., & Chen, B. (2014). Strategies for creating a community of inquiry through online asynchronous discussions. Journal of Online Learning and Teaching., 10(1), 153–165. Department of Defense. (1998). DoD modeling and simulation (M&S) glossary. Retrieved from http://www.dtic.mil/ whs/directives/corres/pdf/500059m.pdf Deperlioğlu, Ö., & Köse, U. (2010, 10-12 Şubat). Web 2.0 teknolojilerinin eğitim üzerindeki etkileri ve örnek bir öğrenme yaşantısı. Akademik Bilişim’10 - XII. Akademik Bilişim Konferansı Bildirileri, Muğla Üniversitesi, Muğla. Deslauriers, L., Schelew, E., & Weiman, C. (2011). Improved learning in a large-enrollment physics class. Science, 332(1), 862–864. doi:10.1126/science.1201783 PMID:21566198 Dewey, J. (1916). Democracy and Education: An introduction to the philosophy of education. New York, NY: MacMillan Company. Dewey, J. (1938). Experience & education. New York, NY: Simon & Schuster. Dimitrova, M., Mimirinis, M., & Murphy, M. (2004). Evaluating the flexibility of a pedagogical framework for e-learning. Proceedings of International Conference on Computer Systems and Applications (AICCSA-05), Cairo, Egypt. IEEE. doi:10.1109/ICALT.2004.1357422 Dimoka, A., Banker, R. D., Benbasat, I., Davis, F. D., Dennis, A. R., & Gefen, D. et al. others. (2012). On the use of neurophysiological tools in is research: Developing a research agenda for neurois. Management Information Systems Quarterly, 36(3), 679–702. DiPardo, A., & Freedman, S. (1988). Peer response groups in the writing classroom: Theoretic foundations and new directions. Review of Educational Research, 58(2), 119–150. doi:10.3102/00346543058002119 Douglas, K. M., & McGarty, C. (2001). Identifiability and self‐presentation: Computer‐mediated communication and intergroup interaction. The British Journal of Social Psychology, 40(3), 399–416. doi:10.1348/014466601164894 PMID:11593941 Dow, C. R., Li, Y. H., Huang, L. H., & Hsuan, P. (2014). Development of activity generation and behavior observation systems for distance learning. Computer Applications in Engineering Education, 22(1), 52–62. doi:10.1002/cae.20528 Downes, S. (2012). Connectivism and Connective Knowledge: Essays on meaning and learning networks. Retrieved from http://www.downes.ca/ Draves, W. A. (2007). Advanced teaching online (3rd ed.). River Falls, WI: LERN Books. Drexler, W. (2010). The networked student model for construction of personal learning environments: Balancing teacher control and student autonomy. Australasian Journal of Educational Technology, 26(3), 369–385. doi:10.14742/ajet.1081 Drigas, A. S., Ioannidou, R. E., Kokkalia, G., & Lytras, M. D. (2014). ICTs, mobile learning and social media to enhance learning for attention difficulties. Journal of Universal Computer Science, 20(10), 1499–1510. Driver, M. (2002). Exploring student perceptions of group interaction and class satisfaction in the web enhanced classroom. The Internet and Higher Education, 5(1), 35–45. doi:10.1016/S1096-7516(01)00076-8 Dudeney, G., Hockly, N., & Pegrum, M. (2013). Digital Literacies. London: Routledge.
489
Compilation of References
Duffy, T. M., & Cunningham, D. J. (1996). Constructivism: implications for the design and delivery of instruction. In D. H. Jonassen (Ed.), Hand Book of Research For Educational Communications and Technology (pp. 170–197). New York: Simon and Schuster Macmillan Publishing. Duffy, T. M., & Cunningham, D. J. (1996). Constructivism: Implications for the design and delivery of instruction. In D. H. Jonassen (Ed.), Handbook of Research for Educational Communications and Technology (pp. 170–198). New York: Macmillan. Duffy, T. M., & Jonassen, D. H. (1991). Constructivism: New implications for instructional technology? Educational Technology, 31(5), 7–12. Du, J., Havard, B., & Li, H. (2005). Dynamic online discussion: Task-oriented interaction for deep learning. Educational Media International, 42(3), 207–218. doi:10.1080/09523980500161221 Duphorne, P. L., & Gunawardena, C. N. (2005). The effect of three computer conferencing designs on critical thinking skills of nursing students. American Journal of Distance Education, 19(1), 37–50. doi:10.1207/s15389286ajde1901_4 Durrington, V. A., Berryhill, A., & Swafford, J. (2006). Strategies for Enhancing Student Interactivity in an Online Environment. College Teaching, 54(1), 190–193. doi:10.3200/CTCH.54.1.190-193 Dwivedi, P., & Bharadwaj, K. K. (2015). E-learning recommender system for a group of learners based on the unified learner profile approach. Expert Systems: International Journal of Knowledge Engineering and Neural Networks, 32(2), 264–276. doi:10.1111/exsy.12061 Dziuban, C. D., Hartman, J. L., & Moskal, P. D. (2004). Blended learning. Educause Research Bulletin, 2004(7). Retrieved from https://net.educause.edu/ir/library/pdf/erb0407.pdf Easton, S. S. (2003). Clarifying the instructors role in online distance learning. Communication Education, 52(2), 87–105. doi:10.1080/03634520302470 Eccles, J. S., Adler, T. F., Futterman, R., Goff, S. B., Kaczala, C. M., Meece, J. L., & Midgley, C. (1983). Expectancies, values, and academic behaviors. In J. T. Spence (Ed.), Achievement and achievement motivation (pp. 75–146). San Francisco, CA: W. H. Freeman. Eckhardt, A., Maier, C., & Buettner, R. (2012). The Influence of Pressure to Perform and Experience on Changing Perceptions and User Performance: A Multi-Method Experimental Analysis. Proceedings ICIS ‘12. Eddy, P. L., & Lawrence, A. (2013). Wikis as platforms for authentic assessment. Innovative Higher Education, 38(4), 253–265. doi:10.1007/s10755-012-9239-7 EDUCAUSE. (2005). 7 Things you should know about...Social bookmarking. Retrieved from http://net.educause.edu/ ir/library/pdf/ ELI7001.pdf EDUCAUSE. (2012). 7 things you should know about flipped classrooms. EDUCAUSE Learning Initiative. Retrieved from https://net.educause.edu/ir/library/pdf/eli7081.pdf Educause. (2012). Retrieved from http://net.educause.edu/ir/library/pdf/eli7078.pdf Educause. (2013a). Peer Assessment in MOOCs. Retrieved from https://net.educause.edu/ir/library/pdf/ELI139_OL14.pdf Educause. (2013b). Writing II: Rhetorical Composing. Retrieved from http://www.educause.edu/sites/default/files/library/ presentations/E13/SESS008/Writing2-Final-Report.pdf EF. (2010). Work programme on the follow-up of the objectives of education and training systems in Europe. Retrieved from http://eur-lex.europa.eu/legal-content/EN/TXT/?uri = URISERV%3Ac11086 490
Compilation of References
Ehlers, T. (2014, Oct. 23). Online education is the future of learnings. Method Test Prep blog. Retrieved from http://info. methodtestprep.com/blog/online-education-is-the-future-of-learning Ehlers, U. D., & Hilera, J. R. (2012). Special issue on quality in e-learning. Journal of Computer Assisted Learning, 28(1), 1–3. doi:10.1111/j.1365-2729.2011.00448.x Ehrlich, T. (2000). Civic responsibility and higher education. New York, NY: Oryx Press. Elbow, P. (1981). Writing with power: Techniques for mastering the writing process. Oxford University Press. Engleman, M., & Schmidt, M. (2007). Testing an experimental universally designed learning unit in a graduate level online teacher education course. Journal of Online Learning and Teaching, 3(2), 112–132. Ent, V. L. (2016). Is flipped learning really new to academia? TechTrends, 60(3), 204–206. doi:10.1007/s11528-016-0060-5 Ertmer, P. A., Sadaf, A., & Ertmer, D. J. (2011). Student-content interactions in online courses: The role of question prompts in facilitating higher-level engagement with course content. Journal of Computing in Higher Education, 23(23), 157–186. doi:10.1007/s12528-011-9047-6 Escobar-Rodriguez, T., & Monge-Lozano, P. (2012). The acceptance of Moodle technology by business administration students. Computers & Education, 58(4), 1085–1093. doi:10.1016/j.compedu.2011.11.012 Etzkowitz, H., & Leydesdorff, L. (2000). The dynamics of innovation. In Science and Technology (pp. 109-123). Etzkowitz, H. (2008). The Triple Helix: University–Industry–Government Innovation in Action. New York: Routledge. doi:10.4324/9780203929605 European Association for Quality Assurance in Higher Education-ENAQ. (2015). Standards and Guidelines for Quality Assurance in the European Higher Education Area, Retrieved from http://www.enqa.eu/wp-content/uploads/2015/11/ ESG_2015.pdf European Commission. (2001). Making a European Area of Lifelong Learning a Reality. Retrieved from http://eur-lex. europa.eu/LexUriServ/LexUriServ.do?uri = COM:2001:0678:FIN:EN:PDF European Council (2000). Lisbon European Council 23 and 24 March 2000, Presidency Conclusions. Brussels: EU. European Commission. Directorate General for Employment, Social Affairs and Inclusion. (2011). The Social Dimension of the Europe 2020 Strategy: A Report of the Social Protection Committee. Luxembourg: Publications Office of the European Union. European Commission-EC. (2007), The Key Competences for Lifelong Learning – A European Reference Framework, Annex of a OJ of the EU 2006/L394), Education and culture DG, Ed: Official publications of the EC European Commission-EC. (2010). EUROPE 2020: A strategy for smart, sustainable and inclusive growth, Retrieved 2016, April from http://ec.europa.eu/europe2020/index_en.htm European Commission-EC. (2014), A common European Digital Competence Framework for Citizens (DIGCOMP), Eramus+. Retrieved from http://openeducationeuropa.eu/sites/default/files/DIGCOMP%20brochure%202014%20.pdf European Foundation for Quality Management-EFQM. (2013). An overview of the EFQM Excellence Model. Retrieved from http://www.efqm.org European Parliament and the Council of the European Union. (2006). Recommendation 2006/962/EC of the European Parliament and of the Council of 18 December 2006 on key competences for lifelong learning. Retrieved fromhttp:// europa.eu/legislation_summaries/education_training_youth/lifelong_learning/c11090_en.htm Friedman, T. L. (2005). The World is Flat. London: Penguin. 491
Compilation of References
Evans, K., Kersh, N., & Kontanien, S. (2004). Recognition of tacit skills: Sustaining learning outcomes in adult learning and work re-entry. International Journal of Training and Development, 8(1), 54–72. doi:10.1111/j.1360-3736.2004.00196.x Eyler, J., & Giles, D. (1999). Where’s the learning in service-learning? San Francisco, CA: Jossey-Bass. Falchikov, N. (1995). Peer feedback marking: Developing peer assessment. Innovations in Education and Training International, 32(2), 175–187. doi:10.1080/1355800950320212 Falchikov, N. (2001). Learning together: Peer tutoring in higher education. London: Routledge Falmer. doi:10.4324/9780203451496 Falchikov, N. (2005). Improving assessment through student involvement. New York: Routledge Falmer. Falchikov, N., & Goldfinch, J. (2000). Student peer assessment in higher education: A meta-analysis comparing peer and teacher marks. Review of Educational Research, 70(3), 287–322. doi:10.3102/00346543070003287 Farid, S., Ahmad, R., Niaz, I. A., Arif, M., Shamshirband, S., & Khattak, M. D. (2015). Identification and prioritization of critical issues for the promotion of e-learning in Pakistan. Computers in Human Behavior, 51, 161–171. doi:10.1016/j. chb.2015.04.037 Farinha, L., & Ferreira, J. J. (2013). Triangulation of the Triple Helix: A Conceptual Framework (White Paper). Triple Helix. Farrel, D., Kostkova, P., Weinberg, J., Lazareck, L., Weerasinghe, D., Lecky, D. M., & McNulty, C. A. M. (2011). Computer games to teach hygiene: An evaluation of the e‐Bug junior game. The Journal of Antimicrobial Chemotherapy, 66(Suppl. 5), 39–44. doi:10.1093/jac/dkr122 PMID:21680586 Fawley, N. (2014). Flipped classrooms: Turning the tables on traditional library instruction. American Libraries, 45(910), 19. Fear, W. J., & Erikson-Brown, A. (2014). Good quality discussion is necessary but not sufficient in asynchronous tuition: a brief narrative review of the literature. Journal of Asynchronous Learning Networks, 18(2), 21–28. Feng, J. Y., Chang, Y. T., Chang, H. Y., Erdley, W. S., Lin, C. H., & Chang, Y. J. (2013). Systematic review of effectiveness of situated e-learning on medical and nursing education. Worldviews on Evidence-Based Nursing, 10(3), 174–183. doi:10.1111/wvn.12005 PMID:23510119 Fernandez, A., Abrahão, S., & Insfran, E. (2013). Empirical validation of a usability inspection method for model-driven Web development. Journal of Systems and Software, 86(1), 161–186. doi:10.1016/j.jss.2012.07.043 Fernandez, A., Insfran, E., & Abrahão, S. (2011). Usability evaluation methods for the web: A systematic mapping study. Information and Software Technology, 53(8), 789–817. doi:10.1016/j.infsof.2011.02.007 Ferrari, A. (2013), DIGCOMP: A Framework for Developing and Understanding Digital Competence in Europe. Fer, S., & Cırık, İ. (2007). Yapılandırmacı öğrenme: Kuramdan uygulamaya. İstanbul: Morpa Kültür Yayınları. Fichten, C. S. (1986). Self, other, and situation-referent automatic thoughts: Interaction between people who have a physical disability and those who do not. Cognitive Therapy and Research, 10(5), 571–587. doi:10.1007/BF01177820 Finegold, D., & Notabartolo, A. S. (2010). 21st-Century Competencies and Their Impact: An Interdisciplinary Literature Review. Retrieved from http://www.hewlett.org/library/grantee-publication/21st-century-competencies-and-their-impactinterdisciplinary-literature-review
492
Compilation of References
Fink, L. D. (2003). Creating significant learning experiences: An integrated approach to designing college courses. San Francisco, CA: Jossey-Bass. Fisher, C. (2010). Discussion, participation and feedback in online courses. Proceedings of Information Systems Educators Conference. Retrieved from http://www.westga.edu/~distance/ojdla/winter114/hixon114.html Flanigan, A. E., & Babchuk, W. A. (2015). Social media as academic quicksand: A phenomenological study of student experiences in and out of the classroom. Learning and Individual Differences, 44, 40–45. doi:10.1016/j.lindif.2015.11.003 Fletcher, G., Dowsett, G. W., & Austin, L. (n. d.). Actively promoting learner engagement: developing and implementing a signature subject on ‘Contemporary Issues in Sex and Sexuality’. Journal of University Teaching & Learning Practice, 9(3), 1-10. Flipped Learning Network. (2014). The four pillars of F-L-I-P. Retrieved from www.flippedlearning.org/definition Foldnes, N. (2016). The flipped classroom and cooperative learning: Evidence from a randomized experiment. Active Learning in Higher Education, 17(1), 39–49. doi:10.1177/1469787415616726 Fonosch, G., & Schwab, L. O. (1981). Attitudes of selected university faculty members toward disabled students. Journal of College Student Personnel, 22(3), 229–235. Fredericksen, E., Pickett, A., Shea, P., Pelz, W., & Swan, K. (2000). Student satisfaction and perceived learning with online courses: Principles and examples from the SUNY learning network. Journal of Asynchronous Learning Networks, 4(2), 7–41. Frijda, N. H. (1986). The Emotions. Studies in Emotion and Social Interaction series. Cambridge, UK: Cambridge University Press. Frydenberg, M. (2013). Flipping excel. Information Systems Education Journal, 11(1), 63–73. Fry, S. A. (1990). Implementation and evaluation of peer marking in higher education. Assessment & Evaluation in Higher Education, 15(3), 177–189. doi:10.1080/0260293900150301 Fung, K. (2015). Otolaryngology–head and neck surgery in undergraduate medical education: Advances and innovations. The Laryngoscope, 125(Suppl. 2), S1–S14. doi:10.1002/lary.24875 PMID:25124523 Furco, A., & Billig, S. H. (Eds.). (2002). Service-learning: The essence of the pedagogy. Greenwich, CT: Information Age Publishing. Furió, D., González‐Gancedo, S., Juan, M. C., Segui, I., & Rando, N. (2013). Evaluation of learning outcomes using an educational iPhone game vs. traditional game. Computers & Education, 64, 1–23. doi:10.1016/j.compedu.2012.12.001 Furstenberg, G., Levet, S., English, K., & Maillet, K. (2001). Giving a virtual voice to the silent language of culture: The CULTURA project. Language Learning & Technology, 5(1), 55-102. Retrieved fromllt.msu.edu/vol5num1/furstenberg/ default.html Giddens, A. (1990). The Consequences of Modernity. Stanford, CT: Stanford University Press. Gaba, D. M. (1999). The human work environment and anesthesia simulators. In R. Miller (Ed.), Anesthesia (pp. 2613–2668). New York, NY: Churchill Livingstone. Gaba, D. M. (2004). The future vision of simulation in health care. Quality & Safety in Health Care, 13(Suppl. 1), i2–i10. doi:10.1136/qshc.2004.009878 PMID:15465951 Gaba, D. M. (2006). The futures here. We are it. Simulation in Healthcare, 1, 1–2. doi:10.1097/01266021-20060001000001 PMID:19088564
493
Compilation of References
Gagné, R. (1985). The Conditions of Learning (4th ed.). New York, NY: Holt, Rinehart & Winston. Gagné, R., & Driscoll, M. (1988). Essentials of learning for instruction (2nd ed.). Englewood Cliffs, NJ: Prentice-Hall. Gaikwad, N., & Tankhiwale, S. (2014). Interactive e-learning module in pharmacology: A pilot project at a rural medical college in India. Perspectives on Medical Education, 3(1), 15–30. doi:10.1007/s40037-013-0081-0 PMID:24072666 Gao, F. (2014). Exploring the Use of Discussion Strategies and Labels in Asynchronous Online Discussion. Online Learning, 18(3), 1–19. Gao, F., Zhang, T., & Franklin, T. (2013). Designing asynchronous online discussion environments: Recent progress and possible future directions. British Journal of Educational Technology, 44(3), 469–483. doi:10.1111/j.1467-8535.2012.01330.x Garavan, T. N., Carbery, R., O’Malley, G., & O’Donnell, D. (2010). Understanding participation in e-learning in organizations: A large-scale empirical study of employees. International Journal of Training and Development, 14(3), 155–168. doi:10.1111/j.1468-2419.2010.00349.x Garrett, T. (2008). Student-centered and teacher-centered classroom management: A case study of three elementary teachers. Journal of Classroom Interaction, 43(1), 34–47. Garrison, D. R. (2011). E-learning in the 21st century: A framework for research and practice. Marceline MO: Walsworth Publishing Company. Garrison, D. R. (2011). E-learning in the 21st century: A framework for research and practice. Taylor & Francis. Garrison, D. R., & Anderson, T. (2003). E-learning in the 21st Century: A framework for research and practice. NY: Routledge Falmer. Garrison, D. (2011). E- learning 21st century: A framework for research and practice (2nd ed.). NY: RoutledgeFalmer. Garrison, D. R. (2006). Online collaboration principles. Journal of Asynchronous Learning Networks, 10(1), 25–34. Garrison, D. R. (2009). Implications of online learning for the conceptual development and practice of distance education. Journal of Distance Education, 23(2), 93–104. Garrison, D. R., & Akyol, Z. (2013). The community of inquiry theoretical framework. In M. G. doi:10.4324/9780203803738. ch7 Garrison, D. R., Anderson, T., & Archer, W. (1999). Critical inquiry in a text-based environment: Computer conferencing in higher education. The Internet and Higher Education, 2(2), 87–105. doi:10.1016/S1096-7516(00)00016-6 Garrison, D. R., Anderson, T., & Archer, W. (2001). Critical thinking, cognitive presence, and computer conferencing in distance education. American Journal of Distance Education, 15(1), 7–23. doi:10.1080/08923640109527071 Garrison, D. R., Anderson, T., & Archer, W. (2010). The first decade of the community of inquiry framework: A retrospective. The Internet and Higher Education, 13(1-2), 5–9. doi:10.1016/j.iheduc.2009.10.003 Garrison, D. R., & Arbaugh, J. B. (2007). Researching the community of inquiry framework: Review, issues, and future directions. The Internet and Higher Education, 10(3), 157–172. doi:10.1016/j.iheduc.2007.04.001 Garrison, D. R., & Cleveland-Innes, M. (2005). Facilitating cognitive presence in online learning: Interaction is not enough. American Journal of Distance Education, 19(3), 133–148. doi:10.1207/s15389286ajde1903_2 Garrison, D. R., Cleveland-Innes, M., & Fung, T. S. (2010). Exploring causal relationships among teaching, cognitive and social presence: Student perceptions of the community of inquiry framework. The Internet and Higher Education, 13(1-2), 31–36. doi:10.1016/j.iheduc.2009.10.002 494
Compilation of References
Garrison, D. R., & Kanuka, H. (2004). Blended learning: Uncovering its transformative potential in higher education. The Internet and Higher Education, 7(2), 95–105. doi:10.1016/j.iheduc.2004.02.001 Garrison, D. R., & Vaughan, N. D. (2008). Blended learning in higher education: Framework, principles, and guidelines. San Francisco: Jossey-Bass. Gašević, D., Adesope, O., Joksimović, S., & Kovanović, V. (2015). Externally-facilitated regulation scaffolding and role assignment to develop cognitive presence in asynchronous online discussions. The Internet and Higher Education, 24, 53–65. doi:10.1016/j.iheduc.2014.09.006 Gaytan, J., & McEwen, B. C. (2007). Effective Online Instructional and Assessment Strategies. American Journal of Distance Education, 21(3), 117–132. doi:10.1080/08923640701341653 Geng, R., & Tian, J. (2015). Improving web navigation usability by comparing actual and anticipated usage. IEEE Transactions on Human-Machine Systems, 45(1), 84–94. George Washington’s Clinical Learning & Simulation Skills (CLASS) Center. (n. d.). Non-human simulation rooms. Retrieved from https://smhs.gwu.edu/class/about/simlab George, D., & Mallery, P. (2009). SPSS for Windows step by step: a simple guide and reference 16.0 update. Boston, MA: Allyn and Bacon. Gielen, S., Peeters, E., Dochy, F., Onghena, P., & Struyven, K. (2010). Improving the effectiveness of peer feedback for learning. Learning and Instruction, 20(4), 304–315. doi:10.1016/j.learninstruc.2009.08.007 Gikandi, J. W., Morrow, D., & Davis, N. E. (2011). Online formative assessment in higher education: A review of the literature. Computers & Education, 57(4), 2333–2351. doi:10.1016/j.compedu.2011.06.004 Gilbert, N., & Troitzsch, K. G. (2005). Simulation for the social scientist. Berkshire, United Kingdom: Open University Press. Gillespie, H. (2006). Unlocking teaching and learning with ICT: Identifying and overcoming barriers. London: David Fulton. Ginns, P., & Ellis, R. A. (2009). Evaluating the quality of e-learning at the degree level in the student experience of blended learning. British Journal of Educational Technology, 40(4), 652–663. doi:10.1111/j.1467-8535.2008.00861.x Girasoli, A. J., & Hannafin, R. D. (2008). Using asynchronous AV communication tools to increase academic selfefficacy. Computers & Education, 51(4), 1676–1682. doi:10.1016/j.compedu.2008.04.005 Goh, J., & Clapham, M. (2014). Attitude to e–learning among newly qualified doctors. The Clinical Teacher, 11(1), 20–23. doi:10.1111/tct.12117 PMID:24405914 GoldSim. (n. d.). Instruction: Types of simulation tools. Retrieved from http://www.goldsim.com/Web/Introduction/ SimulationTypes/ Goleman, D. (2007). Social intelligence: the new science of human relationships, Reprint Editdion. Bantam. Gonzalez, J. (2014). Modifying the flipped classroom: The “in-class” version. Flipped Classroom. Retrieved from http:// www.edutopia.org/blog/flipped-classroom-in-class-version-jennifer-gonzalez Goodyear, P., Salmon, G., Spector, J. M., Steeples, C., & Tickner, S. (2001). Competences for online teaching: A special report. Educational Technology Research and Development, 49(1), 65–72. doi:10.1007/BF02504508
495
Compilation of References
Gordon, M., Chandratilake, M., & Baker, P. (2013). Low fidelity, high quality: A model for e-learning. The Clinical Teacher, 10(4), 258–263. doi:10.1111/tct.12008 PMID:23834573 Gorham, J. (1988). The relationship between verbal teacher immediacy behaviors and student learning. Communication Education, 37(1), 40–53. Graff, M. (2003). Cognitive style and attitudes towards using online learning and assessment methods. Electronic Journal of E-Learning, 1(1), 21–28. Granić, A. (2008). Experience with usability evaluation of e-learning systems. Universal Access in the Information Society, 7(4), 209–221. doi:10.1007/s10209-008-0118-z Griffin, P., & Care, E. (2015). Assessment and teaching of 21st century skills. In P. Griffin & E. Care (Eds.), THE ATC21S Method (pp. 3–33). Dordrecht: Springer. doi:10.1007/978-94-017-9395-7 Griffiths, M. E., & Graham, C. R. (2009) Using asynchronous video in online classes: Results from a pilot study. Instructional technology & distance Learning, 6(3), 65-76. Griffiths, M. E., & Graham, C. R. (2009). The Potential of Asynchronous Video in Online Education. Distance Learning, 6(2), 13–22. Griffiths, M., & Graham, C. R. (2010). Using Asynchronous Video to Achieve Instructor Immediacy and Closeness in Online Classes: Experiences from Three Cases. International Journal on E-Learning, 9(3), 325–340. Gronlund, A., & Islam, Y. M. (2010). A mobile e-learning environment for developing countries: The Bangladesh virtual interactive classroom. Information Technology for Development, 16(4), 244–259. doi:10.1080/02681101003746490 Grosseck, G. (2009). To Use or Not to Use Web 2.0 in Higher Education? Paper presented at the Procedia Social and Behavioral Sciences, World Conference on Educational Science. Guiller, J., Durndell, A., & Ross, A. (2008). Peer interaction and critical thinking: Face-to-face or online discussion? Learning and Instruction, 18(2), 187–200. doi:10.1016/j.learninstruc.2007.03.001 Gunawardena, C. N., & McIsaac, M. S. (2004). Distance education. In D. H. Jonassen (Ed.), Handbook of research on educational communications and technology (pp. 355–395). Mahwah, NJ: Lawrence Erlbaum. Gunawardena, C. N., & Zittle, F. J. (1997). Social presence as a predictor of satisfaction within a computer‐mediated conferencing environment. American Journal of Distance Education, 11(3), 8–26. doi:10.1080/08923649709526970 Gunga, S. O., & Ricketts, I. W. (2007). Facing the challenges of e-learning initiatives in African universities. British Journal of Educational Technology, 38(5), 896–906. doi:10.1111/j.1467-8535.2006.00677.x Guo, P. J., Kim, J., & Rubin, R. (2014). How video production affects student engagement: An empirical study of mooc videos. Proceedings of the first ACM Learning@scale conference (pp. 41–50). ACM. Gupta, P., Thangaratinam, S., Shehmar, M., Gee, H., Karri, K., Bondili, A., & Khan, K. S. (2012). An electronic trainingthe-trainers programme: Developing resources for training in educational supervision in obstetrics and gynaecology. The Obstetrician & Gynaecologist, 14(1), 39–44. doi:10.1111/j.1744-4667.2011.00087.x Haaga, D. A. F. (1993). Peer review of term papers in graduate psychology courses. Teaching of Psychology, 20(1), 28–32. doi:10.1207/s15328023top2001_5 Halawi, L. A., McCarthy, R., & Pires, S. (2009). An evaluation of E-learning on the basis of Blooms Taxonomy: An exploratory study. Journal of Education for Business, 84(6), 374–380. doi:10.3200/JOEB.84.6.374-380
496
Compilation of References
Hall, G. E. (1997). Stages of concern. Paper presented at the Annual Conference of the Association for Educational Communications and Technology (AECT ‘97), Albuquerque, NM, USA. Hamdan, N., McKnight, P., McKnight, K., & Arfstrom, K. M. (2013). A review of flipped learning. South Bend, IN: Flipped Learning Network. Retrieved from http://flippedlearning.org/domain/41 Hammami, S., & Mathkour, H. (2015). Adaptive e-learning system based on agents and object petri nets (AELS-A/OPN). Computer Applications in Engineering Education, 23(2), 170–190. doi:10.1002/cae.21587 Hammond, L., Austin, K., Orcutt, S., & Rosso, J. (2001). How people learn. Introduction to learning theories. Stanford University School of Education, Stanford University. Hansen, D. E. (2008). Knowledge transfer in online learning environments. Journal of Marketing Education, 30(2), 93–105. doi:10.1177/0273475308317702 Hao, Y. (2016). Exploring undergraduates perspectives and flipped learning readiness in their flipped classrooms. Computers in Human Behavior, 59, 82–92. doi:10.1016/j.chb.2016.01.032 Harasim, L. (1989). On-line education: A new domain. In R. Mason & A. Kaye (Eds.), Mindweave: Communication, computers, and distance education (pp. 50–62). Oxford: Pergamon Press. Harasim, L. (2012). Learning theory and online technologies. New York, NY: Routledge. Harden, R. M. (2005). A new vision for distance learning and continuing medical education. The Journal of Continuing Education in the Health Professions, 25(1), 43–51. doi:10.1002/chp.8 PMID:16078802 Hardman, S. (2007). Simulation: Transforming technology into teaching. In J. Woodhouse (Ed.), Strategies for healthcare education: How to teach in the 21st century (pp. 91–102). Oxford, United Kingdom: Radcliffe. Harlen, W., & Jelly, S. (1997). Developing science in the primary classroom. Essex, United Kingdom: Addison Wesley Longman. Harrati, N., Bouchrika, I., Tari, A., & Ladjailia, A. (2016). Exploring user satisfaction for e-learning systems via usage-based metrics and system usability scale analysis. Computers in Human Behavior, 61, 463–471. doi:10.1016/j.chb.2016.03.051 Hartmann, S. (1996). The world as a process: Simulations in the natural and social sciences. In R. Hegselmann, U. Mueller, & K. Troitzsch (Eds.), Modeling and simulation in the social sciences from the philosophy of science point of view (pp. 77–100). Dordrecht, Netherlands: Kluwer. doi:10.1007/978-94-015-8686-3_5 Hay, A., Peltier, J., & Drago, W. (2004). Reflective learning and on-line education: A comparison of traditional and on-line MBA students. Strategic Change, 13(4), 169–182. doi:10.1002/jsc.680 Hay, D. B., Kehoe, C., Miquel, M. E., Hatzipanagos, S., Kinchin, I. M., Keevil, S. F., & Lygo-Baker, S. (2008). Measuring the quality of e-learning. British Journal of Educational Technology, 39(6), 1037–1056. doi:10.1111/j.14678535.2007.00777.x Head, K. (2013). Lessons Learned from a Freshman-Composition MOOC. The Chronicle of Higher Education. Retrieved from http://chronicle.com/blogs/wiredcampus/lessons-learned-from-a-freshman-composition-mooc/46337 Heath, C., & Heath, D. (2007). Made to stick: Why some ideas thrive and others die. New York: Random House. Heer, R. (2015). Revised Bloom’s taxonomy. Effective teaching practices. Center for Excellence in Learning and Teaching (CELT). Ames, IA: Iowa State University. Retrieved from http://www.celt.iastate.edu/wp-content/uploads/2015/09/ RevisedBloomsHandout-1.pdf
497
Compilation of References
Held, D., McGrew, A. G., Goldblatt, D., & Perraton, J. (1999). Global Transformations. Politics, Economics and Culture. Cambridge: Polity Press. Henriksson, A., Yi, Y., Frost, B., & Middleton, M. (2007). Evaluation instrument for e-government websites. Electronic Government, an International Journal, 4(2), 204-226. Hensberry, K., Moore, E., & Perkins, K. (2015). Effective student learning of fractions with an interactive simulation. Journal of Computers in Mathematics and Science Teaching, 34(3), 273–298. Henson, K. T. (2003). Foundations for learner-centered educational: A knowledge base. Computers & Education, 124(1), 5–16. Hernández, A. B., Gorjup, M. T., & Cascón, R. (2010). The role of the instructor in business games: A comparison of face-to-face and online instruction. International Journal of Training and Development, 14(3), 169–179. doi:10.1111/ j.1468-2419.2010.00350.x Hernandez, M. D. (2015). Academic Dishonesty Using Social Media: A Comparative Study of College Students from Canada and China. SAM Advanced Management Journal, 80(4), 45. Herrington, J., Reeves, T. C., & Oliver, R. (2010). A guide to authentic e-learning. New York, NY: Routledge. Herrington, J., Reeves, T. C., & Oliver, R. (2014). Authentic learning environments. In J. M. Spector, M. D. Merrill, J. Elen, & M. J. Bishop (Eds.), Handbook of Research on Educational Communications and Technology (4th ed., pp. 401–412). New York, NY: Springer. doi:10.1007/978-1-4614-3185-5_32 Herrmann, N., & Popyack, J. L. (2003). Electronic grading: When the tablet is mightier than the pen. Syllabus: Technology for Higher Education. Heuer, B. P., & King, K. P. (2004). Leading the band: The role of the instructor in online learning for educators. Journal of Interactive Online Learning, 3(1), 1–11. He, W. (2013). Examining students online interaction in a live video streaming environment using data mining and text mining. Computers in Human Behavior, 29(1), 90–102. doi:10.1016/j.chb.2012.07.020 Hew, K. F., & Cheung, W. S. (2012a). Audio-based versus text-based asynchronous online discussion: Two case studies. Instructional Science, 41(2), 365–380. doi:10.1007/s11251-012-9232-7 Hew, K. F., & Cheung, W. S. (2012b). Students’ use of Asynchronous Voice Discussion in a Blended-Learning Environment: A study of two undergraduate classes. Electronic Journal of E-Learning, 10(4), 360–367. He, Y. (2014). Universal design for learning in an online teacher education course: Enhancing learners’ confidence to teach online. MERLOT Journal of Online Learning and Teaching, 10(2), 283–298. Hickey, D., & McCaslin, M. (2001). Educational psychology, social constructivism, and educational practice: A case of emergent identity. Educational Psychologist, 36(2), 133–140. doi:10.1207/S15326985EP3602_8 Hirsch, J. (2015). 100 Videos and counting: Lessons from a flipped classroom. Retrieved on April 6, 2016 from http:// www.edutopia.org/blog/100-videos-lessons-flipped-classroom-joe-hirsch Holsapple, C. W., & Lee-Post, A. (2006). Defining, assessing, and promoting e-learning success: An information systems perspective. Decision Sciences Journal of Innovative Education, 4(1), 67–85. doi:10.1111/j.1540-4609.2006.00102.x Holt, M. (1992). The value of written peer criticism. College Composition and Communication, 43(2), 384–392. doi:10.2307/358229
498
Compilation of References
Homan, G., & Macpherson, A. (2005). E-learning in the corporate university. Journal of European Industrial Training, 29(1), 75–90. doi:10.1108/03090590510576226 Honeycutt, B. (2013, March 25). Looking for “flippable moments” in your class. Faculty Focus blog. Retrieved from http://www.facultyfocus.com/articles/instructional-design/looking-for-flippable-moments-in-your-class/ Honeycutt, B., & Garrett, J. (2013). The flipped approach to a learner-centered class (White paper). Madison, WI: Magna Publications. Horan, D. A., Hersi, A. A., & Kelsall, P. (2016). The dialogic nature of meaning making within a hybrid learning space: Individual, community, and knowledge-building pedagogical tools. In J. Keengwe (Ed.), Handbook of Research on Active Learning and the Flipped Classroom Model in the Digital Age (pp. 19–40). Hershey, PA: IGI Global. doi:10.4018/9781-4666-9680-8.ch002 Hornb\aek, K. (2006). Current practice in measuring usability: Challenges to usability studies and research. International journal of human-computer studies, 64(2), 79-102. Horton, W. (2011). E-learning by design. John Wiley & Sons. doi:10.1002/9781118256039 Houck, C. K., Asselin, S. B., Troutman, G. C., & Arrington, J. M. (1992). Students with learning disabilities in the university environment: A study of faculty and student perceptions. Journal of Learning Disabilities, 25(10), 678–684. doi:10.1177/002221949202501008 PMID:1460390 Houlding, S. (1994). 3D geoscience modeling: Computer techniques for geological characterization. Berlin, Germany: Springer-Verlag. How Does IMS Enable Better Learning Experiences? (2016). Retrieved from https://www.imsglobal.org/ How Millennials use and control social media. (2015). American Press Institute. Retrieved from http://www.americanpressinstitute.org/publications/reports/survey-research/millennials-social-media/ Hoyt, J., & Oviatt, D. (2013). Governance, faculty incentives, and course ownership in online education at doctorategranting universities. American Journal of Distance Education, 27(3), 165–178. doi:10.1080/08923647.2013.805554 Hrtonova, N., Kohout, J., Rohlikova, L., & Zounek, J. (2015). Factors influencing acceptance of e-learning by teachers in the Czech Republic. Computers in Human Behavior, 51, 873–879. doi:10.1016/j.chb.2014.11.018 Huang, H. M. (2002). Toward constructivism for adult learners in online learning environments. British Journal of Educational Technology, 33(1), 27–37. doi:10.1111/1467-8535.00236 Huang, W., & Mille, A. (2006). ConKMeL: A contextual knowledge management framework to support multimedia e-learning. Multimedia Tools and Applications, 30(2), 205–219. doi:10.1007/s11042-006-0024-4 Huang, W., Webster, D., Wood, D., & Ishaya, T. (2006). An intelligent semantic e-learning framework using contextaware Semantic Web technologies. British Journal of Educational Technology, 37(3), 351–373. doi:10.1111/j.14678535.2006.00610.x Huber, M. T., & Morreale, S. P. (Eds.). (2002). Disciplinary styles in the scholarship of teaching and learning: Exploring common ground. Washington, DC: American Association for Higher Education and The Carnegie Foundation of Teaching. Hug, T., Lindner, M., & Bruck, P. A. (2005). Microlearning: Emerging concepts, practices and technologies after elearning. Proceedings of Microlearning ‘05. Hung, M.-L., & Chou, C. (2014). The Development, Validity, and Reliability of Communication Satisfaction in an Online Asynchronous Discussion Scale. Asia-Pacific Education Researcher, 23(2), 165-177. doi:10.1007/s40299-013-0094-9 499
Compilation of References
Hung, H., & Cho, V. (2008). Continued usage of e-learning communication tools: A study from the learners’ perspective in Hong Kong. International Journal of Training and Development, 12(3), 171–187. doi:10.1111/j.1468-2419.2008.00302.x Hung, J. I. (2012). Trends of e-learning research from 2000 to 2008: Use of text mining and bibliometrics. British Journal of Educational Technology, 43(1), 5–16. doi:10.1111/j.1467-8535.2010.01144.x Hurtado, S., & Carter, D. F. (1997). Effects of college transition and perceptions of the campus racial climate on Latino learners sense of belonging. Sociology of Education, 70(4), 324–345. doi:10.2307/2673270 Ice, P., Curtis, R., Phillips, P., & Wells, J. (2007). Using asynchronous audio feedback to enhance teaching presence and students’ sense of community. Journal of Asynchronous Learning Networks, 11(2), 3–25. Illeris, K. (2003). Towards a contemporary and comprehensive theory of learning. International Journal of Lifelong Education, 22(4), 396–406. doi:10.1080/02601370304837 Ingerham, L. (2012). Interactivity in the online learning environment: A study of users of the North Carolina Virtual Public School. The Quarterly Review of Distance Education, 13(2), 65–75. Ingvarson, D., & Gaffney, M. (2008). Developing and Sustaining the Digital Education Ecosystem: The Value and Possibilities of Online Environments for Learner Learning. In M. Lee, M. Gaffney (Eds.), Leading a digital school: Principles and practice (pp. 146-167). Camberwell, Australia: ACER Press. International Association of Schools of Social Work, & International Federation of Social Workers (2004). Global Standards for Social Work Education and training of the Social Work Profession. Adelaide, Australia. Islam, A. K. M. N. (2014). Sources of satisfaction and dissatisfaction with a learning management system: A critical incident approach. Computers in Human Behavior, 30(1), 249–261. doi:10.1016/j.chb.2013.09.010 Islam, A. K. M. N. (2016). E-learning system use and its outcomes: Moderating role of perceived compatibility. Telematics and Informatics, 33(1), 48–55. doi:10.1016/j.tele.2015.06.010 Ismailova, R. (2015). Web site accessibility, usability and security: A survey of government web sites in Kyrgyz Republic. Universal Access in the Information Society. ISTE. (2002). National Educational Technology Standards and Performance Indicators for All Teachers. Retrieved from http://cnets.iste.org/teachers/t_stands.html Ivory, M. Y., & Hearst, M. A. (2001). The state of the art in automating usability evaluation of user interfaces. ACM Computing Surveys, 33(4), 470–516. doi:10.1145/503112.503114 Jacobsen, A. D., Egen, P., & Donald, K. (2002). Methods for teaching promoting student learning (6th ed.). Ohio: Merrill Prentice Hall. Jacobs, J. E., & Paris, S. G. (1987). Children’s metacognition about reading: Issues in definition, measurement, and instruction. Educational Psychologist, 22(3 & 4), 235–278. Jacobs, J. W., & Dempsey, J. V. (1993). Simulation and gaming: Fidelity, feedback, and motivation. In J. V. Dempsey & G. C. Sales (Eds.), Interactive instruction and feedback. Englewood Cliffs, NJ: Educational Technology Publications. Jacoby, B., & Brown, N. C. (2009). Preparing students for global civic engagement. In B. Jacoby (Ed.). Civic engagement in higher education: Concepts and practices. San Francisco, CA, USA: Jossey-Bass. Jacoby, B. (2015). Service-Learning essentials: Questions, answers, and lessons learned. San Francisco, CA: Jossey-Bass.
500
Compilation of References
Jameson, J., Ferrell, G., Kelly, J., Walker, S., & Ryan, M. (2006). Building trust and shared knowledge in communities of e-learning practice: Collaborative leadership in the JISC eLISA and CAMEL lifelong learning projects. British Journal of Educational Technology, 37(6), 949–967. doi:10.1111/j.1467-8535.2006.00669.x Januszewski, A., & Molenda, M. (2008). Educational technology: a definition with commentary. New York: Lawrence Erlbaum Associates. Jara, C. A., Candelas, F. A., Torres, F., Dormido, S., & Esquembre, F. (2012). Synchronous collaboration of virtual and remote laboratories. Computer Applications in Engineering Education, 20(1), 124–136. doi:10.1002/cae.20380 Jenkins, J., Cogo, A., & Dewey, M. (2011). Review of developments in research into English as a lingua franca. Language Teaching, 44(3), 281–315. doi:10.1017/S0261444811000115 Jennings, S. E., & McCuller, M. Z. (2004). Meeting the challenges of grading online business communication assignments. Paper presented at the69th Annual Convention, Association for Business Communication, Cambridge, Massachusetts. Johnson, C. (2013). Exploring effective online course design components. In T. Bastiaens & G. Marks (Eds.), Proceedings of World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education 2013 (pp. 1183-1188). Chesapeake, VA: AACE. Johnson, C. (2016). Developing a teaching framework for online music course [Unpublished Ph.D. dissertation]. University of Calgary, Calgary, Alberta, Canada. Johnson, C., & Altowairiki, N. (2015, May 12-13). Building and maintaining online teaching presence: A practical starting point. Presentation at2015 University of Calgary Conference on Postsecondary Learning and Teaching, Calgary, Alberta, Canada. Johnson, D. W., & Johnson, R. T. (1990). Cooperative learning. Blackwell Publishing Ltd. Retrieved from http://onlinelibrary.wiley.com/doi/10.1002/9780470672532.wbepp066/abstract;jsessionid=08D8977240813DB2D09E28FF56 D79C96.f02t02?deniedAccessCustomisedMessage=&userIsAuthenticated=false Johnson, L., Adams Becker, S., Estrada, V., & Freeman, A. (2014). NMC Horizon Report: 2014 Higher Education Edition. Austin, TX: The New Media Consortium. Retrieved from http://cdn.nmc.org/media/2014-nmc-horizon- reporthe-EN-SC.pdf Johnson, D. W. (1981). Student-student interaction: The neglected variable in education. Educational Researcher, 10(1), 5–10. doi:10.3102/0013189X010001005 Johnson, D. W., & Johnson, F. P. (1991). Joining together: Group theory and group skills. Prentice-Hall, Inc. Johnson, D. W., Johnson, R. T., & Holubec, E. J. (1991). Cooperation in The Classroom, Interaction Book. MN: Edina Publications. Johnson, D. W., Johnson, R. T., & Smith, K. A. (1991). ASHE-ERIC Higher Education Report: Vol. 4. Cooperative learning: Increasing college faculty instructional productivity. Washington, DC: The George Washington University, School of Education and Human Development. Johnson, P. A. (1999). Problem-based, cooperative learning in the engineering classroom. Journal of Professional Issues in Engineering Education and Practice, 125(1), 8–11. doi:10.1061/(ASCE)1052-3928(1999)125:1(8) Johnson, R. D., Hornik, S., & Salas, E. (2008). An empirical examination of factors contributing to the creation of successful e-learning environments. International Journal of Human-Computer Studies, 66(5), 356–369. doi:10.1016/j. ijhcs.2007.11.003
501
Compilation of References
Jonassen, D. (1999). Designing constructivist learning environments. In C. M. Reigeluth (Ed.), Instructional design theories and model: A new paradigm of instructional theory (Vol. 3, pp. 215–241). Hillsdale, NJ: Lawrence Erlbaum Associates. Jonassen, D. (1999). Designing constructivist learning environments. In C. M. Reigeluth (Ed.), Instructional design theories and models: A new paradigm of instructional theory (Vol. 2, pp. 215–239). Mahwah, NJ: Lawrence Erlbaum Associates. Jonassen, D. H. (1991). Objectivism versus constructivism; Do we need a new philosophical paradigm? Educational Technology Research and Development, 39(3), 5–14. doi:10.1007/BF02296434 Jonassen, D. H. (1992). Evaluating constructivistic learning. In T. M. Duffy & D. H. Jonassen (Eds.), Constructivism and the Technology of Instruction: A Conversation. Florence, KY: Routledge. Jonassen, D. H. (1994). Computers in schools: Mindtools for critical thinking. University Park, PA: Pennsylvania State University Press. Jonassen, D., Davidson, M., Collins, M., Campbell, J., & Haag, B. B. (1995). Constructivism and computer‐mediated communication in distance education. American Journal of Distance Education, 9(2), 7–26. doi:10.1080/08923649509526885 Jones, B. D. (2010). An examination of motivation model components in face-to-face and online instruction. Electronic Journal of Research in Educational Psychology, 8(3), 915–944. Jones, N., Georghiades, P., & Gunson, J. (2012). Student feedback via screen capture digital video: Stimulating students modified action. Higher Education, 64(5), 593–607. doi:10.1007/s10734-012-9514-7 Joo, Y. J., Kim, E. K., & Park, S. Y. (2009). The structural relationship among cognitive presence, flow and learning outcome in corporate cyber education. The Journal of Educational Information and Media, 15(3), 21–38. Joo, Y. J., Lim, K. Y., & Kim, E. K. (2011). Online university students satisfaction and persistence: Examining perceived level of presence, usefulness and ease of use as predictors in a structural model. Computers & Education, 57(2), 1654–1664. doi:10.1016/j.compedu.2011.02.008 Jovanovic, D., & Jovanovic, S. (2015). An adaptive e-learning system for Java programming course, based on Dokeos LE. Computer Applications in Engineering Education, 23(3), 337–343. doi:10.1002/cae.21603 Joyce, B., Weil, M., & Calhoun, E. (2014). Models of teaching (9th ed.). Boston, MA: Pearson. Junco, R. (2014). Engaging learners through social media: evidence-based practices for use in learner affairs. Hoboken, NJ: John Wiley & Sons. Jung, I. (2005). ICT-Pedagogy Integration in Teacher Training: Application Cases Worldwide. Journal of Educational Technology & Society, 8(2), 94–101. Kachru, B., & Nelson, C. (2001). World Englishes. In A. Burns & C. Coffin (Eds.), Analysing English in a Global Context (pp. 9–25). London: Routledge. Kahne, J., & Westheimer, J. (1996). In the service of what? The politics of service learning. Phi Delta Kappan, 77(9), 592–599. Kalelioglu, F., & Gülbahar, Y. (2014). The Effect of Instructional Techniques on Critical Thinking and Critical Thinking Dispositions in Online Discussion. Journal of Educational Technology & Society, 17(1), 248–258. Kankaanranta, A., & Louhiala-Salminen, L. (2013). “What language does global business speak?” – The concept and development of BELF. Ibérica, 26, 17–33.
502
Compilation of References
Kankaanranta, A., & Planken, B. (2010). BELF Competence as Business Knowledge of Internationally Operating Business Professional. Journal of Business Communication, 47(4), 380–407. doi:10.1177/0021943610377301 Kanuka, H., Rourke, L., & Laflamme, E. (2007). The influence of instructional methods on the quality of online discussion. British Journal of Educational Technology, 38(2), 260–271. doi:10.1111/j.1467-8535.2006.00620.x Kaplan, A. M., & Haenlein, M. (2010). Users of the world, unite! The challenges and opportunities of Social Media. Business Horizons, 53(1), 59–68. doi:10.1016/j.bushor.2009.09.003 Kasemsap, K. (2013). Practical framework: Creation of causal model of job involvement, career commitment, learning motivation, and learning transfer. International Journal of the Computer, the Internet and Management, 21(1), 29–35. Kasemsap, K. (2016f). Mastering big data in the digital age. In M. Singh & D. G. (Eds.), Effective big data management and opportunities for implementation (pp. 104–129). Hershey, PA, USA: IGI Global. doi:10.4018/978-1-5225-0182-4.ch008 Kasemsap, K. (2014). The role of social media in the knowledge-based organizations. In I. Lee (Ed.), Integrating social media into business practice, applications, management, and models (pp. 254–275). Hershey, PA, USA: IGI Global. doi:10.4018/978-1-4666-6182-0.ch013 Kasemsap, K. (2016a). The roles of e-learning, organizational learning, and knowledge management in the learning organizations. In E. Railean, G. Walker, A. Elçi, & L. Jackson (Eds.), Handbook of research on applied learning theory and design in modern education (pp. 786–816). Hershey, PA, USA: IGI Global. doi:10.4018/978-1-4666-9634-1.ch039 Kasemsap, K. (2016b). Exploring the role of web-based learning in global education. In M. Raisinghani (Ed.), Revolutionizing education through web-based instruction (pp. 202–224). Hershey, PA, USA: IGI Global. doi:10.4018/978-14666-9932-8.ch012 Kasemsap, K. (2016c). The roles of lifelong learning and knowledge management in global higher education. In P. Ordóñez de Pablos & R. Tennyson (Eds.), Impact of economic crisis on education and the next-generation workforce (pp. 71–100). Hershey, PA, USA: IGI Global. doi:10.4018/978-1-4666-9455-2.ch004 Kasemsap, K. (2016d). The role of learning analytics in global higher education. In M. Anderson & C. Gavan (Eds.), Developing effective educational experiences through learning analytics (pp. 282–307). Hershey, PA, USA: IGI Global. doi:10.4018/978-1-4666-9983-0.ch012 Kasemsap, K. (2016e). Utilizing communities of practice to facilitate knowledge sharing in the digital age. In S. Buckley, G. Majewski, & A. Giannakopoulos (Eds.), Organizational knowledge facilitation through communities of practice in emerging markets (pp. 198–224). Hershey, PA, USA: IGI Global. doi:10.4018/978-1-5225-0013-1.ch011 Kasemsap, K. (2016g). Mastering digital libraries in the digital age. In E. de Smet & S. Dhamdhere (Eds.), E-discovery tools and applications in modern libraries (pp. 275–305). Hershey, PA, USA: IGI Global. doi:10.4018/978-1-52250474-0.ch015 Kasemsap, K. (2017a). Encouraging continuing professional development and teacher professional development in global education. In R. Cintron, J. Samuel, & J. Hinson (Eds.), Accelerated opportunity education models and practices (pp. 168–202). Hershey, PA, USA: IGI Global. doi:10.4018/978-1-5225-0528-0.ch008 Kasemsap, K. (2017b). Mastering educational computer games, educational video games, and serious games in the digital age. In R. Alexandre Peixoto de Queirós & M. Pinto (Eds.), Gamification-based e-learning strategies for computer programming education (pp. 30–52). Hershey, PA, USA: IGI Global. doi:10.4018/978-1-5225-1034-5.ch003
503
Compilation of References
Kasemsap, K. (2017c). Mastering web mining and information retrieval in the digital age. In A. Kumar (Ed.), Web usage mining techniques and applications across industries (pp. 1–28). Hershey, PA, USA: IGI Global. doi:10.4018/978-15225-0613-3.ch001 Kaufman, J. H., & Schunn, C. D. (2011). Students perceptions about peer assessment for writing: Their origin and impact on revision work. Instructional Science, 39(3), 387–406. doi:10.1007/s11251-010-9133-6 Kaye, C. (2004). The complete guide to service-learning; Proven, practical ways to engage students in civic responsibility, academic curriculum, and social action. Minneapolis, MN: Free Spirit. Kay, R. H. (2006). Developing a comprehensive metric for assessing discussion board effectiveness. British Journal of Educational Technology, 37(5), 761–783. doi:10.1111/j.1467-8535.2006.00560.x Kear, K. (2004). Peer learning using asynchronous discussion systems in distance education. Open Learning: The Journal of Open, Distance and e-Learning, 19(2), 151-164. doi:10.1080/0268051042000224752 Keast, D. A. (2009). A constructivist application for online learning in music. Research Issues in Music Education, 7(1), 1–8. Keller, C. J., Finkelstein, N. D., Perkins, K. K., & Pollock, S. J. (2006). Assessing the effectiveness of a computer simulation in conjunction with tutorials in introductory physics in undergraduate physics recitations. Retrieved from http:// www.colorado.edu/physics/EducationIssues/papers/perc2005_keller.pdf Kelly, R. (2009). Jump start program prepares faculty to teach online. Faculty Focus Special Report: 12 Tips for Improving Your Faculty Development Plan. Retrieved from http://www.facultyfocus.com/free-reports/12-tips-for-improvingyour-faculty-development-plan/ Kelly, R. (2012). Five factors that affect online learner motivation faculty focus. Madison, WI: Magna Publications. Kelsey, R. (2010). Building to learn: A decade of innovation at the Columbia University Center for New Media Teaching and Learning. Retrieved from http://ccnmtl.columbia.edu/dr/papers/kelsey_jordan2010.pdf Kernan, M. C., & Lord, R. G. (1990). Effects of valence, expectancies and goal-performance discrepancies in single and multiple goal environments. The Journal of Applied Psychology, 75(2), 194–203. doi:10.1037/0021-9010.75.2.194 Kerr, S. (2011). High school online. Pedagogy, preferences, and practices of three online teachers. Journal of Educational Technology Systems, 39(3), 221–244. doi:10.2190/ET.39.3.b Khan, B. H. (Ed.). (1997). Web-based instruction. Educational Technology. Kiesler, S., Siegel, J., & McGuire, T. W. (1984). Social psychological aspects of computer-mediated communication. The American Psychologist, 39(10), 1123–1134. doi:10.1037/0003-066X.39.10.1123 Kim, B., & Reeves, T. (2007). Reframing research on learning with technology: In search of the meaning of cognitive tools. Instructional Science, 35(1), 207–256. doi:10.1007/s11251-006-9005-2 Kim, K. S., & Sin, S. C. J. (2016). Use and Evaluation of Information from Social Media in the Academic Context: Analysis of Gap Between Students and Librarians. Journal of Academic Librarianship, 42(1), 74–82. doi:10.1016/j. acalib.2015.11.001 Kim, K., Trimi, S., Park, H., & Rhee, S. (2012). The impact of CMS quality on the outcomes of e-learning systems in higher education: An empirical study. Decision Sciences Journal of Innovative Education, 10(4), 575–587. doi:10.1111/ j.1540-4609.2012.00360.x King, F. J., Goodson, L., & Rohani, F. (n. d.). High order thinking skills: Definition, teaching strategies, assessment. Retrieved from http://www.cala.fsu.edu/files/higher_order_thinking_skills.pdf 504
Compilation of References
King, A. (1993). From sage on the stage to guide on the side. College Teaching, 41(1), 30–35. doi:10.1080/87567555. 1993.9926781 King, E., & Boyatt, R. (2015). Exploring factors that influence adoption of e-learning within higher education. British Journal of Educational Technology, 46(6), 1272–1280. doi:10.1111/bjet.12195 Kirriemuir, J. (2002). The relevance of gaming and gaming consoles to the higher education/further education learning experience. London, United Kingdom: JISC. Kirriemuir, J., & McFarlane, A. (2004). Literature Review in Games and Learning. Bristol: NESTA Futurelab. Kirschner, P., Sweller, J., & Richard, E. (2006). Why minimal guidance during instruction does not work: An analysis of the failure of constructivist, discovery, problem-based, experiential, and inquiry-based teaching. Educational Psychologist, 41(2), 75–86. doi:10.1207/s15326985ep4102_1 Kluijfhout, E. (2005). E-learning business models: Pedagogical approaches, design implications, and prerequisites for e-learning. Retrieved from http://fr.slideshare.net/eric.kluijfhout/pedagogical-approaches-design-implications-andprerequisites-for-e-learning Knight-McCord, J., Cleary, D., Grant, N., Herron, A., Lacey, T., Livingston, T., & Emanuel, R. et al. (2016). What social media sites do college students use most? Journal of Undergraduate Ethnic Minority Psychology, 2, 21. Knight, P. (2002). Summative assessment in Higher Education: Practices in disarray. Studies in Higher Education, 27(3), 276–286. doi:10.1111/1468-2273.00218 Knight, P. T., & Wilcox, S. (1998). Effectiveness and ethics in educational development: Changing contexts, changing notions. The International Journal for Academic Development, 3(2), 97–106. doi:10.1080/1360144980030202 Knowles, M. S. (1970). The modern practice of adult education (Vol. 41). New York Association Press New York. Kobewka, D., Backman, C., Hendry, P., Hamstra, S. J., Suh, K. N., Code, C., & Forster, A. J. (2014). The feasibility of e-learning as a quality improvement tool. Journal of Evaluation in Clinical Practice, 20(5), 606–610. doi:10.1111/ jep.12169 PMID:24828785 Koehler, M., Mishra, P., Kereluik, K., Shin, T., & Graham, C. (2014). The technological pedagogical content knowledge framework. In J. M. Spector, M. D. Merrill, J. Elen, & M. J. Bishop (Eds.), Handbook of Research on Educational Communications and Technology (4th ed., pp. 101–111). New York, NY: Springer. doi:10.1007/978-1-4614-3185-5_9 Koh, M. H., Hill, J. R., & Barbour, M. K. (2010). Strategies for instructors on how to improve online groupwork. Journal of Educational Computing Research, 43(2), 183–205. doi:10.2190/EC.43.2.c Kolb, D. A. (1984). Experiential Learning - Experience as the source of learning and development. Englewoods Cliffs, NJ, USA: Prentice-Hall. Kolb, A. Y., & Kolb, D. A. (2005). Learning styles and learning spaces: Enhancing experiential learning in higher education. Academy of Management Learning & Education, 4(2), 193–212. doi:10.5465/AMLE.2005.17268566 Kolb, D. A. (1984). Experiential learning: Experience as the source of learning and development. Englewood Cliffs, NJ: Prentice Hall. Kolb, D. A., Rubin, I. M., & McIntyre, J. M. (1994). Organizational psychology: An experiential approach to organizational behavior (4th ed.). London: Prentice Hall. Koricich, A. (2013). Technology Review: Multimedia Discussions Through VoiceThread. Community College Enterprise, 19(1), 76–79. 505
Compilation of References
Korzenny, F. (1978). A theory of electronic propinquity: Mediated Communication in organizations. Communication Research, 5(1), 3–23. doi:10.1177/009365027800500101 Ko, S., & Rossen, S. (2010). Teaching online: A practical guide (3rd ed.). New York, NY: Routledge. Kramsch, C., & Thorne, S. (2002). Foreign language learning as global communicative practice. In D. Block & D. Cameron (Eds.), Language learning and teaching in the age of globalization (pp. 83–100). London: Routledge. Krashen, S. (1982). Principles and practice in second language learning and acquisition. Oxford: Pergamon. Krause, S. (2013). The end of the Duke Composition MOOC: again, what did we learn here? Retrieved from http:// stevendkrause.com/2013/06/21/the-end-of-the-duke-composition-mooc-again-what-did-we-learn-here/comment-page-1/ Kress, G., Jewitt, C., Ogborne, J., & Tsatsarelis, C. (2001). Multimodal teaching and learning: The rhetorics of the science classroom. London and New York, NY: Continuum. Kuh, G. (2012). What matters to student success (Keynote address).Proceedings of the annual National Symposium on Student Retention, New Orleans, LA, USA. Kuh, G. D. (2009). What student affairs professionals need to know about student engagement. Journal of College Student Development, 50(6), 683–706. doi:10.1353/csd.0.0099 Ku, M., MacDonald, R. H., Andersen, D. L., Andersen, D. F., & Deegan, M. (2006). Using a simulation-based learning environment for teaching and learning about complexity in public policy decision making. Journal of Public Affairs Education, 22(1), 49–66. Kumar, V. S. (1996). Computer-supported collaborative learning: issues for research. Proceedings of theEighth Annual Graduate Symposium on Computer Science, University of Saskatchewan. Kumar, K., & Wideman, M. (2014). Accessible by design: Applying UDL principles in a first year undergraduate course. Canadian Journal of Higher Education, 44(1), 125–147. Kupczynski, L., Ice, P., Weisenmayer, R., & McCluskey, F. (2010). Student perceptions of the relationship between indicators of teaching presence and success in online courses. Journal of Interactive Online Learning, 9(1), 23–43. Kurt, A. A. (2011). Personalization principle in multimedia learning: Conversational versus formal style in written word. The Turkish Journal of Educational Technology, 10(3), 185–192. Lage, M. J., Platt, G. J., & Treglia, M. (2000). Inverting the classroom: A gateway to creating an inclusive learning environment. The Journal of Economic Education, 31(1), 30–43. doi:10.1080/00220480009596759 Lahti, M. E., Kontio, R. M., & Välimäki, M. (2016). Impact of an e-learning course on clinical practice in psychiatric hospitals: Nurse managers’ views. Perspectives in Psychiatric Care, 52(1), 40–48. doi:10.1111/ppc.12100 PMID:25624098 Lai, L. S., & Turban, E. (2008). Groups formation and operations in the Web 2.0 environment and social networks. Group Decision and Negotiation, 17(5), 387–402. doi:10.1007/s10726-008-9113-2 Lambert, N., & McCombs, B. (2000). Introduction: Learner-centered schools and classrooms as a direction for school reform. In N. Lambert & B. McCombs (Eds.), How students learn (pp. 1–15). Washington, D.C.: American Psychological Association. doi:10.4000/books.pur.16454 Lamb, T. (2008). Learner autonomy and teacher autonomy: Synthesising an agenda. In T. Lamb & H. Reinders (Eds.), Learner and Teacher autonomy (pp. 269–284). Amsterdam: John Benjamins Publishing Company. doi:10.1075/aals.1.21lam
506
Compilation of References
Landriscina, F. (2013). Simulation and learning: A model-centered approach. New York, NY: Springer. doi:10.1007/9781-4614-1954-9 Lara, J. A., Lizcano, D., Martinez, M. A., Pazos, J., & Riera, T. (2014). A system for knowledge discovery in e-learning environments within the European higher education area: Application to student data from Open University of Madrid, UDIMA. Computers & Education, 72, 23–36. doi:10.1016/j.compedu.2013.10.009 Larvin, M. (2009). E-learning in surgical education and training. ANZ Journal of Surgery, 79(3), 133–137. doi:10.1111/ j.1445-2197.2008.04828.x PMID:19317777 Laurillard, D., Oliver, M., Wasson, B., & Hoppe, U. (2009). Implementing technology-enhanced learning (pp. 289–306). Technology-Enhanced Learning. doi:10.1007/978-1-4020-9827-7_17 Lave, J., & Wenger, E. (1991). Situated learning: legitimate peripheral participation. Cambridge University Press. doi:10.1017/CBO9780511815355 Lazonder, A. (2014). Inquiry learning. In J. M. Spector, M. D. Merrill, J. Elen, & M. J. Bishop (Eds.), Handbook of Research on Educational Communications and Technology (4th ed., pp. 453–464). New York, NY: Springer. doi:10.1007/9781-4614-3185-5_36 Leal, F. (2012, September 14). Report: U.S students lack writing skills. Retrieved from http://www.ocregister.com/ articles/students-371409-writing-graders.html Lean, J., Moizer, J., Towler, M., & Abbey, C. (2006). Simulations and games: Use and barriers in higher education. Active Learning in Higher Education, 7(3), 227–242. doi:10.1177/1469787406069056 Lear, J. L., Isernhagen, J. C., LaCost, B. A., & King, J. W. (2009). Instructor Presence for Web-Based Classes. Delta Pi Epsilon Journal, 51(2), 86–98. Retrieved from http://search.proquest.com.liblink.uncw.edu/docview/195592989?ac countid=14606 Learning from Experience: A collection of service-learning projects linking academic standards to curriculum. (2000). Madison, WI: Wisconsin Department of Public Instruction. Learning in Deed. (2002). Battle Creek, MI: W.K. Kellogg Foundation. Leclercq, D., & Poumay, M. (2005). The 8 Learning Events Model and its principles. LabSET. Retrieved from http:// www.labset.net/media/prod/8LEM.pdf Lee, H.-J., & Lim, C. (2012). Peer Evaluation in Blended Team Project-Based Learning: What Do Students Find Important? Journal of Educational Technology & Society, 15(4), 214–224. Lee, J., & Jang, S. (2014). A methodological framework for instructional design model development: Critical dimensions and synthesized procedures. Educational Technology Research and Development, 62(6), 743–765. doi:10.1007/ s11423-014-9352-7 Lee, J., & Lee, W. (2008). The relationship of e-learner’s self-regulatory efficacy and perception of e-learning environmental quality. Computers in Human Behavior, 24(1), 32–47. doi:10.1016/j.chb.2006.12.001 Lee, L. (2014). Digital news stories: Building language learners content knowledge and speaking skills. Foreign Language Annals, 47(2), 338–356. doi:10.1111/flan.12084 Lee, S. M., Kim, Y. R., & Lee, J. (1995). An empirical study of the relationships among end-user information systems acceptance, training, and effectiveness. Journal of Management Information Systems, 12(2), 189–202. doi:10.1080/07 421222.1995.11518086 507
Compilation of References
Lee, Y. J., & Lee, D. (2015). Factors influencing learning satisfaction of migrant workers in Korea with e-learning-based occupational safety and health education. Safety and Health at Work, 6(3), 211–217. doi:10.1016/j.shaw.2015.05.002 PMID:26929830 Leichliter, M. (2010). A case study of universal design for learning applied in the college classroom [Doctoral Dissertation]. Leki, I. (1990). Potential problems with peer responding in ESL writing classes. CATESOL Journal, 3, 5–19. Levine, S. J. (2007). The online discussion board. New Directions for Adult and Continuing Education, 2007(113), 67–74. doi:10.1002/ace.248 Levy, M. (2009). WEB 2.0 implications on knowledge management. Journal of Knowledge Management, 13(1), 120–134. doi:10.1108/13673270910931215 Levy, M., & Hubbard, P. (2005). Why call CALL CALL? Computer Assisted Language Learning, 18(3), 143–149. doi:10.1080/09588220500208884 Lewin, T. (2015, February 13). Harvard and MIT are sued over lack of closed captions. New York Times. Retrieved from http://www.nytimes.com/2015/02/13/education/harvard-and-mit-sued-over-failing-to-caption-online-courses.html Lewis, J. R., & Sauro, J. (2009). The factor structure of the system usability scale (pp. 94–103). Human Centered Design. doi:10.1007/978-3-642-02806-9_12 Lewis, R., Strachan, A., & Smith, M. M. (2012). Is high fidelity simulation the most effective method for the development of non-technical skills in nursing? A review of the current evidence. The Open Nursing Journal, 6, 82–89. doi:10.2174/1874434601206010082 PMID:22893783 Leyland, B. (1996). How can computer games offer deep learning and still be fun.Proceedings of ASCILITE Conference, Adelaide, Australia. Li, M. P., & Lam, B. H. (2013). Cooperative learning. Retrieved from http://www.ied.edu.hk/aclass/l’heories/cooperativelearningcoursewriting_LBH% 2024June.pdf Liapis, A., Katsanos, C., Sotiropoulos, D., Xenos, M., & Karousos, N. (2015). Recognizing emotions in Human Computer Interaction: Studying stress using skin conductance. Human-Computer Interaction-INTERACT ‘15, 255–262. Liegle, J. O., & Janicki, T. N. (2006). The effect of learning styles on the navigation needs of web-based learners. Computers in Human Behavior, 22(5), 885–898. doi:10.1016/j.chb.2004.03.024 Limayem, M., & Cheung, C. M. K. (2011). Predicting the continued use of Internet-based learning technologies: The role of habit. Behaviour & Information Technology, 30(1), 91–99. doi:10.1080/0144929X.2010.490956 Lim, C. P., & Ching, C. S. (2004). An activity-theoretical approach to research of ICT integration in Singapore schools: Orienting activities and learner autonomy. Computers & Education, 43(3), 215–236. doi:10.1016/j.compedu.2003.10.005 Lim, Y. C., & Chiew, T. K. (2014). Creating reusable and interoperable learning objects for developing an e-learning system that supports remediation learning strategy. Computer Applications in Engineering Education, 22(2), 329–339. doi:10.1002/cae.20558 Lin, C. C., Ma, Z., & Lin, R. C. P. (2011). Re-examining the critical success factors of e-learning from the EU perspective. International Journal of Management in Education, 5(1), 44–62. doi:10.1504/IJMIE.2011.037754 Linehan, C., Kirman, B., Lawson, S., & Chan, G. (2011). Practical, appropriate, empirically-validated guidelines for designing educational games. Paper presented the 29th Annual ACM Conference on Human Factors in Computing Systems (CHI ‘11), Vancouver, Canada. doi:10.1145/1978942.1979229 508
Compilation of References
Lin, J. W., Huang, H. H., & Chuang, Y. S. (2015). The impacts of network centrality and self-regulation on an e-learning environment with the support of social network awareness. British Journal of Educational Technology, 46(1), 32–44. doi:10.1111/bjet.12120 Little, D. (1991). Learner Autonomy. 1: Definitions, Issues and Problems. Dublin: Authentik. Liu, E. Z. F., Lin, S. S. J., & Yuan, S. M. (2002). Alternatives to instructor assessment: A case study of comparing self and peer assessment with instructor assessment under a networked innovative assessment procedures. International Journal of Instructional Media, 29(4), 10. Liu, X., Lee, S., Bonk, C. J., Bude, C., & Magjuka, R. J. (2005). Exploring four dimensions of online instructor roles: A program level case study. Journal of Asynchronous Communication, 9(4), 29–48. Liu, Y., & Wang, H. (2009). A comparative study on e-learning technologies and products: From the East to the West. Systems Research and Behavioral Science, 26(2), 191–209. doi:10.1002/sres.959 Livingston, D. (2006, February 11). Differentiated instruction and assessment in the college classroom. Paper presented at the 12th Annual Conference on College and University Teaching. Kennesaw, GA: Kennesaw State University. Retrieved from http://home.lagrange.edu/dlivingston/differentiated.htm Li, Y., Duan, Y., Fu, Z., & Alford, P. (2012). An empirical study on behavioural intention to reuse e-learning systems in rural China. British Journal of Educational Technology, 43(6), 933–948. doi:10.1111/j.1467-8535.2011.01261.x Lobry de Bruyn, L. (2004). Monitoring online communication: Can the development of convergence and social presence indicate an interactive learning environment? Distance Education, 25(1), 67–81. doi:10.1080/0158791042000212468 Lock, J. V. (2003). Building and sustaining virtual communities [Doctoral Dissertation]. University of Calgary, Calgary, Alberta, Canada. Lombardi, A. R., & Murray, C. (2011). Measuring university faculty attitudes toward disability: Willingness to accommodate and adopt universal design principles. Journal of Vocational Rehabilitation, 34(1), 43–56. Loncar, M., Barrett, N. E., & Liu, G.-Z. (2014). Towards the refinement of forum and asynchronous online discussion in educational contexts worldwide: Trends and investigative approaches within a dominant research paradigm. Computers & Education, 73(0), 93–110. doi:10.1016/j.compedu.2013.12.007 Long, T., Logan, J., & Waugh, M. (2016). Students perceptions of the value of using videos as a pre-class learning experience in the flipped classroom. TechTrends, 60(3), 245–252. doi:10.1007/s11528-016-0045-4 Louhiala-Salminen, L., & Kankaaranta, A. (2011). Professional Communication in a Global Business Context: The Notion of Global Communicative Competence. IEEE Transactions on Professional Communication, 54(3), 244–262. doi:10.1109/TPC.2011.2161844 Lou, Y., Abrami, P. C., & dAppolonia, S. (2001). Small group and individual learning with technology: A meta-analysis. Review of Educational Research, 71(3), 449–521. doi:10.3102/00346543071003449 Lowenthal, P., & Parscal, T. (2008). Teaching presence. The Learning Curve, 3(4), 1–2. Lowyck, J. (2014). Bridging learning theories and technology-enhanced environments: A critical appraisal of its history. In J. M. Spector, M. D. Merrill, J. Elen, & M. J. Bishop (Eds.), Handbook of Research on Educational Communications and Technology (4th ed., pp. 3–20). New York: Springer. doi:10.1007/978-1-4614-3185-5_1
509
Compilation of References
Lozano, R., Lozano, F., Mulder, K., Huisingh, D., & Waas, T. (2013). Advancing higher education for sustainable development: International insights and critical reflections. Journal of Cleaner Production, 48, 3–9. doi:10.1016/j. jclepro.2013.03.034 Lüdert, T., Nast, A., Zielke, H., Sterry, W., & Rzany, B. (2008). E-learning in the dermatological education at the Charité: Evaluation of the last three years. JDDG: Journal der Deutschen Dermatologischen Gesellschaft, 6(6), 467–472. doi:10.1111/j.1610-0387.2008.06738.x PMID:18400021 Ludwig, B., Bister, D., Schott, T. C., Lisson, J. A., & Hourfar, J. (2016). Assessment of two e-learning methods teaching undergraduate students cephalometry in orthodontics. European Journal of Dental Education, 20(1), 20–25. doi:10.1111/ eje.12135 PMID:25560366 Luna, R. A., Aranhan, R. N., & Spite, D. H. (n. d.). Simulation in medical education. Retrieved from http://www.telessaude.uerj.br/resource/goldbook/pdf/25.pdf Luo, H., Robinson, A. C., & Park, J. Y. (2014). Peer grading in a mooc: Reliability, validity, and perceived effects. Online Learning: Official Journal of the Online Learning Consortium, 18(2). Lykourentzou, I., Giannoukos, I., Mpardis, G., Nikolopoulos, V., & Loumos, V. (2009). Early and dynamic student achievement prediction in e-learning courses using neural networks. Journal of the American Society for Information Science and Technology, 60(2), 372–380. doi:10.1002/asi.20970 MacLean, P., & Scott, B. (2011). Competencies for learning design: A review of the literature and a proposed framework. British Journal of Educational Technology, 42(4), 557–572. doi:10.1111/j.1467-8535.2010.01090.x MacMillan, T., Forte, M., & Grant, C. (2014). Thematic Analysis of the “Games” Students Play in Asynchronous Learning Environments. Journal of Asynchronous Learning Networks, 18(1). Retrieved from http://olc.onlinelearningconsortium. org/publications/olj_main MacNeill, H., Telner, D., Sparaggis-Agaliotis, A., & Hanna, E. (2014). All for one and one for all: Understanding health professionals’ experience in individual versus collaborative online learning. Journal of Continuing Education in the Health Profession, 34, 102-111. doi:10.1002/chp.21226 Magin, D. (1993). Should student peer ratings be used as part of summative assessment? Research and Development in Higher Education, 16, 537–542. Maier, F. H., & Grobler, A. (2000). What are we talking about? A taxonomy of computer simulations to support learning. System Dynamics Review, 16(2), 135–148. doi:10.1002/1099-1727(200022)16:23.0.CO;2-P Maki, R. H., & Maki, W. S. (2007). Online courses. In F. T. Durso (Ed.), Handbook of applied cognition (pp. 527–552). New York: Wiley & Sons, Ltd. doi:10.1002/9780470713181.ch20 Manca, S., & Ranieri, M. (2016a). Facebook and the others. Potentials and obstacles of Social Media for teaching in higher education. Computers & Education, 95, 216–230. doi:10.1016/j.compedu.2016.01.012 Manca, S., & Ranieri, M. (2016b). Yes for sharing, no for teaching!: Social Media in academic practices. The Internet and Higher Education, 29, 63–74. doi:10.1016/j.iheduc.2015.12.004 Mandermach, B. J., Gonzales, R. M., & Garrett, A. L. (2006). An examination of online instructor presence via threaded discussion participation. Journal of Online Learning and Teaching, 2(4), 248–260. Mangelsdorf, K. (1992). Peer reviews in the ESL composition classroom: What do the students think? ELT Journal, 46(3), 274–284. doi:10.1093/elt/46.3.274
510
Compilation of References
Mangiatordi, A., & Serenelli, F. (2013). Universal design for learning: A meta-analytic review of 80 abstracts from peer reviewed journals. Research on Education and Media, 5(1), 109–113. Mao, J., & Peck, K. (2013). Assessment strategies, self-regulated learning skills and perceptions of assessment in online learning. Quarterly Review of Distance Education, 14(2), 75–95. Marco, F. A., Penichet, V. M. R., & Gallud, J. A. (2013). Collaborative e-Learning through Drag & Share in Synchronous Shared Workspaces. J. UCS, 19(7), 894–911. Marković, S., Jovanović, Z., Jovanović, N., Jevremović, A., & Popović, R. (2013). Adaptive distance learning and testing system. Computer Applications in Engineering Education, 21(Suppl. 1), E2–E13. doi:10.1002/cae.20510 Markus, G. B., Howard, J. P., & King, D. C. (1993). Notes: Integrating community service and classroom instruction enhances learning: Results from an experiment. Educational Evaluation and Policy Analysis, 15(4), 410–419. Marshall, S. (2012). Improving the quality of e-learning: Lessons from the eMM. Journal of Computer Assisted Learning, 28(1), 65–78. doi:10.1111/j.1365-2729.2011.00443.x Martin, A. (2005). DigEuLit – a European Framework for Digital Literacy: a Progress Report. Journal of eLiteracy, 2, 130-136. Martínez-Caro, E. (2011). Factors affecting effectiveness in e-learning: An analysis in production management courses. Computer Applications in Engineering Education, 19(3), 572–581. doi:10.1002/cae.20337 Mason, K. C. (2014, August 25). Colleges adjust to new reality that more students juggle work, family. PBS News Hour. Retrieved from http://www.pbs.org/newshour/updates/colleges-adjust-to-new-reality-that-students-juggle-work-familymore/ Mason, G., Shuman, T. R., & Cook, K. E. (2013). Comparing the effectiveness of an inverted classroom to a traditional classroom in an upper-division engineering course. IEEE Transactions on Education, 56(4), 430–435. doi:10.1109/ TE.2013.2249066 Mason, R., & Rennie, F. (2008). E-Learning and Social Networking Handbook. Abingdon: Routledge. Masoumi, D., & Lindström, B. (2012). Quality in e-learning: A framework for promoting and assuring quality in virtual institutions. Journal of Computer Assisted Learning, 28(1), 27–41. doi:10.1111/j.1365-2729.2011.00440.x Maxwell, S., & Mucklow, J. (2012). E-learning initiatives to support prescribing. British Journal of Clinical Pharmacology, 74(4), 621–631. doi:10.1111/j.1365-2125.2012.04300.x PMID:22509885 Mayer, C. L. (2004). An analysis of the dimensions of a Web-delivered problem based learning environment [Ph. D. Dissertation]. University of Missouri, Columbia. Mayer, R. E. (2009). Multimedia learning (2nd ed.). New York: Cambridge University Press. doi:10.1017/ CBO9780511811678 Mayhew, D. J. (1999). The usability engineering lifecycle. In CHI’99 Extended Abstracts on Human Factors in Computing Systems (pp. 147-148). Mazur, E. (2013, March 13). The flipped classroom will redefine the role of educators. EvoLLLution blog. Retrieved from http://www.evolllution.com/distance_online_learning/audio-flipped-classroom-redefine-role-educators-10-years/ Mazzolini, M., & Maddison, S. (2003). Sage, guide or ghost? The effect of instructor intervention on student participation in online discussion forums. Computers & Education, 40(3), 237–253. doi:10.1016/S0360-1315(02)00129-X
511
Compilation of References
Mazzolini, M., & Maddison, S. (2007). When to jump in: The role of the instructor in online discussion forums. Computers & Education, 49(2), 193–213. doi:10.1016/j.compedu.2005.06.011 McClusky, D. A. III, & Smith, D. (2008). Design and development of a surgical skills simulation Curriculum. World Journal of Surgery, 32(2), 171–181. doi:10.1007/s00268-007-9331-9 PMID:18066685 McCombs, B. (2015). Learner-Centered Online Instruction. New Directions for Teaching and Learning, 2015(144), 57–71. doi:10.1002/tl.20163 McConnell, D. (2000). Implementing computer supported cooperative learning (2nd ed.). London: Kogan Page. McConnell, D. (2006). E-learning groups and communities. Berkshire: Open University Press. McGee, P., & Reis, A. (2012). Blended course design: A synthesis of best practices. Journal of Asynchronous Learning Networks, 16(4), 7–22. McGill, T. J., & Hobbs, V. J. (2008). How students and instructors using a virtual learning environment perceive the fit between technology and task. Journal of Computer Assisted Learning, 24(3), 191–202. doi:10.1111/j.1365-2729.2007.00253.x McGill, T. J., & Klobas, J. E. (2009). A task-technology fit view of learning management system impact. Computers & Education, 52(2), 496–508. doi:10.1016/j.compedu.2008.10.002 McGrath, J. E. (1992). Time, interaction, and performance (TIP): A theory of groups. Small Group Research, 22(2), 147–174. doi:10.1177/1046496491222001 Mcgrath, J. E., Arrow, H., Gruenfeld, D. H., Hollingshead, A. B., & OConnor, K. M. (1993). Groups, tasks, and technology The effects of experience and change. Small Group Research, 24(3), 406–420. doi:10.1177/1046496493243007 McGuire, J. M., Scott, S. S., & Shaw, S. F. (2006). Universal design and its applications in educational environments. Remedial and Special Education, 27(3), 166–175. doi:10.1177/07419325060270030501 McHaney, R. (1991). Computer simulation: A practical perspective. San Diego, CA: Academic Press Professional. McKagan, S. B., Handley, W., Perkins, K. K., & Wieman, C. E. (2008). A Research-based curriculum for teaching the photoelectric effect. Retrieved from http://www.colorado.edu/physics/EducationIssues/papers/McKagan_etal/photoelectric.pdf McKenney, S., & Reeves, T. C. (2012). Conducting educational design research. New York, NY: Routledge. McLeod, G. (2003). Learning theory and instructional design. Learning Matters, 2, 35–43. McLoughlin, C., & Lee, M. J. W. (2007). Social software and participatory learning: Pedagogical choices with technology affordances in the Web 2.0 era. In ICT: Providing choices for learners and learning. Proceedings of the ascilite Singapore. http://www.ascilite.org.au/conferences/singapore07/procs/mcloughlin.pdf McLoughlin, C., & Luca, J. (2001). Quality in online delivery: What does it mean for assessment in e-learning environments. Paper presented at theAnnual Conference of the Australasian Society for Computers in Learning in Tertiary Education. McManus, T. F. (2000). Individualizing instruction in a web-based hypermedia learning environment: Nonlinearity, advance organizers, and self-regulated learners. Journal of Interactive Learning Research, 11(2), 219–251. McPherson, M. A., & Nunes, J. M. (2008). Critical issues for e-learning delivery: What may seem obvious is not always put into practice. Journal of Computer Assisted Learning, 24(5), 433–445. doi:10.1111/j.1365-2729.2008.00281.x Mcquiggan, C. A. (2012). Faculty development for online teaching as a catalyst for change. Journal of Asynchronous Learning Networks, 16(2), 27–61. 512
Compilation of References
Means, B., Bakia, M., & Murphy, R. (2014). Learning online: What research tells us about whether, when and how. New York, NY: Routledge. Means, B., Toyama, Y., Murphy, R., Bakia, M., & Jones, K. (2010). Evaluation of evidence-based practices in online learning. Washington, DC: U.S. Department of Education. Mehrabian, A. (1971). Silent messages. Belmont, CA: Wadsworth. Mentes, S. A., & Turan, A. H. (2012). Assessing the Usability of University Websites: An Empirical Study on Namik Kemal University. Turkish Online Journal of Educational Technology, 11(3), 61–69. Meo, G. (2008). Curriculum planning for all learners: Applying universal design for learning (UDL) to a high school reading comprehension program. Preventing School Failure: Alternative Education for Children and Youth, 52(2), 21–30. doi:10.3200/PSFL.52.2.21-30 Methodology Manual. (n. d.). Retrieved from http://www.preciousheart.net/chaplaincy/Auditor_Manual/13casesd.pdf Meyer, A., Rose, D. H., & Gordon, D. (2014). Universal design for learning: Theory and practice. Wakefield, MA: CAST. Miles, M. B., & Huberman, A. M. (1994). Qualitative data analysis (2nd ed.). Thousand Oaks, CA: Sage Publications. Miller, M. (2014). Minds online: Teaching effectively with technology. Cambridge, MA: Harvard University Press. doi:10.4159/harvard.9780674735996 Miller, S. M., & Miller, K. M. (2000). Theoretical and practical considerations in the design of web-based instruction. In B. Abbey (Ed.), Instructional and cognitive impacts of web-based education (pp. 156–177). Hershey, PA: Idea Group Publishing. doi:10.4018/978-1-878289-59-9.ch010 Minocha, S., & Roberts, D. (2008). Social, usability, and pedagogical factors influencing studentslearning experiences with wikis and blogs. Pragmatics & Cognition, 16(2), 272–306. doi:10.1075/p&c.16.2.05min Mittal, A. (2010). Framework of e-learning business model. Retrieved from http://fr.slideshare.net/mittalashi/frameworkof-e-learning-business-models Moedtricher, F. (2006). E-learning theories in practice: A comparison of three methods. Journal of Universal Science and Technology of Learning, 0(0), 3–18. Molenda, M. (2015). In search of the elusive ADDIE Model. Performance Improvement, 54(2), 40–42. Retrieved from: http://doi.org/10.1002/pfi.21461 Molenda, M. (2003). In search of the elusive ADDIE model. Performance Improvement, 42(5), 34–36. doi:10.1002/ pfi.4930420508 Moore, C. (2013). Inclusive college teaching: a study of how four award-winning faculty employ universal design [Doctoral dissertation]. Temple University. Moore, E. B. (2015). Designing accessible interactive chemistry simulations. Retrieved from http://confchem.ccce. divched.org/2015SpringConfChemP8 Moore, E. B., Chamberlain, J. M., Parson, R., & Perkins, K. K. (2014). PhET interactive simulations: Transformative tools for teaching chemistry. Journal of Chemical Education, 91(8), 1191–1197. doi:10.1021/ed4005084 Moore, J. L., Dickson-Deane, C., & Galyen, K. (2011). e-Learning, online learning, and distance learning environments: Are they the same? The Internet and Higher Education, 14(2), 129–135. doi:10.1016/j.iheduc.2010.10.001
513
Compilation of References
Moore, M. (1973). Toward a theory of independent learning and teaching. The Journal of Higher Education, 44(12), 661–679. doi:10.2307/1980599 Moore, M. G. (1998). Three types of interaction. American Journal of Distance Education, 3(2), 1–6. doi:10.1080/08923648909526659 Moore, M. G. (2007). The theory of transactional distance. In M. G. Moore (Ed.), Handbook of distance education (pp. 89–105). Mahwah, NJ: Lawrence Erlbaum Associates. Moore, M. G. (2007). The Theory of Transactional Distance. In M. G. Moore (Ed.), The Handbook of Distance Education (2nd ed., pp. 89–108). Mahwah, NJ: Lawrence Erlbaum. Moore, M. G. (Ed.), Handbook of Distance Education (3rd ed.). New York, NY: Routledge. Moore, M. G., & Kearsley, G. (2011). Distance education: A systems view of online learning. Belmont, CA: Cengage Learning. Moore, M., & Kearsley, G. (2005). Distance education: A systems view. Toronto, Canada: Nelson. Moran, M., Seaman, J., & Tinti-Kane, H. (2011). Teaching, Learning, and Sharing: How Today’s Higher Education Faculty Use Social Media. Babson Survey Research Group. Moravec, M., Williams, A., Aguilar-Roca, N., & ODowd, D. K. (2010). Learn before lecture: A strategy that improves learning outcomes in a large introductory biology class. CBE Life Sciences Education, 9(4), 473–481. doi:10.1187/ cbe.10-04-0063 PMID:21123694 Moule, P., Ward, R., & Lockyer, L. (2010). Nursing and healthcare students’ experiences and use of e-learning in higher education. Journal of Advanced Nursing, 66(12), 2785–2795. doi:10.1111/j.1365-2648.2010.05453.x PMID:20946565 Moura, A. P. M., Cunha, L. M., Azeiteiro, U. M., Aires, L., & de Almeida, M. D. V. (2010). Food consumer science postgraduate courses: Comparison of face-to-face versus online delivery systems. British Food Journal, 112(5), 544–556. doi:10.1108/00070701011043781 Moussiaux, S. J., & Norman, J. T. (2003). Constructivist teaching practices: perceptions of teachers and students. Retrieved from http://www.ed.psu.edu Mueller, D., & Strohmeier, S. (2010). Design characteristics of virtual learning environments: An expert study. International Journal of Training and Development, 14(3), 209–222. doi:10.1111/j.1468-2419.2010.00353.x Muhi, K., SzHoke, G., Fülöp, L. J. H., Ferenc, R., & Berger, Á. (2013). A Semi-automatic Usability Evaluation Framework, Computational Science and Its Applications ICCSA ‘13 (pp. 529-542). Muirhead, R. J. (2007). E-learning: Is this teaching at students or teaching with students? Nursing Forum, 42(4), 178–184. doi:10.1111/j.1744-6198.2007.00085.x PMID:17944698 Munshi, F., Lababidi, H., & Alyousef, S. (2015). Low-versus high-fidelity simulations in teaching and assessing clinical skills. Journal of Taibah University Medical Sciences, 10(1), 12–15. doi:10.1016/j.jtumed.2015.01.008 Muntean, C. H., & Muntean, G. M. (2009). Open corpus architecture for personalised ubiquitous e-learning. Personal and Ubiquitous Computing, 13(3), 197–205. doi:10.1007/s00779-007-0189-5 Murau, A. M. (1993). Shared Writing: Students’ Perceptions and Attitudes of Peer Review. Working Papers in Educational Linguistics, 9(2), 71-79.
514
Compilation of References
Murray, C., Lombardi, A., & Wren, C. (2011). The effects of disability-focused training on the attitudes and perceptions of university staff. Remedial and Special Education, 32(4), 290–300. doi:10.1177/0741932510362188 Murray, C., Lombardi, A., Wren, C. T., & Keys, C. (2009). Associations between prior disability-focused training and disability-related attitudes and perceptions among university faculty. Learning Disability Quarterly, 32(2), 87–100. doi:10.2307/27740359 Murray, M., Perez, J., Geist, D., & Hedrick, A. (2012). Student interaction with online course content: Build it and they might come. Journal of Information Technology Education, 11, 125–139. Nakagawa, A. S. (2001). Using VoiceThread for professional development: Probeware training for science teachers. Paper presented at the Annual Technology, Colleges, and Community Worldwide Online Conference. Nandi, D., Hamilton, M., & Harland, J. (2015). What Factors Impact Student-Content Interaction in Fully Online Courses. International Journal of Modern Education and Computer Science, 7(7), 28–35. doi:10.5815/ijmecs.2015.07.04 Narciss, S., Proske, A., & Korndle, H. (2007). Promoting self-regulated learning in web-based learning environments. Computers in Human Behavior, 23(3), 1126–1144. doi:10.1016/j.chb.2006.10.006 National Center on Universal Design for Learning. (2012). UDL Guidelines-Version 2.0. Retrieved from http://www. udlcenter.org/aboutudl/udlguidelines/principle3 National Center on Universal Design for Learning. (2014). About UDL. Retrieved from http://www.udlcenter.org/ aboutudl/whatisudl National Research Council. (1996). National science education standards. Washington, DC: National Academy Press. Navimipour, N. J., & Zareie, B. (2015). A model for assessing the impact of e-learning systems on employees’ satisfaction. Computers in Human Behavior, 53, 475–485. doi:10.1016/j.chb.2015.07.026 Nawaz, A. (2012). E-learning experiences of HEIs in advanced states, developing countries and Pakistan. Universal Journal of Education and General Studies, 1(3), 72–83. Neier, S., & Zayer, L. T. (2015). Students Perceptions and Experiences of Social Media in Higher Education. Journal of Marketing Education, 37(3), 133–143. doi:10.1177/0273475315583748 Nelson, G. L., & Murphy, J. M. (1992). An L2 writing group: Talk and social dimension. Journal of Second Language Writing, 1(3), 171–193. doi:10.1016/1060-3743(92)90002-7 Nelson, G., & Murphy, J. (1993). Peer response groups: Do L2 writers use peer comments in revising their drafts? Journal of Second Language Writing, 27, 135–142. Nelson, J., Dodd, J., & Smith, D. (1990). Faculty willingness to accommodate students with learning disabilities. Journal of Learning Disabilities, 23(3), 185–189. doi:10.1177/002221949002300309 PMID:2313192 Nelson, L. M. (2009). Collaborative problem solving. In C. M. Reigeluth (Ed.), Instructional-design theories and models: A new paradigm of ınstructional theory (Vol. 2) (pp. 241–269). New York: Lawrence Erlbaum Associates Inc. Publisher. Neo, M. (2003). Developing a collaborative learning environment using a web based design. Journal of Computer Assisted Learning, 19(4), 462–473. doi:10.1046/j.0266-4909.2003.00050.x New Media Consortium. (2012). Horizon report: 2012 Higher education edition. Retrieved from http://www.nmc.org/ pdf/2012-horizon-report-HE.pdf New Media Consortium. (2016). Augmented reality. Retrieved from http://www.nmc.org/horizon_topic/augmented-reality/ 515
Compilation of References
New Media Horizon and EDUCAUSE. (2013). NMC Horizon Report: 2013 Higher Education Edition. New Media Consortium. Newman, D., Webb, B., & Cochrane, C. (1995). A content analysis method to measure critical thinking in face-to-face and computer supported group learning. Interpersonal Computer and Technology: An Electronic Journal for the 21st Century, 3(2), 56-77. Ng, C. S. L., Cheung, W. S., & Hew, K. F. (2012). Interaction in asynchronous discussion forums: Peer facilitation techniques. Journal of Computer Assisted Learning, 28(3), 280–294. doi:10.1111/j.1365-2729.2011.00454.x Nichols, M. (2008). Institutional perspectives: The challenges of e-learning diffusion. British Journal of Educational Technology, 39(4), 598–609. doi:10.1111/j.1467-8535.2007.00761.x Nicol D.J., & Macfarlane-Dick, D. (2004). Rethinking Formative Assessment in HE: a theoretical model and seven principles of good feedback practice Nielsen, J. (1994). Heuristic evaluation. Usability inspection methods, 17(1), 25-62. Nilson, L. B. (2010). Teaching at its best: a research-based resource for college instructors. San Francisco, CA: Jossey-Bass. Ni, S., & Aust, R. (2008). Examining teacher verbal immediacy and sense of classroom community in online classes. International Journal on E-Learning, 7(3), 477–498. Retrieved from http://search.proquest.com.liblink.uncw.edu/docv iew/210333926?accountid=14606 Njoroge, B. (2016). How College Instructors Use Social Media for Instruction. WEST VIRGINIA UNIVERSITY. Nold, E. (1981). Revising. In C. H. Frederiksen & J. F. Dominic (Eds.), Writing: The Nature, Development, and Teaching of Written Communication (pp. 67-79). Hillsdale, NJ: Erlbaum. Noonoo, S. (2012, June 20). Flipped learning founders set the record straight. THE Journal. Retrieved from https:// thejournal.com/articles/2012/06/20/flipped-learning-founders-q-and-a.aspx Norman, G. R., & Schmidt, H. G. (1992). The psychological basis of problem-based learning: A review of the evidence. Academic Medicine, 67(9), 557–565. doi:10.1097/00001888-199209000-00002 PMID:1520409 Normark, O. R., & Cetindamar, D. (2005). E-learning in a competitive firm setting. Innovations in Education and Teaching International, 42(4), 325–335. doi:10.1080/14703290500062581 Nuninger, W., Conflant, B., & Châtelet, J.-M. (2016), Roadmap to Ensure the Consistency of WIL with the Projects of Companies and Learners: A Legitimate and Sustainable Training Offer, In Nuninger W. & J.-M. Châtelet J.M (Eds.) Advances in Educational Marketing, Administration, & leadership (AEMAL) Book Series (chap. 8), IGI Gobal. doi:10.4018/978-1-5225-0024-7.ch008 Nuninger, W., & Châtelet, J.-M. (2014). Engineers Abilities Improved Thanks to a Quality WIL Model in Coordination with the Industry for Two Decades.[IJQAETE]. International Journal of Quality Assurance in Engineering and Technology Education, 3(1), 15–51. doi:10.4018/ijqaete.2014010102 Nuninger, W., & Châtelet, J.-M. (2016). Hybridization-Based Courses Consolidated through LMS and PLE Leading to a New Co-Creation of Learning: Changing All Actors’ Behavior for Efficiency. In D. Fonseca & E. Redondo (Eds.), Handbook of Research on Applied E-Learning in Engineering and Architecture Education (pp. 55–87). Hershey, PA: Engineering Science Reference; doi:10.4018/978-1-4666-8803-2.ch004 O’Dowd, R., & Ritter, M. (2006). Understanding and Working with ‘Failed Communication’ in Telecollaborative Exchanges. CALICO, 23(3), 623–642. 516
Compilation of References
O’Sullivan, P. B., Hunt, S., & Lippert, L., Owens, S/. & Whyte, A. (2001, November). Mediated immediacy: Affiliation at distance in educational contexts. Paper presented at theannual meeting of the National Communication Association, Atlanta, GA, USA. Ogden, L., Pyzdrowski, L., & Shambaugh, N. (2014). A Teaching Model for the College Algebra Flipped Classroom. In J. Keengwe, G. Onchwari, & J. Oigara (Eds.), Promoting Active Learning through the Flipped Classroom Model (pp. 47–70). Hershey, PA, USA: IGI Global. doi:10.4018/978-1-4666-4987-3.ch003 Ogden, L., & Shambaugh, N. (2016). The continuous and systematic study of the college algebra flipped classroom. In J. Keengwe (Ed.), Handbook of Research on Active Learning and the Flipped Classroom Model in the Digital Age (pp. 41–72). Hershey, PA, USA: IGI Global. doi:10.4018/978-1-4666-9680-8.ch003 Oiry, E. (2009). Electronic human resource management: Organizational responses to role conflicts created by e-learning. International Journal of Training and Development, 13(2), 111–123. doi:10.1111/j.1468-2419.2009.00321.x Olofsson, A., Lindberg, J. O., & Stödberg, U. (2011). Shared video media and blogging online: Educational technologies for enhancing formative e‐assessment? Campus-Wide Information Systems, 28(1), 41–55. doi:10.1108/10650741111097287 Orfanou, K., Tselios, N., & Katsanos, C. (2015). Perceived usability evaluation of learning management systems: Empirical evaluation of the System Usability Scale. The International Review of Research in Open and Distributed Learning, 16(2). doi:10.19173/irrodl.v16i2.1955 Orlando, J. (2011). How to effectively assess online learning (White Paper). Retrieved from http://www.stjohns.edu/sites/ default/files/documents/ir/f63bd49dcf56481e9dbd6975cce6c792.pdf Orús, C., Barlés, M. J., Belanche, D., Casaló, L., Fraj, E., & Gurrea, R. (2016). The effects of learner-generated videos for YouTube on learning outcomes and satisfaction. Computers & Education, 95, 254–269. doi:10.1016/j.compedu.2016.01.007 Osguthorpe, R. T., & Graham, C. R. (2003). Blended learning environments: Definitions and directions. Quarterly Review of Distance Education, 4(3), 227–233. Ossiannilsson, E., & Landgren, L. (2012). Quality in e-learning: A conceptual framework based on experiences from three international benchmarking projects. Journal of Computer Assisted Learning, 28(1), 42–51. doi:10.1111/j.13652729.2011.00439.x Otten, H., & Ohana, Y. (2009). The eight key competencies for lifelong learning: an appropriate framework within which to develop the competence of trainers in the field of European youth work or just plain politics? Retrieved from: http:// www.ikab.de/reports/Otten_Ohana_8keycompetence_study_2009.pdf Oxford, R. (2003). Language learning styles and strategies: Concepts and relationships. International Review of Applied Linguistics in Language Teaching Journal, 41(4), 271–278. Retrieved from http://web.ntpu.edu.tw/~language / workshop/read2.pdf Özden, Y. (2003). Öğrenme ve öğretme. Ankara: Pegem Akademik. Oztekin, A., Delen, D., Turkyilmaz, A., & Zaim, S. (2013). A machine learning-based usability evaluation method for eLearning systems. Decision Support Systems, 56, 63–73. doi:10.1016/j.dss.2013.05.003 Oztok, M., Zingaro, D., Makos, A., Brett, C., & Hewitt, J. (2015). Capitalizing on social presence: The relationship between social capital and social presence. The Internet and Higher Education, 26, 19–24. doi:10.1016/j.iheduc.2015.04.002 Ozyurt, O., & Ozyurt, H. (2011). Investigating the effects if asynchronous discussions on students’ learning and understanding of mathematics subjects. Turkish Online Journal of Distance Education, 12(4), 17–33.
517
Compilation of References
Ozyurt, O., & Ozyurt, H. (2015). Learning style based individualized adaptive e-learning environments: Content analysis of the articles published from 2005 to 2014. Computers in Human Behavior, 52, 349–358. doi:10.1016/j.chb.2015.06.020 Ozyurt, O., Ozyurt, H., Guven, B., & Baki, A. (2014). The effects of UZWEBMAT on the probability unit achievement of Turkish eleventh grade students and the reasons for such effects. Computers & Education, 75, 1–18. doi:10.1016/j. compedu.2014.02.005 Pachler, N., & Daly, C. (2011). Key issues in e-learning: Research and practice. Bloomsbury Publishing. Paganelli, L., & Paternò, F. (2002). Intelligent analysis of user interactions with web applications Proceedings of theInternational conference on Intelligent user interfaces (pp. 111-118). doi:10.1145/502716.502735 Pagram, P., & Pagram, J. (2006). Issues in e-learning: A Thai case study. The Electronic Journal of Information Systems in Developing Countries, 26(6), 1–8. Palincsar, A. S., & Brown, A. L. (1984). Reciprocal teaching of comprehension-fostering and comprehension-monitoring activities. Cognition and Instruction, 1(2), 117–175. doi:10.1207/s1532690xci0102_1 Palloff, R., & Pratt, K. (2007). Building online learning communities (2ndEd.). San Francisco: Jossey-Bass. Palloff, R. M., & Pratt, K. (2010). Collaborating online: Learning together in community (Vol. 32). John Wiley & Sons. Palloff, R. M., & Pratt, K. (2011). The excellent online instructor: Strategies for professional development. San Francisco, CA: Jossey-Bass. Palloff, R. M., & Pratt, K. (2011). The excellent online instructor: Strategies for Professional Development. San Francisco: Jossey-Bass. Pang, Y. J. (2010). Techniques for Enhancing Hybrid Learning of Physical Education, In P. Tsang et al. (Ed.) Hybrid Learning 3rd International Conf., Beijing, China, August 16-18,Proceedings LNCS 6248 (pp. 94–105), Berlin Heidelberg: Springer-Verlag doi:10.1007/978-3-642-14657-2_10 Panitz, T. (1996). A Definition of collaborative vs cooperative learning. Retrieved from http://www.londonmet.ac.uk/ deliberations/collaborative-learning/panitz-paper.cfm Panitz, T. (1999). Collaborative versus cooperative learning: A comparison of the two concepts which will help us understand the underlying nature of interactive learning. ERIC Clearinghouse. Pappas, C. (2013). The Facebook Guide for Teachers-eLearning Industry. Retrieved from http://elearningindustry.com/ the-facebook-guide-for-teachers Pariente-Martinez, B., Gonzalez-Rodriguez, M., Fernandez-Lanvin, D., & De Andres-Suarez, J. (2014). Measuring the role of age in user performance during interaction with computers. Universal Access in the Information Society. Park, Y. J., & Bonk, C. (2007). Is online life a breeze? A case study for promoting synchronous learning in a blended graduate course. Journal of Online Learning and Teaching, 3(3), 1–14. Pascarella, E. T., & Terenzini, P. T. (2005). How college affects learners: A third decade of research (Vol. 2). Jossey-Bass. Paternò, F., Santoro, C., & Spano, L. D. (2012). Improving support for visual task modelling (pp. 299–306). HumanCentered Software Engineering. Paul, A., Podolefsky, N. S., & Perkins, K. K. (2012). Guiding without feeling guided: Implicit scaffolding through interactive simulation design. In P. V. Engelhardt, A. D. Churukian, & N. S. Rebello (Eds.), 2012 physics education research conference proceedings (pp. 302−305). Philadelphia, PA: AIP. 518
Compilation of References
Pavlenko, P. D. (2010). Theory, history and methodology of social work (Handbook). Moscow: Dashkov & Co.(In Russian) Pawan, F., Paulus, T. M., Yalcin, S., & Chang, C. F. (2003). Online learning: Patterns of engagement and interaction among in-service teachers. Language Learning & Technology, 7(3), 119–140. Pecot-Hebert, L. (2012). To hybrid or not to hybrid, that is the question! Incorporating VoiceThread Technology into a traditional communication course. Communication Teacher, 26(3), 129–134. doi:10.1080/17404622.2011.650703 Pelgrum, W. J. (2001). Obstacles to the integration of ICT in education: Results from a worldwide educational assessment. Computers & Education, 37(2), 163–178. doi:10.1016/S0360-1315(01)00045-8 Perchik, J. (2014, May 14). Flipped classroom: When it fails and why. In-Training blog. Retrieved from http://in-training. org/flipped-classroom-fails-7052 Perkins, K. K., & Moore, E. B. (2014). Blending implicit scaffolding and games in PhET interactive simulations. In J. L. Polman, E. A. Kyza, D. K. O’Neill, I. Tabak, W. R. Penuel, A. S. Jurow, & L. D’Amico et al. (Eds.), The international conference of the learning sciences: Learning and becoming in practice (pp. 1201–1202). Boulder, CO: International Society of the Learning Sciences. Persall, N. R., Skipper, J. E. J., & Mintzes, J. J. (1997). Knowledge restructuring in the life sciences: A longitudinal study of conceptual change in biology. Science Education, 81(2), 193–215. doi:10.1002/(SICI)1098-237X(199704)81:23.0.CO;2-A Persico, D., Manca, S., & Pozzi, F. (2014). Adapting the Technology Acceptance Model to evaluate the innovative potential of e-learning systems. Computers in Human Behavior, 30, 614–622. doi:10.1016/j.chb.2013.07.045 Peterson, D., Robinson, K., Verrall, T., & Quested, B. (2008). Experiences on e-learning projects. ISBT Science Series, 3(1), 175–182. doi:10.1111/j.1751-2824.2008.00163.x Peterson, D., Robinson, K., Verrall, T., Quested, B., & Saxon, B. (2007). E-learning and transfusion medicine. ISBT Science Series, 2(2), 27–32. doi:10.1111/j.1751-2824.2007.00107.x Peterson, Q. R. (1970). Some reflections on the use and abuse of molecular models. Journal of Chemical Education, 47(1), 24–29. doi:10.1021/ed047p24 Pew Internet Project. (2014). Mobile technology fact sheet. Pew Research Center’s Internet & American Life Project. Retrieved from http://www.pewinternet.org/fact-sheets/mobile-technology-fact-sheet/ PhET. (2016a). Interactive simulations for science and math. Retrieved from http://phet.colorado.edu/ PhET. (2016b). Research. Retrieved from http://phet.colorado.edu/en/research#use PhET. (2016c). Teaching resources. Retrieved from http://phet.colorado.edu/en/teaching-resources PhET. (n. d.). Using PhET interactive simulations in college lecture: Ideas for engaging students through inquiry in lecture settings. Retrieved from http://phet.colorado.edu/files/guides/Planning/UG_Phys_Guide-Lecture-Overview.pdf Phillips, R., McNaught, C., & Kennedy, G. (2012). Evaluating e-learning: Guiding research and practice. Routledge. Piaget, J. (1971). Genetic epistemology. New York: W. W. Norton. Piaget, J. (1977). The development of thought: Equilibration of cognitive structures. New York: Viking. Picciano, A. G. (2002). Beyond student perceptions: Issues of interaction, presence, and performance in an online course. Journal of Asynchronous Learning Networks, 6(1), 21–38.
519
Compilation of References
Pintar, R., Jereb, E., Vukovic, G., & Urh, M. (2015). Analysis of web sites for e-learning in the field of foreign exchange trading. Procedia: Social and Behavioral Sciences, 197, 245–254. doi:10.1016/j.sbspro.2015.07.131 Plotnikoff, D. (2013, July 16). Classes should do hands-on exercises before reading and video, Stanford researchers say. Stanford Report. Retrieved from http://news.stanford.edu/news/2013/july/flipped-learning-model-071613.html Podolefsky, N. S., Moore, E. B., & Perkins, K. K. (2013). Implicit scaffolding in interactive simulations: Design strategies to support multiple educational goals. Retrieved from http://arxiv.org/abs/1306.6544 Podolefsky, N. S., Perkins, K. K., & Adams, W. K. (2010). Factors promoting engaged exploration with computer simulations. Physical Review Physics Education Research, 6(2), 1–11. Pollard, H., Minor, M., & Swanson, A. (2014). Instructor Social Presence within the Community of Inquiry Framework and Its Impact on Classroom Community and the Learning Environment. Online Journal of Distance Learning Administration, 17(2), n2. Pons, D., Hilera, J. R., Fernandez, L., & Pages, C. (2015). Managing the quality of e-learning resources in repositories. Computer Applications in Engineering Education, 23(4), 477–488. doi:10.1002/cae.21619 Ponterotto, D. (2005). Intercultural Communication: Searching for Categories in Conversation Analysis. In M. Bondi & N. Maxwell (Eds.), Cross-Cultural Encounters: Linguistic Perspectives (pp. 253–265). Roma: Officina Edizioni. Poolton, J. M., Zhu, F. F., Malhotra, N., Leung, G. K., Fan, J. K., & Masters, R. S. (2016). Multitask training promotes automaticity of a fundamental laparoscopic skill without compromising the rate of skill learning. Surgical Endoscopy, 30(1), 1–8. PMID:26743112 Popescu, E. (2010). Adaptation provisioning with respect to learning styles in a web-based educational system: An experimental study. Journal of Computer Assisted Learning, 26(4), 243–257. doi:10.1111/j.1365-2729.2010.00364.x Porter, T. S., Riley, T. M., & Ruffer, R. L. (2004). A review of the use of simulations in teaching economics. Social Science Computer Review, 22(4), 426–443. doi:10.1177/0894439304268464 Pratt, N. (2008). Multi-point e-conferencing with initial teacher training students in England: Pitfalls and potential. Teaching and Teacher Education, 24(6), 1476–1486. doi:10.1016/j.tate.2008.02.018 Premlatha, K. R., & Geetha, T. V. (2015). Learning content design and learner adaptation for adaptive e-learning environment: A survey. Artificial Intelligence Review, 44(4), 443–465. doi:10.1007/s10462-015-9432-z Prensky, M. (2001). Digital game-based learning. New York, NY: McGraw-Hill. Pressey, S. L. (1926). A simple apparatus which gives tests and scores-and teaches. School and society, 23(586), 373-376. Punie, Y. (2007). Learning spaces: An ICT-enabled model of future learning in the knowledge-based society. European Journal of Education, 42(2), 185–199. doi:10.1111/j.1465-3435.2007.00302.x Quade, M., Lehmann, G., Engelbrecht, K.-P., Roscher, D., & Albayrak, S. (2013). Automated usability evaluation of model-based adaptive user interfaces for users with special and specific needs by simulating user interaction (pp. 219–247). User Modeling and Adaptation for Daily Routines. doi:10.1007/978-1-4471-4778-7_9 Quality Matters Program. (2014). Quality Matters rubric standards 2014 fifth edition with assigned point values. Retrieved from https://www.qualitymatters.org/rubric Quality Matters. (2016). Higher Ed Program Rubric. Retrieved from https://www.qualitymatters.org/rubric
520
Compilation of References
Quintana, C., Zhang, M., & Krajcik, J. (2005). A framework for supporting metacognitive aspects of online inquiry through software-based scaffolding. Educational Psychologist, 40(4), 235–244. doi:10.1207/s15326985ep4004_5 Qureshi, I. A., Ilyas, K., Yasmin, R., & Whitty, M. (2012). Challenges of implementing e-learning in a Pakistani university. Knowledge Management & E-Learning: An International Journal, 4(3), 310–324. Rada, R., Michailidis, A., & Wang, W. (1994). Collaborative hypermedia in a classroom setting. Journal of Educational Multimedia and Hypermedia, 3, 21–36. Rakes, C. R., Valentine, J. C., McGatha, M. B., & Ronau, R. N. (2010). Methods of instructional improvement in algebra: A systematic review and meta-analysis. Review of Educational Research, 80(3), 372–400. doi:10.3102/0034654310374880 Rao, K. (2012). Universal design for online courses: Addressing the needs of non-traditional learners. Proceedings of the2012 IEEE International Conference on Technology Enhanced Education (ICTEE). doi:10.1109/ICTEE.2012.6208664 Rao, K., Edelen-Smith, P., & Wailehua, C. U. (2015). Universal design for online courses: Applying principles to pedagogy. Open Learning: The Journal of Open, Distance and e-Learning, 30(1), 35–52. Rao, K., & Tanners, A. (2011). Curb cuts in cyberspace: Universal instructional design for online courses. Journal of Postsecondary Education and Disability, 24(3), 211–229. Rao, K., & Tanners, A. (2011). Curb cuts in cyberspace: Universal Instructional Design for online courses. Journal of Postsecondary Education and Disability, 24(3), 211–229. Redmond, P., & Lock, J. V. (2006). A flexible framework for online collaborative learning. The Internet and Higher Education, 9(4), 267–276. doi:10.1016/j.iheduc.2006.08.003 Rees, J. (2013). The MOOC Racket. Retrieved from http://www.slate.com/articles/technology/future_tense/2013/07/ moocs_could_be_disastrous_for_students_and_professors.html Reigeluth, C. M., & Carr-Chellman, A. A. (2009). Understanding instructional theory. In C. M. Reigeluth & A. A. Carr-Chellman (Eds.), Instructional design theories and models (Vol. III, pp. 3–26). New York, NY: Taylor & Francis. Reinders, H., & Darasawang, P. (2012). Diversity in language support. In G. Stockwell (Ed.), Computer-assisted language learning: Diversity in research and practice. Cambridge: Cambridge University Press. doi:10.1017/CBO9781139060981.004 Renaut, C., Batier, C., Flory, L., & Heyde, M. (2006). Improving web site usability for a better e-learning experience. Current developments in technology-assisted education (pp. 891–895). Badajoz, Spain: FORMATEX. Reznick, R., & McCrae, H. (2006). Teaching surgical skills: Changes in the world. The New England Journal of Medicine, 355(25), 2664–2669. doi:10.1056/NEJMra054785 PMID:17182991 Riccomini, P. (2002). The comparative effectiveness of two forms of feedback: Web-based model comparison and instructor delivered corrective feedback. Journal of Educational Computing Research, 27(3), 213–228. doi:10.2190/ HF6D-AXQX-AXFF-M760 Richards, C. (2006). Towards an integrated framework for designing effective ICT-supported learning environments: The challenge to better link technology and pedagogy. Technology, Pedagogy and Education, 15(2), 239–255. doi:10.1080/14759390600769771 Richardson, J. C., & Ice, P. (2010). Investigating students level of critical thinking across instructional strategies in online discussions. The Internet and Higher Education, 13(1), 52–59. doi:10.1016/j.iheduc.2009.10.009 Richardson, J. C., & Swan, K. (2003). Examining social presence in online courses in relation to students’ perceived learning and satisfaction. Journal of Asynchronous Learning Networks, 7(1). 521
Compilation of References
Richardson, J., & Swan, K. (2003). Examining social presence in online courses in relation to students’ perceived learning and satisfaction. Journal of Asynchronous Learning Networks, 7(1), 68–88. Richey, R., & Klein, J. D. (2007). Design and development research: Methods, strategies, and issues. New York: Routledge. Richmond, V. P. (2002). Teacher nonverbal immediacy: Uses and outcomes. In Communications for teachers . Richter, T., & McPherson, M. (2012). Open educational resources: Education for the world? Distance Education, 33(2), 201–219. doi:10.1080/01587919.2012.692068 Riendeau, D. (2012). Flipping the classroom. The Physics Teacher, 50(1), 507. doi:10.1119/1.4758164 Roach, T. (2014). Student perceptions toward flipped learning: New methods to increase interaction and active learning in economics. International Review of Economics Education, 17, 74–84. doi:10.1016/j.iree.2014.08.003 Robles, M., & Braathen, S. (2002). Online assessment techniques. Delta Pi Epsilon Journal, 44(1), 39–49. Rockinson-Szapkiw, A. J., Baker, J. D., Neukrug, E., & Hanes, J. (2010). The efficacy of computer mediated communication technologies to augment and support effective online helping profession education. Journal of Technology in Human Services, 28(3), 161–177. doi:10.1080/15228835.2010.508363 Rolfe, J. M., & Hampson, B. P. (2003). Flight simulation: Viability versus liability issues of accuracy, data and validation. Aeronautical Journal, 107(1076), 631–635. Rosaci, D., & Sarné, G. M. L. (2010). Efficient personalization of e-learning activities using a multi-device decentralized recommender system. Computational Intelligence, 26(2), 121–141. doi:10.1111/j.1467-8640.2009.00343.x Rose, D., Harbour, W., Johnston, C. S., Daley, S., & Abarbanell, L. (2006). Universal design for learning in postsecondary education: Reflections on principles and their application. Journal of Postsecondary Education and Disability, 19(2), 17. Retrieved from http://www.udlcenter.org/sites/udlcenter.org/files/UDLinPostsecondary.pdf Rose, D., Harbour, W., Johnston, C., Daley, S., & Abarbanell, L. (2006). Universal Design for Learning in postsecondary education: Reflections on principles and their application. Journal of Postsecondary Education and Disability, 19(2), 17. Rose, D., & Meyer, A. (2002). Teaching every student in the digital age: Universal Design for Learning. Alexandria, VA: Association for Supervision & Curriculum Development. Rose, D., & Meyer, A. (Eds.), (2006). A practical reader in universal design for learning. Cambridge, MA: Harvard Education Press. Rose, D., & Strangman, N. (2007). Universal design for learning: Meeting the challenge of individual learning differences through a neurocognitive perspective. Universal Access in the Information Society, 5(4), 381–391. doi:10.1007/ s10209-006-0062-8 Rosenberg, J. M., Terry, C. A., Bell, J., Hiltz, V., & Russo, T. E. (2016). Design Guidelines for Graduate Program Social Media Use. TechTrends. Rosenberg, M. J. (2001). E-learning: Strategies for delivering knowledge in the digital age. New York, NY: McGraw–Hill. Rosenberg, M. J. (2006). Beyond e-learning: Approaches and technologies to enhance organizational knowledge, learning, and performance. San Francisco: Pfeiffer. Rourke, L., & Kanuka, H. (2009). Learning in Communities of Inquiry: A Review of the Literature. Journal of Distance Education, 23(1), 19–48.
522
Compilation of References
Rovai, A. P. (2002). Development of an instrument to measure classroom community. The Internet and Higher Education, 5(3), 197–211. doi:10.1016/S1096-7516(02)00102-1 Rovai, A. P. (2002). Sense of community, perceived cognitive learning, and persistence in asynchronous learning networks. The Internet and Higher Education, 5(4), 319–332. doi:10.1016/S1096-7516(02)00130-6 Rovai, A. P. (2004). A constructivist approach to online college learning. The Internet and Higher Education, 7(2), 79–93. doi:10.1016/j.iheduc.2003.10.002 Rubin-Vaughan, A., Pepler, D., Brown, S., & Craig, W. (2011). Quest for the golden rule: An effective social skills promotion and bullying prevention program. Computers & Education, 56(1), 166–175. doi:10.1016/j.compedu.2010.08.009 Rudestam, K. J., & Schoenholtz-Read, J. (2010). The flourishing of adult online education. In K. J. Rudestam & J. Schoenholtz-Read (Eds.), Handbook of Online Learning (2nd ed.). California: SAGE. Rudestam, K. J., & Schoenholtz-Read, J. (2010). The flourishing of adult online education: An overview. In K. J. Rudestam & J. Schoenholtz-Read (Eds.), Handbook of Online Learning (2nd ed.). Thousand Oaks, CA: Sage Publications. Ruiz, J. G., Teasdale, T. A., Hajjar, I., Shaughnessy, M., & Mintzer, M. J. (2007). The consortium of e-learning in geriatrics instruction. Journal of the American Geriatrics Society, 55(3), 458–463. doi:10.1111/j.1532-5415.2007.01095.x PMID:17341252 Rushton, C., Ramsey, P., & Rada, R. (1993). Peer assessment in a collaborative hypermedia environment: A case study. Journal of Computer-Based Instruction, 20, 75–80. Russel, J., Elton, L., Swinglehurst, D., & Greenhalgh, T. (2006). Using the online environment assessment for learning: A case study of a web-based course in primary care. Assessment & Evaluation in Higher Education, 31(4), 465–478. doi:10.1080/02602930600679209 Sadler, D. R. (1989). Formative assessment and the design of instructional systems. Instructional Science, 18(2), 119–144. doi:10.1007/BF00117714 Sadler, P. M., & Good, E. (2006). The impact of self-and peer-grading on student learning. Educational Assessment, 11(1), 1–31. doi:10.1207/s15326977ea1101_1 Salifu, S. (2016). Understanding flipped instructions and how they work in the real world. In J. Keengwe (Ed.), Handbook of Research on Active Learning and the Flipped Classroom Model in the Digital Age (pp. 72–90). Hershey, PA, USA: IGI Global. doi:10.4018/978-1-4666-9680-8.ch004 Salloum, S. (2012). Student perceptions of computer-mediated communication tools in online learning| Helpfulness and effects on teaching, social, and cognitive presence. doi:10.1007/s10726-011-9234-x Salter, N. P., & Conneely, M. R. (2015). Structured and unstructured discussion forums as tools for student engagement. Computers in Human Behavior, 46, 18–25. doi:10.1016/j.chb.2014.12.037 Saltmarsh, J., Hartley, M., & Clayton, P. (2009). Democratic engagement white paper. New England Resource Center for Higher Education Texas State Auditor’s Office. Sambrook, S. (2003), E-learning in small organizations, Education + Training, 45 (8/9), pp. 506-516 Sams, A., & Bennett, B. (2012). The truth about flipped learning. eSchool News. Retrieved from http://www.eschoolnews. com/2012/05/31/the-truth-about-flipped-learning/3/
523
Compilation of References
Sang, G., Valcke, M., Braak, J. V., & Tondeur, J. (2010). Student teachers thinking processes and ICT integration: Predictors of prospective teaching behaviors with educational technology. Computers & Education, 54(1), 103–112. doi:10.1016/j.compedu.2009.07.010 Sapundzhieva, K. (2011). The academic status of social pedagogy in the context of challenges of the modern socialization situation and social practice(In Bulgarian). Pedagogy, 1, 11–20. Sargeant, J., Curran, V., Allen, M., Jarvis-Selinger, S., & Ho, K. (2006). Facilitating interpersonal interaction and learning online: Linking theory and practice. The Journal of Continuing Education in the Health Professions, 26(2), 128–136. doi:10.1002/chp.61 PMID:16802307 Savery, J. R., & Duffy, T. M. (1995). Problem based learning: An instructional model and its constructivist framework. Educational Technology, 35(5), 31–38. Saxena, S. (2013). Best classroom practices for student-centric teaching. EdTechReview. Retrieved from http://edtechreview.in/news/news/trends-insights/insights/775-best-classroom-practices-for-student-centricteaching?goback=%2Egde_5092459_member_5810429914867843076#%21 Saywer, R. K. (2006). The Cambridge handbook of the learning sciences. Cambridge University Press. Scharmer, C. O. (2009). Theory U: Leading from the Future as It Emerges. Berrett-Koeheler Publishers. Schein, E. H. (2013), “Humble Inquiry: The Gentle Art of Asking Instead of Telling”, Ed: Berrett-Koehler Publishers Schellens, T., Van Keer, H., & Valcke, M. (2005). The impact of role assignment on knowledge construction in asynchronous discussion groups a multilevel analysis. Small Group Research, 36(6), 704–745. doi:10.1177/1046496405281771 Schmidt, Goldhaber-Fiebert, Ho, & McDonald. (2013). Simulation exercises as a patient safety strategy: A systematic review. Patient Safety Network, 158(5), 426–432. PMID:23460100 Schneckenberg, D., Ehlers, U., & Adelsberger, H. (2011). Web 2.0 and competence-oriented design of learning: Potentials and implications for higher education. British Journal of Educational Technology, 42(5), 747–762. doi:10.1111/j.14678535.2010.01092.x Schoonenboom, J. (2014). Using an adapted, task-level technology acceptance model to explain why instructors in higher education intend to use some learning management system tools more than others. Computers & Education, 71, 247–256. doi:10.1016/j.compedu.2013.09.016 Schrader, D. E. (2015). Constructivism and Learning in the Age of Social Media: Changing Minds and Learning Communities. New Directions for Teaching and Learning, 2015(144), 23–35. doi:10.1002/tl.20160 Schrire, S. (2006). Knowledge building in asynchronous discussion groups: Going beyond quantitative analysis. Computers & Education, 46(1), 49–70. doi:10.1016/j.compedu.2005.04.006 Scott, K. M. (2013). Does a university teacher need to change e-learning beliefs and practices when using a social networking site? A longitudinal case study. British Journal of Educational Technology, 44(4), 571–580. doi:10.1111/bjet.12072 Scott, S. S., Mcguire, J. M., & Shaw, S. F. (2003). Universal design for instruction: A new paradigm for adult instruction in postsecondary education. Remedial and Special Education, 24(6), 369–379. doi:10.1177/07419325030240060801 Scwitzer, A. M., Griffin, O. T., Kancis, J. R., & Thomas, C. R. (1999). Social adjustment experiences of African American college learners. Journal of Counseling and Development, 77(2), 189–197. doi:10.1002/j.1556-6676.1999.tb02439.x
524
Compilation of References
Seidlhofer, B. (2008). Standard future or half-baked quackery: descripitve and pedagogic bearings on the globalisation of English. In C. Gnutzmann & F. Inteman (Eds.), The globalisation of English and the English language classroom (2nd ed., pp. 159–173). Tübingen: Gunter Narr Verlag. Seidlhofer, B. (2011). Understanding English as a Lingua Franca. Oxford: Oxford University Press. Semenov, A. (2005). Information and communication technologies in schools: A handbook for teachers or How ICT can create new, open learning environments. UNESCO. Retrieved from http://unesdoc.unesco.org/images/0013/001390/139028e.pdf Senge, P. & al. (1994). The fifth discipline fieldbook. London: Nicolas Brealey Publishing. Servonsky, E. J., Daniels, W. L., & Davis, B. L. (2005). Evaluation of Blackboard(TM) as a Platform for Distance Education Delivery. The ABNF Journal, 16(6), 132–135. PMID:16382797 Shaffer, D. (2008). Education in the digital age. The Nordic Journal of Digital Literacy, 4(1), 39–51. Shambaugh, N. (2003). Use of CoWebs in scenario-based ID instruction. Proceedings of the 26th Annual Anaheim: Selected Papers On the Practice of Educational Communications and Technology (pp. 400-407) Association for Educational Communications and Technology (AECT). Shambaugh, N. (2016). Documenting the online course (Working paper EDP640 2010-2015). West Virginia University. Shambaugh, N. (2007). Using developmental research to evaluate blended teaching in higher education. In P. Richards (Ed.), Global Issues in Higher Education (pp. 1–28). Hauppauge, NY: Nova Science Publishers. Shambaugh, R. N., & Magliaro, S. G. (1997). Mastering the possibilities: A process approach to instructional design. Boston, MA: Allyn & Bacon. Shambaugh, R. N., & Magliaro, S. G. (2001). A reflexive model for teaching and learning instructional design. Educational Technology Research and Development, 49(2), 69–92. doi:10.1007/BF02504929 Shanley, E. L., Thompson, C. A., Leuchner, L. A., & Zhao, Y. (2004). Distance education is as effective as traditional education when teaching food safety. Food Service Technology, 4(1), 1–8. doi:10.1111/j.1471-5740.2003.00071.x Shapiro, N. Z., & Anderson, R. H. (1985). Toward an Ethics and Etiquette for Electronic Mail. ERIC. Sharif, A., & Magrill, B. (2015). Discussion forums in MOOCs. International Journal of Learning, Teaching and Educational Research, 12(1). Sharpe, R., Benfield, G., & Francis, R. (2006). Implementing a university e-learning strategy: Levers for change within academic schools. ALT-J: Research in Learning Technology, 14(2), 135–151. doi:10.1080/09687760600668503 Shaver, M. (2010). Using low tech interactions in the chemistry classroom to engage students in active learning. Journal of Chemical Education, 87(12), 1320–1323. doi:10.1021/ed900017j Shea, P. J., Pickett, A. M., & Pelz, W. E. (2003). A follow-up investigation of “teaching presence” in the SUNY Learning Network. Journal of Asynchronous Learning Networks, 7(2), 61–80. Shea, P. J., Pickett, A. M., & Pelz, W. E. (2003). A follow-up investigation of teaching presence in the SUNY learning network. Journal of Asynchronous Learning Networks, 7(2), 61–80. Shea, P., & Bidjerano, T. (2010). Learning presence: Towards a theory of self-efficacy, self-regulation, and the development of a communities of inquiry in online and blended learning environments. Computers & Education, 55(4), 1721–1731. doi:10.1016/j.compedu.2010.07.017
525
Compilation of References
Shea, P., Hayes, S., & Vickers, J. (2010). Online instructional effort measured through the lens of teaching presence in the community of inquiry framework: A re-examination of measure and approach. International Review of Research in Open and Distance Learning, 11(3), 127–154. Shea, P., Li, C., & Pickett, A. (2006). A Study of Teaching Presence and Student Sense of Learning Community in fully Online and Web-enhanced College Courses. The Internet and Higher Education, 9(3), 175–190. doi:10.1016/j. iheduc.2006.06.005 Shea, P., Li, C., Swan, K., & Pickett, A. (2005). Developing learning community in online asynchronous college courses: The role of teaching presence. Journal of Asynchronous Learning Networks, 9(4), 59–82. Shea, P., Vickers, J., & Hayes, S. (2010). Online instructional effort measured through the lens of teaching presence in the Community of Inquiry framework: A re-examination of measures and approach. International Review of Research in Open and Distance Learning, 11(3), 1–29. Sheridan, D., White, D., & Gardner, L. A. (2002). Cecil: the first web-based LMS. Paper presented at the ASCILITE. Sheridan, K., & Kelly, M. A. (2010). The indicators of instructor presence that are important to students in online courses. Journal of Online Learning and Teaching, 6(4), 767. Retrieved from http://search.proquest.com/docview/1497198590 ?accountid=14606 Shihab, M. M. (2008). Web 2.0 tools improve teaching and collaboration in high school English language classes [Unpublished Doctoral Dissertation]. Nova Southeastern University Graduate School of Computer and Information Sciences, USA. Shin, N. (2002). Beyond Interaction: The relational construct of ‘Transactional Presence’. Open Learning: The Journal of Open, Distance and e-Learning, 17(2), 121-137. doi:10.1080/02680510220146887 Shi, S., Mishra, P., Bonk, C., Tan, S., & Zhao, Y. (2006). Thread theory: A framework applied to content analyses of synchronous computer mediated communication data. International Journal of Instructional Technology & Distance Learning, 3(3). Retrieved from http://www.itdl.org/journal/mar_06/index.htm Shohreh, A. K., & Keesling, G. (2000). Development of a web-based Internet marketing course. Journal of Marketing Education, 22(2), 84–89. doi:10.1177/0273475300222002 Short, J. A., Williams, E., & Christie, B. (1976). The social psychology of telecommunications. London: Wiley. Shulman, L. S. (1986). Those who understand: Knowledge growth in teaching. Educational Researcher, 15(2), 4–14. doi:10.3102/0013189X015002004 Simkin, M. G., & Ramarapu, N. K. (1997). Student perceptions of the peer review process in student writing projects. Journal of Technical Writing and Communication, 27(3), 249–263. Simões, A. P., & de Moraes, A. (2012). The ergonomic evaluation of a virtual learning environment usability. Work (Reading, Mass.), 41, 1140. PMID:22316872 Sinclair, P., Schoch, M., Black, K., & Woods, M. (2011). Proof of concept: Developing a peer reviewed, evidence-based, interactive e-learning programme. Journal of Renal Care, 37(2), 108–113. doi:10.1111/j.1755-6686.2011.00217.x PMID:21561547 Sitzmann, T., Kraiger, K., Stewart, D., & Wisher, R. (2006). The comparative effectiveness of web‐based and classroom instruction: A meta‐analysis. Personnel Psychology, 59(3), 623–664. doi:10.1111/j.1744-6570.2006.00049.x Skramstad, E., Schlosser, C., & Orellana, A. (2012). Teaching presence and communication timeliness in asynchronous online courses. Quarterly Review of Distance Education, 13(3), 183. 526
Compilation of References
Slavin, R. (1996). Research on cooperative learning and achievement: What we know, what we need to know. Contemporary Educational Psychology, 21(2), 43–69. doi:10.1006/ceps.1996.0004 Sloan, C. (2013). 10th Annual Survey of Online Learning: Changing Course: Ten Years of Tracking Online Education in the United States. Retrieved from http://sloanconsortium.org/node/384451 Smith, R. O. (2005). Working with difference in online collaborative groups. Adult Education Quarterly, 55, 182-199. doi:10.1177/0741713605274627 Smith, F. (2012). Analyzing a college course that adheres to the Universal Design for Learning (UDL) framework. Journal of the Scholarship of Teaching and Learning, 12(3), 31–61. Smith, G. G., Sorensen, C., Gump, A., Heindel, A. J., Caris, M., & Martinez, C. D. (2011). Overcoming student resistance to group work: Online versus face-to-face. The Internet and Higher Education, 14(2), 121–128. doi:10.1016/j. iheduc.2010.09.005 Smith, J. A., & Sivo, S. A. (2012). Predicting continued use of online teacher professional development and the influence of social presence and sociability. British Journal of Educational Technology, 43(6), 871–882. doi:10.1111/j.14678535.2011.01223.x Smith, M., & Winking-Diaz, A. (2004). Increasing students’ interactivity in an online course. Journal of Interactive Online Learning, 2(3), 1–25. So, H.-J., & Brush, T. A. (2008). Student perceptions of collaborative learning, social presence and satisfaction in a blended learning environment: Relationships and critical factors. Computers & Education, 51(1), 318–336. doi:10.1016/j. compedu.2007.05.009 Sommers, N. (1980). Revision strategies of student writers and experienced adult writers. College Composition and Communication, 3(4), 378–388. doi:10.2307/356588 Sopko, K. M. (2009). Universal design for learning: Policy challenges and recommendations. Project Forum at National Association of State Directors of Special Education. Alexandria, VA: United States Office of Special Education Programs. Southard, S., Meddaugh, J. & France-Harris, A. (2015). Can SPOC (Self-Paced Online Course) Live Long and Prosper? A Comparison Study of a New Species of Online Course Delivery, Online Journal of Distance Learning Administration, XVIII(2; spring) Stahl, G., Koschmann, T., & Suthers, D. (2006). Computer-supported collaborative learning: An historical perspective. In Cambridge handbook of the learning sciences (pp. 409-426). Stanley, J. (1992). Coaching student writers to become effective peer evaluators. Journal of Second Language Writing, 1(2), 17–233. Stefanidis, D. (2013). Improving surgeon skills with simulator training to automaticity. Retrieved from http://www. physiciansweekly.com/surgeon-skills-and-automaticity-with-simulator-training/ Stefanidis, D., Scerbo, M. W., Montero, P. N., Acker, C. E., & Smith, W. D. (2012). Simulator training to automaticity leads to improved skill transfer compared with traditional proficiency-based training: A randomized controlled trial. Annals of Surgery, 255(1), 30–37. doi:10.1097/SLA.0b013e318220ef31 PMID:21637099 Stein, S. J., Shephard, K., & Harris, I. (2011). Conceptions of e-learning and professional development for e-learning held by tertiary educators in New Zealand. British Journal of Educational Technology, 42(1), 145–165. doi:10.1111/j.14678535.2009.00997.x
527
Compilation of References
Stinson, B. (n. d.). Universal Design and Accessibility for Online Classes. Retrieved from https://www.coursesites.com/ webapps/portal/frameset.jsp?tab_tab_group_id=null&url=/webapps/blackboard/execute/launcher?type=Course&id= _269867_1&url Stodden, R. A., Brown, S. E., & Roberts, K. (2011). Disability-friendly university environments: Conducting a climate assessment. New Directions for Higher Education, 1(154), 83–92. doi:10.1002/he.437 Stone, D., & Heen, S. (2014), “Thanks for the feedback: The Science and Art of Receiving Feedback”, Ed: Viking Stover, K., Kissel, B., Wood, K., & Putman, M. (2015). Examining Literacy Teachers Perceptions of the Use of VoiceThread in an Elementary, Middle School, and a High School Classroom for Enhancing Instructional Goals. Literacy Research and Instruction, 54(4), 341–362. doi:10.1080/19388071.2015.1059911 Straus, S. G., & McGrath, J. E. (1994). Does the medium matter? The interaction of task type and technology on group performance and member reactions. Journal of Applied Psychology, 79, 87-97. doi:10.1037/0021-9010.79.1.87 Strayer, J. F. (2012). How learning in an inverted classroom influences cooperation, innovation and task orientation. Learning Environments Research, 15(2), 171–193. doi:10.1007/s10984-012-9108-4 Strijbos, J. W., & De Laat, M. F. (2010). Developing the role concept for computer-supported collaborative learning: An explorative synthesis. Computers in Human Behavior, 26(4), 495–505. doi:10.1016/j.chb.2009.08.014 Strijbos, J. W., & Weinberger, A. (2010). Emerging and scripted roles in computer-supported collaborative learning. Computers in Human Behavior, 26(4), 491–494. doi:10.1016/j.chb.2009.08.006 Sue, B. S., Sarah, C. W., & Warren, S. H. (2006). Reaching through the screen: Using a tablet PC to provide feedback in online classes. Rural Special Education Quarterly, 25(2), 8-12. Retrieved from http://search.proquest.com.liblink.uncw. edu/docview/227213654?accountid=14606 Sultan, C., Corless, M., & Skelton, R. E. (2000). Tensegrity flight simulator. Journal of Guidance, Control, and Dynamics, 23(6), 1055–1064. doi:10.2514/2.4647 Sun, P. C., Tsai, R. J., Finger, G., Chen, Y. Y., & Yeh, D. (2008). What drives a successful e-Learning? An empirical investigation of the critical factors influencing learner satisfaction. Computers & Education, 50(4), 1183–1202. doi:10.1016/j.compedu.2006.11.007 Swales, J. M., & Feak, C. B. (2004). Academic writing for graduate students: Essential tasks and skills (Vol. 1). Ann Arbor, MI: University of Michigan Press. Swan, K. (2005). A constructivist model for thinking about learning online. In Elements of quality online education: Engaging communities (Vol. 6, pp. 13-31). Swan, K. (2005). A constructivist model for thinking about learning online. In J. Bourne & J. C. Moore (Eds.), Elements of Quality Online Education: Engaging Communities. Needham, MA: Sloan. Retrieved from http://www.kent.edu/rcet/ Publications/upload/constructivist%20theory.pdf Swan, C. (2014). Tech tools for assessing the “soft” skills. Tech & Learning, 34(8), 38–40. Swan, K. (2001). Virtual interaction: Design factors affecting student satisfaction and perceived learning in asynchronous online courses. Distance Education, 22(2), 306–331. doi:10.1080/0158791010220208 Swan, K. (2003). Learning effectiveness online: What the research tells us. Elements of Quality Online Education. Practice and Direction, 4, 13–47.
528
Compilation of References
Swan, K., Garrison, D. R., & Richardson, J. (2009). A constructivist approach to online learning: the Community of Inquiry framework. In C. R. Payne (Ed.), Information technology and constructivism in higher education: Progressive learning frameworks (pp. 43–57). Hershey, PA, USA: IGI Global. doi:10.4018/978-1-60566-654-9.ch004 Swan, K., & Shih, L. F. (2005). On the nature and development of social presence in online course discussions. Journal of Asynchronous Learning Networks, 9(3), 115–136. Szeto, E. (2015). Community of Inquiry as an instructional approach: What effects of teaching, social and cognitive presences are there in blended synchronous learning and teaching? Computers & Education, 81, 191–201. doi:10.1016/j. compedu.2014.10.015 Talbert, R. (2014, January 27). The inverted calculus course: Overture. Casting Out Nines column. The Chronicle of Higher Education. Retrieved from http://chronicle.com/blognetwork/castingoutnines/2014/01/27/the-inverted-calculuscourse-overture/ Tambouris, E., Panopoulou, E., Tarabanis, K., Ryberg, T., Buus, L., Peristeras, V., & Porwol, L. et al. (2012). Enabling problem based learning through web 2.0 technologies: Pbl 2.0. Journal of Educational Technology & Society, 15(4), 238–251. Tam, M. (2000). Constructivism, instructional design, and technology: Implications for transforming distance learning. Journal of Educational Technology & Society, 3(2), 50–60. Tannenbaum, S. I., Mathieu, J. E., & Cannon-Bowers, J. A. (1991). Meeting trainees’ expectations: The influence of training fulfillment on the development of commitment, self-efficacy, and motivation. The Journal of Applied Psychology, 76(6), 759–769. doi:10.1037/0021-9010.76.6.759 Tan, W., Chen, S., Li, J., Li, L., Wang, T., & Hu, X. (2014). A trust evaluation model for e-learning systems. Systems Research and Behavioral Science, 31(3), 353–365. doi:10.1002/sres.2283 Taras, M. (2005). Assessment – Summative and Formative – Some theoretical reflections. British Journal of Educational Studies, 53(4), 466–478. doi:10.1111/j.1467-8527.2005.00307.x Taras, M. (2010). Back to basics: Definitions and processes of assessments. Práxis Educativa, 5(2), 123–130. doi:10.5212/ PraxEduc.v.5i1.123130 Tarhini, A., Hone, K., & Liu, X. (2015). A cross-cultural examination of the impact of social, organisational and individual factors on educational technology acceptance between British and Lebanese university students. British Journal of Educational Technology, 46(4), 739–755. doi:10.1111/bjet.12169 Taylor, P., Morin, R., Cohn, D., Kochhar, R., & Clark, A. (2008). Inside the Middle Class: Bad Times Hit the Good Life. Washington: PewResearch Center. Retrieved from http://www.pewsocialtrends.org/2008/04/09/inside-the-middle-classbad-times-hit-the-good-life/ Taylor-Massey, J. (2015). Redefining teaching: The five roles of the online instructor. Retrieved from http://blog.online. colostate.edu/blog/online-teaching/redefining-teaching-the-five-roles-of-the-online-instructor/ TED. (2011, March). Salman Khan: Let’s use video to reinvent education [Online newsgroup]. Retrieved from www. ted.com/talks/salman_kan_let_s_use.videos_to_reinvent_education.html Tennyson, R. D., & Jorczak, R. L. (2008). A conceptual framework for the empirical study of instructional games. In H. F. O’Neil & R. S. Perez (Eds.), Computer games and team and individual learning (pp. 39–54). Oxford, United Kingdom: Elsevier.
529
Compilation of References
Tess, P. A. (2013). The role of social media in higher education classes (real and virtual)–A literature review. Computers in Human Behavior, 29(5), A60–A68. doi:10.1016/j.chb.2012.12.032 Tharp, R. G., & Gallimore, R. (1988). Rousing minds to life: Teaching, learning, and schooling in social context. Cambridge: Cambridge University Press. Thomas, M., Hilton, A. A., & Ingram, T. (2015, November) (Accepted). Campus environments: their importance and impact. Paper presented at theAssociation for the Study of Higher Education (ASHE) 40th Annual Conference. Thomas, M. J. (2002). Learning within incoherent structures: The space of online discussion forums. Journal of Computer Assisted Learning, 18(3), 351–366. doi:10.1046/j.0266-4909.2002.03800.x Thomas, M., & Reinders, H. (2010). Task-Based Language Teaching and Technology. New York, NY: Continuum. Thomas, M., Reinders, H., & Warschauer, M. (2013). Contemporary Computer-Assisted Language Learning. London: Bloomsbury Publishing Plc. Thompson, R. F. (1981). Peer grading: Some promising advantages for composition research and the classroom. Research in the Teaching of English, 15(2), 172–174. Thorne, S. L. (2003). Artifacts and cultures-of-use in intercultural communication. Language Learning & Technology, 7(2), 38–67. Retrieved from llt.msu.edu/vol7num2/pdf/thorne.pdf Tiedtke, T., Märtin, C., & Gerth, N. (2002). AWUSA-A tool for automated website usability analysis. Proceedings of theWorkshop on Interactive Systems. Design, Specification, and Verification. Rostock, Germany (pp. 12-14). Ting-Toomey, S. (1999). Communicating Across Cultures. New York, NY: The Guilford Press. Tinoca, L., Oliveira, I., & Pereira, A. (2010). Online group work patterns: how to promote a successful collaboration. Proceedings of the Seventh International Conference on Networked Learning (pp. 429–438). Tinto, V. (2008). When access is not enough. The Hispanic Outlook in Higher Education, 19, 13–13. Tobin, T. J. (2014). Increase online student retention with universal design for learning. Quarterly Review of Distance Education, 15(3), 13–24. Tomlinson, C. A., & Moon, T. R. (2013). Assessment and student success in a differentiated classroom. Alexandria, VA, USA: Association for Supervision & Curriculum Development. Topping, K. J. (2009). Peer assessment. Theory into Practice, 48(1), 20–27. doi:10.1080/00405840802577569 Topping, K. J., Smith, E. F., Swanson, I., & Elliot, A. (2000). Formative peer assessment of academic writing between post students. Assessment & Evaluation in Higher Education, 25(2), 151–169. doi:10.1080/713611428 Toto, R., & Nguyen, H. (2009). Flipping the work design in an industrial engineering course.Proceedings of the 39th ASEE/IEEE Frontiers in Education Conference, San Antonio, TX, USA. doi:10.1109/FIE.2009.5350529 Tracey, R. (n. d.). A Framework for Content Curation. Retrieved from https://ryan2point0.wordpress.com/2015/06/17/aframework-for-content-curation/ Triacca, L., Bolchini, D., Botturi, L., & Inversini, A. (2004). MiLE: Systematic Usability Evaluation for E-learning Web Applications. Paper presented at theWorld Conference on Educational Multimedia, Hypermedia and Telecommunications. Triyono, M. B. (2015). The indicators of instructional design for e-learning in Indonesian vocational high schools. Procedia: Social and Behavioral Sciences, 204, 54–61. doi:10.1016/j.sbspro.2015.08.109
530
Compilation of References
Tsai, P. C.-F., Yen, Y.-F., Huang, L.-C., & Huang, C. (2007). A study on motivating employees learning commitment in the post-downsizing era: Job satisfaction perspective. Journal of World Business, 42(2), 157–169. doi:10.1016/j. jwb.2007.02.002 Tu, C. H., & Corry, M. (2003). Designs, management tactics, and strategies in asynchronous learning discussions. The quarterly Review of Distance Education, 4(3), 303-315. Tu, C. (2004). Online collaborative learning communities: Twenty-one design to building an online collaborative learning communities. Libraries Unlimited. Tu, C. H. (2005). From presentation to interaction: New goals for online learning. Educational Media International, 42(3), 189–206. doi:10.1080/09523980500161072 Tu, C.-H., & Corry, M. (2003). Building active online interaction via a collaborative learning community. Computers in the Schools, 20(3), 51–59. doi:10.1300/J025v20n03_07 Tudor, I. (2005). Higher Education language policy in Europe: A snapshot of action and trends [Discussion brief ]. Retrieved from http://web.fu-berlin.de/enlu/ Tullis, T., Fleischman, S., McNulty, M., Cianchette, C., & Bergel, M. (2002). An empirical comparison of lab and remote usability testing of web sites. Proceedings of theUsability Professionals Association Conference. Ullah, H., & Wilson, M. A. (2007). Students’ academic success and its association to student involvement with learning and relationships with faculty and peers. College Student Journal, 41(4), 1192–1202. UNESCO. (1998). Higher Education in the Twenty-first Century Vision and Action Final Report. Retrieved fromunesdoc. unesco.org/images/0011/001163/116345e.pdf UNESCO. (2002). ICT (Information and Communication Technologies) in teacher education, a planning guide. Retrieved from http://unesdoc.unesco.org/images/0012/001295/129533e.pdf University of Arkansas at Little Rock. (n. d.). Ten simple steps toward universal design of online courses. Retrieved from http://ualr.edu/pace/tenstepsud/ Urh, M., Vukovic, G., Jereb, E., & Pintar, R. (2015). The model for introduction of gamification into e-learning in higher education. Procedia: Social and Behavioral Sciences, 197, 388–397. doi:10.1016/j.sbspro.2015.07.154 van der Pol, J., van den Berg, B. A. M., Admiraal, W. F., & Simons, P. R. J. (2008). The nature, reception, and use of online peer feedback in higher education. Computers & Education, 51(4), 1804–1817. doi:10.1016/j.compedu.2008.06.001 Van Leeuwen, A., Janssen, J., Erkens, G., & Brekelmans, M. (2015). Teacher regulation of multiple computer-supported collaborating groups. Computers in Human Behavior, 52, 233–242. doi:10.1016/j.chb.2015.05.058 van Seters, J. R., Wellink, J., Tramper, J., Goedhart, M. J., & Ossevoort, M. A. (2012). A web-based adaptive tutor to teach PCR primer design. Biochemistry and Molecular Biology Education, 40(1), 8–13. doi:10.1002/bmb.20563 Vaughan, N., & Garrison, D. R. (2005). Creating cognitive presence in a blended faculty development community. The Internet and Higher Education, 8(1), 1–12. doi:10.1016/j.iheduc.2004.11.001 Venkatesh, V., & Davis, F. D. (2000). A theoretical extension of the technology acceptance model: Four longitudinal field studies. Management Science, 46(2), 186–204. doi:10.1287/mnsc.46.2.186.11926 Vermunt, J. D. (1996). Metacognitive, cognitive and affective aspects of learning styles and strategies: A phenomenographic analysis. Higher Education, 31(1), 25–50. doi:10.1007/BF00129106
531
Compilation of References
Vestal, W. (2006). Sustaining communities of practice. KM world, 15(3), 8-40. Retrieved from http://www.kmworld. com/Articles/Editorial/Features/Sustaining-communities-of-practice-15159.aspx Villamil, O. S., & Gurrerro, M. C. M. (1996). Peer revision in the L2 classroom: Social cognitive activities, mediating strategies, and aspects of social behaviors. Journal of Second Language Writing, 5(1), 51–75. doi:10.1016/S10603743(96)90015-6 Violante, M. G., & Vezzetti, E. (2014). Implementing a new approach for the design of an e-learning platform in engineering education. Computer Applications in Engineering Education, 22(4), 708–727. doi:10.1002/cae.21564 Violante, M. G., & Vezzetti, E. (2015). Virtual interactive e-learning application: An evaluation of the student satisfaction. Computer Applications in Engineering Education, 23(1), 72–91. doi:10.1002/cae.21580 Vitale, A. T. (2010). Faculty development and mentorship using selected online asynchronous teaching strategies. Journal of Continuing Education in Nursing, 41(12), 549–556. doi:10.3928/00220124-20100802-02 PMID:20704095 Vogel, T. (2001). Internationalization, Interculturality and the Role of Foreign Languages in Higher Education. Higher Education in Europe, 3(26), 381–389. Retrieved from http://elearning.surf.nl/e-learning/english/3793 Von Glasersfeld, E. (1995). Radical Constructivism: A Way of Knowing and Learning, SMES (Vol. 6). Taylor & Francis Inc. doi:10.4324/9780203454220 Vonderwell, S., Liang, X., & Alderman, K. (2007). Asynchronous discussions and assessment in online learning. Journal of Research on Technology in Education, 39(3), 309–328. doi:10.1080/15391523.2007.10782485 Vygotsky, L. S. (1978). Thought and language, 1962. In Mind and Society. Vygotsky, L. S. (1978). Educational implications. In M. Cole, V. John-Steiner, S. Scribner, & E. Souberman (Eds.), Mind in society: The development of higher psychological processes (pp. 79–153). Cambridge: Harvard University Press. Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes. Cambridge, MA: Harvard University Press. Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes. In M. Cole, V. John-Steiner, S. Scribner, & E. Souberman (Eds.), Mind in society: The development of higher psychological processes. Cambridge, MA, USA: Harvard Press. Vygotsky, L. S. (1981). The genesis of higher mental functions. In J. V. Wertsch (Ed.), The concepts of activity in Soviet psychology. Armonk, NY: Sharpe. Vygotsky, L. S. (1987). Thinking and speech. In L. S. Vygotsky (Ed.), Collected Works (pp. 39–285). New York, NY: Plenum Press. Wade, R. (Ed.). (1997). Community service-learning: A guide to including service in the public school curriculum. Albany, NY: State University of New York Press. Wang, T. (2008). Web‐based quiz‐game‐like formative assessment: Development and evaluation. Computers & Education, 51(3), 1247–1263. doi:10.1016/j.compedu.2007.11.011 Wang, X., Yang, D., Wen, M., Koedinger, K., & Rosé, C. P. (2015). Investigating How Student’s Cognitive Behavior in MOOC Discussion Forums Affect Learning Gains. International Educational Data Mining Society. Wang, Y.-S., Wang, H.-Y., & Shee, D. Y. (2007). Measuring e-learning systems success in an organizational context: Scale development and validation. Computers in Human Behavior, 23(4), 1792–1808. doi:10.1016/j.chb.2005.10.006
532
Compilation of References
Wankel, C. (2009). Management education using social media. Organizational Management Journal, 6(4), 251–262. doi:10.1057/omj.2009.34 Wanner, T., & Palmer, E. (2015). Personalising learning: Exploring student and teacher perceptions about flexible learning and assessment in a flipped university course. Computers & Education, 88, 354–369. doi:10.1016/j.compedu.2015.07.008 Warnock, S. (2013, April 18). Frequent, Low-Stakes Grading: Assessment for Communication Confidence. Retrieved from http://www.facultyfocus.com/articles/educational-assessment/frequent-low-stakes-grading-assessment-for-communication-confidence/ Warschauer, M. & Healey, D. (1998). Computers and language learning: An overview. Language Teaching. Language Teaching, 31, 57-71. Retrieved from http://www.gse.uci.edu/person/warschauer_m/overview.html Washington University’s Clinical Simulation Center. (n. d.). Clinical simulation center. Retrieved from http://www. simulation.wustl.edu/The-Centers/Clinical-Simulation-Center Waterworth, J., & Waterworth, E. (1999). Education as exploration: Being, going and changing the world. Proceeding of Didactics of Informatics and Mathematics in Education, Crete. Watters, A. (2012, August 27). The problems with peer grading in Coursera. Inside Higher Ed. Retrieved from: http:// www.insidehighered.com/blogs/hack-higher-education/problems-peer-grading-coursera Weaver, S. J., Salas, E., Lyons, R., Lazzara, E. H., & Rosen, M. A. (2010). Simulation-based team training at the sharp end: A qualitative study of simulation-based team training design, implementation, and evaluation in healthcare. Journal of Emergencies, Trauma, and Shock, 3(4), 369–377. doi:10.4103/0974-2700.70754 PMID:21063560 Weinberger, A., Ertl, B., Fischer, F., & Mandl, H. (2005). Epistemic and social scripts in computer-supported collaborative learning. Instructional Science, 33(1), 1–30. doi:10.1007/s11251-004-2322-4 Weinel, M., Bannert, M., Zumbach, J., Hoppe, H., & Malzahn, N. (2011). A closer look on social presence as a causing factor in computer-mediated collaboration. Computers in Human Behavior, 27(1), 513–521. doi:10.1016/j.chb.2010.09.020 Weiner, M., & Mehrabian, A. (1968). Language within Language: Immediacy, a channel in verbal communication. New York: Appleton-Century-Crofts. Wells, M., & Holland, C. (2016). Flipping learning! Challenges in deploying online resources to flipped learning in higher education. In J. Keengwe (Ed.), Handbook of Research on Active Learning and the Flipped Classroom Model in the Digital Age (pp. 1–18). Hershey, PA, USA: IGI Global. doi:10.4018/978-1-4666-9680-8.ch001 Welsh, E. T., Wanberg, C. R., Brown, K. G., & Simmering, M. J. (2003). E‐learning: Emerging uses, empirical results and future directions. International Journal of Training and Development, 7(4), 245–258. doi:10.1046/j.1360-3736.2003.00184.x Wenger, E. (2000), “Communities of Practice and Social Learning Systems”, in Organization, 7(2), 225-246, Ed: SAGE London doi:10.1177/135050840072002 Wenger, E. (1998). Communities of practice: Learning, meaning, and identity. Cambridge, UK: Cambridge University Press. doi:10.1017/CBO9780511803932 Wen, M. L., & Tsai, C.-C. (2006). University students perceptions of and attitudes toward (online) peer assessment. Higher Education, 27(18), 27–44. doi:10.1007/s10734-004-6375-8 White, S. (2007). Critical success factors for e-learning and institutional change: Some organisational perspectives on campus-wide e-learning. British Journal of Educational Technology, 38(5), 840–850. doi:10.1111/j.1467-8535.2007.00760.x Wiggins, G., & McTighe, J. (2005). Understanding by Design. Ontario, Canada: ASCD. 533
Compilation of References
Wilcox, G., & Lock, J. (2014, October). Student perceptions of online practicum. In E-Learn: World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education (pp. 2059–2064). Williams, P. E. (2000). Defining distance education roles and competencies for higher education institutions: A computermediated delphi stud [Doctoral Dissertation]. Williams, K. C., Morgan, K., & Cameron, B. A. (2011). How do students define their roles and responsibilities in online learning group projects? Distance Education, 32(1), 49–62. doi:10.1080/01587919.2011.565498 Wilson, B. G. (Ed.). (1996). Constructivist learning environments: Case studies in instructional design. Educational Technology. USA: Educational Tech. Publications. Retrieved from http://www.google.com.tr/books?hl=tr&lr=&id=m psHa5f712wC&oi=fnd&pg=PR5&dq=Constructivist+learning+environments:+Case+studies+in+instructional+de sign&ots=sXhbBjbSOk&sig=cBifRbQXFXjX7wsOrwLISyDdPak&redir_esc=y#v=onepage&q=Constructivist%20 learning%20environments%3A%20Case%20studies%20in%20instructional%20design&f=false21 Wilson, A. (2012). Effective professional development for e-learning: What do the managers think? British Journal of Educational Technology, 43(6), 892–900. doi:10.1111/j.1467-8535.2011.01248.x Wilson, B. G. (1996). Constructivist learning environments: Case studies in instructional design. Englewood Cliffs, NJ: Educational Technology. Wilson, B., Ludwig-Hardman, S., Thornam, C., & Dunlap, J. (2004). Bounded community: Designing and facilitating learning communities in formal courses. The International Review of Research In Open And Distributed Learning, 5(3), 1–22. Retrieved from http://www.irrodl.org/index.php/irrodl/article/view/204 doi:10.19173/irrodl.v5i3.204 Wilson, S. G. (2013). The flipped class: A method to address the challenges of an undergraduate statistics course. Teaching of Psychology, 40(3), 193–199. doi:10.1177/0098628313487461 Wimba. (2009). Wimba voice for higher education. Retrieved from http://www.wimba.com/solutions/higher-education/ wimba_voice_for_higher_education Wirt, J., Choy, S., Rooney, P., Hussar, W., Provasnik, S., & Hampden-Thompson, G. (2005). The Condition of Education, 2005. NCES 2005-094. National Center for Education Statistics. Wise, A., Saghafian, M., & Padmanabhan, P. (2012). Towards more precise design guidance: Specifying and testing the functions of assigned student roles in online discussions. Educational Technology Research and Development, 60(1), 55–82. doi:10.1007/s11423-011-9212-7 Wise, A., Speer, J., Marbouti, F., & Hsiao, Y.-T. (2013). Broadening the notion of participation in online discussions: Examining patterns in learners online listening behaviors. Instructional Science, 41(2), 323–343. doi:10.1007/s11251012-9230-9 Wiseman, A., Haynes, C., & Hodge, S. (2013). Implementing professional integrity and simulation-based learning in health and social care: An ethical and legal maze or a professional requirement for high-quality simulated practice learning? Clinical Simulation in Nursing, 9(10), 437–443. doi:10.1016/j.ecns.2012.12.004 Wisneski, J. E., Ozogul, G., & Bichelmeyer, B. A. (2015). Does teaching presence transfer between MBA teaching environments? A comparative investigation of instructional design practices associated with teaching presence. The Internet and Higher Education, 25, 18–27. doi:10.1016/j.iheduc.2014.11.001 Woelber, J. P., Hilbert, T. S., & Ratka-Krüger, P. (2012). Can easy-to-use software deliver effective e-learning in dental education? A randomised controlled study. European Journal of Dental Education, 16(3), 187–192. doi:10.1111/j.16000579.2012.00741.x PMID:22783845
534
Compilation of References
Wong, J.-S., Pursel, B., Divinsky, A., & Jansen, B. J. (2015). An Analysis of MOOC Discussion Forum Interactions from the Most Active Users. Paper presented at theInternational Conference on Social Computing, Behavioral-Cultural Modeling, and Prediction. doi:10.1007/978-3-319-16268-3_58 Wong, K., Kwan, R., & Leung, K. (2011). An exploration of using Facebook to build a virtual community of practice. In Hybrid Learning (Vol. 6837, pp. 316–324). LNCS. Wood, K. D., Stover, K., & Kissel, B. (2013). Using digital VoiceThreads to promote 21st century learning. Middle School Journal, 44(4), 58–64. doi:10.1080/00940771.2013.11461865 World Links. (2007). Final report on the Asian policy forum on ICT integration into education. Retrieved from http:// cache-www.intel.com/cd/00/00/38/07/380769_380769.pdf Xie, K. (2013). What do the numbers say? The influence of motivation and peer feedback on students behaviour in online discussions. British Journal of Educational Technology, 44(2), 288–301. doi:10.1111/j.1467-8535.2012.01291.x Xie, K., Yu, C., & Bradshaw, A. C. (2014). Impacts of role assignment and participation in asynchronous discussions in college-level online classes. The Internet and Higher Education, 20, 10–19. doi:10.1016/j.iheduc.2013.09.003 Yalcinalp, S., & Gulbahar, Y. (2010). Ontology and taxonomy design and development for personalised web-based learning systems. British Journal of Educational Technology, 41(6), 883–896. doi:10.1111/j.1467-8535.2009.01049.x Yang, J., Schneller, Ch., & Roche, S. (2015). The Role of Higher Education in Promoting Lifelong Learning, Ed: UNESCO Institute for Lifelong Learning Yang, J. C., Chen, C. H., & Jeng, M. C. (2010). Integrating video-capture virtual reality technology into a physically interactive learning environment for English learning. Computers & Education, 55(3), 1346–1356. doi:10.1016/j. compedu.2010.06.005 Yang, S. H., Yang, L., & He, C. H. (2001). Improve safety of industrial processes using dynamic operator training simulators. Process Safety and Environmental Protection, 79(6), 329–338. doi:10.1205/095758201753373096 Yasir, E. A. M., & Sami, M. S. (2011). An approach to adaptive e-learning hypermedia system based on learning styles (AEHS-LS): Implementation and evaluation. International Journal of Library and Information Science, 3(1), 15–28. Yeh, Y.-c., & Lin, C. F. (2015). Aptitude-Treatment Interactions during Creativity Training in E-Learning: How MeaningMaking, Self-Regulation, and Knowledge Management Influence Creativity. Journal of Educational Technology & Society, 18(1), 119–131. Yien, J. M., Hung, C. M., Hwang, G. J., & Lin, Y. C. (2011). A games‐based learning approach to improving students’ learning achievements in a nutrition course. Proceedings of the Turkish Online Journal of Educational Technology, 10(2). Youn, S., & Vachon, M. (2005). An investigation of the profiles of satisfying and dissatisfying factors in e-learning. Performance Improvement Quarterly, 18(2), 97–113. Zhang, D., Landmark, L., Reber, A., Hsu, H., Kwok, O., & Benz, M. (2010). University faculty knowledge, beliefs, and practices in providing reasonable accommodations to students with disabilities. Remedial and Special Education, 31(4), 276–286. doi:10.1177/0741932509338348 Zhang, D., & Nunamaker, J. F. (2003). Powering e-learning in the new millennium: An overview of e-learning and enabling technology. Information Systems Frontiers, 5(2), 207–218. doi:10.1023/A:1022609809036 Zhang, S. (1995). Reexamining the affective advantage of peer feedback in the ESL writing class. Journal of Second Language Writing, 4(3), 209–222. doi:10.1016/1060-3743(95)90010-1 535
Compilation of References
Zhao, Y., Lei, J., Yan, B., Lai, C., & Tan, H. S. (2005). What makes the difference? A practical analysis of research on the effectiveness of distance education. Teachers College Record, 107(8), 1836–1884. doi:10.1111/j.1467-9620.2005.00544.x Zheng, S., Han, K., Rosson, M. B., & Carroll, J. M. (2016). The Role of Social Media in MOOCs: How to Use Social Media to Enhance Student Retention. Paper presented at the Proceedings of the Third (2016) ACM Conference on Learning @ Scale, Edinburgh, Scotland, UK. doi:10.1145/2876034.2876047 Zheng, L., & Huang, R. (2016). The effects of sentiments and co-regulation on group performance in computer supported collaborative learning. The Internet and Higher Education, 28, 59–67. doi:10.1016/j.iheduc.2015.10.001 Zheng, Y., & Yano, Y. (2007). A framework of context-awareness support for peer recommendation in the e-learning context. British Journal of Educational Technology, 38(2), 197–210. doi:10.1111/j.1467-8535.2006.00584.x Zhu, C., Valcke, M., Schellens, T., & Li, Y. (2009). Chinese students’ perceptions of a collaborative e-learning environment and factors affecting their performance: Implementing a Flemish e-learning course in a Chinese educational context. Asia Pacific Education Review, 10(2), 225–235. doi:10.1007/s12564-009-9021-4 Zimmerman, B. J. (1990). Self-regulated learning and academic achievement: An overview. Educational Psychologist, 25(1), 3–17. doi:10.1207/s15326985ep2501_2 Zimmerman, B. J. (2002). Becoming a self-regulated learner: An overview. Theory into Practice, 41(2), 64–70. doi:10.1207/s15430421tip4102_2 Zlotkowski, E. (Ed.). (1998). Successful service-learning programs. Bolton, MA: Anker. Zlotkowski, E., Longo, N., & Williams, J. (2006). Students as colleagues: Expanding the circle of service-learning leadership. Providence, RI: Campus Compact. Zull, J. E. (2011). From brain to mind. Sterling, VA: Stylus Publishing. Zydney, J. M., Stegeman, C., Bristol, L., & Hasselbring, T. S. (2010). Improving a multimedia learning environment to enhance learners learning, transfer, attitudes and engagement. IJLT International Journal of Learning Technology, 5(2), 147. doi:10.1504/IJLT.2010.034547
536
537
About the Contributors
Carl Moore is currently the Acting Director of the Research Academy for Integrated Learning (RAIL) at University of DC. Prior to his current role he served as an adjunct assistant professor in the College of Ed as well as the Assistant Director of the Teaching and Learning Center at Temple University. He has a Doctorate of Education in Urban Education from Temple University and a Masters of Arts from The Ohio State University in Higher Ed Administration. His dissertation investigated how exemplary college faculty employ Universal Design for Learning principles in their teaching practices. Carl has been teaching for over 12 years and has created and instructed a variety of courses in education at Temple, Cabrini College, and Arcadia University in both face-to-face and online formats. Prior to his career in faculty development, Carl served in a number of student services roles that focused on providing individual and institutional support to retain and advance the success of college students. As a self-described social justice advocate and “techie,” the sum Carl’s passion lies in the development of programs on teaching with technology and inclusion in higher education. *** Noha Altowairiki is a PhD Candidate in Educational Technology in the Werklund School of Education at the University of Calgary. Her PhD research focuses on developing online teaching capacity to implement the Universal Design for Learning principles in the design and facilitation of online graduate programs. Also, she is interested in online collaborative learning, and educational development for online instruction. Albena Antonova, is lecturer at Sofia University, Faculty of Economics and Business Administration in the field of ICT implementation in business and economics. She works on a number of research projects, related to technology implementation, serious games, knowledge management and others. Her main research interests include knowledge management, emerging technologies, serious games, business and management education, service science, e-learning and TEL models and methods, e-Business, technology entrepreneurship, open innovations and innovations management and living labs and she has more than 30 publications in the field. Handan Atun is an Informational Technology teacher at Turkish Ministry of Education. She is graduated from the Middle East Technical University with bachelor’s degree in Computer Education and Instructional Technology in 2013. After graduation, she started her career in Konya where she is still working. Since 2015, she has attended Necmettin Erbakan University, Department of Computer Education and Instructional Technology post-graduate program, in which she continues her research.
About the Contributors
Daisyane Barreto is an Assistant Professor at University of North Carolina Wilmington (UNCW). Before coming to U.S. to pursue her graduate degree, Dr. Barreto was as a middle school teacher/media specialist in a private school in Brazil. In addition to teaching experiences, Dr. Barreto had opportunities to collaborate in innovative projects such as coordinating an exchange program between Brazil and USA to integrate technology and promote multicultural awareness in K-12 environments. Dr. Barreto also worked as multimedia developer for UGA Center for Teaching and Learning as well as UGA J.W. Fanning Institute. Her interests involve game-based learning, distance education, technology integration, multimedia design and production. Imed Bouchrika received his BSc and PhD degrees in Electronics and Computer Science from the University of Southampton (United Kingdom) in 2004 and 2008 respectively. Since 2008, he has worked as a research fellow at the Information: Signals, Images, Systems Research Group of the University of Southampton. He is now a lecturer of Computer Science at the University of Souk Ahras. His research areas are human computer interaction, image processing and biometrics. Regina Brautlacht is a Senior Lecturer in English and Business Communication at Bonn-RheinSieg University of Applied Sciences. She heads the English Language and Business Communication Programme for the Management Sciences Department and is responsible for curriculum design. She has recently started publishing about her international online projects that raise issues in competencies (intercultural, collaboration, EFL). She holds a MA in Educational Media from University DuisburgEssen. She is an appointed member of university teaching and learning commission and a member of the e-learning task force. Sheri Anderson Conklin graduated from UNC Wilmington with a BA in Art History. While completing her undergraduate degree she worked as a substitute teacher. She then pursued a certification as a special education teacher concentrating in the areas of Learning Disabilities and Mental Retardation. As a special education teacher she was able to identify and adopt various assistive technology tools. In May 2008 she graduated with a Master of Science degree in Instructional Technology from UNCW. In January 2009, Sheri began assisting faculty with online development as an Instructional Designer with Office of e-Learning in the Academic Affairs Division. Currently, Sheri is a doctoral candidate at Boise State University. Luciano da Rosa dos Santos, PhD Candidate at the Werklund School of Education – University of Calgary, supporting the development of strategies and initiatives aimed at enhancing the quality of teaching and learning. He is also working towards his PhD in Educational Technology at the University of Calgary, where he is conducting research on how Universities and Faculties interact for the support and development of online teaching capacity among academic staff. Nouzha Harrati received her bachelor degree in computer science from the University of Annaba, Algeria and she obtained a Magister’s Degree in Computer Science from the University of Souk Ahras. Harrati is now working towards her PhD degree at the Image Processing Research Group at the University of Bejaia. She is an assistant lecturer of Computer Science. Her research includes usability analysis, e-Learning, affective computing and automated classification of facial expressions.
538
About the Contributors
Rachelle Harris’ professional experience includes teaching graduate writing courses, English Literature, and facilitating secondary online instruction. Her academic interests include online learning and instructional design and gender dynamics in online and face-to-face learning environments. S. Laurie Hill, PhD, is an Assistant Professor, Education at St. Mary’s University. Her research interests include pre-service teacher education and specifically the connections between on campus coursework and field practicum experiences. She is also interested in pre-service teacher professional identity, student transitions, and the variety of learning environments that support undergraduate student success. Barbi Honeycutt is a speaker, scholar, and author. She facilitates workshops, shares strategies, and develops resources to support educators who want to involve students, increase engagement, and improve learning. She provides ideas, insights, and inspiration for educators on topics including teaching, active learning, student engagement, instructional design, faculty development, and professional development for graduate students and postdoctoral scholars. Over the past 16 years, Dr. Honeycutt has facilitated more than 3,000 workshops, presentations, and professional-development events for more than 10,000 graduate students, postgraduate scholars, faculty members, and leaders representing nearly every profession within industry, education, government, and nonprofit settings. In 2011, she created FLIP It Consulting where she supports educators in creating engaging learning environments using the FLIP, or “Focus on your Learners by Involving them in the Process.” The FLIP means reversing how learning environments are designed so learners focus on higher-level learning outcomes to increase engagement and enhance learning during in-class time. Dr. Honeycutt “practices what she teaches”: she FLIPs her own events so you can see and feel how the inverted-instruction model works from the learner’s point of view. Plama Hristova is a Chief Assistant Professor in Organizational Behaviour at Sofia University. Hristova reads lectures and leads seminar classes at the VUZF University as well. Dr. Hristova is a psychodrama therapist and a CBT therapist at the International Medical Center, Sofia. She is an international coordinator of 3 bilingual scientific journals: Journal of Education, Child Development and Counselling; Journal of Management, Consulting and Organizational Development; Journal of Innovative Behavior, Entrepreneurship and Sustainable Development. Dr. Hristova is an observer member of the Bulgarian Academy of Sciences and Arts, and a member of the Bulgarian Association of Psychodrama and Group Therapy, the Institute for Psychodrama Practice “Chiron” and the Bulgarian Association of CognitiveBehavioral Psychotherapies. Her research interests are: immigration, humanitarian action, professional stress, organizational behavior, leadership, entrepreneurship and innovations. Carol Johnson holds a PhD in Educational Technology from the Werklund School of Education (Calgary, Alberta, Canada) and was the 2014-2015 Werklund Doctoral Fellow. Her doctoral research focused on the development of an online music education model for effective teaching and learning. Linked to her dissertation research, Carol is also involved in researching the development of online faculty professional development to assist the understanding of how to effectively deploy the resulting online teaching framework. An established curriculum writer, Carol has published numerous music book series for elementary through professional music students. Additionally, she mentors higher education administrators and professors in their transitioning to teaching in the online environment.
539
About the Contributors
Kijpokin Kasemsap received his BEng degree in Mechanical Engineering from King Mongkut’s University of Technology, Thonburi, his MBA degree from Ramkhamhaeng University, and his DBA degree in Human Resource Management from Suan Sunandha Rajabhat University. He is a Special Lecturer in the Faculty of Management Sciences, Suan Sunandha Rajabhat University, based in Bangkok, Thailand. He is a Member of the International Association of Engineers (IAENG), the International Association of Engineers and Scientists (IAEST), the International Economics Development and Research Center (IEDRC), the International Association of Computer Science and Information Technology (IACSIT), the International Foundation for Research and Development (IFRD), and the International Innovative Scientific and Research Organization (IISRO). He also serves on the International Advisory Committee (IAC) for International Association of Academicians and Researchers (INAAR). He has had numerous original research articles in top international journals, conference proceedings, and books on the topics of business management, human resource management, and knowledge management, published internationally. Arlene King-Berry is Professor of Special Education at University of the District of Columbia, where she also serves as Chair of the Faculty Senate and of the Institutional Review Board. She is a dynamic educator, lawyer, writer, Principal Investigator for federal and local grants, with 25+ years of organizational, instructional, business management experience. A seasoned conference presenter, King-Berry has made national and international presentations in venues such as the Oxford Roundtable, Oxford England; the Howard University Scientific Symposium, Bahia, Brazil; and the USDOE Special Education Conference, Washington DC. She is actively engaged in scholarly publication: Allyn &Bacon published her Special Education Law Manual and she has published in peer-reviewed journals (e.g., Journal of Negro Education and Black History Bulletin). Agah Korucu is a computer engineer and instructional technologists works as an Assistant Professor Doctor at Necmettin Erbakan University. Also he serves academic consultant for various companies. He has many studies about student and educator engagement, dynamic web technologies, academic achievement, collaborative technologies, ICT integration, augmented reality, fuzzy logic and their impacts on teaching. Dr. Agah Tuuh T KORUCU has published widely in his research areas (book chapter, paper, study report, etc.). He has also attended many international conferences and organised many of them. Kristin Koskey is an Associate Professor in the LeBron James Family Foundation College of Education at The University of Akron where she co-developed the fully online graduate program in Assessment and Evaluation. She teaches courses in evaluation, assessment, and statistics. Two of her courses are Quality Matters in online learning recognized. Her work is published in leading journals such as Studies in Educational Evaluation, Journal of Applied Measurement, Journal of Mixed Methods Research, Journal of Experimental Education, and International Journal of Qualitative Methods. Further, she has authored book chapters on Norming and Scaling for Automated Essay Scoring and Data-driven STEM Assessment. Finally, Dr. Koskey has secured grant funding from the Ohio Department of Education and National Science Foundation, as well as contributed to the evaluations on grants funded by the ODE, the Ohio Board of Regents, U.S. Department of State, NSF, and local foundations.
540
About the Contributors
Susan N. Kushner Benson is an associate professor in the LeBron James Family Foundation College of Education at the University of Akron. She has been teaching fully online since 2001 and is the co-designer of a fully online master’s degree in Assessment, Evaluation, and Data Literacy. She is the recipient of the College of Education Outstanding Teaching Award. Dr. Kushner Benson’s areas of expertise include classroom assessment, the pedagogy of online teaching and learning and applied research methods. Her work is published in journals such as Journal of Educational Computing Research, Journal of Online Learning and Teaching, and Social Psychology of Education. Dr. Kushner Benson is a co-author of the book Decision-Making in Planning and Teaching, and the book chapter The Essential Role of Pedagogical Knowledge in Technology Integration for Transformative Teaching and Learning. Ammar Ladjialia is a research fellow at the Image Processing research group at the University of Souk Ahras where he works as a computer science lecturer. His research interest includes gestural interaction for human computer interaction. Jieun Lim is a doctoral student in the Learning Design and Technology program at Purdue University. Her research interests include effective design and facilitation strategies for online learning, the Community of inquiry, and technology integration for learning. Jennifer Lock, PhD, is a Professor and the Associate Dean of Teaching and Learning in the Werklund School of Education at the University of Calgary, Canada. Her area of specialization is in online learning, ICT integration, change and innovation, and educational development in higher education. Zohra Mahfouf is a research fellow at the Image Processing research group at the University of Souk Ahras. She is working towards her PhD in the area of Image Processing and Human Computer Interaction. Maria de Lurdes Martins is an Assistant Lecturer at the Polytechnic Institute of Viseu, where she teaches Business English and English for Tourism to undergraduate students. She holds a PhD in Linguistics (2012) from the University of Aveiro. Her research interests include Web 2.0 enhanced foreign language learning, dialogical and dialectical language learning, and social networked language learning. Walter Nuninger (Senior Lecturer) is a Chartered Engineer (1993) with a PhD in Automatic Control awarded in 1997. He further worked as a research engineer at ALSTOM where he developed his skill on friction and traction for trains. Since 1999, he has been an Associate Professor at the University of Lille. He works in the Engineering School of Polytech’Lille where he directed the Production Department dedicated to Life Long Learning with Work Integrated Learning (2008-2011). Then, he has been commissioned for the management and financial control at the CUEEP until 2014, in charge of Continuing Education. He teaches automatic control, computer science, data mining and mathematics... and is involved in learner-centred pedagogy with hybrid courses, using digital support. He is a tutor of learners in Formative Work Situations in the industry, guiding their reflexive attitude. Through the years he has had several experiences in management, leadership and financial control in a quality framework. He is interested in organization, inter-culturalism and excellence. Since 2014, he is a member of the Harassment Prevention Unit, offering advice and guidance to university students and staff.
541
About the Contributors
Lori Ogden is a Teaching Assistant Professor at West Virginia University and a Faculty Associate with the WVU Teaching and Learning Commons. She is the course coordinator for Precalculus and has taught a variety of undergraduate mathematics courses including College Algebra and Applied Calculus. Her research interests include course design and development, online and blended learning environments, and teacher education. Larisa Olesova, Ph.D., is an instructional designer and adjunct professor specializing in distance education at George Mason University. Her research focuses on the effectiveness of instructional strategies in online learning environments. Christopher P. Ostrowski is a PhD student of Learning Sciences in the Werklund School of Education. His recent research focused on understanding the lived experiences of university students with visual impairments and their use technology to support learning. He is also interested in educational development approaches for implementing universal design for learning to improve teaching and learning experiences in higher education. Beth Allred Oyarzun earned a PhD in Instructional Design and Technology from Old Dominion University in May 2016. Beth works as an Instructional Designer in the Office of e-learning at the University of North Carolina Wilmington, in addition being an adjunct lecturer for the University of North Carolina at Charlotte. She has worked in the higher education environment and taught online courses for more than ten years. Beth was previously a high school mathematics teacher. Franca Poppi is an Associate Professor of English Languages and Translation at University of Modena and Reggio Emilia, where she is also Director of the Master Degree Programme in Languages for Communication in International Enterprises and Organizations. She has published on various aspects of teacher-learner interaction, learner autonomy and advising in self-instruction. Her current research centres on English as an international Lingua Franca. Yufeng Qian is a faculty member in the Doctor of Education program at Northeastern University. She teaches courses in quantitative research design and data analysis and advises doctoral research studies. Her fields of research include emerging technologies, online education, and teaching effectiveness. Dr. Qian is the author of a number of book chapters and journal articles on 3D virtual learning environments, game-based learning, and digital media literacy. Prior to joining Northeastern University, Dr. Qian worked as an Associate Professor of Education at St. Thomas University. Her prior experiences in higher education include Dartmouth College, Grand Valley State University, Lehigh University, SUNY at Buffalo, and Beijing Capital University of Economics and Business. Cristiane Rocha Vicentini holds master degrees in TESOL and Instructional Design and Technology. She has focused her career on English language instruction, with experience teaching English language learners as well as K-12 pre-service teachers. Enilda Romero-Hall is an Assistant Professor of Instructional Design & Technology in the Department of Education at The University of Tampa. In her research, Dr. Romero-Hall is currently exploring different web and computer-based instruction and learning. Her research interests include: multimedia 542
About the Contributors
instruction, human-computer interaction, social media in teaching and learning, and distance/online education. Dr. Romero-Hall obtained her Ph.D. in Education with a concentration in Instructional Design & Technology from Old Dominion University. As part of her doctoral studies, Dr. Romero-Hall completed a Certificate in Modeling & Simulation in Education and Training. Dr. Romero-Hall is a past Link Foundation Fellow. JoAnne Dalton Scott is an Instructional Design Practitioner and Researcher with research interests that include topics related to aging learners such the normal developmental progression of the aging brain; cognitive aptitude of the aging; and needs of the aging learner. She is currently exploring how these topics intersect so as to promote optimally designed learning experiences for aging learners. JoAnne is a graduate of the University of Tampa, where she earned her Master’s Degree in Instructional Design & Technology in 2014. She was awarded the Nova Southeastern University Award for Outstanding Practice by a Graduate Student in Instructional Design for her design and development of a course titled, “Principles of Learner Motivation.” As a presenter at AECT International Conventions JoAnne has spoken on topics which include increasing motivation through the use of andragogical principles and maximizing learning experiences for geriatric learners. In 2016 she will serve AECT as the Communications Officer for the Research and Theory Division. Neal Shambaugh is a Professor of Learning Sciences and Human Development. He is a former Associate Dean of Academic Affairs and a Graduate Programs Coordinator of Instructional Design and Technology (IDT) in the College of Education and Human Services at West Virginia University. The master’s IDT program has been 100% online delivered since 2010, and he teaches courses in IDT and Educational Psychology. He has been a university liaison to an elementary/middle public school for a dual-degree, five-year teacher education program. Prior to academia, his work experiences included radio station ownership, engineering, announcing, and sales; training program consultant, and an audio and video producer. Geraldine Stirtz, MA.Ed. Director, Office for Service-Learning, serves as Senior Lecturer at the University of Nebraska at Kearney (UNK). Her interest and involvement is in the field of academic servicelearning. Based in the College of Education, she has worked exclusively in the area of service-learning for the past 25 years with teacher candidates completing a required community based, service-learning experience. She has published journal articles and presented nationally and regionally, and mentors individuals on other campuses in service-learning pedagogy. She serves as UNK representative to the Great Plains Regional Campus Compact. Her interest is focused on training and promoting service-learning pedagogy by engaging Higher Education and K-12 faculty and administrators in training experiences. Morris Thomas, PhD has a background in Instructional Technology Management, Higher Education Administration & Student Affairs, Accessibility & Disability Services, as well as Teaching & Learning. He serves as an Associate Professor at the University of the District of Columbia in the Research Academy for Integrated Learning. His professional/academic experience include service in the following areas: training & development, consultant, instructor, administrator, researcher, and speaker. Morris is also the Founder and Executive Director of Excellence Enterprise, a consulting firm. He is the author of Focus: The Missing Factor; A Practical Guide To Accomplishing Your Goals. Morris’ research interests include institutional/learning environments, teaching & learning, and diversity. 543
About the Contributors
Thomas J. Tobin is an instructional designer at the Pennsylvania State University in State College, PA. In the field of online-course and -program quality, he is best known for his work on administrativeevaluation techniques; his article on “Best Practices for Administrative Evaluation of Online Faculty” (2004) is considered a seminal work in the field, and has been cited in more than 150 publications. His latest work is Evaluating Online Teaching: Implementing Best Practices (Wiley, 2015) with B. Jean Mandernach and Ann H. Taylor. He is currently writing Reach Everyone, Teach Everyone: A Practitioner’s Guide to Universal Design for Learning in Higher Education, expected from West Virginia University Press in 2017. Since the advent of online courses in higher education in the late 1990s, Tom’s work has focused on using technology to extend the reach of higher education beyond its traditional audience. He advocates for the educational rights of people with disabilities and people from disadvantaged backgrounds. Tom serves on the editorial boards of InSight: A Journal of Scholarly Teaching, the Online Journal of Distance Learning Administration and the Journal of Interactive Online Learning, and he is an internationally-recognized speaker and author on topics related to quality in distance education, especially copyright, evaluation of teaching practice, academic integrity, and accessibility/universal design for learning. Nikolina Tsvetkova, PhD has long years of experience in teacher training, material design and teaching English in a variety of contexts and levels. She has defended a PhD thesis in the sphere of developing intercultural competence and continues working in this field. Currently, she teaches EU terminology to European Studies BA and MA students at Sofia University. She has been and is currently involved in teacher training for the British Council and the EI Centre (Sofia, Bulgaria). She has done research and has published on issues of intercultural education and intercultural dialogue, implementing ICT solutions in different educational settings, the EU-dimensions of teacher training and has delivered talks and workshops at various academic events on these topics. Nikolina Tsvetkova has also acted as a manager and a team member on a number of EU-funded and national projects. She is a member of the editorial board of the Foreign Language Education journal issued by the Bulgarian Ministry of Education. Lan Vu is ABD in Rhetoric and Composition, English Department at Southern Illinois University, Carbondale, USA. Her research interest includes composition, e-learning, testing and assessment, and technology integration in education.
544
545
Index
A accommodation 454-456, 470 ADDIE model 162-163, 302 asynchronous assessment 95-97 asynchronous discussion 25, 41, 45, 86, 91, 93, 97-98, 102, 160-161, 187, 220, 222, 227, 264, 271, 280 autonomy 91, 119, 316, 332-334, 336, 344, 347, 351, 356, 358-359, 363, 402-403, 417-418
B Blended Courses (Hybrid Courses) 363 blended learning 11, 24, 100, 141, 219, 352, 428, 430, 433, 447, 452 Bloom’s Revised Taxonomy 241, 267-268, 280 Bloom’s Taxonomy of Learning Domains 453, 470
C CAST 221-222, 226, 235, 323, 456-457, 460-461, 463 citizenship 60-62, 69, 74, 80, 396, 399-400 coaching 93, 162, 225, 228-229, 235 cognitive levels 182, 241, 253 collaboration 2, 12, 22, 40-41, 43, 52, 56, 60, 63, 68, 72, 75, 78, 81-82, 89-94, 101, 119, 134, 152, 154-161, 167, 220, 227-228, 237, 248, 272-273, 275, 277, 316-317, 356, 368, 374, 393-394, 397-398, 401, 406, 409, 411, 413-415, 418-419, 441, 458, 462 collaborative learning 5-9, 12-13, 20-22, 30, 117, 119, 129, 154, 156, 158, 160, 165, 167, 177, 266, 272, 371, 417-418 Community of Inquiry 20, 22, 107, 115, 120, 151, 153-156, 169, 177 Community of Inquiry framework 120, 151, 153, 177 Community of Practice 226, 229, 231, 235, 332-333, 342, 359 composition MOOCs 178, 180-181, 183-186, 192, 211-212
computer simulation 236-244, 248-249, 251, 253255, 262 constructive learning 1-2 Constructivist Learning Environment (CLE) 3 Constructivist learning theories 42 contextual interaction 271-273, 276, 278, 280 Continuous Vocational Training (CVT) 333, 363 critical reflection 62, 68-69, 155, 402
D DELF 393, 396, 398, 400-401, 414, 418-419 designed interaction 263-264, 271-273, 275-278, 280 development project 338, 357 Digital competences 334, 355, 359 Directed Google+ Community 40-41, 43-44, 51, 56, 59 Directed Google+ Community model (DG+) 40-41, 43, 59 discussion board 19, 25, 40-41, 43-45, 56, 59, 65, 93, 110, 113, 120, 269, 306
E educational development 157, 169, 219-220, 223-226, 228-229, 231-232, 235 e-learning 129-130, 132, 141, 152, 177, 236, 265, 268, 318, 333-340, 345, 354, 359, 367-377, 391, 394, 427-436, 441-442, 447 e-learning systems 367, 369-371, 373-374, 427-428, 430, 433, 435, 441-442 ELF 393, 397-398, 400-402, 408, 417, 419 emergent roles 21 emotional valence 455-458, 460, 470 experiential learning 62, 130, 254-255
F faculty 61, 65, 67-70, 72-73, 81-83, 108, 120, 153, 158, 162-163, 179, 235, 249, 251, 265-266, 283, 296,
Index
298, 304-306, 308-309, 326, 407, 433, 449-455, 457-458, 460-462, 464-466 fidelity 238-240, 247-249, 262, 441 field experience 218-219, 224-225, 227-228, 231232, 235 flipped classroom 281-284, 286, 291-292, 295-296, 298-299, 303, 428, 433, 449-454, 461-462, 464, 470
G game-based learning 128-130, 132, 146, 149, 237 global communicative competence 397 Google+ 40-41, 43-44, 46-48, 51-56, 59 Google+ community 40-41, 43-44, 46-48, 51-56, 59
H higher education 20, 22, 41-42, 61, 70-73, 81, 83, 85, 97, 106, 136, 144-145, 157, 160, 178, 187, 218-220, 222-224, 232, 236, 238, 240-241, 244, 252-254, 298, 305-306, 308, 331-333, 340, 342, 363, 367-368, 371, 373, 375-376, 395, 403, 407, 427-428, 442, 449, 451-452, 455, 457-458 hybrid courses 43, 49, 295, 298, 363 Hybridization 338, 363
I immediate feedback 4, 41, 182, 240, 244, 247-248, 251, 255, 358 Information Technology 367, 391 instructional design 43, 49-50, 53, 56, 97-98, 118, 134, 152, 156, 162, 167, 254, 264, 281-282, 284-285, 288, 292-294, 298, 302-303, 308, 371, 434, 453 instructional design model 50, 162, 254 instructor immediacy 106-107, 112-113 Instructor-grading Scores 178 instructors 1, 10, 13, 19-20, 22, 41-43, 45, 47, 49, 52, 56, 59, 86, 88-90, 93-95, 97, 101, 107-108, 110-117, 119-120, 151-153, 156-158, 160, 162-167, 169, 178-179, 183-187, 190-192, 195, 207-213, 218221, 223-225, 227-232, 245-246, 254, 264-268, 270, 272-273, 275-278, 281, 283-284, 287, 289, 297, 299, 303, 305-317, 320, 322-324, 326-327, 330, 363, 371, 373-374, 377, 406-407, 430, 433, 449, 452, 459, 461-462, 465 instructor-student interaction 270, 273, 275, 280 Integrated Learning Environment 334-335, 351-352, 355, 359 interaction 2-7, 9-10, 13, 20-23, 40, 43, 55, 85-86, 88,
546
94, 96-97, 107-109, 112, 117-118, 120, 128, 132, 134, 137-139, 141, 143, 149, 154, 156, 158, 160, 165-166, 212, 227, 229, 235, 238-240, 245, 254255, 263-264, 266, 268-278, 280, 282, 286, 291, 298, 306-308, 313, 315, 318-319, 335, 337-338, 341-342, 345, 354, 356, 359, 363, 370-375, 398, 401-402, 409, 411, 427, 430, 433, 435-442, 451, 455, 461-462 interactive tool 96
K knowledge management 40, 43-44, 375-376, 392
L learning content 119, 162-163, 177, 181, 222, 226, 372-375, 464 learning management system (LMS) 41, 44, 59, 283, 363, 431-432, 462 Learning Management System (LMS) Shell 59 learning outcome-based categorization 236, 238, 240, 253, 262 learning outcomes 13, 23, 25, 32, 40-41, 51-52, 56, 68, 109, 112-113, 117, 137, 152, 155-157, 163-165, 167, 177, 230-231, 236-237, 239, 241, 245, 248, 251-255, 268, 271, 280, 286, 290, 295, 298-299, 308, 327, 332-333, 335-338, 340-341, 345, 347, 351, 355-357, 359, 363-364, 400, 450, 453, 464 Lifelong Training 363 LMS 41, 43-44, 46, 48-49, 53, 59, 96, 119, 187, 190, 283, 288, 290, 294, 296, 308, 316, 319-320, 327, 331-332, 335, 338, 344-345, 348, 363-365, 368, 372, 412, 431-434, 462
M Micro-Learning 448 M-Learning 338, 448 mobile 114, 132, 158, 413, 432, 448, 451, 458-460 modeling-based simulation 241-242, 244-245, 251, 262 models of teaching framework 281-282, 284, 286, 298, 303 motivation 8, 42, 53, 86, 88-89, 95-96, 107, 109-110, 112, 119-120, 131, 157, 162-163, 266, 269, 276277, 282-284, 295-296, 298, 305-306, 310, 313, 344, 357, 364, 367, 369-370, 372, 376, 402, 411, 414, 417-419, 434, 457, 459-461 multimodal 85-86, 88-102, 180, 400, 418
O
Index
ONAAG 333, 339, 342-351, 354, 363-366 online discussions 19-33, 85-91, 94, 99-101, 154-155, 164, 167, 268-269, 278 online education 56, 88, 106, 182, 254-255, 267, 271, 368, 371, 428, 434 online learning 1, 4, 6, 8-9, 19, 21-22, 41, 43, 73, 85-86, 88, 95, 101-102, 106-107, 115, 117-118, 151161, 163-165, 168, 177-179, 186-187, 218-220, 224-226, 228, 231, 235, 253-255, 266-267, 269, 271-272, 304-310, 312, 314-317, 322, 326-327, 331-333, 339, 351, 354, 357, 371-372, 374, 428, 430-431, 433, 435 online learning environments 6, 19, 22, 95, 101, 115, 151, 154, 156-157, 218-220, 224-226, 228, 304309, 327, 372 online teaching 117, 120, 151-152, 154, 157, 164-165, 177, 223-225, 228-229, 231, 263-265, 268, 304, 308, 327, 368
P pedagogical framework 127, 129, 131, 134-136, 140141, 145-146, 149, 345 peer assessment 96, 178-179, 181-188, 190-194, 196, 198-201, 203-206, 208-212 peer review 40, 43, 54, 98, 166, 180-181, 183, 185 peer-grading scores 178, 210 Personal Learning Environment 332, 334-335 Play-Test-Get-Feedback 149 Plus-One Approach 471 problem-solving and decision-making simulation 243, 249-252
R Rapid Application Development (RAD) and Agile approach 363 Rose 160, 219, 221-222, 225, 323, 456
S Screencasting 297, 303 scripted roles 21-22 serious games 129, 132-134, 146, 149 simulated learning 127-129, 133, 141, 145-146, 240, 243, 250
social constructivism 2, 22, 129, 152, 166, 272 social media 1, 4, 10-13, 40-43, 51-52, 55-56, 59, 219, 304-305, 318-320, 327, 368, 399, 414 social presence 41-43, 48, 53-54, 86, 90-91, 101, 106-107, 112-115, 117-118, 120, 151, 153-154, 156-157, 159, 161, 177 social work 128, 132, 136, 145, 149 student-content interaction 269-271, 273, 280 student-instructor interaction 270-271, 273, 280 student-student interaction 4, 20, 263-264, 266, 270273, 275-278, 280
T task 2, 5, 20-22, 43, 47, 68, 70, 89, 138, 154-155, 161-162, 165-166, 182, 188, 222, 230, 239-240, 242-243, 247-249, 251-253, 262-264, 271-272, 275-277, 280, 284, 288, 290, 303, 309, 322, 334, 369, 398, 404-405, 414-415, 440-442, 458, 462 teaching presence 22, 88, 91, 101, 106-108, 110, 112, 115, 151-159, 161, 164-165, 167-169, 177, 371 Ten and Two 462, 471 Threaded Discussion Board 41, 59
U
UDL 109, 160-161, 218-226, 228-232, 235, 310-311, 323, 449, 451, 454, 456-464, 466, 471 Universal Design for Learning 218-220, 222, 225-226, 235, 323, 449, 451, 453-454, 456, 458, 460, 471 Universal Design for Learning (UDL) 218-220, 235, 323, 449, 451, 456, 471 usability 9, 48, 50, 142, 144-145, 319, 371, 427-428, 435-442, 448, 464 usability evaluation 435, 437-440, 442 usefulness 197-198, 204-206, 209, 211, 370, 372, 437, 441
V vulnerable people 127-129, 131, 134-142, 146, 149
W Wade 61-62, 66-67, 269, 401 Work Integrated Learning (WIL) 331, 333, 339, 363
547