Formative Design in Learning: Design Thinking, Growth Mindset and Community (Educational Communications and Technology: Issues and Innovations) 3031419499, 9783031419492

Learning design is an ill-structured process that must account for multiple stakeholders, contextual constraints, and ot

108 77 13MB

English Pages 325 [317] Year 2023

Report DMCA / Copyright

DOWNLOAD PDF FILE

Table of contents :
Contents
Chapter 1: What is Formative Learning Design? Collaborative Meaning-Making From the 2022 AECT Summer Research Symposium
Collaborative Meaning-Making: Towards a Shared Understanding of Formative Learning Design
What is Formative Learning Design?
Formative Learning Design is Not Well-Defined
Formative Learning Design is Messy
Formative Learning Design Relies on Learning From Failure
Formative Learning Design Requires a Specific Mindset
Formative Learning Design Considers Issues of Culture and Equity
What is the Process of Formative Learning Design?
Formative Learning Design Processes are Iterative
Formative Learning Design Processes are Feedback-Driven
Formative Learning Design Processes Rely on Data-Based Decision Making
Formative Learning Design Processes are Rooted in Communication and Collaboration
Implications and Future Research
Appendix
“Formatively Designing a Book Together” Activity
Chapter 2: Formative Design in the Holistic 4D Model
Why a New Model
The Holistic 4D Model
Formative Design in the Model
Research on Formative Design
Conclusion
References
Chapter 3: Intern Observations and Reflections From the 2022 AECT Summer Research Symposium
Symposium Structure
Developing Chapter Manuscripts Using the Pro Action Café Format
Critical Thinking Activity at the Art Museum
Defining Formative Design: The “Sticky Note Activity”
Final Thoughts
Redefining Formative Design
The Success of the Pro Action Café
Reflection
References
Chapter 4: Closing the Professional Learning Loop: Designing for Performance Improvement
Closing the Professional Learning Loop: Designing for Performance Improvement
The Design Context
Design and Delivery Models
Constraints and Results
Rethinking the Design Process
Pragmatic and Efficient Designing Methods
Prioritizing Research-Based and Proven Techniques
Incorporating Learner Feedback
Embedding Systematic Learner-Centered Strategies
Reimagined Formative Design Process
Using the New Design Process to Develop the LAM
Component 1: In-Person Virtual Learning
Component 2: Self-Directed Learning
Component 3: Professional Learning Communities
Refining the LAM During Implementation
Observed Results of the Design Process and LAM Implementation
Applying Formative Design Processes to Your Professional Learning
Summary
References
Chapter 5: Designing the Museum of Instructional Design, a 3D Learning Environment: A Learning Experience Design Case
Introduction
Project Description
Software Used to Design the MID
Method
Successive Approximation Model
Data Sources
Study Procedures
Learner Evaluations
Expert Evaluations
Analysis
Empathy Maps
Quantitative Analysis
Qualitative Analysis
Results
Results of Phase 1: Preparation (RQ1)
Empathy Maps
Personas
Results of Phase 2: Iterative Design (RQ2)
Results of Phase 3: Iterative Development (RQ3)
Discussion
Design Implications
Limitations
Nature of UX Research
Same Participants and Initial Impressions
References
Chapter 6: Enhancing Problem Based Learning Through Design Thinking and Storying
Introduction
Problem Based Learning Shares Common Goals with Many Instructional Methods
Enhancing the Sustainability of PBL with Design Thinking
Enhancing One’s Knowledge Though Embracing Failure
Enhancing PBL Through Storying
Using a Logic Model to Foster Design Thinking
Root Cause Analysis
Summary
References
Chapter 7: Formative Design of Authentic Scenarios for a Virtual Reality-Based Parent-Teacher Conference Training Simulation
Introduction
The Parent-Teacher Communication Competences Model
Three-Phase Design Model
Evaluation Questions
Formative Design and Evaluation
Phase 1 Developing Simulation Structure
Phase 1 Scenario Evaluation
Phase 2 Elaborating Contextualized Scenarios
Phase 2 Scenario Evaluation
Phase 3 Finalizing Simulation Authenticity
Conclusion
References
Chapter 8: Formative Learning Design in the COVID-19 Pandemic: Analysis, Synthesis, and Critique of Learning Design and Delivery Practices
Introduction and Purpose of the Study
Literature Review
Teaching and Learning During the COVID-19 Pandemic—Is It “Online Learning”?
Faculty/Teachers as Designers—Are They Professionally Equal to Learning Designers?
Theoretical Orientation—What is Design and What is Formative Design?
Significance of the Study
Researchers’ Roles and Positionality
Methods
Scoping of Articles
Analysis of Articles
Study Limitations
Findings
(RQ1) What Learning Design and Delivery Practices are Reported by Educators During or in Relation to the COVID-19 Pandemic?
(RQ2) Taking Into Account Contexts, What Unique or New Learning Design and Delivery Practices Appeared During, or in Relation to, the COVID-19 Pandemic?
Discussion & Critique
Learning Design and Delivery Practices Reported by Educators During or in Relation to the COVID-19 Pandemic
Learning Design and Delivery Practices Appeared to be Unique or New During or in Relation to the COVID-19 Pandemic
Future Work
Conclusion
Appendices
Appendix A: Coding Book
Appendix B: Categories of Design and Delivery Practices With Examples
References
Chapter 9: Formative Learning Design Within Project Evaluation: Case of a Food Bank Disaster Planning and Recovery Tool
Background
Considerations for the Formative Evaluation Design
Immersive Tabletop Exercise Strategy
Building on Design Precedents
Conducting the Sessions
Lessons Learned
References
Chapter 10: How a Novice Instructional Designer Embraced a Design Thinking Mindset Through a Learning Design Course
Context
Data
Data Analysis
Results (Table 10.1)
Evidence of the Implementation of the Design Thinking Skills in the First Design Document
Evidence of the Implementation of the Design Thinking Skills in the Second Design Document
Evidence of the Implementation of the Design Thinking Skills in the Third Design Document
Evidence of the Implementation of the Design Thinking Skills in the Final Design Document
Evidence of the Novice Instructional Designer’s Growth
Conclusion & Future Study
References
Chapter 11: How has Virtual Reality Technology Been Used to Deliver Interventions That Support Mental Well-Being? A Systematic Literature Review
Virtual Reality Technology
Mental Well-Being and Virtual Reality Technologies
Methods
Information Source and Search Strategy
Inclusion and Exclusion Criteria
Screening Procedures
Results
Discussion and Significance
Limitations
Conclusion
References
Chapter 12: Layering Views of Experience to Inform Design
Formative Design
Use of Video in Generating Truth Spaces for Formative Design
Experiences in School Based Maker Spaces
Methods
Examples
Example One From Phase One
Example Two From Phase Two
Discussion and Conclusion
References
Chapter 13: Making a Framework for Formative Inquiry Within Integrated STEM Learning Environments
Introduction
An Authentic STEM Instructional Design
Abridged History of Inquiry-Based Learning
Science Education Through an Integrated STEM Context
Design Problem
First Drafts
Development of Formative Framework
Design Decisions
Making Meaning
Target Audience Assumptions
Connecting Practice to Theory
Conclusion
References
Chapter 14: Measuring Informal Learning: Formative Feedback Towards the Validity of the Informal SOM-SCI
Libraries as Addressing STEM Equity Issues in Underserved Urban Settings
Goal of the Project – Instrument Development for Informal STEM Learning in Libraries
School Observation Measure (SOM)
Content Validity: Additional Measures of Classroom Problem-Solving
Towards Validity of the Informal SOM Science Development of Informal School Observational Measurement (SOM)-Science
Face Validity: Formative Feedback
Future Directions
Conclusion
References
Chapter 15: Multipurpose Practicum: Feeding a Hunger for Justice via a Mainstream Academic Requirement
Normalizing Productive Struggle Toward Justice
Mainstream LDT Practicum Purposes/Goals
A Multipurpose Practicum Design
Impetus and Context
Topic and Approach
Structure and Support
Project Deliverables
Design Decisions and Challenges During the Practicum
Outcomes and Students’ Perspectives on the Experience
Culminating Report
Proof of Concept
Critical Considerations in the Design and Support of This Case
Authentic Context and Need
Thoughtful and Supported Team Formation
Relationship, Belonging, Competence, and Trust
Formative Process Matters
References
Chapter 16: Practicing 360-Degree Innovation: Experiencing Design Thinking, Exhibiting Growth Mindset, and Engaging Community in a French Business School Graduate-Level Intensive Course
Introduction
Determining Course Focus and Learning Outcomes
The Situational Context of the Course
The Specific Design of the Course and Learning Sciences Rationales
Day One
Day Two
Day Four
Assessment as Part of Course Design
Conclusion
References
Chapter 17: Preparing Elementary Teachers to Design Learning Environments That Foster STEM Sensemaking and Identity
What Is Sensemaking?
Development of a STEM Identify
Professional Learning (PL) to Encourage Formative Design Thinking
Project-Based Learning for Sensemaking
Examining Student Work and Thinking
Teacher-Created Books to Foster an Engineering Identity
Conclusion
References
Chapter 18: Profound Learning for Formative Learning Design and Technology
What Is Profound Learning?
Learning Processes of Deformation, Reformation, and Transformation
Meta-Learning and Meta-Practices
Profound Learning and Human Flourishing
Developing a Disposition for Depth
Profound Learning as a Formative Process
What Does LDT Formative Design for PL Look Like?
Linking Profound Learning to Formative Learning Design, Design Thinking and Community
Research Accomplished
Conclusion
References
Chapter 19: Tapping Into How We Teach What We Teach: A Journey in Explicit and Implicit Reflection
Reflection and Designer Professional Identity
Our Design Story: What We Teach
Moment of Use
The 3I’s: Introspection, Interaction, Intention
Our Journey: Formative Design in Our Moment of Use (How We Teach)
Journey Participants
Documenting Our Journey
Resulting Improvements
Additional Results from Implicit and Explicit Reflection
The Journey Continues
References
Chapter 20: The Formative Design of the SRL-OnRAMP: A Reflective Self-Regulated Learning Intervention
Problem: Self-Regulation Requirements for Distance Learning
Theoretical Background: The Zimmerman Self-Regulation of Learning Model
Formative Design Process
Scholarship of Teaching and Learning Study
Literature Review of SRL Interventions
Finding 1: Embedded SRL Instruction
Finding 2: Frequency of Deployment
Finding 3: Sensitivity to Time Constraints
Finding 4: Emphasis on Feedback Cycle
SRL-OnRAMP-v2 Revision
Mixed Methods Evaluation Study
Implications
References
Chapter 21: The Many Hats – Accidental Roles on an Interdisciplinary Research and Implementation Project: A Collaborative Autoethnography
Brief Description of the Project
Methods
Procedure
Findings
Thematic Summary of Individual Reflections
Theme 1: Starting the Project as Educational Researchers
Marisa
Deepti
Theme 2: Challenges Within Interdisciplinary Team
Marisa
Deepti
Theme 3: Formative Emergence of the ID role
Marisa
Deepti
Theme 4: Evolving Nature of the ID role
Marisa
Deepti
Theme 5: Multiple Dimensions of the Researcher Role
Marisa
Deepti
Joint Sensemaking Discussions
Non-Autoethnographer’s Meta-Reflection by Iryna
Discussion & Conclusion
References
Chapter 22: Using Personas to Leverage Students’ Understanding of Human Performance Technology to Support Their Instructional Design Practice
Introduction
Use of Personas and Journeys to Support Curricular Design
Our Process
Development of HPT Student Personas
Journey Mapping Our HPT Personas
Implications for Designing Instruction
References
Chapter 23: Using Photo-Journals to Formatively Evaluate Elementary Student Robotic Construction
Introduction
Methods
Rubrics
Build Types
Examples: Evaluating Robotic Builds
Evaluation Over Time
Conclusion
References
Chapter 24: What Is Is Not What Has to Be: The Five Spaces Framework as a Lens for  (Re)design in Education
Case 1: The Teacher’s Desk
Case 2: The PISA Test
Case 3: The Learning Management System (LMS)
Conclusion
References
Chapter 25: Brad Hokanson and the Summer Research Symposium: The Quiet Force Behind a Signature Event
Recommend Papers

Formative Design in Learning: Design Thinking, Growth Mindset and Community (Educational Communications and Technology: Issues and Innovations)
 3031419499, 9783031419492

  • 0 0 0
  • Like this paper and download? You can publish your own PDF file online for free in a few minutes! Sign Up
File loading please wait...
Citation preview

Educational Communications and Technology: Issues and Innovations

Brad Hokanson · Matthew Schmidt · Marisa E. Exter · Andrew A. Tawfik · Yvonne Earnshaw   Editors

Formative Design in Learning

Design Thinking, Growth Mindset and Community

Educational Communications and Technology: Issues and Innovations Series Editors J. Michael Spector, Department of Learning Technologies University of North Texas Denton, TX, USA M. J. Bishop, College of Education, Lehigh University University System of Maryland Bethlehem, PA, USA Dirk Ifenthaler, Learning, Design and Technology University of Mannheim Mannheim, Baden-Württemberg, Germany Allan Yuen, Faculty of Education, Runme Shaw Bldg, Rm 214 University of Hong Kong Hong Kong, Hong Kong

This book series, published collaboratively between the AECT (Association for Educational Communications and Technology) and Springer, represents the best and most cutting edge research in the field of educational communications and technology. The mission of the series is to document scholarship and best practices in the creation, use, and management of technologies for effective teaching and learning in a wide range of settings. The publication goal is the rapid dissemination of the latest and best research and development findings in the broad area of educational information science and technology. As such, the volumes will be representative of the latest research findings and developments in the field. Volumes will be published on a variety of topics, including: •  Learning Analytics •  Distance Education •  Mobile Learning Technologies •  Formative Feedback for Complex Learning •  Personalized Learning and Instruction •  Instructional Design •  Virtual tutoring Additionally, the series will publish the bi-annual AECT symposium volumes, the Educational Media and Technology Yearbooks, and the extremely prestigious and well known, Handbook of Research on Educational Communications and Technology. Currently in its 4th volume, this large and well respected Handbook will serve as an anchor for the series and a completely updated version is anticipated to publish once every 5 years. The intended audience for Educational Communications and Technology: Issues and Innovations is researchers, graduate students and professional practitioners working in the general area of educational information science and technology; this includes but is not limited to academics in colleges of education and information studies, educational researchers, instructional designers, media specialists, teachers, technology coordinators and integrators, and training professionals.

Brad Hokanson  •  Matthew Schmidt Marisa E. Exter  •  Andrew A. Tawfik Yvonne Earnshaw Editors

Formative Design in Learning Design Thinking, Growth Mindset and Community

Editors Brad Hokanson College of Design University of Minnesota St. Paul, MN, USA Marisa E. Exter College of Education Purdue University West Lafayette, IN, USA Yvonne Earnshaw School of Instructional Technology & Innovation, Bagwell College of Education Kennesaw State University Kennesaw, GA, USA

Matthew Schmidt Mary Frances Early College of Education and College of Pharmacy University of Georgia Athens, GA, USA Andrew A. Tawfik College of Education University of Memphis Memphis, TN, USA

ISSN 2625-0004     ISSN 2625-0012 (electronic) Educational Communications and Technology: Issues and Innovations ISBN 978-3-031-41949-2    ISBN 978-3-031-41950-8 (eBook) https://doi.org/10.1007/978-3-031-41950-8 © Association for Educational Communications & Technology 2023 This work is subject to copyright. All rights are solely and exclusively licensed by the Publisher, whether the whole or part of the material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation, broadcasting, reproduction on microfilms or in any other physical way, and transmission or information storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now known or hereafter developed. The use of general descriptive names, registered names, trademarks, service marks, etc. in this publication does not imply, even in the absence of a specific statement, that such names are exempt from the relevant protective laws and regulations and therefore free for general use. The publisher, the authors, and the editors are safe to assume that the advice and information in this book are believed to be true and accurate at the date of publication. Neither the publisher nor the authors or the editors give a warranty, expressed or implied, with respect to the material contained herein or for any errors or omissions that may have been made. The publisher remains neutral with regard to jurisdictional claims in published maps and institutional affiliations. This Springer imprint is published by the registered company Springer Nature Switzerland AG The registered company address is: Gewerbestrasse 11, 6330 Cham, Switzerland Paper in this product is recyclable.

Contents

1

What is Formative Learning Design? Collaborative Meaning-Making From the 2022 AECT Summer Research Symposium������������������������������������������������������������������������������    1 Yvonne Earnshaw, Brad Hokanson, Marisa E. Exter, Matthew Schmidt, and Andrew A. Tawfik

2

 Formative Design in the Holistic 4D Model������������������������������������������   13 Charles M. Reigeluth

3

Intern Observations and Reflections From the 2022 AECT Summer Research Symposium ��������������������������������������������������������������   25 Vanessa Johnson and Alyse Harris

4

Closing the Professional Learning Loop: Designing for Performance Improvement ��������������������������������������������������������������   33 Rita Fennelly-Atkinson, Courtney L. Teague, and Jillian Doggett

5

 Designing the Museum of Instructional Design, a 3D Learning Environment: A Learning Experience Design Case�����������������������������   49 Noah Glaser, Yvonne Earnshaw, Dana AlZoubi, Mohan Yang, and Elisa L. Shaffer

6

Enhancing Problem Based Learning Through Design Thinking and Storying ��������������������������������������������������������������������������������������������   65 Robert F. Kenny and Glenda A. Gunter

7

Formative Design of Authentic Scenarios for a Virtual Reality-Based Parent-Teacher Conference Training Simulation��������   77 Jeeheon Ryu, Sanghoon Park, Eunbyul Yang, and Kukhyeon Kim

v

vi

Contents

8

Formative Learning Design in the COVID-­19 Pandemic: Analysis, Synthesis, and Critique of Learning Design and Delivery Practices ����������������������������������������������������������������������������   93 Ahmed Lachheb, Jacob Fortman, Victoria Abramenka-Lachheb, Peter Arashiro, Ruth N. Le, and Hedieh Najafi

9

Formative Learning Design Within Project Evaluation: Case of a Food Bank Disaster Planning and Recovery Tool�������������������������  115 Susie L. Gronseth and Ioannis A. Kakadiaris

10 How  a Novice Instructional Designer Embraced a Design Thinking Mindset Through a Learning Design Course ����������������������  127 Jing Song and Wanju Huang 11 How  has Virtual Reality Technology Been Used to Deliver Interventions That Support Mental Well-Being? A Systematic Literature Review ������������������������������������������������������������  139 Minyoung Lee, Matthew Schmidt, and Jie Lu 12 Layering  Views of Experience to Inform Design����������������������������������  157 Signe E. Kastberg, Amber Simpson, and Caro Williams-Pierce 13 Making  a Framework for Formative Inquiry Within Integrated STEM Learning Environments��������������������������������������������������������������  167 Stuart Kent White 14 Measuring  Informal Learning: Formative Feedback Towards the Validity of the Informal SOM-SCI ��������������������������������������������������  179 Andrew A. Tawfik, Linda Payne, and Carolyn R. Kaldon 15 Multipurpose  Practicum: Feeding a Hunger for Justice via a Mainstream Academic Requirement��������������������������������������������  193 Amy C. Bradshaw 16 Practicing  360-Degree Innovation: Experiencing Design Thinking, Exhibiting Growth Mindset, and Engaging Community in a French Business School Graduate-Level Intensive Course����������  207 Dennis Cheek 17 Preparing  Elementary Teachers to Design Learning Environments That Foster STEM Sensemaking and Identity��������������������������������������  219 Kim A. Cheek 18 Profound  Learning for Formative Learning Design and Technology����������������������������������������������������������������������������������������  229 Davin J. Carr-Chellman, Alison A. Carr-Chellman, Carol Rogers-Shaw, Michael Kroth, and Corinne Brion

Contents

vii

19 Tapping  Into How We Teach What We Teach: A Journey in Explicit and Implicit Reflection���������������������������������������������������������  241 Monica W. Tracey and John Baaki 20 The  Formative Design of the SRL-­OnRAMP: A Reflective Self-Regulated Learning Intervention ��������������������������������������������������  251 Alexis Guethler and William A. Sadera 21 The  Many Hats – Accidental Roles on an Interdisciplinary Research and Implementation Project: A Collaborative Autoethnography��������������������������������������������������������������������������������������  267 Deepti Tagare, Marisa E. Exter, and Iryna Ashby 22 Using  Personas to Leverage Students’ Understanding of Human Performance Technology to Support Their Instructional Design Practice ����������������������������������������������������������������������������������������  281 Jill E. Stefaniak, Marisa E. Exter, and T. Logan Arrington 23 Using  Photo-Journals to Formatively Evaluate Elementary Student Robotic Construction����������������������������������������������������������������  295 Anna V. Blake, Lauren Harter, and Jason McKenna 24 W  hat Is Is Not What Has to Be: The Five Spaces Framework as a Lens for (Re)design in Education����������������������������������������������������  305 Melissa Warr, Kevin Close, and Punya Mishra 25 Brad  Hokanson and the Summer Research Symposium: The Quiet Force Behind a Signature Event������������������������������������������  317 Elizabeth Boling

Chapter 1

What is Formative Learning Design? Collaborative Meaning-Making From the 2022 AECT Summer Research Symposium Yvonne Earnshaw, Brad Hokanson, Marisa E. Exter, Matthew Schmidt, and Andrew A. Tawfik Abstract  The 2022 AECT Summer Research Symposium focused on formative design in learning. During the two-day event, attendees engaged in an interactive affinity mapping activity around the concept of formative design to come to a more formalized definition. Analysis revealed a range of themes, suggesting formative design as an iterative and learner-centered design process that involves ongoing assessments, feedback loops, and data-informed decision making. Formative design is characterized as a systematic approach that focuses on continuous improvement, problem-solving, and refining designs. It emphasizes collaboration and communication and recognizes the influence of cultural, contextual, and identity factors on learning and design decisions. In addition, it embraces successes and failures. These findings provide valuable insights for researchers, practitioners, and educators, contributing to a better understanding of formative learning design and guiding future research and practice. Y. Earnshaw (*) Kennesaw State University, Kennesaw, GA, USA e-mail: [email protected] B. Hokanson University of Minnesota, Minneapolis, MN, USA e-mail: [email protected] M. E. Exter Purdue University, West Lafayette, IN, USA e-mail: [email protected] M. Schmidt University of Georgia, Athens, GA, USA e-mail: [email protected] A. A. Tawfik University of Memphis, Memphis, TN, USA e-mail: [email protected] © The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 B. Hokanson et al. (eds.), Formative Design in Learning, Educational Communications and Technology: Issues and Innovations, https://doi.org/10.1007/978-3-031-41950-8_1

1

2

Y. Earnshaw et al.

Keywords  Formative design · Collaborative meaning-making · Affinity mapping · Instructional design · Iterative design The Association for Educational and Communications Technology (AECT) Summer Research Symposium is an event that brings together researchers and practitioners from various fields, including instructional design, educational technology, communication, landscape architecture, design, computer science, and higher education administration, among others. The symposium serves as a platform for participants to present their research on a given pre-determined theme and engage in meaningful discussions and collaborations with their peers. The chapter entitled “Intern Observations and Reflections From the 2022 AECT Summer Research Symposium” in this volume, written by this year’s interns, goes into detail about the structure of the symposium. The theme for this year’s symposium was “Formative Design in Learning: Design Thinking, Growth Mindset, and Community.” The symposium was held as a collaborative effort with the Journal of Formative Design in Learning, aiming to bring together researchers, practitioners, and scholars in the field of formative learning design to exchange knowledge, share insights, and explore new developments. As an official journal of AECT, the Journal of Formative Design in Learning is a reputable publication that bridges the gap between theory and practice in education and training. It fosters collaborations between researchers and practitioners, publishing original research papers that inform instructional improvement and advancements in design. Dr. Robert Kenny, Editor-in-Chief of the journal, provided a presentation in which he discussed formative design in learning and introduced the journal as a platform for scholarly contributions in this space. He invited authors to submit extended manuscripts to the journal, encouraging researchers to contribute to the advancement of formative learning design through publication. This year’s symposium was particularly special as it marked the first time in 3 years that it was held in-person, having been a fully virtual, online event in 2020 and 2021. The symposium took place at the Indiana University campus in Bloomington, Indiana over a two-day period in July, 2023. The keynote speaker at the symposium was Dr. Charles Reigeluth, who presented his Holistic 4D Model of instructional design and highlighted its connections with formative learning design. Dr. Reigeluth’s chapter on the Holistic 4D Model is included in this volume. His presentation provided valuable insights into the integration of formative learning design principles within the framework of instructional design, contributing to the overall discourse and understanding of formative learning design approaches.

1  What is Formative Learning Design? Collaborative Meaning-Making From…

3

 ollaborative Meaning-Making: Towards a Shared C Understanding of Formative Learning Design This year, the symposium organizers planned an interactive affinity mapping activity to drive collaborative meaning-making around the concept of formative learning design (see Appendix). Affinity mapping is a method of organizing and grouping related ideas or information based on their similarities or shared characteristics. During the symposium, our affinity mapping activity spanned both days and all attendees participated. Throughout this activity, participants were encouraged to jot down their thoughts on sticky notes, which were put up on a wall in the symposium room (see Fig. 1.1). We started off the activity by asking participants to address the question “what is formative learning design?” Additional prompts and reminders to write notes were given throughout the day. Breaks were held, during which participants could add additional thoughts and group notes based on perceived affinity and commonalities. An extended period of affinity mapping was held as the final activity at the end of the symposium, after which all participants debriefed to discuss the meaning-making we had undergone collectively. Following the symposium, sticky notes were digitized by the editorial team and added to the shared whiteboard tool Miro (see Fig. 1.2). Following this, the editorial team underwent a further affinity mapping process so as to group the sticky notes

Fig. 1.1  Attendees reading sticky notes on the wall

4

Y. Earnshaw et al.

Fig. 1.2  Sticky notes grouped by themes using Miro

into the overarching themes represented in the participants’ responses. Groups of sticky notes were assigned descriptive titles. After this, the included information on the sticky notes was summarized and synthesized, as discussed in the following section.

What is Formative Learning Design? Analysis of participants’ data suggests that formative learning design is a systematic and iterative approach that involves ongoing improvements, addressing identified problems, and creating ways to refine and correct designs. It is both generative and evaluative, striking a productive balance between creativity and assessment. This learner-centered process emphasizes continuous feedback loops, data-informed decision making, and the cultivation of a reflective and adaptive mindset.

Formative Learning Design is Not Well-Defined The participants recognized the lack of clear definitions and methods in formative learning design, which sometimes hinders its recognition and respect compared to more empirical approaches. Their responses highlighted that it is a “highly iterative learner-centered design process with ongoing feedback loops & data-informed

1  What is Formative Learning Design? Collaborative Meaning-Making From…

5

decision making,” as one participant described. It involves “iterative assessments” and is both generative and evaluative, producing a tension that is productive. Participants recognized the importance of framing and judgment in evoking the formative nature of the design process. One participant noted it “is under-valued and does not receive the same respect as more empirical approaches.” While formative learning design is typically associated with learning environments and assessing students to meet their needs, there was curiosity about its applicability in non-formal spaces. One participant questioned, “Seems to be confined to learning environments w/ clear objectives in mind. Is there room for formative design in non-­formal spaces?” Finally, the idea of formative learning design challenging existing knowledge and promoting reflection and change was also emphasized. As one participant stated, “Formative design is really (in)formative design since it helps us see what we ‘know’ isn’t so!”

Formative Learning Design is Messy A key finding had to do with the innate ambiguity of formative learning design. As one participant stated, “Formative design is inherently messy, just like the real world it seeks to influence.” Another participant stated, “Formative design is messy, and that is okay.” Participants recognized it as a valuable aspect of the process, and described formative learning design as being in the moment and authentic, requiring designers to be open and vulnerable. The acceptance of feedback and ideas in formative learning design was seen as a formative process in itself. As one participant pointed out, “Accepting feedback + ideas in F.D. is, itself, a formative process.” They emphasized the importance of gaining comfort and creating norms for navigating emotions in dealing with the messiness of formative learning design. Taken together, participant responses acknowledged that formative learning design aligns with the complexities and unpredictabilities of the real world, emphasizing that this messiness is both acceptable and valuable in the design process.

Formative Learning Design Relies on Learning From Failure Related to messiness is the underlying notion of productive failure, a crucial aspect of formative learning design. Participants emphasized the importance of identifying problems, asking questions, and being open to the possibility of being wrong. They stressed that failure should be seen as a step forward, indicating progress and learning. As one participant noted, “Failure is a good step and means you are heading [...]moving forward.” Another participant expressed the mindset of delighting in failure, seeing it “as another opportunity to learn.” Participants emphasized the need to build upon these experiences, using them as a foundation for iterative improvement. Reflective thinking was mentioned as a key component in the process, enabling designers to analyze design failures and make informed adjustments.

6

Y. Earnshaw et al.

Formative Learning Design Requires a Specific Mindset Formative design involves a mindset and purpose (a “dispositional choice”) that goes beyond the process itself. It is driven by a vision of creating positive outcomes for learners and fostering innovation. The influence of personal positionality on the design journey and the recognition of formative learning design as a developmental approach were also highlighted. They recognized that formative learning design is not just a process but also a mindset and dispositional choice. It involves reforming thoughts, actions, intents, and impact, reflecting a commitment to continuous improvement and learner-centeredness. The participants also considered the influence of their own positionality in the formative learning design process. They questioned how their perspectives, experiences, and positions affect their design journey and decision-making, highlighting the awareness of the impact of personal bias and subjectivity on the design process, as well as emphasizing the importance of reflexivity and critical self-reflection.

 ormative Learning Design Considers Issues of Culture F and Equity Participants highlighted the importance of equity and culture/context in formative learning design. They asked how formative learning design can promote equity and include historically excluded students. One participant expressed, “How can formative design be used to promote equity & include historically excluded students?” Another participant questioned, “How can FD also be framed around equity?” They emphasized consideration of learners’ contexts and cultures, and highlighted how this necessitates iteration. The role of ongoing assessments in formative learning design was also discussed, with some questioning whether assessment must be solely summative and suggesting the value of formative assessment for equity and contextual understanding. Ultimately, findings underscore the importance of equity and the consideration of culture and context in formative learning design, and highlight the need for a continuous and iterative design process that acknowledges and celebrates learners’ diverse identities and experiences.

What is the Process of Formative Learning Design? In addition to issues pertaining to defining formative learning design, participants also engaged in a discussion about its overarching process, exploring its timing, relationship to design thinking, prototyping, and its iterative nature. Responses suggest that participants recognized that formative learning design occurs at various stages of the design process and can be interpreted and applied differently,

1  What is Formative Learning Design? Collaborative Meaning-Making From…

7

highlighting the need for flexibility and adaptability to suit specific contexts and needs. The process of formative learning design was also described in terms of prototypes, models, and measures. Notably, participants particularly recognized the critical importance of iterative processes, aligning and realigning design elements to needs and goals. One participant characterized formative learning design as an “asymptotic process,” implying a continuous approach that seeks to approach a desired goal, but that may never fully reach it. Taken together, responses suggest that the process of formative learning design is one of constant refinement and adjustment until expectations and requirements of the intended learners are met.

Formative Learning Design Processes are Iterative Participants characterized formative learning design as an iterative, reflective, and ongoing process that involves frequent evaluation, continuous improvement, and problem-solving through iterative feedback loops. They highlighted the importance of reflection, implementation, and a focus on listening to feedback in order to inform the design process. As one participant expressed, “Formative design is a metaphor for interaction-assess-iteration until ended by either exhaustion or ‘success’,” capturing the idea of formative learning design as a continuous process that continues until a desired outcome is achieved (see Fig. 1.3). Participants also recognized that formative learning design is a constructive approach that requires listening, watching, and taking feedback into account for the next steps. They highlighted the importance of ongoing feedback loops and the inhibitions that can be addressed through formative learning design.

Formative Learning Design Processes are Feedback-Driven Participants emphasized the significance of feedback in the formative learning design process, stressing its role in improvement, reflection, and adaptation. They also questioned who generates feedback, thereby underscoring the importance of Fig. 1.3  This sticky note from a symposium participant illustrates how iteration in formative learning design is characterized by both generative and evaluative activities

8

Y. Earnshaw et al.

involving various stakeholders and learners in the feedback process and recognizing the value of feedback from experts and experienced researchers. The participants also raised questions about the role of designers and the importance of co-designing with participants and stakeholders. They recognized that involving various perspectives in the feedback process, including learners, stakeholders, and peer designers, contributes to a more comprehensive and informed design approach. They also acknowledged that feedback is essential for good design and work, considering it a formative element that drives improvement and refinement.

 ormative Learning Design Processes Rely on Data-Based F Decision Making Participants’ discussion about the role of data in formative learning design and its connection to evaluation and decision-making suggests that the formative learning design involves utilizing both quantitative and qualitative data to inform the design process. As one participant noted, “Formative design should really articulate how the data informs a specific element or process or design decision.” The importance of considering multiple stakeholders, including students, parents, teachers, and supervisors, in the data collection and analysis process was emphasized. They acknowledged the importance of having criteria by which design decisions are made, and highlighted that formative learning design should clearly articulate how data informs specific elements, processes, and design decisions. Ultimately, participants’ shared perspectives highlight the crucial role of data-based decision making in formative learning design.

 ormative Learning Design Processes are Rooted F in Communication and Collaboration Participants emphasized the significant role of communication and collaboration in the formative learning design process. They recognized that formative learning design is a collaborative effort where individuals come together to share feedback and engage in an iterative approach. One participant stated, “Formative design involves communicating thinking and reasoning, fostering collaboration and generating constructive representations.” Indeed, collaboration, team building, and effective communication were identified as essential components of formative learning design. Participants highlighted the communal effort and problem-solving nature of formative learning design, emphasizing the need to work together and generate constructive representations. They also agreed that formative learning design requires the involvement of other people, emphasizing the importance of engaging with diverse perspectives and expertise. They acknowledged that formative learning design involves communicating thinking and reasoning, allowing for collective

1  What is Formative Learning Design? Collaborative Meaning-Making From…

9

insights and generating possibilities. Communication and collaboration are integral to the formative learning design process, underscoring the value of working together, sharing feedback, and engaging in problem-solving as key elements of formative learning design.

Implications and Future Research Analysis of participants’ responses during the 2022 AECT Summer Research Symposium sheds light on several important aspects of formative learning design. These findings have implications for researchers, practitioners, and educators in the field, offering valuable insights into the nature of formative learning design and guiding future directions for research and practice. While formative learning design has been discussed as important for developing learning technologies, how it is defined and applied within the field remains ambiguous. However, findings from the analysis described in this chapter suggest a provisional definition of the phenomenon. Formative design is an iterative and learner-centered design process that involves ongoing assessments, feedback loops, and data-informed decision making. It focuses on addressing identified problems, refining designs, and promoting continuous improvement. It embraces a mindset of collaboration, communication, and reflection, incorporating diverse perspectives and considering the cultural and contextual aspects of learning environments. Formative design leverages feedback, data, and successes and failures, to inform design decisions and create meaningful learning experiences. It encompasses the integration of data-driven approaches, stakeholder engagement, and the use of design thinking principles. Formative design is distinguished by the following characteristics: • Iterative and Learner-Centered: Emphasizes an ongoing and adaptive approach to design that considers the needs and experiences of learners. • Evaluation and Feedback: Incorporates frequent evaluation and feedback loops to inform design decisions that promote learning. • Continuous Improvement: Strives for continual refinement and enhancement of designs through reflection and data-informed decision making. • Collaboration and Communication: Emphasizes collaborative efforts, effective communication, and the involvement of stakeholders throughout the design process. • Cultural and Contextual Considerations: Recognizes the influence of cultural, contextual, and identity factors on learning and design decisions. • Incorporation of Data: Utilizes both quantitative and qualitative data to inform design choices and measure the effectiveness of interventions. • Integration of Design Thinking: Applies design thinking principles to approach challenges, generate ideas, and prototype solutions. • Embracing Successes and Failures: Considers failures as opportunities for learning and improvement, while celebrating successes and building upon them.

10

Y. Earnshaw et al.

The definition presented here has implications for the design and development process, such as how it aligns with broader approaches to learning design that incorporate formative elements, such as learning experience design (LXD) and learning engineering. At the same time, further elaboration and differentiation may be needed so as to distinguish formative learning design from other, well-established approaches to learning design, such as design-based research or rapid prototyping. The definition presented here emerged from collaborative meaning-making among a diverse group of scholars and practitioners, suggesting that it could be a useful, emerging concept that could provide useful insights for how the field of learning design can effectively advance formative learning design. The Symposium Team  Brad Hokanson, Matthew Schmidt, Marisa Exter, Andrew A. Tawfik, and Yvonne Earnshaw. Many people have participated in the development of the symposium and in the subsequent production of the printed volume. They include: Reviewers Ilene Alexander, Dennis Cheek, Marisa Exter, Noah Glaser, Jason McDonald, Pamela Moore, Angelica Pazurek, Tracy Robinson, Matthew Schmidt, Jill Stefaniak, and Andrew A. Tawfik. Interns Vanessa Johnson, Indiana University Alyse Harris, Indiana University AECT Staff We thank you for all of your assistance in helping to make this a successful symposium.

Appendix “Formatively Designing a Book Together” Activity Goal • We are all designing this book together and adding to the understanding of what “formative learning design” means/is. Activity Structure • Part 1: Introduce • Part 2 & 3: Start putting your notes up around the room. –– Feel free to start reading others’ notes and grouping. • Part 4: collaborative activity (see below) • Part 5: final reflection

1  What is Formative Learning Design? Collaborative Meaning-Making From…

Prompts • What is formative learning design • Other thoughts related to formative learning design Collaborative Activity (Part 4) • Introduce • Additional ideation • Use wall-space to jointly organize stickies • Share out (have people summarize walls and discuss) • We take pictures

11

Chapter 2

Formative Design in the Holistic 4D Model Charles M. Reigeluth

Abstract  This chapter begins by explaining 10 major reasons why we developed the Holistic 4D Model for instructional design and development. One of the reasons was to incorporate formative design – that is, frequent formative evaluation during the instructional development process. Next, the chapter describes the Holistic 4D Model and focuses in specifically on the ways that formative design is incorporated into the model. The four phases of the ID process are Define, Design, Develop, and Deploy, and the Design phase has three levels of design, each of which has multiple short cycles of analysis, design, and evaluation. This allows formative evaluation to be done often throughout the design phase. The chapter ends with a few thoughts about the most useful kinds of research methods for advancing knowledge about formative design in the ID process. It is particularly important to conduct design-­ based or formative research to build design theory that identifies different methods for formative design that are found to be preferable for different situations. Keywords  Formative design · Formative evaluation · Instructional design · Formative research · Instructional design theory Formative design is a central part of the Holistic 4D Model (Reigeluth & An, 2021) for instructional design and development. This chapter begins by explaining why we developed a new ID model when there are already so many, followed by describing the model, then describing the ways that formative design is incorporated into the model, and ending with a few thoughts about the most useful kinds of research methods for advancing knowledge about formative design in the ID process.

C. M. Reigeluth (*) Indiana University, Bloomington, IN, USA e-mail: [email protected] © The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 B. Hokanson et al. (eds.), Formative Design in Learning, Educational Communications and Technology: Issues and Innovations, https://doi.org/10.1007/978-3-031-41950-8_2

13

14

C. M. Reigeluth

Why a New Model The Holistic 4D Model offers 10 innovations that are not common in other ID models: 1. It offers a holistic ID process, rather than the typical fragmented one. 2. Both task and topic expertise are addressed, to provide more effective content and instructional strategies for education-type instruction. 3. Instructional theory is incorporated, whereas most ID process models offer little-to-no guidance about which instructional methods to use when. 4. Its instructional theory is focused on the learner-centered paradigm, because it is more effective than the teacher-centered paradigm. 5. Holistic instructional sequences are offered, in contrast to the fragmented instruction that results from a Gagné-type hierarchical sequence. 6. Just-in-time analysis is used, rather than pretending that all the analysis can be done at the beginning of a project. 7. Rapid prototyping is incorporated, which is a powerful formative design tool to increase the efficiency of the ID process. 8. “Designer objectives” are distinguished from “learner objectives,” to provide better information to both designers and learners. 9. Performance analysis is included, to identify non-instructional interventions that improve performance, thereby avoiding the design of unnecessary instruction. 10. Continuous evaluation is included, another formative design tool to increase the efficiency of the ID process and the effectiveness of the instruction. I will describe each of these features next. 1. Holistic ID process. Perhaps the most important innovation is a holistic ID process that begins by creating a fuzzy vision (Banathy, 1991, 1996) of the entire instructional system you want to design. On the top level of design, we offer guidance for creating a fuzzy vision of the content. We also offer guidance for a fuzzy vision of how that content will be sequenced and what methods will be used in general. This fuzzy vision is a chance to be creative, far more so than in a traditional ADDIE process. On the mid level, we offer guidance for more details on the fuzzy vision. You get a clearer idea of the content by developing mid-level objectives, and you design more details about the sequence, assessments, and instructional methods. This way, you can design each part of the instructional system with all the other parts in mind, resulting in more coherent and creative instruction. This gives you sufficient perspective to complete a detailed design that includes lower-level objectives, instructional sequence, assessments, and instructional methods. We also offer guidance for beginning some development work in tandem with this design work. 2. Task and topic expertise. Another huge innovation is that we address both task and topic expertise (Romiszowski, 1981, 1986). So, what is the difference between the two? With the behaviorist roots of our field, we have focused almost

2  Formative Design in the Holistic 4D Model Table 2.1  Task and topic expertise

Task expertise Performance – Goal oriented Examples:  Medicine  Engineering  Instructional theory

15 Topic expertise Understanding – Descriptive Related examples:  Biology  Physics  Learning theory

exclusively on task expertise, which concerns how to do things (see Table 2.1). All instruction had to be targeted to behavioral objectives. Of course, task expertise is certainly important in many situations. But it is not everything. Much of education – and even some of training – is focused on understanding topics – specifically, understanding concepts and principles that allow many different ways of demonstrating that sufficient understanding has occurred. The important point here is that the nature of the ID process is different in important ways for developing topic expertise than for developing task expertise. For example, different kinds of analysis are needed for each of the two kinds of expertise. And different guidance is needed for sequencing instruction for each. Perhaps most importantly, different instructional strategies are needed for each. 3. Instructional theory incorporated. Perhaps the most important innovation is that instructional theory is integrated into guidance for the ID process. There are three major kinds of instructional theory for which we provide guidance. First is instructional theory for the learner-centered paradigm of education (McCombs & Whisler, 1997; Reigeluth et  al., 2017; Reigeluth & Karnopp, 2020), which I describe in greater detail in the next paragraph. Second is the Elaboration Theory (Reigeluth, 1999), which includes guidance for a holistic sequence, but uses procedural and hierarchical sequencing when appropriate within an elaboration sequence. I’ll say more about this shortly. Third, and in some ways most importantly for learning, we provide guidance for the design of motivational strategies and tactics. This is based partly on McClelland’s (1987) Needs Theory, in which he proposes three major human needs or motivations: need for achievement, which we address through competency-based education; need for affiliation, which we address through collaborative team-­ based learning; and need for power, which we address through self-directed learning. We also provide guidance based on John Keller’s (1987, 2010) ARCS Model for the motivational design of instruction. 4. The learner-centered paradigm. The instructional theory that is integrated with the ID process is up-to-date, state-of-the-art, learner-centered theory (Marzano et al., 2017; McCombs, 2013; McCombs & Whisler, 1997; Reigeluth et al., 2017; Reigeluth & Karnopp, 2020). It includes guidance for designing competency-based education (Marzano et  al., 2017; Voorhees & Bedard-­ Voorhees, 2017), including competency-based student progress, competency-­ based learning targets, competency-based student assessment, and competency-based student records (Reigeluth & Karnopp, 2020). The feeling

16

C. M. Reigeluth

that comes with successful mastery of the material is highly motivating for learners (satisfying McClelland’s, 1987, “need for achievement”), especially when the material is perceived as important. The instructional theory also includes guidance for designing project-based learning (Francom, 2017; Jonassen, 2011; Savery, 2009) for content that is primarily topics as well as content that is primarily tasks, as I described earlier. Learning by doing is also highly motivating when there is sufficient scaffolding. So, the guidance for PBL is accompanied by guidance for scaffolding, primarily in the form of justin-time tutorials during the projects (Reigeluth, 2012), where those tutorials are very different when the focus is on topics versus tasks. And it includes guidance for personalizing the learning targets, projects, tutorials, assessments, and reflections (Watson & Watson, 2017), which also enhances motivation. Our model provides guidance for designing several types of collaboration into the instruction. One is team-based learning, where team composition is a big factor. The other is peer assistance, where peers – not on one’s team – help out. Often, this is a more advanced student who has recently learned what this student needs assistance on. Those peer assistants often benefit as much as those whom they are helping. Done well, collaboration is also highly motivating. Guidance for learner-centered instructional theory also includes the design of self-directed learning experiences (Huh & Reigeluth, 2017). Students are helped to select several things: their own learning goals or targets, the projects and tutorials that will be vehicles for reaching those goals, the assessments that will be used to certify mastery of those goals, and the reflections that broaden and deepen the learning. The amount of teacher support for that self-direction is also personalized for every student. 5. Holistic instructional sequences. A holistic ID process is one of the most important aspects of our ID model, but another aspect of holism is the instructional sequence that students go through. Rather than the designers breaking the content down into little pieces and students progressing from one piece to another, our model offers guidance for beginning with a holistic view of the topic or task. Based on schema theory (Widmayer, 2004), this results in more meaningful learning and better long-term retention. It also improves motivation. Our model uses the Elaboration Theory (Reigeluth, 1999, 2007), which offers guidance for different kinds of sequences for tasks than for topics that are predominantly concepts and/or principles. For tasks, we use the Simplifying Conditions Method (Reigeluth & Rodgers, 1980), which begins the instruction on a task with the simplest real-world version of the task. For example, in teaching someone to drive a car, simplifying conditions include an automatic rather than manual transmission, no traffic, no parallel parking, no hill starts, good weather, and much more. Once the simplest version of the task is mastered, the instruction moves on to progressively eliminate the simplifying conditions until the learner has mastered the task under the most complex conditions specified by the objectives. When topics constitute the majority of the content, the Elaboration Theory offers guidance for teaching the broadest, most inclusive concepts first (Reigeluth & Darwazeh, 1982), or the simplest, most broadly

2  Formative Design in the Holistic 4D Model

17

applicable principles first (Reigeluth, 2007), and then gradually elaborating with more detailed concepts or more complex principles related to them. Elaboration sequences thus provide a meaningful context, a kind of “ideational scaffolding” (Ausubel, 1968), that results in considerably more effective and appealing instruction. 6. Just-in-time analysis. Another important innovation is guidance for just-in-­ time analysis. Traditionally, in an ID project, you do all the analysis, and then move on to the design phase. But that doesn’t make sense, because different design decisions lead to different information needs for making your next design decision. For example, the decision to use a topical elaboration sequence requires a very different kind of analysis than a decision to use a simplifying conditions sequence or a hierarchical sequence. Similarly, the decision to use a simulation game requires a very different kind of analysis than the decision to use literature-based discussion groups. So, analysis should be done periodically throughout the ID process, just in time before you need to make a design decision. And you should do some kind of formative evaluation after each major design decision – usually through expert review. Therefore, we offer guidance for conducting cycles of analysis, synthesis (or design), and formative evaluation. This way you avoid doing analysis that ends up never being used, you get the analysis information just when you need it, so it is still fresh in your mind, and you get the formative evaluation information and revision information soon enough to benefit future design decisions. This is good formative-design practice. 7. Rapid prototyping. Another feature of our Holistic 4D model is rapid prototyping (Desrosier, 2011; Pham & Dimov, 2003; Tripp & Bichelmeyer, 1990). While rapid prototyping is not new, we offer guidance for two kinds of rapid prototyping. One entails reducing the amount of instruction in the rapid prototype – a quantitative reduction. It is a small but representative part of the whole instruction under development. The other kind entails reducing the fidelity of the instruction – a qualitative reduction. For example, maybe you just create storyboards for a “quick and dirty” version of a video or computer-based simulation to initially formatively evaluate the prototype. Of course, you can use a combination approach to rapid prototyping, as well, utilizing both kinds. 8. Designer and learner objectives. Another innovation has to do with the objectives. Objectives serve two purposes: to help designers design, and to help learners learn. But each of these audiences needs different information in the objective. For example, designers benefit from knowing the behaviors, conditions, standards, and test instrument specs when designing instruction for tasks. In contrast, learners benefit from a general sense of what is to come. This can take two forms: abstract objectives, which are most common and just describe the desired performance, and demonstration objectives, which show the desired performance being done and thereby tend to be more effective and motivating. We offer guidance as to when and how to use each of these kinds of objectives. 9. Performance analysis. In training contexts, instruction is not always the solution to performance problems. Such problems may be due to poor motivation,

18

C. M. Reigeluth

inadequate tools or materials, poor processes, or other factors. Therefore, we offer guidance for conducting performance analysis (Rossett, 2009) and selecting appropriate interventions to improve performance. 1 0. Continuous evaluation. The last innovation I’ll talk about here is continuous formative evaluation – the heart and soul of formative design. We offer guidance for several kinds of evaluation, including both formative and summative. Formative evaluations should be done for the instruction you are designing, but you shouldn’t wait until all the design work is done. Rather, such evaluation should be done throughout the design process, as I mentioned earlier for each cycle of analysis, synthesis, and evaluation and for each rapid prototype you design. This way, you can avoid repeating the same design mistakes during the rest of the design process. But it is also important to formatively evaluate your version of the Holistic 4D Model’s ID process throughout the project and into future projects, so you can constantly improve your ID process. We offer guidance for all these kinds of evaluation and when to do each.

The Holistic 4D Model Given these innovations, what does the Holistic 4D Model look like? Well, of course, it has the 4 Ds – four major phases, but there is overlap and recursion among these phases (see Fig. 2.1). It also has smaller interactive cycles of activity within each of these four phases. In the case of Define, the cycle entails analyzing, defining, and evaluating. In the case of Design, the cycles entail analyzing, designing, and evaluating, but this work is done on three levels (as described earlier), to provide a holistic approach to the design process. For Develop, the cycles just entail developing and evaluating, and some of this development work is done during the design process when appropriate. We also offer guidance for the four different kinds of formative evaluation: expert review, one-on-one, small-group, and field test. Finally, for Deliver, the cycle entails implementing, managing, and evaluating. These delivery considerations are often overlooked in many ID models.

Formative Design in the Model There are several ways that formative design is factored into the Holistic 4D Model. First, as I mentioned earlier, rapid prototyping tailors the ID process to early and frequent formative evaluation. Both approaches to rapid prototyping – quantitative and qualitative reduction – allow formative evaluation to be done earlier in the ID process. The evaluation results not only help you improve the design just evaluated, but they also inform future designs that are similar to this one. Second, we offer guidance for engaging in design cycles that begin with just-in-­ time analysis before each design activity and immediate formative evaluation and

2  Formative Design in the Holistic 4D Model

19

Fig. 2.1  The Holistic 4D Model

revision at the end of each design activity, as warranted. In fact, these formative design cycles occur in all four phases of the ID process – all four Ds. In the Define Phase, you formatively evaluate and revise the definition of the project. In the Design Phase, you formatively evaluate and revise your design decisions on each of the three levels of design. A major choice you need to make is how large a chunk of the design you evaluate in one cycle. The larger it is, the less often formative evaluation and revision will occur, which could save some time and money. But less frequent formative data and revision may allow problematic designs to be replicated, causing more wasted time and money. You also need to decide whether to use expert reviews, one-on-one learner trials, and/or small-group learner trials - based primarily on time and budget. In the Develop Phase, you formatively evaluate and revise each chunk of the design product. Again, you need to decide how much of the instruction to develop before you evaluate it. And, now that you have a product, you can do actual field tests, as well as the quicker and easier expert reviews, one-on-one learner trials, and small-group trials. Finally, in the Deploy Phase, formative evaluation and revision should continue to be done on a regular basis. Summative evaluation may also be important occasionally for decisions about whether or not to discontinue the instructional system, but we recommend no summative evaluations during at least the first three iterations of deployment, so there is enough time to work out the bugs in the system.

20

C. M. Reigeluth

These tools for formative design in the Holistic 4D Model (rapid prototyping and design cycles) should provide information about the strengths of each chunk of instruction you design, its weaknesses, and possible improvements, through interviews with experts and/or trials with learners. We offer guidance for using the evaluation data to improve your current design and future designs that are similar to the current one. Possible improvements are evaluated in further iterations of formative evaluation of similar designs that you create in the future. Later improvements may lead you to cycle back and make revisions in earlier designs. A third use of formative design in the Holistic 4D Model is ongoing formative evaluation of the version of its ID process that you use in your context. Every ID model needs to be tailored to each project you undertake. This use of formative design allows you to improve your application of the Holistic 4D Model sooner rather than later during a project, which saves time and money in your ID process, as well as improving the quality of the design product. As with your designs, formative evaluations of your ID process identify its strengths, weaknesses, and possible improvements for your project through reflections of your designers. And we offer guidance for using the evaluation data to improve the ID process for your subsequent use of that part of the Holistic 4D Model – tailored to your situation.

Research on Formative Design The AECT Summer Research Symposium was created because of concerns about the quality and usefulness of research in AECT and our field in general. Peter Honebein and I have recently published several articles and a chapter about these concerns  and have offered some suggestions for addressing them (Honebein & Reigeluth, 2020a, 2020b, 2021). These suggestions are highly relevant to research on formative design, so I will highlight the most important ones here. First is the importance of building design theory about formative design rather than descriptive theory, because design theory offers guidance for selecting methods to achieve goals – it is instrumental. This means we need to develop more guidance about how to do formative design effectively and efficiently. Second, it is important to understand the S curve of knowledge development (Branson, 1987). This pattern of development is valid for hard and soft technologies and for the development of systems of all kinds. Our interest is in methods for formative design. When first developed, a method begins relatively low on the performance scale, the vertical dimension of this chart (see Fig. 2.2). Through research, improvements in its performance gradually increase, and their rate of increase accelerates to a period of rapid improvement. Then improvements slow down as performance approaches the upper limit of performance for that method. One of the important insights from understanding this pattern of development is that it is almost never useful to compare an older method that has approached its upper limit with a newer method that is near the bottom of its S curve. Doing so could result in abandoning the newer method when, given time and resources for

2  Formative Design in the Holistic 4D Model

21

Fig. 2.2  The S curve of development

further development, it could have significantly outperformed the older method. Another insight is recognizing that it is important that our research moves the newer method along its S curve, rather than doing research to compare the newer method to the older one. Therefore, it is important for research on formative design to focus on improving the guidelines rather than on proving them. Finally, we know that no method works best all the time. Different methods work best in different situations. So, it is important for our research to carefully document the situational variables  (situationalities) prevailing in each research study. The knowledge we generate should help us understand which methods should be used in which situations.

Conclusion Formative design is a powerful tool for increasing the effectiveness and efficiency of the ID process. The challenge is figuring out how best to do formative design, understanding that different ways of doing it are going to be preferable in different situations. No single method is best for all situations. While there are some principles for doing formative design that are pretty much universal, our field needs to focus on advancing knowledge about which variations of which methods of formative design are most appropriate for different situations. Yunjo An and I have developed the Holistic 4D Model to incorporate the best available knowledge about using formative design in an ID project. But more knowledge is needed. We encourage researchers to work in this area and to please let us know of any improvements you can think of for the Holistic 4D Model’s formative design guidance.

22

C. M. Reigeluth

References Ausubel, D. P. (1968). Educational psychology: A cognitive view. Holt, Rinehart & Winston. Banathy, B. H. (1991). Systems design of education: A journey to create the future. Educational Technology Publications. Banathy, B. H. (1996). Designing social systems in a changing world. Plenum Press. Branson, R. K. (1987). Why the schools can’t improve: The upper limit hypothesis. Journal of Instructional Development, 10(4), 15–26. Desrosier, J. (2011). Rapid prototyping reconsidered. Journal of Continuing Higher Education, 59, 135–145. Francom, G. M. (2017). Principles for task-centered instruction. In C. M. Reigeluth, B. J. Beatty, & R. D. Myers (Eds.), Instructional-design theories and models, volume IV: The learner-centered paradigm of education (Vol. IV, pp. 65–91). Routledge. Honebein, P. C., & Reigeluth, C. M. (2020a). Making good design judgments via the instructional theory framework. In J. K. McDonald & R. E. West (Eds.), Design for learning: Principles, processes, and praxis (1st ed.). EdTech Books. https://edtechbooks.org/id/ Honebein, P. C., & Reigeluth, C. M. (2020b). The instructional theory framework appears lost. Isn’t it time we find it again? Revista de Educación a Distancia, 20(64). https://doi.org/10.6018/ red.405871 Honebein, P. C., & Reigeluth, C. M. (2021). To prove or improve, that is the question: The resurgence of comparative, confounded research between 2010 and 2019. Educational Technology Research and Development, 69, 465–496. https://doi.org/10.1007/s11423-­021-­09988-­1 Huh, Y., & Reigeluth, C.  M. (2017). Designing instruction for self-regulated learning. In C. M. Reigeluth, B. J. Beatty, & R. D. Myers (Eds.), Instructional-design theories and models, Vol. IV: The learner-centered paradigm of education (pp. 243–267). Routledge. Jonassen, D. H. (2011). Learning to solve problems: A handbook for designing problem-solving learning environments. Routledge. Keller, J. M. (1987). Strategies for stimulating the motivation to learn. Performance & Instruction, 26, 1–7. Keller, J.  M. (2010). Motivational design for learning and performance: The ARCS model approach. Springer. Marzano, R.  J., Norford, J.  S., Finn, M., & Finn, D. (2017). A handbook for personalized competency-­based education. Marzano Research. McClelland, D. C. (1987). Human motivation. Cambridge University Press. McCombs, B.  L. (2013). The learner-centered model: From the vision to the future. In J. H. D. Cornelius-White, R. Motschnig-Pitrik, & M. Lux (Eds.), Interdisciplinary handbook of the person centered approach: Connections beyond psychotherapy. Springer. McCombs, B. L., & Whisler, J. S. (1997). The learner-centered classroom and school: Strategies for increasing student motivation and achievement. Jossey-Bass Publishers. Pham, D., & Dimov, S. (2003). Rapid prototyping: A time compression tool. Ingenia, 17, 43–48. Reigeluth, C. M. (1999). The elaboration theory: Guidance for scope and sequence decisions. In C. M. Reigeluth (Ed.), Instructional-design theories and models: A new paradigm of instructional theory (Vol. II, pp. 425–453). Lawrence Erlbaum Associates. Reigeluth, C. M. (2007). Order, first step to mastery: An introduction to sequencing in instructional design. In F. Ritter, J. Nerb, E. Lehtinen, & T. O’Shea (Eds.), In order to learn: How the sequence of topics influences learning (pp. 19–40). Oxford University Press. Reigeluth, C. M. (2012). Instructional theory and technology for the new paradigm of education. RED, Revista de Educación a Distancia, 32. http://www.um.es/ead/red/32 Reigeluth, C. M., & An, Y. (2021). Merging the instructional design process with learner-centered theory: The holistic 4D model. Routledge. www.reigeluth.net/holistic-­4d Reigeluth, C. M., Beatty, B. J., & Myers, R. D. (Eds.). (2017). Instructional-design theories and models, volume IV: The learner-centered paradigm of education. Routledge.

2  Formative Design in the Holistic 4D Model

23

Reigeluth, C.  M., & Darwazeh, A. (1982). The elaboration Theory’s procedure for designing instruction: A conceptual approach. Journal of Instructional Development, 5(3), 22–32. Reigeluth, C. M., & Karnopp, J. R. (2020). Vision and action: Reinventing schools through personalized competency-based education. Marzano Research. Reigeluth, C. M., & Rodgers, C. A. (1980). The elaboration theory of instruction: Prescriptions for task analysis and design. NSPI Journal, 19(1), 16–26. Romiszowski, A. J. (1981). Designing instructional systems: Decision making in course planning and curriculum design. Nichols Publishing. Romiszowski, A. J. (1986). Developing auto-instructional materials: From programmed texts to CAL and interactive video. Nichols Publishing. Rossett, A. (2009). First things fast: A handbook for performance analysis. Pfeiffer. Savery, J. R. (2009). Problem-based approach to instruction. In C. M. Reigeluth & A. A. Carr-­ Chellman (Eds.), Instructional-design theories and models: Building a common knowledge base (Vol. III, pp. 143–165). Routledge. Tripp, S., & Bichelmeyer, B. (1990). Rapid prototyping: An alternative instructional design strategy. Educational Technology Research & Development, 38(1), 31–44. Voorhees, R. A., & Bedard-Voorhees, A. (2017). Principles for competency-based education. In C. M. Reigeluth, B. J. Beatty, & R. D. Myers (Eds.), Instructional-design theories and models, volume IV: The learner-centered paradigm of education (Vol. IV, pp. 33–63). Routledge. Watson, W. R., & Watson, S. L. (2017). Principles for personalized instruction. In Instructional-­ design theories and models, volume IV: The learner-centered paradigm of education (Vol. IV, pp. 93–120). Routledge. Widmayer, S.  A. (2004). Schema theory: An introduction. https://pdfs.semanticscholar.org/47b 1/5487db915f62aec7a1f57c6f64c0c1c5234f.pdf

Chapter 3

Intern Observations and Reflections From the 2022 AECT Summer Research Symposium Vanessa Johnson and Alyse Harris

Abstract  In this chapter, we reflect on our experience as interns during the 2022 AECT Summer Research Symposium. The two-day symposium follows a unique iterative and formative structure, where authors engage in multiple rounds of critique, revision, and reflection to improve their manuscripts for a collective edited volume. We describe the structure of the symposium from our perspective as interns, which includes activities such as Pro Action Café discussions, critical thinking activities at an art museum, and a collaborative reflection activity using sticky notes to discuss the definition and characteristics of formative design. The symposium fosters a supportive environment for scholarly input and feedback, contributing to the development of high-quality scholarship. We conclude by acknowledging the malleability and flexible nature of formative design and the success of the Pro Action Café format in facilitating productive discussions and feedback during the symposium. Keywords  Formative design · Iterative feedback · Research symposium · Manuscript development · Intern reflection The Association for Educational Communications and Technology (AECT) Summer Research Symposium (SRS) is a unique two-day symposium that allows attendees to collaboratively refine their ideas around a novel theme. The SRS has a unique structure because it is not just an event that concludes at the end of these two days, but it is also a year-long manuscript development process from proposal to published chapter in a collective edited volume. The book editors also act as facilitators during the two-day symposium. In terms of scholarship, the SRS tends to attract a range of attendees such as faculty and graduate students from a range of fields,

V. Johnson (*) · A. Harris Indiana University, Bloomington, IN, USA e-mail: [email protected]; [email protected] © The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 B. Hokanson et al. (eds.), Formative Design in Learning, Educational Communications and Technology: Issues and Innovations, https://doi.org/10.1007/978-3-031-41950-8_3

25

26

V. Johnson and A. Harris

instructional designers, and practitioners from industry, higher education, and non-­ profit organizations. The AECT SRS began in 2006 with a goal to facilitate deep, meaningful discussions about current research in the instructional design field based on a predetermined theme. From 2010 until 2018, the SRS met in-person bi-annually. Since 2019, the SRS has met annually, alternating between online and in-person. However, in 2020 and 2021, the symposium met virtually due to the COVID-19 pandemic. In 2022, the symposium returned to a two-day in-person event at Indiana University in Bloomington, Indiana. The theme of the 2022 SRS was Formative Design in Learning: Design Thinking, Growth Mindset, and Community. Formative design is a relatively undefined topic in the instructional design field. In the Journal of Formative Design in Learning’s first issue, the editor-in-chief, Dr. Robert Kenny, stated in the introduction that “we have noticed there does not exist one central idea as to what ‘formative’ really connotes” (2017, p. 1). Despite the focus on defining formative design, the literature that comprehensively defines this term is sparse. Formative design is also described as an iterative design and evaluation process that is continually refined and evaluated through testing phases until the design is converged into a product (Calongne et al., 2019, p. 200).

Symposium Structure The idea of formative design aligns well with the structure of the symposium to support the development of the collective, edited volume. Prior to the two-day symposium, authors drafted a proposal that was reviewed by the editorial team, and if the proposal was accepted, authors were then invited to prepare a draft manuscript. The draft was sent to the other authors for review prior to the symposium. Then, during the symposium, authors engaged in a series of activities to receive feedback on their manuscripts. After the symposium, authors would then fully expand on their manuscripts for the next stage in the review process. If the manuscripts were accepted, they were extended as chapters in an edited book dedicated to formative design and its process. As mentioned previously, during the 2022 symposium attendees engaged in pre-­ planned activities to more deeply explore the topic of formative design and to receive feedback on their manuscripts. These activities included: • Participating in multiple rounds of “Pro Action Café” style discussions in order to help each author further develop their manuscript • Engaging in a Visual Thinking Strategy activity • Brainstorming and reflecting activity related to defining formative design Each activity will be described in the following sub-sections.

3  Intern Observations and Reflections From the 2022 AECT Summer Research…

27

 eveloping Chapter Manuscripts Using the Pro Action D Café Format A foundational element of the SRS is the Pro Action Café format for generating discussion about attendee’s in-formation manuscripts. This format is a combination of the World Café (Owen, 2008) and Open Space Technology (Herman, n.d.) concepts that are similar in format and designed for engaging participants in active discussions at meetings and events. A schedule of conversation sessions is followed with the goal of providing feedback on a presented project or work. In the general discussion structure of the SRS, the same method was used. Two Pro Action Café sessions took place each morning, and two took place each afternoon throughout the symposium. Prior to arriving at the symposium, attendees were encouraged to read the manuscripts. Then during the symposium, each manuscript was discussed twice in order to provide authors with a variety of perspectives from other attendees and to allow attendees to participate in discussions of a wider range of topics. The discussion structure used during each session was as follows: • Introductions: (5 min) • Statement: (5 min) for the presenter to describe the research (without interruption) • Clarify: (5 min) for the participants to ask clarifying questions of the presenter (with response from the presenter) • Incubate: (10–15 min) for the participants to discuss the research (without any input from the presenter), and finally • Rejoin: (5–10 min) for the presenter to become ungagged and respond, ask questions, and/or summarize what they’ve heard. This structure provided a generous opportunity for authors to gain scholarly input, insight, and general feedback from fellow attendees that was iterative within the 35–40 min table sessions. During the Pro Action Café table sessions, attendees were encouraged to provide feedback to the authors on how to strengthen their papers. Consistent with Participatory Learning and Action Principles (Chambers, 2007), attendees of the symposium presented, shared, analyzed, reflected, evaluated, and planned to act through this exchange of ideas; repeated suggestions were related to reframing theoretical concepts, adding theoretical framing to a narrative paper, adding further discussion on future goals, and providing more information to the reader surrounding thought processes and design decisions. To a lesser extent, some themes of the Pro Action Café table discussions included making more explicit the relevance of the paper to formative design, reorganizing or condensing sections of the paper, and providing clear definitions or context of introduced terms and concepts in the writing. The critique and conversation among colleagues with the goal in mind to strengthen and improve upon the author’s writings is representative of a formatively-­ organized assignment. The structure and the theme offered a real-time demonstration of formative design in action. Because the goal of the symposium is to generate high-quality

28

V. Johnson and A. Harris

scholarship, the opportunity to participate in several rounds of research writing critique, revision, and reflection, is a formative process. Discussing, note-taking, asking clarifying questions, critiquing, providing feedback, and reflecting were interactive and engaging.

Critical Thinking Activity at the Art Museum The attendees engaged in an activity to support critical and creative thinking. First the group viewed a painting by Diego Velázquez titled Las Meninas and then were asked to evaluate the painting using the questions from Visual Thinking Strategies (Housen, 2002). Dr. Brad Hokanson led the group through the activity by introducing each of the following questions: 1 . What is going on in this picture, sculpture, etc.? 2. What makes you say that? 3. What more can we find? (Housen, 2002, p. 100). After the group discussion, attendees were invited to divide into small groups of 2–4 people and visit the Sidney and Lois Eskenazi Museum of Art on the Indiana University campus. The groups were allotted 1 hr to explore the museum and use the activity questions to discuss the artwork. Several attendees took photographs of the artwork they discussed and shared their observations with other attendees. These conversations continued into the roundtable discussions, acting as an inspiration for integrating the critical thinking questions into the research paper feedback by promoting an ongoing generative process of further questioning among attendees. Housen (2002) discussed the desired learning outcomes of these three questions as a “‘critical thinking studio’ in which learners observe carefully, evaluate, synthesize, justify and speculate” (p. 101). Similarly to the goal of expanding these ideas in art, these learning outcomes can be applied to research.

Defining Formative Design: The “Sticky Note Activity” The third component of the SRS consisted of a collaborative reflection activity enabling attendees to partake in meaning construction of formative design and its processes with the objective of progressing toward a collective understanding of the term. This collaborative activity occurred using sticky notes as communicative devices between attendees as they expanded upon each other’s posted ideas on the activity wall. Mirroring the theme of the 2022 SRS, the activity was also part of the formative process. Both at designated times and informally throughout the symposium, attendees used sticky notes to respond to prompts that asked them to define formative design, reflect on the symposium activities, and respond to other attendees’

3  Intern Observations and Reflections From the 2022 AECT Summer Research…

29

comments. Each ideation session lasted 10–20 min. Attendees could also respond by adding a new sticky note beneath another to agree, disagree, refine, expand upon, or ask a question. If attendees had ideas or contributions outside of the designated time, the facilitators encouraged them to capture their thoughts on sticky notes and add them to the wall. The posted sticky notes were then arranged into groups such as “reflection,” “questions,” “definition,” and “feedback/data.” Attendees actively participated in formative design as they collectively, collaboratively, and iteratively developed a description of formative design. During these engagements, it became apparent that formative design is a process. The term “process” became a descriptive factor and a characteristic of formative design. Several other terms emerged in the sticky notes and subsequent conversations. High-­ frequency terms used throughout the symposium by attendees and facilitators and written on brainstorming sticky notes were “iterative,” “feedback,” “design,” “formative,” and “repetitive.”

Final Thoughts Attending the 2022 SRS as an intern and engaging with the formative feedback cycle from various perspectives was educational and worthwhile as a student of Instructional Systems Technology and as a developing academic, researcher, and writer. Observing and participating in the formative process alongside other attendees, while working collaboratively to create, edit, recreate, and generate ideas, is a productive manner in which to embody and represent the characteristics associated with the formative design process. The Pro Action Café was a befitting choice for use in the symposium and was helpful in prompting and continuing the instrumental conversations taking place.

Redefining Formative Design Formative design is not easily defined and has a malleable, amorphous range of characteristics that take form through application. Staying true to the approach modeled during the research symposium, we continue to strive towards a definition that embodies the formative design process by revisiting the endeavor of outlining its conceptualization. The components of the formative design process were also somewhat undefined. Attendees generally agreed upon several characteristics of formative design (iteration, assessment, and evaluation), although these characteristics were not explicitly formalized into a model or framework. Another part of the formative design conversation at the research symposium was the idea of “failure,” and what it means within the formative design context. Through the discussion, the axiom from IDEO founder, David Kelley, “we fail faster to succeed sooner” (Muoio, 1997) was referenced to illustrate the relationship

30

V. Johnson and A. Harris

between failure and success in formative design. Attendees underscored that in the design process, failure is expected. The term “failure” was not used lightly and was at the center of some questioning or clarification throughout the conference. Failure was mostly associated with failing forward. That is, we repeatedly fail throughout the formative design process to continue making constructive progress.

The Success of the Pro Action Café The 2022 SRS successfully integrated the Open Space technology and Pro Action Café formats to create an iterative, interactive symposium. This collaborative, generative structure encouraged attendees to engage with each other benefiting not only the author of the manuscript being discussed but also the reviewers providing the feedback. As attendees changed roles from author to reviewer and vice versa, they also gained perspective on ways to critique research writing and research design and communicate those ideas with their peers. Through ongoing small group discussions, authors received feedback to improve their writing, and we observed attendees engaging in critical thinking, critiquing, synthesizing, and analyzing research. Despite slight adaptations to the Pro Action Café structure in some groups, attendees had similar experiences and productive conversations. The interactive process of critique and feedback mirrored the formative design process. As attendees engaged in discussion, they supported each other in a formative design process. This symposium successfully improves scholarship through feedback and provides space for attendees to explore new concepts in the instructional design field. The Pro Action Café’s environment was a harmonious one in the case of the SRS. Incorporating the Open Space Technology and Pro Action Café format within the structure and schedule of the SRS served as a productive facilitation method for discussion to occur. The generation of feedback, thoughts, and notes for the authors to consider in the editing of their writing, was consistent with that of collaboration, cooperation, participatory learning and action, and the formative design process. In our observations of the research symposiums’ table discussions, it is evident that the iterative cycle of feedback, criticism, and questioning submitted to authors was done with respectful hesitation. This may reflect uncertainty and partial understanding of the research writings, due to attendees’ incompleteness in reviewing and reading the papers submitted for review. From our experience at the SRS, we tentatively conclude that formative design is a living, flexible, expandable, changeable process. Formative design applies itself to several areas of study and practice resulting in a characteristically indefinite concept that resists the notion of a static, narrow designation that may lend itself to becoming restricted in the way that it exists.

3  Intern Observations and Reflections From the 2022 AECT Summer Research…

31

Reflection Reflecting on the organization of the SRS and use of the Pro Action Café, sticky note brainstorming activities, and paper feedback structure and timing, we would conclude that this utilization of time and space for the purposes of the SRS was very productive and effective. The symposium use of time was dedicated to manuscript improvement in every way. The environment was shared with educated and accomplished attendees who are qualified to contribute feedback and ideas to authors sharing papers who are seeking honest, critical, and valuable responses and guidance toward their research and academic writing. Anyone attending the SRS as an author receives a value-added outcome of their time spent among their peers, as it is unique in an exceptional way, separate from that of conferences. We recommend that the SRS continues to use this approach as the environment was conducive for creative idea formation, collaborative thinking, and especially formative feedback and design.

References Calongne, C., Stricker, A. G., Truman, B., & Arenas, F. J. (2019). Cognitive apprenticeship for teaching computer science and leadership in virtual worlds. In A.  Stricker, C.  Calongne, B. Truman, & F. Arenas (Eds.), Recent advances in applying identity and society awareness to virtual learning (pp. 180–200). IGI Global. https://doi.org/10.4018/978-­1-­5225-­9679-­0.ch010 Chambers, R. (2007). From PRA to PLA and pluralism: Practice and theory. Institute of Development Studies. https://opendocs.ids.ac.uk/opendocs/handle/20.500.12413/660 Herman, M. (n.d.). What is open space technology? OpenSpaceWorld.org. https://openspaceworld.org/wp2/what-­is/ Housen, A. C. (2002). Aesthetic thought, critical thinking and transfer. Arts and Learning Research Journal, 18(1), 99–132. Kenny, R. (2017). Introducing journal of formative design in learning. Journal of Formative Design in Learning, 1(1), 1–2. https://doi.org/10.1007/s41686-­017-­0006-­0 Muoio, A. (1997, August 31). They have a better idea…do you? Fast Company. https://www.fastcompany.com/29116/they-­have-­better-­idea-­do-­you Owen, H. H. (2008). Open space technology: A user’s guide. Berrett-Koehler Publishers, Inc.

Chapter 4

Closing the Professional Learning Loop: Designing for Performance Improvement Rita Fennelly-Atkinson, Courtney L. Teague, and Jillian Doggett

Abstract  Organizations can use a variety of methods to develop professional learning that improves performance. Formative design can be used effectively as part of research-based design methods to develop learning models that support performance improvement. Developed to serve secondary school educators in the United States, the Learner in Action Model (LAM) incorporates iterative processes and feedback loops that are focused on supporting specific changes in learner behavior. People in organizations that understand their people’s intrinsic needs for learning will gain proficiency, increase the creative application of learning, enhance performance improvement outcomes, and develop ownership of the performance improvement process (Senge, 2006). While developed for adult learners in a K-12 learning environment, The LAM model can be applied and modified to meet the needs of adult learners in various contexts. Keywords  Performance improvement · Professional learning · Formative design · Learning models · Instructional design

 losing the Professional Learning Loop: Designing C for Performance Improvement Organizations often struggle with professional learning programs that do not clearly connect performance improvements with overall organizational outcomes (Kirkpatrick & Kirkpatrick, 2016). This design case examines the development of a R. Fennelly-Atkinson (*) Sam Houston State University, Huntsville, TX, USA e-mail: [email protected] C. L. Teague · J. Doggett CLT & Associates, Bellevue, WA, USA e-mail: [email protected]; [email protected]

© The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 B. Hokanson et al. (eds.), Formative Design in Learning, Educational Communications and Technology: Issues and Innovations, https://doi.org/10.1007/978-3-031-41950-8_4

33

34

R. Fennelly-Atkinson et al.

formative design process, which was subsequently used to develop an improved professional learning delivery model. Formative design is a type of iterative design process that uses timely evaluation throughout the design process. It responds to the evaluation with refinement, further evaluation and iterative teaching and learning strategies (Calongne et al., 2019). A formative design process can be leveraged to integrate elements of learner experience design and agile project management into traditional instructional design methods toward achieving performance improvements. This responsive process was used to develop the Learner in Action Model (LAM) to close the professional learning loop by ensuring that instruction resulted in implementation and performance improvement for adult learners. For the purpose of the study, adult learners are secondary educators.

The Design Context This chapter addresses the design team’s journey in rethinking the models used to design and deliver professional learning. Examining the period between 2018 and 2022, the design team engaged in an array of choices informed by program constraints and results, as shown in Table  4.1. The professional learning initiative served secondary educators in U.S. public schools who had committed to engaging in a multi-year program to support the effective use of 1:1 devices with students during instruction. As part of this program, educators also committed to engaging in ongoing professional learning to support the effective implementation of pedagogical strategies to enhance technology integration. Table 4.1  Timeline and overview of professional learning methods, design, constraints, and results outcomes Timeframe Models 2018 ~ 2019 Delivery model: Academic • Three 4-hour face-to-face year workshops delivered on campus during the academic year

2019 Summer program evaluation and design

Design model: • ADDIE process was used to inform professional learning design iterations using the following data: Learner outcomes, observations, focus groups, research and evaluation data

Constraints • Required learning seat time • Incompatible schedules between schools and organization • Competing initiatives and priorities for recipients • Program required continuing the existing delivery model • Virtual delivery of professional learning was allowed on a case-by-case basis

Results • Slow and inconsistent implementation of practices • Too much variability in professional learning delivery • Learning experiences did not address learner variability • Professional learning topic tracks based on school needs were identified • Consistent professional learning format and facilitator guides were developed

(continued)

4  Closing the Professional Learning Loop: Designing for Performance Improvement

35

Table 4.1 (continued) Timeframe 2019–2020 Academic year

Models Delivery model: • One 4-hour face-to-face workshop delivered on campus during the academic year. • One virtual 2- hour workshop delivered • COVID-19 disruptions shifted the third workshop into responsive school support

Constraints • Competing local school board learning time restrictions • Educators’ at-home broadband and device access • Rapid shift to delivering professional learning experiences fully online

2020 Summer program evaluation and design

Design model: • Program required • Design process was an updated adjusted and updated professional based on multiple learning virtual research-based methods delivery model • Formative process was • Professional used to engage in ongoing learning tracks agile and iterative design were formalized to sprints that were focus on highresponsive to ongoing data interest and collection, which included high-need topics learner outcomes, • Professional observations, focus learning required groups, research, and some synchronous evaluation data training to be conducted

2020–2022

Delivery model: • The LAM prototype was deployed and iterated using the updated formative design process

• Teacher turnover due to leaving the field • Short-term leave due to COVID-19 • Decreased availability of professional learning team

Results • Data indicated an increased need for professional learning and support • Need and interest in engaging in virtual professional learning options increased • On-site instructional coaches took greater ownership of facilitating professional learning experiences and took a more active role in the follow-up to the implementation of learning into practice • The professional learning goals and outcomes were updated • The content was updated and redesigned to accommodate synchronous and asynchronous virtual delivery • Learning design included increased opportunities for continued learning for choice and application • Workflows incorporated ongoing, iterative, and data-informed design decisions • The LAM prototype was developed and refined, as shown in Fig. 4.2 • Preference for virtual professional learning increased • Preference for increased autonomy stated • Program data indicated that the model was effective

36

R. Fennelly-Atkinson et al.

Design and Delivery Models Because the program primarily served educators, the design cycle typically followed the academic calendar. Professional learning was delivered during the academic year, which meant that the design team was able to access cumulative summative data toward the end of the school year. Prior to the COVID-19 pandemic, the team used the analysis, design, development, implementation, and evaluation (ADDIE) model to update the professional learning for the subsequent academic year. During this time, educators participated in professional learning delivered in three 4-hour face-to-face learning workshops. Due to the need to be responsive to fluctuating and emergent circumstances caused by the pandemic, the team repeated and adjusted the professional learning design to better meet the needs of the educators. As the models were adjusted, the team found that the professional learning results improved despite changing constraints.

Constraints and Results Constraints inform the choices and methods used in the instructional design process (Weston et  al., 1995). Formative methods can be used to address them (Weston et al., 1995). As Table 4.1 shows, the designers faced programmatic, contextual, and situational constraints. The results represent the observed outcomes based on a variety of data sources and the subsequent major design decisions. The original design addressed pre-pandemic program constraints and preferences. However, the professional learning evaluation data revealed that earlier delivery models required an extensive time commitment from learners, lacked scaffolding, and did not productively move educators from knowledge to on-the-job application. Changing participant behaviors is the most important aspect of any training program, and is often the most challenging (Kirkpatrick & Kirkpatrick, 2016). As a result, the designers decided to reimagine the design process and the virtual learning experience to leverage strategies based on current learning sciences research.

Rethinking the Design Process After some reflection, the design team determined that they were engaging in process- or product-oriented design methods (Boller & Fletcher, 2020; Solomonson, 2008). Rather than centering the learner, the process was prioritizing the subject-­ matter expert and designer. By moving toward a process model, the design team hoped to use collaboration to refocus attention on the learner (Boller & Fletcher,

4  Closing the Professional Learning Loop: Designing for Performance Improvement

37

2020; Solomonson, 2008). By rethinking the process, the team hoped to address a number of challenges within the design process: 1 . A pragmatic and efficient method to address emergent issues. 2. A process that prioritized research-based and proven techniques. 3. A practice that incorporated and elevated learner feedback. 4. A systematic strategy to center learner needs equitably and inclusively. 5. A method of collecting and using formative data.

Pragmatic and Efficient Designing Methods The ADDIE model was used as a starting point for reimagining the design process. This model represents the characteristics of the instructional design process considered to be most effective (Roblyer, 2015). The ADDIE model uses a summative process to restart the design cycle, and it can be integrated with formative data and design elements to be responsive to learners’ feedback (Azukas & Gaudelli, 2020; Hokanson & Kenny, 2020; Phillips et al., 2018; Roblyer, 2015). This type of process also offers a clear method for engaging in a systematic instructional design process (Roblyer, 2015). Considered a universal model, ADDIE is often criticized for being linear, despite its recursiveness (Gawlik-Kobylinska, 2018). The model assumes a systematic process that incorporates formative and summative data to revise instruction by restarting the process (Roblyer, 2015). However, some designers found this model insufficient to address the rapidly changing learning contexts during the pandemic (Dong, 2021). Likewise, the team felt the existing design models were inadequate for rapidly fluctuating conditions. In order to remain viable and provide timely learning to support the new demands placed on educators’ talent due COVID-19– related environmental shifts, the team was encouraged to collaboratively and reflectively learn through the design process (Exter & Ashby, 2022).

Prioritizing Research-Based and Proven Techniques In evaluating the design process, the team consulted a variety of resources and research, as shown in Table 4.2. These resources were used to initiate critical conversations and reflections about instructional design practices and how to improve them with specific research-based techniques. The design team incorporated design thinking, agile processes, learner experience design (LXD), distance education practices, and learner-centered methods as key additions to an updated design model. First, design thinking was a major addition to the design process. This methodology consists of five iterative stages (empathize, define, ideate, prototype, and test) that were integrated into the team’s processes (Institute of Design at Stanford,

38

R. Fennelly-Atkinson et al.

Table 4.2  Resources consulted to inform the design model General practices To research practices in other fields: Google (2020a, 2020b), Interaction Design Foundation (2014) and Pressman and Maxim (2015) To consider effective practices for instruction and coaching: Aguilar (2016, 2020) and Knight (2018, 2020)

Instructional design practices To inform instructional design practices: Boller and Fletcher (2020), Kadakia and Owens (2020) and Moore (2017) To support learner needs: Boller and Fletcher (2020), France (2020), Digital Promise (2019), Knowles et al. (2005), Moore and Kearsley (2012) and Pullin (2009)

Specific techniques To refine evaluation methods: Boller and Fletcher (2020) and Kirkpatrick and Kirkpatrick (2016) To integrate equitable and inclusive practices: Chardin and Novak (2021), Fritzgerald (2020), Hammond (2015) and Nuri-Robins et al. (2015)

2015). This process allows designers to engage in a design thinking process, in which a mindset is adopted to inform the development of professional learning (Schmidt & Huang, 2022; Stefaniak & Sentz, 2020). Typically used for software development, an agile process prioritizes continuous improvement with frequent updates, shorter time scales, collaboration with stakeholders, sustainable development, and ongoing design team reflection (Pressman & Maxim, 2015). The team also incorporated scrum principles, in which a backlog (professional learning requirements) was addressed via a scrum process (requirements, analysis, design, evolution, and delivery) during design sprints (Pressman & Maxim, 2015). The professional learning requirements backlog consisted of any component that needed to be updated such as learning maps, learning assets, evaluation instruments, and more. Design sprints varied in length based on the complexity of the component being created or redesigned. For example, when the design team developed empathy maps, they engaged in the following sprint structure: • Day 1: Reviewed all available data from past and current learners. • Day 2: Defined trends from the organizational and user perspective. • Day 3: Ideated potential solutions to select the ones that prioritized user needs, organizational needs, and environmental constraints (Boller & Fletcher, 2020). • Day 4: Developed an empathy map prototype. • Day 5: Reviewed prototype with users for alignment and bias. • Day X: Prototype was tested by the design team at the next available opportunity. • Day X + 1: Empathy maps were updated based on field-test feedback and results. In the case of this sprint, the empathy maps were a product for internal use by the design team, which means that members reviewed and tested each other’s learning designs. For prototypes such as learner activities, the design team solicited learner feedback and tested the activities in authentic settings before finalizing designs. Other research-based models including LXD and distance education were also instrumental to rethinking a design process for virtual professional learning. The hallmark of LXD includes human-centered, user experience, and socio-cultural elements (Boller & Fletcher, 2020; Schmidt & Huang, 2022). Meanwhile, distance education principles informed virtual design elements such as course design, media and technology selection, and learner autonomy (Moore & Kearsley, 2012).

4  Closing the Professional Learning Loop: Designing for Performance Improvement

39

Collectively, these iterative and agile techniques supported a formative design process for learning (Calongne et al., 2019).

Incorporating Learner Feedback Within any instructional design process, specific types of feedback are collected and used to make revisions at particular points. In a general instructional design model, data is often collected in the analysis and evaluation phase (Roblyer, 2015). A formative design process also begins with data collection to inform the learning model. However, the learning model embeds ongoing data collection to inform feedback loops. Single-loop learning data is used to identify immediate instructional changes, while double-loop learning data is leveraged to adjust the learning design elements for the next iteration of the model (Argyris, 2002). The feedback loops must include key stakeholders, metrics, and various forms of data to support an agile, iterative design process when implementing professional learning to meet specific program and organizational outcomes. For instance, meeting with learning facilitators throughout and after the delivery of all the live learning sessions within a learning experience to gather qualitative and quantitative observational data allows designers to gather critical insights into how learners respond to and interact with the content. Designers can then utilize the data from these conversations to inform the content design and delivery moving forward to make improvements and affirm points of success. In recent years, there has been a growing movement toward formative design learning models. These models prioritize learner feedback and performance improvement data to continuously adjust the learning design. Traditional learning models tend to be more static and focused on delivering a predetermined instructional curriculum (Calongne et  al., 2019; Food and Agriculture Organization of the United Nations, 2021; Understood, n.d.). There are several key benefits of formative design learning models. First, they are highly effective at promoting long-term learning and retention. Second, they are very flexible and can be easily adapted to meet the needs of individual learners. Finally, they have been shown to improve learner motivation and engagement. However, feedback from key internal stakeholders should also be included to ensure the model is responsive to organizational outcomes, technical platform limitations, and instructional delivery considerations (Kadakia & Owens, 2020). The design team found that learners expressed increased satisfaction with content, activities, level of choice, and opportunities to implement new strategies and reflect on their teaching practice.

Embedding Systematic Learner-Centered Strategies Learner-centered strategies were also a major aspect of the research-based practices that the design team incorporated into their processes. As the strategies were extensive, they merit a detailed overview. The design process was grounded in

40

R. Fennelly-Atkinson et al.

learner needs through surveying, persona development, and empathy map creation, which spurred the development of creative, innovative approaches to learning that resonated with learners, ultimately leading to increased learner engagement (Kahu, 2013). The design thinking framework ensured that the team empathized with users through all stages of the design process. Designers defined the learners’ needs and generated creative solutions to those needs during the empathy stage (Institute of Design at Stanford, 2015). Designers then engaged in a user experience technique in which hypothetical user characteristics were used to develop empathy maps based on learner personas (Ferreira et  al., 2015). These empathy maps supported an understanding of learner needs and their respective environments (Ferreira et al., 2015). Empathy maps also allowed the team to better address learner needs. By deeply understanding the variability of learners, designers were better able to address learners’ needs. According to the World of Work Project (2019), an individual has three driving motivators and basic needs: power, achievement, and affiliation. Therefore, designers must consider these motivators and characteristics of a learner by addressing them in the design to properly engage the learner (World of Work Project, 2019). For example, achievement-motivated learners need challenging projects that allow for a balanced appraisal (World of Work Project, 2019). Likewise, providing opportunities for affiliation through collaboration and power will allow goal-oriented learners to excel (World of Work Project, 2019). Providing different ways that learners could demonstrate skill acquisition and process information in the various components of the professional learning model allowed the design team to address these different motivators and increase learner engagement in the learning tasks. The concept of learner variability posits that all learners present varying needs in different contexts based on content, cognition, social-emotional learning, and learner background factors (Pape, 2018). Designers need to ensure that professional learning will support the varying needs of learners. However, this presents its own set of challenges in supporting technology integration since learners have varying technological capabilities (ATD et al., 2015).

Reimagined Formative Design Process While being intentional by integrating research-based practices and techniques, the team was able to meet the organization’s dynamic needs and develop a responsive formative design process that incorporated learner feedback and embedded systematic learner-centered strategies. As shown in Fig. 4.1, the team began with a traditional ADDIE model for designing professional learning, which was later transformed into a completely new method for tackling instructional design.

4  Closing the Professional Learning Loop: Designing for Performance Improvement

41

Fig. 4.1  Updated formative design process

Using the New Design Process to Develop the LAM The hybridized approach of the new formative design process allowed for an agile and responsive approach to the overall instructional design process, which was used to develop the LAM, as shown in Fig.  4.2. The LAM was developed to support multimedia-based distance learning by enabling an entire range of instructional experiences to support anytime, everywhere learning (iNACOL, 2016). The model scaffolded instruction in a manner that allowed learners to be more effective and efficient in their application as they adopted recommended behavioral changes (Kadakia & Owens, 2020). Several key practices ensured that the learning model design benefited learners and designers alike. First, the model ensured that learners had the opportunity to understand the content, apply their learning, and share and receive feedback and support for their work performance. Second, designers proactively identified the types of formative data needed during the professional learning implementation to support responsive design changes. Third, the designers considered how learner needs would be centered within the process to ensure continued engagement and persistence toward performance improvement. Lastly, programmatic data was used to assess the fidelity and success of instructional delivery to ensure the model’s effectiveness. While some of these elements were part of the overall design process, the team determined that it was essential to also document how they were embedded into the LAM.

42

R. Fennelly-Atkinson et al.

Fig. 4.2  Learner in Action Model (LAM) developed through the formative design process Note. This figure depicts the LAM developed through the formative design process, which comprises one workshop. All model and learning design components were based on feedback loops that informed the agile and iterative revision process.

As such, the design team also addressed learner variability by embedding these specific practices into the professional learning model: professional learning communities, peer coaching, and instructional coaching to ensure that learners had strong social learning supports (Aguilar, 2016, 2020; Kadakia & Owens, 2020; Knight, 2018; Knight et al., 2020, Knowles et al., 2005; Moore & Kearsley, 2012). A strengths-based approach also provided flexible pacing, regular reflective prompts, and practical applications of authentic and relevant learning (Knowles et al., 2005; Pape, 2018). The LAM included three components, or different types of learning touchpoints, designed to provide learners with multiple ways of engaging with content (Kadakia & Owens, 2020). Component 1 was delivered to allow for an “in-person” experience in a synchronous virtual setting (Kadakia & Owens, 2020). The immediate touchpoint (Kadakia & Owens, 2020) formed the basis of component 2, or the self-­ directed online learning program. Finally, professional learning communities offered learners the opportunity to engage in social learning deeply connected to their context (Kadakia & Owens, 2020).

4  Closing the Professional Learning Loop: Designing for Performance Improvement

43

Component 1: In-Person Virtual Learning The first component consisted of a synchronous virtual learning experience where learners interacted with the presenter and each other in real-time. This component incorporated Gagné’s nine levels of instruction to help learners build foundational knowledge and optimize learning conditions (Gagné et al., 1992). The design of the virtual synchronous experiences focused on providing learners with the optimal conditions necessary for deep learning. For example, the opportunities to internalize learning through social discussions, application, observations, and responsive facilitation embedded within each experience help learners connect the researched-based content and their day-to-day work. The grounding of learning in purpose and relevance and the focus on internalizing learning increased learners’ confidence in the application and the likelihood of transferability into practice (Aguilar, 2016).

Component 2: Self-Directed Learning In the second component, learners applied the learning from the virtual synchronous component through a self-directed, on-demand course. The self-directed course offered a mediated learning opportunity in which learners exercised agency and autonomy in choosing their learning path and how they demonstrated their application (Knowles et al., 2005; McDonough, 2014; Moore & Kearsley, 2012). The agency provided in this segment empowered learners to guide their learning and validated their experiences, expertise, and skill sets, creating the conditions for learning and making learners more open to and responsible for taking in and applying the learnings.

Component 3: Professional Learning Communities On-site professional learning communities and just-in-time supports were leveraged to close out the learning loop in the final stage of the learning experience. Professional learning communities (PLCs) are groups of learners who come together consistently to share ideas and best practices by leveraging an ecological, social-­ constructivist approach to learning (Herro, 2016; Serviss, 2021; Waters, 2021; Wenger et al., 2002). Learners share and receive feedback from peers on their implementation of learning to support the ongoing transfer of knowledge to application and increase the fidelity of the implementation of professional learning (Serviss, 2021; Waters, 2021). Providing continuous support and assistance to learners as they work to resolve challenges and integrate new practices into instruction cultivates a culture of

44

R. Fennelly-Atkinson et al.

continuous learning and a context of supportive change and builds the capacity and efficacy of educators for the ultimate benefit of K-12 student learning and achievement (Hall & Hord, 2011).

Refining the LAM During Implementation The LAM was developed to continuously integrate feedback loops into the design and delivery of professional learning. This formative process allowed an agile format in which learner feedback was collected to iterate the LAM design in ways that expanded learner opportunities (Hokanson & Kenny, 2020). This process leveraged formative data, which allowed for rapid shifts to professional learning delivery and more intentional model design changes, as shown in Fig. 4.3. By clearly outlining when and where formative data would be collected and used to engage in redesign sprints, the design team created an action map for how and when to use learner feedback. Gathering continuous feedback informed the refinement of the design, uncovered unexpected issues, and reduced learner frustration (Kirkpatrick & Kirkpatrick, 2016). This process also supported a validation and verification process, which is derived from software design to ensure designs are meeting their intended purpose (Sommerville, 2007). The design team sorted feedback into two categories: immediate action items and deferred action items for the design backlog (Pressman & Maxim, 2015). The design team defined immediate action items as those that caused critical failures in the design, accessibility issues, and any changes that could be easily implemented with minimal impact to the

Fig. 4.3  Formative feedback operationalized Within the LAM

4  Closing the Professional Learning Loop: Designing for Performance Improvement

45

learner. Deferred action items were described as any issue that would require an extensive content redesign that would require a lengthy amount of design time or would impede learner progress if introduced during active implementation.

 bserved Results of the Design Process O and LAM Implementation Because the designers were continually engaged in evaluating data and responding to learner feedback, they were able to identify trends throughout the process. The team noticed that changing traditional and well-known design methods caused pushback from organizational stakeholders. However, by engaging in a multistage design process to develop the LAM, the designers gained a deep understanding of the research and reasons for every aspect of the design, which allowed them to clearly articulate and justify each design decision with research and data. The design team grounded their designs in extensive research and testing, which led to deeper reflections and critical conversations when discussing how to address concerns that would impact the learner. When looking at learner data, the team discovered that learners had overwhelmingly positive experiences with the learning design. Further, the team discerned that the formative surveys were effective at capturing specific feedback that could inform a continuous improvement process of the design. Learner engagement and observational data also validated that the learning design was eliciting behavioral change in participants. Overall, the team believed that they were being effective and responsive to the needs of the organization and learners.

 pplying Formative Design Processes to Your A Professional Learning When implementing formative practices in your instructional design process, it is important to be clear and intentional. The first step is to ensure that your professional learning model can support an agile and formative process. The ideal scenario consists of a format in which professional learning is delivered repeatedly within a specified timeframe. When professional learning is provided frequently, there is an opportunity to collect continuous formative data that can inform small shifts in design and delivery. It is important to note that formative data enables small changes while larger changes may prompt a full redesign of professional learning. While fundamental redesigns are a needed part of the instructional design process, they will not have an immediate impact on learners engaged in the current professional learning cycle. However, a redesign of the professional learning structure can be integrated between professional learning cycles to impact current learners.

46

R. Fennelly-Atkinson et al.

The formative design process outlined in this chapter can be implemented by using the following strategies: • Research the methods and techniques used across industries. • If possible, incorporate former or existing learners in the design process to provide feedback on their learning experience. • Find opportunities to test previously untried elements, learning assets, or strategies to collect data on their efficacy during the development phase. • Develop a formative design data collection strategy to intentionally acquire information that can be used to adjust the design. • Schedule regular reviews of formative data and determine design changes needed. • Create a process to determine which changes to the professional learning design can be made immediately without negatively impacting learners or facilitators. • Evaluate deeper design changes and determine which are content-based and which are design-based, and which can inform all professional learning independent of the content.

Summary To meet adult learners’ varied and complex needs, designers can engage in a formative design process when developing professional learning models and experiences to maximize the impact and transferability of learning into practice. The design team gathered and used data to establish, examine the effectiveness of, and refine the Learner in Action Model and its three components to address learner variability and ensure the timeliness and relevance of learning to improve performance and increase learner success. As online learning opportunities continue to grow in popularity and there is a continued need for adult learners to cultivate digital literacy skills, instructional designers must examine how learner input is infused and used within the processes to develop learning models and experiences. By taking a formative, iterative approach to designing learning models and experiences, designers can better understand the myriad individual and systemic factors that impact adult learners in order to develop learning and to leverage technology in learning in a way that is responsive to learner needs and better ensures the successful implementation of learning into practice.

References Aguilar, E. (2016). The art of coaching teams: Building resilient communities that transform schools. Jossey-Bass. Aguilar, E. (2020). Coaching for equity: Conversations that change practice. Jossey-Bass. Argyris, C. (2002). Double-loop learning, teaching, and research. Academy of Management Learning & Education, 1(2), 206–218. https://www.jstor.org/stable/40214154 ATD, IACET, & Rothwell & Associates. (2015). Skills, challenges, and trends in instructional design [White paper]. ATD Research. https://www.td.org/research/skills-­in-­instructional-­design

4  Closing the Professional Learning Loop: Designing for Performance Improvement

47

Azukas, M. E., & Gaudelli, W. (2020). Formative design as a framework for implementing teacher professional development on design thinking. Journal of Formative Design in Learning, 4, 22–33. https://doi.org/10.1007/S41686-­020-­00042-­6 Boller, S., & Fletcher, L. (2020). Design thinking for training and development: Creating learning journeys that get results. Association for Talent Development. Calongne, C., Stricker, A. G., Truman, B., & Arenas, F. J. (2019). Cognitive apprenticeship for teaching computer science and leadership in virtual worlds. In A. G. Stricker, C. Calongne, B. Truman, & F. J. Arenas (Eds.), Recent advances in applying identity and society awareness to virtual learning (pp. 180–200). IGI Global. https://doi.org/10.4018/978-­1-­5225-­9679-­0.CH010 Chardin, M., & Novak, K. (2021). Equity by design: Delivering on the power and promise of UDL. Corwin. Digital Promise. (2019). The learner variability navigator. https://lvp.digitalpromiseglobal.org Dong, H. (2021). Adapting during the pandemic: A case study of using the rapid prototyping instructional system design model to create online instructional content. The Journal of Academic Librarianship, 47(3), 102356. https://doi.org/10.1016/j.acalib.2021.102356 Exter, M., & Ashby, I. (2022). Lifelong learning of instructional design and educational technology professionals: A heutagogical approach. TechTrends, 66(2), 254–264. https://doi.org/10.1007/ s11528-­021-­00657-­x Ferreira, B., Silva, W., Oliveira, E., & Conte, T. (2015). Designing personas with empathy map. In Proceedings of the international conference on software engineering and knowledge engineering, SEKE, 2015-January (pp. 501–505). https://doi.org/10.18293/SEKE2015-­152 Food and Agriculture Organization of the United Nations. (2021). E-learning methodologies and good practices (2nd ed.). FAO. https://doi.org/10.4060/I2516E France, P.  E. (2020). Reclaiming personalized learning: A pedagogy for restoring equity and humanity in our classrooms. Corwin. Fritzgerald, A. (2020). Antiracism and universal design for learning: Building expressways to success. CAST. Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.). Harcourt Brace College Publishers. Gawlik-Kobylinska, M. (2018). Reconciling ADDIE and agile instructional design models—Case study. New Trends and Issues Proceedings on Humanities and Social Sciences, 5(3), 14–21. https://doi.org/10.18844/PROSOC.V5I3.3906 Google. (2020a). Google project management [Course]. https://coursera.org Google. (2020b). Google UX design [Course]. https://coursera.org Hall, G., & Hord, S. (2011). Implementing change: Patterns, principles, and potholes (3rd ed.). Allyn and Bacon. Hammond, Z. (2015). Culturally responsive teaching and the brain: Promoting authentic engagement and rigor among culturally and linguistically diverse students. Corwin. Herro, D. (2016). An ecological approach to learning with technology: Responding to tensions within the “wow-effect” phenomenon in teaching practices. Cultural Studies of Science Education, 11(4), 909–916. https://doi.org/10.1007/s11422-­015-­9688-­2 Hokanson, B., & Kenny, R. (2020). Creativity and critique as formative processes in design thinking. Journal of Formative Design in Learning, 4, 2–4. https://doi.org/10.1007/S41686-­020-­00047-­1 iNACOL. (2016, March 18). What is blended learning? Aurora Institute. https://aurora-­institute. org/blog/what-­is-­blended-­learning/ Institute of Design at Stanford. (2015). An introduction to design thinking: Process guide. https:// web.stanford.edu/~mshanks/MichaelShanks/files/509554.pdf Interaction Design Foundation. (2014). The encyclopedia of human-computer interaction (2nd ed.). https://www.interaction-­design.org/literature/book/ the-­encyclopedia-­of-­human-­computer-­interaction-­2nd-­edliterature Kadakia, C., & Owens, L.  M. D. (2020). Designing for modern learning: Beyond ADDIE and SAM. ATD Press. Kahu, E. R. (2013). Framing student engagement in higher education. Studies in Higher Education, 38(5), 758–773.

48

R. Fennelly-Atkinson et al.

Kirkpatrick, J. D., & Kirkpatrick, W. K. (2016). Kirkpatrick’s four levels of training evaluation. ATD Press. Knight, J. (2018). The impact cycle: What instructional coaches should do to foster powerful improvements in teaching. Corwin. Knight, J., Hoffman, A., Harris, M., & Thomas, S. (2020). The instructional playbook: The missing link for translating research into practice. One Fine Bird Press. Knowles, M. S., Holton, E. F., & Swanson, R. A. (2005). The adult learner: The definitive classic in adult education and human resource development (6th ed.). Elsevier. McDonough, D. (2014). Providing deep learning through active engagement of adult learners in blended courses. Journal of Learning in Higher Education, 10(1), 9–16. https://eric. ed.gov/?id=EJ1143328 Moore, C. (2017). Map it: The hands-on guide to strategic training design. Montesa Press. Moore, M. G., & Kearsley, G. (2012). Distance education: A systems view of online learning (3rd ed.). Wadsworth. Nuri-Robins, K. J., Lindsey, D. B., Lindsey, R. B., & Terrell, R. D. (2015). Culturally proficient instruction: A guide for people who teach. Corwin. Pape, B. (2018). Learner variability is the rule, not the exception. Digital Promise Global. http:// hdl.handle.net/20.500.12265/16 Phillips, J. J., Phillips, P. P., & Nicholas, H. (2018). Measuring the return on investment (ROI) in technology-based learning. In R. A. Reiser & J. V. Dempsey (Eds.), Trends and issues in instructional design and technology (4th ed., pp. 97–103). Pearson. Pressman, R. S., & Maxim, B. R. (2015). Software engineering: A practitioner’s approach (8th ed.). McGraw Hill. Pullin, G. (2009). Design meets disability (2nd ed.). MIT Press. https://doi.org/10.3109/0743461 8.2010.532926 Roblyer, M. D. (2015). Introduction to systematic instructional design for traditional, online, and blended learning environments. Pearson Education. Schmidt, M., & Huang, R. (2022). Defining learning experience design: Voices from the field of learning design & technology. TechTrends, 66(2), 141–158. https://doi.org/10.1007/ S11528-­021-­00656-­Y/FIGURES/7 Senge, P. M. (2006). The fifth discipline: The art and practice of the learning organization. Penguin Random House. Serviss, J. (2021, May 13). 4 benefits of an active professional learning community. ISTE Blog. https://www.iste.org/explore/professional-­d evelopment/4-­b enefits-­a ctive-­p rofessional-­ learning-­community Solomonson, W. L. (2008). Toward fluent instructional design in the context of people. Performance Improvement, 47(7), 12–19. https://doi.org/10.1002/pfi.20012 Sommerville, I. (2007). Software engineering (8th ed.). Addison Wesley. Stefaniak, J., & Sentz, J. (2020). The role of needs assessment to validate contextual factors related to user experience design practices. In M. Schmidt, A. A. Tawfik, I. Jahnke, & Y. Earnshaw (Eds.), Learner and user experience research: An introduction for the field of learning design & technology. EdTech Books. https://edtechbooks.org/ux/role_of_needs_assessment Understood. (n.d.). The difference between Universal Design for Learning (UDL) and traditional education. Retrieved June 10, 2022, from https://www.understood.org/en/articles/ the-­difference-­between-­universal-­design-­for-­learning-­udl-­and-­traditional-­education Waters, S. (2021, September 13). The power of professional learning communities. Better Up. https://www.betterup.com/blog/professional-­learning-­communities Wenger, E., McDermott, R., & Snyder, W. M. (2002). Cultivating communities of practice: A guide to managing knowledge. Harvard Business School Press. Weston, C., McAlpine, L., & Bordonaro, T. (1995). A model for understanding formative evaluation in instructional design. Educational Technology Research and Development, 43(3), 29–48. https://www.jstor.org/stable/30221006 World of Work Project. (2019, February). McClelland’s acquired needs motivation theory. https:// worldofwork.io/2019/02/mcclellands-­motivation-­theory/

Chapter 5

Designing the Museum of Instructional Design, a 3D Learning Environment: A Learning Experience Design Case Noah Glaser, Yvonne Earnshaw, Dana AlZoubi, Mohan Yang, and Elisa L. Shaffer Abstract  This paper describes the iterative design of a three dimensional collaborative virtual learning environment (3D CVLE) called the Museum of Instructional Design (MID) that was developed using learning experience design processes. A detailed articulation of our three-phased learner experience design process will be outlined. Findings will provide insight into how other instructional designers can use formative learner experience design processes to create highly usable and effective 3D learning environments. Keywords  Three dimensional collaborative virtual learning environment · Design case · Learner experience design · User experience design

N. Glaser (*) School of Information Science and Learning Technologies, University of Missouri, Columbia, MO, USA e-mail: [email protected] Y. Earnshaw Kennesaw State University, Kennesaw, GA, USA e-mail: [email protected] D. AlZoubi Mississippi State University, Mississippi State, MS, USA e-mail: [email protected] M. Yang · E. L. Shaffer Old Dominion University, Norfolk, VA, USA e-mail: [email protected]; [email protected] © The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 B. Hokanson et al. (eds.), Formative Design in Learning, Educational Communications and Technology: Issues and Innovations, https://doi.org/10.1007/978-3-031-41950-8_5

49

50

N. Glaser et al.

Introduction Instructional design courses and technology provide learners with various opportunities to practice the use of technologies. Yet, traditional instructional design contexts are often limited in promoting and translating innovative products and processes to classroom environments (Karagiorgi & Symeou, 2005). Many instructional designers and educators face difficulties when applying practicebased tasks and using a variety of technology tools due to the lack of such learning experiences (Pellas et  al., 2020). To address some of these concerns, there has been a growing interest in the use of virtual reality (VR) and related technologies such as three dimensional collaborative virtual learning environments (3D CVLE). 3D CVLE are three dimensional, digital spaces designed to support collaborative, user-centric learning activities (Churchill & Snowdon, 1998). The affordances of these technologies for teaching and learning have long been established (Dalgarno & Lee, 2010; Shin, 2017). However, even though VR equipment has become commercially available and more affordable, there are challenges with this approach as few university students have access to VR headsets (Eriksson, 2021). In an attempt to maintain many of the same affordances of traditional, immersive VR technologies while reducing the barriers to adoption, many are turning to webbased VR. In this study, a 3D CVLE called the Museum of Instructional Design (MID) was developed as a free web-based VR platform for a doctoral-level instructional design and technology (IDT) course focusing on trends and issues of current and historical significance to the field. In this paper, we describe how a team of instructional designers used learner experience design (LXD) methodologies to formatively design, develop, and evaluate the Museum of Instructional Design (MID) to support the learning needs of instructional design and technology (IDT) doctoral students.

Project Description The MID was developed to provide online learners with a collaborative space that would also provide opportunities to engage in critical discourse and to gain essential applied design skills within 3D spaces. Students of this course were enrolled in an online doctoral IDT program at an R1 institution. The majority of students in this program worked full-time and attended night classes. The MID was designed to emulate the experience of an in-person museum with various gallery spaces for students to meet, engage in conversation, and share their own exhibits to represent the IDT field. The instructor and students created the exhibits with the intention of developing a museum gallery that would evolve over the course of the semester.

5  Designing the Museum of Instructional Design, a 3D Learning Environment…

51

Fig. 5.1  A screenshot from the backend Mozilla spoke project

Software Used to Design the MID The MID was designed and developed in Mozilla Spoke and Mozilla Hubs. Mozilla Spoke is a free web-based 3D worlds editor that does not require external software or 3D modeling experience. Mozilla Spoke provides access to an open-source repository of images, videos, 3D models, and other tools (e.g., frames in which multimedia can be placed). Virtual environments created in Mozilla Spoke can be seamlessly integrated and accessed within Mozilla Hubs, the end-user interface. The lead author on this paper developed the architecture and underlying 3D CVLE infrastructure within a Mozilla Spoke project (see Fig. 5.1). The Mozilla Spoke project was then published to a private Mozilla Hubs space. Mozilla Hubs is a web-based 3D meeting platform that can be used with VR headsets, desktops, and mobile devices and is compatible with different technology tools (e.g., Discord; Le et al., 2020). In this private Mozilla Hubs environment, students of the class assumed the role of a virtual avatar of their choice controlled through input device configurations (e.g., keyboard and mouse). Students co-created the museum exhibits as they engaged in curriculum activities within this Hubs space (see Fig. 5.2).

Method This study describes the learning experience design (Schmidt et al., 2020), development, and evaluation of the MID. LXD uses iterative processes and is “a human-­ centric, theoretically-grounded, and socio-culturally sensitive approach to learning design, intended to propel learners towards identified learning goals, and informed

52

N. Glaser et al.

Fig. 5.2  A screenshot from the museum of instructional design of students debating learning analytics around an exhibit they co-created

by user experience design methods” (Schmidt & Huang, 2022, p. 151). The MID was developed in three phases which included: (1) front-end analysis, (2) design and development, and (3) evaluation. The front-end analysis consisted of empathy interviews, empathy mapping, and persona development. The iterative design and development process made use of rapid prototyping (Desrosier, 2011; Tripp & Bichelmeyer, 1990; Wilson et al., 1993) to revise the MID between versions. An evaluation was conducted through usability and learner experience design methods. Research activities were considered exempt by the PI’s Institutional Review Board. The research focused on the following design questions (DQ): • DQ1: How can the user experience design methods (empathy mapping and persona development) inform design principles for a 3D CVLE? • DQ2: How can the identified design principles be incorporated into the design framework of a 3D CVLE? • DQ3: How is the usability of the MID perceived by classroom and expert evaluator participants, and what features promoted or hindered usability?

Successive Approximation Model To analyze, design, and develop the MID, we used a modified approach to Allen’s Successive Approximation Model version 2 (Allen, 2012). SAM was used because it is an agile approach to design and development that is more flexible than traditional ID models. Furthermore, SAM was selected as it is a common rapid prototyping framework that is used in instructional design contexts (see Schmidt et al., 2020 for a detailed use case of an instructional designer using SAM). As seen in Fig. 5.3, SAM supported the highly iterative process that we used to design the MID. Given the problems with trying to create instructional systems based on the assumptions of students (Schmidt et al., 2020), this three-phase approach was couched in learning experience design (LXD) - with a particular focus on gathering the requirements

5  Designing the Museum of Instructional Design, a 3D Learning Environment…

53

Fig. 5.3 Learner experience design framework of the museum of instructional design based on SAM2

of end-users to assess their needs (Sleezer et  al., 2014). The three phases (see Fig. 5.3) of this approach includes preparation (Phase 1), iterative design (Phase 2), and iterative development (Phase 3). During Phase 1, empathy and persona development methods were used (Cooper, 2004). Empathy interviews were conducted at the beginning of the project to assist the designer with developing empathy with the targeted end-students. Empathy maps were created based on an analysis of empathy interviews to identify a user’s behaviors and attitudes. They focused on detailing and articulating what the end-­ users might say, think, do, and feel (Siricharoen, 2020). Themes from these empathy maps were then used to create “personas” or fictional models of expected students of the learning space (Mashapa et al., 2013; McGinn & Kotamraju, 2008; Miaskiewicz & Kozar, 2011). In Phase 2, we used rapid prototyping to incorporate social, technological, and pedagogical considerations that were revealed from the efforts of Phase 1. This led to a design proof consisting of an initial prototype and underlying system architecture. This initial design proof would be the system used during the first week of class. In Phase 3, the MID was iteratively evaluated and developed. Throughout Phase 3, data were collected during the regularly scheduled classroom activities using a variety of quantitative and qualitative data sources to help inform the design of the MID.  An expert evaluation was also conducted to further elicit feedback on the MID.

Data Sources A variety of quantitative and qualitative data sources were used throughout the study (see Table 5.1).

54

N. Glaser et al.

Table 5.1  List of data sources and descriptions Data source Empathy maps

Data source description Empathy maps are created from interviews to identify a user’s behaviors and attitudes. Representative students from the program were asked questions about their thoughts, feelings, and perspectives on the IDT class, their experiences in the program so far, technology comfort, etc. the empathy map results focus on what the user says, thinks, does, and feels. Empathy mapping distills themes which are used to create personas, or fictional models of expected students of the learning space. System usability The System Usability Scale (SUS) is a 10-item questionnaire that is used to scale (SUS) measure usability of hardware, software, websites, or applications (Brooke, 1996). Respondents rank each item on a 5-point Likert-type scale ranging from Strongly Agree to Strongly Disagree. Adjectival ease of The Adjectival Ease of Use Scale is a single-item questionnaire that use scale measures user friendliness (Bangor et al., 2008). The item states, “Overall, I would rate the use-friendliness of this product as: Worst Imaginable, Awful, Poor, Ok, Good, Excellent, Best Imaginable.” Computer system The CSUQ (Lewis, 2018) is a validated measure that examines the usability usability of computer systems and software. The CSUQ consists of 16 7-point questionnaire Likert-type scale questions (strongly disagree to strongly agree) and two (CSUQ) additional questions that ask participants to provide the three most negative and positive aspects of their experience. Field notes The instructor took notes on the classroom observations throughout the semester. The instructor also took notes on the expert reviewers’ comments during the think-aloud protocol and on the observations of the expert reviewers’ interactions regarding usability issues, bugs, navigation, memorability, and features. Think-aloud While the participant is completing a task, they narrate what they are doing, protocol feeling, and thinking (Nielsen, 1993). Think-aloud protocols were recorded. Exit tickets A number of quantitative and qualitative exit tickets were administered to students throughout the duration of the semester.

Study Procedures Learner Evaluations In Phase 3, ongoing evaluations were conducted with 15 students (n = 15; male = 6, female = 9) in a 15 week doctoral level IDT course. All participation in research activities was voluntary and anonymous. Research activities took place during regularly scheduled class activities and were typically presented as exit tickets or surveys at the end of class. All survey and exit ticket data were collected through an anonymous Google Form. Classroom observations were also documented by the instructor of the class. These data were used to iteratively design and develop the MID throughout the semester. During Week 1, students of the class were introduced to Mozilla Hubs through a training environment designed to familiarize new students with the features and interface of the platform (Advanced Learning Technologies Studio, 2022). Pilot

5  Designing the Museum of Instructional Design, a 3D Learning Environment…

55

data were collected at the end of the class period using the CSUQ (Lewis, 2018) from students in the class (n = 12). This survey was administered at the end of the class session to measure the evaluation of the system from the students’ perspective. During Week 2, students (n = 11) provided their insights through an informal exit ticket that asked them to rate their confidence level with the technology: “On a scale from 1-5 (strongly disagree to strongly agree). I am feeling confident using Mozilla Hubs.” During Week 4, students provided their insights through another informal exit ticket. They addressed the following questions: What was the most challenging part of designing multimedia for 3D spaces? What did you learn about designing in 3D spaces? What resources, tools, etc., helped you as you designed your ID leaders exhibit? During Week 7, the System Usability Scale was administered to the students (n = 11). During Week 8, students completed the Adjectival Ease of Use, a single-item questionnaire that measures user friendliness (Bangor et al., 2008). The item states, “Overall, I would rate the user-friendliness of this product as: Worst Imaginable, Awful, Poor, Ok, Good, Excellent, Best Imaginable.” In addition, students completed an exit ticket about features and changes that hindered or promoted usability. Expert Evaluations As suggested in Tessmer’s (1993) work on formative evaluation in instructional design, an expert review was also conducted (Phase 3). Three (n = 3) expert reviewers were recruited to provide an evaluation of the MID (see Table 5.2). These participants were purposively recruited based on their background and expertise. Participants were required to have a background relevant to the design and development of digital worlds and/or background with deploying educational technologies. They were also required to be at least 18  years of age. Informed consent was obtained by all expert reviewers prior to their participation in the study. Expert reviewers were tasked with completing a series of activities structured to mirror those that students enrolled in the class would go through during any given Table 5.2  Demographics and details of expert reviewers Pseudonym Gender Description Alexander Male A former commercial game designer who now works as faculty at a R1 research institution. Has expertise in the design of 3D worlds and conducts research concerning learner engagement within technology-­ driven experiences. Tabatha Female A high school teacher at a public school in New England. She collaborates with research partners at universities to iteratively design and evaluate technological solutions that can be integrated into the classroom. Link Male A PhD student in a learning technologies program. He leads a game design studio and conducts research in the development of educational games and simulations.

56

N. Glaser et al.

week. Expert evaluators began the session by completing the same training activity that students in the class completed in the first week of class. They then explored the MID to complete tasks that included engaging in a lecture, providing responses to prompts, creating museum exhibits, and placing their artifacts within the environment. Expert review participants were asked to think aloud (Nielsen, 1993) while completing these tasks. These sessions were also screened and audio recorded for later analysis. A trained researcher took field notes during these evaluations. Upon the completion of the activities within the MID, expert reviewers were asked to complete the CSUQ.

Analysis Empathy Maps Empathy maps were created using information from empathy interviews conducted during Phase 1. These focused on four areas: say, think, do, feel. From the six empathy maps, four user personas were developed. Quantitative Analysis Quantitative usability data were calculated using methods provided by individual instruments. SUS results were calculated using methods outlined by Brooke (1996). Scores for each question of the SUS are converted to a new number, added together and then multiplied by 2.5 to convert the original scores to a value between 0–100. Though these scores are between 0–100, they are not meant to be interpreted as percentages and instead should be considered only in terms of their percentile ranking (Brooke, 1996). Scores of 68 are considered to represent above average usability. CSUQ scores were obtained by using a formula outlined by Lewis (2018) which converts the results to a 100-point scale to match the SUS. Data from the Adjectival Ease of Use Scale and exit tickets were input into a spreadsheet to calculate descriptive statistics. Qualitative Analysis Qualitative analysis focused on identifying characteristics of the 3D CVLE that promoted or hindered the MID’s ease-of-use. A deductive approach to qualitative analysis was conducted using usability heuristics and guidelines established in the field (Nielsen, 1994) that have also been adapted and applied to 3D environments (Joyce, 2021).

5  Designing the Museum of Instructional Design, a 3D Learning Environment…

57

Results This study formatively evaluated a 3D CVLE called the MID. The following results articulate how learner experience design methods might inform design principles; how design principles might be incorporated into an operable design framework; how participants perceived usability; and what might be improved. The following sections will detail the results from Phase 1 (Preparation), Phase 2 (Iterative Design), and Phase 3 (Iterative Development).

Results of Phase 1: Preparation (RQ1) Empathy Maps We created six empathy maps based on the empathy interviews that were conducted in Phase 1. These empathy maps were then iteratively refined. Each empathy map includes say, think, do, and feel statements as well as a list of potential pains and gains (see examples in Fig. 5.4). The empathy maps we present below represent a stark contrast between student backgrounds, desires, and feelings as they entered into the MID. One student type has little interest in learning a new technology so late in their doctorate program and sees their comprehensive examination as the only thing that matters. The other example represents a student type who is vastly interested in using the MID and further exploring Mozilla Hubs. This student is able to learn the system quickly and frequently asks for more features and wants to push the limits of the system. Personas Four data-driven user personas were created (see example in Fig.  5.5). We constantly referenced these personas during the design and development phases to articulate and refine the social, technological, and pedagogical considerations (see Table 5.3).

Results of Phase 2: Iterative Design (RQ2) In Phase 2, we made various design decisions based on the results from Phase 1, such as selecting Mozilla Hubs over other web-based VR platforms, including a training environment for students to familiarize themselves with the features and controls, and integrating instructional strategies appropriate to the learner. From these findings, three focus areas for subsequent design emerged: (1) social considerations, (2) pedagogical considerations, and (3) technological considerations (see Table 5.3).

58

N. Glaser et al.

Fig. 5.4  Examples of empathy maps used to inform the design of the MID

Results of Phase 3: Iterative Development (RQ3) Quantitative results from student responses to the various usability measures indicate that the MID’s usability was above average. Aggregate SUS (69.1) and CSUQ (69.5) values were rated as being acceptable and the results from the Adjectival Ease of Use Scale were “Good” (5.2). Expert evaluators perceived the MID’s usability as being better (81.3). Qualitative results were used to determine features that promoted or hindered usability. These usability issues were iteratively addressed throughout the course of the semester leading to a refined system (see Table 5.4).

5  Designing the Museum of Instructional Design, a 3D Learning Environment…

59

Fig. 5.5  Example of a Personas Table 5.3  Social, technological, and pedagogical considerations Social considerations Learner needs Create a sense of connectedness to others in the program Provide opportunities to showcase work Provide ways to reduce stressful and detrimental feelings Exposure to growing set of IDT skills Technological considerations System Individual user profiles Individualized avatars Supports multiple means of representation & engagement Private and secure Pedagogical considerations Community of inquiry Sense of belonging through ownership of exhibits Social presence through customization options and gathering spaces Museum theme

Co-creation Editable exhibits Evolving museum space Design supports and scaffolds

Content Decentralize authority figure from class content Invite learner expertise Relevancy to impending doctoral exams

Devices & Hardware Supports mobile, tablet, and desktop devices Supports range of input devices Supports older devices Cross platform

Architecture Low bandwidth Open-source plugin, tools, and repository support Works in web browsers Supports plugin integration

Design focused Applied 2D and 3D design opportunities Rapid prototyping Opportunities to fail

Adult students Work on mobile and cross platform devices Simple easy-to-use system that builds off prior knowledge of technology systems Provide scaffolds and supports

60

N. Glaser et al.

Table 5.4  Description of examples from feedback and changes made to the MID Usability issues Domain Loading issues

Examples from usability feedback Some students had issues loading the environment due to their internet speed Some students had issues loading the environment due to the limitations of their device Some students created classroom artifacts that caused extraneous load on the devices of others Some students had no experience with navigating in 3D spaces before Some students said turning and navigating through halls was challenging Students had difficulty remembering where different exhibits were

Examples of changes made Split space into separate zones Provided guides on improving load optimization Iterative Design of Materials Reduced high load virtual assets

Provided opportunities to practice Links in presentation materials Quick Nav map Navigating the Links in presentation materials environment Quick Nav map Signage Positioning and sizing 2D and 3D models in More media frames Placing objects Reduced this activity onto walls in 3D the virtual space was challenging Put more space between media Hard to get 2D multimedia to be flush space frames against the wall without media frame Objects would clip and fall “into the void” Objects sometimes snapped into the wrong media frame Developed audio zones Hearing others in Hard to hear others if too far away from Split space into separate zones environment them Created user guides around Could hear other classmates in nearby audio settings rooms which was distracting Some students had loading issues with audio Created activities that utilized different audio zones Features that promoted usability Domain Examples from usability feedback Examples of changes made Included student-created work Past examples Students said they liked being able to see their work from the semester placed within in most activities that would the MID and that this gave them insight into be placed on the walls of the MID what did and did not work for designing in 3D spaces Training Students said they liked having access to a PersonaAdded permanent link environment training environment within MID to training environment PersonaProvided Multiple input Students said that they liked having the options ability to use different control schemes and documentation and training for different input devices devices (e.g. mouse and keyword) User guides Students said support materials were helpful Created more scaffolded Students said they liked having the support guides throughout the semester Pinned the guides within the materials provided directly in the MID 3D space

Controls

(continued)

5  Designing the Museum of Instructional Design, a 3D Learning Environment…

61

Table 5.4 (continued) Not having to use camera Relevancy

Access to materials

Students said they did not want to have their camera on during class Students said they liked having applied design opportunities Students said they liked gaining 3D technology exposure Students said they liked having access to a repository of open-source images, 3D models, gifs, and other multimedia Students said they liked being exposed to different web-based applications and design tools

Did not require the camera for class activities Included opportunities to do applied design work within the MID Provided access to free web-based repositories and applications to conduct required work

Discussion In this paper, a prototype 3D CVLE called the Museum of Instructional Design was presented. The goal of the MID was to provide doctoral-level students with a flexible 3D space to participate in class activities and to engage in authentic design activities. In the current research, we sought to address three goals. First, we sought to articulate how user experience design methods could inform design principles for a 3D CVLE. Second, we sought to explore how identified design principles could be incorporated into the design framework of a 3D CVLE. Last, we sought to explore how participants rated and perceived the usability of the MID. By approaching these questions we sought to reveal key design considerations and to provide precedent for how emerging web-based VR can be designed. Much of the existing research in this area focuses on outcomes rather than on documenting design decisions (e.g., Glaser & Schmidt, 2021), which can be critical in how designers go about their design process (Gray & Boling, 2016). VR and related technologies are becoming more prominent in education (Kimmons & Rosenberg, 2022). It is imperative that the field provide design cases to provide precedent (Lawson, 2004, 2019) for addressing the complexities of designing for 3D spaces (Huang & Lee, 2019). Regarding the usability of the MID, findings show that the mean usability scores are above the standard metric for a system to be considered usable (Brooke, 1996, Sauro, 2011). In addition, all students were able to complete the entirety of the semester’s activities within the 3D CVLE. While some students encountered some usability issues, the majority of these issues were remedied through the reflexive iterative design and evaluation process.

Design Implications There are several implications for using a 3D CVLE. It allows for more feasibility than using traditional VR with a required headset and provides opportunities for students to engage and collaborate. In addition, because using a 3D CVLE does not

62

N. Glaser et al.

require software development skills, students and instructors have opportunities to design in a 3D space. However, there are still some challenges with the technology. Constant modifications may be needed to improve the learner experience. There is not a one-size-fits-all template for all instructors to use. In addition, while web-­ based VR technologies certainly broaden access and potential use cases for instruction and learning, logistical barriers still hinder adoption for some (e.g., rural students with poor Internet connectivity). Given that many students may have little to no experience navigating a 3D environment, instructors may need to provide more guidance and support to students.

Limitations The findings presented in this chapter should consider the limitations detailed in the following section. Nature of UX Research Due to the nature of user experience design research, findings from this work cannot be generalized beyond the current context. The purpose was to design, develop, evaluate, and refine a feasible and acceptable 3D CVLE for adult students enrolled in a PhD level class. Therefore, this research used small sample sizes and specifically focused on understanding the nature of students’ experiences as they used the MID. Instead of seeking to create generalizable knowledge, the findings from this work seeks to reveal insights into design decisions that can address the social, technological, and pedagogical needs of students. Same Participants and Initial Impressions Quantitative results from the students’ responses to usability measures during Phase 3 (iterative development) indicate that their perceived usability did not change throughout the course of the semester (69.1–69.5). However, this finding might be limited due to the first impression bias phenomena (Fiske & Neuberg, 1990; Lim et al., 2000). In this case, with the students from the class acting as research participants, it is possible that their initial impressions of the system led to a reluctance to change their responses to usability measures throughout the course of the semester. In contrast to these results, qualitative findings indicate that students appreciated the revisions made to the MID and that its usability was improved. Further, expert evaluators, who tested the system closer to the end of the MID’s development (after most of the improvements had been made) rated the system higher.

5  Designing the Museum of Instructional Design, a 3D Learning Environment…

63

References Advanced Learning Technologies Studio. (2022). Project PHoENIX virtual reality software. Copyright 2022 University of Florida. Allen, M. W. (2012). Leaving ADDIE for SAM: An agile model for developing the best learning experiences. Association for Talent Development. Bangor, A., Kortum, P. T., & Miller, J. T. (2008). An empirical evaluation of the system usability scale. International Journal of Human-Computer Interaction, 24(6), 574–594. https://doi. org/10.1080/10447310802205776 Brooke, J. (1996). SUS: A “quick and dirty” usability. In P.  W. Jordan, B.  Thomas, B. A. Weerdmeester, & I. L. McClelland (Eds.), Usability evaluation in industry (pp. 189–194). Taylor & Francis. Churchill, E.  F., & Snowdon, D. (1998). Collaborative virtual environments: An introductory review of issues and systems. Virtual Reality, 3(1), 3–15. https://doi.org/10.1007/BF01409793 Cooper, A. (2004). The inmates are running the asylum: Why high-tech products drive us crazy and how to restore the sanity. SAMS Publishing. Dalgarno, B., & Lee, M.  J. (2010). What are the learning affordances of 3-D virtual environments? British Journal of Educational Technology, 41(1), 10–32. https://doi. org/10.1111/j.1467-­8535.2009.01038.x Desrosier, J. (2011). Rapid prototyping reconsidered. The Journal of Continuing Higher Education, 59, 134–145. https://doi.org/10.1080/07377363.2011.614881 Eriksson, T. (2021). Failure and success in using Mozilla hubs for online teaching in a movie production course. In 7th international conference of the immersive learning research network (iLRN) (pp. 1–8). IEEE. https://doi.org/10.23919/iLRN52045.2021.9459321 Fiske, S. T., & Neuberg, S. L. (1990). A continuum of impression formation, from category-based to individuating processes: Influences of information and motivation on attention and interpretation. Advanced in Experimental Social Psychology, 23, 1–74. https://doi.org/10.1016/ S0065-­2601(08)60317-­2 Glaser, N., & Schmidt, M. (2021). Systematic literature review of virtual reality intervention design patterns for individuals with Autism Spectrum Disorders. International Journal of Human– Computer Interaction, 38(8), 753–788. https://doi.org/10.1080/10447318.2021.1970433 Gray, C., & Boling, E. (2016). Inscribing ethics and values in design for learning: A problematic. Educational Technology Research and Development, 64(1), 969–1001. https://edtechbooks. org/-­jcS Huang, H., & Lee, C. F. (2019). Factors affecting usability of 3D model learning in a virtual reality environment. Interactive Learning Environments, 30, 1–14. https://doi.org/10.1080/1049482 0.2019.1691605 Joyce, A. (2021). 10 usability heuristics applied to virtual reality. Nielsen Norman Group. https:// www.nngroup.com/articles/usability-­heuristics-­virtual-­reality/ Karagiorgi, Y., & Symeou, L. (2005). Translating constructivism into instructional design: Potential and limitations. Journal of Educational Technology & Society, 8(1), 17–27. Kimmons, R., & Rosenberg, J. M. (2022). Trends and topics in educational technology, 2022 edition. Tech Trends, 66, 134–140. https://doi.org/10.1007/s11528-­022-­00713-­0 Lawson, B. (2004). Schemata, gambits and precedent: Some factors in design expertise. Design Studies, 25(5), 443–457. Lawson, B. (2019). The design student’s journey: Understanding how designers think. Routledge. Le, D.  A., MacIntyre, B., & Outlaw, J. (2020). Enhancing the experience of virtual conferences in social virtual environments. In 2020 IEEE conference on virtual reality and 3D user interfaces abstracts and workshops (VRW) (pp.  485–494). IEEE. https://doi.org/10.1109/ VRW50115.2020.00101 Lewis, J. R. (2018). Measuring perceived usability: The CSUQ, SUS, and UMUX. International Journal of Human-Computer Interaction, 34(12), 1148–1156. https://doi.org/10.1080/1044731 8.2017.1418805

64

N. Glaser et al.

Lim, K.  H., Benbasat, I., & Ward, L.  M. (2000). The role of multimedia in changing first impression bias. Information Systems Research, 11(2), 115–136. https://doi.org/10.1287/ isre.11.2.115.11776 Mashapa, J., Chelule, E., Van Greunen, D., & Veldsman, A. (2013). Managing user experience – Managing change. In P. Kotzé, G. Marsden, G. Lindgaard, J. Wesson, & M. Winckler (Eds.), Lecture notes in computer science: Vol. 8118. INTERACT 2013 (pp.  660–677). Springer. https://doi.org/10.1007/978-­3-­642-­40480-­1_46 McGinn, J., & Kotamraju, N. (2008). Data-driven persona development. In Proceedings of the SIGCHI conference on human factors in computing systems (pp.  1521–1524). https://doi. org/10.1145/1357054.1357292 Miaskiewicz, T., & Kozar, K. A. (2011). Personas and user-centered design: How can personas benefit product design processes? Design Studies, 32(5), 417–430. https://doi.org/10.1016/j. destud.2011.03.003 Nielsen, J. (1993). Usability engineering. Morgan Kaufmann. Nielsen, J. (1994). Enhancing the explanatory power of usability heuristics. In Proceedings of the SIGCHI conference on human factors in computing systems (pp.  152–158). https://doi. org/10.1145/191666.191729 Pellas, N., Dengel, A., & Christopoulos, A. (2020). A scoping review of immersive virtual reality in STEM education. IEEE Transactions on Learning Technologies, 13(4), 748–761. https://doi. org/10.1109/TLT.2020.3019405 Sauro, J. (2011). A practical guide to the system usability scale: Background, benchmarks, & best practices. Measuring Usability LLC. Schmidt, M., & Huang, R. (2022). Defining learning experience design: Voices from the field of learning design & technology. TechTrends, 66(2), 141–158. https://doi.org/10.1007/ s11528-­021-­00656-­y Schmidt, M., Earnshaw, Y., Tawfik, A. A., & Jahnke, I. (2020). Methods of user centered design and evaluation for learning designers. In M. Schmidt, A. A. Tawfik, I. Jahnke, & Y. Earnshaw (Eds.), Learner and user experience research: An introduction for the field of learning design & technology. EdTech Books. https://edtechbooks.org/ux/ucd_methods_for_lx Shin, D. H. (2017). The role of affordance in the experience of virtual reality learning: Technological and affective affordances in virtual reality. Telematics and Informatics, 34(8), 1826–1836. https://doi.org/10.1016/j.tele.2017.05.013 Siricharoen, W. V. (2020). Using empathy mapping in design thinking process for personas discovering. In P. C. Vinh & A. Rakib (Eds.), Lecture notes of the institute for computer sciences, social informatics and telecommunications engineering: Vol. 343. Context-aware systems and applications, and nature of computation and communication (pp. 182–191). Springer. https:// doi.org/10.1007/978-­3-­030-­67101-­3_15 Sleezer, C. M., Russ-Eft, D. F., & Gupta, K. (2014). A practical guide to needs assessment (3rd ed.). Tessmer, M. (1993). Planning and conducting formative evaluations: Improving the quality of education and training. Routledge. Tripp, S.  D., & Bichelmeyer, B. (1990). Rapid prototyping: An alternative instructional design strategy. Educational Technology Research and Development, 38(1), 31–44. https://doi. org/10.1007/BF02298246 Wilson, B. G., Jonassen, D. H., & Cole, P. (1993). Cognitive approaches to instructional design. In G. M. Piskurich (Ed.), The ASTD handbook of instructional technology (pp. 11.1–21.22). McGraw-Hill.

Chapter 6

Enhancing Problem Based Learning Through Design Thinking and Storying Robert F. Kenny and Glenda A. Gunter

Abstract  The authors define Problem Based Learning (PBL) as a student-centered, formative, pedagogical approach in which pupils acquire critical and design thinking skills along with content knowledge. The problem focus is similar to cases found in Case-Based Learning (CBL) that has been commonplace for decades in many academic domains. Most of what is found in the literature about PBL tends to be descriptive in nature, whereas little can be found that assists with identifying the process of identifying root causes for those problems and with sustaining the strategy in the classroom. This chapter presents a case for integrating story and design thinking to develop causal analysis and to add rigor through formative conjecture modeling. Keywords  Problem-based learning · Storying · Design thinking · Sense of inquiry and wonder

Introduction Often, a new or contemporary instructional framework or theory is actually one that is based on or borrowed from other disciplines. Such is the case with Problem Based Learning (PBL). Having its origins as Case Based Learning in medical education, many educators and academics in other fields adopted the strategy, utilizing several R. F. Kenny (*) Florida Gulf Coast University, Fort Myers, FL, USA e-mail: [email protected] G. A. Gunter University of Central Florida, Orlando, FL, USA e-mail: [email protected] © The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 B. Hokanson et al. (eds.), Formative Design in Learning, Educational Communications and Technology: Issues and Innovations, https://doi.org/10.1007/978-3-031-41950-8_6

65

66

R. F. Kenny and G. A. Gunter

“x-based” learning nomenclatures. Such is the case in K12 settings, where the approach gained traction for a few years but then became less utilized. Recently, the field of education rebranded PBL as Project Based Learning. We suggest that this redefining of the acronym has changed the nuance and focus of the original intent of the PBL strategy and may have contributed to its lack of adoption in K12 due to a misunderstanding of its intent and a perceived difficulty in its implementation. Referring to PBL as Project Based Learning tends to change the focus to be more on the final summative product or artifact that is intended to demonstrate a student’s mastery of content. While gaining content knowledge is important, the change in focus may be causing confusion and misunderstandings as to the potential of PBL as an effective, mainstream instructional delivery approach (see “Project-Based Learning vs. Problem-Based Learning vs. X-BL”). The authors suggest that this refocusing on the end product of the exercise rather than creative problem solving can be attributed to a lack of understanding as to how design thinking and storying can assist with maintaining a proper focus and creating a sustainable instructional culture. To reiterate, we subscribe to the original connotation of PBL as “Problem Based Learning” and describe it as a student-centered, pedagogical approach in which pupils simultaneously acquire critical and design thinking skills, as well as content knowledge through a cyclical, scientific information gathering process in which its outcome may well be a project but also could be an intervention, an artifact, or another counter measure. Regardless of whichever end product is determined, it is ancillary to demonstrating what has been learned from the process and is clearly defined as emanating from a measurable problem. Perhaps a compromise between the differing definitions of PBL might be to refer to the whole process as a Problem Based project.

 roblem Based Learning Shares Common Goals with Many P Instructional Methods The following examples are evidence of the resilience of PBL and Design Thinking. Both share similarities with the scientific methods that originated in the hard sciences and then were extended and applied in engineering studies as shown in Fig. 6.1. By extension, the above graphic visualizes how these methods were consolidated into a single framework and integrated into standard engineering STEM frameworks (Kenny & Gunter, 2015; Zalewski et al., 2014). What may not be as obvious with the initial concept of scientific experimentation is that it is predominantly utilized in a lab setting and for which most variables and outliers are eliminated. An engineer and other applied scientists work in the natural world; the natural world being mostly composed of a wide range of random variables that need to be reckoned with (i.e., predicted and/or controlled for) and requiring iterative and formative treatment during development and implementation. In other words, what

6  Enhancing Problem Based Learning Through Design Thinking and Storying

67

Fig. 6.1  Scientific method

Fig. 6.2  Software development process

applied scientists have added to the basic, scientific conceptual frameworks is the utilization of measurable, predictive analyses that are applied to an iterative recycling and testing (i.e., evaluation) of hypotheses, design, and development to assess and refine outcomes. In software engineering, for example, the process looks like Fig. 6.2: Instructional designers also have begun to evolve its classical legacy processes to endorse similar iterative processes to create a more holistic model for education and training. ADDIE has evolved into AGILE (see Fig. 6.3) and, more recently, the 4D model (Reigeluth & An, 2020). A properly formulated PBL project, similar to engineering thinking, starts with the formulation of a hypothesis followed by structured experiential modeling, coupled with a formative approach to predict one or more potential solutions. With our view of PBL, it is based on a similarly looking conceptual framework as shown in Fig. 6.4: Although PBL follows a rigorous process, several observations can be made regarding its slow adoption into the K12 classrooms, starting with the confusion and/or lack of consensus and nomenclature as to what these strategies consist of and how they integrate with learning design. One of the major causes for this confusion

68

R. F. Kenny and G. A. Gunter

Fig. 6.3  Evolution of ADDIE

Fig. 6.4  PBL process

may actually correspond to the success of researchers in developing over more than 40 years a plethora of instructional strategies. As far back as the 1980s, researchers were debating which strategies and model(s) were the most efficacious (Weston & Cranton, 1986).

Enhancing the Sustainability of PBL with Design Thinking One anecdotal finding in our research is that most of what we found in the literature about PBL has tended to be descriptive in nature. We have found little that is prescriptive nor simple processes that identify how to evoke or initiate a PBL and provide guidance using appropriate driving and guiding questions. The first step in design thinking is to learn how to become skilled in critical thinking using appropriate questioning techniques to put forward potential solutions to complex problems (Brown, 2019). In other words, invoking a design thinking mindset with students supports the process of problem identification (Kenny & Gunter, 2015).

6  Enhancing Problem Based Learning Through Design Thinking and Storying

69

Often design thinking is incorrectly associated with random, big idea thinking (Brown, 2019). Those who misappropriate the term often overlook the fact that, like one who becomes accomplished in PBL, a properly skilled design thinker learns how to articulate a fixed and rigorous course of action. Both PBL and design thinking are positively affected by rigorous, scientific methodologies and processes. Sometimes mentioned in the literature about both concepts is the Maastricht Seven-­ Jump method (Walldén & Mäkinen, 2014), which involves specific steps, such as clarifying terms, defining problem(s), brainstorming, structuring and hypothesis, and synthesis (Kenny & Gunter, 2015). With our definitions of PBL and design thinking, both follow a remarkably similar track with another, well-known teaching strategy: KWL.  In this process the designer or teacher identifies what is already Known (K), what one Wants to know (W), and what was Learned (L), prompting the learner to access information that leads to a more appropriate resolution (Ogle, 1986). Gunter and Gunter (2015) later extended KWL to form a conceptual framework by adding a Q that stands for further Questioning, and an S for Sharing one’s findings. These principles are at the heart of the PBL process. Thus, PBL when focusing on problem identification, solving, and sharing shares a lot of similarities with other familiar and researched teaching strategies, making it a well premised concept. The point is that many adopted instructional strategies like PBL are not created in a vacuum. We would argue that, while it is perceived that PBL has been slow to be mainstreamed, especially in the K12 environment, it really is not a slow adoption but, perhaps, a slow re-entry into the K12 classroom, and that it likely fell out of favor when teaching to the test seems to have taken over.

Enhancing One’s Knowledge Though Embracing Failure The iterative nature of design thinking (and, therefore, also PBL) strongly suggests that there is an appropriate place in the learning process for learning from one’s mistakes. This line of thinking is anathema to the many K12 teachers we have studied who indicate that they have been taught to avoid failure and to seek truth/perfection. Few of the teachers we interviewed recognized the effectiveness of following the engaging performance technique of interactive improv street actors who smartly receive all offers (i.e., responses), regardless of whether they are right or wrong, and then use them as a means to work towards a more correct answer (Kenny & Gunter, 2022; Kenny & Wirth, 2009). This instructional technique fully supports the formative processes that are embedded within a problem-based learning experience (Kenny & Gunter, 2018). A second possible reason for the slow adoption of embracing mistake making (and, therefore also PBL) is that it is often perceived to be very difficult to implement and can cause a time management problem for classroom teachers who have to watch the clock because of inappropriate scheduling on the part of their schools. Further, parents incorrectly place pressure on the teachers because they do not

70

R. F. Kenny and G. A. Gunter

understand the progressive nature of standards-based grading that is often associated PBL because it is vague and not granular enough to communicate “winners and losers” in a competitive school culture (Harris, n.d.; Meyrat, 2021). In fact, standards-­based grading is yet another example of the misnomers associated to with the PBL process. A better term might be standards-based assessment. Because they feel the pressure by parents to summatively evaluate their student PBL projects, teachers sometimes try to compensate by spotlighting the final (rather than interim) exhibits and fail to embrace the formative progress associated with the PBL formative continuum that embraces mistake-making and imprecise predicting. This view is at the core of the difference between project and problem-based learning (Lathram, 2016). Because these progressive assessments are often communicated and confused with report cards in which a standard letter grade is reported, parents and stakeholders often become frustrated with not being able to pinpoint relative academic successes with their children. This incorrectly premised feedback is the result of the implementation of PBL as an add-on strategy to be implemented as time permits, rather than it becoming a fully developed, systematic process that can change the entire learning culture in K12 schools.

Enhancing PBL Through Storying One of the perceived impediments to a wider acceptance of PBL revolves around issues with communication and an inadequately articulate narrative associated with PBL. We have observed that many teachers find it problematic to teach storytelling and students find it difficult to communicate the project’s narrative that is contextualized through a coherent set of driving questions. The teachers we interviewed have expressed a concern with their inability to create a story around the rationale for or the results of their projects. As Haven (2007) aptly describes it, conveying critical thinking, predicting, and conjuring, are all shown to be most effective when developed through storying. We intentionally use storying as a verb (linguistically speaking, storying is a gerund, which connotes a verb and a noun form simultaneously) (Gunter & Kenny, 2021; Kenny & Gunter, 2020). Storying is the act of story creation that is agnostic to the media or story provider. Stories are most effective if they follow the four principles discovered long ago by successful filmmakers (Branigan, 1992). They include: 1 . establishing a time and place; 2. eliciting cause and effect through an interruption; 3. a judgment by the main character; and 4. creating an effective and credible means to communicate the story (Branigan, 1992). Storying provides a means to infuse emotion, spark motivation and creativity, and accelerate collaboration. Properly prescribing how storying supports PBL

6  Enhancing Problem Based Learning Through Design Thinking and Storying

71

requires a rethinking of the generally incomplete notions that abound about a story’s construct. While many might acknowledge that they know a good story when they see or hear one, our studies have shown that few actually know how to create them effectively (Gunter et  al., 2018; Kenny, 2008; Kenny & Gunter, 2006). Of the four main principles of story creation, the two obviously critical elements of the four storying elements listed above that directly correlate to the PBL process are cause and effect and making judgments (Branigan, 1992). The former equates closely with root cause analysis in problem identification and the latter in the assessment process that occur when determining potential solutions and/or counter measures. A properly formatted story follows an engaging schema, which has been explored extensively in correlation to predictive subject/predict linguistic analyses as utilized by prominent linguists and cognitive psychologists, such as Walter Kintsch (Goldman & Varma, 1995; Kintsch & van Dijk, 1978) Predicting and storying go hand in hand, thereby creating a stronger platform for implementing PBL as an instructional model. Arriving at potential solutions (or outcomes or responses) is enhanced through structured storying using the predicate element of storying (Kintsch & van Dijk, 1978). Jean Matter Mandler (1984) in her book Stories Scripts and Scenes alludes to the schemas that are implied in stories. The importance of a proper story schema is premised on the concept of causal chain analysis that is fostered when the stories follow a prescribed schema (Omanson, 1982; Trabasso & van den Broek, 1983). This analysis implies that there is a “levels effect” in stories in which it is the so-called preposition (schema) offered by a well-constructed story. Kintsch proposed that this schema accounts for a larger share of the positive variance in studies with regards to knowledge acquisition and latent recall than the specific contents of statements made in the story itself. Kintsch’s prepositional analysis borrows from this concept and posits that the ending of a story often can be predicted. When the ending ends differently from that predicted, a surprise arises (as in a surprise ending). That surprise is the basis of what George Gilder (2013) has referred to as the knowledge that is acquired in a given situation. As with stories, many inquiries result in surprise endings that equate to what has been learned (Gunter et al., in press).

Fig. 6.5  Logic model

72

R. F. Kenny and G. A. Gunter

Using a Logic Model to Foster Design Thinking A major enhancement that we offer to implement design thinking into a PBL project is to first integrate the prescribed steps of a logic model (see Fig.  6.5), which enhances the ability to formalize the predicting, assessing, and transforming outputs into outcomes: A logic model is an organized and visual way to display an understanding of the relationships among the resources to operate a program. The logic model makes assumptions explicit, allowing them to be challenged and better examined, and assists with prediction and shared understandings of the possible solutions. In this framework, inputs are the driving questions to be answered. The activities are the interventions, counter measures or artifacts that are iteratively designed and developed. The outputs are the direct, measurable (i.e., evaluated) results of these activities. Outcomes are the assessments/judgments made as to the relative benefits realized for the observed population that is served by that solution (Indiana Youth Institute, n.d.). We add the concept of conjecture to the Logic Model to form what we call a conjecture model. We refer to it as an extension of the Logic Model in that it is a framework that allows the translation of well-founded speculations into design features to create a recipe that informs what activities, artifacts or interventions are potentially appropriate. Because PBL is an investigation into the future and not discovery of the past, the modeling of conjectures is enhanced by a rigorous, formalized process that empowers PBL to embrace high-level speculations that, in turn, evolve from theory (speculation) into enacted, models of design and development. In short, a driving principle of design thinking is to encourage estimating and predicting. We suggest that the conjecture model formalizes the process and adds rigor to the prediction process as told through storying. Surprise endings are what we seek. As such, participants in PBL do not need to worry initially about being completely correct or precise, because they are using formative assessment to move them closer to (a more) appropriate response or outcome. As mentioned earlier, the iterative nature of each step provides formative feedback that parallels one of the first principles of entrepreneurship. Entrepreneurs like David Kelley, founder of IDEO, often quipped that the entrepreneurial process encourages failure in order to find solutions: fail early in order to succeed sooner (Brown, 2019). This perspective is what differentiates PBL from labbased, scientific methodologies that seek precision first by controlling or eliminating almost all variables and outliers. Dispensing with outliers may be an error term in a lab, but it needs to be examined in design thinking that occurs in the real world. Design failures should not be confused with a flaw or negative outcome that is erroneously thought to slow down the learning process when, in fact, early design and development failures can actually speed up the process in the long run. Even Edison was credited with saying he succeeded simply because he ran out of things that did not work. What we propose is to combine the logic model with

6  Enhancing Problem Based Learning Through Design Thinking and Storying

73

Fig. 6.6  Conjecture model

conjecture theory to form a conjecture modeling map that might look something like Fig. 6.6:

Root Cause Analysis As noted previously, the essence of PBL is to support problem investigation and root cause analysis to identify appropriate solutions, artifacts, interventions, or counter measures and then to evaluate these outputs, and then to assess how effective they were in solving or overcoming the issue. A properly constructed PBL project is, therefore, rooted in measurable problem identification that is articulated in story format that is also useful in assisting with communicating the entire nature of the project. Quality questioning is a well-known process to support out of the box thinking and to increase engagement in learning (Walsh & Sattes, 2011). Proponents of experience design mirror this line of thinking that combines questioning with theory-­ based design to make PBL more rigorous. The Handbook on User Experience Design, for example, devotes an entire chapter to the importance of interviews. The well-crafted checklist that is presented supports one’s ability to get at the core of formulating the right questions(s). Learning through questioning is a well-accepted and researched concept that can strengthen and deepen the learning experience and has been prominent in the literature for decades (Edwards & Bowman, 1996). From our research we have found that combining storying and design thinking with formalized prediction processes is at the core of an enhanced Problem Based Learning project that can increase the processes of learner acquisition and comprehension (Gunter & Kenny, 2021; Kenny & Gunter, 2020). By seeking many solutions and solving complex issues with an iterative process, students not only find multiple solutions but multiple directions for solutions and problems. More effective questioning can be achieved using processes such as the Five Whys, an iterative interrogative technique used to explore the cause-and-effect relationships. As noted

74

R. F. Kenny and G. A. Gunter

by Serrat (2019), the primary goal of this technique is to determine the root cause of a problem by repeating the question five times. The assumption is that the answer to the fifth why question should reveal the root cause of the problem. The responses make up the problem’s story so that proper judgments and wiser predictions might be made. As such, the Five Whys has become an essential part of design thinking in our prescription for implementing a properly designed PBL project.

Summary A perceived discomfort and unfamiliarity with design thinking and storying, and the inability to add significant rigor, impedes them from being effectively used in PBL projects and may have contributed to slowing down the (re)adoption of PBL in schools. PBL and its related processes, such as case-based learning, have been commonplace and effective in many industries and disciplines outside of education. Its resilience as a common instructional practice appears to be founded on those same disciplines that foster rigorous, entrepreneurial thinking and on storytelling that has yielded varying degrees of success (Almulla, 2020; Major & Palmer, 2001). Those who have embraced the problem-oriented version of PBL have reported it to be successful in meeting the needs of their students, particularly with Generation Z. “PBL employs constructivist principles to foster application of prior knowledge, collaborative learning, and active engagement” (Siebert, 2021, p. 85). Yet, many educators struggle with getting started. Siebert (2021) further indicated that one reason PBL is not commonly implemented is because of a general anathema toward and a discomfort with teaching critical thinking. Many curricula are still based on direct instruction and/or a random implementation of project learning without a rigorous foundation of knowing how to teach their students how to identify problems and then how to utilize probability to predict potential solutions. Seibert’s research suggests that a rigorous implementation of PBL that includes strategies such as exploration of a problem through design learning and communicating/storying its results is exactly what is needed to advance PBL. In short, the two processes are symbiotic. We also submit that a well-defined design thinking and story infused PBL process closely parallels that of the scientific method and other structured, technical frameworks. As such, one could also agree that PBL technically integrates into the original intent of the rationale for introducing STEM/ STEAM into schools. Design thinking and storying offer an effective means to balance scientific cognition with creativity while adding another dimension and further depth to the learning process, leading to a deeper understanding and retention of the content.

6  Enhancing Problem Based Learning Through Design Thinking and Storying

75

References Almulla, M. A. (2020). The effectiveness of the project-based learning (PBL) approach as a way to engage students in learning. SAGE Open, 10(3). https://doi.org/10.1177/2158244020938702 Branigan, E. (1992). Narrative comprehension in film. Routledge. Brown, T. (2019). Change by design: How design thinking transforms organizations and inspires innovation. HarperCollins. Edwards, S., & Bowman, M. A. (1996). Promoting student learning through questioning: A study of classroom questions. Journal on Excellence in College Teaching, 7(2), 3–24. Gilder, G. (2013). Knowledge and power: The information theory of capitalism and how it is revolutionizing our world. Regnery Publishing. Goldman, S. R., & Varma, S. (1995). CAPping the construction-integration model of discourse comprehension: Essays in honor of Walter Kintsch. In C. Weaver, S. Mannes, & C. Fletcher (Eds.), Discourse comprehension: Essays in honor of Walter Kintsch (pp. 337–358). Erlbaum. Gunter, G. A., & Gunter, R. E. (2015). Teachers discovering computers: Integrating technology in a changing world (8th ed.). Cengage Learning. Gunter, G. A., & Kenny, R. F. (2021). Using design thinking and formative assessment to create an experience economy in online classrooms. Journal of Formative Design in Learning, 5, 79–88. https://doi.org/10.1007/s41686-­021-­00059-­5 Gunter, G. A., Kenny, R. F., & Bzdun, P. (in press). Gamification in online learning. In K. Thompson (Ed.), The Sage handbook of online higher education. Sage. Gunter, G. A., Kenny, R. F., & Junkin, S. (2018). The narrative imperative: Creating a story telling culture in the classroom. In B. Hokanson, G. Clinton, & K. Kaminski (Eds.), Educational technology and narrative: Story and instructional design (pp.  5–19). Springer. https://doi. org/10.1007/978-­3-­319-­69914-­1_2 Harris, C. (n.d.). What is standards-based grading? Iowalum. https://iowalum.com/ what-­is-­standards-­based-­grading/ Haven, K.  F. (2007). Story proof: The science behind the startling power of story. Libraries Unlimited. Indiana Youth Institute. (n.d.). Logic Models: A beginner’s guide. https://www.iyi.org/type/ resource-­guide/ Kenny, R. F. (2008). Digital narrative as change agent to teach reading to mediacentric students. International Journal of Social Sciences, 2(3), 187–195. Kenny, R.  F., & Gunter, G.  A. (2006). Enhancing literacy skills through digital narrative. The Journal of Media Literacy, 53(2), 40–45. Kenny, R. F., & Gunter, G. A. (2015). Building a competency-based STEM curriculum in non-­ STEM disciplines: A sySTEMic approach. In B.  Hokanson, G.  Clinton, & M.  W. Tracey (Eds.), The design of learning experience: Creating the future of educational technology (pp. 181–198). Springer. http://link.springer.com/book/10.1007%2F978-­3-­319-­16504-­2 Kenny, R.  F., & Gunter, G.  A. (2018). Entrepreneur-think meets academia: Formative decision-­ making for instructional designers and administrators. In R.  M. Branch (Ed.), Educational media and technology yearbook (Vol. 41, pp.  39–52). Springer. https://doi. org/10.1007/978-­3-­319-­67301-­1_3 Kenny, R.  F., & Gunter, G.  A. (2020). Wisdom and power: Using information theory to assess the transactional relationship between the learner and the knowledge provider. In B.  Hokanson, M.  Exter, A.  Grincewicz, M.  Schmidt, & A.  A. Tawfik (Eds.), Intersections across disciplines: Interdisciplinarity and learning (pp.  53–61). Springer. https://doi. org/10.1007/978-­3-­030-­53875-­0_5 Kenny, R.  F., & Gunter, G.  A. (2022). Using live interactive improv to instill a participatory, transactional learning culture in the classroom. In B.  Hokanson, M.  Exter, A.  Grincewicz, M. Schmidt, & A. A. Tawfik (Eds.), Learning: Design, engagement, and definition (pp. 29–40). Springer. https://doi.org/10.1007/978-­3-­030-­85078-­4_3

76

R. F. Kenny and G. A. Gunter

Kenny, R. F., & Wirth, J. (2009). Implementing constructivist learning experiences through best practices in interactive performance. Journal of Effective Teaching, 9(1), 34–47. https://uncw. edu/jet/articles/vol9_1/kenny.pdf Kintsch, W., & van Dijk, T. A. (1978). Toward a model of text comprehension and production. Psychological Review, 85, 363–394. Lathram, B. (2016, October 16). 35 leaders on the successes and challenges of project-based learning. Getting Smart. http://www.gettingsmart. com/2016/10/10/35-­leaders-­on-­pbl-­whats-­working-­what-­needs-­to-­improve/ Major, C. H., & Palmer, B. (2001). Assessing the effectiveness of problem-based learning in higher education: Lessons from the literature. Academic Exchange Quarterly, 5(1), 4–9. Mandler, J. M. (1984). Stories, scripts, and scenes: Aspects of schema theory. Psychology Press. Meyrat, A. (2021). Standards based learning will ruin education. Quillette. https://quillette.com/ author/auguste/ Ogle, D.  M. (1986). K-W-L: A teaching model that develops active reading of expository text. Reading Teacher, 39, 564–570. Omanson, R. C. (1982). An analysis of narratives: Identifying central, supportive, and distracting content. Discourse Processes, 5, 195–224. Reigeluth, C., & An, Y. (2020). Merging the instructional design process with learner-centered theory: The holistic 4D model. Routledge. Seibert, S. A. (2021). Problem-based learning: A strategy to foster generation Z’s critical thinking and perseverance. Teaching and Learning in Nursing, 16(1), 85–88. https://doi.org/10.1016/j. teln.2020.09.002 Serrat, O. (2019). The five whys technique. Knowledge Solutions. https://doi. org/10.1007/978-­981-­10-­0983-­9_32. Trabasso, T., & van den Broek, P. (1983). Causal thinking and story understanding [Unpublished manuscript]. University of Chicago. Walldén, S., & Mäkinen, E. (2014). Educational datamining and problem-based learning. Informatics in Education, 13(1), 141–156. Walsh, J.  A., & Sates, B.  D. (2011). Thinking through quality questioning: Deepening student engagement. Sage. Weston, C., & Cranton, P.  A. (1986). Selecting instructional strategies. The Journal of Higher Education, 57(3), 259–288. https://doi.org/10.1080/00221546.1986.11778771 Zalewski, J., Gonzalez, F., & Kenny, R. (2014). Small is beautiful: Embedded systems projects in an undergraduate software engineering program. In Proceedings of the E2LP workshop (pp. 35–41) https://annals-­csis.org/Volume_4/pliks/668.pdf

Chapter 7

Formative Design of Authentic Scenarios for a Virtual Reality-Based Parent-Teacher Conference Training Simulation Jeeheon Ryu, Sanghoon Park, Eunbyul Yang, and Kukhyeon Kim

Abstract  This chapter shares a case study of the formative design and evaluation process of authentic scenarios in developing a virtual reality (VR) parent-teacher conference training simulation for pre-service teachers. We used the three-phase design model (developing simulation structure, elaborating contextualized scenarios, and finalizing simulation authenticity). The simulation scenarios were structured based on the three components of the Parent-Teacher Communication Competences Model (sequencing of communication, psychological aspects of communication, and communication flow). For the formative evaluation process, we used survey findings and expert reviews for iterative revisions and created three valid and authentic scenarios to be used for the parent-teacher conference training simulation. Keywords  Parent-teacher conference · Virtual simulation · Formative design · Parent-teacher communication competence model

Introduction The interactions and collaborative relationships among teachers and students’ legal guardians including parents or caregivers are critical as they influence students’ learning and classroom experiences (Dearing et al., 2006; De Coninck et al., 2021). J. Ryu · E. Yang · K. Kim Chonnam National University, Gwangju, South Korea e-mail: [email protected]; [email protected]; [email protected] S. Park (*) University of South Florida, Tampa, FL, USA e-mail: [email protected] © The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 B. Hokanson et al. (eds.), Formative Design in Learning, Educational Communications and Technology: Issues and Innovations, https://doi.org/10.1007/978-3-031-41950-8_7

77

78

J. Ryu et al.

Teachers are encouraged or, in some countries (such as the UK and the Netherlands), required to collaborate and communicate with parents regularly. Maintaining effective communication between teachers and parents is particularly important for parent-teacher conferences (De Coninck et al., 2018). Parent-teacher communication (PTC) refers to two-way verbal interactions between parents and teachers while both parties are engaged in parent-teacher conferences (De Coninck et al., 2021). Despite the importance of parent-teacher communication competences (PTCC), teachers often regard PTC as one of their most challenging professional tasks (Hoover-Dempsey et al., 2002; Markow & Pieters, 2012). This is because parent-­ teacher interactions require complex communication skills often involving evaluation, responsibility, and high-stakes decisions about student learning and overall school education (Lawrence-Lightfoot, 2003). Previous studies show that many pre-­ service teachers receive insufficient training to acquire these complex conferencing skills or limited opportunities to practice PTC during their teacher preparation programs, thus they often feel ill-prepared in this capacity (Dotger et al., 2008; Epstein, 2013; Evans, 2013). Consequently, novice teachers feel anxious and stressed due to the lack of preparation, which leads to decreased job satisfaction. The anxiety of communicating with parents is also a major predictor of teacher burnout (Pressley, 2021). As an approach to improving pre-service teachers’ communication skills and efficacy in building a strong teacher-parent relationship, Virtual Reality (VR)-based training simulation can offer ample opportunities to practice their PTCC and maximize the benefits of PTCC training (Khasnabis et al., 2018; Luke & Vaughn, 2022; Thompson et al., 2019). Technology-based simulations offer pre-service teachers a virtual environment that resembles the complexity of a real classroom situation or a parent-teacher conference situation (Badiee & Kaufman, 2014; Dalinger et al., 2020). Through immersive and authentic simulations, pre-service teachers can experience various classroom situations with uniquely designed virtual students representing different psychological and behavioral issues (Park & Ryu, 2019). VR-based training simulation allows pre-service teachers to experience unlimited training opportunities in a realistic parent-teacher conference environment with various cases of simulated virtual parents. Scenarios in VR simulations are the sequenced cases, or the situations designed to accomplish desired outcome behaviors (Luke & Vaughn, 2022). The success of VR-based training depends on the authenticity of embedded scenarios, which are critical for pre-service teachers to sensitize their PTC skills. According to Rosson and Carroll (2002), scenario design is used as a development technique to provide problems to be engaged, sequenced activities to experience the problems, and interactions/information as resources to solve the problems. Problems within a scenario need to be carefully created to reveal aspects of the authentic problem. Activities are sequenced within the scenario to create concrete ideas about the PTC that present the problem scenario. Lastly, interaction/information is scripted to create actions/dialogues between the user (pre-service teacher) and the virtual parents. The overall goal for the parent-teacher conference scenarios is for pre-service teachers to acquire PTC skills while discussing the assessment results of a simulated student with the student’s simulated parents.

7  Formative Design of Authentic Scenarios for a Virtual Reality-Based Parent-Teacher…

79

The Parent-Teacher Communication Competences Model The goal of the design process is to ensure that the scenario content is appropriate for the intended training objectives and further achieve the high-quality realism of the target situations in a VR environment (Kaufman & Ireland, 2016). Since the intended purpose of the parent-teacher conference scenarios was to improve pre-­ service teachers’ PTC skills, the initial scenario was created based on the PTCC model (De Coninck et al., 2018). The PTCC model as shown in Table  7.1 integrates (1) the competences for sequencing an effective parent-teacher conference and (2) the competences for adopting supportive psychological structures (De Coninck et al., 2018). The first set of competences consists of five steps that pre-service teachers need to follow. The steps are divided into the initial phase (positive opening, gathering information, sharing information) and the final phase (reaching agreement and positive closing) during a parent-teacher conference. The second set of competences is about the adoption of psychological structures (accepting parents’ emotions and maintaining a positive relationship). Throughout the parent-teacher conference session, the PTCC model emphasizes the importance of managing the flow of both the sequencing competences and the psychological structures adoption competences.

Three-Phase Design Model We followed a three-phase design model to create the VR-based training simulation scenarios. Each phase has two dimensions to design and develop the simulation: scenario and functional features. The scenario is an essential element to create a training module that reflects conflict situations between parents and the teacher during the parent-teacher conference. While the scenario augments authenticity in the simulated stories, the functional feature is also critical to emulate a natural human conversation simulation in a VR environment. We considered various simulation components for the functional feature including virtual human design, natural voice tone, and emotional facial expressions, for instance. The three-phase design model follows (1) Developing simulation structure, (2) Elaborating contextualized scenarios, and (3) Finalizing simulation authenticity (Fig. 7.1). In the first phase, developing simulation structure, we developed a PTCC model-based scenario structure by following the recommended conversational sequence of an effective parent-­ teacher conference. For the functional feature, we created the first parent model of a virtual human as the mother of a fictional student. To design a realistic virtual human, we developed the virtual human persona as a late thirty-year-old female. The challenging part of the design was to create a gentle but intense mood with non-­ verbal voice tone, pitch, facial expression, and gesture. The second phase, elaborating contextualized scenarios, was to present realistic yet context-rich storytelling. It was essential to create a scenario subtle enough to deliver realistic parent-teacher

De Coninck et al. (2018)

Managing flow (1) Sequencing of an effective parent-teacher conference (1a) Initial 1. Positive Teacher immediately establishes a context for the phase opening conversation by (1) friendly greeting the parent, and (2) clarifying the goals of the parent-teacher conference 2. Gathering Teacher gathers pertinent information of the parent by information (1) asking questions, and (2) listening actively. 3. Sharing Teacher explains the situation from his/her point of information view by (1) giving concrete information and examples, and (2) communicating in an understandable way for the parent. (1b) Final 4. Reaching an Teacher reaches agreement on a promising course of phase agreement action by (1) suggesting potential solutions to the situation, and (2) incorporating parent’s ideas if possible. 5. Positive closing Teacher closes the conversation in a positive way by (1) agreeing on how to monitor progress and cooperate further, and (2) friendly greeting the parent. (2) Psychological structures of an effective parent-teacher conference 7. Accepting parent’s emotions Teacher accepts the emotions of parents by (1) recognizing emotions and letting parents ventilate their feelings, and (2) expressing empathy for the parent’s emotional state. 8. Maintaining a positive Teacher maintains a positive interpersonal relationship relationship with parents by (1) being friendly and encouraging, regardless of the parent’s behaviors, and (2) showing interest and understanding for the situation.

Table 7.1  The PTCC model Teacher manages the structure of the parent-teacher conference to guarantee efficiency through (1) time management, and (2) using authority.

Teacher manages the flow of the parent-­ teacher conference by (1) managing the emotions of the parent and (2) keeping the conversation focused on the subject.

6. Managing the flow of the sequencing of the parent-teacher conference

9. Managing the flow of the psychological structures in the parent-teacher conference

80 J. Ryu et al.

7  Formative Design of Authentic Scenarios for a Virtual Reality-Based Parent-Teacher…

81

Fig. 7.1  The three-phase design model of simulation

conference topics. In order to extend the first scenario to various situations, we later diversified the virtual parents with possible parent-teacher conference cases. We further elaborated on body movements to express the virtual parents’ emotional expressions. In the third phase, finalizing simulation authenticity, we validated and checked the authenticity of the designed simulation scenarios and functional features. This chapter provides a formative design and evaluation process of authentic scenarios in VR parent-teacher conference training simulation. Through the three-­ phased formative design and evaluation process, we refined the scenarios to be as realistic and authentic as the real PTCC sessions. The modifications made for the scenarios over the three phases are also reported.

Evaluation Questions 1. What are the three phases of parent-teacher conference training simulation scenario development in a virtual reality environment? 2. What are users’ perceptions toward each phase of the scenario development?

Formative Design and Evaluation Phase 1 Developing Simulation Structure Figure 7.2 shows a screen capture of the initial design of the parent-teacher conference scenario. A female virtual avatar (a student’s mother) was created for interaction. Following the five PTCC sequencing competences, a pre-service teacher would play a teacher’s role and verbally communicate with the parent character through the head-mounted device (HMD). The interaction between the pre-service teacher and the parent virtual avatar works through a sequenced conversation simulation. The pre-service teacher verbally initiates the conversation, and a moderator behind the VR simulation controls responses from the parent avatar to continue the conversation (Table 7.2).

82

J. Ryu et al.

Fig. 7.2  The initial design of the parent-teacher conference scenario screen capture

Phase 1 Scenario Evaluation Participants: The phase 1 evaluation was conducted in spring 2021 to identify basic scenario components and further solicit expert reviews from in-service teachers who have experience in parent-teacher conferences. In order to do so, we showed the scenario to 25 in-service teachers (13 females and 12 males ranging from ages 30–62 with a mean of 39.96, 9 high school teachers and 16 middle school teachers) and asked them to describe student behavioral issues that often require this type of parent-teacher conference. They performed the simulation through the teleconference platform ZOOM, and immediately after completing the simulation, a survey and open-ended questions were administered on the validity and realism of the given scenario. Evaluation instruments: Using the 9 formative evaluation survey items, the authors created a 7-point Likert scale and added five additional open-ended questions at the end. Teachers were asked to rate the degree of scenario validity (5 items) and scenario authenticity (4 items) presented in the initial scenario and further describe parent-teacher conference situations based on their experiences (Table 7.3). The purpose of the validity items was to check if the presented scenario accurately reflects the problem situation that could occur during the parent-teacher conference. The purpose of the authenticity items was to confirm the realistic expressions of the parent avatars in the parent-teacher conference.

Positive closing

Reaching agreement

Sharing information

Conference sequence Positive opening (greetings) Gathering information

Pre-service teacher’s reaction training React in a positive way to build a context for the conversation by greeting and explaining the goal of the meeting I am a little nervous and confused about this meeting. Is Listen carefully to what the parent’s concerns are there something wrong with Doa? Provide some positive points about the student’s behaviors Yes, Doa is very good at those. I think he has some talents Communicate additional information about the student’s class behaviors (or grades) and ask about the parent’s experiences with the student Doa recently has been under a lot of stress. He seems to be Share the student’s behavioral problems in the classroom doing fine though Presenting concrete information and examples in an understandable way Oh, I don’t know if a little interruption in class is such a Ask the parent’s opinion and try to understand the big deal that we need to discuss it student’s problem from the parent’s perspective To be honest, last year’s homeroom teacher did an excellent Share some possible reasons for the behavior problem job of taking care of Doa and encouraging him to pay and suggest a potential course of action to solve the attention to class activities. He didn’t even have any problem behavioral issues. Could you possibly help him more? I am frustrated a lot. I really don’t know what I can do to Share and discuss how the behavioral problem can be help him solved through collaborative efforts between the parent and teacher Yes, I like the idea. Thank you Ask if the parent has any other ideas I will monitor Doa’s progress and talk to him about his Incorporate the parent’s ideas into the action plan classroom behaviors at home. Please talk to him and help Ask for the parent’s support and provide concrete ideas him understand what he needs to do about the action plan Sure, I understand Confirm the potential solution Close the conference in a positive way Thank you for meeting with me today Friendly reply

Parent avatar (Sample conversation) Hello, how are you? Very nice to meet you. I am Doa’s mother

Table 7.2  A sample scenario design between a parent and a pre-service teacher based on the PTCC model Psychological structure Throughout the conference, accept parents’ emotions Maintain a positive relationship

7  Formative Design of Authentic Scenarios for a Virtual Reality-Based Parent-Teacher… 83

Item statement Construct This parent-teacher conference scenario will be useful to increase the communication ability of pre-service Validity teachers 2 Using this parent-teacher conference scenario will help provide effective pre-service teacher training for communication 3 This parent-teacher conference scenario will be useful for communication training for pre-service teachers 4 This parent-teacher conference scenario consisted of cases worthy of communication training for pre-­service teachers 5 This parent-teacher conference scenario was suitable for pre-service teachers’ communication training 6 Through this parent-teacher conference scenario, I was able to experience communication situations that Authenticity could occur in reality 7 This parent-teacher conference scenario was something that could happen in a real school 8 The terms and vocabularies used in this parent-teacher conference scenario are used by parents in real situations 9 This parent-teacher conference scenario is composed of stories that are likely to occur in reality Open-ended questions 10 We intend to use this scenario to train preservice teachers to cope with problem situations during a parent-teacher conference. What do you think needs to be improved for the realistic interaction within the simulation scenario? 11 We would like to develop educational materials to utilize this simulation. What content should be provided for the training of pre-service teachers before, during, or after the simulation? 12 If you were in the presented parent-teacher conference and had to actually meet with the parents, how would you respond? Please describe your response strategies. 13 During the conversation with the virtual parent avatar, what were the factors (the parent avatar’s dialogue, attitude, gestures, etc.) that interfered with the flow of verbal interaction, failed to provide immersiveness, or felt awkward? Please explain. 14 If you use this scenario, what do you expect pre-service teachers to learn?

Item # 1

Table 7.3  Phase 1 scenario evaluation items (validity and authenticity)

84 J. Ryu et al.

7  Formative Design of Authentic Scenarios for a Virtual Reality-Based Parent-Teacher…

85

Phase 1 evaluation results: The responses were aggregated to gain insight for the next phase of design and evaluation. The reliability of the items were 0.94 (scenario validity) and 0.76 (scenario authenticity), respectively. Reviewers also rated the scenario validity as high as 6.60 (SD = 0.54) and scenario authenticity as high as 6.52 (SD = 0.51). Expert review and suggestions on how to improve the scenario from 5 open-ended questions are summarized below. First, teachers suggested that the number of scenario cases needs to be extended to reflect different levels of challenge and seriousness of parent-teacher conferences. The initial version well presented a case of a typical parent who is overprotective of her son by blaming other students for her son’s problem behaviors or accusing the teacher for not having enough teaching experience. Yet, the scenario cases need to include more challenging and authentic situations such as parents who are indifferent toward their children and do not care about their children’s school lives, or a scenario with parents who are rude to teachers and purposely show their authority and professional status. They also suggested categorizing different parent types presenting different levels of challenges. Second, teachers emphasized the importance of building trust at the beginning of the conference session, sharing common goals that both parties can agree on, and showing empathy when listening to the parent. We noted that the suggestions are strongly linked to the two main competences of the PTCC model (sequencing and psychological structure).

Phase 2 Elaborating Contextualized Scenarios Design revisions: Based on the findings from the phase 1 design and evaluation, we considered three changes for the next phase of design and evaluation. We decided to revise the initial scenario to include some more contextual details with the voice tone and design of the parent avatar. We also decided to create two more scenarios that can reflect the most common parent-teacher conference situations. Lastly, we decided to check the conversation scripts for the parent avatar and make sure they are designed by following the PTCC model. Following the feedback we received from the 25 in-service teachers during phase 1, we developed three scenarios including the revision of the initial scenario introduced in phase 1. Considering different types of parents that often appear in parent-teacher conferences, three topics were parents who are indifferent toward their children, parents who ignore teachers due to their professional background, and parents who only care about their own children (the opposite case of the first type of parents). The sequencing and psychological structure competences were again reflected in each of the scenarios. After creating the three scenarios, we conducted the phase 2 formative evaluation in fall 2021.

86

J. Ryu et al.

Phase 2 Scenario Evaluation Participants: The goal of the phase 2 evaluation was to validate the three initially created scenarios. The findings of the survey were expected to further modify the scenario situations and the sequence of activities to be realistic. The three revised scenarios were presented to 10 in-service teachers (6 females and 4 males, ranging from ages 32–51 with a mean of 37.80, 7 high school teachers and 3 middle school teachers) for their review and input. One of the 10 teachers participated in phase 1 formative evaluation, but the other nine teachers did not. Evaluation instruments: Eight revised formative evaluation survey items with a 5-point Likert scale and an open-ended questionnaire were used. Teachers were asked to rate the degree of scenario validity (4 items) and scenario authenticity (4 items) presented in each of the three scenarios and further indicate the level of authenticity of each scenario in percentages (Table 7.4). Phase 2 evaluation results: The responses from the phase 2 evaluation were aggregated to gain insight for the next phase of design and evaluation. The validity of the three scenarios was rated as 3.8 (SD = 1.00) for the indifferent parent case, 4.0 (SD = 0.90) for the ignoring parent case, and 4.6 (SD = 0.60) for the only caring parent case. The authenticity of the three scenarios was rated as 4.0 (SD = 0.90) for the indifferent parent case, 3.7 (SD = 1.20) for the ignoring parent case, and 4.6 (SD = 0.50) for the only caring parent case. The first scenario (indifferent parent case) showed the lowest validity and authenticity while the other two scenarios Table 7.4  Phase 2 scenario evaluation items (validity and authenticity) Item # Item statement 1 The scenario is suitable for the case of (parents who are indifferent toward their children; parents who ignore teachers due to their professional background; parents who only care about their own children) during the parent-teacher conference 2 The parents in the presented scenario are showing the behavior of parents 3 The conversation between the parents and the teacher well presents the behavior of the parents 4 The scenario adequately represents the challenging situation of parents during the parent-teacher conference 5 The terms and vocabularies used in this parent-teacher conference scenario are used by parents in real situations of parents during the parent-teacher conference 6 The conversation expressions used in the parent-teacher conference are realistic for parents 7 The content used in this parent-teacher conference scenario is realistic for (parents who are indifferent toward their children; parents who ignore teachers due to their professional background; parents who only care about their own children) during the parent-teacher conference 8 The parents in this parent-teacher conference scenario are using expressions that match their emotional status

Construct Validity

Authenticity

7  Formative Design of Authentic Scenarios for a Virtual Reality-Based Parent-Teacher…

87

showed higher than 4.0 validity and authenticity. Regarding the first scenario (indifferent parent case), experts agreed that the scenario is well designed for pre-service teacher training especially as it well presented an in-depth understanding of the problem situations often encountered in the classroom. Yet, they also noted that what makes the conference even more difficult is parents who do not acknowledge their indifference and accuse teachers of being the source of their children’s problems. Therefore, the scenario needs to include more guiding conversations for teachers to present information and data about the students.

Phase 3 Finalizing Simulation Authenticity Design revisions: The goal of phase 3 was to finalize the scenarios with more details in terms of virtual parent conversations, gestures, and facial expressions. The findings from phase 2 suggested all three scenarios accurately represent the intended problem situations in authentic parent-teacher conferences. We revised the details of the first scenario and developed the VR parent-teacher conference modules. Phase 3 was to finalize the scenarios with more details in terms of virtual parent conversations, gestures, and facial expressions. Three in-service teachers enrolled in the educational technology program served as experts and recommended revisions to be included in the final scenarios. The conversation scripts between the parent and the teacher in the VR conference were further polished to be natural. The sequencing and the psychological structure of the PTCC model were also considered to ensure that the users (pre-service teachers) follow the five steps of the sequenced actions. Through the three phases of formative design and evaluation, we were able to develop criteria to judge the validity and authenticity of parent-teacher conference scenarios. Figures  7.3, 7.4 and 7.5 show the three developed parent-­ teacher conference modules with scenario descriptions. In scenario 1, we created a virtual parent persona as a mother who is indifferent toward her daughter, Da-mi. Da-mi does not have a particular problem with friendship, but she tends to be very passive when communicating with her friends and accepting their opinions. She becomes overly anxious when experiencing conflicts with her friends, which influence her school life. Recently, she was misunderstood by her peers and has not been able to reconcile. To make things worse, she accidentally revealed other friends’ secrets while trying to explain herself to her friends. She has been depressed about all these misunderstandings with her friends, and now she feels emotionally insecure and doesn’t want to talk to anyone about her worries. In this scenario, a user (pre-service teacher) is trying to communicate with Da-mi’s mother to come up with a good plan to support Da-mi, but her parent is not interested in Da-mi’s problems. The pre-service teacher is expected to emphasize the unwanted consequences of Da-mi’s behavioral problems and encourage her mother to share some ideas on how both teacher and parent can support Da-mi, while accepting her mother’s emotions and maintaining a positive relationship.

88

J. Ryu et al.

Fig. 7.3  Scenario 1 design: a case of a mother who is indifferent toward her daughter

Fig. 7.4  Scenario 2 design: a case of parents who ignore a teacher

In scenario 2, we created a scenario in which a student’s parents do not want to work with a teacher by not accepting the teacher’s suggested plan of action. The virtual parents’ personas were designed as highly educated professionals who do not trust the teacher’s shared information regarding their kid’s behavioral problems

7  Formative Design of Authentic Scenarios for a Virtual Reality-Based Parent-Teacher…

89

Fig. 7.5  Scenario 3 design: a case of a father who only cares about his son

in school. The scenario begins with praise of the student, but when a user (pre-­ service teacher) shares the child’s behavior problems, the virtual parents respond with a negative attitude and try to defend their kid. The situation was intended to present aggressive parents so that the pre-service teacher can learn how to firmly react and point out the problem behavior of their child. In the scenario, the virtual parents keep denying the reported behavioral problems and accuse the pre-service teacher of a lack of teaching experience. For example, the virtual parents say, “we can hire a tutor who can dedicate herself to our son, so you are accusing our son of his behaviors is not something we should be concerned about.” The virtual parents use cultured speech and speak calmly, but they cut off the pre-service teacher in the middle of the conference and continued to ignore the pre-service teacher until the meeting is over. The pre-service teacher is expected to explain possible reasons for the behavior problem and suggest a potential course of action that the virtual parents can agree upon based on the PTCC phases. In scenario 3, we created a virtual parent persona as a father who cares only about his son and does not readily admit his son’s behavioral problems. His son has a tendency to dominate his friends with high aggression and impulsivity, so he uses violence whenever his classmates laugh at him. The virtual parent shows a tendency to blame their friends or teachers for the cause. A user (pre-service teacher) is expected to present clear information about the causes and consequences of using violence in school and have the virtual parent accept the situation.

90

J. Ryu et al.

Conclusion In this chapter, we presented detailed explanations for the design and evaluation process of the VR-based parent-teacher conference scenarios and the modifications we made based on the data collected during the three design phases. All three finalized simulation scenarios present realistic and valid situations a teacher would encounter in parent-teacher conference meetings. Pre-service teachers who are familiar with the PTCC model can practice their communication skills in the three simulations by applying the model-based strategies to different conference scenarios. We also noted that there are several challenges we need to consider for the successful implementation of virtual simulation scenarios. First, parents’ perspectives on the virtual parent personas presented in the simulations need to be considered. In the formative design and evaluation process, we gathered data from teachers but not from parents. It would be important to understand how parents think and act when joining a parent-teacher conference and to incorporate parents’ ideas into the design of simulation scenarios. Second, the cost of technical equipment required to use the simulation might be significant for some teacher education programs. A designated physical space where pre-service teachers can access and use the simulations using HMDs, observation monitors, mobile webcams, and high-functioning computers will be needed, but not all teacher education programs will be able to afford them. An alternate version of the virtual simulations with low fidelity will also need to be designed to support those programs.

References Badiee, F., & Kaufman, D. (2014). The effectiveness of an online simulation for teacher education. Journal of Technology and Teacher Education, 22(2), 167–186. https://learntechlib.org/ primary/p/45934/ Dalinger, T., Thomas, K. B., Stansberry, S., & Xiu, Y. (2020). A mixed reality simulation offers strategic practice for pre-service teachers. Computers and Education, 144, 103696. https://doi. org/10.1016/j.compedu.2019.103696 De Coninck, K., Valcke, M., & Vanderlinde, R. (2018). A measurement of student teachers’ parent–teacher communication competences: The design of a video-based instrument. Journal of Education for Teaching, 44(3), 333–352. https://doi.org/10.1080/02607476.2018.1465656 De Coninck, K., Keppens, K., Valcke, M., Dehaene, H., Neve, J. D., & Vanderlinde, R. (2021). Exploring the effectiveness of clinical simulations to develop student teachers’ parent-teacher communication competence. Research Papers in Education, 1–33. https://doi.org/10.108 0/02671522.2021.1961291 Dearing, E., Kreider, H., Simpkins, S., & Weiss, H. B. (2006). Family involvement in school and low-income children’s literacy: Longitudinal associations between and within families. Journal of Educational Psychology, 98(4), 653–664. https://doi.org/10.1037/0022-­0663.98.4.653 Dotger, B.  H., Harris, S., & Hansel, A. (2008). Emerging authenticity: The crafting of simulated parent-teacher candidate conferences. Teaching Education, 19(4), 337–349. https://doi. org/10.1080/10476210802438324

7  Formative Design of Authentic Scenarios for a Virtual Reality-Based Parent-Teacher…

91

Epstein, J.  L. (2013). Ready or not? Preparing future educators for school, family, and community partnerships. Teaching Education, 24(2), 115–118. https://doi.org/10.1080/1047621 0.2013.786887 Evans, M. P. (2013). Educating pre-service teachers for family, school, and community engagement. Teaching Education, 24(2), 123–133. https://doi.org/10.1080/10476210.2013.786897 Hoover-Dempsey, K. V., Walker, J. M. T., Jones, K. P., & Reed, R. P. (2002). Teachers involving parents (TIP): An in-service teacher education program for enhancing parental involvement. Teaching and Teacher Education, 18, 843–867. Kaufman, D., & Ireland, A. (2016). Enhancing teacher education with simulations. TechTrends, 60(3), 260–267. https://doi.org/10.1007/s11528-­016-­0049-­0 Khasnabis, D., Goldin, S., & Ronfeldt, M. (2018). The practice of partnering: Simulated parent conferences as a tool for teacher education. Action in Teacher Education, 40(1), 77–95. https:// doi.org/10.1080/01626620.2018.1424658 Lawrence-Lightfoot, S.  L. (2003). The essential conversation: What parents and teachers can learn from each other. Random House. Luke, S. E., & Vaughn, S. M. (2022). Embedding virtual simulation into a course to teach parent–teacher collaboration skills. Intervention in School and Clinic, 57(3), 182–188. https://doi. org/10.1177/10534512211014873 Markow, D., & Pieters, A. (2012). The MetLife survey of the American teacher: Teachers, parents and the economy. Metlife. Park, S., & Ryu, J.  H. (2019). Exploring pre-service teachers’ emotional experiences in an immersive virtual teaching simulation through facial expression recognition. International Journal of Human-Computer Interaction, 35(6), 521–533. https://doi.org/10.1080/1044731 8.2018.1469710 Pressley, T. (2021). Factors contributing to teacher burnout during Covid-19. Educational Researcher, 50(5), 325–327. https://doi.org/10.3102/0013189X211004138 Rosson, M. B., & Carroll, J. M. (2002). Scenario-based design. In J. Jacko & A. Sears (Eds.), The human-computer interaction handbook: Fundamentals, evolving technologies and emerging applications (pp. 1032–1050). Lawrence Erlbaum. Thompson, M., Owho-Ovuakporie, K., Robinson, K., Kim, Y. J., Slama, R., & Reich, J. (2019). Teacher moments: A digital simulation for pre-service teachers to approximate parent–teacher conversations. Journal of Digital Learning in Teacher Education, 35(3), 144–164. https://doi. org/10.1080/21532974.2019.1587727

Chapter 8

Formative Learning Design in the COVID-­19 Pandemic: Analysis, Synthesis, and Critique of Learning Design and Delivery Practices Ahmed Lachheb, Jacob Fortman, Victoria Abramenka-Lachheb, Peter Arashiro, Ruth N. Le, and Hedieh Najafi Abstract  Diverse publications have begun to report the various learning design and delivery practices of instructors during or in relation to the COVID-19 pandemic. Due to their developmental nature in facing unprecedented challenges within new learning environments/experiences, we consider these practices a form of formative design. To further clarify and advance a common notion of formative design, we offer a synthesis and grounded critique of published design cases and reflective papers published in select academic/peer-reviewed journals. Our analysis of these publications was guided by two research questions: (RQ1) What learning design and delivery practices were reported by educators during or in relation to the COVID-19 pandemic? (RQ2) Taking into account contexts, what unique or new learning design and delivery practices appeared during, or in relation to, the COVID-19 pandemic? Our findings suggest that five themes characterize the reported learning design and delivery practices during the pandemic: minimizing stress, reacting to change, iterative loops, strengthening systems, and online learning activities. The majority of these design and delivery practices were new to instructors. Keywords  Formative learning design · COVID-19 · Learning design

A. Lachheb (*) · V. Abramenka-Lachheb · P. Arashiro · R. N. Le · H. Najafi University of Michigan, Ann Arbor, MI, USA e-mail: [email protected]; [email protected]; [email protected]; ruthle@umich. edu; [email protected] J. Fortman Grand Valley State University, Allendale, MI, USA e-mail: [email protected] © The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 B. Hokanson et al. (eds.), Formative Design in Learning, Educational Communications and Technology: Issues and Innovations, https://doi.org/10.1007/978-3-031-41950-8_8

93

94

A. Lachheb et al.

Introduction and Purpose of the Study The impact of natural disasters, armed conflicts, and pandemics on education and school systems around the world is well documented (Ayebi-Arthur, 2017; Baytiyeh, 2018; Burde et al., 2015; Tarricone et al., 2021). Significant disasters from the past, including Hurricane Katrina which struck the southern U.S. city of New Orleans in 2005, and earthquakes that struck the cities of Canterbury and Christchurch, New Zealand in 2010 and 2011 respectively, have provided lessons learned for what education systems, schools, and educators can do in the midst, and in the aftermath, of disasters to support student learning and wellbeing. While it is premature to garner lessons learned from the COVID-19 pandemic due to its ongoing nature, and due to the need for more thoughtful inquiry work that can bring rich lessons, different publications on COVID-19 learning design and delivery practices started to appear in the instructional/learning design and technology (I/LDT) literature. These publications document learning design and delivery practices and offer insights into the thinking of educators (faculty, instructors, and learning designers) about their professional practices during the COVID-19 pandemic. In this study, in order to arrive at a synthesis and then offer a grounded critique of learning design and delivery practices during and in relation to the COVID-19 pandemic, we analyzed published design cases and reflective papers in select academic/peer-reviewed journals that report on these practices. While various meta-­ analyses (cf. Pokhrel & Chhetri, 2021; Hysa et al., 2022) have been conducted to synthesize findings based on empirical educational data, our study is focused on synthesizing practices in situ based on educators’ narratives in design cases and reflective essays. We consider these learning design and delivery practices a form of formative design due to their developmental nature in facing unprecedented challenges to learning environments/experiences.

Literature Review  eaching and Learning During the COVID-19 Pandemic—Is It T “Online Learning”? In response to the COVID-19 pandemic, many educational/training institutions were forced to pivot from in-person learning and teaching to emergency remote teaching (ERT) (Hodges et al., 2020). This rapid transition challenged educators in several ways (Bonk, 2020; Literat, 2021), to the extent that some have argued that educators have faced unprecedented challenges. For instance, many teachers/trainers were required to adapt their teaching/training plans to an online format in a short period of time, without prior practice or fluency with learning design approaches in the online learning space. Others found themselves unable to maintain a sense of classroom/learning community, or engage their learners in experiences that required

8  Formative Learning Design in the COVID-19 Pandemic: Analysis, Synthesis…

95

access to physical spaces, despite having the capacity to meet online with their students regularly (Bonk, 2020). These challenges, which have reverberated throughout schools, students, families, and communities, underscore the sociopolitical inequalities of education during COVID-19. As Literat (2021) noted, such inequalities exist along social and material lines, as student success is often mediated by differing computer hardware, internet connectivity, digital literacy, and self-directed learning skills. It is common knowledge—and arguably a solid principle for instructional/learning design professionals and stakeholders—that designing online courses requires more than mirroring face-to-face courses in the online environment (Hodges et al., 2020; Watson et  al., 2017). Therefore, teaching and learning experiences that occurred in the spring 2020 semester—right when universities decided to close their physical facilities for learning—must be labeled as “emergency remote teaching” and not as “online teaching” to maintain clarity, and to do justice when describing two different categories of designs. Online teaching/learning and distance education have already been established as rigorous disciplines of inquiry and educational professions. Their practices have become evidence-based and the scientific judgment is clear about their validity: online learning is not inferior to traditional modes of learning (Veletsianos, 2020).

 aculty/Teachers as Designers—Are They Professionally Equal F to Learning Designers? Before the emergence of the learning designer profession, faculty/teachers had traditionally been tasked to design, develop, and implement learning experiences. Their professional practices have been studied recently as design practices. As the professional role of a learning/instructional designer gained momentum and importance in educational institutions/organizational training (Kim, 2018), design started to hold an important place in fields of inquiry that focus on topics of learning and teaching. For example, K-12 teachers were found to be engaged in similar design activities and processes, and even hold similar thought patterns to learning designers (Gyabak, 2018). At the university level, and in designing online courses, faculty were found to make different design judgments and be able to create authentic learning experiences with or without collaboration with learning/instructional designers (Abramenka-Lachheb & Ozogul, 2022). Furthermore, several studies have looked at investigating the relationship and the collaboration between faculty/teachers and instructional/learning designers. For instance, Halupa (2019) stated that the roles of faculty and designers should be differentiated and clarified to ensure successful collaboration. Similarly, Richardson et  al. (2019) argued for the importance of collaboration between instructional designers and faculty and outlined key elements of effective relationship building between them. Nevertheless, multiple scholarly arguments and valid professional

96

A. Lachheb et al.

positions have called for not being cavalier with attributing the term/adjective/role “designer” to non-professional learning designers, such as faculty/teachers. The argument advocated for the higher value of skills that professionally trained instructional/learning designers have compared to traditionally trained teachers and faculty in traditional disciplines, such as sociology or English. As Kim (2018) argued: Of course, you hire the instructional designer who has been trained to be an instructional designer. A professional with training in the science of learning. An expert on the theoretical frameworks and research-validated methods for course design (para. 1–3).

 heoretical Orientation—What is Design and What is T Formative Design? The questions “What is design?” and “What do designers do?” have been entertained by several design scholars, mainly in the design epistemology and cognition movement (Cross, 2018). The consensus among several notable design scholars, such as Alexander (1964), Schön (1983, 1987) and Nelson and Stolterman (2012), is that design involves an aspect of an intentional creation (Roman, 2018). Schön (1983, 1987) and Nelson and Stolterman (2012) define design as a human tradition, arguably the oldest human tradition: “Humans did not discover fire—they designed it” (Nelson and Stolterman, 2012, p. 11). Design in this sense is a unique approach to life and different from other human approaches, such as science and art. It is an activity in service to others— people and organizations—and a way of inquiry. Designers, in this sense, are the individuals who have this professional ability for creation through coming up with an idea and bringing this idea into life—an idea by itself is not a design. For this study, we orient our thinking about design as the intentional act of creation with the goal of introducing a change into the world. This act of creation ought to be carried out by a professional practitioner called a designer, and enacted through a process called the design process to achieve the design goal. As Kenny (2017) remarked, despite having a clear definition of design and designers’ role, a formal definition of formative design does not yet exist in the general design literature and in the I/lDT literature. In fact, searching the term formative design yields literature that focuses on the constructs of “formative” vs. “summative” assessments—two well-established constructs with clear definitions. However, it is safe to assert, based on literature published in the Journal of Formative Design in Learning, that formative learning design is: • a form of design that prioritizes developmental ways of designing (Parfitt & Rose, 2018) • an approach that learning designers adopt in carrying out processes of exploring design solutions, including constructing prototypes (Bridges et al., 2018), engaging in multiple cycles of iterations (Schmidt et al., 2020), as well as adaption and improvisation practices

8  Formative Learning Design in the COVID-19 Pandemic: Analysis, Synthesis…

97

Significance of the Study The need to analyze publications that report on learning design and delivery practices becomes crucial in understanding how educators (learning designers, teachers, faculty, etc.) faced the COVID-19 challenges, and what critical insights could be gained to inform future learning design and delivery practices in an ethical and rigorous manner. This type of inquiry aligns with the need for educational inquiry — especially in the I/LDT field—to be socially responsible (Reeves, 1995). Public health experts are in consensus that COVID-19 is not the last pandemic we will witness in the twenty-first century (McKeen, 2021). Because of unforeseen challenges and disruptions to contexts of teaching and learning, we learning design scholars and practitioners should align with the consensus that we have a considerable margin for improvement. Ergo, now that we foresee pandemics, we must be ready to help people learn in an ethical and equitable manner.

Researchers’ Roles and Positionality We (the authors) are a research team consisting of six (6) professional learning designers with extensive and rich experience in higher education, non-profit, and K-12 educational contexts. We are non-faculty educators who continuously impact the state of learning through professional design work and research. We believe in the importance of connecting research on learning design to the practice of learning design. We aspire to generate good outcomes through this connection. We approached this study with care and thoughtfulness. Our research goal is descriptive in nature with a margin to share our intellectual thoughts in the form of a grounded critique. We do not want to insert our own biases or perspectives into the research. We came to the study with an aspiration for high-quality inquiry work that would allow us to candidly share our observations and doubts as needed.

Methods Our study is guided by the following research questions: • (RQ1) What learning design and delivery practices were reported by educators during or in relation to the COVID-19 pandemic? • (RQ2) Taking into account contexts, what learning design and delivery practices appeared to be unique or new during or in relation to the COVID-19 pandemic?

98

A. Lachheb et al.

Scoping of Articles We focused our analysis on design cases and reflective papers that explicitly report on learning design and delivery practices during or in relation to the COVID-19 pandemic. We intentionally focused on design cases and reflective papers because they prioritize narrating what happened in their bounded in situ educational contexts, and what educators did. Other papers (studies, literature reviews, etc.) focus on data and findings that can be generalizable, and thus offer limited opportunity to learn about in situ design and delivery practices. The publications are extracted from the scholarship venues as they contain design cases and reflective papers (see Table 8.1). We chose to scope literature from the above scholarly sources because of their rigor and implemented peer-review processes, and a broad focus that afforded us a holistic view of diverse contexts (i.e., these journals are not focused on specific subject areas and/or learning context). In scoping and extracting literature we adopted the following inclusion/exclusion criteria: Table 8.1  Summary of studies reviewed for analysis

Journal International Journal of Designs for Learning (IJDL) The Journal of Teaching and Learning with Technology (JoTLT) TechTrends Educational Technology Research and Development (ETR&D) Information and Learning Sciences

Online Learning Journal (OLJ) TOTAL

# Articles Scope published Volume 12, Issue 1 (Special Section on Designing 6 for Learning in a Pandemic)

# Articles selected 6

Volume 10, Issue 1 (Special Issue on Transitioning Teaching to Remote and Online Learning During the COVID-19 Pandemic)

41

32

Articles published between 2020–2021 with keyword “covid” Articles published between 2020–2021 with keyword “covid”

47

2

79

0

Volume 121, Issue 5/6 and Volume 121 Issue 7/8 42 (Special Issue on Evidence-based and Pragmatic Online Teaching and Learning Approaches: A Response to Emergency Transitions to Remote Online Education in K-12, Higher Education, and Librarianship, Part 1 & 2) Articles published between 2020–2021 with 18 keyword “covid” 6 Journals 215

6

1 47

8  Formative Learning Design in the COVID-19 Pandemic: Analysis, Synthesis…

99

Inclusion Criteria A. The article must have been be published between 2020–2021 B. The article must report explicitly on learning design and delivery practices during or in relation to the COVID-19 pandemic, through either a design case or a reflective essay written with an educator’s/scholar’s voice Exclusion Criteria A. Articles published before 2020 or after 2021 B. Articles/studies that deal with interventions or philosophical/theoretical discussions on what learning design and delivery practices ought to be used

Analysis of Articles The analysis of literature followed a multiple-cycle approach, consistent with methods for analyzing qualitative data (Merriam & Tisdell, 2015; Saldaña, 2016). In the first cycle, we each self-assigned articles to review that met our selection criteria in a large excel file. We read the articles closely to answer the following guiding questions (with pre-established lists and open-ended answers): • What is the context of the study (choosing from): Higher Ed (N = ⁠38), Technical/ Professional Education (N = 2); K12 (N = 7) • What are the design and delivery practices reported? • From the author(s) and/or discipline point of view, do these design and delivery practices appear to be unique or new? • Do you have additional comments/free thoughts on the article? In answering these questions, we established a coding book (Appendix A) in order to ensure the clarity and consistency of our answers to the above guiding questions. We held regular meetings and asynchronous discussions at which we addressed any questions or confusion one might have (e.g., whether a specific article met our selection criteria or not, and whether a statement in the article qualified to be reported as design/delivery practice). We also discussed in detail our goal for the study, what we hoped to accomplish, and what values we wanted to bring to our inquiry efforts. In the second cycle, we met as a team to categorize the design and delivery practices we extracted from the articles, following an inductive approach and using an axial coding method (Merriam & Tisdell, 2015; Saldaña, 2016). We each first took a set of data and independently generated preliminary descriptions that described a group of design and delivery practices. We shared these descriptions with each other, using examples from our analysis sheet. We talked through what we saw common in the data, and how each of our descriptions might differ or overlap. We asked questions, we raised doubts, and kept our thoughts open for change. We relied on a

100

A. Lachheb et al.

digital board with sticky notes to draw these categories and iterate on them as our discussion progressed (Fig.  8.1). Through meaning negotiation and consensus (Tomita et al., 2021), we arrived at 25 distinct categories that describe the design and delivery practices we extracted from the literature (see Appendix B for categories and examples). In the third cycle, we met as a team to find commonalities among the 25 categories we generated in the second cycle of data analysis. We asked ourselves “What, if any, are the common threads between these categories?” Knowing that having 25 categories will be too broad to generate meaningful findings and moving toward further abstraction of data to increase the transferability and the validity of our inquiry (Merriam & Tisdell, 2015; Saldaña, 2016), our goal was to generate a broader set of themes that could describe the richness of the design and delivery practices (Fig.  8.2). Again, through meaning negotiation and consensus (Tomita et al., 2021), we arrived at five (5) distinct themes that describe the 25 categories of design and delivery practices we extracted from the literature (see Table  8.2 for themes and categories). To answer the second research question, we looked at our notes in the analysis sheet where we answered, “From the author(s) and/or discipline point of view, do these design and delivery practices appear to be unique or new?” These notes allowed us to count how many times we found either an implicit or explicit mention in the papers on whether design and delivery practices appear to be unique or new.

Fig. 8.1  Final categories of design and delivery practices as they appear in collaborative digital board

8  Formative Learning Design in the COVID-19 Pandemic: Analysis, Synthesis…

101

Fig. 8.2  Final themes based on categories of design and delivery practices as they appear in collaborative digital board Table 8.2  Categories and themes of design and delivery practices Categories Regular and frequent communication Equity Establishing rules for online space Community-building among peers Simplifying modes of communications Student well-being Centering personal connection Reducing cognitive load Alternative means of collaboration in class Reflective and metacognitive learning strategies Acknowledging transition and loss Focusing on the most important skills/competencies to teach Re-evaluation of instructional strategies Digital content creation Asynchronous learning activities Synchronous meetings to mirror face to face classes Seeking continuous feedback Embracing flexibility

Theme Minimizing stress

Definition Design and delivery practices that seek to make the ERT shift less stressful on students and their families

Reacting to change

Immediate design and delivery practices that allowed schools to react to the swift change to ERT, common across all contexts

Iterative loops

Design and delivery practices that aimed to introduce continuous improvement (continued)

102

A. Lachheb et al.

Table 8.2 (continued) Categories Community-sourced online pedagogy resources Technical and instructional support for teachers/faculty/ students Collaboration structures Gamificationrepurposing openly available content as educational content New types of assessments/grading Adopting web tools to support collaboration

Theme Strengthening systems

Definition Design and delivery practices that allow schools to make their educational systems and the experiences they offer to be strong in facing the rapid shift to ERT

Online learning alternatives

Design and delivery practices that aimed to introduce different forms of designs for learning by taking advantage of online learning affordances

Study Limitations Like our design work, our inquiry effort had constraints and did not seek to provide a generalizable truth that could be applicable everywhere. Our study’s limitations arise mainly from the sources of our data—the articles (design cases and reflective essays) we scoped as reporting on learning design and delivery practices during and/or in relation to the COVID-19 pandemic. First, the articles we analyzed are what is reported “in print” in the academic journals we scoped. Therefore, there are likely other learning design and delivery practices that are either not reported or have been reported in different venues, such as blogs and articles in trade magazines/websites. Second, the articles we analyzed are in the English language, and mostly report on educational contexts in the United States, Canada, and in a handful of international contexts where the English language is widely used. Third, the time frame of the study we analyzed is 2020–2021 because we conducted this study in the 2021–2022 academic year and before the end of the 2022 calendar year. Therefore, articles published in 2022 could include further learning design and delivery practices during and/or in relation to the COVID-19 pandemic and provide the opportunity for a replication study.

Findings ( RQ1) What Learning Design and Delivery Practices are Reported by Educators During or in Relation to the COVID-19 Pandemic? Based on the articles we have analyzed, we found diverse design and delivery practices that are recurring in multiple contexts, which allow us to offer the following themes to describe these practices and quantify them:

8  Formative Learning Design in the COVID-19 Pandemic: Analysis, Synthesis…

103

Practices for Minimizing Stress Given the stressful nature of the abrupt change caused by the pandemic, there were several design and delivery practices that sought to make the shift to ERT less stressful on students and their families. These practices are the most reported practices as they are categorized based on our analysis into ten (10) categories as we illustrated earlier in Table 8.2. These practices include adjusting the duration of live one-day events to multi-day activities, providing appropriate scaffolding and clear guidance, reviewing past work, and completing make-up work to maintain student engagement while they dealt with the stresses of the pandemic (e.g., child care). Practices for Reacting to Change Given the sense of urgent change that needed to be made, there were several immediate design and delivery practices that were implemented by educators to allow them to react to the swift change to ERT. These practices were common across all contexts and are the second most reported practices—categorized based on our analysis into six (6) categories as we illustrated earlier in Table 8.2. Practitioners leveraged regular and frequent communication through phone calls, student feedback, regular meetings, and posted weekly videos. These allowed a sense of structure for the ever-changing landscape of health policies, closures, and their impacts. Practices for Online Learning Alternatives Despite the ERT nature of learning experiences during the pandemic, there were several design and delivery practices reported that aimed to introduce different forms of designs for learning by taking advantage of online learning affordances. These practices are the third most reported, as they are categorized based on our analysis into four (4) categories as we illustrated earlier in Table 8.2. These practices included gamification, repurposing openly available content as educational content, introducing new types of assessments/grading, and adopting web tools to support collaboration. Practices for Strengthening Systems Once educators reacted to change through immediate design and delivery practices, a second set of reactive practices were implemented to allow schools to make their educational systems and the experiences they offer to be strong in facing the rapid shift to ERT. These practices were ongoing and are categorized into three (3) categories: community-sourced online pedagogy resources, technical and instructional support for teachers/faculty and students, and collaboration structures. Examples included: inviting guest lectures to class, creating support resources for learning management systems, and forming small cohorts to share lesson plans and new ideas from emergency remote teaching. Practices for Iterative Loops Realizing the need to improve, educators reported several design and delivery practices that aimed to introduce continuous improvement. These practices are categorized into two (2) categories as we illustrated earlier in Table 8.2: seeking continuous feedback and embracing flexibility. These practices are the least reported practices. Examples included: soliciting student feedback through discussion threads, rapidly

104

A. Lachheb et al.

implementing and discussing student feedback, and embracing asynchronous methods of learning to increase flexibility.

( RQ2) Taking Into Account Contexts, What Unique or New Learning Design and Delivery Practices Appeared During, or in Relation to, the COVID-19 Pandemic? We found consistent evidence that most learning design and delivery practices, as reported by educators in the articles we analyzed, are either new or unique to them. For example, Barwick (2021) reflected on how [they] tried a new teaching strategy in their course—asking students to play an active role in shaping the course content and their learning experience: I decided to try something new. Rather than overloading the course with more of my own content, I would invite students to play an active role in reimagining the course. After all, the students were now at the point where they knew coffee beyond its basics, and some of the most impactful moments of the semester had already resulted from their contributions. How could we now use the technology available to us to continue to learn and connect in meaningful ways? (p. 47)

Similarly, Scott-Harmon (2021) reported how [they] experimented with new technologies and the feeling they experienced: My resistance to collaboration was nearly as monumental as my resistance to experimenting with new aspects of technology. I had a basic grasp of Canvas, so why should I need more? As it turned out, I was not entirely fearless. I had to confront fears that ranked right up there with the virus itself. I was afraid of the time it would take to learn and feel comfortable using new technological tools. I was afraid of failing at using them. I was afraid of looking like an amateur in front of my savvy students. (p. 187)

We also found multiple instances where educators reported—from their point of view/discipline—unique learning design and delivery practices. For example, in the context of technical and professional education, Breman (2021) reported most if not all practices deployed in March of 2020 were new to the institution given its focus on technical education for the future workforce in the electric industry. Their use of outdoor spaces for hands-on training while complying with COVID-19 safety guidelines is unique to this school given their affordances to do so. In a few instances, we found evidence that pointed to the fact that educators did not report new design and delivery practices because of their previous online learning design and teaching experience. For example, Dietz et  al. (2021) mentioned their 20 years of experience teaching online and emphasized that this experience allowed them to use strategies applicable for an online format and to leverage technologies that students had available to them while still maintaining student learning outcomes.

8  Formative Learning Design in the COVID-19 Pandemic: Analysis, Synthesis…

105

Discussion & Critique  earning Design and Delivery Practices Reported by Educators L During or in Relation to the COVID-19 Pandemic Looking at the variety of learning design and delivery practices during and in relation to the COVID-19 pandemic as reported by educators, it is worth noting the multitude of practices for minimizing stress, the most reported practices as they are categorized based on our analysis into ten (10) categories as we illustrated earlier in Table 8.2. These practices speak to the heavy impact the pandemic had on students, their families, and on educators. They also speak to how educational institutions acknowledged this heavy impact and sought to do something about it. As educational institutions—and educators by extension—continue to ponder their roles within the communities they belong to and what benefit they provide to society, it is important to remember that educational institutions could provide a vital role in the well-being of those who serve, and they ought to re-think their practices now that they are thinking of “going back to normal.” Minimizing and eliminating stress should not be temporary practices solely in relation to the COVID-19 pandemic. Lastly, while the practices for minimizing stress are not directly linked to formative design, these practices indicate the factors that sparked the need for formative design practices.

 earning Design and Delivery Practices Appeared to be Unique L or New During or in Relation to the COVID-19 Pandemic Looking closely at the immediate design and delivery practices that allowed schools to react to the swift change to ERT, and the second to immediate practices to strengthen the systems, it is interesting how common they were across all contexts—virtually every educational institution used a web conferencing online system and similar online tools to meet and conduct lessons synchronously, and ramped up their technical and instructional support for teachers/faculty and students. Interestingly, synchronous meetings and discussions are used less frequently in online learning environments (i.e., learning experiences that are well-designed to be taught online ahead of time), especially in higher education. Also, most learning design professionals, mainly in higher education and K-12 settings, have called for extended support systems, citing being overworked and not having sufficient time to do all the work they needed (Moore et al., 2021). This begs the question of disparity of learning design practices and the systems needed among and within several educational institutions: How is it that several programs/courses/universities hold a rich tradition of high-quality online learning, yet during the pandemic, the most frequent “solution” is meeting synchronously through web conferencing tools? The answer to the last question is certainly complex and nuanced. Yet, it is safe to assert three factors that led to this situation: (1) thoughtful, rigorous, and slow

106

A. Lachheb et al.

online learning design processes cannot be scaled rapidly, (2) several educators have not had the chance to experiment with online learning as they were disadvantaged by their institutions (e.g., adjunct faculty who are paid disproportionately less than their colleagues), (3) several educators did not experiment with online learning as a matter of power and privilege. The last point appears to be valid, based on the frequent mention of “doing [X practice] for the first time” and the handful of learning design and delivery practices we found for online learning alternatives. The handful of learning design and delivery practices that we found aimed to introduce continuous improvement and online learning affordances, points to: (1) the fact that most learning experiences should be appropriately described as ERT, not online learning, and (2) educators in the COVID-19 pandemic found themselves in a formative design space that required mostly adaptation and improvisation, and less innovation, as “we were all in survival mode.” This last point is certainly understandable given time constraints, lack of resources, and several other personal factors that ought to be understood with empathy and sympathy. Additionally, it is not reasonable to expect faculty/teachers to independently design high-quality (online or any other forms of) learning experiences. As evidenced in the articles we analyzed, the four categories of practices for online learning (gamification, repurposing openly available content as educational content, new types of assessments/grading, adopting web tools to support collaboration) have been heavily relied upon by learning designers for the last two decades. Professional learning designers (with diverse job titles ranging from instructional designers to learning experience designers) are the ones who are the most qualified to deliver professional design work for online learning.

Future Work Given the ongoing nature of the pandemic and the previously mentioned limitations of our research, we see several directions for expanding on research related to formative learning design during COVID-19. First, while our research drew on published, public-facing design cases and reflective papers. Other empirical data sources such as interviews and personal memos would likely provide rich data sources for understanding instructor sense-making during the pandemic. Second, while some new design practices were reported to be introduced during COVID-19, it is not clear how long these practices will persist into the future. Future researchers may find it beneficial to trace a longer historical narrative of learning design practices before, during, and after the pandemic, to more fully grasp how COVID-19 has catalyzed new practices and ideas in learning design. Finally, while our research was primarily centered on English speaking countries, future research should attend more specifically to other populations, including non-English speaking populations, racial minority populations, and lower income communities. In this way, we can begin to conceptualize the plurality of context specific ideas and practices that are constituted within an understanding of formative learning design.

8  Formative Learning Design in the COVID-19 Pandemic: Analysis, Synthesis…

107

Conclusion To arrive at a synthesis and then offer a grounded critique of learning design and delivery practices during and in relation to the COVID-19 pandemic, we analyzed published design cases and reflective paper in select academic/peer-reviewed journals that reported on these practices. Our findings, by keeping in mind the study limitations, point to diverse learning design and delivery practice and offer a view into the thinking of educators (faculty, instructors, and learning designers) on their professional practices during the challenging times of the COVID-19 pandemic. These practices were aligned, for the most part, with formative design thinking, and brought to light a tension regarding professional design responsibility.

Appendices Appendix A: Coding Book

Code Design and delivery practices are reported

Design and delivery practices appears to be unique or new?

Definition Design and delivery practices are what educator(s) (faculty, instructor, learning designers, etc.) have done to shift the mode of learning and teaching from F2F to ERT/L. the design practices are goal-oriented for a specific purpose. Examples:   Recording a video lecture using screencasting tools and sharing with students asynchronously, by adapting slides that were meant to be shared in class   Lecturing students via webconferencing app, and improvising students’ engagement methods while lecturing (polls, pausing for questions, breakout rooms, etc.)   Designing assessments that include multiple choice and/or open-ended questions to be taken on the LMS instead of a class presentation or using in-lab equipment   Designing online simulations as an alternative to in class/lab experiences of manipulating tools/equipment From the point of view of the author(s) the design and delivery practices appear to be new to them, i.e., this is the first time they were engaged with these practices in their teaching profession (e.g., giving an exam online or recording a lecture at home), or/and appear to be unique to them, i.e., they have engaged in these practices because their discipline requires them to do so (e.g., remote internship/apprenticeship engagement with organizations to fulfill an academic course requirement). New and or unique could be inferred directly if the author said so explicitly in the article. Or inferred indirectly if there is a statement or a hint to that. Examples:   “This was my first time recording a video lecture using screencasting tools at home and talking to a black screen, without seeing my students.” >>New practice for the educator   “Because all the students have started in January their internship projects with industry partners, we asked them to continue working with them remotely and switch their focus on projects that have applications in the online space.” >>Unique practice for the educator

A. Lachheb et al.

108

 ppendix B: Categories of Design and Delivery Practices A With Examples

Category Regular and frequent communication Equity

Establishing rules for online space

Community building among peers Simplifying modes of communications Student well-being

Centering personal connection Reducing cognitive load

Alternative means of collaboration in class Reflective and metacognitive learning strategies Acknowledging transition and loss

Example of a design/Delivery practice Require all staff to meet with administrators on Zoom every morning Technology coaches continually work with learning support specialists on identifying and making sure teachers use accessibility tools (closed captioning, text-to-speech and speech-to-text). Mutually constituting expectations and commitments between teacher and the teacher-­ learners on what concerns they had and how she could best support their learning Students’ discussion topics changed, as they shifted to share more about their daily life, show peer support and empathy to each other Coordinate responses to faculty and instructor requests and questions Revising syllabi to include information about health and safety due to the pandemic

Virtual breakfast—each student has a cup of coffee, tea, or any other beverage and a simple meal that they could eat Duration of talks was reduced to prevent fatigue and asynchronous communications were added to allow participants to discuss the talks Creating “breakout rooms” that allowed students to resume their group work.

Students documented their creative activities in a logbook Faculty used a flipped classroom approach and taught students how to successfully learn in online courses through skills training after acknowledging struggles in adapting to a new normal Tossing anything cumbersome out and Focusing on the most important skills/competencies minimally adapt application activities to work through Zoom. If an application activity to teach required a significant change to fit the online format, leadership told the project team to omit it from the slide deck

Reference Justis et al. (2020) Peterson et al. (2020)

Jung and Brady (2020)

Muhonen and Räsänen (2021) Kessler et al. (2020) Young and Román-­ Lagunas (2021) Abas (2021)

Arnold and Kumar (2021) Quintana and Quintana (2020) Winters (2021) Kurz et al. (2021)

Breman et al. (2021)

(continued)

8  Formative Learning Design in the COVID-19 Pandemic: Analysis, Synthesis… Category Re-evaluate instructional strategies

Example of a design/Delivery practice Re-evaluated best practices from Flower Darby’s (2019) advice on scaffolding, beginning with low-stress assignments to make sure students get the hang of the technology and continuing low-stakes work before major assignments Digital content creation Re-designing a nursing simulation lab with the use of mannequins and medical equipment to online modules with video demos and activities Asynchronous learning Icebreaker session was implemented as activities asynchronous LMS based combinations and in synch breakout rooms Synchronous meetings to Face-to-face classes continued as synchronous mirror face to face classes weekly meetings in zoom (60–90 min) featuring a brief social-emotional sharing time Seeking continuous feedback Ask for student feedback and invite them to express their concerns and preferences for course modifications Embracing flexibility On-the-fly pivots to adjust to changing and previously unknown conditions and circumstances Community-sourced online Invited teachers to share insights and skills in pedagogy resources morning meetings and elsewhere so that others might appropriate these ideas and incorporate them into their own instructional design work Technical and instructional Moderator for every class on Zoom to “support support for teachers/faculty the instructor by solving technical problems for and students students, posting poll questions, monitoring the chat for students’ questions, and summarizing students’ answers to activities and questions that the instructor introduced” Collaboration structures Google Calendar for Teams on Slack to keep track of meetings Gamification Canvas navigation was designed to emphasize the gamified elements and strengthen students’ perception of the course as a game Repurposing openly available Using Google streets view and virtual 3D content as educational environments to create an opportunity for content learning through a cultural immersion experience New types of assessments/ Redesigning students’ assignments to include grading short walks in their neighborhood or science experiments with kitchen supplies Adopting web tools to Used Gather Town to recreate the office, foster support collaboration hallway conversations, and add unconventional elements such as virtual frisbee field

109

Reference Scott-Harmon (2021)

Ross et al. (2021) Arnold and Kumar (2021) Moore-­ Beyioku (2021) Dietz et al. (2021) Cole Harmon et al. (2021) Justis et al. (2020)

Breman et al. (2021)

Shrestha and Rogers (2021) Neumann et al. (2021) Kolovou (2021)

Justis et al. (2020) Shrestha and Rogers (2021)

110

A. Lachheb et al.

References Abas, S. (2021). Teaching and learning in COVID times: A reflective critique of a pedagogical seminar course. Journal of Teaching and Learning with Technology, 10(1), 34–43. https:// scholarworks.iu.edu/journals/index.php/jotlt/article/view/31392 Abramenka-Lachheb, V., & Ozogul, G. (2022). Faculty as designers of authentic learning projects in online courses. Online Learning Journal, 26(4). https://doi.org/10.24059/olj.v26i4.2826 Alexander, C. (1964). Notes on the synthesis of form. Harvard University Press. Arnold, P., & Kumar, S. (2021). Designing “virtual social Europe days”—An international collaborative seminar across closed borders. International Journal of Designs for Learning, 12(1), 125–139. https://doi.org/10.14434/ijdl.v12i1.31290 Ayebi-Arthur, K. (2017). E-learning, resilience and change in higher education: Helping a university cope after a natural disaster. E-Learning and Digital Media, 14(5), 259–274. https://doi. org/10.1177/2042753017751712 Barwick, C. (2021). Coffee over Zoom: Teaching food studies over the internet during a pandemic. Journal of Teaching and Learning with Technology, 10(1), 44–49. https://scholarworks.iu.edu/ journals/index.php/jotlt/article/view/31566 Baytiyeh, H. (2018). Online learning during post-earthquake school closures. Disaster Prevention and Management, 27(2). https://doi.org/10.1108/DPM-­07-­2017-­0173 Bonk, C.  J. (2020). Pandemic ponderings, 30 years to today: Synchronous signals, saviors, or survivors? Distance Education, 41(4), 589–599. https://doi.org/10.1080/0158791 9.2020.1821610 Breman, J., Oostra, K., Bosch, C., & Kaiser, J. (2021). Emergency remote delivery—Rapid resilience at a trade school in the utility industry. International Journal of Designs for Learning, 12(1), 102–111. https://doi.org/10.14434/ijdl.v12i1.31302 Bridges, J., Stefaniak, J., & Baaki, J. (2018). A formative design examining the effects of elaboration and question strategy with video instruction in medical education. Journal of Formative Design in Learning, 2(2), 129–143. https://doi.org/10.1007/s41686-­018-­0025-­5 Burde, D., Guven, O., Kelcey, J., Lahmann, H., & Al-Abbadi, K. (2015). What works to promote children’s educational access, quality of learning, and wellbeing in crisis-affected contexts. Education Rigorous Literature Review, Department for International Development. Department for International Development. https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/470773/Education-­emergencies-­rigorous- ­ review2.pdf Cole Harmon, R., Hospelhorn, M., Gutierrez, E., Velarde, C., Fetrow, M., & Svihla, V. (2021). Mission to Mars amidst a pandemic. International Journal of Designs for Learning, 12(1), 158–170. https://doi.org/10.14434/ijdl.v12i1.31295 Cross, N. (2018). Developing design as a discipline. Journal of Engineering Design, 29(12), 691–708. https://doi.org/10.1080/09544828.2018.1537481 Dietz, B., Petonito, G., Mann, K., Oswald, B., McMahon, C., & Linthicum, M. (2021). Pivoting during a pandemic: Creating presence for all students. Journal of Teaching and Learning with Technology, 10(1) https://scholarworks.iu.edu/journals/index.php/jotlt/ article/view/31376 Gyabak, K. (2018). Making do: A multiple case study on the designerly thinking and action of primary school teachers in under-resourced schools of Bhutan. Doctoral dissertation, Indiana University. ProQuest Dissertations & Theses Global. https://www.proquest.com/ docview/2131415382 Halupa, C. (2019). Differentiation of roles: Instructional designers and faculty in the creation of online courses. International Journal of Higher Education, 8(1), 55–68. https://doi. org/10.5430/ijhe.v8n1p55

8  Formative Learning Design in the COVID-19 Pandemic: Analysis, Synthesis…

111

Hodges, C. B., Moore, S., Lockee, B. B., Trust, T., & Bond, M. A. (2020). The difference between emergency remote teaching and online learning. Educause Review. https://er.educause.edu/ articles/2020/3/the-­difference-­between-­emergency-­remote-­teaching-­and-­online-­learning Hysa, E., Akbar, A., Yasmin, F., & ur Rahman, A., & Li, S. (2022). Virtual learning during the COVID-19 pandemic: A bibliometric review and future research agenda. Risk Management and Healthcare Policy, 15, 1353. https://doi.org/10.2147/RMHP.S355895 Jung, H., & Brady, C. (2020). Maintaining rich dialogic interactions in the transition to synchronous online learning. Information and Learning Sciences, 121(5/6), 391–400. https://doi. org/10.1108/ILS-­04-­2020-­0096 Justis, N., Litts, B.  K., Reina, L., & Rhodes, S. (2020). Cultivating staff culture online: How Edith Bowen Laboratory School responded to COVID-19. Information and Learning Sciences, 121(5/6), 453–460. https://doi.org/10.1108/ILS-­04-­2020-­0136 Kenny, R. (2017). Introducing journal of formative design in learning. Journal of Formative Design in Learning, 1(1), 1–2. https://doi.org/10.1007/s41686-­017-­0006-­0 Kessler, A., Barnes, S., Rajagopal, K., Rankin, J., Pouchak, L., Silis, M., & Esser, W. (2020). Saving a semester of learning: MIT’s emergency transition to online instruction. Information and Learning Sciences, 121(7/8), 587–597. https://doi.org/10.1108/ILS-­04-­2020-­0097 Kim, J. (2018, June 6). A traditional Ph.D. does not an instructional designer make. Inside Higher Ed Digital Learning Blog. https://www.insidehighered.com/digital-­learning/blogs/ technology-­and-­learning/traditional-­phd-­does-­not-­instructional-­designer-­make Kolovou, T.  A. (2021). Recreating cultural immersion in an online environment. Journal of Teaching and Learning with Technology, 10(1) https://scholarworks.iu.edu/journals/index.php/ jotlt/article/view/31236 Kurz, L., Metzler, E. T., & Ryan, K. C. (2021). Teaching in the time of COVID-19: Reconceptualizing faculty identities in a global pandemic. Journal of Teaching and Learning with Technology, 10(1), 172–184. https://scholarworks.iu.edu/journals/index.php/jotlt/article/view/31444 Literat, I. (2021). “Teachers act like we’re robots”: TikTok as a window into youth experiences of online learning during COVID-19. AERA Open. https://doi.org/10.1177/2332858421995537 McKeen, M. (Ed.) (2021). Drawing light from the pandemic: A new strategy for health and sustainable development. In A review of the evidence for the Pan-European Commission on Health and Sustainable Development. European Observatory on Health Systems and Policies. https:// apps.who.int/iris/bitstream/handle/10665/345027/9789289051798-­eng.pdf?sequence=1 Merriam, S. B., & Tisdell, E. J. (2015). Qualitative research: A guide to design and implementation. Wiley. Moore, S., Trust, T., Lockee, B. B., Bond, A., & Hodges, C. (2021). One year later... and counting: Reflections on emergency remote teaching and online learning. Educause Review. https:// er.educause.edu/articles/2021/11/one-­year-­later-­and-­counting-­reflections-­on-­emergency-­ remote-­teaching-­and-­online-­learning Moore-Beyioku, C. (2021). COVID-19 transition to online: Quick! Bring the fun! Journal of Teaching and Learning with Technology, 10(1) https://scholarworks.iu.edu/journals/index.php/ jotlt/article/view/31107 Muhonen, A., & Räsänen, E. (2021). “Kaikille okei” – Everyone alright? Shifting topics and practices in language students’ chat during the global covid-19 pandemic. Journal of Teaching and Learning with Technology, 10(1), 258–278. https://doi.org/10.14434/jotlt.v10i1.31565 Nelson, H. G., & Stolterman, E. (2012). The design way: Intentional change in an unpredictable world (2nd ed.). The MIT Press. Neumann, K. L., Stansberry, S. L., Del Rosso, C. L., Welch, S. S., & Ivey, T. A. (2021). Moonshot: Redesigning NASA’s high school aerospace scholars experience at Johnson space center for online delivery. International Journal of Designs for Learning, 12(1), 140–157. https://doi. org/10.14434/ijdl.v12i1.31303

112

A. Lachheb et al.

Parfitt, C. M., & Rose, A. L. (2018). Collaborating to meet the needs of alternative certification teachers using formative design. Journal of Formative Design in Learning, 2(1), 49–55. https:// doi.org/10.1007/s41686-­018-­0017-­5 Peterson, L., Scharber, C., Thuesen, A., & Baskin, K. (2020). A rapid response to COVID-19: One district’s pivot from technology integration to distance learning. Information and Learning Sciences, 121(5/6), 461–469. https://doi.org/10.1108/ILS-­04-­2020-­0131 Pokhrel, S., & Chhetri, R. (2021). A literature review on impact of COVID-19 pandemic on teaching and learning. Higher Education for the Future, 8(1), 133–141. https://doi. org/10.1177/2347631120983481 Quintana, R., & Quintana, C. (2020). When classroom interactions have to go online: The move to specifications grading in a project-based design course. Information and Learning Sciences, 121(7/8), 525–532. https://doi.org/10.1108/ILS-­04-­2020-­0119 Reeves, T.  C. (1995). Questioning the questions of instructional technology research. In M. Simonson & M. Anderson (Eds.), Proceedings of the 1995 annual national convention of the Association for Educational Communications and Technology (pp. 459–470). Association for Educational Communications and Technology. Richardson, J. C., Ashby, I., Alshammari, A. N., Cheng, Z., Johnson, B. S., Krause, T. S., Lee, D., Randoph, A. E., & Wang, H. (2019). Faculty and instructional designers on building successful collaborative relationships. Educational Technology Research and Development, 67(4), 855–880. https://doi.org/10.1007/s11423-­018-­9636-­4 Roman, T.  A. (2018). Design education at the secondary level in the U.S.: Instructional practices and perspectives of teachers. Doctoral dissertation, Indiana University. ProQuest Dissertations and Theses Global. https://search.proquest.com/docview/2092229260/81733 3C064EF4A1APQ Ross, J. M., Wright, L., & Arikawa, A. Y. (2021). Adapting a classroom simulation experience to an online escape room in nutrition education. Online Learning, 25(1), 238–244. https://doi. org/10.24059/olj.v25i1.2469 Saldaña, J. (2016). The coding manual for qualitative researchers. Sage. Schmidt, M., Cheng, L., Raj, S., & Wade, S. (2020). Formative design and evaluation of a responsive eHealth/mHealth intervention for positive family adaptation following pediatric traumatic brain injury. Journal of Formative Design in Learning, 4(2), 88–106. https://doi.org/10.1007/ s41686-­020-­00049-­z Schön, D. (1983). The reflective practitioner: How professionals think in action. Temple Smith. Schön, D. A. (1987). Educating the reflective practitioner: Toward a new design for teaching and learning in the professions. Wiley. Scott-Harmon, S. (2021). Rolling the die, learning to teach. Journal of Teaching and Learning with Technology, 10(1), 185–193. https://scholarworks.iu.edu/journals/index.php/jotlt/article/ view/31259 Shrestha, D. M., & Rogers, C. (2021). Recreating the experience of an in-person summer internship program remotely. International Journal of Designs for Learning, 12(1), 112–124. https:// doi.org/10.14434/ijdl.v12i1.31351 Tarricone, P., Mestan, K., & Teo, I. (2021). Building resilient education systems: A rapid review of the education in emergencies literature. Australian Council for Educational Research. https:// doi.org/10.37517/978-­1-­74286-­639-­0 Tomita, K., Alangari, H., Zhu, M., Ergulec, F., Lachheb, A., & Boling, E. (2021). Challenges implementing qualitative research methods in a study of instructional design practice. TechTrends, 65(2), 144–151. https://doi.org/10.1007/s11528-­020-­00569-­2 Veletsianos, V. (2020, August 31). The 7 elements of a good online course. George Veletsianos, Ph.D.  Blog. https://www.veletsianos.com/2020/08/31/the-­7-­elements-­of-­a-­good-­online­course/

8  Formative Learning Design in the COVID-19 Pandemic: Analysis, Synthesis…

113

Watson, F.  F., Bishop, M.  C., & Ferdinand-James, D. (2017). Instructional strategies to help online students learn: Feedback from online students. TechTrends, 61(5), 420–427. https://doi. org/10.1007/s11528-­017-­0216-­y305 Winters, T. (2021). Emergency remote studio teaching: Notes from the field. Journal of Teaching and Learning with Technology, 10(1), 117–126. https://doi.org/10.14434/jotlt.v10i1.31580 Young, C. J., & Román-Lagunas, V. (2021). Team-building and the re-prioritizing of the teaching and learning experience in the midst of ambiguity. Journal of Teaching and Learning with Technology, 10(1), 247–257. https://scholarworks.iu.edu/journals/index.php/jotlt/article/ view/31895

Chapter 9

Formative Learning Design Within Project Evaluation: Case of a Food Bank Disaster Planning and Recovery Tool Susie L. Gronseth and Ioannis A. Kakadiaris

Abstract  During disasters, food banks, alongside disaster response partners, work to provide food and water supplies promptly to affected individuals. Opportunities have emerged for researchers and civic partnerships to innovate the processes and resources for food distribution efforts during disasters. This paper describes how a formative evaluation process was utilized to inform the development of an artificial intelligence decision-making support tool (AI-SERVE). An immersive tabletop exercise strategy enabled the team to simulate situations in which the tool could be applied to identify how well the tool functions in relation to target user needs. Keywords  Artificial intelligence · Formative evaluation · Food insecurity · Disaster planning · Iterative design Food insecurity was an issue for approximately 1 out of 10 families in the United States pre-COVID; with the additional challenges brought on by the pandemic, this figure doubled in 2021 (Feeding America, 2022; Food and Agriculture Organization of the United Nations, 2003). In Houston, Texas, even higher rates of food insecurity have been observed, with nearly 25% of children in the region experiencing food insecurity (Correa & ACE Coalition Food Insecurity Workgroup, 2017). Natural disasters, such as hurricanes, floods, and freezes, exacerbate the problem (Herskovitz, 2015). During disasters, food banks, alongside disaster response partners, work to provide food and water supplies promptly to affected individuals, This material is based upon work supported by the National Science Foundation under Grant Nos. 2043988 and 2133352. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation. S. L. Gronseth (*) · I. A. Kakadiaris University of Houston, Houston, TX, USA e-mail: [email protected]; [email protected] © The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 B. Hokanson et al. (eds.), Formative Design in Learning, Educational Communications and Technology: Issues and Innovations, https://doi.org/10.1007/978-3-031-41950-8_9

115

116

S. L. Gronseth and I. A. Kakadiaris

which may include previously food-secure individuals who are experiencing disaster-­caused food-related needs. Efforts by food banks to improve their performance in meeting the food needs of their service areas include supply chain optimization, food insecurity tracking, and social capital growth. In relation to “fast disasters” (such as hurricanes) compounded with “slow disasters” (such as the COVID-19 pandemic), opportunities have emerged for researcher and civic partnerships to foster strategic planning processes for maximizing food bank efforts in meeting the immediate food insecurity needs of their service areas in robust ways. In this research space, an interdisciplinary team of researchers and civic partners formed in the south-central US to engage in a project that aimed to incorporate the inputs from critical stakeholders of the food bank and its network of cooperating organizations to foster collaborative, data-driven planning and decision-making pre-, during, and post-disaster. This paper describes the evaluation design approach and lessons learned through the project. First, the context of this design case will be detailed as part of framing the need for the project. Next, critical components of the project evaluation plan will be highlighted and situated alongside formative learning considerations. Then, the immersive tabletop structure applied as part of formative evaluation activities will be described. Finally, key findings from the evaluation will be explored as potential design precedents.

Background Within the greater Houston area, the Houston Food Bank (HFB) is a crucial organization working to address food insecurity needs in the 18-county region. HFB serves approximately 150  million meals annually, with about 93,000 households depending on HFB’s support each week (Houston Food Bank, n.d.). HFB works within a network of over 1600 partnering organizations in the region, including grocery suppliers, food pantries, shelters, community centers, churches, and other service groups. For food banks such as HFB, the competing demands of being in a network with multiple food source networks can present its own set of challenges that limit the potential effectiveness and efficiency of the hierarchical hub and spoke distribution system, reduce supply and fulfillment choices, and increase paperwork (Bouek, 2018). Further, distribution complexities include the variabilities related to the food bank’s largely volunteer-based labor force and logistical considerations for allocating, transporting, storing, and distributing food at partner site locations around the metropolitan area. Further, HFB could benefit from assessing current food insecurity levels in the communities they serve and expanding the adoption of data-driven decision-making processes. Utilizing artificial intelligence (AI) to create an integrated dashboard of data sources on food security needs, electrical power infrastructure vulnerabilities, anticipated flooding impacts, partner organization operational statuses, and pre-­ staged food supply locations could yield valuable time and HFB man-power savings

9  Formative Learning Design Within Project Evaluation: Case of a Food Bank…

117

when the food bank is called into disaster response. Pairing the AI technology with supportive data-driven decision-making processes could help HFB more readily anticipate demand, match food supplies with targeted needs, and reduce waste. Coordinating and sharing real-time data between the food bank and its organizational partners can also support essential inter-organizational communications for quick and effective response amidst disaster-prompted requests. While measurements of the success of a food bank network are often determined by the tons of food delivered in response to community needs, food security metrics can include a myriad of other elements. For example, food improvements could be targeted in quantity, nutritional quality, cultural acceptability, safety, certainty, timeliness, and stability (Coates, 2013). Enhancing community access, communication/ outreach, and social capital through friends, family, and neighbors may be another area that could be tracked and used to inform goal-setting for food banks. Food bank response during disasters could likewise be optimized through a data-informed approach for food pre-placement and distribution at anticipated geographic areas of need. Strategic positioning before disasters of disaster-related supplies, such as food disaster relief boxes containing non-perishable ready-to-eat meals and snacks, can support more timely distribution when a disaster strikes (Campbell & Jones, 2011). Ideally, pre-staged supply hosting partners would be able to provide adequate coverage of anticipated food insecurity needs while also maintaining their operations through potential weather impacts of wind damage, regional flooding, and power outages, as well as offer a volunteer base for carrying out the food distribution activities. For example, partner organization A may be ideally located in a region of high food insecurity and anticipated flooding risk but may not be accessible via roadways during a flooding event or operational due to disaster impacts limiting volunteer availability. In addition to planning for additional food security needs during weather-related disasters, in recent years, food banks have experienced increased demand and supply chain disruptions during the COVID-19 pandemic (Blessley & Mudambi, 2022). Through the pandemic, HFB has experienced decreases in food donations due to canceled food drives, food price inflation, quarantine limitations, and financial impacts to donors while also trying to meet higher food security needs across the region. Therefore, the design of data-driven decision-making supportive tools now needs to incorporate features that attend to the layered complexity introduced by the pandemic. In the case of HFB, the COVID-19 pandemic spurred recognition of the need for innovation to address constraints to the food bank and its partners’ funding and volunteer base through exploration of how decision-making within disaster planning and response could be more strategic, efficient, and equitable. One of their current disaster response approaches, for example, is to set up large distribution locations, termed “neighborhood supersites,” to distribute truckloads of food supplies to the food-insecure population en masse. For these distributions, they utilize sites such as the city’s professional football team stadium parking lots, which provide ample space and traffic flow for conducting a mobile drive-through food distribution. However, there may be food access challenges for vulnerable populations not located proximally to a supersite (Chakraborty et al., 2022).

118

S. L. Gronseth and I. A. Kakadiaris

Considerations for the Formative Evaluation Design An interdisciplinary project team was formed between the food bank and partnering universities consisting of experts in food distribution, computer science, supply chain management, public health, education, civil and environmental engineering, and sustainability. Utilizing a team science approach (National Research Council, 2015) suitably matched the complex challenge of addressing food support needs during multi-layered disaster scenarios. Through an iterative and community-­ engaged process, the researchers conducted focus group sessions in 2021 that engaged HFB personnel and partner stakeholders in discussions around the varied needs that could be addressed by developing a shared data dashboard tool. After this information gathering, the team began developing a prototype of the tool that utilized AI technology and real-time data for partner locations, power infrastructure, flood risk, and the needs of food-insecure individuals in the Houston area (Fig. 9.1). The team envisioned that the resulting tool would impact HFB and its partners’ organizational decision-making processes so that distribution coverage could be optimized for the anticipated food demand during weather-related disasters while also considering COVID-related supply chain stressors. The shared goal was to enable HFB to more adequately allocate and position disaster response food resources pre-disaster while still allowing manageable adjustments during a disaster based on actual impacts and system disruptions. The Houston metro area is ideal for this work due to its prevalence of food insecurity and vulnerability to tropical weather events (i.e., hurricanes). To support the evaluation of the project, the team articulated a logic model that identified the project’s assumptions, processes, outcomes, and external factors. At the process level, project inputs involved the multidisciplinary research team, HFB civic partner, and other stakeholders of the HFB network of partners who facilitate food storage, distribution, and advocacy for the end-users who receive food assistance. Across the project activities of team capacity building, development of food

Fig. 9.1  Excerpt from geographic planning of food distribution tool mockup

9  Formative Learning Design Within Project Evaluation: Case of a Food Bank…

119

distribution equity indicators, the definition of a food bank network organizational resilience index, incorporation of infrastructure vulnerability metrics, and creation of the decision-supportive technology tool, data would need to be systematically collected and analyzed to determine the extent of outputs obtained through the project’s objectives and address areas of sustainability, transferability, and scalability. The project was funded through the National Science Foundation Civic Innovation Challenge program, which facilitated workshops, networking events, and a community of practice structure for all challenge grant awardees during the year-long project period (National Science Foundation, 2022). The tool prototype programming was completed in 2022 using an iterative design approach across multiple cycles of module creation, testing, refinement, and modification. The tool was named “AI-SERVE,” an acronym for AI Support for Equitable and Resilient Food Distribution During Extreme Weather Events. The tool home screen consists of a food distribution dashboard that provides a high-level view of how much demand is being met by current food distribution activities, the operational status of distribution hubs, calculated total food supply capacities, and the average amount of food distributed in pounds and pallets via disaster partners in the Houston region (Fig. 9.2). The tool also offers components, referred to as “modules,” for pre-disaster planning and disaster response. For example, the Planning module enables users to run hypothetical weather event scenarios and compute current coverage capabilities through their established network of disaster partner organizations. Users can manipulate variables such as setting a maximum for food

Fig. 9.2  Excerpt of tool prototype distribution dashboard

120

S. L. Gronseth and I. A. Kakadiaris

supply recipients’ travel time to distribution sites via private car, public transportation, or walking; accounting for additional constraints on the supply chain due to COVID; or adjusting to target percentages of pre-disaster food insecure and food secure that they would like to be able to meet through their distribution activities. The tool produces map visualizations that illustrate the projected impacts of a weather event (such as a Category 2 hurricane with a south-to-north trajectory through Houston) as integrated with current disaster partner locations and their associated population, flood risk, power grid vulnerabilities, and disaster pallet inventory. The tool can also suggest mobile disaster distribution locations for servicing anticipated areas of need that would likely not be covered by the current disaster partner network. Disaster response-related functionality includes monitoring the operational status of disaster partners, impacts on disaster pallet supplies, and emergency food distribution coverage. As this paper discusses the project’s evaluation strategy, the activities of primary focus here involve how criteria, metrics, and methods were developed and used as formative learning opportunities for the project. Through these activities, the evaluation design centered on outputs determining the accuracy, usability, viability for frequent updating, and security of the developed AI-powered decision-making tool. It also analyzed how the project fostered the food bank network’s organizational resilience, foundational to their engagement with community partners during the food supply disaster response. The expected impacts of these outcomes were organized across short-term, intermediate, and long-term timeframes, with the project positioned as a scalable innovation for enabling equitable, timely, and appropriate food distribution across HFB’s service area. The evaluation also delved into the project’s sustainability beyond the grant funding period and its transferability for application by food banks and partners in other communities.

Immersive Tabletop Exercise Strategy Building on Design Precedents As the tool was being developed, the team conducted three two-hour usability assessment sessions virtually through Microsoft Teams with HFB staff from emergency operations, distribution services, and agency services. These sessions focused on a single module and explored the functionality and interface design with the target users. A six-hour in-person session at the HFB headquarters was also conducted with HFB staff, representatives from the county’s Office of Emergency Management, tool programmers, and research team members, in which all modules were utilized and discussed. The design of these sessions was informed by a tabletop exercise (TTX) methodology approach from the emergency management training sector (Mason & Verner, 2008). TTXs are used extensively in emergency management to help identify

9  Formative Learning Design Within Project Evaluation: Case of a Food Bank…

121

strengths (capabilities), weaknesses (needs), and gaps in preparedness for disaster planning/response (Dausey et  al., 2007; Fowkes et  al., 2010; High et  al., 2010). TTXs are discussion-based exercises in which individuals involved in emergency response can dialogue through discussion scenarios, sharing their views on response steps and relating current routines with potential modifications (High et al., 2010; van Laere & Lindblom, 2019). TTXs are intended to support opportunities for participants to collaboratively learn and identify needed revisions to current organizational processes and procedures (Smith & Elliott, 2007). Concluding sessions with debriefing conversations can further enhance the connections for participants between theoretical and real-life applications to future emergencies (Roud et al., 2020). The team reviewed TTX design precedents (Boling, 2021). The design precedents were toolkits and programs from leading organizations about designing, facilitating, and evaluating disaster response tabletop exercises, including Harvard University School of Public Health (2013) Public Health Emergency Preparedness Exercise Evaluation Toolkit, Federal Emergency Management Agency’s (FEMA, 2021) Preparedness Toolkit (PrepToolkit), and the Homeland Security Exercise and Evaluation Program (HSEEP) (FEMA, 2020). The design precedents offered insights into session structure, exercise design and development, and the creation of supportive documents such as note-taking sheets for observers. The team envisioned that these immersive, constructivist-based sessions would delve into the usability and utility of the decision support tool. A protocol was developed to guide the three planned usability assessment sessions.

Conducting the Sessions The protocols were constructed similarly to a semi-structured focus group protocol, with scripted statements and branching questions. The semi-structured protocols made it more likely that the scope of the planned discussions would align with the aims articulated in the project evaluation logic model, namely, to gather user perceptions on the intuitiveness of the tool interface, appropriateness of visualization outputs for various scenario parameter inputs, and suitability of feature preferences. Each session contained four key components: 1 . Session participant introductions (to promote relationship-building) 2. Module overview (by tool developer) 3. Scenario-based explorations (with think-aloud and screen sharing by HFB staff) 4. Debriefing discussion on applicability (whole group) Participants were provided individual user logins and encouraged to express their thoughts and reactions as they navigated the tool through the simulated scenarios. The branching questions covered the range of Bloom’s taxonomy of cognitive skills to elicit participant perspectives at levels of knowledge (recall of information), comprehension (understanding of data), application (applying data to solve problems),

122

S. L. Gronseth and I. A. Kakadiaris

analysis (identifying patterns and trends), synthesis (using data for prediction), and evaluation (comparing and judging ideas) (Adams, 2015). Figure 9.3 provides an excerpt from a session protocol with Bloom’s levels specified. For the longer in-person session, an exercise situation manual was developed that defined the responsibilities of the various roles of the team members in the session, detailed the exercise scenarios, and provided note-taking scaffolds. Two exercise scenarios were conducted in the session involving disaster response food distribution during a hypothetical hurricane weather event. In the first exercise, “player” participants did not use the AI-SERVE tool but instead applied their current processes and data resources to navigate the exercise scenario. The AI-SERVE tool was introduced before the second exercise. Participants completed the second exercise using the information from the tool, such as predicted impacts of weather event

Let us see the coverage estimated for this year during a flooding event. For this scenario, let us estimate that the pandemic during that flood event is like the status today, which I just looked at on the Harris County dashboard this morning, and we are still in Level 3Moderate (yellow) out of four levels. Let us interpret that as a +25% pandemic factor. ● Participants should set the pandemic factor at 25%. Let us say that we estimate covering 85% of the food insecure and about 20% of the food secure population. ● Participants should set food insecurity to 85% and food security to 20%. And let us see if we can get the food distribution quite close to where the need is. How about estimating transportation time thresholds of 10 minutes for private transportation, 30 minutes for public transportation, and 20 minutes for walking? And then generate the visualization to see the demand coverage for this scenario. ● Participants should set transportation time thresholds to 10 minutes (private), 30 minutes (public), and 20 minutes (walking). ● Participants should then generate visualization output. Questions: 1. What percentage would be covered by the current disaster partners for this scenario? (Knowledge level) 2. Do you find anything in this output surprising or unexpected? (Evaluation level) 3. In what areas of Harris County is the currently planned coverage the lowest? (Comprehension level) 4. Where are there potentially underserved regions? (Analysis level) 5. Why might these areas not have disaster response food distribution sites located nearby? (Analysis level) 6. How would you adjust the scenario parameters to reflect the current goals of HFB’s demand coverage more accurately? (Application level) 7. What would the demand coverage be for these adjusted scenario parameters? (Synthesis level)

Fig. 9.3  Excerpt from demand coverage session protocol

9  Formative Learning Design Within Project Evaluation: Case of a Food Bank…

123

scenarios on partner availability and anticipated emergency food demands. Following the exercises, semi-structured debriefing discussions guided participants to reflect upon their exercise experiences with and without using the AI-SERVE tool.

Lessons Learned Through this formative evaluation strategy, the decision-making tool has undergone iterative modifications based on the data gathered through the usability assessment and tabletop exercise sessions. User feedback has driven parameter input and navigation adjustments, consolidation of features, and added functionality. For example, simulating the tool in the tabletop exercise resulted in discussions in which the participants saw the tool’s utility for re-opening food bank operations post-disaster. Observing users as they attempted to try out the tool in weather event response simulations revealed a high-priority need for greater intuitiveness in the tool’s dashboard design. Preparing for and addressing the multitude of needs created during disaster crises is complicated and dependent on the intersection of many different contingent factors. Throughout the sessions, the team has continued to gain a deeper understanding of the steps the food bank currently takes in its disaster planning, response, and partner communications processes, in which decisions may often be made at higher levels in the organization on an as-needed, just-in-time basis. A key challenge the project seeks to address is bringing in real-time data and coordinated communications to prompt positive changes in these processes. Further, the qualitative discussions during the sessions among the research team, programmers, and civic partners have helped inform how the tool navigation could be better integrated into the food bank’s disaster response procedures. The sessions have also supported relationship-building between the research team and the civic partner personnel, fostering productive advances in other project dimensions, such as supporting the development of the project’s food bank network organizational resilience index. The team has learned that designing evaluation sessions utilizing this immersive strategy necessitates coordination between those involved in tool programming and those responsible for evaluation. For instance, discussing and agreeing upon tool modules’ release dates in advance of each usability session can provide ample time for the evaluators to develop scenario-based explorations and semi-structured session protocols. Further, holding some of the sessions virtually afforded greater flexibility and involvement of civic partners and research team members to connect, as some research team and HFB staff members are physically located in other states. The virtual format can also enable efficient screen sharing during the hands-on portions and support integrated recording for later re-review by the programmers and evaluators to inform tool modifications and understandings of project progress.

124

S. L. Gronseth and I. A. Kakadiaris

In sum, four key design insights emerged through this project – 1. Intentionally embedded formative learning mechanisms within the project timeline supported iterative and innovative tool development that attended to feasibility considerations through simulation of potential application scenarios. 2. Bringing together representatives from diverse sectors through usability-focused sessions fostered opportunities for shared discussions between designers and users about the extent to which tool prototypes addressed community partner needs. 3. Pairing usability dialog alongside constructivist learning approaches fostered formative evaluation data collection through interactive and community-­ building ways. 4. Session planning was supported by developing semi-structured protocol documents that specified session agendas, activity steps, and targeted questions for user feedback across cognitive levels. The formative learning design in the evaluation approach of this project prioritizes team science to garner more efficient and equitable ways of meeting food security needs amidst the challenges of fast and slow disasters. By bringing together expertise from varied research backgrounds and civic representatives, the project has generated a technology that curates real-time data useful for guiding disaster planning and response decision-making by the food bank and its partners. The immersive tabletop exercise strategy enabled the team to simulate situations in which the tool could be applied to identify how well the tool functions in relation to target user needs. Further, the strategy fostered shared experiences and meaningful dialogue across sectors that can strengthen the network of partner relationships and support greater resiliency during disaster times and beyond. Acknowledgements  The authors would like to thank Dr. Elizabeth Fletcher, tabletop exercise lead, other members of the project research team (Drs. Hiba Baroud, Casey Durand, Aron Laszka, and Bruce Race), and Erica Banks of the Houston Food Bank for their partnership in this project.

References Adams, N. E. (2015). Bloom’s taxonomy of cognitive learning objectives. Journal of the Medical Library Association, 103(3), 152–154. https://doi.org/10.3163/1536-­5050.103.3.010 Blessley, M., & Mudambi, S. M. (2022). A trade war and a pandemic: Disruption and resilience in the food bank supply chain. Industrial Marketing Management, 102, 58–73. https://doi. org/10.1016/j.indmarman.2022.01.002 Boling, E. (2021). The nature and use of precedent in designing. In J. K. McDonald & R. E. West (Eds.), Design for learning. EdTech Books. https://edtechbooks.org/id/precedent Bouek, J. W. (2018). Navigating networks: How nonprofit network membership shapes response to resource scarcity. Social Problems, 65(1), 11–32. https://doi.org/10.1093/socpro/spw048 Campbell, A.  M., & Jones, P.  C. (2011). Prepositioning supplies in preparation for disasters. European Journal of Operational Research, 209(2), 156–165. https://doi.org/10.1016/j. ejor.2010.08.029

9  Formative Learning Design Within Project Evaluation: Case of a Food Bank…

125

Chakraborty, J., Aun, J. J., & Schober, G. S. (2022). Assessing the relationship between emergency food assistance and social vulnerability during the COVID-19 pandemic. Applied Spatial Analysis and Policy., 16, 259–276. https://doi.org/10.1007/s12061-­022-­09478-­8 Coates, J. (2013). Build it back better: Deconstructing food security for improved measurement and action. Global Food Security, 2(3), 188–194. https://doi.org/10.1016/j.gfs.2013.05.002 Correa, N.  P. & ACE Coalition Food Insecurity Workgroup. (2017). Food insecurity screening in Houston and Harris County: A guide for healthcare professionals (p. 31). Baylor College of Medicine and Texas Children’s Hospital. https://www.texaschildrens.org/sites/default/files/ uploads/Food%20Insecurity%20Report%20Final.pdf Dausey, D. J., Buehler, J. W., & Lurie, N. (2007). Designing and conducting tabletop exercises to assess public health preparedness for manmade and naturally occurring biological threats. BMC Public Health, 7(1), 92. https://doi.org/10.1186/1471-­2458-­7-­92 Feeding America. (2022). Facts about hunger and poverty in America. https://www.feedingamerica.org/hunger-­in-­america/facts FEMA. (2020, February). Homeland security exercise and evaluation program. https://www.fema. gov/emergency-­managers/national-­preparedness/exercises/hseep FEMA. (2021, November 9). Exercise and preparedness tools. https://www.fema.gov/ emergency-­managers/national-­preparedness/exercises/tools Food and Agriculture Organization of the United Nations. (2003). Trade reforms and food security: Conceptualizing the linkages. http://www.fao.org/3/y4671e/y4671e.pdf Fowkes, V., Blossom, H.  J., Sandrock, C., Mitchell, B., & Brandstein, K. (2010). Exercises in emergency preparedness for health professionals in community clinics. Journal of Community Health, 35(5), 512–518. https://doi.org/10.1007/s10900-­010-­9221-­1 Harvard University School of Public Health. (2013). Public health emergency preparedness exercise evaluation toolkit manual. Harvard School of Public Health. https://asprtracie.hhs.gov/technical-­resources/resource/3626/ public-­health-­emncy-­preparedness-­exercise-­evaluation-­toolkit-­manual Herskovitz, J. (2015, September 14). America’s city rankings set for Texas-sized shake up; Houston to edge past Chicago. Reuters. https://www.reuters.com/article/ us-­usa-­houston-­idUSKCN0RD0E420150914 High, E.  H., Lovelace, K.  A., Gansneder, B.  M., Strack, R.  W., Callahan, B., & Benson, P. (2010). Promoting community preparedness: Lessons learned from the implementation of a chemical disaster tabletop exercise. Health Promotion Practice, 11(3), 310–319. https://doi. org/10.1177/1524839908325063 Houston Food Bank. (n.d.). About us. https://www.houstonfoodbank.org/about-­us/ Mason, D., & Verner, D. (2008). Tabletop exercises drive home the importance of drought planning. Journal AWWA, 100(8), 48–50. https://doi.org/10.1002/j.1551-­8833.2008.tb09697.x National Research Council. (2015). Enhancing the effectiveness of team science. The National Academies Press. https://doi.org/10.17226/19007 National Science Foundation. (2022). Civic innovation challenge. https://nsfcivicinnovation.org/ Roud, E., Gausdal, A. H., Asgary, A., & Carlström, E. (2020). Outcome of collaborative emergency exercises: Differences between full-scale and tabletop exercises. Journal of Contingencies and Crisis Management, 29(2), 170–184. https://doi.org/10.1111/1468-­5973.12339 Smith, D., & Elliott, D. (2007). Exploring the barriers to learning from crisis: Organizational learning and crisis. Management Learning, 38(5), 519–538. https://doi.org/10.1177/1350507607083205 van Laere, J., & Lindblom, J. (2019). Cultivating a longitudinal learning process through recurring crisis management training exercises in twelve Swedish municipalities. Journal of Contingencies and Crisis Management, 27(1), 38–49. https://doi.org/10.1111/1468-­5973.12230

Chapter 10

How a Novice Instructional Designer Embraced a Design Thinking Mindset Through a Learning Design Course Jing Song and Wanju Huang

Abstract  A design thinking mindset is crucial to instructional designers’ success. Using the collaborative autoethnography method, this study investigated how a student instructional designer developed a design thinking mindset in a graduate-level learning design course at a land grant university in the Midwest. The findings suggested that the student incorporated design thinking when creating the learning module – a final project for the learning design course. Findings also revealed that the student transformed herself from a novice designer into a designer who understood and embraced a design thinking mindset purposefully and intentionally throughout the course. Keywords  Design thinking · Design thinking mindset · Instructional designer · Collaborative autoethnography · Learning design course Design thinking refers to how designers see the world and think (Liu, 1996). It is a practical approach to problem solving (Dam & Siang, 2022). Design thinking practitioners strive to use a human-centered approach instead of a technology-centered or organization-centered approach to address complex problems (Brown, 2008; Kimbell, 2015; Koh et al., 2015). Developing one’s design thinking skills is crucial to success in the modern world (Razzouk & Shute, 2012). Design thinking has gained popularity in many contexts, such as healthcare, the automotive industry, and business settings (Melles et al., 2012; Shé et al., 2022). It has become an integral part of the academic environment, and many institutions require students to read and reason critically to solve complex problems (Razzouk & Shute, 2012). Therefore, it can be considered an instrumental tool for teaching and learning twenty-first-century skills (Henriksen et al., 2017; Luka, 2020).

J. Song (*) · W. Huang Purdue University, West Lafayette, IN, USA e-mail: [email protected]; [email protected] © The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 B. Hokanson et al. (eds.), Formative Design in Learning, Educational Communications and Technology: Issues and Innovations, https://doi.org/10.1007/978-3-031-41950-8_10

127

128

J. Song and W. Huang

Design thinking has been incorporated into the school curriculum in various ways and in different contexts (Melles et al., 2012). Koh et al. (2015) showcased a framework to integrate design thinking into Singaporean and Taiwanese school curricula to improve traditional educational systems. McKilligan et  al. (2017) suggested using design thinking as the fundamental strategy to transform a traditional engineering department into an agile department that sustains new ways of thinking. Henriksen et al. (2017) argued design thinking could offer an approachable structure to help teachers and educators address complex educational problems creatively. In a case study, Shé et al. (2022) indicated that design thinking skills helped instructional designers to develop engaging and effective online courses for learners. Design thinking is a non-linear, iterative, systematic process, mindset, and toolbox (Brenner et  al., 2016). It combines divergent and convergent thinking skills (Brenner et al., 2016). It has been used to signal an individual designer’s knowledge and approach to their own work (Kimbell, 2015). There are five stages in the design thinking process: empathize, define, ideate, prototype, and test (Dam & Siang, 2022; Shé et al., 2022). Empathize is the first stage in the process, and one should develop an empathic understanding of the problem and the end-users (Dam & Siang, 2022; Kimbell, 2015; Shé et al., 2022). During the define stage, information gathered in the empathize phase will be analyzed and synthesized to identify the core problems (Dam & Siang, 2022). The third stage is to ideate. Designers start to look for alternative perspectives to view the problem and generate ideas to solve the problem (Dam & Siang, 2022; Shé et  al., 2022). The final two steps are to pilot designers’ ideas and run tests to evaluate the possible solutions. Integrating the five design thinking stages into a term course and helping students, especially novice instructional designers, to develop and hone this skill is essential (Shé et al., 2022). In this study, we investigated the development of one student’s design thinking skills in a learning design course. Our research questions were: • To what extent were design thinking skills integrated into the learning module that the student created? • How did this novice instructional designer grow through the course?

Context The context of this research is a graduate-level course on learning design in a Learning Design and Technology program at a land grant university in the Midwest. The course examines the processes of instructional design within a project-based context. It focuses on the design of effective learning design strategies that are motivating, efficient, and effective. The primary learning goal of this course was to describe and apply the components of a systematic design process, its rationales, and its uses. Specifically, students were expected to use a systematic design process and instructional psychology principles in performing an analysis of the learning context and learning tasks, in the design and development of assessments, and in design of the formative and summative evaluation of the instruction itself.

10  How a Novice Instructional Designer Embraced a Design Thinking Mindset…

129

Most of the learners in this class are first-year doctoral students in this program. While most of them have teaching experiences in K-12, higher education, or private sectors, none of the students took any instructional design classes prior to this class. The major assignments of the course were three design documents developed in three stages. These assignments guided students through the steps of a systematic learning design model developed by Dick et al. (2015) and culminated in a learning module.

Data This study reviews one student’s growth in design thinking by evaluating the learning module she created and the reflections she shared along with the course assignments. The reason why this student’s artifacts and assignments were chosen for this study was because the student had a deeper conversation with the instructor a few weeks after the class ended – reflecting on her experiences and growth in this class. The instructor suggested conducting a study using collaborative autoethnography to evaluate and capture her experiences. The Design Thinking Disposition Scale (DTDS) designed by Tsai and Wang (2020) was utilized to evaluate and measure this student’s design thinking skills under four subscales: emphasize, define, ideate, and prototype. The test stage was excluded from the analysis because it was not required by the course.

Data Analysis We utilized the collaborative autoethnography method to analyze the data and answer the research questions. (Chang et al., 2012; McDonald et al., 2021; Roy & Uekusa, 2020). Chang et al. (2012) explained that “collaborative autoethnographers adopt a team research model in which a group of researchers conduct research and produce reports together... [They] turn their interrogative tools on themselves, generating and utilizing their autobiographical data to understand social phenomena” (p. 37). In this paper, the second author was brought to the research to examine the first author’s artifacts and reflections, and worked with the first author to capture her growth and development through the project. This is a similar research method to that implemented in McDonald et al. (2021). Chang et al. (2012) named this method “‘partial’ mode of autoethnographic collaboration” (as cited in McDonald et  al., 2021, p. 92). It “provides the benefits of sitting at the intersection of the personal, lived experiences of some team members, with the more detached, ‘interpretive’ focus provided by others” (McDonald et al., 2021, p. 92).

130

J. Song and W. Huang

Results (Table 10.1)  vidence of the Implementation of the Design Thinking Skills E in the First Design Document The Ideate Stage Jane showed fewer design thinking skills in this stage than in the “Define” stage. Her score is 17/25. She was comfortable coming up with various solutions and different problem-solving approaches but had difficulty proposing new and innovative solutions. Besides, generating solutions through brainstorming was not her typical process. The Prototype Stage Jane’s score is 17/25. She demonstrated design thinking skills by providing examples for showing an idea and designing a preliminary model. However, she scored poorly in drawing a picture or making a design model. In the reflection of Design Document 1, she wrote: “I have a hard time differentiating the relationship between the instructional goal of this training program and the purpose of the online English course for children.” It seemed challenging for her to switch her role from being an online instructor to an instructional designer. It would be interesting to find out how she dealt with this challenge later in this class. The Empathize Stage Jane’s score is high – 14/15. This is not a surprise since she is a subject matter expert for this project and has been working with the potential learners of this project for 4 years. In fact, her reflection confirmed this result: “If I must choose one part that’s going well so far, I would pick the gap analysis section, which is relatively straightforward because I have been teaching this online course to first-graders for several years.” The Define Stage Jane scored 22/25 in this stage. In designing this learning module, she was clear about the learning goal – to provide strategies and resources to the parents so that they can guide their children’s English learning at home. She also found out the criteria for successful work and clarified the problem she was facing (Table 10.1).

131

10  How a Novice Instructional Designer Embraced a Design Thinking Mindset… Table 10.1  Jane’s implementation of the design thinking skills in the first design document Retained items in the design thinking Subscale No. disposition scale Guidance: When I design a work, …... Ideate ID 1 I usually generate solutions via brainstorming. ID 2 I usually come up with various solutions. ID 3 I usually come up with innovative solutions. ID 4 I usually come up with new solutions. ID 5 I usually think of different problem-­ solving approaches Prototype PR 1 I usually make a model of the design. PR 2 I usually draw a picture of the design. PR 3 I usually provide an example for showing an idea. PR 4 I usually provide an idea by assembling objects. PR 5 I usually try to design a preliminary model. Empathize EM 1 I usually try to imagine the feelings of the users. EM 2 I usually try to consider the practice of the work. EM 3 I usually try to feel the needs of the users. Define DE 1 I usually make it clear about the problem that I am facing. DE 2 I usually try to clarify the problems that must be solved. DE 3 I usually try to find out the criteria of successful works. DE 4 I usually see the gap between the current work and the goal. DE 5 I usually try to clearly know the goal to achieve.

Total DTDS Score in a DTDS 5-Point Likert Scale Score 3 4 3 3 4

17/25

2 2 5 3 5

17/25

5 4 5 5

14/15

4 5 3 5

22/25

 vidence of the Implementation of the Design Thinking Skills E in the Second Design Document The Ideate Stage Jane scored 19/25. Compared to her design thinking skills in the first design document, she seemed more confident in coming up with various solutions. In the reflection of her Design Document II, she wrote:

132

J. Song and W. Huang

“I also considered the possibilities of making my training module asynchronous. However, considering the workload of knowing Canvas and creating the materials at this time of the semester, I chose synchronous online as my option.” The Prototype Stage Jane’s score is 18/25. She demonstrated design thinking skills by providing examples for showing an idea and designing a preliminary model. However, she still struggled with making a model or drawing a picture of the design. The Empathize Stage Jane’s score is high – 14/15, which is the same as the previous stage. As mentioned previously, Jane is a subject matter expert for this project and has been working with the potential learners of this project for 4 years. The Define Stage Jane scored 24/25 in this stage. The score shows an increase in Jane’s skill in the Define Stage. Jane’s reflection also confirmed this change (Table 10.2): Overall, I feel Design Document II is less intimidating compared to Design Document I. It seems like when you have set a clear instructional goal and made an explicit diagram of the steps, the following parts come naturally.

Table 10.2  Jane’s implementation of the design thinking skills in the second design document Retained items in the design thinking Subscale No. disposition scale Guidance: When I design a work, …... Ideate ID 1 I usually generate solutions via brainstorming. ID 2 I usually come up with various solutions. ID 3 I usually come up with innovative solutions. ID 4 I usually come up with new solutions. ID 5 I usually think of different problem-­ solving approaches Prototype PR 1 I usually make a model of the design. PR 2 I usually draw a picture of the design. PR 3 I usually provide an example for showing an idea. PR 4 I usually provide an idea by assembling objects. PR 5 I usually try to design a preliminary model. Empathize EM 1 I usually try to imagine the feelings of the users.

DTDS Score in a 5-Point Likert Scale

Total DTDS Score

4 5 3 3 4

19/25

3 3 5 3 5

18/25

5 (continued)

10  How a Novice Instructional Designer Embraced a Design Thinking Mindset…

133

Table 10.2 (continued) Subscale

Define

Retained items in the design thinking No. disposition scale EM 2 I usually try to consider the practice of the work. EM 3 I usually try to feel the needs of the users. DE 1 I usually make it clear about the problem that I am facing. DE 2 I usually try to clarify the problems that must be solved. DE 3 I usually try to find out the criteria of successful works. DE 4 I usually see the gap between the current work and the goal. DE 5 I usually try to clearly know the goal to achieve.

DTDS Score in a 5-Point Likert Scale 4

Total DTDS Score

5

14/15

5 5 5 4 5

24/25

 vidence of the Implementation of the Design Thinking Skills E in the Third Design Document The Ideate Stage Jane’s score is 18/25, which is lower than the previous stage. In her reflection, she mentioned, “the challenging part of design document III is the design evaluation chart. She wracked her mind to come up with home-learning strategies.” The Prototype Stage Jane’s score is 19/25. She seemed comfortable designing a preliminary model when designing Design Document III. Though she still struggled with making a model or drawing a picture of the design, she liked using examples to show ideas. Noticeably, in the reflection, she wrote: “I enjoyed working on the implementation plan and the evaluation plan. I am lucky that I have three target learners [who] participate in the pilot workshop, and my target learners like to talk with me.” The Empathize Stage Jane’s score remains the same as the previous two design documents. The Define Stage Jane scored 23/25 in this stage. Compared to Design Document II, her score dropped one point because there was a miscommunication between her and her peers in defining the problem. In her reflection, she mentioned (Table 10.3), I sought peer feedback from Mary and Alex for this project. Mary was very helpful to guide me through the steps. However, she complicated the project. Therefore, I was overwhelmed by designing all the appendixes which were not required at this stage.

134

J. Song and W. Huang

Table 10.3  Jane’s implementation of the design thinking skills in the third design document Retained items in the design thinking Subscale No. disposition scale Guidance: When I design a work, …... Ideate ID 1 I usually generate solutions via brainstorming. ID 2 I usually come up with various solutions. ID 3 I usually come up with innovative solutions. ID 4 I usually come up with new solutions. ID 5 I usually think of different problem-­ solving approaches Prototype PR 1 I usually make a model of the design. PR 2 I usually draw a picture of the design. PR 3 I usually provide an example for showing an idea. PR 4 I usually provide an idea by assembling objects. PR 5 I usually try to design a preliminary model. Empathize EM 1 I usually try to imagine the feelings of the users. EM 2 I usually try to consider the practice of the work. EM 3 I usually try to feel the needs of the users. Define DE 1 I usually make it clear about the problem that I am facing. DE 2 I usually try to clarify the problems that must be solved. DE 3 I usually try to find out the criteria of successful works. DE 4 I usually see the gap between the current work and the goal. DE 5 I usually try to clearly know the goal to achieve.

DTDS Score in a 5-Point Likert Scale

Total DTDS Score

4 4 3 4 4

18/25

3 3 5 3 5

19/25

5 4 5

14/15

5 4 5 4 5

23/25

 vidence of the Implementation of the Design Thinking Skills E in the Final Design Document The Ideate Stage Jane’s score is 18/25. This score is lower than that in the first design document. She is good at brainstorming solutions and trying to address problems with various solutions. However, coming up with new and innovative solutions seemed to be challenging for her.

10  How a Novice Instructional Designer Embraced a Design Thinking Mindset…

135

The Prototype Stage Jane’s score is 22/25. She was doing much better in demonstrating ideas with examples. In addition, she was also accustomed to designing a preliminary model and providing an idea by assembling objects. However, the score still indicates that she did not rely on drawing to help her visualize her design. The Empathize Stage Jane scored full marks 15/15. In addition to relying on her previous work experience with potential learners, she interviewed and observed the target learners to enhance her understanding of their needs when working on the second design document. The Define Stage Jane scored 23/25 in this stage. She was clear about the problem she was facing and was able to further clarify it after the interviews and observations with potential learners. Also, the learner analysis, performance, and learning context analysis conducted in the second design document helped her to further specify the performance goals (Table 10.4).

Table 10.4  Jane’s implementation of the design thinking skills in the final design document Retained items in the design thinking Subscale No. disposition scale Guidance: When I design a work, …... Ideate ID 1 I usually generate solutions via brainstorming. ID 2 I usually come up with various solutions. ID 3 I usually come up with innovative solutions. ID 4 I usually come up with new solutions. ID 5 I usually think of different problem-­ solving approaches Prototype PR 1 I usually make a model of the design. PR 2 I usually draw a picture of the design. PR 3 I usually provide an example for showing an idea. PR 4 I usually provide an idea by assembling objects. PR 5 I usually try to design a preliminary model. Empathize EM 1 I usually try to imagine the feelings of the users. EM 2 I usually try to consider the practice of the work.

DTDS Score in a 5-Point Likert Scale

Total DTDS Score

4 4 3 3 4

18/25

5 3 5 3 5

22/25

5 5 (continued)

136

J. Song and W. Huang

Table 10.4 (continued) Subscale

Define

Retained items in the design thinking No. disposition scale EM 3 I usually try to feel the needs of the users. DE 1 I usually make it clear about the problem that I am facing. DE 2 I usually try to clarify the problems that must be solved. DE 3 I usually try to find out the criteria of successful works. DE 4 I usually see the gap between the current work and the goal. DE 5 I usually try to clearly know the goal to achieve.

DTDS Score in a 5-Point Likert Scale 5

Total DTDS Score 15/15

5 5 4 4 5

23/25

Evidence of the Novice Instructional Designer’s Growth Jane’s reflections were analyzed four times during the semester. At first, the tone suggested that she felt challenged and intimidated, but it switched to a more positive tone indicating that she felt relieved and successful. Like many students new to the program, she thought that the course project was challenging. It was difficult for her to specify the instructional goal and align the scope of her design project with the assignment requirement: a one-hour learning module. In the beginning, she wanted to design an online English course for Chinese first grade students that was entirely outside of the project’s scope. She included four goals – guide, supervise, participate, and evaluate in her first design. After reviewing the course examples and receiving peer feedback, Jane narrowed her focus to the “guide” element. The finalized instructional goal was: “parents will use the suggested home-schooling strategies and available resources to guide Chinese first grade students’ English learning at home.” Thus, the course changed from a general English course to a course that educates parents or caregivers on how to provide after-class guidance to children who take online English courses. The tone in the second reflection indicated Jane’s sense of achievement in working on the second design document. She started the work 3  weeks prior to the assignment deadline and was able to receive feedback from her peers and instructor to improve her design document before submitting it. The positive feedback from her peers and the instructor bolstered her confidence. However, it was still challenging to analyze the performance and learning contexts when examples were not suitable for her design project. Overall, she felt that once the instructional goal was clearly set, the rest of the design became less intimidating. The success that Jane felt when developing her second design document continued when she was working on her third design document. She enjoyed working on

10  How a Novice Instructional Designer Embraced a Design Thinking Mindset…

137

the project even though, at one point, she felt overwhelmed by unclear peer feedback. Finding potential learners to pilot the learning module and conducting post-­ pilot interviews was easy for her because of her prior working experiences. Jane’s final reflection showed a sense of relief and triumph as she was able to develop her learning module successfully. Additionally, her reflection also revealed major growth in her instructional design knowledge and skills. She “spoke” the instructional design language fluently and was able to point out critical elements of design thinking skills, such as systematic thinking, the dynamic and iterative design process, and communication. Below is a direct quote from her final reflection: Systematic thinking is fundamental to instructional designers. Also, instructional design should be dynamic, and the design process is iterative. As the design proceeds, one should look back and revise the work as necessary.

Conclusion & Future Study This paper documents the growth of a novice student’s design thinking skills by evaluating the reflections she submitted for a course on learning design. The delay between reflection submission and data analysis limits the validity of the study. However, this paper presents a method for a student and an instructor to revisit the course and the student’s experience. Through this process, they identified a design skill assessment scale for the student to quantify her design thinking level. Future studies should examine whether incorporating the self-evaluating design skill scale as a tool for students to evaluate their own design thinking skills would assist them in developing design thinking skills incrementally.

References Brenner, W., Uebernickel, F., & Abrell, T. (2016). Design thinking as mindset, process, and toolbox. Design Thinking for Innovation, 3–21. https://doi.org/10.1007/978-­3-­319-­26100-­3_1 Brown, T. (2008). Design thinking. Harvard Business Review, 86(6), 84. Chang, H., Ngunjiri, F., & Hernadez, K.  C. (2012). Collaborative autoethnography. Taylor & Francis Group. Dam, R. F., & Siang, T. Y. (2022). 5 Stages in the design thinking process. Interaction Design Foundation. https://www.interaction-­design.org/literature/article/5-­stages-­in-­the-­design-­thinking-­process Dick, W., Carey, L., & Carey, J. (2015). The systematic design of instruction (8th ed.). Pearson. Henriksen, D., Richardson, C., & Mehta, R. (2017). Design thinking: A creative approach to educational problems of practice. Thinking Skills and Creativity, 26, 140–153. Kimbell, L. (2015). Rethinking design thinking: Part I. Design and Culture, 3(3), 285–306. https:// doi.org/10.2752/175470811x13071166525216 Koh, J.  H. L., Chai, C.  S., Wong, B., & Hong, H.-Y. (2015). Design thinking and education. In J. H. L. Koh, C. S. Chai, B. Wong, & H.-Y. Hong (Eds.), Design thinking for education (pp. 1–15). https://doi.org/10.1007/978-­981-­287-­444-­3_1

138

J. Song and W. Huang

Luka, I. (2020). Design thinking in pedagogy. Journal of Education Culture and Society, 5(2), 63–74. https://doi.org/10.15503/jecs20142.63.74 Liu, Y.-T. (1996). Is designing one search or two? A model of design thinking involving symbolism and connectionism. Design Studies, 17, 435–449. McDonald, J.  K., Stefaniak, J., & Rich, P.  J. (2021). Expecting the unexpected: A collaborative autoethnography of instructors’ experiences teaching advanced instructional design. TechTrends, 66(1), 90–101. https://doi.org/10.1007/s11528-­021-­00677-­7 McKilligan, S., Fila, N., Rover, D., & Mina, M. (2017, October). Design thinking as a catalyst for changing teaching and learning practices in engineering. In 2017 IEEE frontiers in education conference (FIE) (pp. 1–5), IEEE. Melles, G., Howard, Z., & Thompson-Whiteside, S. (2012). Teaching design thinking: Expanding horizons in design education. Procedia – Social and Behavioral Sciences, 31, 162–166. https:// doi.org/10.1016/j.sbspro.2011.12.035 Razzouk, R., & Shute, V. (2012). What is design thinking and why is it important? Review of Educational Research, 82(3), 330–348. https://doi.org/10.3102/0034654312457429 Roy, R., & Uekusa, S. (2020). Collaborative autoethnography: “Self-reflection” as a timely alternative research approach during the global pandemic. Qualitative Research Journal, 20(4), 383–392. https://doi.org/10.1108/qrj-­06-­2020-­0054 Shé, C.  O. N., Farrell, O., Brunton, J., & Costello, E. (2022). Integrating design thinking into instructional design: The open case study. Australasian Journal of Educational Technology, 38(1), 33–52. https://doi.org/10.14742/ajet.6667 Tsai, M.-J., & Wang, C.-Y. (2020). Assessing young students’ design thinking disposition and its relationship with computer programming self-efficacy. Journal of Educational Computing Research, 59(3), 410–428. https://doi.org/10.1177/0735633120967326

Chapter 11

How has Virtual Reality Technology Been Used to Deliver Interventions That Support Mental Well-Being? A Systematic Literature Review Minyoung Lee, Matthew Schmidt, and Jie Lu Abstract  In this paper, the authors present findings from a systematic literature review that interrogates the degree to which virtual reality (VR) technology has been used to deliver interventions that support mental well-being. Following PRISMA guidelines, a total of n = 853 articles were returned. After removing duplicates and applying inclusion and exclusion criteria during abstract and title screening, a total of n = 12 articles remained. The full text of these 12 articles was analyzed and the results were summarized and synthesized. Based on our analysis, we assert that a better articulation of what constitutes VR in the field of mental well-being is needed. Further, findings suggest that research methodologies, study outcomes, target populations, etc., are remarkably heterogeneous, which could be explained by the complex nature of the concept of mental well-being. As a result, findings may lack reliability or generalizability due to disparate research approaches. Implications and directions for further research are provided. Keywords  Virtual reality · Mental well-being · Intervention · Immersive technology This chapter presents findings from a PRISMA-based systematic literature review that interrogates how virtual reality technology has been used as an intervention modality to support mental well-being in clinical patients, non-clinical patients, and the general population. The purpose of this research was to explore how researchers, designers, practitioners, providers, and therapists have incorporated virtual reality M. Lee (*) · J. Lu University of Florida, Gainesville, FL, USA e-mail: [email protected]; [email protected] M. Schmidt University of Georgia, Athens, GA, USA e-mail: [email protected] © The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 B. Hokanson et al. (eds.), Formative Design in Learning, Educational Communications and Technology: Issues and Innovations, https://doi.org/10.1007/978-3-031-41950-8_11

139

140

M. Lee et al.

(VR) technologies into interventions to address mental health issues such as stress, anxiety, depression, etc. In the following sections, the authors first introduce VR technologies and the concept of mental well-being. Next, the authors discuss how VR has been used in wellness contexts. Finally, the authors identify a gap in the literature regarding the relationship between VR and mental well-being and present the significance and rationale for the study.

Virtual Reality Technology Virtual reality (VR) belongs to a family of digital tools known as immersive technologies. Interest in immersive technologies, which include VR as well as augmented reality (AR) and extended reality (XR), is increasing. These technologies are quickly becoming pervasive, leading to a range of efforts to define immersive technology. Extant definitions illustrate a range of perspectives. For example, Slater (2009) defines immersive technology as a type of technology that provides high-­ quality sensory information to the user. Others define immersive technologies as being able to provide an environment in which humans and virtual objects interact and integrate with one another by creating a sense of immersion and increasing the realism of their own virtual world experience (Desai et al., 2014; Soliman et al., 2017). In practical terms, immersive technology can allow users to experience immersion in a digital environment in which real-life objects and the virtual environment blur together (Di Serio et al., 2013; Lee et al., 2013; Zeng & Richardson, 2016). More specifically, in VR, users can control their movements and actions in a virtual world that may represent or resemble a real-life experience for them, leading to perceptions of actually being present in the virtual world. The psychology of VR is rooted in the concept of immersion, which is a quality of the system’s technology itself. Immersion can be considered an objective measure of the technology system’s ability to present the virtual environment while excluding real-life environmental elements (Sadowski & Stanney, 2002). Related to this, the level of immersion afforded by a VR system can promote perceptions of presence, that is, the psychological perception of actually being present in the digitally-­rendered environment (Cummings & Bailenson, 2016). The interplay of immersion and presence must be confronted by designers of VR systems as a complex, interconnected, and interdependent relationship, rendering the design of such systems remarkably challenging (Schmidt, 2014). For the purposes of this systematic review, we adopt the definition of VR proposed by Bamodu and Ye (2013). These scholars highlight the interactive components of computer-based simulation and interaction in the VR environment and how these promote sensations of being immersed through different audio, visual, and haptic stimuli (Craig et al., 2009; Kutz, 1998; Mihelj et al., 2014). These sensations can be cultivated using a range of different VR hardware systems. For example, desktop-based VR systems allow users to interact in a 3D virtual environment using a keyboard and mouse as input devices and a computer monitor as a display (Isdale,

11  How has Virtual Reality Technology Been Used to Deliver Interventions That…

141

1998). A more sophisticated device is the head-mounted display (HMD), a wearable VR system. HMDs use optical lenses coupled with display technologies to place a view of the virtual environment directly in front of the user’s eyes, fully blocking out views of the natural environment (Bamodu & Ye, 2013). A further VR technology is the Cave Automatic Virtual Environment (CAVE), which is similar to desktop-­based VR in that the user does not use a wearable device, but differs in that CAVE systems use digital projector systems to present the virtual environment on screens that surround the user in the real world, allowing users to move somewhat freely while experiencing the digital environment (Barrett et  al., 2011). More recently, mobile-based VR systems have emerged, which allow users to engage with VR environments using smartphone affordances such as gyroscopes and gesture controls, so long as an internet connection is present (Burdea & Coiffet, 2003).

Mental Well-Being and Virtual Reality Technologies Research seeking to investigate different VR technologies has been increasing steadily, not only in the entertainment industry, but also in scientific fields such as manufacturing, engineering, biology, education, and healthcare (Pears et al., 2020; Pribeanu et al., 2017). Of particular interest for the current study is research in the context of mental health and mental well-being. Several studies have explored how VR technologies can be used to influence mental well-being issues such as anxiety and depression (Clemmensen et al., 2020; Rizzo & Wiederhold, 2006; Yen & Chiu, 2021). Literature reviews have begun to emerge focusing on VR technology as a potential treatment tool to address mental health disturbances such as psychological distress and psychological burden (Hatta et al., 2022a, b; Kelson et al., 2021; Riva et al., 2020; Valmaggia et al., 2016). Within the field of positive psychology, it is acknowledged that mental well-being is a complex construct (Rose et al., 2017), encompassing positive emotions, engagement, positive relationships, meaning, and accomplishment (i.e., “PERMA”; Seligman, 2012). Mental well-being can include emotional well-being, social well-being, and psychological well-being, which involves positive individual functioning and positive or negative affect. In the current climate of COVID-19, mental well-being has become particularly relevant due to increasingly exacerbated negative outcomes such as stress, depression, anxiety, and fear/phobia (Siani & Marley, 2021). Virtual reality is seen as a potentially promising advanced technology for addressing negative mental well-being outcomes. This has led to an increase in research interest in this area, the majority of which is situated in healthcare contexts. VR has been widely implemented in healthcare, with a predominant focus on interventions for physical issues and problems, for example, chronic stroke (Mazzini et al., 2019) and functional upper limb activities (Kalron et al., 2020). However, less focus has been placed on studies that use VR for psychological challenges (Lee et al., 2018; Lu et al., 2022; Park et al., 2019), and research supporting the use of VR for mental wellness or for mental health-related purposes is particularly limited.

142

M. Lee et al.

Examples of such research include Kelson et al.’ (2021) study focusing on using VR to help adolescents manage psychological distress (Kelson et al., 2021), and Hatta and colleagues’ use of VR as a psychological intervention for mental health disturbances during the COVID-19 pandemic (Hatta et al., 2022a). Although examples in peer-reviewed literature of investigations of VR applications for mental health are now beginning to emerge, research in this area is still nascent (Frewen et  al., 2020). Researchers increasingly are acknowledging the promise of VR as an intervention modality; however, empirical research that reports outcomes of VR-based studies to support mental well-being is limited. Hence, there is a need to explore the relationship more intentionally between VR technology and mental well-being interventions. Therefore, this systematic literature review aimed to rigorously examine the literature reporting how VR technologies have been used to support mental well-being. The research question that guided this study was: How has virtual reality technology been used to deliver interventions that support mental well-being?

Methods To identify relevant articles, the authors followed the Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) standards (Moher et al., 2009). A total of 853 articles were identified initially. Title and abstract screening returned n = 645 articles. After the full-text review process, a total of n = 14 articles were returned (Fig. 11.1). These articles were analyzed and synthesized, the results of which are reported here. As per PRISMA guidelines, our research protocol was registered at Open Science Framework (OSF) and given the following registration number: 10.17605/OSF.IO/KZUXE.

Information Source and Search Strategy Five electronic databases were searched: (1) ACM Digital Library, (2) IEEE Xplore, (3) Web of Science, (4) PubMed, and (5) SCOPUS (see Table 11.1). ACM Digital Library was selected because it is a full-text database of computing machinery research that is user-friendly for the general public and also for academic professionals. IEEE Xplore was selected due to many research articles being published as it focuses on advanced immersive technologies. The Web of Science database was used as it encompasses a variety of topics including psychoeducation and technology which includes the terms in the current study’s research question. In addition, PubMed was selected because a great deal of VR research related to mental well-­ being is indexed in this database and because it focuses on biomedical and life sciences and related discipline literature such as mental health or mental well-being.

11  How has Virtual Reality Technology Been Used to Deliver Interventions That…

143

Fig. 11.1  PRISMA flow diagram illustrating the search and selection process Table 11.1  Table illustrating search strings for multiple databases Database ACM Digital Library, IEEE Xplore

Web of Science, SCOPUS, PubMed

Search string (“mental wellbeing” OR “mental well-being” OR “mental wellness” OR “social wellness” OR “emotional wellness” OR “mental health” OR “psychological wellbeing” OR “psychological well-being” OR “quality of life”) AND “intervention” AND (“virtual reality” OR “virtual environment” OR “VR”) (“mental wellbeing” OR “mental well-being” OR “mental wellness” OR “social wellness” OR “emotional wellness” OR “mental health” OR “psychological wellbeing” OR “psychological well-being”) AND “intervention” AND (“virtual reality” OR “virtual environment” OR “VR”)

144

M. Lee et al.

Lastly, the authors used SCOPUS as it covers nearly more than 30,000 peer-reviewed journals in the top-level subject fields such as social sciences and health sciences. All database searches were conducted between August and September of 2022. Search results were verified by a second researcher on a bi-weekly basis during this period. Across all databases, the following search terms were used:

Inclusion and Exclusion Criteria Inclusion and exclusion criteria were developed across several iterations by two researchers under the advisement of a university professor. The following inclusion criteria were used: 1 . Manuscript is peer-reviewed, 2. Manuscript is a primary source, 3. Manuscript is published in peer-reviewed, academic journal, 4. Manuscript is in English, 5. Technology used in the project is VR technology, 6. VR intervention is designed specifically for mental wellness/well-being, 7. Manuscript reports outcomes of VR intervention. The following exclusion criteria were used: 1. Manuscript is an introduction for special issue, book, book chapter, conference proceeding, workshop proceeding, and/or a non-empirical study, 2. VR was not used in the study, 3. The design or implementation context for the technology used is not supporting mental well-being, 4. Empirical evidence/results are not provided, 5. Full text is not located by the authors.

Screening Procedures Screening was facilitated using the web-based systematic review management software tool Covidence (Veritas Health Innovation, n.d.). A total of n = 853 articles were retrieved initially. After duplicates were removed (n  =  208), the remaining articles totaled n = 645. A title and abstract review were then performed, and inclusion and exclusion criteria were applied, resulting in the removal of n = 631 articles from the corpus. This left a total of n = 14 for full-text analysis. During analysis, one article was removed due to it not being a full manuscript but only a study proposal and a second article was removed because it only reported a research protocol (Fig. 11.1). The final corpus of articles that were included, therefore, was n = 12. These articles were read in full by two graduate student researchers, and data were extracted for summary and synthesis, including: authors, year, disciplines,

11  How has Virtual Reality Technology Been Used to Deliver Interventions That…

145

publication type and journal name, research purpose, technology type(s), research methodology, and a summary of key findings, as discussed below.

Results Included articles were published between 2013 and 2021. Journals in which these articles were published included: Journal of Clinical Medicine, European Journal of Oncology Nursing, Behaviour Research and Therapy, Rehabilitation Psychology, Arts in Psychotherapy, Virtual Reality, Frontiers in Psychiatry, International Journal of Environmental Research and Public Health, etc. Target audiences for the included articles included veterans with a post-traumatic stress disorder, senior citizens or older adults with depression and anxiety, adult patients suffering from social anxiety disorder, adolescent oncology patients with negative physical and emotional mood states and their caregivers, non-clinical university students with stress-related issues, women with disabilities who were struggling with self-esteem issues, adult patients with metastatic cancer, people with spider phobia, and people who had experienced at least two months of strict social distancing measures due to COVID-19. In the summary provided in Table 11.2, we report our findings regarding the research question of how VR technology has been used to deliver interventions that support mental well-being.

Discussion and Significance Our screening process unveiled general trends related to the literature on mental well-being and VR.  Many of the articles ultimately ended up being excluded because, although the outcomes of many of these articles was in support of mental well-being, they utilized cognitive behavioral therapy as an intervention rather than using a VR intervention (Ahmad et al., 2020). Studies that mentioned web-based or computer-assisted psychological therapy approaches as an intervention for mental well-being (Parr et al., 2020) were also excluded, as were studies that utilized VR interventions but targeted subjects with mental well-being issues that were outside the scope of this study, such as people who had schizophrenia (Jo et  al., 2018), autism (Bozgeyikli et al., 2017), physical pain (Georgescu et al., 2020), body image issues (Kruzan & Won, 2019), obsessive-compulsive disorder or substance disorder (Kim et al., 2009; Lebiecka et al., 2021), dementia (Oliveira et al., 2021), and brain injury (Stasolla et al., 2021). Ultimately, the number of articles in which virtual reality technology was utilized for improving or promoting mental well-being proved to be relatively limited. For this study, distinguishing mental well-being from physical well-being was essential since many retrieved articles reporting on well-being were focused on physical well-being. Defining mental well-being proved challenging, as the term is

146

M. Lee et al.

Table 11.2  Summary of extracted data (total of 12 articles) from full-text review data analysis References Nosek et al. (2016)

VR technology for Research purpose intervention 3-D, immersive, To examine the virtual environment feasibility of an online self-­ esteem enhancement group program for women with disabilities.

van Gelderen 3MDR for et al. (2020) veterans with treatment-­ resistant PTSD.

A novel virtual reality and motion-assisted exposure therapy, called 3MDR (Multi-modular Motion-Assisted Memory Desensitization and Reconsolidation), provides treatment in an immersive, personalized, and activating context. Szczepańska-­ To evaluate the As a virtual reality source, the effectiveness of Gieracha et al. (2021) virtual therapy in VRTierOne (Stolgraf®) device the elderly for was used. whom the previous multimodal, biopsychosocial therapeutic program had not brought the expected results.

Research methodology 7-session interactive group intervention in the 3-D, immersive, virtual environment of SecondLife. Criteria for determining feasibility were (a) enrollment, (b) engagement, (c) acceptability, and (d) improvement on measures of self-esteem, depression, self-efficacy, and social support. In a randomized controlled trial, the 3MDR was compared to a non-specific treatment component control group.

Summary of findings An intervention to enhance self-­ esteem may have a corollary benefit on depressive symptomatology.

Twenty-five elderly women with depressive symptoms were randomly divided into a virtual reality group and a control group. The therapeutic cycle consisted of eight virtual therapy sessions, twice a week for 4 weeks.

The virtual reality therapy significantly lowered the intensity of depressive symptoms, as well as stress and anxiety levels in older women taking part in the group-based multimodal therapeutic program.

Data shows emerging evidence for 3MDR and its potential to progress PTSD treatment for veterans.

(continued)

11  How has Virtual Reality Technology Been Used to Deliver Interventions That…

147

Table 11.2 (continued) Research methodology Using a randomized controlled study design patients were allocated to VR (three content groups) or an iPad control condition. Pre-post-­ intervention self-report state measures were collected using visual analog scales and an objective measure of physiological arousal (pulse rate). Post-­ intervention, patients reported levels of immersion, enjoyment, and simulator sickness. Four types of To determine the An immersive immersive virtual virtual reality-based immersive virtual environment sensorimotor reality-based scenarios were rehabilitation sensorimotor implemented, and (IVR-SRB) rehabilitation only the (IVR-SRB) effect experimental on mental health group (global mental participated for a health, period of depression, 6 weeks, three anxiety, and times a week with well-being) in 25 min per older adults. session. 2 weeks less than other programs due to the increased stimulus timing.

References Research purpose Tennant et al. To investigate (2020) whether Immersive Virtual Reality (VR) has a greater positive influence on oncology patients’ physical and emotional mood states.

Brito et al. (2022)

VR technology for intervention Immersive VR experiences were provided using a smartphone (Galaxy S7®; Samsung) and VR headset (Samsung Gear VR® first-­ generation mobile head-mounted display).

Summary of findings Patients benefited from both VR and novel iPad interventions with no statistically significant differences found between conditions on child outcomes. However, patients accessing Immersive VR consistently reported greater positive shifts in mood state and reductions in negative symptoms.

The IVR-SRB positive net effect was found in the reduction of symptoms of global mental health and depression. The anxiety scores showed moderation at the beginning which means the greater the anxiety symptoms are presented the greater the effect of the IVR-SRB in symptom reduction. (continued)

M. Lee et al.

148 Table 11.2 (continued) References Modrego-­ Alarcon et al. (2021)

Baños et al. (2013)

Research purpose To evaluate the efficacy of a mindfulness-­ based program (MBP) for reducing stress in university students and its action mechanisms and to explore the capacity of virtual reality (VR) exposure to enhance adherence to the intervention. To investigate the feasibility and possible benefits of a psychological intervention that uses virtual reality to induce positive emotions in adult hospitalized patients with metastatic cancer.

VR technology for intervention The VR kit comprised a set of Samsung GEAR VR goggles, a Samsung Galaxy S6 phone, and optional headphones from Amelia virtual care.

Research methodology A randomized controlled trial and a post-­ treatment, and 6-month follow-up. A total of 280 students from two Spanish universities were randomly assigned to three study conditions: ‘MBP’, ‘MBP with VR’, and ‘Relaxation’ (active controls).

Summary of findings Both MBP and MBP with VR support were superior to just the relaxation intervention in terms of improving stress.

Virtual reality environments were shown on a 32-in. LCD television was connected to a computer. A keyboard and mouse were used as interaction devices and participants used headphones.

The intervention consisted of four 30-min sessions during 1 week in which patients navigated through virtual environments designed to induce joy or relaxation. Mood and patient satisfaction were assessed along with open-ended questions as qualitative data.

There were adequate levels of pleasantness and perceived utility of the intervention. The main perceived benefits were distraction, entertainment, and promotion of relaxation states. Regarding mood changes, an increase in positive emotions and a decrease in negative emotions were also detected. (continued)

11  How has Virtual Reality Technology Been Used to Deliver Interventions That…

149

Table 11.2 (continued) References Richesin et al. (2021)

Research purpose To compare a 2-dimensional and 3-dimensional art-making VR intervention on measures of stress, anxiety, and mood.

Chan et al. (2021)

To examine the effects of virtual nature on affect and stress.

VR technology for intervention The VR simulation was run on an HTC Vive (Vive and Valve Corporation) headset connected to a Gigabyte Aero 15 (Gigabyte Technology) laptop computer. The 3D art-making group entered the Google Tilt Brush (Google) application, a popular VR program used for art and design. This simulation was run on an Oculus Quest (Oculus) headset. HTC’s VIVE Pro head-mounted device with full auditory and visual immersion. The authors developed a nature and urban VR environment using the Unity platform (version 2017.2.0f3).

Research methodology Both physiological (heart rate, skin conductance, & alpha amylase) and self-report measures (stress, anxiety, & mood) were recorded before and after the interventions.

Summary of findings Results showed that participants demonstrated a similar ability to reduce anxiety and enhance mood. More specifically, the virtual reality art-making group displayed the greatest decrease in heart rate.

There were two studies total: young adults and senior citizens. Both participant groups experienced the nature and urban environments with one week in-between, and the order was counter-balanced. During each session, participants completed a pre-test survey, a VR task, and a post-test survey.

Results showed that walking in a virtual forest reduced negative affect due to enhanced nature connectedness, and reduced stress measured by heart rate. Also, it improved positive affect due to enhanced nature connectedness after the virtual nature walk.

(continued)

M. Lee et al.

150 Table 11.2 (continued) References Lloyd and Haraldsdottir (2021)

Research purpose To trial the VR technology and consider what benefits may emerge for hospice in patients.

Lindner et al. To test the (2021) hypothesis that after being exposed to VR exposure therapy in the treatment of anxiety disorders and fear reduction, there can be continued improvement after discontinuing VR use.

VR technology for intervention A virtual world, using ‘room-scale’ VR technology that tracks the user’s movement and headsets.

VRET (virtual reality exposure therapy) with gamification components; VR scenarios.

Research methodology VR sessions were observed by a researcher and followed by qualitative semi-structured interviews to discuss the experience. Interviews were audio recorded, transcribed, and thematically analyzed.

Summary of findings Results showed that the VR sessions were acceptable for people within the hospice environment. The majority of participants enjoyed the experience. Many expressed joy and delight at the process. VR holds possibilities for relieving symptoms such as pain and anxiety frequently experienced by people in hospices. Using data from a VR exposure interventions may recent trial on benefit from automated VR exposure therapy including explicit for spider phobia, in-virtuo to in-vivo transitioning in which participants were components. followed for 1 year, completing assessments for 1 week, 3 months, and 12 months post-treatment. The assessment included a validated self-report of phobia symptoms, a standardized behavioral approach test featuring a real spider, and a questionnaire for self-reporting frequency of in-vivo exposures since the last assessment. (continued)

11  How has Virtual Reality Technology Been Used to Deliver Interventions That…

151

Table 11.2 (continued) References Riva et al. (2021)

Research purpose To investigate the effectiveness of a novel self-­ administered at-home daily virtual reality (VR) based intervention (COVID Feel Good) for reducing the psychological burden experienced during the COVID-19 lockdown.

VR technology for intervention A 360-degree virtual reality scenario-based environment. It can be displayed in an immersive (i.e., through a head-­ mounted display or low-cost cardboard connected with a smartphone) or in a non-immersive way (i.e., for instance, YouTube supports 360° video formats both in its Android app and on its website).

Research methodology Participants were in the intervention between June and July 2020 for one week. Primary outcome measures were depression, anxiety, stress symptoms, perceived stress levels, and hopelessness. Secondary outcomes were the experienced social connectedness and the level of fear experienced during the pandemic.

Summary of findings The study shows that the intervention was associated with good clinical outcomes, low to no risks for the treatment, and no adverse effects or risks.

complex to articulate and can be confused with mental disorder symptoms or personality disorders, which are heavily researched in psychiatric and psychological sciences. For this study, the focus of mental well-being VR interventions included people with anxiety, depression, stress, and phobia. However, further work may be needed to better articulate what terms and concepts encompass mental well-being from the perspective of VR interventions. As a result of this lack of semantic clarity, search terms for this study captured studies focusing on social wellness, emotional wellness, mental health, psychological well-being, psychological well-being, and quality of life; therefore, articles that indirectly measured mental health in their studies may have been included. For example, one study seemed to indirectly measure mental health because self-esteem is one of the contributing factors to mental health disorders. However, as this study also showed statistically significant improvement in depressive symptomology using VR technology, it was ultimately included in the full-text analysis. Although analysis suggests that findings from the included studies seem promising, there is little conformity among studies, leading to a tapestry of research findings that are somewhat piecemeal. This could potentially frustrate efforts to connect future studies with the methods and findings from the extant literature and present challenges in terms of generalization of findings. This is exacerbated by the great variety in methodological quality between research studies. For example, one study

152

M. Lee et al.

did not mention research limitations, some studies used very small sample sizes, and still, others were vague in their reporting. Further confounding efforts to synthesize research in this area, and as others have found (Schmidt & Glaser, 2021), different authors have vastly differing perceptions of what constitutes VR. Some consider using desktop monitors to be VR, others use smartphones, and still others use fully immersive head-mounted displays. Better articulation of what constitutes VR in the field of mental well-being is needed. In the future, authors are encouraged to define more explicitly what is meant by VR in their studies and to take a more nuanced approach that considers specific technological affordances and how those affordances might promote or detract from intervention efficacy.

Limitations Our research should be considered in the light of the following limitations. First, only five databases were searched. Although these databases were selected because their indices seemed most germane to our research question, including other databases potentially could have returned additional included articles. Second, our selection criteria were intentionally narrow, and focused only on peer-reviewed journal publications. Including conference proceedings, dissertations, or so-called “gray” literature could have increased the diversity of studies retrieved for this study. Third, although the analysis of the literature suggested that the year range of 2013–2021 is when the majority of research in this area has been performed, there could have been studies prior to 2013 that might have met the inclusion criteria; therefore, some articles may have been missed in the search process. Lastly, only publications that were written in English were considered. The exclusion of non-­ English publications is considered to be the “Tower of Babel bias” (Jackson & Kuriyama, 2019). This bias could have influenced the generalizability of our findings.

Conclusion This systematic and comprehensive literature review focuses on how VR has been used to deliver interventions that support mental well-being. To date, reporting on findings from VR technology interventions that focus on mental health topics is limited. Interventions that involve advanced learning technologies may need several technological and design iterations and refinements in a process that can be described as formative learning design. This review, which focuses on VR interventions for mental wellbeing, may contribute to the body of knowledge of formative learning design for VR designers and educators. The findings of this study on the design of virtual reality interventions for mental well-being will inform researchers in the

11  How has Virtual Reality Technology Been Used to Deliver Interventions That…

153

field of educational technology and will allow them to better understand the relationship between the current state of VR technology and psychoeducation and in general education, and to establish productive, relevant directions forward.

References Ahmad, F., El Morr, C., Ritvo, P., Othman, N., Moineddin, R., & MVC Team. (2020). An eight-­ week, web-based mindfulness virtual community intervention for students’ mental health: Randomized controlled trial. JMIR Mental Health, 7(2), e15520. https://doi.org/10.2196/15520 Bamodu, O., & Ye, X. M. (2013). Virtual reality and virtual reality system components. Advanced Materials Research, 765-767, 1169–1172. https://doi.org/10.4028/www.scientific.net/ amr.76-­767.1169 Baños, R. M., Espinoza, M., García-Palacios, A., Cervera, J. M., Esquerdo, G., Barrajón, E., & Botella, C. (2013). A positive psychological intervention using virtual reality for patients with advanced cancer in a hospital setting: A pilot study to assess feasibility. Supportive Care in Cancer, 21(1), 263–270. Barrett, M., Blackledge, J., & Coyle, E. (2011). Using virtual reality to enhance electrical safety and design in the built environment. ISAST Transactions on Computers and Intelligent Systems, 3(1), 1–9. https://doi.org/10.21427/D7790G Bozgeyikli, L., Bozgeyikli, E., Raij, A., Alqasemi, R., Katkoori, S., & Dubey, R. (2017). Vocational rehabilitation of individuals with autism spectrum disorder with virtual reality. ACM Transactions on Accessible Computing (TACCESS), 10(2), 1–25. https://doi. org/10.1145/3046786 Brito, H., Pham, T., & Vicente, B. (2022). Effect of sensorimotor rehabilitation based on an immersive virtual reality model on mental health. International Journal of Geriatric Psychiatry, 37(1), 1–9. Burdea, G. C., & Coiffet, P. (2003). Virtual reality technology. Wiley. Chan, S. H. M., Qiu, L., Esposito, G., Mai, K. P., Tam, K. P., & Cui, J. (2021). Nature in virtual reality improves mood and reduces stress: Evidence from young adults and senior citizens. Virtual Reality, 1–16. Clemmensen, L., Bouchard, S., Rasmussen, J., Holmberg, T. T., Nielsen, J. H., Jepsen, J. R. M., & Lichtenstein, M.  B. (2020). Study protocol: Exposure in virtual reality for social anxiety disorder  - a randomized controlled superiority trial comparing cognitive behavioral therapy with virtual reality based exposure to cognitive behavioral therapy with in vivo exposure. BMC Psychiatry, 20(1), 1–9. https://doi.org/10.1186/s12888-­020-­2453-­4 Craig, A.  B., Sherman, W.  R., & Will, J.  D. (2009). Developing virtual reality applications: Foundations of effective design. Morgan Kaufmann. Cummings, J. J., & Bailenson, J. N. (2016). How immersive is enough? A meta-analysis of the effect of immersive technology on user presence. Media Psychology, 19(2), 272–309. https:// doi.org/10.1080/15213269.2015.1015740 Desai, P. R., Desai, P. N., Ajmera, K. D., & Mehta, K. (2014). A review paper on oculus rift-a virtual reality headset. arXiv preprint arXiv:1408.1173. https://doi.org/10.48550/arXiv.14077.1173 Di Serio, Á., Ibáñez, M. B., & Kloos, C. D. (2013). Impact of an augmented reality system on students’ motivation for a visual art course. Computers & Education, 68, 586–596. https://doi. org/10.1016/j.compedu.2012.03.002 Frewen, P., Mistry, D., Zhu, J., Kielt, T., Wekerle, C., Lanius, R.  A., & Jetly, R. (2020). Proof of concept of an eclectic, integrative therapeutic approach to mental health and well-being through virtual reality technology. Frontiers in Psychology, 11, 858. https://doi.org/10.3389/ fpsyg.2020.00858

154

M. Lee et al.

Georgescu, R., Fodor, L.  A., Dobrean, A., & Cristea, I.  A. (2020). Psychological interventions using virtual reality for pain associated with medical procedures: A systematic review and meta-analysis. Psychological Medicine, 50(11), 1795–1807. https://doi.org/10.1017/ S0033291719001855 Hatta, M. H., Sidi, H., Sharip, S., Das, S., & Saini, S. M. (2022a). The role of virtual reality as a psychological intervention for mental health disturbances during the COVID-19 pandemic: A narrative review. International Journal of Environmental Research and Public Health, 19(4), 2390. https://doi.org/10.3390/ijerph19042390 Hatta, M. H., Sidi, H., Siew Koon, C., Che Roos, N. A., Sharip, S., Abdul Samad, F. D., et al. (2022b). Virtual reality (VR) technology for treatment of mental health problems during COVID-19: A systematic review. International Journal of Environmental Research and Public Health, 19(9), 5389. https://doi.org/10.3390/ijerph19095389 Isdale, J. (1998). What is virtual reality? A web-based introduction. http://isdale.com/jerry/VR/ WhatIsVR/noframes/WhatIsVR4.1.html Jackson, J.  L., & Kuriyama, A. (2019). How often do systematic reviews exclude articles not published in English? Journal of General Internal Medicine, 34(8), 1388–1389. https://doi. org/10.1007/s11606-­019-­04976-­x Jo, G., Rossow-Kimball, B., Park, G., & Lee, Y. (2018). Effects of virtual reality exercise for Korean adults with schizophrenia in a closed ward. Journal of Exercise Rehabilitation, 14(1), 39. https://doi.org/10.12965/jer.1835168.584 Kalron, A., Achiron, A., Pau, M., & Cocco, E. (2020). The effect of a telerehabilitation virtual reality intervention on functional upper limb activities in people with multiple sclerosis: A study protocol for the TEAMS pilot randomized controlled trial. Trials, 21, 713. https://doi. org/10.1186/s13063-­020-­04650-­2 Kelson, J.  N., Ridout, B., Steinbeck, K., & Campbell, A.  J. (2021). The use of virtual reality for managing psychological distress in adolescents: Systematic review. Cyberpsychology, Behavior and Social Networking, 24(10), 633–641. https://doi.org/10.1089/cyber.2021.0090 Kim, K., Kim, C.  H., Kim, S.  Y., Roh, D., & Kim, S.  I. (2009). Virtual reality for obsessive-­ compulsive disorder: Past and the future. Psychiatry Investigation, 6(3), 115. https://doi. org/10.4306/pi.2009.6.3.115 Kruzan, K.  P., & Won, A.  S. (2019). Embodied well-being through two media technologies: Virtual reality and social media. New Media & Society, 21(8), 1734–1749. https://doi. org/10.1177/1461444819829873 Kutz, M. (Ed.). (1998). Mechanical engineers’ handbook (2nd ed.). Wiley. Lebiecka, Z., Skoneczny, T., Tyburski, E., Samochowiec, J., & Kucharska-Mazur, J. (2021). Is virtual reality cue exposure a promising adjunctive treatment for alcohol use disorder? Journal of Clinical Medicine, 10(13), 2972. https://doi.org/10.3390/jcm10132972 Lee, H. G., Chung, S., & Lee, W. H. (2013). Presence in virtual golf simulators: The effects of presence on perceived enjoyment, perceived value, and behavioral intention. New Media & Society, 15(6), 930–946. https://doi.org/10.1177/1461444812464033 Lee, S. H., Lee, J. Y., Kim, M. Y., Jeon, Y. J., Kim, S., & Shin, J. H. (2018). Virtual reality rehabilitation with functional electrical stimulation improves upper extremity function in patients with chronic stroke: A pilot randomized controlled study. Archives of Physical Medicine and Rehabilitation, 99(8), 1447–1453. https://doi.org/10.1016/j.apmr.2018.01.030 Lindner, P., Dafgård, P., Miloff, A., Andersson, G., Reuterskiöld, L., Hamilton, W., & Carlbring, P. (2021). Is continued improvement after automated virtual reality exposure therapy for spider phobia explained by subsequent in-vivo exposure? A first test of the lowered threshold hypothesis. Frontiers in Psychiatry, 12, 654. Lloyd, A., & Haraldsdottir, E. (2021). Virtual reality in hospice: Improved patient well-being. BMJ Supportive & Palliative Care, 11(3), 344–350. Lu, Y., Ge, Y., Chen, W., Xing, W., Wei, L., Zhang, C., & Yang, Y. (2022). The effectiveness of virtual reality for rehabilitation of Parkinson disease: An overview of systematic reviews with meta-analyses. Systematic Reviews, 11(50), 1–14. https://doi.org/10.1186/s13643-­022-­01924-­5

11  How has Virtual Reality Technology Been Used to Deliver Interventions That…

155

Mazzini, N. A., Almeida, M. G. R., Pompeu, J. E., Polese, J. C., & Torriani-Pasin, C. (2019). A combination of multimodal physical exercises in real and virtual environments for individuals after chronic stroke: Study protocol for a randomized controlled trial. Trials, 20(1), 436. https:// doi.org/10.1186/s13063-­019-­3396-­2 Mihelj, M., Novak, D., & Beguš, S. (2014). Virtual reality technology and applications. Springer. https://doi.org/10.1007/978-­94-­007-­6910-­6 Modrego-Alarcon, M., Lopez-del-Hoyo, Y., Garcia-Campayo, J., Perez-Aranda, A., Navarro-Gil, M., Beltran-Ruiz, M., & Montero-Marin, J. (2021). Efficacy of a mindfulness-based program with and without virtual reality support to reduce stress in university students: A randomized controlled trial. Behavior Research and Therapy, 142, 103866. Moher, D., Liberati, A., Tetzlaff, J., & Altman, D. G. (2009). Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. BMJ. https://doi.org/10.1136/ bmj.b2535 Nosek, M. A., Robinson-Whelen, S., Hughes, R. B., & Nosek, T. M. (2016). An internet-based virtual reality intervention for enhancing self-esteem in women with disabilities: Results of a feasibility study. Rehabilitation Psychology, 61(4), 358. Oliveira, J., Gamito, P., Souto, T., Conde, R., Ferreira, M., Corotnean, T., Fernandes, A., Silva, H., & Neto, T. (2021). Virtual reality-based cognitive stimulation on people with mild to moderate dementia due to Alzheimer’s disease: A pilot randomized controlled trial. International Journal of Environmental Research and Public Health, 18(10), 5290. https://doi.org/10.3390/ ijerph18105290 Park, M., Ko, M.-H., Oh, S.-W., Lee, J.-Y., Ham, Y., Yi, H., Choi, Y., Ha, D., & Shin, J.-H. (2019). Effects of virtual reality-based planar motion exercises on upper extremity function, range of motion, and health-related quality of life: A multicenter, single-blinded, randomized, controlled pilot study. Journal of Neuroengineering and Rehabilitation, 16(1), 1–13. https://doi. org/10.1186/s12984-­019-­0595-­8 Parr, J. R., Brice, S., Welsh, P., Ingham, B., Le Couteur, A., Evans, G., Monaco, A., Freeston, M., & Rodgers, J. (2020). Treating anxiety in autistic adults: Study protocol for the Personalised Anxiety Treatment–Autism (PAT-A©) pilot randomized controlled feasibility trial. Trials, 21(1), 1–14. https://doi.org/10.1186/s13063-­020-­4161-­2 Pears, M., Yiasemidou, M., Ismail, M. A., Veneziano, D., & Biyani, C. S. (2020). Role of immersive technologies in healthcare education during the COVID-19 epidemic. Scottish Medical Journal, 65(4), 112–119. https://doi.org/10.1177/0036933020956317 Pribeanu, C., Balog, A., & Iordache, D. D. (2017). Measuring the perceived quality of an AR-based learning application: A multidimensional model. Interactive Learning Environments, 25(4), 482–495. https://doi.org/10.1080/10494820.2016.1143375 Richesin, M. T., Baldwin, D. R., & Wicks, L. A. (2021). Art making and virtual reality: A comparison study of physiological and psychological outcomes. The Arts in Psychotherapy, 75, 101823. Riva, G., Bernardelli, L., Browning, M.  H., Castelnuovo, G., Cavedoni, S., Chirico, A., et  al. (2020). COVID feel good—an easy self-help virtual reality protocol to overcome the psychological burden of coronavirus. Frontiers in Psychiatry, 11, 563319. https://doi.org/10.3389/ fpsyt.2020.563319 Riva, G., Bernardelli, L., Castelnuovo, G., Di Lernia, D., Tuena, C., Clementi, A., et al. (2021). A virtual reality-based self-help intervention for dealing with the psychological distress associated with the COVID-19 lockdown: An effectiveness study with a two-week follow-up. International Journal of Environmental Research and Public Health, 18(15), 8188. Rizzo, A., & Wiederhold, B. K. (2006, March). Virtual reality technology for psychological/neuropsychological/motor assessment and rehabilitation: Applications and issues. In IEEE virtual reality conference (VR 2006) (pp. 308–308). IEEE Computer Society. https://doi.org/10.1109/ VR.2006.144 Rose, T., Joe, S., Williams, A., Harris, R., Betz, G., & Stewart-Brown, S. (2017). Measuring mental wellbeing among adolescents: A systematic review of instruments. Journal of Child and Family Studies, 26(9), 2349–2362. https://doi.org/10.1007/s10826-­017-­0754-­0

156

M. Lee et al.

Sadowski, W., & Stanney, K. (2002). Presence in virtual environments. In K. M. Stanney (Ed.), Handbook of virtual environments (pp. 831–846). CRC Press. Schmidt, M. M. (2014). Designing for learning in a three-dimensional virtual learning environment: A design-based research approach. Journal of Special Education Technology, 29(4), 59–71. https://doi.org/10.1177/016264341402900405 Schmidt, M., & Glaser, N. (2021). Investigating the usability and learner experience of a virtual reality adaptive skills intervention for adults with autism spectrum disorder. Educational Technology Research and Development, 69(3), 1665–1699. https://doi.org/10.1007/ s11423-­021-­10005-­8 Seligman, M. E. (2012). Flourish: A visionary new understanding of happiness and well-being. Simon and Schuster. Siani, A., & Marley, S.  A. (2021). Impact of the recreational use of virtual reality on physical and mental wellbeing during the Covid-19 lockdown. Health and Technology, 11(2), 425–435. https://doi.org/10.1007/s12553-­021-­00528-­8 Slater, M. (2009). Place illusion and plausibility can lead to realistic behaviour in immersive virtual environments. Philosophical Transactions of the Royal Society B: Biological Sciences, 364(1535), 3549–3557. https://doi.org/10.1098/rstb.2009.0138 Soliman, M., Peetz, J., & Davydenko, M. (2017). The impact of immersive technology on nature relatedness and pro-environmental behavior. Journal of Media Psychology, 29(1), 8–17. https:// doi.org/10.1027/1864-­1105/a000213 Stasolla, F., Matamala-Gomez, M., Bernini, S., Caffò, A. O., & Bottiroli, S. (2021). Virtual reality as a technological-aided solution to support communication in persons with neurodegenerative diseases and acquired brain injury during COVID-19 pandemic. Frontiers in Public Health, 8, 635426. https://doi.org/10.3389/fpubh.2020.635426 Szczepańska-Gieracha, J., Cieślik, B., Serweta, A., & Klajs, K. (2021). Virtual therapeutic garden: A promising method supporting the treatment of depressive symptoms in late-life: A randomized pilot study. Journal of Clinical Medicine, 10(9), 1942. Tennant, M., Youssef, G. J., McGillivray, J., Clark, T. J., McMillan, L., & McCarthy, M. C. (2020). Exploring the use of immersive virtual reality to enhance psychological well-being in pediatric oncology: A pilot randomized controlled trial. European Journal of Oncology Nursing, 48, 101804. Valmaggia, L.  R., Latif, L., Kempton, M.  J., & Rus-Calafell, M. (2016). Virtual reality in the psychological treatment for mental health problems: A systematic review of recent evidence. Psychiatry Research, 236, 189–195. https://doi.org/10.1016/j.psychres.2016.01.015 van Gelderen, M. J., Nijdam, M. J., Haagen, J. F., & Vermetten, E. (2020). Interactive motion-­ assisted exposure therapy for veterans with treatment-resistant posttraumatic stress disorder: A randomized controlled trial. Psychotherapy and Psychosomatics, 89(4), 215–227. Veritas Health Innovation. (n.d.). Covidence systematic review software [Computer software]. https://www.covidence.org Yen, H. Y., & Chiu, H. L. (2021). Virtual reality exergames for improving older adults’ cognition and depression: A systematic review and meta-analysis of randomized control trials. Journal of the American Medical Directors Association, 22(5), 995–1002. https://doi.org/10.1016/j. jamda.2021.03.009 Zeng, W., & Richardson, A. (2016). Adding dimension to content: Immersive virtual reality for e-Commerce. ACIS 2016 Proceedings. https://aisel.aisnet.org/acis2016/24

Chapter 12

Layering Views of Experience to Inform Design Signe E. Kastberg, Amber Simpson, and Caro Williams-Pierce

Abstract  The use of video data in mathematics education has allowed mathematics educators to conduct in-depth analyses of interactive communication. New camera technology such as GoPro cameras afford researchers multiple views on a single event, not possible from a stand-alone camera. In this paper, we argue that the use of multiple GoPro cameras and analysis of individual views allowed us as researchers to “see” more of an event with each new “layer” of video data. The resulting layered view illustrates the complexity of interpreting student experience with designed objects and contexts, and provides a new perspective on iterating design. We provide examples as to how the use of multiple GoPro cameras within a group of fifth grade students afforded us as a research team the opportunity to layer views to form a possibility space for student experience of one event. We contend that the significance of this approach lies in interpreting students’ experiences in makerspaces, as well as consequences of classroom and technological design features hidden within a single camera view. Keywords  GoPro camera · Makerspace · Perspective The use of video data is widespread (e.g., Hamel & Viau-Guay, 2019; Nassauer & Legewie, 2021), and camera technology such as a GoPro is becoming more prevalent (Burbank et al., 2018; Simpson et al., 2020a; Simpson & Feyerabend, 2022). Video data from GoPro cameras provides evidence of a participant’s potential view (perspective) of an event in contrast with a stand-alone camera that provides an S. E. Kastberg (*) Purdue University, West Lafayette, IN, USA e-mail: [email protected] A. Simpson Binghamton University, Binghamton, NY, USA e-mail: [email protected] C. Williams-Pierce University of Maryland, College Park, MD, USA e-mail: [email protected]

© The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 B. Hokanson et al. (eds.), Formative Design in Learning, Educational Communications and Technology: Issues and Innovations, https://doi.org/10.1007/978-3-031-41950-8_12

157

158

S. E. Kastberg et al.

outsider-looking-in view of the same event. Participant perspectives enable a more subjective, humanistic, collaborative, and participatory approach (Harwood & Collier, 2019; Lahlou, 2011; Pink, 2015) to data analysis. We argue that using multiple GoPro cameras and layering our researcher interpretations of collaborative group member views in a makerspace, resulted in a possibility space for the participants’ experiences of an event (i.e., perspective taking). This possibility space allowed us to “see” consequences of makerspace design features. Hence, we conceptualize perspective taking as viewing a situation or event from another’s point of view (Gasiorek & Hubbard, 2017). We utilized third-space as a framework to guide this method of collecting and analyzing data (Gutiérrez et al., 1999). Specifically, we conceptualized the third-space as a possibility space that allowed us to frame participants’ views (first-space) and researchers’ perspectives (second-space) as complements rather than in opposition to one another (Pink, 2015) and reduce “othering” in research with children (Lahman, 2008). In this third-space we as researchers attempted to enter the particulars of an event (e.g., human movement) through the perspectives of participants wearing GoPro cameras (Hulme et  al., 2009; Simpson & Feyerabend, 2022). The third-space informed hypotheses about instructional activity design features including assumptions about tools and interactions in school situated makerspaces.

Formative Design Formative design of instructional contexts and activities, rest on gathering information about users’ experiences within learning spaces and activities with the goal of guiding “improvements in the ongoing teaching and learning context” (Kenny, 2017, p. 2). In this paper the instructional activity designer was a teacher, who created a collection of activities for students to engage in making. Video of the instructional activities provided views of students’ experience. Layers of interpretations of these views led to questions about the activity design and assumptions about tools (e.g., iPad) and collaboration. In this approach to formative design, hypothesizing about students’ experience from video provides a starting point for problem identification and possibilities for redesign. Video analysis allows us to identify design challenges and informs new iterations of instructional design.

Use of Video in Generating Truth Spaces for Formative Design Designing makerspace instructional activities regularly benefits from a wide variety of formative approaches, from using user-centered design practices (e.g., Steele et  al., 2018) to action research (e.g., Carruci & Toyama, 2019) to design-based research (e.g., Becker & Jacobsen, 2021). Yet an alternative pathway to design possibilities is developed from video evidence of users’ experiences. Video draws

12  Layering Views of Experience to Inform Design

159

attention to consequences of makerspace design features and highlights design limitations such as assumptions about tools and collaborative processes. For example, video evidence of student makerspace activity revealed limited use of rulers, highlighting designers’ assumption that rulers are an important measuring tool in makerspaces (Simpson & Kastberg, 2022). While student experience in makerspaces is not directly accessible, video gives researchers/designers more ways to hypothesize about experience. Designers can then use those hypothesized interpretations of teachers and learners’ experiences to inform formative redesign of instructional activities. A key philosophical underpinning of our research, however, is that video evidence cannot perfectly capture the unique lived experiences of students as they interact with the designed space and their peers. Additional ambiguity in understanding student experiences results from stand-alone camera video recordings that capture outsider-looking-in views of events. Our methodological approach hypothesizes a possibility space of student experience by interpreting and coordinating multiple student views as captured through their GoPro cameras. By tracing each view through the activity across time, we can better interpret a student’s experience. As a result, in this paper, we use the term “truth” as a subjective experiential term that both respects the unique lived experience of our participants, and acknowledges that their experiences are filtered through our own unique lived experiences. In particular, we do not consider a “truth” to exclude other possible simultaneous “truths” – instead, we seek to identify possible truth spaces that inform formative design decisions without assuming students will approach the activity in one way.

Experiences in School Based Maker Spaces Makerspaces in the context of schools are a liminal space neither structured by formal school norms such as teacher oversight nor non-formal spaces for open play (Simpson & Feyerabend, 2022). Teachers and students are often left to their own devices in creating normative ways of operating. For example, teachers are left to wonder how much oversight to provide to students, while students wonder when to ask the teacher for help. This level of freedom is a relief from the pressure to achieve in many schools. Yet how makerspaces are experienced by teachers and students as they engage with one another, tools, processes, and activities is less clear and may lead to opposing forces at play such as “doing” school or being agentic learners (Kajamaa & Kumpulainen, 2019; Kumpulainen & Kajamaa, 2020). In this paper we focus on a group of fifth graders engaged in a school makerspace to provide ­examples of ways researchers can use video evidence of student views to create a possibility space of “truth” of one event (i.e., third-space).

160

S. E. Kastberg et al.

Methods In our project, observations of a makerspace were collected over three days during free periods in June 2019. The second author engaged with the students as an observer during the teacher-designed and facilitated activities. At the outset, the goal for the second author was to gain insight into mathematical play during making. This study is a secondary analysis of the video data collected on Day 2 of the observation, motivated by unexpected events in one group’s collaboration. Video data used to highlight our approach was collected from one group of fifth-­ grade students (Two female: Olive and Gina; and four male students: Simon, Ben, Timothy, and Harry – all pseudonyms) tasked by their teacher to first construct a masking tape-path for a robot, Dash, to traverse (Phase 1). Next, the group used the app Blockly (Pastenak et al., 2017) to program Dash to stay on a path created by another student group (Phase 2). The two phases took 20 min with a minute between for the teacher to distribute supplies. Three students in our group volunteered to wear GoPro cameras on their chest, while we also recorded the overall group interaction using one stand-alone camera. The purpose of the data collection was to hypothesize about users’ experiences in makerspace environments. However, as we engaged in analysis we began to question our assumptions about collaboration, tools, and teacher practices in school based makerspaces. The data was analyzed using dialogue (Guifoyle et al., 2004) in interaction analysis (Jordan & Henderson, 1995), with the added complexity of layering perspectives using the GoPro. Collectively, we silently watched 2- to 4-min video clips, then discussed our observations citing evidence from the data. Our videotaped discussions were informed by and influenced our views, language, and understandings of the event as it unfolded temporally and with each new layer of student video. There was active accountability as we requested video support for observations and interpretive statements and interrogated one another’s assumptions. We further discussed how we moved past individual views of an event to a layered view, maintaining areas of doubt where video data was only available from a single participant.

Examples Using two examples, we highlight how we approached the video and created a layered view maintaining ambiguity regarding possibilities for truth. We feature portions of the video analysis from Phase 1 and 2 of the making activity.

Example One From Phase One We began by assuming Olive’s perspective – watching her GoPro video data through the phases of the activity. Olive took control of the masking tape at the beginning of Phase 1, saying “I’m good at this stuff because I do Odyssey of the Mind,” referring

12  Layering Views of Experience to Inform Design

161

to extracurricular design activities she participated in. We perceived this as a way for Olive to establish her power within the group. This perspective “colored” how we viewed and described Olive’s position throughout the first phase of the activity. In another instance, Olive remarked as Ben added a strip of tape to the end of the path, “make it jagged. That’s way too long. I’ll position it in a better way.” She pulled up the strip of tape and ripped it in half. The resulting two strips of tape were then added to the start of the path. We discussed Olive’s actions as illustrative of her assertion of dominance as to how the path should be laid. When the teacher shouted – “You have about three min.” – we observed what seemed like a chaotic bustle. From Olive’s perspective, the students hurriedly placed the tape, and there seemed to be little collaboration or order to their activity. We noted how the time limit seemed to provoke the students to individually work towards the goal of finishing laying the tape path, instead of cooperating towards that shared goal. Assuming Simon’s perspective revealed additional information that expanded the possibility space for interpretations of this phase. For instance, we noticed that Simon tried to take the tape twice from two different peers, Gina and Timothy, and was rebuffed both times. Consequently, he separated himself from the group for a short time. His physical distance provided a new perspective on activity around the masking tape. While Olive continued to monopolize moments in this phase (e.g., ignoring bids from Gina for a piece of tape), other members of the group were attending to the tape placement and discussing placement patterns. We characterized this as ‘budding collaboration’ – the group was developing their collaborative structure within the activity in a way that was not visible when watching Olive’s perspective. As one author noted, “It is like an attempt at trying to collaborate – talk about what the path should look like, everyone gets a turn. So it looks even more organized than when I first saw it.” In addition, our initial perspective through Olive’s video when the teacher introduced a time limit was that the action seemed chaotic, but Simon’s perspective provided evidence for another interpretation. The chaotic feeling of the group activity as we watched Olive’s video was due to her embeddedness in all the activity. As one author said, “Olive is constantly involved. That first view really influenced my view of the group dynamics just because she was so close the whole time.”

Example Two From Phase Two When Phase 2 began, students were initially given only a measuring tape and pencil, to begin marking length measurements on the path created by another group. Through Olive’s GoPro video data, we discussed how Olive gained control of the tools and again stated: “I do Odyssey of the Mind.” She began measuring strips of tape on the path while peers watched and occasionally held the tape measure. After a few minutes, the teacher introduced the final two tools for Phase 2, Dash and the iPad with the Blockly app. We initially saw little of those two tools, as Olive’s view was focused on her activity of measurement. Occasionally, a student with the iPad

162

S. E. Kastberg et al.

(possession seemed to change frequently among the boys) would ask Olive for a measurement, and she would read off the length of the path she had just measured. Yet the two activities (measuring and programming) seemed siloed. As we followed Simon’s perspective in Phase 2, we gained a detailed view of the group clustered around the iPad. As soon as Dash and the iPad were introduced, the boys clustered around the iPad  – we noticed Simon grabbing for the iPad when someone else had it, and others grabbing for it when Simon had it. In addition, we saw a new view of the requests for measurement lengths from Olive: whomever was controlling the iPad would occasionally ask Olive for measurements, and as noted above, she would read off the measurement she had just completed. However, we could see more details of iterations of coding and subsequent running of Dash from Simon’s perspective. Coordinating the video from Olive’s perspective and from Simon’s led us to realize that the programmer holding the iPad often asked Olive for a different measurement than the one she gave. In fact, requests for a measurement did not include units, so sometimes the programmer would be focusing on angles, while Olive was providing centimeter lengths, since no protractor was available. The participants provided no evidence that they were aware of the misalignment in requested measures and those provided. Seeing different participant perspectives resulted in our identification of this mismatch. Sequential viewing of participant video revealed new interpretations with each new perspective, yet areas of ambiguity remained regarding the participant’s experiences (e.g., whether participants knew that they were not communicating about the lengths and angle measures to program Dash). As one author said during a meeting, “With our data, we are seeing different images and I am struck by how they make me feel. So one view makes me feel one way and another doesn’t seem that bad… that’s the beauty of having different perspectives. All the times that I have video data that does not follow different perspectives seems shallow now …we are having more faithful interactions with the data.”

Discussion and Conclusion Sequential viewing of different perspectives on an activity can reveal new evidence to form a larger possibility space of “truth” of an event. For example, our realization that Olive was offering programmers measurements for parts of the path they were not asking about was made possible through seeing both Olive’s perspective and Simon’s perspective. The broad camera view on the group, while useful for tracking overall movements of the group, did not include sufficient evidence of either Olive or the programmer’s activity in the moment to interpret their use of tools and interactions during the event. From the broad camera view, if the programmer asked for a measurement, only the GoPro video of the programmer allowed us to see the referent for the measurement (and unit) through the programmer’s view of the iPad. This led us to identify the iPad (and consequently Blockly) as an insular tool that required substantial redesign for successful collaboration (Simpson et al., 2020b):

12  Layering Views of Experience to Inform Design

163

that is, Olive had no way of knowing which measurement the programmer was requesting without being able to see the iPad. In addition, it was only through Olive’s view of measurement in coordination with Simon’s that we were able to see misalignment between the programmer’s request for measurement information and Olive’s response. Furthermore, insights gained from interactive analysis of the GoPro video data from different perspectives suggests that the lines of human movement (Harwood & Collier, 2019) afforded by the GoPro views from a single wearer were subjective and deeply personal perspectives on the activity. Seeing Olive’s perspective on laying a path afforded us with a view of her controlling the activity. Shifting to Simon’s video afforded us with another view of how the team worked together in spite of Olive’s positioning for power. Layering such perspectives created intersubjective views of events such as Olive’s “control” of laying the path, and her later “marginalized” participation in programming Dash. These differing interpretations of video evidence allowed the research team to create a third-space of “truth” by framing each participant view and our researcher views as complements (Pink, 2015). Students were positioned as having perspectives of an event that were beneficial to us as outsiders to the event. Evidence gathered from student views and interaction analysis, resulted in questions about the “truths” of events that have design implications. For example, layering Olive and Simon’s view shifted our interpretation of their experiences and opened design possibilities that could not be identified using the stand-alone camera. Through each viewing, hypotheses about the truth of events were replaced with new hypotheses, thereby highlighting the subjective nature of our interpretations of the students’ experience. We contend that collecting, analyzing, and layering multiple student perspectives using GoPro cameras enhances interpretations of students’ engagement and design thinking in makerspaces. These interpretations yield insights into consequences of design features hidden within a single camera view. Activity initially identified as chaotic came to be viewed as orderly in relation to images of efforts to control the iPad. Aligned with Lahlou (2011) and Pink (2015), video data from a collection of perspectives allowed us to maintain a student-­ centered view of events and view the GoPro evidence as subjective. Intersubjectivity and a story line developed as we called into question interpretations of evidence gathered from individual GoPro views and stitched together a collaborative view of events in which no one student’s experience was absolutely understood or could be separated from the others. Instead, taken as a whole, the collective became a possibility space of “truths” of temporal events and experiences for which we had multiple perspectives, such as laying the path and controlling the iPad. Events not captured by multiple videos were conditional and created pockets of subjectivity within the dominant intersubjective narrative. We assume that the participants’ experience of reality is inaccessible (see inner rectangle in Fig. 12.1). In addition, the GoPro video from each participant and the stationary camera constitute a view of lived events in the makerspace (see inner oval in Fig. 12.1). So, when we viewed the videos through our lenses (see outer rectangle

164

S. E. Kastberg et al.

Fig. 12.1  Relationships between researcher and participant reality

in Fig. 12.1) we created layers of researchers’ views on participants’ experiences of events (see outer oval in Fig. 12.1). Viewing of the video data from different perspectives resulted in a collection of in some cases divergent and other convergent versions of events as we hypothesized about the experiences and intents of the participants. Our subjective versions of events drew from our histories as cultural, gendered, racialized, and complex beings. We used information derived from our lived experiences to interpret what we saw. We described and discussed our interpretations of the students’ experiences, settling on a space of possibilities for the ‘truth’ of the events in the video captures. The possibility space we construct informs formative design by maintaining designers’ focus on intended and unintended consequences of design. Conducting this research has impacted how we think of designing makerspaces in schools and instructional activities for students, particularly regarding access to tools, outputs of tool use, and the impact on interactions. In formative design, building possibilities for truth has implications for options in makerspace design. In considering ways that the space and activity were understood differently from the perspectives of different students and from our perspectives on the activities of the different students, we wondered how the design of research in conjunction with teachers’ development as facilitators in makerspaces can support formative design processes. Teachers trying to operate as if makerspaces embedded in schools were informal spaces, rather than spaces in which their roles and those of the students were at times ambiguous and at other times aligned with school norms, illustrates ways teachers and students operating in these non-formal spaces face confusion. So too, the perspectives provided by students in collaborative groups illustrate how uncertainty about versions of collaboration that are sanctioned can give rise to unproductive and unevaluated behaviors as in the case of Olive and Sam’s

12  Layering Views of Experience to Inform Design

165

mismatch. To address these differences means taking seriously the cultures and values of the actors and their understandings of the expectations in this space. When is the space identified and acted upon as formal and when non-formal? In this paper we illustrate that interpreting consequences of design for learners is not a straightforward activity. While design itself is complex, so too is interpreting experience. Designer planning to gain awareness of possibilities for truths for student experiences in the makerspace opens design options that will in turn have new consequences for learners. We have shown that interpretation of design consequences from views of student experience can inform assumptions regarding design. The implication of these findings involves attention to makerspaces as needing the development of possibilities for design including roles of teachers, culture of interactors, access to tools, and opportunities to learn about ways of interaction.

References Becker, S., & Jacobsen, M. (2021). A year at the improv: The evolution of teacher and student identity in an elementary school makerspace. Teaching Education, 34, 1–18. https://doi.org/10. 1080/10476210.2021.1978968 Burbank, B., McGregor, D., & Wild, M. (2018). ‘My special, my special thing, and my camera!’ Using GoPro™ as a complementary research tool to investigate young children’s museum experiences. Museum and Society, 16(3), 311–333. Carucci, K., & Toyama, K. (2019). Making well-being: Exploring the role of makerspaces in long term care facilities. Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (pp. 1–12). https://doi.org/10.1145/3290605.3300699 Gasiorek, J., & Ebesu Hubbard, A.  S. (2017). Perspectives on perspective-taking in communication research. Review of Communication, 17(2), 87–105. https://doi.org/10.1080/1535859 3.2017.1293837 Guilfoyle, K., Hamilton, M.  L., Pinnegar, S., & Placier, P. (2004). The epistemological dimensions and dynamics of professional dialogue in self-study. In J. J. Loughran, M. L. Hamilton, V. K. LaBoskey, & T. Russell (Eds.), International handbook of self-study of teaching and teacher education practices (pp. 1109–1167). Springer. https://doi.org/10.1007/978-­1-­4020-­6545-­3_28 Gutiérrez, K. D., Baquedano-López, P., Alvarez, H., & Chiu, M. M. (1999). Building a culture of collaboration through hybrid language practices. Theory Into Practice, 38, 87–93. Hamel, C., & Viau-Guay, A. (2019). Using video data to support teachers’ reflective practices: A literature review. Cogent Education, 6(1), 1673689. https://doi.org/10.108 ­ 0/2331186X.2019.1673689 Harwood, D., & Collier, D. R. (2019). “Talk into my gopro, I’m making a movie!” Using digital ethnographic methods to explore children’s sociomaterial experiences in the woods. In N. Kucirkova, J. Rowsell, & G. Falloon (Eds.), The Routledge international handbook of learning with technology in early childhood (pp. 49–61). Routledge. Hulme, R., Cracknell, D., & Owens, A. (2009). Learning in third space: Developing trans-­ professional understanding through practitioner enquiry. Educational Action Research, 17(4), 537–550. https://doi.org/10.1080/09650790903309391 Jordan, B., & Henderson, A. (1995). Interaction analysis: Foundations and practice. Journal of the Learning Sciences, 4(1), 39–103. Kajamaa, A., & Kumpulainen, K. (2019). Agency in the making: Analyzing students’ transformative agency. Mind, Culture, and Activity, 26(3), 266–281. https://doi.org/10.1080/10749039. 2019.1647547

166

S. E. Kastberg et al.

Kenny, R. (2017). Introducing journal of formative design in learning. Journal of Formative Design in Learning, 1(1), 1–2. https://doi.org/10.1007/s41686-­017-­0006-­0 Kumpulainen, K., & Kajamaa, A. (2020). Sociomaterial movements of students’ engagement in a school’s makerspace. British Journal of Educational Technology, 51(4), 1292–1307. https:// doi.org/10.1111/bjet.12932 Lahlou, S. (2011). How can we capture the subject’s perspective? An evidence-based approach for the social scientist. Social Science Information, 50(34), 607–655. https://doi. org/10.1177/0539018411411033 Lahman, M. K. (2008) Always othered: Ethical research with children. Journal of Early Childhood Research 6(3), 281–300. https://doi.org/10.1177/1476718X08094451 Nassauer, A., & Legewie, N.  M. (2021). Video data analysis: A methodological frame for a novel research trend. Sociological Methods & Research, 50(1), 135–174. https://doi. org/10.1177/0049124118769093 Pasternak, E., Fenichel, R., & Marshall, A.  N. (2017). Tips for creating a block language with Blockly. In 2017 IEEE blocks and beyond workshop (B&B) (pp.  21–24). IEEE. https://doi. org/10.1109/BLOCKS.2017.8120404 Pink, S. (2015). Going forward through the world: Thinking theoretically about first person perspective digital ethnography. Integrative Psychological and Behavioral Science, 49, 239–252. https://doi.org/10.1007/s12124-­014-­9292-­0 Simpson, A., & Feyerabend, M. (2022). Tug-of-war: The pull of formal institutional practices and structures and the desire for personal change. International Journal of Science and Mathematics Education, 20, 149–168. https://doi.org/10.1007/s10763-­020-­10139-­w Simpson, A., & Kastberg, S. (2022). Makers do math! Legitimizing informal mathematical practices within making contexts. Journal of Humanistic Mathematics, 12(1), 40–75. https://doi. org/10.5642/jhummath.202201.05 Simpson, A., Burris, A., & Maltese, A. V. (2020a). Youth’s engagement as scientists and engineers in an after-school tinkering program. Research in Science Education, 50(1), 1–22. https://doi. org/10.1007/s11165-­017-­9678-­3 Simpson, A., Williams-Pierce, C., & Kastberg, S. (2020b). When figured worlds fracture: A collaborative environment splintered by a non-collaborative tool. In M.  Gresalfi & I.  S. Horn (Eds.), Proceedings of the 14th International Conference of the Learning Sciences (Vol. 3, pp. 1673–1676). International Society of the Learning Sciences. Steele, K.  M., Blaser, B., & Cakmak, M. (2018). Accessible making: Designing makerspaces for accessibility. International Journal of Designs for Learning, 9(1), 114–121. https://doi. org/10.14434/ijdl.v9i1.22648

Chapter 13

Making a Framework for Formative Inquiry Within Integrated STEM Learning Environments Stuart Kent White

Abstract  Emerging instructional design decisions impact development of instructional material in unique ways. Initial development of Integrated Chemistry Physics (ICP) course curriculum modules originated from designer’s experience with student-­ centered formative inquiry-based modeling within K-12 science classrooms. Feedback from pilot schools regarding the initial authentic STEM education framework facilitated refinement of a design, make, investigate activity template. Decision making processes based on making meaning, target audience assumptions, and connecting theory to practice resulted in a formative learning process integrated into the developed framework. The resulting formative STEM education framework merges scientific inquiry with engineering design processes. The developed framework facilitates an integrated STEM approach to science education where inquiry-based learning and engineering design processes happen simultaneously. Preliminary results favor improvements in student engagement while simultaneously illuminating implementation strategy issues. Keywords  STEM · Integrated STEM · Inquiry · Formative learning process

Introduction An Authentic STEM Instructional Design As a K-12 science teacher I am familiar with prepackaged curriculum requiring the purchase of specialized equipment and consumables. These curricular materials come ready-made to provide learners with opportunities for hands-on “experimentation” into textbook content. Publishers of these learning modules make every S. K. White (*) Purdue University, West Lafayette, IN, USA e-mail: [email protected] © The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 B. Hokanson et al. (eds.), Formative Design in Learning, Educational Communications and Technology: Issues and Innovations, https://doi.org/10.1007/978-3-031-41950-8_13

167

168

S. K. White

effort to convince K-12 educators their products are student-centered, yet very seldom do packaged resources allow students to deviate from scripted learning experiences. Additionally, each one claims a standards-based approach where assessments are consistent with diverse learning communities. Finally, publishers claim to provide learners with learning experiences mimicking the experience of real-life professionals (authentic science education) and fitting the needs of every classroom student better than their competitors. Unfortunately, once purchased, teachers are required to either modify these resources to meet actual classroom needs or force students into a one size fits all science instruction program. As twenty-first century academic standards require learners to become conversant with ever increasing science content instructors often turn to time-saving media-rich lectures (e.g., PowerPoint, Prezi, Google Slides) (Kraus, 2008), textbook reading assignments with electronic worksheets (Webb, 2008), and cookbook style experimentation. Instructional designers responsible for K-12 science instructional material within this environment must recognize “an instructional method cannot guarantee that the desired learning will occur; it can only increase the probability that it will occur” (Reigeluth, 1997, p. 43). This subtle distinction between instruction material addressing how elements work together in meeting learning outcomes and how elements accomplish learning outcomes is especially important when developing formative authentic STEM education resources. Formative resources are defined as educational content/context that informs knowledge/skill development within subsequent learning activities (Fukuda et al., 2022). Over time, attempts by titan educational textbook companies to “improve” K-12 science laboratory experiments for every classroom resulted in recipe-like procedures. These cookbook style labs take learners through a step-by-step process, allowing all students to collect homogenous data and require very little mental input on the part of the learner (Brownell et al., 2012). Most of these experimental activities obtain the prescribed data by means of precisely engineered laboratory equipment purchased from specialty companies. Each laboratory experience is meant to “school” learners in scientific knowledge rather than engage them in authentic scientific inquiry (Lee & Butler, 2003). Efforts to break from these packaged laboratory experiences formulated to provide ideal data for answering prescribed questions necessitate the creation of a laboratory framework where learners design and build their own scientific investigation.

Abridged History of Inquiry-Based Learning Today’s K-12 science classrooms have a complicated history of educational practices that familiarize students with the accumulation of scientific knowledge rather than science “as a method of thinking, an attitude of mind” (Dewey, 1910, p. 122). In the 1960s inquiry-based learning was promoted as an effective strategy to encourage learners to make “reasonable explanations of phenomena about which they are curious” and allow learners to “investigate problems via the mind [and] senses”

13  Making a Framework for Formative Inquiry Within Integrated STEM Learning…

169

(Novak, 1964, p. 26). Later, researchers began campaigning for inquiry-based learning “initiated and controlled by the learner” (Suchman, 1965, p. 290) who faced “cognitive discomfort while attempting to acquire higher level thinking skills” (Costenson & Lawson, 1986, p. 155). Costenson and Lawson (1986) went on to advocate for the use of inquiry-based educational models constructed from materials “purchased at local stores or brought in from students’ homes” (p. 158) within the learning process. Further research into the use of everyday objects to construct functional experimental models shined a light on their contribution to improved student self-­ confidence, aided memory formation, and critical thinking development (Uno, 1990). Additional research identified advantages from inquiry-based learning in areas of question refinement, visualization, analysis and reporting of quantitative data, pattern recognition, application of arithmetic and statistical techniques, and others (Edelson et al., 1999, p. 397). Edelson (2001) went on to state inquiry-based modeling pushed learners “beyond simple investigation use [to] provide support for artifact construction, expression, and record keeping.” (p. 380). Eventual incorporation of computer technology into inquiry-based learning provided learners with “authentic experience[s] by engaging learners in a knowledge construction process” (Löhner et al., 2005, p. 442). Löhner et al. (2005) went on to clarify that built-in modeling supports must guide learners through “a systematic strategy, as well as to specific reasoning activities” (p. 458). Extensive research has been done into authentic hands-on inquiry-based learning experiences that allow students to engage with science content through a discovery process (Costenson & Lawson, 1986; Edelson, 2001; Edelson et al., 1999; Suchman, 1961, 1965). Experimentations and/or investigations learners engage with during inquiry-based approaches are said to combine inductive and deductive learning processes that “emphasizes [the] learner’s responsibility for discovering knowledge that is new to the learner” (Pedaste et al., 2015, p. 48). The distinguishing feature of inquiry-based learner engagement is active blending of new and prior knowledge. This type of knowledge construction requires learners to take a retrospective look at prior learning and adapt subsequent learning to meet future needs, a process Black and Wiliam (2010) identify as a formative learning process. Additionally, Wiliam and Thompson (2008) propose this formative retrospection take place over short cycles as the lesson progresses or between a series of short learning activities.

Science Education Through an Integrated STEM Context The conception of formative resources engaging learners in authentic inquiry-based educational activities best fits an integrated STEM learning experience (Kelley & Knowles, 2016) where learners interact with industry specific tasks and practices as they learn science. Visualizing difficult science content through the use of authentic scientific tools and equipment is juxtaposed to not possessing the skills necessary to interact with tools and equipment in a meaningful way (Edelson et  al., 1999). Edelson et  al. (1999) pointed to inherent challenges in developing inquiry-based

170

S. K. White

learning in K-12 settings due to student inability to navigate authentic learning situations. Sanders (2008) went on to suggest current science education methods are failing to prepare today’s technology savvy learners to face STEM challenges and prepare for anticipated STEM careers. Today’s conception of science is a “transdisciplinary subject in schools that integrate[s] the disciplines of science, technology, engineering, and mathematics into a single course of study” (Mitts, 2016, p. 30). However, finding an example of a K-12 integrated STEM classroom curriculum is difficult, mainly due to a lack of training needed to successfully teach integrated STEM (Ejiwale, 2013; Wang et al., 2011). Integrated STEM Education describes the process of combining educational approaches from two or more of the STEM subjects within a single learning experience (Shernoff et al., 2017). Successful implementation of instructional resources that address an integrated STEM approach are thought to rely on “… the foundations, pedagogies, curriculum, research, and contemporary issues of each of the STEM education disciplines, and to new integrative ideas, approaches, instructional materials, and curriculum” associated with each STEM subject (Sanders, 2008, p. 22). As science content educators focus on their area of expertise and attempt to bridge gaps between isolated subject matter content and the integrated nature of authentic learning experiences within STEM discipline, science courses will become authentic formative learning settings (Hallström & Schönborn, 2019). This formative learning experience is a core tenant of today’s emerging research into making as an educational methodology, with research identifying matches between the tools, applications, and practices found in makerspaces and K-12 inquiry-based educational settings (Hynes & Hynes, 2018; Trust et al., 2018). This formative nature of making is also foundational to the design, make, investigate framework at the heart of maker STEM education. The purpose of this article is to illuminate efforts to incorporate making into inquiry-based science classrooms. The inquiry nature of science education combined with making asks learners to formatively design, construct, evaluate, and problem-solve as they focus on variables within an experiment (Löhner et al., 2005). Furthermore, researchers found when inquiry-based STEM learning activities within a makerspace environment are leveraged properly, each step of the making process (i.e., design, construction, and evaluation of experimental model) informs subsequent steps, leading to improved learner knowledge transfer compared to traditional inquiry-based learning dependent on purchased cookbook experimental models (Bevan et al., 2015; Hughes et al., 2018; Martin, 2015).

Design Problem In 2016, a team of individuals at Purdue University concerned with student disinterest in STEM education and entry-level blue-collar career fields identified an educational bottleneck within Indiana science courses. All Indiana high school students are required to take a biology course followed by a series of additional science

13  Making a Framework for Formative Inquiry Within Integrated STEM Learning…

171

content courses. Within this setting approximately 60% of all Indiana high school graduates complete an integrated chemistry physics (ICP) course taught as an accumulation of oversimplified facts, in an effort to meet the needs of struggling students needing to earn credits towards graduation. Analysis of student disinterest in STEM from an engineering problem-solving mindset identified a learning gap between science content and science in application. Bridging this gap requires highly engaging science investigations that merge scientific methods with engineering principles and practices. My association with another Purdue educational program resulted in joining the team as the instructional designer responsible for development of classroom resources to meet these goals. Designing classroom resources for others to use was a new experience, having only previously developed instructional material for my specific classroom. My prime responsibility was designing and developing a series of authentic STEM investigation activities. I began the only place I knew how, a lengthy discussion with my supervisor on content expectations and my personal classroom experience. We established a conceptual framework based on analyzing the learning context, the learners, and the learning tasks (Smith & Ragan, 2004). The lens through which each task was viewed during the instructional design was “science is focused on simplifying processes to obtain a deep understanding of just one phenomena [while] engineering embraces the complexity of real-world devices and processes” (Caruthers & White, 2019). The mindset from which classroom resources were born was application of an authentic STEM setting – where Science attempts to gain understanding of a single phenomenon, engineering attempts to describe two or more scientific principles occurring simultaneously, mathematics is used to communicate knowledge, and technology deals with devices used for data collection and analysis (Caruthers & White, 2019). In addition, my classroom experience and educational training have shown me that science, mathematics, and engineering are informed learning processes where prior learning is leveraged during subsequent activities to answer one or more overarching questions (i.e., a formative learning process). The resulting ICP modules were dubbed Hardware Store Science (HSS) curriculum and utilize a makerspace classroom setting. Each developed HSS investigation was required to connect scientific methodologies with engineering design principles. In addition, each planned investigation apparatus had to be scalable, affordable, and sustainable in a manner that acknowledged the constraints on teachers and the school system. A secondary goal was preparing high school graduates for blue collar STEM careers associated with manufacturing and the skilled trades.

First Drafts I was hired shortly after first draft iterations of instructional material were developed around three investigations (energy storage, friction, material stiffness). These proposed K-12 classroom resources were developed based on experiential

172

S. K. White

Purpose & Guiding Question

Construction of testing apparatus

Science content

Investigation Procedures

Fig. 13.1  Initial draft framework for student progression through HSS investigation module

expectations of Purdue University incoming undergraduate science students. Each investigation was a 10-page text-heavy document that detailed important science content, step-by-step build instructions, an initial investigation protocol, and a series of exploration questions. This document was expected to be shared with students to guide them through each phase of the investigation. Each investigation was built on a four-part framework (Fig. 13.1) which began with a short introduction to the purpose of the investigation and guiding question(s) for students to answer. This was followed by a set of key terms and a “Build It Yourself” section where students were given a set of general instructions for constructing a preliminary testing apparatus. Students used a “List of Supplies” and “Required Tools and Equipment” for construction and experimental data collection. Students were then given background information on the science content related to the investigation, including mathematical formulas for describing various aspects of the observable phenomena. Finally, students were provided a series of investigation steps for collecting data followed by a section labeled “Additional Explorations.”

Development of Formative Framework During my time as a K-12 science teacher I had numerous opportunities to evaluate the instructional content of my classroom to determine whether it provided the learning experience(s) I intended for my students. Often, I made periodic judgements based on what I observed happening during the learning activity. Other times I reflected on what I remembered or relied on test performance to make those determinations. Over time I came to realize that I could apply these same principles to the instructional content I developed for my students. Based on these prior experiences, I began developing smaller classroom activities that informed subsequent activities in a learning unit, such that students could think back to previous learning activities to guide their thinking and understanding in “current” learning activities. Following through these series of learning experiences built to a final assessment over learning modules or project content. I applied skills learned from these classroom experiences to the project and the goal became to purposefully blend a series of learning activities to create an integrated STEM learning framework. The developed framework would assist students to build understanding by drawing on prior experience and understanding to promote learning during subsequent activities. It was assumed that if learners evaluate previous learning activities this can be tied to vivid elaboration taking place during

13  Making a Framework for Formative Inquiry Within Integrated STEM Learning…

173

the rehearsal process and we can expect students to construct robust memories (Karpicke, 2017). Memory is therefore defined as a persistent representation of prior knowledge, experience, or an event that is “reflected in thought or behavior” (Moscovitch, 2007, p. 17). It can then be argued that the act of retrieving, reflecting on, and applying the knowledge and understanding gained from a preceding learning activity (memory retrieval) to a current learning activity constitutes a formative learning process on the part of learners (Bennett, 2011). This constructed knowledge will then be tied to the interwoven nature of STEM applications.

Design Decisions As I analyzed each HSS initial investigation draft document I recognized a dependence on large blocks of text and a lack of any unifying formative processes taking place. Integration of maker STEM content into existing K-12 science curriculum necessitates the creation of a document more closely resembling cookie-cutter lab packets teachers and students are accustomed to using within the classroom. In addition, each step in the investigation process should provide key elements necessary for completion of subsequent processes and procedures. This document could then be shared with students to guide them through the learning activity. Feedback from piloting classrooms indicated students needed assistance in two key aspects of the learning activity; organizing data in a meaningful manner and identifying patterns once data is organized. During the fabrication process it is important to lay out each identified component, prior to assembly or modification, on the source material you are working with. The same is true for data collection. Students need to know the pieces of data they are working with and how they fit together before they can successfully construct meaning out of a scientific investigation. Providing students with an initial data table for recording expected measurements provides a starting point for subsequent explorations. The addition of guiding questions following the initial experimental procedures helps students identify how pieces of data fit together. Feedback from teachers piloting the suggested build processes indicated construction of the initially planned experimental apparatus was difficult to facilitate, and for students to navigate. I came to realize not everyone is at my comfort level when it comes to the use of hand tools and power tools. It is easy to fall into the trap – if I can do it anybody can. Not everyone grows up on a farm where fixing equipment is an everyday routine. Not every preservice teacher worked their way through school as a residential contractor. Not every in-service teacher is comfortable with supervising students as they use a hand saw to cut a board or a cordless drill to drill a hole. This necessitated the addition of content prior to the guiding question, elaborating the skill and knowledge expectations related to each activity.

174

S. K. White

Making Meaning Implementation of these design decisions necessitated modification to the initial HSS investigation framework (Fig. 13.2). The first step in redesigning the investigation framework was to place an emphasis on the engineering design process prior to building their experimental apparatus. All engineering projects include some sort of design of the intended final product. Often these take the form of sketches or CAD (computer aided design) models. In addition, both scientific processes and engineering projects utilize data analysis to refine the experimental process or engineered product, ensuring goals and objectives are met. Addition of the what, how, and why of the investigation to the purpose paragraph allowed these critical components to inform initial design plans for addressing the investigation’s guiding question. Within the Design it Yourself section, students are encouraged to generate multiple design plans for their investigation apparatus. Students choose to either sketch these out on paper or use modeling software (i.e., Tinkercad) to generate 3D models. These models are then used to inform the build process during the next stage of the investigation process. Within the Build and Investigate portion of the framework, students are provided with a “List of Supplies” and “Required Tools and Equipment” dependent on apparatus build processes. Students utilize their design documentation to inform the construction of their experimental test apparatus. Available resources, tools, and measurement technology must be paired with designed apparatus. Students make modifications to their design as they decide which resources will meet the demands of their design and the tools available to bring their design to life. This is followed by a “Background Information” section where students are provided with connections between science content and anticipated data collection. Here students are given information on the science content and mathematical formulas for describing various aspects of the observable phenomena. Students combine the background information with “Investigation Objectives” to identify appropriate technology for collecting required measurements. This could be as simple as using a tape measure, scale, and stopwatch or it may involve the use of multimeters, recording devices, or oscilloscopes. As students make their way through the investigation they accumulate important details about science content, apparatus functioning, and data collection tools to inform decisions made during the Data Analysis section. In order to facilitate pattern identification and connections between observations and science content, students are provided a series of guiding questions. Each question addresses the

Investigation Purpose

Design it Yourself

Build and Investigate

Data Analysis

Fig. 13.2  Final framework for student progression through HSS investigation module

13  Making a Framework for Formative Inquiry Within Integrated STEM Learning…

175

overarching question expressed in the Investigation Purpose section of each HSS investigation activity.

Target Audience Assumptions Multiple assumptions were made during the development of the final HSS investigation framework. First of these was the notion that documentation needs to resemble existing resources available to classroom teachers, and familiar to students. Providing teachers with a four to six-page document with less text and the addition of graphics, data tables, and guiding questions will better facilitate progress through investigation stages in an understandable manner compared to text heavy documentation. When this resource is organized within a formative framework it contributes to improved learning in an inquiry-based integrated STEM learning environment. High school science laboratory investigations are predominantly cookbooks in nature, where students proceed through a series of steps leading to every group collecting the exact same data, identifying the same patterns, and drawing the same conclusions. A second assumption was that providing learners with generalized testing apparatus build instructions would assist all learners in constructing their experimental setup. Build instructions became a set of tips for connecting parts together, informing design decisions, and troubleshooting common construction errors related to foundational science concepts such as friction, equilibrium, and energy conservation. Because the target audience consists of high school science courses, a third assumption was that learners would have experience identifying dependent and independent variables, so these were simply alluded to rather than spelled out within the investigation documentation. In addition, high school science students are also familiar with filling in predefined data tables and creating graphs. For this reason, students are provided with a table for collecting expected measurements (i.e., mass, distance, time) and then tasked with determining how to represent the data graphically. This allowed students to evaluate quantitative data mathematically while simultaneously identifying patterns and relationships.

Connecting Practice to Theory Principles of inquiry-based learning were seen as integral to engaging learners in authentic science exploration. Each investigation includes follow-up exploratory scenarios meant to guide learners in conducting additional research on a topic of interest. Scenarios were stated in a manner requiring the use of knowledge gained during an initial investigation in order to develop an experimental protocol and variables of subsequent investigations.

176

S. K. White

Pilot testing feedback indicated resistance to venturing beyond cookbook style confirmatory laboratory experiments and assigned textbook reading. Improvement of investigation documentation and procedures focused on making experimental protocols more user friendly for both teachers and students. Implementation of feedback became a balancing act of addressing teacher desires and managing evidence-­based research related to methodology, pedagogy, and practices associated with integrated STEM approaches. Decisions such as replacements for the marbles used when collecting associated mass measurements, restructuring written resources so that content was age-appropriate, addressing classroom time and equipment constraints, and overcoming roadblocks due to teacher specific educational approaches were paramount. The addition of the design step in the HSS investigation framework necessitated instruction of model representation. The initial thought was for students to generate hand drawn sketches. However, it quickly became apparent that all students did not possess the same skill level. Subsequently, hand drawn designs were replaced with modeling software for creating 3D images of models, adding to the engineering and technology components of STEM literacy. The use of modeling software came with the added bonus of allowing students the opportunity to mimic assembly of their testing apparatus prior to the build process. Mathematics principles are presented within each lesson as a means of communicating results and predicting outcomes (e.g., measurement, graphing, data collection and analysis methods, etc.). Engineering principles and associated technology are introduced as students become more comfortable building successive prototypes of their testing models. More advanced forms of technology, modeling software, data collection tools, and 3D printing could follow. Guiding questions associated with the analysis of collected data ask students to analyze the data based on targeted scientific concepts, using questions like “Are the results linear?”, “Does it take twice the weight to stretch two rubber bands as far as one?”, and “How much more mass is required to…?” Students are then asked to relate these patterns to the stated experimental purpose and justify their response to the guiding question(s).

Conclusion Every effort was made to determine the most engaging way of connecting learners with STEM content in a formative inquiry-based authentic manner. Authentic STEM education where making plays a central role comes with a unique set of challenges that must be addressed during the development stage of instructional design. Because of time and budgetary constraints associated with K-12 education, understanding what will be required by the educator and from the educator must take the highest priority. Next, attention focused on educator needs must transition to the learners’ needs and clearly defined learning goals. Once consideration has been given to learning objectives and outcomes, each developed instructional tool must

13  Making a Framework for Formative Inquiry Within Integrated STEM Learning…

177

reflect the resources available within the classroom/school itself. Attention can then be turned to formative processes providing the right learning environment for each activity – what prior knowledge does the learner need to successfully engage with the subsequent learning experiences? Educators must provide learners with opportunities to engage with STEM subject matter in a formative inquiry-based manner that prepares them for the highly specialized STEM careers of the future. Consideration must therefore be given to how career specific industry training and formative practices can be replicated within the K-12 learning environment.

References Bennett, R.  E. (2011). Formative assessment: A critical review. Assessment in Education: Principles, Policy & Practice, 18(1), 5–25. https://doi.org/10.1080/0969594X.2010.513678 Bevan, B., Gutwill, J.  P., Petrich, M., & Wilkinson, K. (2015). Learning through STEM-rich tinkering: Findings from a jointly negotiated research project taken up in practice. Science Education, 99(1), 98–120. Black, P., & Wiliam, D. (2010). Inside the black box: Raising standards through classroom assessment. Phi Delta Kappan, 92(1), 81–90. https://doi.org/10.1177/003172171009200119 Brownell, S. E., Kloser, M. J., Fukamo, T., & Shavelson, R. (2012). Undergraduate biology lab courses: Comparing the impact of traditionally based “cookbook” and authentic research-based courses on student lab experiences. Journal of College Science Teaching, 41(4), 36–45. Caruthers, J., & White, S. K. (2019). Integrated curriculum. Hardware Store Science. https://www. purdue.edu/hardware-­store-­science/integrated-­curriculum/ Costenson, K., & Lawson, A. E. (1986). Why isn’t inquiry used in more classrooms? The American Biology Teacher, 48(3), 150–158. https://doi.org/10.2307/4448241 Dewey, J. (1910). Science as subject-matter and as method. Science, 31(787), 121–127. Edelson, D.  C. (2001). Learning-for-use: A framework for the design of technology-supported inquiry activities. Journal of Research in Science Teaching, 38(3), 355–385. https://doi.org/1 0.1002/1098-­2736(200103)38:33.0.CO;2-­M Edelson, D. C., Gordin, D. N., & Pea, R. D. (1999). Addressing the challenges of inquiry-based learning through technology and curriculum design. Journal of the Learning Sciences, 8(3–4), 391–450. https://doi.org/10.1080/10508406.1999.9672075 Ejiwale, J.  A. (2013). Barriers to successful implementation of STEM education. Journal of Education and Learning (EduLearn), 7(2), 2. https://doi.org/10.11591/edulearn.v7i2.220 Fukuda, S. T., Lander, B. W., & Pope, C. J. (2022). Formative assessment for learning how to learn: Exploring university student learning experiences. RELC Journal, 53(1), 118–133. https://doi. org/10.1177/0033688220925927 Hallström, J., & Schönborn, K. J. (2019). Models and modelling for authentic STEM education: Reinforcing the argument. International Journal of STEM Education, 6(1), 22. https://doi. org/10.1186/s40594-­019-­0178-­z Hughes, J., Fridman, L., & Robb, J. (2018). Exploring maker cultures and pedagogies to bridge the gaps for students with special needs. Transforming our world through design, diversity, and education. Studies in Health Technology and Informatics, 256, 393–400. Hynes, M. M., & Hynes, W. J. (2018). If you build it, will they come? Student preferences for Makerspace environments in higher education. International Journal of Technology and Design Education, 28(3), 867–883. https://doi.org/10.1007/s10798-­017-­9412-­5 Karpicke, J. D. (2017). Retrieval-based learning: A decade of progress. In Grantee submission. https://doi.org/10.1016/B978-­0-­12-­809324-­5.21055-­9

178

S. K. White

Kelley, T. R., & Knowles, J. G. (2016). A conceptual framework for integrated STEM education. International Journal of STEM Education, 3(1), 11. https://doi.org/10.1186/s40594-­016-­0046-­z Kraus, R. (2008). Presentation software: Strong medicine or tasty placebo? Canadian Journal of Science, Mathematics and Technology Education, 8(1), 70–81. https://doi. org/10.1080/14926150802152350 Lee, H.-S., & Butler, N. (2003). Making authentic science accessible to students. International Journal of Science Education, 25(8), 923–948. https://doi.org/10.1080/09500690305023 Löhner, S., van Joolingen, W. R., Savelsbergh, E. R., & van Hout-Wolters, B. (2005). Students’ reasoning during modeling in an inquiry learning environment. Computers in Human Behavior, 21(3), 441–461. https://doi.org/10.1016/j.chb.2004.10.037 Martin, L. (2015). The promise of the maker movement for education. Journal of Pre-College Engineering Education Research (J-PEER), 5(1). https://doi.org/10.7771/2157-­9288.1099 Mitts, C. R. (2016). Why STEM? Technology and Engineering Teacher, 75(6), 30–35. Moscovitch, M. (2007). Memory: Why the engram is elusive. In Science of memory: Concepts (pp. 17–21). Oxford University Press. Novak, A. (1964). Scientific inquiry. Bioscience, 14(10), 25–28. https://doi.org/10.2307/1293366 Pedaste, M., Mäeots, M., Siiman, L. A., de Jong, T., van Riesen, S. A. N., Kamp, E. T., Manoli, C. C., Zacharia, Z. C., & Tsourlidaki, E. (2015). Phases of inquiry-based learning: Definitions and the inquiry cycle. Educational Research Review, 14, 47–61. https://doi.org/10.1016/j. edurev.2015.02.003 Reigeluth, C. M. (1997). Instructional theory, practitioner needs, and new directions: Some reflections. Educational Technology, 37(1), 42–47. Sanders, M. E. (2008). STEM, STEM education, STEMmania. The Technology Teacher, 68, 20–26. Shernoff, D. J., Sinha, S., Bressler, D. M., & Ginsburg, L. (2017). Assessing teacher education and professional development needs for the implementation of integrated approaches to STEM education. International Journal of STEM Education, 4(1), 13. https://doi.org/10.1186/ s40594-­017-­0068-­1 Smith, P. L., & Ragan, T. J. (2004). Instructional design. Wiley. Suchman, J. R. (1961). Inquiry training: Building skills for autonomous discovery. Merrill-Palmer Quarterly of Behavior and Development, 7(3), 147–169. Suchman, J. R. (1965). Learning through inquiry. Childhood Education, 41(6), 289–291. https:// doi.org/10.1080/00094056.1965.10728968 Trust, T., Maloy, R.  W., & Edwards, S. (2018). Learning through making: Emerging and expanding designs for college classes. TechTrends, 62(1), 19–28. https://doi.org/10.1007/ s11528-­017-­0214-­0 Uno, G.  E. (1990). Inquiry in the classroom. Bioscience, 40(11), 841–843. https://doi. org/10.2307/1311488 Wang, H.-H., Moore, T., Roehrig, G., & Park, M. (2011). STEM integration: Teacher perceptions and practice. Journal of Pre-College Engineering Education Research (J-PEER), 1(2). https:// doi.org/10.5703/1288284314636 Webb, M. (2008). Impact of IT on science education. In J. Voogt & G. Knezek (Eds.), International handbook of information technology in primary and secondary education (pp.  133–148). Springer. https://doi.org/10.1007/978-­0-­387-­73315-­9_8 Wiliam, D., & Thompson, M. (2008). Integrating assessment with learning: What will it take to make it work? In C. A. Dwyer (Ed.), The future of assessment: Shaping teaching and learning (pp. 53–82). Erlbaum.

Chapter 14

Measuring Informal Learning: Formative Feedback Towards the Validity of the Informal SOM-SCI Andrew A. Tawfik, Linda Payne, and Carolyn R. Kaldon

Abstract  This article seeks to describe the role of formative feedback in the development of an informal learning survey instrument (“Informal SOM-Science”). Whereas many prior instruments measure elements of formal problem-solving, they are often situated within traditional K-12 contexts. As educators explore other avenues to offer access to instruction, it is important that instruments are developed that capture how learning occurs within these unique settings. The proposed Informal SOM-Science includes the following: Classroom Organization, Instructional Orientation, Instructional Strategies, Student Activities, and Technology Use. Formative feedback highlights unique distinctions around the role of educators and the context of learning within libraries. Keywords  Library · Informal learning · Problem-solving · Face validity

 ibraries as Addressing STEM Equity Issues in Underserved L Urban Settings To date, studies show there is a considerable need to address equity gaps in education, especially for STEM students in underserved and marginalized settings (Shtivelband et  al., 2017). Recent data published by the National Assessment of Educational Progress (NAEP) suggests that only 34% of eighth-graders are “proficient” or “advanced” in science. Related literature suggests that marginalized populations in urban populations encounter disparities at disproportionate rates, which perpetuate extant and future inequalities (Darling-Hammond, 2014; Gómez & Suárez, 2020). Hence, there is a pressing need to not just consider adoption of effective instructional strategies, but revisit the broader systems and settings that support A. A. Tawfik (*) · L. Payne · C. R. Kaldon University of Memphis, Memphis, TN, USA e-mail: [email protected]; [email protected]; [email protected]

© The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 B. Hokanson et al. (eds.), Formative Design in Learning, Educational Communications and Technology: Issues and Innovations, https://doi.org/10.1007/978-3-031-41950-8_14

179

180

A. A. Tawfik et al.

education for all learners. The National Research Council argues for a more holistic view to address challenges in educational settings, including museums, community centers and others. Also included are libraries, which are uniquely suited to meet the educational needs of learners within the communities. Indeed, libraries are already a rich source of knowledge repositories and support twenty-first century skills such as inquiry, information seeking, and digital literacy. When extending learning beyond just the K-12 classroom perspective, urban libraries act as “connected learning hubs” that address issues of equity because they are free, readily accessible, and situated within a community (Houghton et al., 2013). Tawfik et al. (2021) further argue that libraries are distinctively suited to be part of the STEM ecosystem for communities given their (a) collaborative learning spaces, (b) access portals to open-educational resources and diverse digital materials, and (c) opportunities to develop research skills. As the library innovates its role within communities, studies detail how this setting can support informal learning (Hassinger-Das et  al., 2020). Specifically, the literature describes how informal learning supports domains such as STEM (Roberson, 2015; Smith & Tyler-Wood, 2020; Weintrop et  al., 2021; Yang et  al., 2021), digital literacy (Brown & Kasper, 2013; Hassinger-Das et  al., 2020; Subramaniam et al., 2021), and civic engagement (Hollett, 2016; Hollett & Ehret, 2017). As well as serving as access to digital resources, research finds gains for cognitive (Brown & Kasper, 2013; Smith & Tyler-Wood, 2020; Yang et al., 2021) and affective learning outcomes (Hollett & Ehret, 2017; Weintrop et  al., 2021). Collectively, the data underscores diversity in terms of how learning is supported as libraries serve as informal learning portals (Fig. 14.1). Another area of interest relates to the instructional strategies employed within informal learning programs, especially in libraries. These informal programs often employ a form of inquiry-based learning (IBL) towards STEM instruction (Kim et al., 2019) that includes a case-structured curriculum, collaborative learning, self-­ directed learning, educator facilitation, and reflection. It is argued that inquiry-based learning represents a more complete introduction to the culture of STEM and collaborative learning when compared to traditional teaching methods, especially given its emphasis on contextualized problem-solving (Lazonder & Harmsen, 2016). However, data underscores the difficulty in administering this instructional strategy in libraries because these settings often employ K-12 standardized curricula despite librarians not being trained on the conceptual aspects of the curriculum (Shtivelband et al., 2017). As such, there is a growing call for further empirical validation of the strategies that work within informal learning (Feder & Jolly, 2017; National Research Council, 2015).

14  Measuring Informal Learning: Formative Feedback Towards the Validity…

181

Fig. 14.1  Learning ecosystem. (Adapted From Feder and Jolly (2017))

 oal of the Project – Instrument Development for Informal G STEM Learning in Libraries Libraries are an important aspect of the learning ecosystem given their placement within communities and access to various resources. Libraries exhibit diversity because the learning programs vary in terms of time (after school, weekend), personnel (external community member, in-house librarian), and levels of expertise (domain expert or information-seeking expert). Feder and Jolly (2017) thus conclude that learning in these settings “is an increasingly important component of our educational system, but one that has been difficult to measure since it is so diverse.” Whereas multiple instruments have been developed to measure learning within formal settings, valid and reliable assessments are needed to accurately measure the impact of libraries in informal learning. This would allow researchers to empirically validate the conceptual claims of libraries as informal learning centers, while also identifying issues that should be addressed in subsequent research. Given this gap in available instruments, the research team sought to develop an instrument that measures aspects of informal STEM learning, especially for K-12 audiences learning that takes place as “outside of school time” (OST). This project began the face validity process using an established valid instrument employed in

182

A. A. Tawfik et al.

school observation (the School Observation Measure (SOM)), which was later tailored to capture characteristics  regarding out of school time learning. While the original SOM has established reliability and validity (Lowther et al., 2003, 2012; Ross et al., 2004; Sterbinsky & Ross, 2003), the goal of the project was to support the psychometric process of face validity for the “Informal SOM Science.”

School Observation Measure (SOM) The School Observation Measure (SOM) was developed to determine the extent to which different common and alternative teaching practices are used throughout an entire school (Ross et al., 2004). The original School Observation Measure (SOM) was developed in the Center for Research in Educational Policy at the University of Memphis. The goal of the SOM is to record, through direct classroom observations, the presence and frequency of 24 instructional practices for a particular program, in one or more grades, or at the whole-school level (Lowther et al., 2003, 2012; Ross et al., 2004; Sterbinsky & Ross, 2003). The SOM strategies include traditional practices (e.g., direct instruction and independent seatwork) and alternative, predominantly student-centered methods associated with educational reforms (e.g., cooperative learning, project-based learning, inquiry, discussion, and using technology as a learning tool). The strategies were identified through surveys and discussions involving policy makers, researchers, administrators, and teachers as those most useful for providing indicators of schools’ instructional philosophies and implementations of commonly used reform designs. The standard, or whole-school SOM© procedure involves observers visiting 10–12 randomly selected classrooms, for 15 min each, during a 3-hour visitation period. The SOM can also be used to conduct targeted observations in which one teacher is observed for 45–60 min. The observer examines classroom events and activities descriptively, not judgmentally. Notes are taken relative to the use or non-­ use of 24 target strategies. At the conclusion of the 3-hour visit, the observer summarizes the frequency with which each of the strategies was observed across all classes in general on a data summary form. The frequency is recorded via a 5-point rubric that ranges from (0) Not Observed to (4) Extensively Observed. Two global items are used to rate the level of academically-focused instructional time and the degree of student attention and interest. The notes forms are completed every 15 min of the lesson and then summarized on a SOM Data Summary Form. In a reliability study, pairs of trained observers selected the identical overall response on the five-category rubric on 67% of the items and were within one category on 95% of the items. Further results establishing the reliability and validity of SOM© are provided in the Lewis et al. (1999) report. In a reliability study using Generalizability Theory, Sterbinsky and Ross (2003) found reliability at the 0.74 level for five SOMs conducted at a school. Reliability increased to 0.82 with 8 SOMs and to 0.85 with 10 SOMs conducted at a school. To date, the SOM has been used to measure classroom instructional methods within different goals and

14  Measuring Informal Learning: Formative Feedback Towards the Validity…

183

settings, such as the impact of inquiry-based instructional practices on student learning within four secondary STEM classes (Hall & Miro, 2016) and direct observations of laptop implementation and professional development in more than 400 classrooms in 50 K-12 schools within 11 districts in Florida (Dawson et al., 2008). As research progressed, other domain-specific derivatives  of the instrument emerged. For example, the SOM Science (SOM-SCI) was adapted from the original SOM and created specifically to observe science classes. The SOM-SCI consists of the following five constructs: classroom organization for science instruction, instructional orientation, instructional strategies, student activities, and individual or group inquiry or research. Tables 14.1, 14.2, 14.3, 14.4 and 14.5 list examples of the specific elements observed within the classroom for each of the aforementioned SOM constructs. Table 14.1  Example items for classroom organization for science instruction within the SOM-SCI Classroom observation 1.A. Theatre-style seating (rows, horseshoe, or U-shaped) 1.B. Science work centers in the regular classroom 1.C. Dedicated science laboratory

Rating (Not observed, Rarely, Occasionally, Frequently, Extensively)

Table 14.2  Example items for instructional orientation within the SOM-SCI Classroom observation 2.A. Teacher-drive or controlled (e.g., lecture, QA, formative assessment) 2.B. Student-driven (inquiry-based learning, cooperative learning)

Rating (Not observed, Rarely, Occasionally, Frequently, Extensively)

Table 14.3  Example items for instructional strategies within the SOM-SCI Classroom observation 3.A. Direct instruction 3.B. Modeling or demonstration 3.C. Cooperative/Collaborative learning

Rating (Not observed, Rarely, Occasionally, Frequently, Extensively)

Table 14.4  Example items for student activities within the SOM-SCI Classroom observation 4.A. Independent seatwork 4.B. Organizational writing 4.C. Sustained writing/composition

Rating (Not observed, Rarely, Occasionally, Frequently, Extensively)

184

A. A. Tawfik et al.

Table 14.5  Example items for individual or group inquiry or research within the SOM-SCI Classroom observation 5.A. Students making predictions or hypothesizing 5.B. Students recording evidence 5.C. Students reporting out

Rating (Not observed, Rarely, Occasionally, Frequently, Extensively)

 ontent Validity: Additional Measures of Classroom C Problem-Solving While the SOM and SOM-SCI provides feedback about instructional strategies, it was originally developed for formal approaches and classroom settings. To better understand broader instructional contexts, we explored instruments that focus on specific elements of problem-solving that are inherent within informal learning settings. For example, the domains of the Dimensions of Success (DoS; Shah et al., 2018) instrument describe constructs that are important for informal and problem-­ solving STEM learning design objectives (i.e., features of the learning environment, activity engagement, STEM knowledge and practices, and youth development in STEM). That said, there are other elements unique to informal learning within libraries that are not described, such as information seeking and digital literacy. Furthermore, the DoS assessment tool did not address collaboration, which many argue is a key element of informal STEM learning design (Tawfik et al., 2021; Yang et al., 2021). Other instruments, such as the Science Instructional Practices Survey (SIPS; Hayes et al., 2016), measure some aspects of informal K-12 STEM learning (e.g., empirical investigation, evaluation and explanation, science discourse and communication, and engaging prior knowledge), but also included more traditional forms of teaching that were more teacher-centered than was appropriate for our instrument needs. The research team also considered the Stanford NGSS Assessment Project (SNAP) assessment instruments (Stanford University, 2018) developed for NGSS Performance Expectations (PEs) in the areas of Earth and Space Science, Life Science, Physical Science, and Engineering. The SNAP assessments include Short Response Items (SRIs), Short Performance Assessments (SPAs), and Instructionally-­Embedded Assessments (IEAs). SRIs call upon students to choose from a list of options (selected-response) or to write the answer rather than selecting it (constructed-­response). They are designed to be administered to individuals or groups and cover items connected to the NGSS PE. Essential to SRI development is administering problem-solving activities with diverse groups of learners to test whether the short-response items are evoking the reasoning among students that they intended to assess. SPAs are designed to last about 30 min and to be taken individually at the end of a lesson central to the NGSS PE. Alternatively, IEAs are designed to assess the three dimensions of a particular PE, other dimensions, and to be embedded into longer-term projects or curriculum or to be woven into a

14  Measuring Informal Learning: Formative Feedback Towards the Validity…

185

short-term, stand-alone series of lessons. While these are valuable assessment tools, they may not be the right choice for middle-school learners in an afterschool informal learning program designed to last a week per topic, especially considering the attrition level reported by librarians interviewed and the amount of time needed to perform the assessments (e.g., 30  min per lesson for SPAs and 3–5 days for the IEAs). Finally, the team considered adoption of the SRIs, but felt they were too formal for our OST and informal learning library setting. Another assessment instrument the study reviewed was the Systematic Characterization of Inquiry Instruction in Early learning Classroom Environments (SCIIENCE), which has been used in various settings. Although this instrument was developed to capture characteristics applicable to our project (e.g., teacher behaviors, promotion of scientific learning, instructional practices, higher-level thinking, and student engagement), it was designed specifically for teacher performance within PK-3 classrooms (Kaderavek et al., 2015). The team considered adapting the SCIIENCE instrument for our 5–7 grade learners, but ultimately determined that a new “Informal SOM Science”  is better aligned with specific assessment needs, especially considering the target age group, emphasis on teacher performance, and the method of collection (in-person survey versus video) involved in the SCIIENCE instrument.

 owards Validity of the Informal SOM Science Development T of Informal School Observational Measurement (SOM)-Science While the aforementioned instruments align with STEM learning goals and help capture aspects of problem-solving within formal classrooms, our goal was to design an assessment instrument that measures inquiry-based STEM education within informal learning environments. Given the increasing role of libraries as locations for informal learning, the research team also wanted to ensure that the instrument  simultaneously considered the goals of Next Generation Science Standards (NGSS) and American Association of School Librarians (AASL) standards. The research team thus developed an instrument that was primarily derived from the original SOM and designed to capture aspects of inquiry-based activities within informal learning contexts. In addition, the Informal SOM-SCI integrated other aspects of prior instruments described earlier, especially from the SIPS and questions related to seating for group work, STEM literacy supports, independent and collaborative student work, hands-on learning, forming a hypothesis (e.g., “what if” or “if/then” questions), explaining reasoning behind an idea, gathering evidence to support or disprove their own hypothesis, critically synthesizing data, design/testing of a model/prototype, using technology provided (e.g., during collaborations, simulations, research, experiment/prototype designs, reflections, and presentations), and reflecting on reasons for revisions (e.g., “what happened” “why

186

A. A. Tawfik et al.

did it happen” “what does it mean”). The final proposed constructs/categories of the Informal SOM-SCI items are as follows: (1) classroom organization, (2) instructional orientation, (3) instructional strategies, (4) student activities, and (5) technology use. Below is a further description of each construct, along with the specific questions that instantiate this aspect of problem-solving within informal learning environments. 1. The Classroom Organization questions assessed if student seating was supportive of group work; if STEM materials and supports were readily accessible to students; and if students used STEM literacy supports/library materials (Table 14.6) 2. Instructional Orientation questions examined if teachers/educators led the instruction and/or provided guiding questions/framework. Questions also looked at how students worked (e.g., collaboratively, independently) (Table 14.7). 3. The Instructional Strategies category included questions about the level of direct instruction to the whole class, levels of modeling, feedback, and support provided, and use of higher-level questioning strategies (Table 14.8) 4. The Student Activities category included the most questions of all the categories. The included  the level of engagement, self-directed learning, hypothesizing, making predictions, gathering evidence, designing models, giving/receiving critiques, and reflecting (Table 14.9). 5. Technology Use questions assessed the amount of collaborating, model/prototype designing, simulating, and gathering of internet reference materials (Table 14.10) Table 14.6  Items for classroom organization Rating (Never, Rarely, Occasionally, Survey question Frequently, Extensively) 1.A. How often is student seating supportive of group work (i.e., tables, physical spaces, etc.) 1.B. How often are STEM materials/ supports readily accessible for students? 1.C. How often do students use STEM literacy supports (e.g., notebooks, books, posters, library materials)? Table 14.7  Items for instructional orientation Survey question 2.A. How often do educators/ facilitators support instruction (e.g., Q&A)? 2.B. How often do students work independently? 2.C. How often do students work collaboratively? 2.D. How often does the educator/ facilitator provide guiding questions or a framework?

Rating (Never, Rarely, Occasionally, Frequently, Extensively)

14  Measuring Informal Learning: Formative Feedback Towards the Validity… Table 14.8  Items for instructional strategies Survey question 3.A. How often does the educator/facilitator provide direct instruction to the whole class? 3.B. How often does the educator/facilitator provide high-level feedback? 3.C. How often does the educator/facilitator provide modeling or demonstration? 3.D. How often does the educator/facilitator use higher-level questioning strategies (e.g., “why,” “explain,” “how,” or “what if”)? 3.E. How often does the educator/facilitator provide direct support to students (e.g., scaffolding information seeking)?

Rating (Never, Rarely, Occasionally, Frequently, Extensively)

Table 14.9  Items for student activities Rating (Never, Rarely, Occasionally, Survey question Frequently, Extensively) 4.A. How often do students engage in hands-on learning or investigations? 4.B. How often do students direct their own learning? 4.C. How often do students make predictions regarding the outcomes of experiments, tasks, or prototypes/models? 4.D. How often do students hypothesize either individually or in groups (e.g., ask “what if” or “if/then” questions)? 4.E. How often do students design an experiment or model/prototype? 4.F. How often do students gather evidence to prove or disprove their own hypothesis? 4.G. How often do students ask higher-order questions? 4.H. How often do students critique the ideas/ work of other students? 4.I. How often do students receive critiques on their ideas/work from other students? 4.J. How often do students explain their reasoning behind an idea/claim? 4.K. How often do students make their own decisions on what they do? 4.L. How often do students make real-time adaptations to their project? 4.M. How often do students critically synthesize data/evidence? 4.N. How often do students reflect on reasons for revisions (e.g., “what happened,” “why did it happen,” and “what does that mean”)? 4.O. How often do students share among the group what they found?

187

188

A. A. Tawfik et al.

Table 14.10  Items for technology use Rating (Never, Rarely, Occasionally, Survey question Frequently, Extensively) 5.A. How often do students use technology to collaborate? 5.B. How often do students use technology to design an experiment, model, or prototype? 5.C. How often do students use technology to apply their knowledge and view outcomes? 5.D. How often do students use technology to find internet-based reference materials (e.g., YouTube, Google, Wikipedia, etc.)?

Face Validity: Formative Feedback A critical element of the instrument development includes validity checks with multiple sources and stakeholders to ensure the instrument achieves its intended goal. To that end, after the preliminary Informal SOM-SCI was fully drafted, the research team met with multiple librarians who are experts in informal learning. Specifically, the subjectmatter experts that provided formative feedback included two librarians from a K-12 rural and an urban context. The formative feedback elicited critical assumptions and potential threats to validity, which primarily focused on aspects of the following: role of the educator, learner activities, and assumptions of the physical space. The formative feedback for each is described below, along with changes to make for the instrument. Formative feedback highlighted the role of the educator within informal learning, especially around the dynamic between teacher and librarian. For questions within the “Instructional Orientation” and “Instructional Strategies” sections, the librarians were unclear because the instrument used the term “teacher”. As such, the stakeholders needed clarification as to whether this if this was in reference to themselves, or whether this alluded to a co-teaching model often employed within formal learning environments. When asked to elaborate, librarians described how they saw themselves as essential to learning, but they were predominantly in a supportive role because they were not technically the instructor of record. Rather, they often provide a space for guided inquiry around the topics that were introduced by the primary teacher. Librarians also detailed how they frequently rely on volunteers who take on an educator role, who may be external to their learning context. Rather than solely use the term “teachers,” the formative feedback caused the research team to employ broader and inclusive terminology, such as “educator.” Another area of formative feedback related to the instructional strategies, especially in Question 2. For example, there was considerable discussion around the question: “How often does the educator/facilitator provide high-level feedback?” The references to “lead instruction” or “provide guiding questions” within the instrument were also subject to debate. The librarians noted the unique aspect of their role is to simultaneously support guided inquiry and interest among the student, while also being mindful of primary teachers’ curricular goals. At the same time, the librarians wanted to enact their own vision of the space in terms of reading programs or digital literacy. The

14  Measuring Informal Learning: Formative Feedback Towards the Validity…

189

librarians underscored that a major aspect of inquiry is the redirection that should take place as students engage in information-­seeking. The instrument was thus amended to use verbiage such as “facilitate”, which better encompassed their role as educator. Similarly, other questions discussed included “How often are STEM materials/supports visibly available for students?” and “How often does the teacher educator/ facilitator provide direct support to students (e.g., hands-on coaching)?” The librarians again noted that their expertise is in information-seeking with digital resources, so they wanted to see more of an emphasis on redirecting and iterative inquiry.  Additional formative feedback focused on the questions related to the learners. Question 4.O(originally described as “Report Their Findings to the Group”) was one issue that was addressed during the interviews with the librarians. The librarians noted that the terminology seemingly implies compulsory accountability structures that are more appropriate within formal education. In addition, the flexibility of libraries affords options for a individual learning experiences (e.g., Coding Club, reading program), which have varying degrees of collaboration required within the curriculum. They instead suggested that the question should be reframed as “Share among the group what they found.” The participants also provided formative feedback regarding the physical space and how it differed when compared with traditional classrooms. In the early version of the instrument, the terminology included verbiage such as “desks” and other references to more structured learning spaces  (e.g  - Question 1.A). However, libraries allow for flexible seating, openness, and access to multimodal resources (e.g., Makerspaces, 3D printers). Similarly, the question “How often are STEM materials/ supports visibly available for students?” is arguably more appropriate for K-12 classroom settings. The librarians noted that the informal nature allows for various environments and thus is especially focused on inclusivity. In addition to digital resources, the “visibly available” aspect of the question may overlook students who may experience sight challenges or have difficulty viewing resources within a wheelchair. As such, they suggested changing the term from “visibly available” to “readily accessible.” Another notable aspect of the feedback related to the technology resources. Although Question 5 references many of the tools that are available within the library, the instrument omitted technologies that may be unique to the library setting. The librarians highlighted examples that would include search databases, mobile technology, and Makerspaces. In doing so, they wanted to ensure that technology was not restrictive, but rather supported open inquiry and artifact creation within the informal learning setting. In another example, librarians discussed Question 5.C: “How often do students use technology to simulate?” They provided feedback that the term “simulate” may be difficult to interpret and questioned whether this was a technological simulation or some other learning activity. This also brought up issues of how this might be done with other problem-solving activities found in the library, such as Makerspaces and after-school programs (e.g., digital literacy curriculum).

Future Directions When creating or adapting an existing instrument, there are multiple steps or stages that researchers should go through to ensure the newly developed assessment is both reliable and valid. Any time an instrument is employed within a new population, it is necessary

190

A. A. Tawfik et al.

to assess its reliability in some form for that particular sample. One way is to calculate measures of internal consistency (e.g., alphas) to see if the instrument performs similarly as in other populations and meets acceptable internal consistency thresholds. If an instrument is designed to measure growth, researchers may also need to conduct testretest reliability to ensure the instrument performs consistently over time. The steps one takes in terms of validity will depend on the psychometric rigor desired and the purpose of a given instrument. For example, in some instances, it is sufficient to obtain face validity and content validity working with experts in the content area of the instrument. For other types of instruments, researchers may need to consider establishing convergent or divergent validity to illustrate if the instrument is measuring the same constructs as other validated instruments that are similar in nature. Also, researchers may need to show the instrument measures constructs different from other similar measures. Finally, if item-level characteristics are desired, or the instrument is proposing to measure underlying constructs, researchers may need to conduct exploratory factor analysis or item-level analyses to measure these aspects of the instrument. Overall, as the Informal SOM-SCI moves forward in field testing, the information we learn from experts, users, and learners will guide any revisions made to the structure and content of the instrument, which in turn, will guide the strategies that will be appropriate to utilize in formally establishing its initial reliability and validity.

Conclusion As the field of education continues to focus on equity and ways to support all learners, researchers are exploring ways to support learners in diverse settings. This has given rise to informal learning that takes place in settings such as libraries, after-­school programs, museums, community centers, and others. The National Research Council suggests these programs are characterized by the following: (a) engage learners in ways that address cognitive and socio-emotional needs; (b) connect to learners’ interests and cultures; (c) connect with formal learning and home environments. While there is a rich body of instruments that capture learning within formal settings, researchers have noted that few instruments capture the nuance of informal learning settings. This is important in multiple respects. Because of the differences between informal and K-12 contexts, it is important to employ protocols that capture the nuances of students’ experience. If no validated instruments exist to measure learning within informal learning contexts, it may be difficult to empirically assess the degree to which learning takes place and how it may be an important component to the broader learning ecosystem. This therefore impacts researchers, policymakers, and practitioners who wish to provide broader and more equitable learning experiences in diverse settings. The chapter provides the context for establishing the Informal SOM-SCI and highlights the important role of formative feedback for face validity in instrument development. The formative data especially identified potential biases about education and context, which allowed the researchers to apply the formative feedback towards future

14  Measuring Informal Learning: Formative Feedback Towards the Validity…

191

versions of the instrument. The areas specifically focused on were the following: physical space, role of the educator, and learner activities. The findings also underscore the differential nature of formal and informal learning environments. Rather than a one-to-one comparison, it is important to understand the underlying differences that make informal learning environments  potentially powerful. For example, the librarians noted that the physical space affords more exploration. There were also differences in terms of how the educators and their instructional goals are perceived. Rather than seeing themselves as disseminators of domain concepts, the strategy was focused on the iteration of information-seeking and inquiry. Finally, the learner activities were focused on unique inquiry activities, especially in spaces that are focused on access and interaction with various learning resources.

References Brown, R.  T., & Kasper, T. (2013). The fusion of literacy and games: A case study in assessing the goals of a library video game program. Library Trends, 61(4), 755–778. https://doi. org/10.1353/lib.2013.0012 Darling-Hammond, L. (2014). What can PISA tell us about US education policy? New England Journal of Public Policy, 26(1), 4. https://scholarworks.umb.edu/nejpp/vol26/iss1/4 Dawson, K., Cavanaugh, C., & Ritzhaupt, A. D. (2008). Florida’s EETT leveraging laptops initiative and its impact on teaching practices. International Journal of Information and Communication Technology Education, 41(2), 143–159. https://doi.org/10.1080/15391523.2008/10782526 Feder, M., & Jolly, E. (2017). What do we know about STEM in out-of-school settings? A National Research Council Report. Gómez, R. L., & Suárez, A. M. (2020). Do inquiry-based teaching and school climate influence science achievement and critical thinking? Evidence from PISA 2015. International Journal of STEM Education, 7(1), 1–11. https://doi.org/10.1186/s40594-­020-­00240-­5 Hall, A., & Miro, D. (2016). A study of student engagement in project-based learning across multiple approaches to STEM education programs. School Science and Mathematics, 116(6), 310–319. https://doi.org/10.1111/ssm.12182 Hassinger-Das, B., Zosh, J.  M., Hansen, N., Talarowski, M., Zmich, K., Golinkoff, R.  M., & Hirsh-Pasek, K. (2020). Play-and-learn spaces: Leveraging library spaces to promote caregiver and child interaction. Library & Information Science Research, 42(1), 101002. https:// doi.org/10.1016/j.lisr.2020.101002 Hayes, K.  N., Lee, C.  S., DiStefano, R., O’Connor, D., & Seitz, J.  C. (2016). Measuring science instructional practice: A survey tool for the age of NGSS. Journal of Science Teacher Education, 27(2), 137–164. https://doi.org/10.1007/s10972-­016-­9448-­5 Hollett, T. (2016). Interests-in-motion in an informal, media-rich learning setting. Digital Culture & Education, 8(1), 1–19. https://www.digitalcultureandeducation.com/volume-­8-­papers/ interests-­in-­motion-­in-­an-­informal-­mediarich-­learning-­setting?rq=Hollett Hollett, T., & Ehret, C. (2017). Civic rhythms in an informal, media-rich learning program. Learning, Media and Technology, 42(4), 483–499. https://doi.org/10.1080/17439884.2016.1182926 Houghton, K., Foth, M., & Miller, E. (2013). The continuing relevance of the library as a third place for users and non-users of IT: The case of Canada Bay. The Australian Library Journal, 62(1), 27–39. https://doi.org/10.1080/00049670.2013.771764 Kaderavek, J.  N., North, T., Rotshtein, R., Dao, H., Liber, N., Milewski, G., Molitor, S.  C., & Czerniak, C. M. (2015). SCIIENCE: The creation and pilot implementation of an NGSS-based instrument to evaluate early childhood science teaching. Studies in Educational Evaluation, 45, 27–36. https://doi.org/10.1016/j.stueduc.2015.03.003

192

A. A. Tawfik et al.

Kim, N. J., Belland, B. R., Lefler, M., Andreasen, L., Walker, A., & Axelrod, D. (2019). Computer-­ based scaffolding targeting individual versus groups in problem-centered instruction for STEM Education: Meta-analysis. Educational Psychology Review, 32, 415–461. https://doi. org/10.1007/s10648-­019-­09502-­3 Lazonder, A., & Harmsen, R. (2016). Meta-analysis of inquiry-based learning: effects of guidance. Review of Educational Research, 87(4), 1–38. https://doi.org/10.3102/0034654315627366 Lewis, E. M., Ross, S. M., & Alberg, M. (1999). School observation measure: Reliability analysis. Memphis, TN: Center for Research in Educational. Lowther, D. L., Ross, S. M., & Morrison, G. M. (2003). When each one has one: The influences on teaching strategies and student achievement of using laptops in the classroom. Educational Technology Research and Development, 51(3), 23–44. https://doi.org/10.1007/BF02504551 Lowther, D. L., Inan, F. A., Strahl, J. D., & Ross, S. M. (2012). Do one-to-one initiatives bridge the way to 21st century knowledge and skills? Journal of Educational Computing Research, 46(1), 1–30. https://doi.org/10.2190/EC.46.1.a National Research Council. (2015). Identifying and supporting productive STEM programs in out-­ of-­school settings. The National Academies Press. Roberson, T.  L. (2015). “STEM”-ulating young minds: Creating science-based programming @ your library. Journal of Library Administration, 55(3), 192–201. https://doi.org/10.108 0/01930826.2015.1034041 Ross, S. M., Smith, L. J., Alberg, M., & Lowther, D. (2004). Using classroom observations as a research and formative evaluation tool in educational reform: The school observation measure. In H. C. Waxman, R. G. Tharp, & R. S. Hilberg (Eds.), Observational research in U.S. classrooms: New approaches for understanding cultural and linguistic diversity (pp.  144–173). Cambridge University Press. https://doi.org/10.1017/CBO9780511616419.007 Shah, A. M., Wylie, C., Gitomer, D., & Noam, G. (2018). Improving STEM program quality in out-of-school-time: Tool development and validation. Science Education, 102(2), 238–259. https://doi.org/10.1002/sce.21327 Shtivelband, A., Riendeau, L., & Jakubowski, R. (2017). Building upon the STEM movement: Programming recommendations for library professionals. Children and Libraries, 15, 23–26. https://doi.org/10.5860/cal.15.4.23 Smith, D. L., & Tyler-Wood, T. L. (2020). STEM academic achievement and perceptions of family support: A gender analysis. Library Hi Tech, 39(1), 205–219. https://doi.org/10.1108/ LHT-­07-­2019-­0147 Stanford University. (2018). SNAP. Stanford NGSS Assessment Project. https://scienceeducation. stanford.edu/assessments Sterbinsky, A., & Ross, S.  M. (2003). School observation measure reliability study. Center for Research in Educational Policy (CREP). Subramaniam, M., Hoffman, K. M., Davis, K., & Pitt, C. (2021). Designing a connected learning toolkit for public library staff serving youth through the design-based implementation research method. Library & Information Science Research, 43(1), 101074. https://doi.org/10.1016/j. lisr.2021.101074 Tawfik, A.  A., Haggerty, K., Vann, S., & Johnson, B.  T. (2021). Rethinking the role of the library in an era of inquiry-based learning: Opportunities for interdisciplinary approaches. In B. Hokanson, M. Exter, A. Grincewicz, M. Schmidt, & A. A. Tawfik (Eds.), Intersections across disciplines: Interdisciplinarity and learning (pp. 1–11). Springer International Publishing. Weintrop, D., Morehouse, S., & Subramaniam, M. (2021). Assessing computational thinking in libraries. Computer Science Education, 31(2), 290–311. https://doi.org/10.1080/0899340 8.2021.1874229 Yang, H., Codding, D., Mouza, C., & Pollock, L. (2021). Broadening participation in computing: Promoting affective and cognitive learning in informal spaces. TechTrends, 65(2), 196–212. https://doi.org/10.1007/s11528-­020-­00562-­9

Chapter 15

Multipurpose Practicum: Feeding a Hunger for Justice via a Mainstream Academic Requirement Amy C. Bradshaw

Abstract  A summer practicum experience was designed to meet the academic goals and requirements of a learning design and technology (LDT)-focused graduate program, in a way that would simultaneously center a social justice issue in the local community. Three students would work together to learn about food insecurity in their college town and create an online hub to address immediate hunger needs, provide public education on the topic, elicit support among community members not vulnerable to food insecurity, and do it in a non-stigmatizing way. Keywords  Learning design · Diversity, equity, inclusion, belonging, and justice · Practicum · Internship · Design precedent

Normalizing Productive Struggle Toward Justice Calls to shift professional practice and academic preparation for work in learning design and technology (LDT) contexts to more overtly address issues of diversity, equity, inclusion, belonging, and justice (DEIBJ), are becoming more common (e.g., Benson, 2018; Benson et al., 2017; Bradshaw, 2017, 2018a; Clark-Stallkamp et al., 2021; Kopcha et al., 2021; Moore, 2021). As a growing population of LDT scholars and practitioners recognize the ethical imperative of addressing DEIBJ concerns in our learning design and technology work, the need for design cases and precedents that may inspire and inform this shift becomes increasingly relevant. Design cases provide transparency and insights regarding how and why particular design decisions were made, and how designers and participants experienced and A. C. Bradshaw (*) University of Oklahoma, Norman, OK, USA e-mail: [email protected] © The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 B. Hokanson et al. (eds.), Formative Design in Learning, Educational Communications and Technology: Issues and Innovations, https://doi.org/10.1007/978-3-031-41950-8_15

193

194

A. C. Bradshaw

responded to the challenges they encountered along the way. Design cases and the precedents they may convey are not formulas or templates to be followed or replicated, but are contextually-bound solution examples that may be considered in the decisions and design moves of future designers working in other settings with different contexts and constraints (see Boling, 2010; Howard, 2011). Design cases and precedents that overtly engage DEIBJ-related issues in LDT contexts are particularly compelling because their use can help normalize iterative approaches, perspective taking, willingness to embrace and benefit from notions of productive failure, and openness to grappling with complex socio-cultural issues. Through my work in LDT for nearly three decades, I have come to perceive academic preparation for LDT careers as a location of profound responsibility. As we enculturate students into the LDT field, we convey assertions about the nature of reality, what and how we can learn, and which values we should prioritize. This is inherently formative space where we reinforce or disrupt philosophical norms that fuel and constrain our work in LDT. This preparatory space holds possibilities for creative problem solving and transformation around DEIBJ issues that permeate our work and our lives. If we want our students to meaningfully and responsibly address DEIBJ issues in their post-academic careers, we must support and normalize productive grappling and critical engagement with these issues during their academic preparation. By “productive grappling” or “productive struggle,” I mean: choosing to remain open in the face of uncertainty about how to live up to the ethical imperative we have been slow to recognize, and mostly still don’t know how to go about; resisting seemingly automatic impulses toward efficiency and closure that limit our understandings of how our uninterrogated practices can contribute to harm; and living with the discomfort of struggling and striving to do better – not to simply dismiss DEIBJ concerns and move on, but to be vulnerable and open enough that both we and our students have chances to fail, learn, and grow. Finding ways to invite both faculty and students into productive struggle with DEIBJ issues during academic preparation is crucial if we expect them to be able to design and facilitate equitable and just learning environments in the future. Possible entry points for reimagining and altering academic preparation in LDT to better support productive grappling with socio-cultural issues are endless. Among my prior efforts to integrate DEIBJ issues with LDT academic preparation, critical design precedents foundational to this chapter’s focus include iterative curriculum and content changes to required courses, such as to syllabi, reading selections, and conceptual foundations (e.g., Bradshaw, 2021a); efforts to make philosophical assumptions, tensions, and limitations more visible for my students and thereby more open to interrogation (e.g., Bradshaw et  al., 2012; Bradshaw, 2021b); and modification of particular assignments, such as a team-based video editing project, without changing the overall course structure (e.g., Bradshaw, 2018b, 2022). In whatever ways we engage this challenge, each of us is a critical factor in our own contexts. Despite positive intent, we may both enable and constrain the kinds of transformations we seek. For this reason, it is important to engage this work with critical humility (see European-­ American Collaborative Challenging Whiteness, 2005). Critical humility is the paradoxical practice of “remaining open to discovering that our knowledge is partial

15  Multipurpose Practicum: Feeding a Hunger for Justice via a Mainstream Academic… 195

and evolving while at the same time being committed and confident about our knowledge and action in the world” (p. 121). Practicing critical humility helps us be receptive to feedback when we get it wrong, and remain committed to designing and facilitating equitable and just learning environments despite contextual challenges. Modeling critical humility with our students helps us create the trusting and supportive learning environments that are necessary for collective grappling with challenging socio-cultural issues that are always present in LDT work, but that for too long have been overlooked or ignored.

Mainstream LDT Practicum Purposes/Goals Practicum and internship requirements in academic programs that prepare students for work in learning design and technology typically provide students with opportunities to apply topics from formal coursework to authentic contexts similar to their target careers (e.g., Johari & Bradshaw, 2008; Johari et al., 2002). The practicum requirement in the master’s program of my context provided students with opportunities to learn and practice mainstream LDT skills in authentic ways and in supported contexts to help prepare them for more formal internships later in the program. Most practicum students are still early in the program but have taken at least one instructional design course. Still, many of the skills required for specific practicum projects may be learned or substantially developed during the practicum experience itself. In completing the one-credit practicum, students are expected to engage roughly 50 h on the project, although the quality of the practicum experience is emphasized more than the number of hours completed. Practica are typically arranged on a one-to-one basis, matching the interests and needs of supportive potential supervising faculty with students who want to learn or refine particular skills. Practicum supervisors are typically faculty within the program area, or in the college of education more broadly, who understand the formative priority and purpose of the arrangement. In contrast to internships, which are more formally evaluated by a third party external to the college and which often implicitly serve as try-out periods for potential future employees, practicum projects are lower stakes. They are selected and designed to assist both learners and supervisors without negatively impacting either party. There is a fair amount of flexibility regarding topics that serve as the focus of the practicum, as long as the projects involve practical and authentic application of processes related to LDT. Faculty work with students to identify projects relevant to the students’ future goals. For example, a student with extensive K-12 experience who wants to prepare for a future career in corporate training and development would be encouraged to engage a project to help prepare them for work in that setting, while a student without much classroom experience who wants a future career in a K-12 setting might seek out a project focused on K-12 teaching, topics, and audience.

196

A. C. Bradshaw

A Multipurpose Practicum Design The remainder of this chapter discusses the context, impetus, design constraints, and planning of an authentic project-based practicum experience designed to center DEIBJ in the academic preparation of students pursuing careers in the learning design and technology field. In addition to serving mainstream LDT practicum goals, the project was designed to facilitate student engagement and intellectual grappling with a social justice issue in the local community. As an intentionally designed solution to a real, multi-layered problem in my local context, this design case may have use as a design precedent for others interested in designing projects through which faculty and students can engage in productive struggle to meaningfully integrate attention to issues of equity, inclusion, and justice with the mainstream demands of LDT.

Impetus and Context At the end of a spring semester at the research intensive, predominantly white public institution in the south central United States where I am employed, I was approached, separately, by three students in my learning design and technology focused master’s program about whether I would be willing to supervise them in practicum projects during the summer. Doing so would enable two of the three to subsequently complete internships and comprehensive examinations in the fall, and thus to be able to graduate by the end of fall semester, and would help the third student move more efficiently though his own separate program timeline. Their requests were compelling because administrative decisions had previously disrupted course sequencing and regularity, threatening to extend graduation timelines. However, if I agreed to their requests, it would mean relinquishing a large proportion of my summer break, without compensation for the time and effort required to properly facilitate their projects, since strictly enforced minimum credit-hour enrollment requirements would not be met. In considering their requests, factors in my decision process included my awareness of the difficult scheduling situation these students faced through no fault of their own, as well as a desire to mitigate that damage, if possible. For the students as well as myself, a summer practicum had extrinsic value as a way to move them forward toward their graduation goals on a reasonable timeline. Eventually I was also able to frame the request for myself as an opportunity to design a solution that would have intrinsic value to me, if I could find in it an opportunity to address my own need/commitment to developing and exploring ways to center DEIBJ in my academic work. Could I design a group practicum experience for the three students together that would engage them and me in addressing authentic issues of equity and justice in our local community? Could I design a project that would meaningfully facilitate reflective dialogue among us about the ontological, epistemological,

15  Multipurpose Practicum: Feeding a Hunger for Justice via a Mainstream Academic… 197

and axiological tensions that permeate the problems we seek to address, as well as the potential solutions we propose? Each of the students had previously taken courses with me, and all knew I had been modifying my courses to integrate issues of diversity, equity, inclusion, belonging, and justice with the mainstream LDT curriculum. I was intrigued by the possibility of engaging with these particular students again on DEIBJ-related issues before they graduated from the program and settled into their professional careers since, even at their most rushed, academic settings can allow for more contemplation, flexibility, exploration, and revision than is generally afforded in most non-­ academic LDT career environments. The students had different experiences and strengths in terms of web development experience, instructional design confidence, project management skills, and formal training context exposure (Table 15.1). The combination of experiences and perspectives among the three held promise for meaningful collaboration and peer support, if scaffolded appropriately, as well as for supporting individual needs for autonomy and confidence in being able to make meaningful contributions to the team effort.

Topic and Approach Around the time of the initial request, I had been meeting with a small group of faculty from different colleges in my university to discuss ways to support a newly developed food pantry on campus. We were also interested in other ways to help meet the needs of people experiencing food insecurity – both on campus and in the wider community. After careful consideration of practical and contextual factors, and extended discussions with the students, I agreed to supervise the three students in a single, combined, team-based, eight-week summer practicum, if they were willing to use the practicum as a means of engaging an authentic DEIBJ issue in our local community, specifically, food insecurity. The team-based practicum would allow space for collaboration, problem solving, creative exploration of digital tools, and critical reflection and discussion of the practical and philosophical issues they would encounter. We agreed that I would provide an initial outline of the project, which they would modify and help shape when we started formally meeting in June. I wanted to explore what would theoretically work in this situation, given the stated Table 15.1  Students requesting a summer practicum Student Age Daniela Late 30 s Jeanniea Mid 20 s Carloa Late 20 s Pseudonyms

a

Relevant prior experience/Stage in master’s program Several years of experience in design and training in a corporate environment; Nearing end of the master’s program Prior experience in a training environment, prior work with 508 accessibility compliance; Nearing end of the master’s program Prior experience with web development and coding, no ID experience; Finishing first semester of the master’s program

198

A. C. Bradshaw

goals and theoretical touchstones of our program emphasis (Johari & Bradshaw, 2008, p.  330), combined with my personal commitment to centering equity and inclusion in my professional work. The practicum experience was designed to provide students with an opportunity to grapple with authentic social issues in a supported and relatively unrushed way. They would not only collaboratively grapple with issues and understandings but also would take reflective action to make a meaningful contribution. The project was specifically designed to scaffold students to engage in praxis around an authentic social justice issue in their local community.

Structure and Support I would meet with the students at least weekly for 8 weeks during the summer. At our first meeting I would help them establish a timeline and management plan. Together, they would negotiate how they could collaborate to accomplish the team’s goals. I would connect them with initial resources, including some people and structural supports related to food scarcity and free food sources in the community. Thereafter, I would meet with them weekly, and more frequently as requested, to discuss progress on the project, questions or concerns they may have had, and their experiences and thoughts as they grappled with issues that would arise as the project progressed. This arrangement would provide scaffolding and other guidance, while also allowing the students to exercise autonomy and control over particular aspects of the project, as well as to adapt and modify the general plan as it progressed.

Project Deliverables The practicum project would be to collaboratively design and develop a food insecurity information hub for the university campus and surrounding community. The online or digital app-based information hub would be accessible and relevant to multiple groups of people, including people experiencing food insecurity, people with resources (food, time, money, ideas, etc.), and people who want or need to learn more about the issue. At the end of the practicum period, students would submit project planning documents including the primary project design document, meeting notes, asynchronous discussion threads, and time and activity logs. They would also submit a fully developed proof of concept of the digital/online information hub, and a collaboratively produced culminating project report. Each student would individually submit a metacognitive essay detailing their experiences working as a team. Reflection prompts for the metacognitive essay would include (a) the issues and dilemmas they engaged, (b) their observations about any changes in their own understandings of the project topic and its related issues, and (c) reflections on

15  Multipurpose Practicum: Feeding a Hunger for Justice via a Mainstream Academic… 199

their roles as designers and developers within the scope of the practicum project, in the LDT field, and in the larger society.

Design Decisions and Challenges During the Practicum During the first team meeting, the students agreed to use only real and accurate data in developing the project so their final proof of concept would have potential for adoption and actual use. They would not create any hypothetical or stand-in data or information in order to finish the proof of concept faster, and they would make the information hub as complete, accurate, and accessible as possible in the 8 weeks they were allowed for the practicum. Through discussions involving scaffolded perspective taking, the students determined that the site they would produce would have to include more than just the name and basic contact information of the resource providers. It would also need to specify what kinds of resources were available from each source, what requirements had to be met (e.g., residency requirements, government ID) for someone to receive the resources, and when and how often they could obtain resources from particular providers. Enough information would need to be provided that users without reliable transportation would not waste their time and limited resources going to a location listed only to find it closed with no way to contact providers. We also discussed how they might make the site relevant to people not directly experiencing food insecurity to help inform the broader public about this issue and invite others to help address it, without stigmatizing the need for food. These decisions necessitated additional responsibilities, including • Devoting an immense amount of time to verifying, correcting, and augmenting the initial list of food insecurity resources I had provided. This task was much more challenging than initially expected. As Daniel expressed in his metacognitive essay, One of the first things that surprised me about this project was how hard it was to gather actual information from the locations that provided services for the food insecure. While we were able to add many locations to our database, from churches to food pantries, information on them was scarce and phone calls and emails more often than not went unanswered.

• Achieving 508 accessibility compliance. The team worked to ensure the produced information and resource hub would comply with accessibility standards, and function well whether accessed via computers, tablets, or phones. This commitment impacted decisions about development tools that might be used to construct the site. • Learning how to combine the corrected resource data with Google’s maps and directions functions.

200

A. C. Bradshaw

Students assessed their individual and collective strengths and determined which tasks to divide among themselves and which to tackle together.

Outcomes and Students’ Perspectives on the Experience Culminating Report The students’ culminating report summarized the project purpose and scope, their accomplishments, suggestions for future work related to the project, and information related to the website they produced, including technical requirements, 508 compliance information with recommendations, website design rationale for the proof of concept, educational information related to the topic, and a questionnaire to assist current and future resource providers.

Proof of Concept Although the team initially intended to build a simple proof of concept that could be handed off to another (unidentified) group for full development, they instead produced a fully functional “Food Insecurity Hub” website accessible and relevant to people with food insecurity, people with resources who want to help those in need, people who want to learn more about food insecurity, and people who want to contribute resources. The primary goal of the hub was to assist people in need of food and other resources. The secondary goals were to educate the community about food insecurity and provide information about how to provide resources and assistance to those in need. The website included accurate resource information, a conditional search feature, and integrated Google map and directions functions. The landing page invited users to select among needed resource categories (Ready-to-­ eat Meals, Canned/Dry Goods, Toiletries, and Clothing) and to search for organizations that provide these items without cost. Searches returned a Google map displaying the locations of all the organizations in the database that could provide the selected options. That map was immediately followed by a text-based listing of the organizations indicated on the map. For each resource provider in the list of search results, accurate details were provided including address, a link for directions, and all available contact information (e.g., phone, email, website URL, social media links); a list of the requested categories of items available (Ready-to-eat Meals, Canned/Dry Goods, Toiletries, Clothing, Shelters, Spanish speaking, etc.); any special requirements necessary to obtain the goods (e.g., Proof of residency, Government ID); and days/hours of operation. The site also included an education section with information about food insecurity in the United States, food security within our state, and how to donate resources (Fig. 15.1).

15  Multipurpose Practicum: Feeding a Hunger for Justice via a Mainstream Academic… 201

Fig. 15.1  “Free food hub” food and resources search feature

 ritical Considerations in the Design and Support C of This Case Authentic Context and Need This practicum experience was specifically designed to engage a social justice issue in the local community, while simultaneously supporting the students seeking a practicum project and the faculty member they asked to facilitate it. The project was designed around an authentic DEIBJ issue in the local community with the hope that learners would become invested in the reality of the topic – that it would mean something to them at a deeply human level because they would be engaging real and unexpected issues, along with barriers and surprises that may be overlooked and unconsidered in simulated contexts. Student reflection via the metacognitive essays submitted at the end of the practicum indicate this goal was achieved. Even the most stoic and ID experienced student, Daniel, indicated he was shocked by the reality of the situation when he shared, I am not sure that I learned much related to instructional design process with this project, it was a fairly simple process and well within the scope of my previous experience. I did however learn a great deal about the overall [city] community, and about the topic of the project. While I had some basic awareness of the concept of food insecurity, my knowledge of how deeply it impacted [state] was very low. While I now have a much better

202

A. C. Bradshaw

u­ nderstanding of its impact, and of some of the resources available in the area to help, it was shocking to realize how little support there actually is.

This reality of food insecurity was all-the-more meaningful because they had directly experienced how difficult it was to find and verify information about where to find free food. Jeannie shared, If I was in need of food, finding information to know where to get help would be extremely challenging. It was difficult and annoying to me, so I can only imagine how trying it would be for someone who genuinely needed food. This project has brought me the awareness of food insecurity in our state. I did not realize how much of a problem it was in [state]. And not being able to easily access information about where to find food makes the problem even more troublesome for those in need.

Thoughtful and Supported Team Formation The project was designed around, and specifically for, the three individuals who requested a summer practicum. Supervising Jeannie and Daniel together was more feasible for me while off contract and would provide them both with additional scaffolding and peer support. Including Carlo had potential to make the practicum experience even stronger, even though he was early in the master’s program and his need for a practicum was not urgent. Grouping him with two advanced peers could provide an opportunity for him to make valuable contributions through his unique technical skills, while also benefitting from their guidance and support. In considering each of their individual skills, prior experiences, progress through the program, positionalities, and personalities, I saw an opportunity to make the most of a team-­ based collaboration in which all participants could benefit from a configuration of maximizing individual differences, while minimizing potential harms related to those differences. Each of the three had skills and experience the other two lacked but, together, with support, they could make a strong team.

Relationship, Belonging, Competence, and Trust This practicum experience reflects a cooperative team development strategy. I started with some general expectations, such as suggesting the topic and the eight-­ week format. Other decisions were made by the four of us through exploratory conversation, and the team had the freedom to make other decisions without my input. I intentionally supported students’ needs for relationship, competence, and autonomy, since prior study found these were key elements of success in instructional technology internships (Johari & Bradshaw, 2008). I particularly wanted the team to develop a shared sense of mutual support, belonging, and trust, as critical aspects of relationship, and I supported this through regular scaffolded dialogue with them. As indicated in students’ metacognitive reflections, these goals were achieved, despite some initial concerns. Carlo stated,

15  Multipurpose Practicum: Feeding a Hunger for Justice via a Mainstream Academic… 203 I was nervous of what the summer practicum would entail at first. Instructional Design is something still very new to me. I had concerns of what I would be able to contribute to the task… The task was to design and model a website that would help food insecure students and [city] residents. I had experience in creating websites from my work as aET Developer. This immediately made me excited because I found out there was a way for me to contribute in something I was positive I could accomplish. The site had to be done in a programming language I have never used before but I was confident that I could learn the language.

Jeannie also reflected, I was a little nervous coming into this practicum, mainly because I did not know what to expect or what we would be doing. I was glad to be working on it with Carlo and Daniel, since I already knew both of them. I work with Carlo and have had a class with Daniel, so it was nice not having to get to know them in addition to working on the project. It was nice and helpful to already know Dr. Bradshaw as well.

My hope was that each could feel valued for what they could contribute, while also seeing how they could benefit from each others’ strengths. I was confident that by collaborating on a group practicum, these three individuals working together could accomplish a stronger project than any of them could produce on their own. Therefore, the project also offered an opportunity for participants to develop skills with, and appreciation for, working collaboratively. Excerpts from students’ metacognitive reflections indicate this goal was reached, with each of the three students feeling highly valued for what they contributed, while also seeing how they benefited from each other’s strengths. Carlo stated, I feel like this project fully utilized each of our own sets of skills. Without any one of us, the final product would have been unpresentable… Designing the site and gathering the required information for listing food locations could not have been possible without everyone’s involvement.

Daniel noted, We each have a different level of experience in the field of instructional design, from myself with roughly six years to Carlo who is in his first semester in the instructional design program with limited experience in direct instructional design. Despite this, our other skills have allowed all members to be equally involved in the projects development. For instance without Carlo’s programming skill we would have had to stop merely at the design phase rather than having a working proof of concept. I provided a depth of experience that helped us ensure we were asking the right questions and helped keep us on track, while Jeannie’s knowledge of 508 compliance and organizational skills helped make the design process very efficient.

Jeannie stated, I am glad Carlo was in the group with his knowledge on web development. If it was just Daniel and myself, or if Carlo did not have the background in web development like he does, I do not think this project would have been as completed as it is now. We would only have a design document or an idea of what we wanted the website to have and look like. With Carlo’s skills, we were able to create a fully functioning website. Daniel and I were able to work on the website content and finding out information about the locations. We were all able to use our own skills to help on this project. I feel like we worked well together and we worked as a team, discussing our thoughts and opinions collectively and constructively… Since I already had some knowledge in 508 Compliance Standards, we thought it

204

A. C. Bradshaw

would be a good idea to implement some of the 508 Standards into the website. I wanted to make sure that the people using the website did not run into any problems using it because of any disability that they could have. It is already hard to find resources, so I did not want to make the website a problem for them to use.

Formative Process Matters Each of the students in this practicum had taken previous classes with me in which we had grappled with DEIBJ issues, and those prior formative experiences primed us to design and engage this practicum experience. Academic preparation for LDT careers is formative even when we do not specifically intend it to be. As liminal space between academic coursework and professional practice, practicum experiences have particular formative power to help students develop their professional self identities. Designing practical experiences through which we and our students can engage with critical humility around authentic DEIBJ issues while practicing LDT processes, helps normalize the productive grappling necessary to transform our field. If the learning environments we design for LDT students do not facilitate productive grappling with socio-cultural issues, taking risks to stretch our understandings, and embracing opportunities for productive failure, we are miseducating our students about what it means to be responsible LDT professionals.

References Benson, A. D. (2018). A typology for conducting research in culture, learning, and technology. TechTrends, 62, 329–335. Benson, A.  D., Joseph, R., & Moore, J.  L. (Eds.). (2017). Culture, learning, and technology. Routledge. Boling, E. (2010). The need for design cases: Disseminating design knowledge. International Journal of Designs for Learning, 1(1), 1–8. Bradshaw, A. C. (2017). Critical pedagogy and education technology. In A. D. Benson, R. Joseph, & J. L. Moore (Eds.), Culture, learning, and technology (pp. 8–27). Routledge. Bradshaw, A. C. (2018a). Reconsidering the instructional design and technology timeline through a lens of social justice. TechTrends, 62(4), 336–344. Bradshaw, A. C. (2018b). Minding the stories we tell: Acknowledging implicit narratives in IDT. In B. Hokanson, G. Clinton, & K. Kaminski (Eds.), Educational technology and narrative: Story and instructional design (pp. 231–247). Springer. Bradshaw, A. C. (2021a). Changing course: Finding a path toward equity and inclusion through an introductory instructional technology course. Presentation at the Annual Conference of the Association for Educational Communications and Technology (AECT), Chicago [Virtual Presentation]. Bradshaw, A.  C. (2021b). Reframing interdisciplinarity toward equity and inclusion. In B.  Hokanson, M.  Exter, A.  Grincewicz, M.  Schmidt, & A.  A. Tawfik (Eds.), Intersections across disciplines: Interdisciplinarity and learning design (pp. 197–208). Springer. Bradshaw, A. C. (2022). Redesigning a video editing project to facilitate learning about social justice. Presentation at the National Conference on Race and Ethnicity in American Higher Education (NCORE), Portland.

15  Multipurpose Practicum: Feeding a Hunger for Justice via a Mainstream Academic… 205 Bradshaw, A. C., Ge, X, & Eseryel, D. (2012). Supporting students’ philosophical development as a necessity in IDT. Presentation & Professors Forum at the annual conference of the Association for Educational Communications and Technology, Louisville. Clark-Stallkamp, R., Herman, K., Marcelle, P., Walters, K., & Yan, L. (2021). The critical theories we need now: A perspective from the CLT Graduate Student Working Group. TechTrends, 65(5), 689–691. European-American Collaborative Challenging Whiteness. (2005). Critical humility in transformative learning when self-identity is at stake. In D.  Vlosak, G.  Kielbaso, & J.  Radford (Eds.), Appreciating the best of what is: Envisioning what could be. Proceedings of the Sixth International Conference on Transformative Learning (pp.  121–126). Michigan State University. Howard, C. D. (2011). Writing and rewriting the instructional design case: A view from two sides. International Journal of Designs for Learning, 2(1), 40–55. Johari, A., & Bradshaw, A. C. (2008). Project-based learning in an internship program: A qualitative study of related roles and their motivational attributes. Educational Technology Research and Development, 56(3), 329–359. Johari, A., Bradshaw, A. C., & Aguilar, D. (2002). Making the most of instructional technology internships. Performance Improvement, 41(1), 19–25. Kopcha, T. J., Asino, T., Giacumo, L. A., & Waters, K. (Eds.) (2021). Attending to Issues of Social Justice through Learning Design [Special issue]. The Journal of Applied Instructional Design, 21. Available: https://edtechbooks.org/jaid:10_4/preface_preface_to_t Moore, S. L. (2021). The design models we have are not the design models we need. Journal of Applied Instructional Design, 10(4). https://edtechbooks.org/jaid:10_4/the_design_models_we

Chapter 16

Practicing 360-Degree Innovation: Experiencing Design Thinking, Exhibiting Growth Mindset, and Engaging Community in a French Business School Graduate-Level Intensive Course Dennis Cheek Abstract  A truncated case of a graduate course on practical innovation within a French business school is presented with attention to the role of design thinking, growth mindset, and student-created, embedded practical innovation designs within a local community. The chapter takes the reader inside the “mind of the designer,” in terms of sample learning sciences insights that inform course design and implementation. Select student comments highlight design features of the course and student-perceived impacts. Keywords  Design · Business education · Role playing · Learning sciences · Community-based learning

Introduction This course taught me to open up to innovative techniques that I did not know. I’m not necessarily comfortable getting out of a formal setting with people I work with, but I’ve really seen the value here… I had a peak of creativity… I felt stimulated. Maybe due to the context that we were forced to have ideas, my brain generated a lot and ideas came on a flow… The main learning lies in the fact that all our creative ideas developed during this course provide a foundation of postures and practices to develop a creative mindset.  – Student X

D. Cheek (*) IÉSEG School of Management, Lille, France e-mail: [email protected] © The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 B. Hokanson et al. (eds.), Formative Design in Learning, Educational Communications and Technology: Issues and Innovations, https://doi.org/10.1007/978-3-031-41950-8_16

207

208

D. Cheek

I have always associated the concept of innovation with creativity, improving something and thinking outside of the box without questioning this idea I had. This is not a false definition but rather an incomplete one… innovation is above all born from the interaction between constraint and vision. It was like a trigger for me to take this perspective… This course taught me that innovation is not a solitary process. Learning from others and questioning yourself allows you to innovate. During the project we had several debates with my team before arriving at our final idea and then our final presentation which was our optimal result within the given time frame. It is by sharing doubts, our fragments of ideas, our misunderstandings, and by arguing our points of view that we arrived there, while thinking ‘inside the box.’… Mistakes are not to be banished or demonized. On the contrary, it is to be exploited and has often led to great discoveries/innovations… so it’s best to turn it to one’s advantage or simply welcome it. – Student F These two sample insights and many more emerged from a 16-h of instruction, four-day, intensive graduate course at IÉSEG School of Management (hereafter IÉSEG) in Paris and Lille, France. The course Practicing 360-Degree Innovation was taught in English once a

year to two cohorts each year between 2019 and 2022. Students came from 98+ countries. The course gave practical exposure and experience for individuals and teams to identify opportunities for innovation within a local environment and respond effectively to select learning opportunities.

Determining Course Focus and Learning Outcomes The course was created in response to the department chair’s request for a new course that provided practical experiences for graduate students regarding innovation. The course placed innovation within the real-world contexts where it lives – a world of constraints, risks, and challenges  – all of which require creative and innovative approaches and the exercise of skills that go well beyond traditional business school education. The course addresses recent goals and directions for student learning at IÉSEG in terms of capabilities every graduate should possess, specifically: 1 . Successfully collaborate within an intercultural team 2. Communicate effectively in English 3. Propose creative solutions within an organization 4. Appraise the performance of a team 5. Compose constructive personal feedback and guidance 6. Thoroughly examine a complex business situation and 7. Combine different skills and management disciplines in support of interdisciplinary responsibilities.

The Situational Context of the Course IÉSEG is one of the leading French business schools with 7000+ students. It attracts over 1000 international students from every inhabited continent who enroll in the 5-year Grande École program. Another 1500 students a year are international

16  Practicing 360-Degree Innovation: Experiencing Design Thinking, Exhibiting…

209

exchange students who come for a semester or are double degree or joint degree students. These students are from other business schools around the world. The diverse linguistic profile of the student body and the fact that English is the global lingua franca for commercial transactions, means that the course was designed to both demand, but also facilitate, the students’ use of English discourse, writing, and reading. Communicating in any language other than English during learning activities is forbidden, with minor exceptions. Students are oriented on the first day to the fact that they will communicate in international English throughout the course. The instructor is explicit about the implications of such a focus and what it requires of each and every student – from native English speakers to those for whom it is a second, third, or fourth language. An intentional focus throughout the course is on students’ communication skills, including attention to body posture, use of the hands, modulation of the voice, eye contact, etc. As Student B noted, “…this course has taught us not only the innovation process but also how to present and communicate our purpose to customers and the company. Even if innovations and techniques are discovered, it is difficult to describe them as promising innovations if they are difficult to explain and do not apply beneficially to our lives. As a result, we learned that the way we speak, our actions, and our procedures can significantly impact our listeners and that they are a yardstick for our outcomes.”

 he Specific Design of the Course and Learning T Sciences Rationales Reigeluth and An (2021) recently sought to merge the instructional design process with learning theory, or what I will here call learning sciences. Learning sciences draws upon many disciplines and subspecialities including education, psychology, neurosciences, sociology, and anthropology, to derive research-based insights regarding how humans learn across the human lifespan. This paper serves to explain the various sequential “learning moves” within this graduate course and then, using boxed text, takes the reader briefly inside the mind of the course developer/instructor to understand sample research insights consistent with the design of student learning experiences. Space limitations make this exercise illustrative rather than exhaustive; it is also broadly consistent with the intent if not the actual suggestions of Reigeluth and An (2021). Learning is enhanced when a “relational, situated, and systems approach” is taken as the starting point for the design of a course (Tassone et al., 2022, p. 255ff). Since IÉSEG prepares students for global international business, it is important that students learn to work well in teams. This course deliberately assigns students randomly, a practice simulating the intentional assignments of employees to teams that occur within multinational corporations. This results in a situation where employees often work with peers they have never met before. Students are told that they are

210

D. Cheek

mimicking the way multinationals operate as well as deriving value from deliberately working with people they do not already know. It is also pointed out that human beings actually want to help others and they should capitalize on this phenomenon within their team and mentally place themselves in an international business/work setting where the norms of home and country need not hold. Quintana-García et al. (2022) have shown that innovation within companies is enhanced when there is significant gender and ethnic diversity. Sulik et al. (2022) argue that diversity matters for particular kinds of knowledge to be optimally realized. In other situations, it seems to matter less – at least in terms of content-oriented tasks. Diversity in this course is also enhanced by the kaleidoscope of educational and cultural experiences as well as work-­ related experiences students have undertaken in internships and/or consulting projects. Zhao and Epley (2022) and others have pointed out that humans underestimate both how much others would help them and how positively others feel about doing so, while overestimating how much these “helpers” will feel inconvenienced by it.

Every team figures out during the week how to get the best performance out of each member of their team. Success requires not just knowledge of facts, figures, logic, and clear presentation and expression, but also “soft skills” such as active listening, moderation, charm and grace, negotiation and compromise, agility and adaptation, cross-cultural awareness, effective teamwork, and an entrepreneurial outlook. There is extensive evidence that explicit attention to soft skills within regular classroom settings can increase students’ skills in those areas, especially if opportunities are given to directly reflect and discuss those skills within the context of the subject at hand (e.g., Wójcik-Karpacz et al., 2022). At the same time, thinking about complex ideas and problems within teams for extended or intensive periods enables students to develop their own “cognitive endurance” (cf. Brown et al., 2022). Discussion within teams about how they are approaching tasks and their individual intellectual struggles increases their personal insights into themselves as learners while also building social bonds within the groups (e.g., Elder et al., 2022; Eisenbruch & Krasnow, 2022).

The course is deliberately designed to produce stress (pressure) within students and within teams over the four days. This is because: (1) stress is a normal part of the work environment, (2) stress can increase the seriousness with which work is approached and the general worth of what is created, and (3) successively dealing with stress results in an unexpectedly large amount of quality work. The interaction

16  Practicing 360-Degree Innovation: Experiencing Design Thinking, Exhibiting…

211

between random teams and stress was expressed in the following way by Student I: “Working with a random team allowed me to step out of my comfort zone. Indeed, this year I didn’t often have this opportunity. The groups were often made by affinity. I was used to working with the same group of friends, and that was a bad habit. In this class, I got to work with people I didn’t know or at least had never worked with before. So, I had more pressure because I didn’t want to disappoint them. So, I was much more productive than I usually am and I think they were too. We worked hand in hand, and we got a more than satisfactory result from our last presentation.” Psychological research (e.g., Teoh & Hutcherson, 2022; Woolley & Fishbach, 2022) suggests that human beings, when placed within time pressure situations, become much more focused on what context information is of most worth. This initial response appears in this course to trigger much richer student exchanges on the topic at hand. The added pressure of a deadline and the short rapid performance responses required of randomly assigned teams, seems to lead to more focused individual motivation, team cohesiveness, and higher quality products than what one might expect for such short durations – a result also reinforced by the various types of diversity in play within and across the teams (Larson et al., 2020; Sulik et al., 2022).

Day One From the very first day, the exceptional linguistic diversity within each cohort is consciously capitalized on within the course in terms of sharing examples of words from one or more of each student’s languages that have come into widespread use outside of their original “linguistic” habitat. These are frequently associated with food, inventions, music, etiquette, and customs that have spread far beyond their points of origin. This drives home at least two points for every student: (1) human language is itself an innovation found everywhere in the world and continuously evolving over time to meet human needs through ingenuity and creativity and, (2) every human culture has contributed in various ways to human innovations, including common words fit for purpose that have spread across many languages. Since innovation has many different definitions, we purposefully spent time on the first day trying to create out of students’ own minds, and within their teams, working definitions of innovation. These definitions are shared and defended before the class with the usual result being an enhanced working definition that all students

212

D. Cheek

recognize as adequate to proceed, but still insufficient to fully cover every situation where innovation occurs. The Socratic method frequently used in the course helps students realize and articulate that they can create and share knowledge and influence the thinking of others. Since abstract concepts cause more uncertainty within individuals, it induces consulting with others to obtain more information and understand the possible meaning(s) of the concept in question (Borghi, 2022). While much remains to be discovered, it seems reasonable to posit that interpersonal transmission of information and experiences between individual brains (“brain-to-brain coupling”) also promotes social learning that goes beyond just content and enhances problem solving (Pan et al., 2022).

We next moved onto the idea that innovation as a concept never lives alone. The professor suggested to the class that there is plenty of needful interaction between the concept of innovation and five other important concepts: creativity, design, imagination, invention, and entrepreneurship. Student teams are requested to do an initial draft diagram with or without words to visualize the possible relationships among these various concepts. They revisit their diagram near the end of the course and produce a final version which they submit as a team. This visualization task helps them increase their understanding of each of these concepts and understand why all of them are important contributors to any significant human innovations. The first day, we started a task that was to be completed by the end of the course and that further broadened their understanding of innovation by way of team-­ supplied examples that they generated from their existing knowledge and life experiences. The exercises underscored that innovations: (1) can be easily found in every contemporary country of the world, (2) can involve ideas, products, processes, and services, (3) often are incremental in nature rather than disruptive or highly transformative and, (4) can be found across all the main subfields of business including marketing, strategy, finance, operations, human resources, products, and services. The cumulative impact of the diverse approaches taken to the identification and articulation of innovation examples across domains, geographies, languages, cultures, time, and applications, is an example of how building from incidental exposures to previously unrecognized innovations that students already know, fosters the category learning of the concept of “innovation” in a way that ends up, in this course, quickly becoming strong and deep in students’ minds. This is consistent with psychological work regarding the impact of incidental exposure in category learning (Unger & Sloutsky, 2022). The fact that innovation is all around us in the wider world is briefly reinforced at the end of class on the first day by a mini-field trip in the local community when

16  Practicing 360-Degree Innovation: Experiencing Design Thinking, Exhibiting…

213

the instructor deliberately stops the group when walking. At the stop each team takes in a 360-degree view of the area and the first person in a team identifies an innovation that could be made to something within the field of view. The second person has to extend or alter the original idea in some manner. The third member of the team needs to further modify the original idea, etc., until the original idea has traveled through the entire team (teams usually are between 3 and 5 persons). This cascading technique is used to make students more consciously alert to the possibilities for innovation in the daily “world” they inhabit. One student commented afterwards, “From the beginning of the course, I learned something I really enjoyed. The innovation technique of walking, forcing oneself to stop, look around and search for “clues” is one I find very interesting. I felt like the whole activity made sense, it resonated with what I expected from an innovation 360 course.” (Student E) This also reinforces for students that incremental innovation is every bit as valuable to human and societal endeavors as disruptive or large-scale innovations which students often solely associate with the concept. O’Brien (2022) documents how humans tend to downplay the importance of the small yet collectively important enhancements to their local environment or a given project or endeavor because they are too stuck on seeking massive change. Yet incremental change is far more pervasive and at least as influential in the world than large-scale, disruptive change of the type explored by the economist Joseph Schumpeter (McGraw, 2007). There is also accumulating evidence (e.g., Selznick et  al., 2022) that in-class and out-of-class learning that is iterative is associated with the development of an innovation growth trajectory among university students. Designing more opportunities for students to work outside of the school within their local community, in a manner targeted to achieve school-desired learning outcomes and appropriately constrained to challenge students to derive their own solutions, can significantly enhance students’ own self-awareness of both the need for and the realization of innovation growth.

After class formally ends on day one, student teams are requested to go into a predefined, large nearby area of the local community, and identify a business, government agency, nonprofit (NGO), or some other entity that exists for which, for the next 48 h, they will become the “proprietor/leaders” (aka Clients) of that particular entity. They use visual inspection of the entity, and anything publicly available over the internet and social media regarding the entity, to master their roles. They are forbidden to interview anyone at the chosen site, whether employees or visitors. As a team they complete an innovation design template (a brief) for their role as the Client. They identify a needed innovation for this entity that they will request from another team within their Triad (three teams total) who will become, in the role play simulation, their “consulting” team at the end of day two.

214

D. Cheek

Day Two A visually appealing presentation concerning the history of materials innovations launches the second day. The topic was explicitly chosen since students in business schools often think that innovations lie almost exclusively within the world of business. This purposefully also attacks the widespread myths of modernity that innovations are all about informational and media technologies as well as pointing out that many innovations from the past are ubiquitous in the present to the point where humans don’t even “see” them for what they are – amazing innovations that altered the entire human-designed world. The examples include pottery, glass, metals, concrete, microscopy, the periodic table, x-ray diffraction, and transistors. All of these material precursors survive in new forms and with even wider applications in our contemporary world. They are reminders that most of us live in a human-designed world where virtually everything we touch, see, and experience was designed by humans for other humans. This means it can also be continuously redesigned and improved (i.e., further innovated). The interplay between physical objects and the world of ideas (which often also include transformative innovations of the mind) is also made clear when discussing the role that innovative ideas play in the world. The concepts of tradeoffs, innovations as agents of social change (both positive and negative), risks associated with all innovations, and systems thinking as the matrix for many modern innovations, are briefly explored. The sequential ordering of innovation examples drawn from business, the surrounding community, everyday life, history, and students’ own attempts at innovating, results in successive relearning regarding the depth, breadth, and potential for innovation within the world at large and is an invaluable way to gain insights that will last long after this course. There is pretty clear evidence that enabling successive relearning of information while shifting contexts and applications for it, can greatly enhance not only the acquisition of knowledge, but also its retention and reuse (e.g., Rawson & Dunlosky, 2022). The many mutually reinforcing instances for literally a hundred or more innovations to emerge during this course helps students develop a very robust view of the length, breadth, depth, and utility of innovations for human societies that goes well beyond merely business settings. Students become sensitized that they live in a world overwhelmingly comprised of innovations from the past that are now so taken for granted, that they are not even recognized by most of us in our day to day lives.

The second day also introduces students to the notion that most innovations take place “within the box” and not the proverbial “outside of the box” thinking that is widely proclaimed. Constraints are the norm and most innovations occur within the context of constraints  – some of them quite severe. The instructor suggests that

16  Practicing 360-Degree Innovation: Experiencing Design Thinking, Exhibiting…

215

thinking inside the box stimulates far more human imagination and creativity than “blue sky” thinking. Students are assigned a deliberately vague task with very few constraints. Each team is to design a “learning space” to learn anything of their choosing that is optimized for learning the particular thing that is desired. They must create 2–3 visuals that illustrate their ideas and may use words to clarify why the designed innovation optimizes learning. They struggle with this experience. As Student O explained: “I found the project on the learning environment particularly interesting because we had no specific instructions or framework. It was a difficult exercise because we usually work with a precise framework that allows us to orient our thinking more easily and quickly.” The last task in the class on the second day is for each Client (team) within a Triad to share their innovation brief with a Consulting Group (team) while the third team within the Triad observes the interchange. Due to the Triad arrangement, every team ends up being both a client for one team and a consulting group for another team. No class is held on day three. Instead each consulting team visits their client’s chosen entity as well as obtaining information from the internet and social media as appropriate that is helpful in creating a solution for the client’s posed innovation “problem.” The consulting team prepares both an innovation design brief as the consulting group as well as an audiovisual presentation and anything else requested by their client (e.g., a written report, comparative table of different approaches, budget). This culminating exercise centers on both the individual learner and team agency and is integral to how ideas “come together” for students regarding innovation by the end of this designed learning experience. There can be little doubt that developing learner agency in their own learning (e.g., Hase & Blaschke, 2021), is one of the most valuable things that can be actualized by students for their present and future development as thinkers, doers, and innovators. Role playing can be a powerful means by which students can experientially see themselves in new ways, test their abilities in real-world contexts and via complex problem settings, and work together with others to achieve things they initially doubted they were capable of achieving.

Day Four The final day of class begins with each consulting team in a Triad presenting their solution to their respective client with the third team observing the exchange. The process proceeds until every team has provided their innovation to their respective client. The entire class debriefs about the experience in terms of things they observed that went well during the exchanges. Then we move to naming and explaining things that would be improved if they had to do this task again.

216

D. Cheek

We have very good psychological evidence that learning from failure is hard and not automatic. It will only occur if it is intentionally surfaced, explored, understood, and learning from it is expressed in terms of future action and/or understandings (e.g., Eskreis-Winkler & Fishbach, 2022). This is why we debrief often throughout the course at individual, team, and class levels. Another overview of contemporary innovation that is truly transformative and disruptive is presented: the case of the X Prize Foundation and the first X Prize which was for the successful design of a space vehicle and two successful civilian space flights with a payload equivalent to two passengers plus the pilot. The manner in which the competition was created, implemented, judged, and the resulting changes that have happened in civilian aerospace developments are highlighted.

Assessment as Part of Course Design The course takes a purposeful approach to assessment and evaluation of student learning consistent with a growing movement to involve university students more fully in assessing their own and peers’ practices and accomplishments (McArthur, 2021). Eighty-five percent of the course grade is comprised of the various tasks student teams complete, along with a final assignment that consists of: (1) assigning grades and providing a 3–4 sentence justification for the grade you give for each member of your own team for their performance throughout the entire course, including all work assignments; (2) providing a single grade and a 3–4 sentence justification for each of the teams in their triad (larger group of three teams) for their combined innovation task, presentation of their work to their respective contractor (company), as well as taking account of the team’s role playing as a company (contractor) during the simulation; and (3) writing a 500-word self-reflective, single essay on the following topic: “What I learned through this course about: (1) Innovation, (2) myself, and (3) working in a randomly assigned team.” The students collectively hold half of 85% of the course grade for team projects. The instructor holds the other half of the 85% and then adds an additional 15% based on each individual’s final submitted assignment. This final assignment was designed to force students to grade human performance  – something often required from employees who work in large, multinational companies that use 360-degree employee evaluation schemes. Secondly, the grading exercise is used to increase the seriousness with which students undertake their work in teams as well as individuals. Retaining 50% of the 85% in the hands of the professor allows for adjustments being made due to special circumstances to which other students may not be privy, as well as to make accommodations for factors related to cultural differences, personal observations, and private conversations during the course between the professor and various students.

16  Practicing 360-Degree Innovation: Experiencing Design Thinking, Exhibiting…

217

Conclusion The experience with this intensive course has demonstrated that students can be placed, with appropriate supports, into highly dynamic, challenging environments that present different innovation challenges and highlight different innovation impacts and issues. Course design is made explicit throughout the class and the entire class follows a written format of this course document so that the design intentions are transparent and linked in their minds. A holistic approach to student development is intentional. It illustrates how business school education can be more intentional and become far more interactive, motivating, and challenging for students than traditional approaches, such as the widely deployed case study method. As Student P observed, “A class about innovation was a new challenge I never faced before. It was not about theory anymore, but about exercising. We needed to ‘touch’ the subject. The simulation was really interesting and I learned a lot about innovation, myself, and teamwork. Getting out of my comfort zone never felt that good!”

References Borghi, A.  M. (2022). Concepts for which we need others: The case for abstract concepts. Current Directions in Psychological Science, 31(3), 238–246. https://doi. org/10.1177/09637214221079625 Brown, C.  L., Kaur, S., Kingdon, G., & Schofield, H. (2022). Cognitive endurance as human capital. NBER Working Paper 20133. National Bureau of Economic Research. https://doi. org/10.3386/w30133 Eisenbruch, A. B., & Krasnow, M. M. (2022). Why warmth matters more than competence: A new evolutionary approach. Perspectives on Psychological Science, 17(6), 1604–1623. https://doi. org/10.1177/17456916211071087 Elder, J., Davis, T., & Hughes, B. L. (2022). Learning about the self: Motives for coherence and positivity constrain learning from self-relevant social feedback. Psychological Science, 33(4), 629–647. https://doi.org/10.1177/09567976211045934 Eskreis-Winkler, L., & Fishbach, A. (2022). You think failure is hard? So is learning from it. Perspectives on Psychological Science, 17(6), 1511–1524. https://doi. org/10.1177/17456916211059817 Hase, S., & Blaschke, L.  M. (2021). Unleashing the power of learner agency. EdTech Books. https://edtechbooks.org/up Larson, N.  L., McLarnon, M.  J. W., & O’Neill, T.  A. (2020). Challenging the ‘static’ quo: Trajectories of engagement in team processes toward a deadline. Journal of Applied Psychology, 105(10), 1145–1163. https://doi.org/10.1037/apl0000479 McArthur, J. (2021). Rethinking student involvement in assessment. Centre for Global Higher Education Working Paper Series, Paper No. 58, Department of Education, University of Oxford. McGraw, T.  K. (2007). Prophet of innovation: Joseph Schumpeter and creative destruction. Belknap Press of Harvard University Press. O’Brien, E. (2022). Losing sight of piecemeal progress: People lump and dismiss improvement efforts that fall short of categorical change – despite improving. Psychological Science, 33(8), 1278–1299. https://doi.org/10.1177/09567976221075302

218

D. Cheek

Pan, Y., Novembre, G., & Olsson, A. (2022). The interpersonal neuroscience of social learning. Perspectives on Psychological Science, 17(3), 680–695. https://doi. org/10.1177/17456916211008429 Quintana-García, C., Marchante-Lara, M., & Benavides-Chicón, C.  G. (2022). Boosting innovation through gender and ethnic diversity in management teams. Journal of Organizational Change Management, 35(8), 54–67. https://doi.org/10.1108/JOCM-­05-­2021-­0137 Rawson, K. A., & Dunlosky, J. (2022). Successive relearning: An underexplored but potent technique for obtaining and maintaining knowledge. Current Directions in Psychological Science, 31(4), 362–368. https://doi.org/10.1177/09637214221100484 Reigeluth, C. M., & An, Y. (2021). Merging the instructional design process with learner-centered theory: The holistic 4D model. Routledge. Selznick, B. S., Mayhew, M. J., Winkler, C. E., & McChesney, E. T. (2022). Developing innovators: A longitudinal analysis over four college years. Frontiers in Education, 7, 854436. https:// doi.org/10.3389/feduc.2022.854436 Sulik, J., Bahrami, B., & Deroy, O. (2022). The diversity gap: When diversity matters for knowledge. Perspectives on Psychological Science, 17(3), 752–767. https://doi. org/10.1177/17456916211006070 Tassone, V.  C., den Brok, P., Tho, C.  W. S., & Wels, A.  E. J. (2022). Cultivating students’ sustainability-­oriented learning at the interface of science and society: A configuration of interrelated enablers. International Journal of Sustainability in Higher Education, 23(8), 255–271. https://doi.org/10.1108/IJSHE-­01-­2022-­0014 Teoh, Y. Y., & Hutcherson, C. A. (2022). The games we play: Prosocial choices under time pressure reflect context-sensitive information priorities. Psychological Science, 33(9), 1541–1556. https://doi.org/10.1177/09567976221094782 Unger, L., & Sloutsky, V. M. (2022). Ready to learn: Incidental exposure fosters category learning. Psychological Science, 33(6), 999–1019. https://doi.org/10.1177/09567976211061470 Wójcik-Karpacz, A., Kraus, S., & Karpacz, J. (2022). Examining the relationship between team-level entrepreneurial orientation and team performance. International Journal of Entrepreneurial Behavior & Research, 28(9), 1–30. https://doi.org/10.1108/ IJEBR-­05-­2021-­0388 Woolley, K., & Fishbach, A. (2022). Motivating personal growth by seeking discomfort. Psychological Science, 33(4), 510–523. https://doi.org/10.1177/09567976211044685 Zhao, X., & Epley, N. (2022). Surprisingly happy to have helped: Underestimating prosociability creates a misplaced barrier to asking for help. Psychological Science, 33(10), 1708–1731. https://doi.org/10.1177/09567976221097615

Chapter 17

Preparing Elementary Teachers to Design Learning Environments That Foster STEM Sensemaking and Identity Kim A. Cheek

Abstract  Elementary STEM teachers must be able to formatively design learning to facilitate sensemaking and a STEM identity. This chapter describes the author’s formative design process for a professional learning sequence to help elementary teachers improve their sensemaking and STEM identities, while modeling how they can use formative design to help their students develop in those areas. Sensemaking and a STEM identity are defined and their relationship to formative design is discussed. Examples of activities that could be used by classroom teachers and purveyors of professional development to foster sensemaking and a STEM identity are described. Keywords  Sensemaking · STEM identify · Formative design · Professional learning Elementary STEM teachers must be able to formatively design learning environments and experiences that enable students to construct explanations about how the natural and human-engineered worlds operate. Teachers need many types of knowledge and skills to do this effectively. One is teachers’ own conceptual understanding, or sensemaking, of the topics they teach. A second is their STEM identity, or whether they see themselves as capable doers and teachers of STEM. A third is their ability to engage in iterative formative design. While these three do not represent everything an elementary teacher needs to know to be able to engage in formative design for science and engineering, they are the focus of this chapter. Herein, I briefly explain what sensemaking and a STEM identity are and why they are important in elementary science and engineering education. I then describe my formative design process and implementation of a two-semester long professional learning (PL) sequence for in-service elementary teachers to help them improve their K. A. Cheek (*) University of North Florida, Jacksonville, FL, USA e-mail: [email protected] © The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 B. Hokanson et al. (eds.), Formative Design in Learning, Educational Communications and Technology: Issues and Innovations, https://doi.org/10.1007/978-3-031-41950-8_17

219

220

K. A. Cheek

sensemaking, develop their STEM identities, and engage in formative design. The learning activities described in this chapter can be used or adapted by both elementary teachers and purveyors of PL for those teachers.

What Is Sensemaking? Science is more than a body of knowledge. It is also a set of practices and attitudes toward knowledge generation. This means students should be engaged in the types of practices professional scientists and engineers use, albeit at age-appropriate levels. Science education is not fundamentally about memorizing and regurgitating current scientific understanding. Rather, science education is about understanding how the world works, which is akin to what scientists do daily. Engineers develop solutions to human-perceived problems. They cannot develop those solutions without also making sense of the properties of the components of the system they are designing and the interactions among them. Thus, sensemaking is just as critical for engineers as it is for scientists. A focus on sensemaking, or at least, aspects of it, has been a hallmark of science education reform efforts for several decades. It is the linchpin of the Next Generation Science Standards (NGSS, 2013) in the U.S. and was arguably a central feature of earlier reform efforts, as well. Many non-STEM disciplines engage in sensemaking, though they do not all use the term or emphasize it in the way STEM educators do. Despite its centrality in science education, one can find multiple definitions in the literature: an approach or stance toward learning, a cognitive process, or a means of discourse (Odden & Russ, 2019). Each definition captures aspects of sensemaking but none encompasses all its facets. To synthesize these three definitions, Odden & Russ suggest that sensemaking is best described as a process by which learners attempt to “figure something out” because their current understanding doesn’t explain the phenomenon they are trying to understand. Typical science activities and skills such as hypothesizing and experimenting can lead to student sensemaking, but they may not result in meaningful learning if teachers are not purposeful about how students process those activities. Sensemaking is both dynamic and iterative as learners build and revise their explanations over time, based on formal and informal learning (Odden & Russ, 2019). Sensemaking could be considered an essential component of formative design since instructional designers also engage in a dynamic, iterative process to figure out how best to design environments and activities to maximize student learning. Sensemaking requires learners to engage in the hard work of realizing they don’t understand something, caring enough to be motivated to address the gaps in their understanding and engaging in the cognitive processes necessary to generate and evaluate possible explanations until they find one that adequately explains the phenomenon. The willingness to do the cognitive work necessary for sensemaking requires persistence because it rarely happens in a single instructional episode. Sensemaking involves causal reasoning as the goal is not merely to describe what is

17  Preparing Elementary Teachers to Design Learning Environments That Foster…

221

happening, but to ultimately explain how and why a phenomenon occurs. Processes and phenomena in the natural and human-designed worlds do not occur in isolation. They are parts of systems that function because their components work together in various ways. Part of engaging in sensemaking is exploring cause-and-effect relationships within systems (e.g., how changes in the wolf population in a region impact the local beaver population). Sensemaking is not a linear process. There is no single set of learning tasks that guarantee student sensemaking in STEM, but some practices appear to facilitate it. Generating and investigating their own questions, developing and using models to test explanations, and engaging in collaborative discourse to build shared explanations contribute to sensemaking. Additionally, those practices help learners view themselves as competent to engage in the type of thinking that will enable them to construct explanations for real world phenomena. Elementary STEM instruction focused on sensemaking stands in contrast to what occurs in many elementary schools, where science instruction is often confined to reading an informational text during the reading block. When scientific investigations do occur, students are expected to follow a rigid set of steps usually referred to as “the scientific method” which has led to the misconception that scientists always follow the same steps in the same order (Meeteren, 2018). A quick Google search yields multiple examples of similarly prescriptive engineering design processes. If taught as lock-step procedures they can be just as detrimental to sensemaking as “the scientific method.”

Development of a STEM Identify Students’ self-efficacy in STEM and how they perceive others see their efficacy are important components of a STEM identity (Galanti & Holincheck, 2022) and enable them to engage in problem-solving in real-world situations (Costa & Kallick, 2008). Sensemaking is an iterative process, so students’ first attempts at explanations or solutions may prove inadequate. The dispositions students utilize when faced with challenges to construct an explanation or design a solution are known as mindsets. Along with mentoring, students’ mindsets have been found to impact whether students from historically underrepresented groups see themselves as capable of pursuing STEM careers (Kricorian et al., 2020). One of those dispositions or mindsets relates to how a learner perceives failure. Embracing failure and a willingness to engage in productive struggle in the face of challenge are essential in STEM. When students engage in productive struggle they are permitted to wrestle with their ideas, explore multiple solutions, and persist in the face of challenges. For teachers this means allowing students to work out a problem on their own rather than immediately stepping in to assist at the first sign of difficulty to spare students from feelings of frustration. While well-intentioned, stepping in to provide support too quickly can short-circuit student sensemaking and their sense of personal competence.

222

K. A. Cheek

Elementary teachers can help students develop a science or engineering identity, in other words help them see themselves as future scientists and/or engineers. Students, especially those from historically underrepresented groups will not develop a science or engineering identity simply because someone tells them they can be successful. For elementary learners identity is related to awareness. Students cannot see themselves as potential engineers if they don’t know what engineers do (Reisslein et al., 2017). What and how students learn are shaped by the multiple cultural groups in which they participate (National Academies of Sciences, Engineering, and Medicine 2021, 2021). Learners need to develop a sense of belonging as part of a community of learners. This occurs when multiple ways of engaging in sensemaking are valued. Does the elementary science and engineering classroom only value turn taking, asking and answering questions to which the answers are already known, and quiet or expository explanations? If so, students may come to view their discourse patterns and ways of investigating as unacceptable among scientists and engineers. As noted above, sensemaking can be viewed as a component of formative design. To engage in formative design for effective instruction in science and mathematics, elementary teachers need to be able to engage in sensemaking themselves (related to STEM and formative design) and see themselves and their students as successful doers of science and engineering. They also need to possess a STEM teacher identity which is impacted by someone’s overall teacher identity as well as how they see themselves as a science learner (Galanti & Holincheck, 2022). It is the second element that causes the greatest challenge for elementary STEM teachers. When surveyed, elementary teachers recall few examples of authentic science inquiry in their own education that would enable them to engage in sensemaking. They describe themselves as poorly prepared to teach science and even less well-prepared to teach engineering (Banilower et  al., 2018). Their own experiences as science learners impact their science teaching effectiveness (Wenner, 2017) both in terms of their ability to facilitate sensemaking and to help students develop a STEM identity as doers of science and engineering. Teachers’ prior experiences and low self-efficacy for learning and teaching science and engineering partially explain why science remains on the back burner in many elementary classrooms despite decades of reform efforts. Teaching engineering is even less of a priority. Both are especially true in schools serving high numbers of underrepresented students (National Academies of Sciences, Engineering, and Medicine 2021, 2021). According Banilower et al. (2018), only 35% of 4th–6th grade teachers and 17% of K-3rd grade teachers report teaching science all or most days, every week. In contrast science is taught some weeks, but not every week in 29% of 4th–6th grade classrooms and 43% of K-3rd grades. Thus, almost half of primary age children and close to one-third of children in intermediate grades rarely learn science. At the elementary level engineering is typically taught during science instructional time so there is no reason to assume the frequency of engineering instruction is higher than that for science.

17  Preparing Elementary Teachers to Design Learning Environments That Foster…

223

Professional Learning (PL) for elementary teachers of science and engineering must address teachers’ sensemaking and STEM identity and provide tools they can use when engaging in their own formative design.

 rofessional Learning (PL) to Encourage Formative P Design Thinking Several factors impact teachers’ use of formative design for instruction, especially for those in schools serving high percentages of historically underrepresented populations. Those schools must often adhere to prescriptive district curriculum pacing guides that limit a teacher’s ability to use formative design. The same schools usually have fewer resources to supplement standard curriculum materials. Teachers’ own STEM identities as learners and teachers contribute to whether they engage in formative instructional design. Thus, any PL for those teachers must give them the tools to use formative design within their existing parameters. It must also work to address their own identities. With those considerations in mind, I asked myself two questions as I began to design the PL described in this chapter: (1) How do elementary STEM teachers engage in sensemaking and develop their own identities as STEM learners and teachers? (2) What experiences help elementary STEM teachers facilitate their students’ sensemaking and the development of their STEM identities? The second question relates more closely to teachers’ formative design thinking than the first. Yet, as I have attempted to show, teachers will struggle to effectively teach for sensemaking and a STEM identity if we do not address the first question. Currently, there is far more literature on PL in science than in engineering education (Loucks-Horsley et al., 2010). Standards for PL in science (Council of Chief State Science Supervisors, 2015) emphasize that teachers need to ask students to explain their thinking, analyze those explanations, and use them to inform their instructional design – emphases that should result in student and teacher sensemaking and more impactful formative design. Teachers in most states are required to produce evidence of continuing education to maintain licensure. Unlike many PL options that last a few days at most, graduate coursework is of sufficient duration to result in changes in teacher practice. In many states graduate coursework can lead to a salary increase while also meeting continuing education requirements, thereby making it an attractive option for teachers. The two-semester science and engineering sequence referenced in this chapter is part of a 15-credit graduate certificate in elementary STEM. One course focuses on elementary science teaching and learning while the other deals with engineering integration in the elementary classroom. Each course is offered once annually. It is not possible in this chapter to discuss all the ways the PL strives to address the two questions posed at the beginning of this section. A few examples illustrate my overall approach.

224

K. A. Cheek

Project-Based Learning for Sensemaking As noted above, sensemaking arises from students’ desire to “figure something out.” Learners are motivated to “figure something out” when what they want to learn appears relevant and is something they care about. Project-based learning (PBL) is an instructional method that capitalizes on students’ interests. PBL instructional units are highly collaborative, student-centered, and framed around a question or authentic problem. A PBL unit starts with a driving question from which other questions can be generated and investigated. Driving questions are open-ended and challenging. They provide a rationale for the practical significance of state standards for the discipline. They help foster student curiosity and serve as a springboard for sensemaking in science. PBL is highly relevant to what engineering design looks like in elementary school classrooms because it is linked to an authentic problem, often one that addresses equity issues. Curricula in the local school districts from which teachers in the PL come are not currently framed around PBL so it is new to many of them. Hence, I decided to model a PBL unit for teachers and require them to modify one of their district science units to give it a PBL focus. One of my goals was to facilitate teacher sensemaking related to selected concepts taught in elementary science. The second was to model how teachers could plan and facilitate PBL in their own classrooms. Early in the semester teachers in the science course generate questions they have about a local scientific issue. The issue varies depending upon the teachers in the course. In a recent iteration they generated questions related to how water changes the physical landscape and the ways in which humans impact those processes and are impacted by them. This is a highly relevant topic for teachers and students in the local area. Our city sits just to the west of the Atlantic Ocean and is bisected by a major river that empties into the ocean. Hurricanes regularly impact the local area and, the impacts of water and wind from those storms change the landscape. Once teachers have generated individual questions, they agree on a driving question that will frame the activities and investigations needed for sensemaking to answer the driving question. The course has a specific structure, but there is flexibility in the activities and investigations based on the questions teachers generate and their performance on course tasks. In the example above, teachers decided they needed to know more about how water and wind impact weathering, erosion, and deposition. They also wanted to know how the composition of various Earth materials, like rocks, affects those processes. Collaborative discourse is a regular part of PBL, and teachers work together to answer their questions. I have used formative design to vary my approach to PBL based upon teachers’ performance on course tasks. In early iterations, teachers had difficulty thinking from a unit perspective. Instead of writing driving questions that would guide learning over a series of lessons, they wrote questions that could be answered after a single lesson. They also found it challenging to plan a series of related lessons. As a result, I revised course activities to include more practice writing driving questions and modeled the use of a driving question board. The latter was done to introduce

17  Preparing Elementary Teachers to Design Learning Environments That Foster…

225

teachers to a way to help students see how related lessons could build upon one another to increase sensemaking. A driving question board serves as a class record of students’ sub-questions, the results of the investigations they conduct to answer those questions, and their evolving ideas about how to answer the driving question. Understanding and describing relationships among components of a system has proven to be challenging for teachers, too. It is important, though, because elementary students are expected to grasp basic concepts related to systems and their interactions (NGSS, 2013). I have recently begun introducing connection circles to enable teachers to explore how changing conditions in a system impact other components of the system. A connection circle enables students to identify cause-and-­ effect relationships among components within a system, such as how the volume of water striking the shore impacts erosion rates. The in-class PBL unit is designed to foster teacher sensemaking and their own STEM identity. We collaboratively work through how to plan a PBL unit, using the driving question teachers generated. To address the second goal related to PBL, teachers then take one of their existing science units and revise it twice, first to give it a PBL focus, and second to integrate engineering design. Teachers are expected to identify how their revised unit design provides equitable experiences for all learners and honors their discourse patterns. They are also required to indicate how the PBL design is informed by students’ current conceptions. To learn about the latter, teachers practice using a protocol for examining student work and thinking.

Examining Student Work and Thinking Teachers are regularly asked to use assessment data to formatively design instruction. Typically, the goal is to increase student scores on state assessments. This means that the formative assessments used for analysis are primarily responses on multiple choice tests, which provide minimal insight into student thinking. A more fruitful activity is for teachers to collaboratively use artifacts such as students’ written responses or drawings to examine their thinking, a strategy known as examining student work and thinking. The strategy is not new. It is one of the science PL standards identified by the Council of Chief State Science Supervisors (Council of Chief State Science Supervisors, 2015). Several states have protocols for the practice, though the state in which the PL took place does not. While teachers could do this individually, engaging in this practice with others helps them develop a collective understanding of what constitutes an exemplary response and how to use student responses to formatively design instruction (Loucks-Horsley et al., 2010). The strategy can help foster a STEM teacher identity. Teachers will not only see themselves as capable of delivering impactful STEM lessons. They can also come to view themselves as able to make instructional decisions based upon student responses. Schools that use this strategy often use grade-level or discipline-specific professional learning communities as the place where teachers engage in collaborative examination of student work and thinking. Because it is teacher-directed, the

226

K. A. Cheek

impetus for using the strategy does not have to come from the school or district administration. Teachers can organize their own group with one or more colleagues. When the group meets, one teacher acts as the facilitator. Another brings several samples of student work on a particular topic. Teachers choose samples from multiple students that capture the range of student responses on the task. The work samples should be ones that demonstrate student thinking, not forced choice test responses. Group members use a protocol to analyze the work samples, what a teacher can learn about student thinking from them, and how they could be used to design future instruction. I provide student work examples and a protocol teachers use to analyze the work samples in small groups. This task addresses both questions I asked myself when planning the PL. The initial step in analyzing the work samples is to determine what characterizes a high, medium, or low-quality response. Identifying those criteria requires solid teacher understanding of the content being assessed (evidence of teacher sensemaking of the content). In a recent iteration of the course teachers subsequently worked in small groups outside of class to examine student work samples from their own classrooms and consider how to formatively design instruction as a result. As I engage in personal reflection on its use in the PL, I do not have data on how, or if, teachers use this strategy after the course is over. I see this as a weakness that needs to be addressed. In future iterations of the PL, I intend to require teachers to take the practice back to their school and report on the results.

Teacher-Created Books to Foster an Engineering Identity My second major goal for the PL was to help teachers develop a STEM identity and learn how to encourage their students’ STEM identities. This is a more acute issue in engineering than it is in science. Many elementary students have little knowledge of the types of engineering or what engineers do. The capacity to “see” themselves as someone who could be successful in a particular field affects student motivation to engage in that discipline. Yet, it is difficult for students to imagine themselves in a career if they don’t know what people in that field do. It is also difficult to imagine themselves in a field if they lack role models who “look like them.” A common approach to foster students’ science and engineering identity is to expose them to role models with whom they can identify, especially those from historically underrepresented groups whose contributions may have been ignored or undervalued at the time they were made. One way teachers can achieve this aim is by writing an e-book biography of an engineer from a historically underrepresented group. Teachers in the PL work with a free, online program called Story Jumper (n.d.) to create age-appropriate biographies for the students in their classrooms. Some teachers choose famous engineers, but a significant number choose engineers within their extended family or social networks. The task has two purposes. The first is to introduce teachers to successful engineers who “look like them” or their students. The second is to enable them to

17  Preparing Elementary Teachers to Design Learning Environments That Foster…

227

leave the course with a resource they can share with the students they teach. Because the online program is free, teachers can also have their students create biographies of other engineers or scientists for the classroom library. Teachers are provided with guiding questions when either researching or interviewing the subject of their biography. Some of the questions focus on challenges and failures experienced by the engineer throughout their career and what they learned from those challenges and failures. Reading about those challenges can help normalize productive struggle and failure for students reading the books. Teachers write the text of the biography which is then formatively assessed by a classmate for required elements, age-appropriateness, flow, and language mechanics. Teachers use the formative feedback they receive to revise the text and add images in Story Jumper to create their picture book. They also describe their plan for using the book with their students. In past years, the plans for use have varied in quality and detail. As I engage in my own formative design process, I see the need to revise the instructions for that portion of the task to require teachers to report back on how they used the book.

Conclusion Science and engineering PL for elementary teachers that will enable them to design instruction that fosters sensemaking and a science and engineering identity should promote those same qualities in teachers. It should also model the types of activities teachers can use with their own students and should emphasize the importance of designing equitable learning experiences that honor what students bring to the learning process. Those themes should permeate the entire PL experience. The activities described in this chapter and my own formative design provide examples of how that can be accomplished. Teachers reading this chapter can use each of these activities with their students. As they reflect on student learning, they can use formative design to revise those activities to better meet student needs.

References Banilower, E. R., Smith, P. S., Malzahn, K. A., Plumley, C. L., Gordon, E. M., & Haye, M. L. (2018). Report of the 2018 NSSME+. Horizon Research. Costa, A.  L., & Kallick, B. (2008). Learning and leading with habits of mind. Association for Supervision and Curriculum Development. Council of Chief State Science Supervisors. (2015). Science professional learning standards. https://cosss.wildapricot.org/Professional-­Learning Galanti, T.  M., & Holincheck, N. (2022). Beyond content and curriculum in elementary classrooms: Conceptualizing the cultivation of integrated STEM teacher identity. International Journal of STEM Education, 9(1), 1–10. https://doi.org/10.1186/s40594-­022-­00358-­8

228

K. A. Cheek

Kricorian, K., Seu, M., Lopez, D., Ureta, E., & Equils, O. (2020). Factors influencing participation of underrepresented students in STEM fields: Matched mentors and mindsets. International Journal of STEM Education, 7(1), 1–9. https://doi.org/10.1186/s40594-­020-­00219-­2 Loucks-Horsley, S., Stiles, K. E., Mundry, S., Love, N., & Hewson, P. W. (2010). Designing professional development for teachers of science and mathematics (3rd ed.). Corwin Press. Meeteren, B. V. (2018). Elementary engineering: What is the focus? (Guest commentary). Science and Children, 55(7), 6–8. National Academies of Sciences, Engineering, and Medicine 2021. (2021). Science and engineering in preschool through elementary grades: The brilliance of children and the strengths of educators. The National Academies Press. NGSS, Lead States. (2013). Next generation science standards: For states by states. The National Academies Press. Odden, T. O. B., & Russ, R. S. (2019). Defining sensemaking: Bringing clarity to a fragmented theoretical construct. Science Education, 103(1), 187–205. https://doi.org/10.1002/sce.21452 Reisslein, M., Miller, C. F., & Ozogul, G. (2017). Latinx and Caucasian elementary school children’s knowledge of and interest in engineering activities. Journal of Pre-College Engineering Education Research (J-PEER), 7(2), 15. https://doi.org/10.7771/2157-­9288.1122 StoryJumper. (n.d.). Create a brighter future for your students one storybook at a time. https:// www.storyjumper.com/ Wenner, J. A. (2017). Urban elementary science teacher leaders: Responsibilities, supports, and needs. Science Educator, 25(2), 117.

Chapter 18

Profound Learning for Formative Learning Design and Technology Davin J. Carr-Chellman, Alison A. Carr-Chellman, Carol Rogers-Shaw, Michael Kroth, and Corinne Brion

Abstract This paper explores the adult learning theory related to Profound Learning (PL). A discussion of the ways in which PL is linked to Formative Design processes and traditional teaching/learning models is presented. PL is defined, including PL practices, components, and an image of application of PL to Formative Design processes. A linkage between PL, Formative Design, design thinking, and community is briefly made before reviewing already accomplished research. Keywords  Profound learning · Learning theories · Formative design · Design thinking The role of learning theory in Learning Design and Technology (LDT) has traditionally been an essential and integral part of research and design practice; it challenges passive acceptance of common assumptions and offers alternative perspectives. From Skinnerian Behaviorism through Cognitivism and Constructivism, the best thinking in the field harnesses learning theories to better inform learning experience designs. Using theory to create stronger learner-­centered solutions focuses on the interplay between technology to support those designs and the processes of human learning (Reigeluth & Carr-Chellman, 2009). Recent iterations of learning theory which inform learning design practices include inquiry-based learning, problem-solving models, diverse learner models, contextualized knowledge development, cognitive load theory, collaborative learning, and pragmatic theory, each highlighted in the recent Educational Technology

D. J. Carr-Chellman (*) · A. A. Carr-Chellman · C. Rogers-Shaw · C. Brion University of Dayton, Dayton, OH, USA e-mail: [email protected]; [email protected]; [email protected]; [email protected] M. Kroth University of Idaho, Moscow, ID, USA e-mail: [email protected] © The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 B. Hokanson et al. (eds.), Formative Design in Learning, Educational Communications and Technology: Issues and Innovations, https://doi.org/10.1007/978-3-031-41950-8_18

229

230

D. J. Carr-Chellman et al.

Research & Development (ETRD) special issue focused on theory (West et al., 2020). Continuing to expand theoretical horizons to inform our design processes and products should be a priority. LDT has repeatedly called for more attention to theory rather than media comparison studies (Amiel & Reeves, 2008; Jameson, 2019; Warr et al., 2020). In this chapter, we explore a theory emerging from adult education, Profound Learning Theory. As disciplines, adult education and LDT have always been closely adjacent, based on their shared focus on non-traditional learners and instructional development. Non-traditional students, in this case, include learners whose roles and responsibilities outside the learning environment provide additional obstacles as well as opportunities inside the learning environment. The continuum of these students ranges from returning adult students to traditional college age students employed full-time and perhaps raising a sibling or child or caring for a parent, to those who do not hold a high school diploma. Non-traditional learners can include those who are in traditional programming as well as learners who are in non-­ traditional programming. Non-traditional programming includes a range of learning opportunities, such as: experiential learning options that incorporate work placements with classroom learning; seminar, webinar, and travel opportunities for knowledge acquisition; community education programs; digital badge credentialing; self-paced learning modules; and portfolios for life experience. Creating programming for these learners, who may be single parents, veterans, college non-completers, caregivers, or have other roles and responsibilities typical of adult learners, is one of the important overlaps between instructional development and adult learning. Both LDT and adult education are integrally related to lifelong learning. LDT has spanned PK-12 and continues to broaden its scope toward lifelong learning. Adult education is grounded in adult and lifelong learning theory, and includes corporate training, military instruction, professional development, and continuing education for non-traditional learners. Because of this history, there are many shared initiatives with LDT specialists serving to assist with adult learning projects, particularly in adult literacy. The LINCS federal initiative (https:// lincs.ed.gov/state-­resources/federal-­initiatives/teal/guide/technology) is a good example of the overlap and shared initiatives between these two fields. Formative Learning Design as a structure within LDT is a natural connection for Profound Learning (PL). Formative design remains undefined with few concrete specifications or examples. However, we define formative design as the ongoing accounting for input from the design process to inform emerging design decisions. An example might be something like the use of formative data in an online class through weekly surveys to improve the design of the course. While PL is still in its exploratory stages, the essential nature of PL promises to alter the way we think about learning moments, placing them in the context of a lifelong formation framework. Learner experiences and intentional, continuing practices are then complementary and equal partners with content. PL is built on the qualities and characteristics of continuous and deliberate formation which integrates meaningful experiences and continuing practices into an explicitly designed, cultivated, developmental process occurring over time, resulting in continuing, durable, deep

18  Profound Learning for Formative Learning Design and Technology

231

learning and development. These particular qualities of PL offer a rich arena for complementary research, theory development, and practice with LDT.

What Is Profound Learning? Profound Learning (PL) is the process of ever-deepening learning and development through learner practices and attributes whereby insights are gained, substantial knowledge and meaningful learning are accomplished, perspectives evolve, relationships flower, values are questioned, and paradigms are tested (Carr-Chellman & Kroth, 2019; Kroth & Carr-Chellman, 2020a). Key to PL is the development and application of practices, meta-practices, and meta-learning skills that, consistently actualized, frame the learning transaction as formation over a lifetime (Kroth et al., 2022). The focus on formation situates learning as more than a series of events, activities, developmental stages, or types of instruction. Instead, learning is a process of continuous formation opening the door to diverse, yet complementary, perspectives and theories of learning which, in turn, offer increased opportunities for human flourishing.

 earning Processes of Deformation, Reformation, L and Transformation The sub-structures of continual formation, always operating in the background of learning processes, include deformation, reformation, and transformation. Since continual formation in the direction of profound learning is not linear, these sub-structures are recursive, operating less as discrete categories and more as overlapping and mutually influential patterns. Living-out these sub-structures finds the learner’s identity constantly developing through the engagement of meta-­ learning skills such as agency, reflection, and virtue development.

Meta-Learning and Meta-Practices In turn, meta-learning skills are actualized through meta-practices such as spirituality, cognition, body, and relationship. Kroth et al. (2022), describe meta-practices and meta-skills in this way: Within our framework, meta-learning skills such as agency, reflection, and virtue development undergird the ability to build the meta-practices of lifelong profound learning and human flourishing. Meta-practices undertaken over time, we suggest, are those disciplines and practices which develop the qualities of profound learning and living. They have no end-point, require regular activity, and may involve a range of exercises to build rich,

232

D. J. Carr-Chellman et al.

an Flourishing

Leading to: Hum

(Eudaemonia)

g , Lifelong Learnin

Profound

May Change Variety of Methods & Exercises Life-long AnchorKeystone Practices (Few)

Spirituality-Related Practices/Habits/ Routines/Disciplines

Cognition-Related Practices/Habits/ Routines/Disciplines

Body-related Practices/Habits/ Routines/Disciplines

Relationshiprelated Practices/Habits/ Routines/Disciplines

MetaPractice(s) Spirituality

MetaPractice(s) Cognition

MetaPractice(s) Body

MetaPractice(s) Relationships

Meta-Learning Undergirds Pro

found Learning

INTENTIONALITY Agency-Purpose-Volition-Virtue Dev

elopment

Fig. 18.1  Meta-learning leading to human flourishing. (Used with permission figure from Kroth et al., 2022)

g­ enerative learning. Meta-practices are important in every part of learning and living… [Meta-­learning skills, meta-practices, and practices]… cumulatively lead to lifelong profound learning and human flourishing, also known as eudaemonia. (p. 33)

Meta-practices are life-long, serve to anchor individual practices, and contribute over time to human flourishing (see Fig. 18.1). Intentionality, reinforced by meta-­ learning skills of agency, purpose, volition, and virtue development, provide direction and continuing impetus and persistence which impel continuing enactment of meta-practices, practices, and regular habits, routines, and actions leading to deep, purposeful, fulfilling, lifelong learning and living. The daily work of engaging meta-practices requires the select practices of each individual profound learner and might include good nutrition and exercise, mediation, active listening and collaborating. The list of practices is extensive and powerful. These elements of PL provide a foundation for lifelong learning that is holistic, purposive, proactive, synergistic, and generative; in contrast to being disjointed, and passive. While it is not a panacea, PL begins to integrate theoretical perspectives. This should compound the depth of learning over time and enable significantly improved learning experiences. The effective application of PL to learning design is intended to correlate with human flourishing. It is important to acknowledge that

18  Profound Learning for Formative Learning Design and Technology

233

some of the most significant developments in theoretical frames for LDT have come from similar re-examinations of the fundamentals of learning such as Flow (Csikszentmihalyi, 1990), Constructivism (Jonassen, 1994), and Systems (Banathy, 1993).

Profound Learning and Human Flourishing Profound learning serves as the learning process which leads to human flourishing. Human flourishing conceptualizes how we experience happiness, self-actualization, depth of human growth, and fulfillment. Grounded in the ancient work of Aristotle (Ross & Brown, 2009) and Aquinas (1274/1981), and incorporating the contemporary view of human flourishing proposed by the recent work of Nussbaum (2011), human flourishing is not seen in PL as an end-state, but as a socially constituted and intermittent experience that can be maintained, enhanced, and accentuated through practices, meta-practices, and meta-skills. Importantly, PL, as the learning delivery process, goes beyond self-learning and includes “improving” others and the world. By moving away from a singular focus on content and subsequent testing, PL creates a learning delivery system that reduces opportunities for cheating by engaging substantive, experiential student-centered learning (Carr-Chellman & Carr-­ Chellman, 2022). Designers can embrace this approach to understanding learning and allow for using media in the service of larger, higher, and longer-term learning goals as well as socially just classrooms.

Developing a Disposition for Depth The compelling characteristic of PL is the clear, simple recognition that deep learning can become engrained, perhaps even to the point of becoming a disposition (Goleman & Davidson, 2017), which means it becomes engrained, that is, a part of who we are and not just what we do. Deep learning can be developed over time, can be practiced, learned, and taught. PL emphasizes substantive changes occurring incrementally and occasionally through experiences and ongoing explorations that seek insight, depth, and breadth through practices or disciplines. PL recognizes that people can learn to be profound learners. PL connects with the construct of “deepening” which is defined as depth of growth, complex thinking, and multi perspectivity over time. Because the experience of PL can be practiced, learned, and taught, it can be designed. However, there is a danger in this simple statement. We have seen similar hazards in the history of constructivist theory implementation. In that instance, the actual practice of constructivism in classrooms too often translated to completing worksheets and other antithetical design choices. To avoid this, designing for PL is particularly appropriate within formative learning design in which the difficult process of PL design is continuously approximated

234

D. J. Carr-Chellman et al.

rather than accomplished once and for all. Importantly, designs for PL must be as true to the lofty goals of profundity as possible. To ensure adoption of PL with high fidelity, we must guard against diminishing the complexity, challenges, and intuitive shifts, which often are minimized or undervalued and not interwoven into traditional ID models. For example, PL values, but is less focused on, driven by, or dependent on accountability measures, ROI metrics, units of instruction, measurable objectives, and the traditional hierarchy of designers, instructors, and learners. Respecting this difference, and discovering processes which honor the need for both proximate results and distal depth creation will be essential to the future of PL in formative design as applied to LDT.

Profound Learning as a Formative Process The qualities of profound learning, profound learners, and profound living have been explored and themes associated with each have moved beyond theoretical ideas toward design blueprints. Profundity is particularly associated with heavy, provocative, and substantive qualities. Profound learners are marked by reflective, consequential, and change-oriented processes, while profound learning is associated with concentrated reflection, meaningful processing, and consequential learning. Profound living is understood as living meaningfully, practicing ongoing reflection, working toward richer understandings, and being intentional, authentic, and integrative (Carr-Chellman & Kroth, 2019; Kroth & Carr-Chellman, 2020b). While LDT professionals might aspire to design for the components of PL in the abstract, the process of traditional ID tends to focus on singular activities, specificity in learning goals, limited content engagement, and a lack of connection with lifelong learning, agency, and human flourishing. As design models incorporate the fullness of the human condition and a focus on human flourishing, PL will become a reality rather than an aspiration. There are several useful instructional approaches to profound learning that resonate with formative learning design. These approaches are characterized by Kroth et al. (2022) as emerging from formation as the engine of profound learning: First, formation is a holistic approach, which incorporates the whole person, body, mind, and spirit. Second, …formation is identity development actualized through enacted virtues, implicating our social roles in the life of the community. Third, formation is considered autopoietic and incorporates self-perpetuating, purposeful, volitional processes that are meant to develop the individual’s ability to learn-to-learn, self-reflect and correct, with motivation and increasing depth over a lifetime. Finally, educational approaches to formation would deliberately incorporate multiperspectivity practices as part of the learner’s essential approach to learning. These approaches continually add depth and breadth to a profound learner’s stock of knowledge, ability to look at multiple ideas, come to thoughtful insights, and take provisional positions about them. (pp. 30–31)

To connect these profound learning approaches more directly to instructional design, we might say that lectures and highly structured discussions are less

18  Profound Learning for Formative Learning Design and Technology

235

oriented toward profound formation, and formative learning design. Deep, open, unstructured discussions alongside discovery-based learning such as experiential or inquiry-based designs. Problem-based and simulation-based learning likewise are more likely to fall into the category of approaches that align with profound learning and formative design.

What Does LDT Formative Design for PL Look Like? The instructional approaches highlighted above serve as potential points of departure for a formative learning design process in the direction of profound learning. The essential steps in a basic instructional design model are associated with ADDIE–Analysis, Design, Development, Implementation, and Evaluation. Like fundamental problem-solving models and formative design processes, ADDIE has matured into a model that too often, unfortunately, has narrowed its scope in practice to a point where more expansive goals, like those represented by PL, are ignored. The focus on accountability, ROI, and achievement has subordinated and even supplanted the opportunity and potential for designing for the development of agency, engagement, and human flourishing. The long-term and iterative nature of ADDIE does not preclude, and actually invites, the creation of a process that can guide those who wish to create PL experiences. The creation of profound learning experiences seems an obvious evolutionary development. Similarly, the use of Universal Design for Learning (UDL) has become a checklist of accommodations, such as multiple options for content access through text, audio, and video, rather than the hoped-for epistemological shift that expands design features to facilitate knowledge acquisition for all learners (Rogers-Shaw et al., 2018). It is possible to apply the concepts and practices of PL to UDL principles in design to support the development of deep, meaningful online learning (Rogers-­ Shaw et  al., 2022). For example, UDL maintains the importance of the learning process and highlights the effort exerted toward improving rather than merely checking evaluation metrics and spurring competition (CAST, 2018). The notion within PL that there is no single truth, but rather a deepening of knowledge through continuous journey of discovery can lead to a growth mindset is reflective of this UDL position (Rogers-Shaw et al., 2022). While UDL encourages the use of scaffolding to enhance information processing and providing multiple content entry points and alternative pathways (CAST, 2018), PL seeks breadth and depth of complex learning through the practice of varied roles and exploring different perspectives within multiple contexts (Rogers-Shaw et al., 2022). PL’s response to UDL’s checklist of embedded reflection prompts and modeled think-alouds (CAST, 2018) is the incorporation of meta-cognition activities that test assumptions and promote meaningful and critical reflection (Rogers-Shaw et  al., 2022). When UDL

236

D. J. Carr-Chellman et al.

guidelines suggest cooperative learning groups with specific member assignments (CAST, 2018), PL calls for building trust within learning communities through safe discussion environments that lead to deep and meaningful exploration and sharing (Rogers-Shaw et al., 2022). As with ADDIE, UDL invites profound learning experiences and the conscious application of PL concepts to learning design moves the application of UDL away from the superficial adoption of guidelines toward the development of deeper learning. The process of creating a profound learning experience is primarily focused on experiencing phases and closer approximations toward agency and flourishing. The process of creating the basic conditions for more profound learning experiences begins by looking at what already exists in the learning experience, step away from it, and figure out how to rethink the traditional learning goal, objectives, and alignment. Connecting with what makes learning profound for ourselves can lead to imagining how to create the pre-conditions for profound experiences for others. Talking to learners who are similar to the target learners can increase understanding of profound learning experiences and practices. Once we, as designers, understand the potentials for how to create profound experiences for our learners, we would then work to create closer approximations toward ideal practices of deep learning, reflection, change, processing, meaning making, intentionality, authenticity, integrated learning across all boundaries, and set out to make an intentional move away from a finite specific learning goal. This process is marked by both tumult and calm. The experiences of change and depth can cause internal conflict, within a learning space where the calm and soothing nature of practices such as reflection, authenticity, and integration are present. The actual learning experience asks the learners, designers, and leaders to connect over the profound learning experience. Rather than elevating the traditional notion of “instruction” and “instructors,” there are those (guides) whose previous work can help, but not direct, the experience. Designers are necessary in the learning experience to effect true ongoing formative design. The separation of the designer from the learning moment is artificial and does not serve the formative nature of design or PL well. The learning experience itself is an opportunity for productivity, renewal, and growth for learners, leaders, designers, and guides. The experience of profound learning, particularly within the frame of formative design requires all learners to extend through and beyond the experience. Learners should constantly be moving beyond what is easy, simplistic, obvious, non-­ reflective, surface, limited, or narrow. This is a challenge for the learners and is ultimately their responsibility. The extension of the profound learning moment is an essential part of the process and considers what is beyond that moment. For example, if, as we begin, we understand our own and our learners’ profound past experiences, we must extend beyond those understandings. It is insufficient to simply comprehend those alone and not attempt to go beyond that which is obvious. It is antithetical to the nature of profound learning to gather profound experiences or perspectives and stop. It is in this shared foundation which impacts both PL and formative design that designers discover the most powerful tool for creating profound learning experiences. This extension process asks learners to deepen their

18  Profound Learning for Formative Learning Design and Technology

237

insights by extending, reviewing, summarizing, crystallizing, prioritizing, and taking a broader and higher perspective on the learning experience.

 inking Profound Learning to Formative Learning Design, L Design Thinking and Community Similar to constructivism’s dramatic change in viewing technology’s role in learning by shifting us from a teacher-driven, information-imparting model to a learner-­ driven, internal creation model with teaching facilitation and scaffolding (Jonassen, 1994; Tobias & Duffy, 2009), PL is a fundamental reorientation in our understanding of the teaching/learning process. Constructivism spawned an expansion of LDT research. We believe PL offers a similar augmentation by integrating depth, learning transfer, intentionality, longer-term practices, and authenticity into learning designs. This repositioning should translate into opportunities for future research that integrate and reinforce formative LDT and which incorporate and support profound learning. Precisely because PL is focused on the reflective unfolding of learning experiences, the sense that one is never completely done with learning, and that learning is formative by its nature, PL is well-aligned with the formative design process. Formative design, with heavy reliance on iteration, early evaluation, continuous refinement, and testing (Calongne et al., 2019), can be dramatically influenced by PL which shares a foundational understanding of iterative, reflective, continuously-­ improving practice. While a formal definition of formative design remains unclear, there is some clarity that profound learning embraces several of the fundamental characteristics of formative design such as reflection, revision cycles, lifelong learning, ongoing improvement, attention to learning transfer, and recursive cycles of design itself. As PL research and theory become increasingly connected to human flourishing, they are becoming more grounded in justice-oriented community-based contexts while building a shared symbolic universe through virtue-oriented growth. In future research and theorizing, PL would be well-informed by design thinking and better understandings of formative design, creating additional depth and value through this two-way communication, conceptualization, and inter-disciplinary research opportunity.

Research Accomplished Several empirical studies have been conducted and more are in-process. The published studies include a Delphi study (Kroth & Carr-Chellman, 2020b) and an investigation into teacher’s experiences of PL (Carr-Chellman & Kroth, 2019). Empirical

238

D. J. Carr-Chellman et al.

investigations are on-going in PL and leadership, end-of-life caregivers, and practices of living. Preliminary results from these studies have been presented and published in conference proceedings (Carr-Chellman et  al., 2022a, b; Threet et  al., 2022). Additional publications include an integrative literature review examining leadership practices grounded in PL (Scott et al., 2020) and three theoretical explorations of PL (Carr-Chellman & Kroth, 2020; Kroth & Carr-Chellman, 2018, 2020a). The development of a “Profoundabilities Starburst Model” of human potentiality is being explored with the intention of shifting the educational lens to learner growth, expanding capabilities, and replacing a deficit model (Rogers-Shaw et al., 2021). Next steps for research include connections with other fields, some of which include design thinking, community building, and organizational capacity building.

Conclusion Investigation into the potential for PL design processes is just developing as early research findings are emerging (Rogers-Shaw et  al., 2022). Because the work is, like the learning, never complete, the notion of this exploratory process to represent the design processes associated with PL is necessarily rudimentary, and we will doubtless modify our thinking as we learn more. Nevertheless, the integration of profound learning and instructional design principles and practices cannot occur without the spark of an idea and the continual building of knowledge upon that foundation. To move the ideas and theory forward, these ideas should be tested to see if they make sense to others who understand PL, formative design, and LDT.

References Amiel, T., & Reeves, T. C. (2008). Design-based research and educational technology: Rethinking technology and the research agenda. Journal of Educational Technology & Society, 11(4), 29–40. https://www.jstor.org/stable/10.2307/jeductechsoci.11.4.29 Aquinas, T. (1981). Summa theologica (Fathers of the English Dominican Province, Trans.). Christian Classics. (Original work published 1274). Banathy, B. H. (1993). Comprehensive systems design in education: Designing education around the learning experience level. Educational Technology, 33(1), 33–35. https://www.learntechlib. org/p/170915/ Calongne, C., Stricker, A. G., Truman, B., & Arenas, F. J. (2019). Cognitive apprenticeship for teaching computer science and leadership in virtual worlds. In Recent advances in applying identity and society awareness to virtual learning. https://doi.org/10.4018/978-­1-­5225-­9679-­0.ch010 Carr-Chellman, D.  J., & Kroth, M. (2019). Public school teachers’ experiences of profound learning. Studies in Adult Education and Learning, 25(3), 107–123. https://doi.org/10.4312/ as.25.3.107.123 Carr-Chellman, D. J., & Kroth, M. (2020). The spiritual disciplines as practices of transformation. In I. Management Association (Ed.), Religion and theology: Breakthroughs in research and practice (pp. 293–307). IGI Global. https://doi.org/10.4018/978-­1-­7998-­2457-­2.ch018

18  Profound Learning for Formative Learning Design and Technology

239

Carr-Chellman, J. F., & Carr-Chellman, A. A. (2022). The learning opportunity of our lifetimes: Here’s why we urgently need alternative assessment. Interactions. https://interactions.aect.org/ learning-­opportunity-­of-­our-­lifetimes/ Carr-Chellman, D. J., Kroth, M., Daniels, D., Brion, C., & Manzanares, L. (2022a). End of life caregivers and profound learning: A grounded theory study. Proceedings of the Adult Education Research Conference. https://newprairiepress.org/aerc/2022/papers/6 Carr-Chellman, D. J., Kroth, M. & Rogers-Shaw, C. (2022b). Human flourishing and adult education. Proceedings of the Adult Education Research Conference. https://newprairiepress.org/ aerc/2022/papers/2 CAST. (2018). Universal design for learning guidelines version 2.2. http://udlguidelines.cast.org Csikszentmihalyi, M. (1990). Flow: The psychology of optimal experience. Harpers Perennial. Goleman, D., & Davidson, R. J. (2017). Altered traits: Science reveals how meditation changes your mind, brain, and body. Avery. Jameson, J. (2019). Developing critical and theoretical approaches to educational technology research and practice. British Journal of Educational Technology, 50(3), 951–955. https://doi. org/10.1111/bjet.12775 Jonassen, D. H. (1994). Thinking technology: Toward a constructivist design model. Educational Technology, 34(4), 34–37. https://www.learntechlib.org/p/171050/ Kroth, M., & Carr-Chellman, D. J. (2018). Preparing profound learners. New Horizons in Human Resource Development and Adult Education, 30(3), 64–71. https://doi.org/10.1002/nha3.20224 Kroth, M., & Carr-Chellman, D. J. (2020a). Conceptualizing profundity through metaphor. New Horizons in Adult Education and Human Resource Development, 32(1), 5–16. https://doi. org/10.1002/nha3.20269 Kroth, M., & Carr-Chellman, D.  J. (2020b). Profound learning: An exploratory Delphi study. International Journal of Adult Education and Technology (IJAET), 11(2), 14–23. https://doi. org/10.4018/IJAET.2020040102 Kroth, M. K., Carr-Chellman, D. J., & Rogers-Shaw, C. (2022). Formation as an organizing framework for the processes of lifelong learning. New Horizons in Adult Education and Human Resource Development, 34(1), 26–36. https://doi.org/10.1002/nha3.20348 Nussbaum, M.  C. (2011). Creating capabilities: The human development approach. Harvard University Press. Reigeluth, C.  M., & Carr-Chellman, A.  A. (Eds.). (2009). Instructional-design theory, vol. III: Building a common knowledge base. Lawrence Erlbaum Associates. Rogers-Shaw, C., Carr-Chellman, D.  J., & Choi, J. (2018). Universal design for learning: Guidelines for accessible online instruction. Adult Learning, 29(1), 20–31. https://doi. org/10.1177/1045159517735530 Rogers-Shaw, C., Carr-Chellman, D., & Kroth, M. (2021). Discussing profound disability and profoundability. [Paper presentation]. American Association of Adult and Continuing Education Conference, Miramar. Rogers-Shaw, C., Kroth, M., Carr-Chellman, D., & Choi, J. (2022). Profound learning through universal design. eLearn, 2022(11). https://doi.org/10.1145/3576936.3569094 Ross, W.  D. & Brown, L (Eds.) (2009). Oxford world’s classics: Aristotle: The nicomachean ethics (Revised Edition). Oxford University Press. https://doi.org/10.1093/ actrade/9780199213610.book.1. Scott, H., Carr-Chellman, D.  J., & Hammes, L. (2020). Profound leadership: An integrative literature review. The Journal of Values-Based Leadership, 13(1). https://doi. org/10.22543/0733.131.1293 Threet, A., Kroth, M., & Carr-Chellman, D. J. (2022). Conceptualizing an abundance mentality and its relationship to lifelong learning, human flourishing, and profound learning. Proceedings of the Adult Education Research Conference. https://newprairiepress.org/aerc/2022/papers/7 Tobias, S., & Duffy, T. M. (Eds.). (2009). Constructivist instruction: Success or failure? Routledge/ Taylor & Francis Group.

240

D. J. Carr-Chellman et al.

Warr, M., Mishra, P., & Scragg, B. (2020). Designing theory. Educational Technology Research and Development, 68(2), 601–632. https://doi.org/10.1007/s11423-­020-­09746-­9 West, R.  E., Ertmer, P., & McKenney, S. (2020). Special issue: The role of theory in learning design and technology research and practice. Educational Technology Research and Development, 68(2).

Chapter 19

Tapping Into How We Teach What We Teach: A Journey in Explicit and Implicit Reflection Monica W. Tracey and John Baaki

Abstract  Designing iteratively, evaluating early, and continuing to refine a design are solid tenets of formative design. As designers and design researchers, we continuously apply an ongoing formative lens to our design process and the design courses we teach. In this chapter, we take you on our formative design journey where we engaged in implicit and explicit reflection to iteratively design, evaluate and refine our courses, in an effort to improve our instruction, while cultivating our designer professional identity. As a result of our formative journey, three immediate design improvements are described as well as additional improvements to be implemented in future courses. Keywords  Formative design · Designer identity · Reflection Designing iteratively, evaluating early, and continuing to refine a design are solid tenets of formative design. Kenny (2017) notes that formative assessment occurs when designers gather data to adjust research activities while the activities are in process. When designers take a formative lens to a learning process, they may find themselves observing, questioning, journaling, and reflecting (Kenny, 2017). Connecting formative assessment and formative design means designers gain feedback that both instructors and students use to drive improvements in the ongoing teaching and learning context. As designers and design researchers, we continuously apply an ongoing formative lens to our design process and the design courses we teach. As such, in this chapter, we take you on our formative design journey where we engaged in implicit and

M. W. Tracey (*) Wayne State University, Detroit, MI, USA e-mail: [email protected] J. Baaki Old Dominion University, Norfolk, VA, USA e-mail: [email protected] © The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 B. Hokanson et al. (eds.), Formative Design in Learning, Educational Communications and Technology: Issues and Innovations, https://doi.org/10.1007/978-3-031-41950-8_19

241

242

M. W. Tracey and J. Baaki

explicit reflection to iteratively design, evaluate and refine our courses, in an effort to improve our instruction, while cultivating our designer professional identity.

Reflection and Designer Professional Identity Reflection becomes an important tool for the formation of our designer professional identity when we expand reflection to encompass an effort to define and redefine our beliefs, values, and perspectives (Tracey et al., 2014). Simply put, to continuously develop designer professional identity, we need to reflect. Regardless of what stage a designer is in their career, there are many reflection techniques and methods to support continuous designer identity development. For example, designers can use a reflective diary, portfolio, or reflective writing if reflecting alone, and design story sharing, peer coaching and mentoring if working with other designers. These reflective activities support a designer’s personal and internal knowledge construction through recursive considerations and interpretations of their experiences and beliefs. Knowledge construction occurs through the practice of explicit and implicit reflection (Tracey & Baaki, 2022a). Explicit reflection  Explicit reflection occurs when we have reflective conversations about the design problem and partial solutions to better grasp the relationship between the two (Tracey & Hutchinson, 2018). Designer professional identity includes an understanding of yourself in a design situation and what is expected of you in the design space (Hutchinson & Tracey, 2015). Who you are as a designer is influenced by your personal traits, habits, talents, and limitations. Studying reflection done by language teachers, Gerlach (2021) concludes that teachers first need a philosophy of teaching that is shaped by who the teacher is. Also necessary are teaching principles that guide a teacher through the language teaching process. When a teacher acts, justified by teaching principles, they reflect on their practice while they engage in teaching. In this way, reflection becomes recursive. Tracey and Hutchinson (2018) investigated the use of reflective writing in an introductory design course to help students better understand their design belief, experiences, and perceptions of self-awareness. An implication of their study was that the complex problems designers face, and their evolving designer professional identity, are embedded in the design students’ design thinking, design decisions, and design outcomes. In other words, there is a close connection between reflecting during design (explicit reflection) and reflection on designer identity (implicit reflection). Implicit reflection  When we reflect implicitly, we gain knowledge of and become familiar with concepts, ideas, and theories outside of our design context (Wackerhausen, 2009). Our personal traits, habits, talents, and limitations impact who we are as designers (Tracey & Baaki, 2022a). Gerlach (2021) proposes that teachers may reach a level of implicit reflection through narrative prompts and sug-

19  Tapping Into How We Teach What We Teach: A Journey in Explicit and Implicit…

243

gests that, in reflective narratives, what is most important is how teachers write about their practice and actions. The challenge for the teaching profession is to understand what is happening through a dialogic process. Gerlach (2021) concludes that implicit reflection is collaborative, involving discussion and dialogue with a colleague or critical professional friend. Focusing on instructional designers, Tracey et al. (2014) provide support to Gerlach, noting that reflection is a crucial tool for cultivating professional identity. One mechanism to support implicit reflection is through the sharing of design stories. It is in the listening and telling of helpful design stories that designers continue to learn and cultivate their own designer professional identity. Engaging in explicit and implicit reflection helps us as instructors improve ongoing teaching and learning environments.

Our Design Story: What We Teach Moment of Use A design team was tasked with creating a course to teach math ratio concepts to adult audiences preparing to take their high school equivalency exam (Baaki & Tracey, 2022). The team began designing the Understanding Ratio course starting with course goals, outcomes, and assessments. Identifying ‘must have’ course content along with reflecting on their audience’s challenges working with ratios and wondering when the audience would possibly need to use ratios, the team determined that there is a ratio moment of use when employed in certain jobs. One of the possible employers of their audience might be a coffee shop, so the team designed a ratio activity using illustrations of a cup of coffee: black, with milk, and with milk and sugar. The activity was designed for the moment the audience might need to use ratios: 1/3, 2/3, and whole. This design example illustrates how the team designed for the moment their audience would actually use what they learned in the instruction. As design researchers and practitioners, we recognize that context is integral in every design situation. In an effort to better understand context, for ourselves and our teaching, we recently researched design context in an attempt to define it (Baaki & Tracey, 2019). We failed to achieve our goal of providing a clear and concise definition of context for ID. We learned that the word “context” appears to be defined by the use of context (Duranti & Goodwin, 1992; Melonçon, 2017). As often happens in research, our attempt to define context actually resulted in our discovery of the moment of use. A moment of use approach emphasizes specific moments where context is scaled back to what is needed to succeed in a situation or moment (Baaki & Tracey, 2019; Herman et al., 2023; Tracey & Baaki, 2022a). The context, therefore, is actually focused on the learner; the moment when the learner will need to use what they have learned in the designed intervention in order to perform it. It is the learner’s moment of use that defines the context of the design. But how do designers design these moments? Our research in empathic design supported this effort but we needed to go deeper. We discovered another missing piece during a research study in 2021, the 3I’s (Herman et al., 2023).

244

M. W. Tracey and J. Baaki

The 3I’s: Introspection, Interaction, Intention In 2021, in a course on non-instructional interventions, John, the second author, charged his students to address an issue of inequity in their organizations or communities via a moment of use approach. This approach allowed for students to design for the large, often systemic, issues surrounding diversity, equity, and inclusion (DEI) while acknowledging the bias they bring to the design (Herman et al., 2023). John employed the words of Wilkerson (2020), who suggested a guiding question for our continued moment of use approach to DEI interventions: “The central question about human behavior is not why do those people do this or act in that way, now or in ages past, but what is it that human beings do when faced with a given circumstance?” (p. 387). How is it that designers should approach a moment of use context? Wilkerson provided the answer, through “radical empathy” (p.  386). Radical empathy begins with reflection on our individual perspectives as part of the empathic design process, recognizing and attempting to set aside the privilege and bias we may subconsciously carry (Wilkerson, 2020). This led us to uncover research on cultural design through designer introspection, interaction, and intention, the 3I’s (Thomas et al., 2002). Introspection  Engaging in introspection begins when designers engage in critical reflection with themselves and their design team looking at their inherent bias and how it may impact the design decisions they make. Designers also reflect on the moment(s) the audience will be using what is learned in the intervention. Focusing the design on the learners specific moment of use, allows designers to identify the critical forces that can affect the understanding and use of the instruction (Melonçon, 2017). Interaction  Beyond collaborative introspection with self and peers, interacting with the audience of use includes a co-determination of specific use cases. When designers interact with the audience, they gain an understanding of how the audience will use their design and the numerous ways the audience may act as a result of the design. Intention  Designing with intention equals action, meaning the intention of the design must be to act. Acting means designers determine the specific moment(s) of use for the design and design for that moment. Intention makes the design real and usable.

 ur Journey: Formative Design in Our Moment of Use (How O We Teach) Not only do we teach the moment of use approach to design, we design our course activities for our students to learn through a moment of use approach. As such, in the fall 2021 and spring 2022 semesters, at two different universities, we taught

19  Tapping Into How We Teach What We Teach: A Journey in Explicit and Implicit…

245

introductory and advanced ID graduate courses focused on taking a moment of use approach to design. Students worked in teams to design meaningful interventions that would be implemented by their client partners. Teaching the concept of a moment of use approach using the 3I’s (what we teach), we then enacted the moment of use approach (how we teach) by setting up an environment for us to engage in explicit and implicit reflection. In other words, we wanted to look at how we were designing for our students’ moment(s) of use; when they would actually use the design principles we were teaching. We believe that designer professional identity development is a continuous process. Although we have years of experience as designers and instructors, we recognize that a key way to continuously cultivate our designer professional identity and improve how we teach is through ongoing examinations of our beliefs, values, and perspectives. By engaging in explicit and implicit reflections of our current courses for review and feedback, we became the research participants, challenged to improve our courses and further cultivate our professional identity.

Journey Participants Our designer professional identity journey began as practitioners. Each of us worked in the “business” of ID for over 15 years before completing our doctoral degrees and entering academe. As professors of ID, we consciously bring a practitioner approach to our ID courses by requiring our students to work with authentic clients in design teams to produce deliverables that will be used in their clients’ organizations. We incorporate the process described in Real-World Instructional Design (Cennamo & Kalk, 2018), layered with empathic design, moment of use and the 3I’s. By engaging in formative design of our courses, we chose to be participants on a journey of explicit and implicit reflection. We shared our design stories and how we see ourselves as instructors and reflective practitioners. Acknowledging the extensive research in novice designer practice and the limited research in expert designer practice, we hoped documenting our journey and experiences could add to expert designer research.

Documenting Our Journey Using current courses we were teaching, we decided to share our experiences in real time, reflect on how we were teaching, what we were teaching, and how we could improve/adjust our approaches. We agreed to communicate on a weekly basis during the 15-week semester. We documented our reflections after each class, what worked, what did not, and how our students were learning or not learning the design principles we were attempting to impart. We documented our reflections in emails and text messages, a shared Google document, and recordings of monthly meetings.

246

M. W. Tracey and J. Baaki

Emails and text messages  In-the-moment text messages and emails were raw, transparent and honest. Text messages illustrated how we were cultivating our designer professional identity through what we were doing in our courses. We were in the thick of implicit reflection where we could not get enough of listening to each other’s design stories. These stories sparked additional ideas, thoughts and reflections on each other’s design ideas. When a thought or idea surfaced, we felt compelled to share it with each other in the moment, and a quick text message or email documented our continual reflections. For example, in week 12 of the fall semester, Monica, the first author texted: Maybe we need to find out the design process that works best with the 3I’s, they can’t seem to go there without some type of foundational map. Is it as easy as the design process, empathize, define, ideate, prototype, test? I hate the thought of a model, but maybe we need to give them something to hang on to as they delve in for the first or second time. I think that we have them in the deep end which is OK but maybe we have to give them a life preserver.

In the moment, John immediately responded: Our students are not a blank slate. They come to our programs with design precedents, often bad ones. We need to tap into what they are doing now, see how moment of use fits. We had the discussion around helping them take one less step and opening new doors to show them a design space. Let’s have them first open a door to show us their space. We are both talented to see how moment of use fits that space.

Google document  For our explicit reflection, we created a shared Google Doc wherein on the day after our scheduled class meetings each week we individually reflected on how the student designers were moving forward with applying the moment of use approach. Texts and emails were where we documented the design stories as they happened, while our Google Doc provided a space in which we sat back, took stock of what was happening, and journaled our thoughts and reflections for that week. Some weeks required pages of reflection; in other weeks, a few sentences sufficed. We then read each other’s story, discovering that we immediately wanted to comment or expand on initial reflections. Two weeks after the above text exchange, John wrote: As the semester ends, I think that we have an idea of where we need to go next. The idea of having students reflect on their actual design process… Not what they thought it should be, not what they would like it to be, not what they learned, but what it is like in the swamp! It would seem to me that there is a lot of good happening with their process. So, how does a moment of use lens approach fit in? For Spring 2022, before the semester starts, before we meet in Week 1, they will create a Google Doc and invite me in. I will ask them to “Share your actual design process. Be honest. Don’t share what you have learned. How do you design? Choose how to share it. Choose 1: a 250-word reflection (250 words exactly), a 90-second (exactly) video or draw it.”

Monthly meetings  While telling our stories to each other and reflecting on and documenting our ideas, we also met once a month to discuss what was happening during our classes and what we intended to change and improve immediately in the semester and what change ideas would be better suited for the next semester. In a phone call between the above Week 12 texts and John’s Week 14 reflection, Monica

19  Tapping Into How We Teach What We Teach: A Journey in Explicit and Implicit…

247

shared about a seminar she had just conducted with instructional designers from a car company. The purpose of the seminar was to help the designers find different approaches to instructional design problems and opportunities. Monica began the seminar asking the designers, “What is your actual, not ideal, design approach?” This question sparked John’s idea to have students share their design process.

Resulting Improvements Schön (1983) introduced reflection-in-action where a designer has an ongoing internal dialogue to make design decisions. Reflection-for-action occurs when designers draw on their design stories and other’s design stories and design for what could happen (McAlpine & Weston, 2000; Tracey et al., 2014). Through our explicit and implicit reflections, we experienced both reflection-in-action and reflection-for-­ action as we made improvements during the fall and spring semesters. Through our reflective journey, we identified three formative design outcomes to implement immediately. Combining the 3I’s  We began the fall semester teaching the 3I’s, one each week, having students reflect on each individually in a writing assignment for a total of 3 weeks. During this time, we were also analyzing previous design student data for another research project. While looking at those design team meeting transcripts, we realized that in practice, the 3I’s do not work linearly (Baaki et al., 2023). We discovered that the 3I’s have constant interactions with one another. Design students generally engage in all three, even within a single sentence during a design meeting. We realized that we were unknowingly teaching students to view the 3I’s separately as individual steps in a three-step process; but in reality, the 3I’s embody an intersecting design process. We observed that teaching the 3I’s as separate steps caused confusion and made it more difficult for them to understand and implement. Engaging in formative design, we changed class activities to more accurately reflect how the 3I’s actually work during design. By making this change, our reflective journey was informing our practice in real time. For one of John’s students, the 3I’s coming together showed through in an onboarding new adjunct faculty project she was working on at her community college. The student shared with John in an email: I was looking at my outcome feedback, and I think I finally may have moment of use, but I wanted to run it past you. So, when we identified that they need to know the college’s mission, we did so from a this-is-training-you-should-know viewpoint. But in reality, as I dig down, they don’t need to know the mission or even how students onboard into the college because their moment of use with students starts in the classroom. It doesn’t matter if the student came in through the adult ed. or the dual enrollment door. They are their students. So, my module needs to shift to be focused on their moment of use, which means I need them to know how to access the wrap-around services to assist struggling students in their courses by referring them to the service.

248

M. W. Tracey and J. Baaki

Meeting designers where they are  Designing for the moment of use appeared to us to be something that makes sense and should be able to be comprehended by design graduate students. This, however, was not the case. We realized that our students do not come to our classes as blank slates. They are instructional designers who have a way of designing, some with processes learned through education and some from actual practice. As shared in our text message, Google Doc reflection, and monthly meeting examples, for the spring semester we met students where they are as designers by understanding how they currently design. Now, our first activity is to have students describe how they design, in real, understandable terms through either written or graphic form. We begin our instruction with where they are presently as designers, then illustrate how and where the moment of use and the 3I’s align or need to be added into their current design processes. Measuring design deliverables  Finally, we are intrigued with how to measure if a design deliverable that is designed using a moment of use is meaningful. We had just completed a study that found that design teams who employed empathic design did not necessarily result in the completion of a meaningful deliverable (Tracey & Baaki, 2022b). The results of this research led us to seek out a rubric to measure creativity (Henriksen et al., 2015). Utilizing the rubric has not only been a game changer for our courses, but has also changed how we have moved forward with our research on studying the application of a moment of use in design. Thus, formative design has put us in a position to continue to improve and evolve our classrooms and our research.

Additional Results from Implicit and Explicit Reflection In addition to the three formative design outcomes described above, we are aware and discussed that even though we are teaching similar courses and a moment of use approach to design, our learning contexts and our students have distinct values, beliefs, and experiences. We view this as a strength. As a result, our explicit and implicit reflections resulted in both of us making improvements to our own specific courses. For example, John realized that if he wanted to have his students think differently about the 3I’s, he needed to innovate his graduate course format. The course has a “Late Show” television show theme that encourages students to think beyond being just an instructional designer. The Late Show began as a playful way to share that the course began at 7:10 P.M. and lasted until 9:50 P.M. After a few weeks, students joined in the play and suggested that the Late Show needed a house band. John began a weekly musical theme where a song was played to open the class, at break, and then to end the class. The Late Show environment helped to turn a late-­ evening graduate classroom into an organic design studio where students embraced a design vibe. Reflecting on her students’ need for “design war stories,” Monica created a “Water Cooler Forum” in Canvas where she regularly shares audio clips of design

19  Tapping Into How We Teach What We Teach: A Journey in Explicit and Implicit…

249

stories that illustrate applications of the assigned readings for a given week. These real-life design stories provide the moment of use context to what students are learning. For example, when students are learning the principles of behaviorism in design, the design story is of a client who required the designing of a step-by-step process for shaping employee behavior in order to improve job performance. As part of the forum discussion, students comment, ask questions, and identify instances of the 3I’s.

The Journey Continues The purpose of this design journey was to participate in reflection in order to iteratively design, evaluate and refine our course designs, in an effort to improve our instruction, and cultivate our designer professional identity. While adjusting our courses in real time, we were also cultivating our professional identity. Monica  As a designer for over 35  years, design is a visceral part of who I am. When learning something new, through research and practice, I alter my design schema and immediately connect it to my design approach. Working with my students, I find that at times in my effort to teach everything I want to teach, wanting them to cultivate their designer identity, I may forget where they currently are. They are making design mistakes, and learning from those mistakes cultivates their designer identity. My students cultivate their designer identity working with me, and I cultivate my designer identity working with them. John  I have always viewed myself as a designer first. Just like an architect is a designer who designs buildings, and an engineer is a designer who designs automobiles, I am a designer who designs instruction. I am fascinated with how chefs, musicians, watchmakers, bladesmiths, etc. design what they design. What can I learn from the wonderful designers around me? How can I become a better designer of instruction? Working with students helps me cultivate my designer professional identity. When I am always designing I am designing better. As designers and design researchers, we continuously apply an ongoing formative lens to our design process, the design courses we teach and the research we conduct. We have illustrated our designer journey in this chapter as journeys like this cultivate our designer identity and improve our design practice. Likewise, our research is constantly informing our practice. On this journey, we reference four additional studies that have altered how we study design as well as how we teach design (Baaki & Tracey, 2019; Baaki et al., 2023; Herman et al., 2023; Tracey & Baaki, 2022b). We choose not to lean solely on traditional design research and practice, but to continuously challenge the current research and practice in an attempt to move both forward. As a result, our research informs our practice and cultivates our professional identity.

250

M. W. Tracey and J. Baaki

References Baaki, J., & Tracey, M. W. (2019). Weaving a localized context of use: What it means for instructional design. Journal of Applied Instructional Design, 8(1), 1–13. https://www.jaid.pub/_files/ ugd/c9b0ce_d055972f6fb942098860aac6071cc11e.pdf Baaki, J., & Tracey, M. W. (2022). Empathy for action in instructional design. In J. E. Stefaniak & R. M. Reese (Eds.), Instructional practices and considerations for training educational technology and instructional design professionals (pp. 58–66). Routledge. Baaki, J., Tracey, M. W., & Bailey, E. (2023). Exploring the two sides of a moment of use approach to design. TechTrends. https://doi.org/10.1007/s11528-­022-­00828-­4 Cennamo, K., & Kalk, D. (2018). Real world instructional design: An iterative approach to designing learning experiences (2nd ed.). Routledge. Duranti, A., & Goodwin, C. (Eds.). (1992). Rethinking context: Language as an interactive phenomenon. Cambridge University Press. Gerlach, D. (2021). Making knowledge work: Fostering implicit reflection in a digital era of language teacher education. Language Education and Multilingualism - The Langscape Journal, 3, 29–51. https://doi.org/10.18452/22340 Henriksen, D., Mishra, P., & Mehta, R. (2015). Novel, effective, whole: Toward a NEW framework for evaluations of creative products. Journal of Technology and Teacher Education, 23(3), 455–478. https://www.learntechlib.org/p/151574/ Herman, K., Baaki, J., & Tracey, M. W. (2023). “Faced with given circumstances”: A localized context of use approach. In B. Hokanson, M. Exter, M. Schmidt, & A. Tawfik (Eds.), Toward inclusive learning design: Social justice, equity, and community. Springer. Hutchinson, A., & Tracey, M. W. (2015). Design ideas, reflection, and professional identity: How graduate students explore the idea generation process. Instructional Science, 43(5), 527–544. https://doi.org/10.1007/s11251-­015-­9354-­9 Kenny, R. (2017). Introducing journal of formative design in learning. Journal of Formative Design in Learning, 1(1), 1–2. https://doi.org/10.1007/s41686-­017-­0006-­0 McAlpine, L., & Weston, C. (2000). Reflection: Issues related to improving professors’ teaching and students’ learning. Instructional Science, 28(5), 363–385. https://doi.org/10.102 3/A:1026583208230 Melonçon, L. K. (2017). Patient experience design: Expanding usability methodologies for healthcare. Communication Design Quarterly, 5(2), 19–28. https://doi.org/10.1145/3131201.3131203 Schön, D.  A. (1983). The reflective practitioner: How professionals think in action. Basic Books, Inc. Thomas, M., Mitchell, M., & Joseph, R. (2002). The third dimension of ADDIE: A cultural embrace. TechTrends, 46(2), 40–45. https://doi.org/10.1007/BF02772075 Tracey, M.  W., & Baaki, J. (2022a). Cultivating professional identity in design: Empathy, creativity, collaboration, and seven more cross-disciplinary skills. Routledge. https://doi. org/10.4324/9781003255154 Tracey, M. W., & Baaki, J. (2022b). Empathy and empathic design for meaningful deliverables. Educational Technology Research and Development, 70, 2091–2116. https://doi.org/10.1007/ s11423-­022-­10146-­4 Tracey, M.  W., & Hutchinson, A. (2018). Reflection and professional identity development in design education. International Journal of Technology and Design Education, 28, 263–285. https://doi.org/10.1007/s10798-­016-­9380-­1 Tracey, M.  W., Hutchinson, A., & Grzebyk, T.  Q. (2014). Instructional designers as reflective practitioners: Developing professional identity through reflection. Educational Technology Research and Development, 62, 315–334. https://doi.org/10.1007/s11423-­014-­9334-­9 Wackerhausen, S. (2009). Collaboration, professional identity and reflection across boundaries. Journal of Interprofessional Care, 23(5), 455–473. https://doi. org/10.1080/13561820902921720 Wilkerson, I. (2020). Caste: The origins of our discontents. Random House.

Chapter 20

The Formative Design of the SRL-­OnRAMP: A Reflective Self-Regulated Learning Intervention Alexis Guethler and William A. Sadera

Abstract  The online learning instructional environment is rapidly growing to meet the needs of an expanding population of adult learners. However, online courses require students to be able to self-regulate their learning experience more than traditional face-to-face courses. This chapter presents the stepwise design of the Self-­ Regulation Online Learning Reflection and Mentoring Protocol (SRL-OnRAMP) from its problem definition through two revisions. Theoretical grounding, empirical study, and literature review guide the revision process in this article. It presents both a promising online student success practice and a case study of its development. Keywords  Self-regulated learning · Online learning · Reflection · Instructional intervention design On April 16, 2020, a student taking online classes at a college in Atlanta had a Tweet go viral when she posted, “these online classes are emotionally and mentally draining. I don’t feel I’m learning; nothing is sticking” (Jones, 2020). Students worldwide joined her in venting their frustrations with online classes, including their workload, communication with their teachers, and doubts about whether they were learning enough. While it is tempting to believe that student frustrations can be attributed to the inadequacy of remote learning during a pandemic, students taking online courses often worry about the self-reliance required in online learning environments (i.e., Huss & Eastep, 2013; Murders, 2017). Long before the pandemic, Murders (2017) reported through a phenomenological inquiry the following student perspective: “The professor is not there, and you are teaching yourself, and it is harder to do. It is your ability to teach yourself something that you don’t know. If the instructor is there, they help you learn” (p. 121). Through this research, the students allude to the necessity of self-regulating their online learning experience (Murders, 2017). Students’ success in the online environment depends on their A. Guethler (*) · W. A. Sadera Towson University, Towson, MD, USA e-mail: [email protected]; [email protected] © The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 B. Hokanson et al. (eds.), Formative Design in Learning, Educational Communications and Technology: Issues and Innovations, https://doi.org/10.1007/978-3-031-41950-8_20

251

252

A. Guethler and W. A. Sadera

ability to autonomously develop, deploy, and assess their self-regulated learning (SRL) skills to take on roles that would be co-regulated in the face-to-face classroom (Broadbent, 2017). These roles include organizing the learning environment, time management, determining effective learning strategies, and using formative assessment to identify when reteaching is needed (Ben-Eliyahu & Bernacki, 2015; Broadbent & Poon, 2015). Strengthening and honing these SRL skills is essential to their success as online learners. Many studies have recommended that faculty support students in developing these skills by scaffolding SRL practices and prompting skills while decreasing students’ worries about being on their own (Gašević et al., 2015; Hadwin et al., 2018; Zheng, 2016). This chapter follows the formative process of designing, evaluating, and redesigning a SRL intervention. The intervention is theoretically grounded in Zimmerman’s Cyclic SRL model (2000, 2013) and responds to the need for scaffolded SRL practices in online higher education courses. The intervention produced is flexible, sensitive to the needs of online learners and instructors, and adaptable to any online higher education course. As online learning practitioners, we share this research in hopes that other faculty, designers, and researchers adapt this intervention or the design process to continue the development of supportive practices for students in online environments.

Problem: Self-Regulation Requirements for Distance Learning The Self-Regulated Learning Online Reflection and Mentoring protocol (SRL-­ OnRAMP) was initially developed in response to student achievement concerns in the designer’s online biology course. Students who demonstrated high competency levels in early assignments sometimes withdrew from or failed the course. Among other issues, the designer identified that students were having difficulties with time management, selecting low-value and passive learning strategies, relying on methods such as flashcards, delaying help-seeking, exhibiting patterns of decreasing effort, and showing a lack of self-awareness. The designer set a goal to find a non-­ content-­based intervention that could support students’ study skills and engagement. SinRL was identified as a likely obstacle to student success.

 heoretical Background: The Zimmerman Self-Regulation T of Learning Model There is a broad theoretical and empirical consensus that SRL skills are critical to students’ ability to cope with the online learning environment (Broadbent & Poon, 2015; Kuo et al., 2014). Schunk and Greene (2018) define SRL as the multiple skills learners use to engage and maintain their behaviors, cognitive resources,

20  The Formative Design of the SRL-OnRAMP: A Reflective Self-Regulated…

253

motivation, and affect toward a learning goal. Students use SRL skills in all learning environments; however, face-to-face environments are characterized by frequent contact with instructors and peers who provide verbal and non-verbal feedback to co-­regulate the learning experience. To be successful as a distance learner, SRL skills developed during face-to-face experiences must be transformed and transferred to the online learning environment, with the student taking on some roles previously performed by the instructor or peers (Broadbent & Poon, 2015; Kuo et al., 2014; Zimmerman, 2009). While SRL skills are critical for online course success, not all learners are prepared to exercise such skills without structured interventions (Broadbent et  al., 2020; McCaslin & Hickey, 2009). Schunk and Greene (2018) strongly advocated for the teaching of SRL strategies to fulfill the educator’s role in guiding students to be independent learners. Enriching courses to support learners in employing and developing their SRL skills is a powerful way to improve short and long-term student retention in online courses (Secdyukov & Hill, 2013). Zimmerman’s (2013) cyclic self-regulation of learning (CSRL) model provides a lens for the design of the current SRL intervention. According to Zimmerman, SRL generally has three phases: forethought (the processes and beliefs that occur before a student attempts to learn), performance (the processes that occur during behavioral implementation), and self-reflection (the process that occurs after each learning effort). Zimmerman describes a SRL feedback loop in which learners who monitor their performance against their goals can change their behavior and self-­ perception to enhance their achievement and self-concept. Several studies have shown that learning outcome attainment improves when instructional environments are designed to prompt learners’ self-regulatory practice (Broadbent & Poon, 2015; Guo, 2022). Among his recommendations for future research, Zimmerman (2013) suggests that instructors provide students with ways to calibrate their self-efficacy and other self-regulation skills through self-reflection training. The intervention described in this chapter focuses on the self-reflection phase of the self-regulation process. In this phase, students measure themselves against a set standard (typically a formative or summative assessment), their own goals, or the performance of other learners in collaborative activities. The self-reflection phase also includes causal attributions in which learners determine whether their actions have had positive or negative effects on their success, impacting their planning, goal setting, and self-efficacy for the next instructional activity (Masui & de Corte, 2005). Student self-observations in the self-reflection phase may impact future forethought and performance strategies for the next learning task forming an iterative practice of SRL (Zimmerman, 2013).

Formative Design Process The initial design of the SRL-OnRAMP intervention was carried out as part of a Scholarship of Teaching and Learning faculty cohort program at a Mid-Atlantic Community College. This cohort program supported faculty in planning, informally

254

A. Guethler and W. A. Sadera

evaluating, and sharing course-based student success initiatives. This 1-year program provided an opportunity to develop, prototype, and pilot test the original scaffolded reflective intervention with one group of online students. At the end of the program, the designer received institutional research board approval to evaluate the intervention through successive studies (Guethler, 2023). The formative design pathway (Fig. 20.1) details the development of the SRL-OnRAMP from its initial design through a second revision and evaluation, in which the intervention is deployed by other faculty serving as implementation agents. SRL-OnRAMP-v1 Development, Prototyping, and Piloting The designers’ initial research identified reflective prompting as an effective practice to scaffold SRL use in computer-based learning environments (Guo, 2022;

Fig. 20.1  The formative design process of the SRL-OnRAMP intervention from theoretical grounding through its second revision

20  The Formative Design of the SRL-OnRAMP: A Reflective Self-Regulated…

255

Zheng, 2016). While many reflective prompting studies used intelligent computer systems as prompting agents, Azevedo et al. (2011) provided evidence that a human tutor could provide similar benefits. The design constraints of the current project require that the intervention occurs within the institutional learning management system (LMS). In addition, the workload created by the intervention must be manageable for both students and faculty. These constraints are commonly echoed as implementation concerns in the SRL intervention literature (Broadbent et al., 2020; Masui & de Corte, 2005). Several pedagogical theorists suggested scaffolded prompts for reflection as a practice likely to yield long-term increases in SRL skill usage (Kramarski, 2018; McCaslin & Hickey, 2009; Nilson, 2013). Prompting students to use SRL as part of a written reflection allows learners to consider their own behavioral, cognitive, and affective responses to learning activities while also including instructor guidance to increase the formative development of their SRL skills. By externalizing the self-regulation process in writing, instructors can act as mentors helping students to consider new strategies and suggesting when students’ self-evaluations and goals may not be realistic. The prototype of the SRL-OnRAMP activity utilized two different opportunities for reflection. The first reflection opportunity was designed as a supplement to a weekly skill check or homework assignment. The prototype used prompts and interaction methods adapted from a webinar by Simpson (2017). Students were to select one of the seven suggested prompts each week and to include their reflection in the assignment submission comment box provided in the LMS.  Simpson provided a choice of prompts to encourage students to think deeply about their work, selecting the prompt most valuable to them on a given week. Nilson (2013) suggested a second reflection strategy in which students were asked to evaluate their progress and plan for future units after each exam. The prototype was piloted during the LEARN cohort program and continued to be adapted over two additional terms. During this time, the assignment instructions were adapted and clarified. Prompts were rewritten and aligned with SRL behaviors linked with learner success in online courses, including goal setting, attribution, help-seeking, strategic monitoring, task interest assessments, and self-evaluation (Broadbent & Poon, 2015; Hamm et al., 2017; Masui & de Corte, 2005).

Scholarship of Teaching and Learning Study A qualitative-driven mixed methods study was conducted within the designer’s asynchronous online biology course across three terms. Both weekly and post-exam reflections were carried out using the revised weekly prompts and instructions. Students typically responded with 2–5 sentence-length discussions that often encompassed more than one SRL skill. Students were recruited to participate in the study by sharing their perspectives and archived reflections after the grading period had been completed. Content analysis was used to categorize students’ use of SRL prompts. Coding a series of 12 student reflections (n = 11) revealed that students

256

A. Guethler and W. A. Sadera

used the prompts to make attributions, set goals, self-evaluate, plan for future lessons, describe metacognitive monitoring, and reach out to the instructor for content and learning process feedback (Guethler, 2023). In addition, a survey (n = 37) gathered students’ perceptions of the value of the reflective activity through Likert scale and open-ended questions. In general, students valued the SRL OnRAMP intervention and stated that it helped improve their course experience, as shown in Fig. 20.2. In describing the impact of the SRL-OnRAMP-v1 intervention, one student wrote: “It allowed me to evaluate my performance and be honest with myself (and professor) so that I could identify areas of improvement while noting things that helped, things that didn’t, and areas where I could have added more effort.” Additionally, feedback provided in response to the reflections emerged as an unexpected avenue of communication. Both the students and the designer found this interaction to be valuable. In describing the impact of the intervention on interaction, one student wrote: “The reflection questions and surveys allowed me to better communicate my needs to be successful in the course. I was able to have a closer relationship with my instructor as a result of the reflection questions and surveys.” While most students found the intervention valuable, additional questions were generated while analyzing the student reflections. It was found that students often had low-quality reflections at the beginning of the course. While many students increased their SRL skill usage during the course, other students completed reflections that were typically non-actionable and unfocused. The student survey was drawn from both the traditional 15-week and 7-week summer terms. The designer noted that it was more common for students in the summer term to skip the reflection question or provide shorter, repetitive reflections. The summer term followed a compressed schedule which increased the intervention frequency to twice per week. In addition, to these concerns, a reviewer questioned whether the short length of student reflective responses limited their impact.

Fig. 20.2  Student perspectives on the SRL-OnRAMP-v1 intervention (n = 37)

20  The Formative Design of the SRL-OnRAMP: A Reflective Self-Regulated…

257

Literature Review of SRL Interventions Analyzing the features and impacts of interventions similar to the SRL-OnRAMP was essential in revising and enhancing the intervention for deployment outside of the designer’s classroom. The following scoping literature review was conducted to briefly summarize current practices that contributed to or undermined the value of similar reflective interventions. Broadbent and Poon (2015) found that the generalizability of lab and face-to-face SRL interventions was limited with respect to applying similar impacts to online courses. In keeping with their recommendation and the design constraints, interventions that were not carried out in fully online higher education courses were excluded. To conduct this intentional literature review, the following databases were queried: Academic Search Ultimate, APA PsycArticles, Education Research Complete, ERIC, and OpenDissertations. The main search strings were “reflection or reflective or reflective practice or co-­ reflection” and “SRL or self-regulated learning or self-regulated learning.” Additional inclusion criteria used synonym strings and Boolean operators for online learning and higher education. Selected articles were limited to peer-reviewed articles published from 2004 to 2021. The multiple definitions of reflection within the literature and a tendency to augment face-to-face courses with online tools were stumbling blocks in the query process. A review of abstracts excluded articles in which reflective prompts were focused on content analysis or the development of professional perspectives. After applying exclusionary criteria, ten peer-reviewed articles and one dissertation remained. Table 20.1 provides an overview of the intervention features in each study and the documented impact of those interventions by the study authors.

Finding 1: Embedded SRL Instruction Due to time-based concerns, college courses do not typically include self-regulation and study skills instruction. Only half of the studies described in the literature review provided students with information on enacting SRL or reflective processes. However, based on the study descriptions, the time commitment required to train students in the intervention was minimal (e.g., Bigenho, 2011; Chang, 2005; Chang & Lin, 2014; Kim et al., 2014). In each example, short faculty-directed instruction provided familiarization with SRL and self-reflection without distracting from the curricular content. Students are better able to engage in effective reflections when their purpose and expected benefits are explicitly identified. Additionally, awareness of SRL strategies allows learners to consider how they may transfer self-­ regulation to other coursework (Barak et al., 2016). Application: The SRL-OnRAMP-v1 design did not directly explain the importance of the reflective prompts or SRL practice. While coding the data, we found that students may not have initially understood the goal of the reflections.

Intervention description (Sensitivity to time constraints) Learning diary with prompts. Students selected prompts each week. Learning Journal with prompts. Students completed 1–2 paragraph responses. Learning Journal with prompts

Kim et al. (2014)

Website with dated wiki-style reflections and a resource, goal setting, and goal evaluation sheets.

Fung et al. (2019) Yilmaz and Students answered reflective Keser (2016) prompts in a blog (before/ during/after), interacting with content. Chang and Open Journaling based on a Lin (2014) model presented in the first week. Hughes et al. Students completed a self-­ (2014) assessment for formative work, received SRL-based feedback Authors Intervention Description (Including time constraints)

Burner (2019)a

Authors Blau et al. (2020)

Intervention design

Available throughout course

Undergraduate Business

Embedded SRL Instruction Y

Type of course

Frequency

Y

N

N

N

Graduate Educational N Leadership

Undergraduate Languages, Sociology, Computer Programing Undergraduate Languages

Undergraduate Educational Technology Undergraduate

Type of course Graduate Education

With each major assignment

Weekly 15 Weeks

Weekly Ten weeks Weekly

Four responses 15-week course

Frequency Weekly

Embedded SRL instruction N

Table 20.1  Overview of the reflective SRL interventions in online courses literature from 2004 to 2021

N

Feedback Cycle

Y

N

N

N

N

Feedback Cycle N

Y



Y

Y

Yb



Y

Outcomes or SRL retention usage



Y

Y



Yb

Outcomes or SRL retention usage – Y



Affective

Y

Y

Y

Yc

N

Affective –

Positive intervention impacts

258 A. Guethler and W. A. Sadera

Long-form blogs recorded progress and SRL strategies utilized. Students used an electronic workbook with reflection prompts. A tutor provided feedback. Journals (no prompts) and a study time recording form. Learning Journal with prompts. Five times/15 weeks

Weekly



Available throughout course

b

a

N

Y

Undergraduate Global Y Issues course Graduate Education N

Undergraduate Computer Applications Course Undergraduate Students in an open course

Hybrid course Marginal results (i.e., very small effect sizes or significance) c Positive results occurred only when reflection was combined with feedback

Chang (2005) Whipp and Chiarelli (2004)

Van Den Boom et al. (2007)

Bigenho (2011)

Y

Y

Y

N





Yc



Y

Y

Yb

Y



Y

Yc



20  The Formative Design of the SRL-OnRAMP: A Reflective Self-Regulated… 259

260

A. Guethler and W. A. Sadera

Additionally, a few students did not begin completing the prompts until several weeks into the term. In response to the observations and the literature findings, a video was created to explain the value of SRL and its connection to the intervention for the subsequent design iterations.

Finding 2: Frequency of Deployment An analysis of the characteristics and relative impact of the eleven reviewed interventions strongly suggests that a weekly practice of reflection is more beneficial than a less frequent practice. Several impactful studies utilized a schedule of reflective practice that occurred weekly or with each assignment (i.e., Blau et al., 2020; Chang & Lin, 2014; Yilmaz & Keser, 2016). Burner (2019) found that infrequent reflections on SRL limited the intervention value. On the other hand, Cho (2004) found a potential limit to the effectiveness of reflective interventions. In their study, students reported less motivation, frustration, and feelings of confinement when asked to reflect on SRL practices three times each week. Application: Students were observed to complete the intervention with more fidelity when scheduled once per week. The literature review concurs, suggesting that a weekly reflection frequency is optimal for this type of intervention. Reflection should be completed weekly in future iterations, regardless of curriculum pace.

Finding 3: Sensitivity to Time Constraints SRL interventions are often critiqued as being too time intensive for already strained student and faculty workloads (Araka et al., 2020; Broadbent et al., 2020). Even effective educational interventions often fail to gain traction because the intervention draws time from content-based time on task. Within this literature review, there was no indication that extended reflections were of more value than shorter ones in increasing student success or engagement. Burner (2019) found reduced student effort with longer and more frequent postings, while Hughes et al. (2014) found that even brief reflections raised student motivation and confidence. Two studies speculated that extended reflections increased the burden on students and instructors and should be avoided if shorter  – paragraph-length reflections have similar impacts (see Fung et al., 2019; van den Boom et al., 2007). Application: The value of short reflections observed as a product of the SRL OnRAMP intervention is supported. The reflections, as delivered in the SRL-­ OnRAMP-­v1, have a student workload of 5–15 min per week. The designer estimated that faculty should anticipate 5  min per student per week for reading, reflecting, and providing feedback. This intervention is sensitive to the time constraints typical in online learning.

20  The Formative Design of the SRL-OnRAMP: A Reflective Self-Regulated…

261

Finding 4: Emphasis on Feedback Cycle Less than half of the studies surveyed included feedback dialogue based on student reflections, and feedback loops were especially rare in the most recent studies. Reflections offer instructors a window into individual students’ learning processes, enabling personalized feedback and instruction. Students had strongly positive perceptions of reflective dialogue when tutors provided feedback based on student reflections (Hughes et al., 2014; van den Boom et al., 2007). In interviews about their reflective experience, students said that the feedback helped them process and plan for their areas of weakness. Additionally, students found the increased interactions with tutors to be motivational and engaging (Hughes et al., 2014). In a paired study, only those who received feedback from tutors increased their scores on learning outcomes (van den Boom et al., 2007). Feedback helps students learn about, assess, and feel empowered to utilize SRL skills while encouraging them to view their instructors as partners in the learning process. Application: Students in the SRL-OnRAMP-v1 study greatly valued the feedback provided in response to their reflections. While the student feedback cycle was always a part of this intervention, giving advice and encouragement based on student reflections must be emphasized in future implementations.

SRL-OnRAMP-v2 Revision In addition to the design revisions extending from the literature review, preparing the intervention for high-fidelity deployment by other faculty revealed the need for additional changes to the SRL-OnRAMP-v1 design. Blending the intervention into the curricula of eight different online courses required that the delivery method be modified to fit in with the existing course structure. Instructors worked with the designer to implement the intervention in three different ways: (1) as an essay question on a weekly quiz, (2) through responses collected via the assignment submission “comment” box, or (3) through an independent weekly wrap-up assignment using the LMS quiz tool. Instructors favored a discrete weekly wrap-up in eight-­ week compressed schedules. Additionally, instructors chose to implement the intervention as a low-point-value graded assignment to encourage students to allocate time to reflections. The post-exam reflection was discontinued to simplify implementation, and the goal setting and attribution question was reworded and added to the weekly reflective prompt options. The wording of the reflective prompts was further refined based on a close analysis of student responses during the qualitative phase of the previous study. Additionally, one question was removed from the list of prompts due to low use. The final version of the reflective prompts and their alignment to SRL skills is shared in Table 20.2.

262

A. Guethler and W. A. Sadera

Table 20.2  Prompts used in the SRL-OnRAMP-v2. Definitions of these skills are drawn from Zimmerman (2013) SRL-OnRAMP-v2 Reflective Prompts Weekly Reflection Prompts 1. What aspect of this module or homework assignment proved most challenging? What specific strategies did you utilize to work through and overcome the challenge? 2. Was there a tool or strategy that helped you to understand the material in the module better? How can you use a similar tool to help you learn the next topic? 3. What do you feel is the strongest component of your submitted work? What connection can you make between your effort and the strength(s) you identified? 4. Given additional time, what would you improve in your work? Describe the elements that you would want to add or change.

5. Did you connect with the material in this module? How was it relevant to you? 6. What challenged a belief, opinion, or perspective? How are you choosing to process and work through that challenge? 7. Are you meeting your own goals for this course? Why have you or why have you not been successful? What can you do to ensure your next assignment meets your goal?

Targeted Skills Self-evaluation

Strategic monitoring Metacognition Self-evaluation Attribution Self-evaluation Effort management Time management Task Interest Task interest Metacognition Critical thinking Self-Evaluation Attribution Goal Setting

Mixed Methods Evaluation Study While SoTL studies are a critical method of exploring promising practices in higher education, a scaled study in which external instructors implement a design is crucial in determining whether the technique is generalizable (de Leeuw et al., 2020). To that effect, a mixed-method study is currently being conducted to describe the impact on students’ perception of this intervention’s influence on SRL skill enactment and transactional distance. The SRL-OnRAMP-v2 has been implemented within eight asynchronous online courses. The eight-week term course included introductory-level courses in history, business, sociology, social geography, physical geography, and a drone technology course. Of the seventy-four students who completed the intervention and a follow-up survey, three-fourths reported a positive impact on their work in the class, while 27% reported no impact. When considering their communication with their instructor, 70% of students reported a positive impact, while 30% had a neutral reaction to the experience. Only one student reported a negative experience with the intervention. Instructors and students who have used this intervention found that their weekly interaction increased communication about the course. Instructors indicated they

20  The Formative Design of the SRL-OnRAMP: A Reflective Self-Regulated…

263

could identify stumbling blocks that were not previously evident from their perspective. At the same time, students felt they had an opportunity to seek help and advocate for changes when the course activities were not meeting their learning needs. The communication opportunity created by the reflection and feedback process helped to alleviate students’ feelings of teaching themselves in the online classroom. In addition to collecting quantitative and qualitative information about SRL use and communication, the study provided opportunities for students and faculty to suggest future revisions. Though data analysis has only begun, one revision suggestion has emerged from open-ended survey questions. Students felt that the questions were somewhat redundant. While providing a choice of prompts was popular, their suggestion was to provide a sub-set of two or three prompts each week. The faculty serving as implementation agents agreed that providing different options each week would encourage students to exercise several self-regulation skills throughout the course. As with all formative designs, this intervention will continue to be refined over time.

Implications The SRL OnRAMP intervention addresses several limitations noted through previous research, including the inclusion of weekly reflective practice, brief learner instruction in SRL practices, and instructor feedback cycles. These features are combined in a model that is sensitive to time constraints and utilizes the LMS instead of external technologies. Additionally, its questions are flexible enough to be relevant across most higher education content areas. In addition, this intervention encourages instructors and students to engage in mentoring dialogs about the student’s learning process in the online course. Though studies of its effectiveness in promoting SRL skill development are still pending, the intervention has received positive feedback from students and instructors in science, history, sociology, and geography courses. The SRL-OnRAMP is presented with the hope that practitioners will continue to adapt it to new learning environments to improve students’ development of SRL skills and online learning opportunities. Additionally, this chapter described the stepwise process of formative design used to develop this intervention. The SRL-OnRAMP has evolved over time as we have more closely approximated students’ understanding of and use of reflective prompts. We expect that it will continue developing as other instructors adapt and blend it with their own curricula. The researchers hope that providing an overview of the design process provides encouragement to others undertaking the design and sharing of student success interventions.

264

A. Guethler and W. A. Sadera

References Araka, E., Maina, E., Gitonga, R., & Oboko, R. (2020). Research trends in measurement and intervention tools for self-regulated learning for e-learning environments—systematic review (2008–2018). Research and practice in technology enhanced learning, 15(6). https://doi. org/10.1186/s41039-­020-­00129-­5 Azevedo, R., Cromley, J. G., Moos, D. C., Greene, J. A., & Winters, F. I. (2011). Adaptive content and process scaffolding: A key to facilitating students’ self-regulated learning with hypermedia. Psychological Test and Assessment Modeling, 53(1), 106–140. https://www.researchgate. net/publication/50864534 Barak, M., Hussein-Farraj, R., & Dori, Y. J. (2016). On-campus or online: Examining self-regulation and cognitive transfer skills in different learning settings. International Journal of Educational Technology in Higher Education, 13(1), 35. https://doi.org/10.1186/s41239-­016-­0035-­9 Ben-Eliyahu, A., & Bernacki, M.  L. (2015). Addressing complexities in self-regulated learning: A focus on contextual factors, contingencies, and dynamic relations. Metacognition and Learning, 10(1), 1–13. https://doi.org/10.1007/s11409-­015-­9134-­6 Bigenho, C. W. (2011). Student reflections as artifacts of self-regulatory behaviors for learning: A tale of two courses. University of North Texas. Blau, I., Shamir-Inbal, T., & Avdiel, O. (2020). How does the pedagogical design of a technology-­ enhanced collaborative academic course promote digital literacies, self-regulation, and perceived learning of students? Internet and Higher Education, 45(June 2018), 100722. https:// doi.org/10.1016/j.iheduc.2019.100722 Broadbent, J. (2017). Comparing online and blended learner’s self-regulated learning strategies and academic performance. Internet and Higher Education, 33, 24–32. https://doi.org/10.1016/j. iheduc.2017.01.004 Broadbent, J., Panadero, E., Lodge, J. M., & de Barba, P. (2020). Technologies to enhance self-­ regulated learning in online and computer-mediated learning environments. In M. J. Bishop, E. Boling, J. Elen, & V. Svihla (Eds.), Handbook of research in educational communications and technology (pp. 37–52). Springer. https://doi.org/10.1007/978-­3-­030-­36119-­8_3 Broadbent, J., & Poon, W. L. (2015). Self-regulated learning strategies & academic achievement in online higher education learning environments: A systematic review. Internet and Higher Education, 27, 1–13. https://doi.org/10.1016/j.iheduc.2015.04.007 Burner, K. J. (2019). Journaling to elicit self-regulation and academic performance in a preservice teacher technology education course. Technology, Instruction, Cognition, and Learning, 11, 219–245. Chang, M.  M. (2005). Applying self-regulated learning strategies in a web-based instruction  An investigation of motivation perception. Computer Assisted Language Learning, 18(3), 217–230. https://doi.org/10.1080/09588220500178939 Chang, M. M., & Lin, M.-C. (2014). The effect of reflective learning e-journals on reading comprehension and communication in language learning. Computers & Education, 71, 124–132. https://doi.org/10.1016/j.compedu.2013.09.023 Cho, M. H. (2004). The effects of design strategies for promoting students ‘self-regulated learning skills on students’ self-regulation and achievements in online learning environments. Association for Educational Communications and Technology, 27(1999), 19–23. http://www. acousticslab.org/dots_sample/module4/Cho2004_SelfRegulatedLearning.pdf de Leeuw, R. R., de Boer, A. A., & Minnaert, A. E. M. G. (2020). The proof of the intervention is in the implementation; a systematic review about implementation fidelity of classroom-based interventions facilitating social participation of students with social-emotional problems or behavioural difficulties. International Journal of Educational Research Open, 1(June), 100002. https://doi.org/10.1016/j.ijedro.2020.100002 Fung, C. Y., Abdullah, M. N. L. Y., & Hashim, S. (2019). Improving self-regulated learning through personalized weekly e-learning journals: A time series quasi-experimental study. Journal of Business Education & Scholarship of Teaching, 13(1), 30–45.

20  The Formative Design of the SRL-OnRAMP: A Reflective Self-Regulated…

265

Gašević, D., Adesope, O., Joksimović, S., & Kovanović, V. (2015). Externally-facilitated regulation scaffolding and role assignment to develop cognitive presence in asynchronous online discussions. Internet and Higher Education, 24, 53–65. https://doi.org/10.1016/j.iheduc.2014.09.006 Guethler, A. (2023). One intervention, two benefits: A qualitative analysis of students’ use of reflective prompting for selfregulated learning in an online course. Education and Information Technologies, 1–23. https://doi.org/10.1007/s10639-023-12016-9 Guo, L. (2022). Using metacognitive prompts to enhance self-regulated learning and learning outcomes: A meta-analysis of experimental studies in computer-based learning environments. In Journal of Computer Assisted Learning (Vol. 38, Issue 3, pp.  811–832). Wiley. https://doi. org/10.1111/jcal.12650 Hadwin, A., Jarvela, S., & Miller, M. (2018). Self-regulation, co-regulation, and shared regulation in collaborative learning environments. In D. H. Schunk & J. A. Greene (Eds.), Handbook of self-regulation of learning and performance (2nd ed., pp. 83–106). Routledge. Hamm, J. M., Perry, R. P., Chipperfield, J. G., Parker, P. C., & Heckhausen, J. (2017). A motivation treatment to enhance goal engagement in online learning environments: Assisting failure-prone college students with low optimism. Motivation Science. https://doi.org/10.1037/mot0000107 Hughes, G., Wood, E., & Kitagawa, K. (2014). Use of self-referential (ipsative) feedback to motivate and guide distance learners. Open Learning, 29(1), 31–44. https://doi.org/10.108 0/02680513.2014.921612 Huss, J. A., & Eastep, S. (2013). The perceptions of students toward online learning at a midwestern university: What are students telling us and what are we doing about it. I.e.: Inquiry in Education, 4(2), 11–18. https://digitalcommons.nl.edu/ie/vol4/iss2/5 Jones, K. [@kamarienyausha] (2020, April 16). These online classes are emotionally and mentally draining. i dont feel im learning, nothing is sticking. im so unmotivated and yet [Tweet]. Twitter. https://twitter.com/kamarienyausha/status/1250889680129798146 Kim, R., Olfman, L., Ryan, T., & Eryilmaz, E. (2014). Leveraging a personalized system to improve self-directed learning in online educational environments. Computers and Education, 70, 150–160. https://doi.org/10.1016/j.compedu.2013.08.006 Kramarski. (2018). Teachers as agents in promoting students’ SRL and performance: Applications for teachers’ dual-role training program. In D. H. Schunk & J. A. Greene (Eds.), Handbook of self regulation learning and performance (2nd ed.). Routledge. Kuo, Y.  C., Walker, A.  E., Schroder, K.  E. E., & Belland, B.  R. (2014). Interaction, internet self-­efficacy, and self-regulated learning as predictors of student satisfaction in online education courses. Internet and Higher Education, 20, 35–50. https://doi.org/10.1016/j. iheduc.2013.10.001 Masui, C., & de Corte, E. (2005). Learning to reflect and to attribute constructively as basic components of self-regulated learning. British Journal of Educational Psychology, 75(3), 351–372. https://doi.org/10.1348/000709905X25030 McCaslin, M., & Hickey, D.  T. (2009). Self-regulated learning and academic achievement: A Vygotskian view. In B.  J. Zimmerman & D.  H. Schunk (Eds.), Self-regulated learning and academic achievement (2nd ed.). Routledge. Murders, M. (2017). A phenomenological study of the online education experiences of college students with learning disabilities [Dissertation, University of Arkansas]. https://scholarworks. uark.edu/etd/2518 Nilson, L.  B. (2013). Creating self-regulated learners: Strategies to strengthen students’ self awareness and learning skills (1st ed.). Stylus Publishing. Schunk, D. H., & Greene, J. A. (2018). Historical, contemporary and future perspectives on self-­ regulated learning and performance. In D. H. Schunk & J. A. Greene (Eds.), Handbook of self-­ regulation of learning and performance (2nd ed.). Routledge. Secdyukov, P., & Hill, R. A. (2013). Flying with clipped wings: Are students independent in online college classes? Journal of Research in Innovative Teaching, 6(1), 54–67. Simpson, S. [CanvasLIVE] (2017). Coach metacognition via assignment submission comments. YouTube. https://youtu.be/IpS-­jsEYxh8

266

A. Guethler and W. A. Sadera

van den Boom, G., Paas, F., & van Merrie, J. J. G. (2007). Effects of elicited reflections combined with tutor or peer feedback on self-regulated learning and learning outcomes. Learning and Instruction, 17, 532–548. https://doi.org/10.1016/j.learninstruc.2007.09.003 Whipp, J. L., & Chiarelli, S. (2004). Self-regulation in a web-based course: A case study. Educational Technology Research and Development, 52(4), 5–22. https://doi.org/10.1007/ BF02504714 Yilmaz, F. G. K., & Keser, H. (2016). The impact of reflective thinking activities in e-learning: A critical review of the empirical research. Computers and Education, 95, 163–173. https://doi. org/10.1016/j.compedu.2016.01.006 Zheng, L. (2016). The effectiveness of self-regulated learning scaffolds on academic performance in computer-based learning environments: A meta-analysis. Asia Pacific Education Review, 17(2), 187–202. https://doi.org/10.1007/s12564-­016-­9426-­9 Zimmerman, B.  J. (2000). Attaining self-regulation: A Social cognitive perspective. In M.  Boekaerts, P.  Pintrich, & M.  Zeidner (Eds.), Handbook of self-regulation (Electronic, pp. 13–40). Academic. Zimmerman, B. J. (2009). Theories of self-regulated learning and academic achievement: An overview and analysis. In B. J. Zimmerman & D. H. Schunk (Eds.), Self-regulated learning and academic achievement (2nd ed.). Routledge. Zimmerman, B. J. (2013). From cognitive modeling to self-regulation: A social cognitive career path. Educational Psychologist, 48(3), 135–147. https://doi.org/10.1080/00461520.2013.794676

Chapter 21

The Many Hats – Accidental Roles on an Interdisciplinary Research and Implementation Project: A Collaborative Autoethnography Deepti Tagare, Marisa E. Exter, and Iryna Ashby Abstract  This chapter is a collaborative autoethnography on the first and second authors’ experience as educational researchers in an interdisciplinary multi-­ institutional project. The multidisciplinary nature of both the larger team and the project led to the emergence of new formative roles as instructional designers and instructional design consultants to guide the larger team towards competency-based education. We present individual and joint reflections on the tensions arising due to the emergent formative roles and discuss how interdisciplinarity contributes to it. This chapter illustrates how our multiple identities impacted our project work and how the project work changed our perception of our identities. Keywords  Collaborative autoethnography · Interdisciplinary team · Instructional design · Educational researcher · Accidental instructional designer Interdisciplinary research is undertaken by scholars from two or more disciplines with the purpose of solving a large complex problem to produce meaning, explanation, and new knowledge (Aboelela et al., 2007; Rhoten & Pfirman, 2007; Rijnsoever & Hessels, 2011). It allows multidisciplinary team members to explore a complex phenomenon from multiple perspectives (Rijnsoever & Hessels, 2011). The primary benefit of interdisciplinary research is the participants’ varied set of skills and knowledge that provides opportunities for deeper analysis and synthesis (Aboelela et al., 2007). Although encouraged by funding agencies like the National Science Foundation (NSF), research suggests that scholars involved in interdisciplinary projects need to overcome both philosophical and practical differences among the various disciplines (Campbell, 2005; Rhoten, 2004).

D. Tagare (*) · M. E. Exter · I. Ashby Purdue University, West Lafayette, IN, USA e-mail: [email protected]; [email protected]; [email protected] © The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 B. Hokanson et al. (eds.), Formative Design in Learning, Educational Communications and Technology: Issues and Innovations, https://doi.org/10.1007/978-3-031-41950-8_21

267

268

D. Tagare et al.

There might be fundamental differences in how each discipline defines the problem being tackled. If these go unaddressed, or worse, unnoticed, they may hinder the team’s harmonious functioning (Campbell, 2005). Campbell (2005) reports that scholars may have unconscious assumed expectations of the objectives and outcome of the research. If not communicated early, such unspoken assumptions may cause conflicts, lack of a unified approach, and ultimately become barriers to the co-creation of knowledge thus obstructing the success of the project (Rhoten, 2004). Differences may also relate to areas including publishing and authorship, perception and treatment of bias, power dynamics, and differences in use of terminologies, resulting in unforeseen misunderstandings among members of the group (Campbell, 2005). The role of a researcher may be defined at the onset of the project. However, complex dynamics due to disciplinary differences result in researchers positioning themselves strategically with respect to other team members (Freeth & Vilsmaier, 2020). This researcher position dynamically shifts as and when the interpersonal encounters demand it (O’Boyle, 2018). This is necessary to transcend disciplinary boundaries and create a shared understanding of the research problem to develop research approaches that are mutually beneficial and agreed upon (O’Boyle, 2018). Yet, negotiating the malleable position/role can be a challenge when complex circumstances require the researcher to play more than one role whose boundaries may not be clearly defined. In her book Accidental Instructional Designer, Bean (2014) discussed how we can stumble (often by accident) into a situation where instruction is needed, when we discover existing gaps in knowledge and skills that went unnoticed before. In the project discussed in this chapter, our only defined role was that of educational researchers. Although some team members had a background in instructional design (ID), we did “fall” into the role of instructional designer (IDer) with no prior planning around the need, scope of work, or funding of our time on that work. This chapter includes individual reflections and joint sensemaking on the first and second authors’ experience in an interdisciplinary multi-institutional project. We balanced our original role as educational researchers and emergent formative roles as IDers to help shape common understanding and provide other team members with the competencies required to meet the project’s goals. While it is not unusual for grant participants to engage in educating stakeholders, our take on the events also reflected the tradeoffs and benefits we had to balance out when interweaving these roles, especially as they were unplanned in the outset.

Brief Description of the Project The first two authors are part of a multi-institutional interdisciplinary team in an NSF-funded project. The project aims to support systemic change of undergraduate computing programs towards scalable competency-based curricula. As originally

21  The Many Hats – Accidental Roles on an Interdisciplinary Research…

269

conceived, the project included a research team; a pilot team with representatives from computing programs in three institutions; and a team focused on raising awareness and desire to move towards a research-based competency model in the larger computing education community, with support from a change management expert. The majority of team members are computing educators with backgrounds in computer science or similar fields.

Methods Collaborative autoethnography (CAE) is a qualitative method of inquiry that investigates the individual and collaborative experiences of a group through collective reflection, interrogation, and analysis (Chang et al., 2013). This reflective process is informed by accompanying formative research (Freeth & Vilsmaier, 2020), occurring in parallel with the experience described. This helps us gain a better understanding of how our multiple positionalities impact the project work and how this work changes our perception of our positionality. The demographic information of all team members is included in Table 21.1. The first two authors had significant emergent roles during the process of this CAE. Other team members remained focused on research work. Trustworthiness in CAE is increased through the inclusion of a non-­ autoethnographer, who is not directly involved in the experience and does not contribute an individual reflection. They contribute to the collaborative process of sensemaking through analysis and interpretation (Chang et  al., 2013). The third author served as our non-autoethnographer, drawing on her own interdisciplinary educational and professional background (see Table 21.1).

Procedure Our procedure to document our reflections and meta-reflections is shown in Fig. 21.1. Triggering events were defined as events that led to evolution of our perception of our role within the bigger team, for example, participating in larger team meetings, conducting workshops, or chasing deadlines for conference proposals and journal article submissions. Meeting notes and other relevant documents were reviewed to refresh our memory prior to writing reflections. To trigger additional reflection, the third author conducted individual and joint interviews with the other authors. The autoethnographers went through multiple phases of re-writing the reflections as part of identifying and summarizing the themes. Upon finishing the thematic segments, we recorded a final dialog about our reflections across the entire year – the basis for the “Joint Sensemaking Discussion” section.

Professional Background & Relevant Service Translator (20+ years) ID (10+ years): consulting in higher education and corporate Project Manager (10+) Department Manager (5+) General Manager (5+)

Shamila Janakiraman

Electronics & Communication Engineering (BE), Marketing Management (MBA) Learning Design and Technology (MS, PhD)

Engineer (2 years), Computer hardware instructor (2 years), Middle school teacher and University lecturer (3 years), Content writer (9 years), Faculty in the Purdue LDT program after completing M.S. & Ph.D. (1 year 8 months)

Marisa Exter Computer Science (BS, Software designer and developer MS); Instructional Systems (14 years, combined industry + HE) Technology (PhD) Manager/day-to-day project manager (5 years) Faculty member (9 years) Member of ACM/IEEE Computer Society Computing Curriculum 2020 taskforce.

Educational Background Translation Study (BA, MA), Psychology (BA) & ID (MSEd, PhD) Suzhen Duan Automation (BS), Engineering Physics (MS), Learning Design and Technology (MS, PHD)

Name Iryna Ashby

Table 21.1  Demographic information of team members & authors

Research assistant – Educational Research

Role on Grant Project Team None Role on CAE 3rd author (non-­ autoethnographer) None

Purdue PI and 2nd Author co-lead of (Autoethnographer) competency team Project manager Study designer Educational Researcher IDer CBE, game-based learning, Post-Doc None gamification, online teaching Researcher and learning, attitudinal Project learning, and education for manager sustainable development Educational Researcher

Research Interests Competencies & CBE; Interdisciplinary teams & education; Motivation, attitudes, beliefs, and identity among preservice teachers, in-service teachers and computing education. CBE; Interdisciplinary education; computing education

270 D. Tagare et al.

Educational Background Computer Science (BS, MS, MST, PhD)

Electronics and Telecommunication Engineering (BE) Publishing (MA)

Name Mihaela Sabin

Deepti Tagare

Professional Background & Relevant Service 15 years as faculty member & department chair of Computing Program at University of New Hampshire; Chaired the ACM/IEEE Computer Society curriculum guidelines IT2017 task force; ACM Education Board and ACM SIGITE Vice-Chair; Founding member of the Computer Science Teacher Association NH Chapter and CS4NH; ABET program evaluator Five years of ID Experience in Industry; 2+ years’ experience in designing and facilitating teacher professional development in computational thinking skills. Computational Thinking Skills; Teacher Professional Development; CBE

Research Interests Computing education; curriculum development; AI

Graduate Research Assistant Educational Research IDer

Role on Grant Project Team UNH PI and co-lead of competency team Subject matter expert Educational researcher

1st Author (Autoethnographer)

Role on CAE None

21  The Many Hats – Accidental Roles on an Interdisciplinary Research… 271

272

D. Tagare et al.

Fig. 21.1  Autoethnographic process

Findings Distinct themes emerged from individual reflections. The emergence and evolution of our roles is represented in the thematic summary of individual reflections. Tensions around the emergent formative roles are presented through the analysis of our individual reflections and joint sensemaking conversations.

Thematic Summary of Individual Reflections Theme 1: Starting the Project as Educational Researchers Marisa I co-lead the research team with Dr. Mihaela Sabin, a Computer Science professor and department chair at University of New Hampshire. Other than Mihaela, the research team is all at Purdue, under my supervision. According to the proposal, I was to “co-design the research plan with Dr. Sabin with assistance from other research team members and train and mentor all members of the research team in applicable research methods, in addition to taking part in all aspects of implementing the research plan, working on the competency model, and (co)authoring presentations and papers.” I have taken on a larger share of “implementing the research plan” than initially anticipated, including co-conducting interviews, taking part in the systematic literature review process, and analyzing large amounts of qualitative data. I also spend significant amounts of time in coordination with other teams during leadership team and large-team meetings, small group planning meetings, etc. These duties and other project responsibilities competed for my attention along with teaching, service, and other research projects as Purdue faculty.

21  The Many Hats – Accidental Roles on an Interdisciplinary Research…

273

Deepti As a graduate research assistant, I came in with a broad understanding that this grant project involves educational research work in competency-based education for computing degree programs. Starting in August 2021, I joined the postdoc in the systematic literature review work, since the interview research was still in the planning stage. Gradually, I became involved in planning for and conducting interviews of computing professionals. Both these research methods were quite new to me as a PhD student, resulting in a steep learning curve. Theme 2: Challenges Within Interdisciplinary Team Marisa We realized early on that we needed to do some foundational work to define what competencies mean for this project, and the model we would like to promote to computing educators. This was more work than I expected – planning and leading monthly meetings that included our team and representatives of other teams, writing and re-writing documents, (co)creating diagrams and visualizations, etc. Although this began in August 2021, we aren’t fully satisfied with our proposed common terminology because of the difference in disciplinary terminology and different bodies of literature we reference. As we planned for and proceeded with the educational research strands, we recognized that, although I have a computer science background and Deepti has a background in a related field, neither of us have been computing educators. We relied on Mihaela and other team members in areas we did not anticipate. For example, they wanted us to include the “big umbrella” of computing fields in the study. We did not know what academic programs or job roles should be included. In turn, although other team members may have done computing education research, they may not have had a formal background in educational research methods. Therefore, we spent much time going back and forth on how many different job roles under “the big umbrella” we could reasonably include before reaching saturation, or how long it would take to do a constant comparative analysis of 30 interviews and 50 articles. Deepti This inter-reliance between teams was frequently discussed in the larger team meetings. However, it caused delays in the intended timeline due to lack of understanding across the various disciplines. We could not fully grasp the complexity and multilayered nature of the computing field, something that in which we needed to make progress in our research. There was a gap in understanding between the computing educators and our team as to how long the research-based competency list

274

D. Tagare et al.

generation would take. Soon we realized that members within the larger team had a different understanding of the project scope, timeline, and deliverables. The larger team was waiting for our team’s research outcomes. Theme 3: Formative Emergence of the ID role Marisa In Fall 2021, the pilot team members increasingly expressed a concern that they were unsure what they could work on before we provided the research-based competency list to them. We started creating materials about CBE for their use. In late December, the pilot team lead, the change management expert, and I had a very general idea of offering two short workshops. My vision was to quickly put these together as another way to introduce concepts and answer questions about the materials we had shared. We believed this would help each pilot institution set their own goals. I didn’t really think of this as a significant dedication to ID work but by the end of these workshops, I realized we would probably be doing more of this work, and that we were starting to slide into the role of “accidental instructional designers.” Deepti This led to an offshoot to my original role as I transitioned into the IDer’s role. CBE itself was very new to me. I needed to learn about CBE to be able to successfully fulfil my role as an IDer. This required me to invest more time in reading and developing an in-depth understanding. Marisa and I engaged in designing and creating instructional material. I leveraged my prior experience in ID to develop the material. All this occurred while we continued to conduct interviews for research and co-­ author conference proposals. Theme 4: Evolving Nature of the ID role Marisa After the initial workshops, the pilot team indicated that they would like to engage in activities to directly help them redesign their courses. We agreed to plan a three-­ afternoon workshop in June 2022 with the systemic change team. Deepti and I had some long discussions about what to include. We planned interactive activities to help each institution’s faculty to customize their approach. This planning took much more time and thought than simply putting together some PowerPoints. We spent most of the third day of the workshop working with faculty on individual course modifications. They used a template we provided and brainstormed

21  The Many Hats – Accidental Roles on an Interdisciplinary Research…

275

ways to add competencies and new pedagogical approaches into one of their courses. We met with each instructor twice during the session. Several were excited about making changes to their courses. We offered to be available to ‘consult’ but hadn’t really thought through what this would entail. Deepti My consultation with one faculty member involved helping her think through course objectives, brainstorming ideas to address concerns with the existing course structure, suggesting competency-based pedagogies to resolve some of these concerns, and ideating on pedagogical and assessment techniques for integrating dispositions within this course. This was done over a single call. While I didn’t create material for her course, I helped her rethink and redesign it. It felt like I was taking on another role – that of an ‘instructional design consultant.’ Theme 5: Multiple Dimensions of the Researcher Role Marisa Being in academia, we are under pressure (me as a faculty member and Deepti and other team members as future job seekers) to publish as much as we can. At the same time, I want to derive materials from the same data that are useful and accessible for the target audience: educators. So, I know we are going to make different versions of reports and augment the competency list with pedagogical and assessment recommendations at a level of detail and writing style that is quite different from those research articles. While I love every bit of what we are doing, I recognize my bad tendency to somehow believe we will be able to compress it all into the constantly dwindling time remaining on the grant. Deepti The week before the June workshop, I had paused my work on interview coding since I didn’t have the bandwidth to juggle paper writing, ID, coding, an intensive summer course, and other unrelated research projects. We were behind on compiling the competency list from the two research strands mainly because I was still coding the interviews. I felt the pressure. Just as I was catching pace in coding, the deadline for a conference paper came up and I had to switch gears again. At this point it felt like the educational research role was in conflict with itself!

276

D. Tagare et al.

Joint Sensemaking Discussions After completing our thematic analysis and summarizing our reflections in the previous section, we had a joint sensemaking discussion. We spoke at length about our experiences after each triggering event. In some cases, we had unique perceptions because of the different focus and responsibilities of each of our roles. For example, Deepti’s initial perception of the need for two-page conceptual introductions was that it was pre-planned by Marisa. However, Marisa shared that was not quite the case. Marisa: You went to India during winter break, and it was not very practical for you to participate in conducting interviews. Meanwhile, the pilot team was unsure what to do while waiting for our competency list that was very delayed because of the interview timeline. I had just assumed they would start thinking about how to incorporate CBE practices into their classes, but I realized that they didn’t know much (if anything) about CBE and might benefit from background material. This seemed like a good task for you to work on offline. Deepti: I knew before I traveled that there was a need, but I did not know the history of the request. I don’t know if you remember, but when you first told me about the need for two-­ pagers, I asked you, ‘OK, what is the final use of this? Where is this going? Who’s the audience?’ When you said that this would be for the pilot team, I thought that this was pre-­ planned because it looked like it aligned with a document that you had already created summarizing our framework and some competency based pedagogical and assessment approaches. Marisa: It wasn’t really pre-planned. I met with the pilot team lead just a few days before you left. She was concerned that the pilot team didn’t have anything to work on yet. I had some ideas of things they could do. So, I suggested we could do a brief workshop. I really wasn’t sure what that workshop would be like, and I thought ‘let’s just start making materials to share first.’ At that time, I thought the materials would give them something to work with and give us time to finish the research steps.

The IDer role emerged as we continued to plan workshops, and started demanding increasing time and resources. We experienced tensions between our primary role as researchers and the emergent formative role as IDers as apparent in our joint reflection. Deepti: These roles had become very dynamic. When a workshop was scheduled, most of my work hours were spent designing and developing training material. Other times, my primary role as a research assistant took precedence. There were times when I juggled both roles as we conducted interviews and offered workshops in the same week. Marisa: Yeah, I felt torn about the amount of work you were doing, while we were also trying to find and interview as many people as possible to catch up with our research timeline.

This experience demonstrated how intense ID work actually is, and the amount of time that we needed to dedicate to this ‘accidental’ role. Yet, in this process, we found ourselves struggling to believe in our ability to truly identify the pilot team’s needs.

21  The Many Hats – Accidental Roles on an Interdisciplinary Research…

277

Deepti: The ideation process for the June workshop took longer. You and I had to plan carefully for optimum use of our time with the local team. I felt that you had a better understanding and insight of the pilot team’s needs since you closely interact with them more often. I relied on you to inform our ID decisions and choices. Marisa: I feel that you underestimate yourself and overestimate me! Haha. As I reflect, I keep realizing that I am not sure exactly what the pilot team does need – or what they have the resources and power to change. We focused on adjusting to participants’ needs during the workshop itself, but never really did a needs analysis.

Each of our roles benefited the others. One of our conversations reflects this perspective. Deepti: As we move closer to completing our data analysis and have preliminary findings, I see the boundaries of the two roles blurring. In our meetings with the pilot team, I draw from my role as a researcher to make suggestions based on my research findings and from my IDer’s role to tailor how it is presented to them. My ID work is now informed by the competencies emerging from the data and the quotes from interviewees. Marisa: Those quotes were really helpful at the workshop! I think they said more to the pilot team faculty than a million words in our own voice. I believe the ID work helps the research effort. It helps us understand what the audience needs to know – even things like, how many competencies can we really have? That impacts the data analysis – we have to compress codes to reduce the number.

In the current state of the project, we have taken on full responsibility to conduct ongoing workshops for the pilot teams and provide ID consultation to the faculty members as needed. However, due to the time and resource tensions, we are unable to truly offer what we believe an IDer should offer. As Marisa points out, there is also a gap in awareness regarding what we potentially could offer. Marisa: I don’t think they knew what an IDer is. The three that revised their courses were motivated and willing to spend a lot of their own time. Because they made changes to their own courses, we did not go through a process of identifying needs or learner characteristics, or even helping them design or develop anything. I feel like a poor IDer for not really explaining what IDers do or planning in a way that we could offer those “services.” On the other hand, this offer of “consulting” with them hasn’t really taken much of our time so far.

While our assigned task (as per the grant) was educational research, our emergent identity as IDers led us to believe we could (and should) take on other roles to address the discovered gaps. However, we now need to make judgements about balancing priorities. Currently, we are trying to balance four main areas: generating a competency list based on Year 1 data; generating research presentations and publications; ID work; designing and developing a survey (originally devised as a primary priority within the grant). Other challenges arose (in areas beyond the scope of this paper) to push us further off the timeline or complicate the research work. At this point, we would like to do it all – but need to make a priority call. This is difficult within the larger interdisciplinary team, because other team members may not understand what these elements entail, and there is a desire for collaborative decision making.

278

D. Tagare et al.

Non-Autoethnographer’s Meta-Reflection by Iryna In some cases, instruction can be pre-planned when the gaps are clear at the beginning of the project. However, often such a need emerges sporadically once the team starts working together, making sense of expectations, building common language, and uncovering gaps. This is particularly true for interdisciplinary teams such as the one described by my co-authors. The larger team had people with similar levels of education. Yet, there was a difference in disciplinary cultures and backgrounds that shape professional terminology, approaches to teaching, and competencies. For an effective interdisciplinary collaboration, it is important to create opportunities to develop a common basis for all members. As a result, we have accidental IDers. The co-authors’ experiences align with the research on multi-institutional interdisciplinary research teams, which often face geographical and conceptual distance. While geographical distance may be mediated using technology, conceptual distance requires ongoing effort and engagement (Axelrod & Cohen, 1999; Cilliers, 1998; Wolf-Branigin, 2013). This refers not only to building a clear joint vision and goals for the project, but also addressing (accidentally) uncovered needs for common language and terminology that would allow all members to operate on the same level (Holbrook, 2013; Holley, 2017; Klein, 2005; Lattuca, 2001). This is evident in the emergence of the IDer role with the intention to quickly bringing team members to a common understanding. The question may arise as to why such accidents happened even if some team members were IDers, and could potentially predict the need. Again, the challenge is presented by the interdisciplinary nature of the team. Most, if not all, team members were faculty with advanced degrees in their discipline and years of experience in academia. Most freely operated with the terminology described in the grant, thus creating the perception of common conceptual understanding. The gaps were uncovered once the real work started, which again is common for interdisciplinary teams (e.g., Klein, 1996, 2005; Laursen & O’Rourke, 2019). However, the real challenge is for all members to realize and accept the need to learn and be open to other members taking a lead role, which is difficult and may be resisted due to lack of understanding of why it is needed, who the IDers are, and how they can help.

Discussion & Conclusion This chapter talks about the benefits and challenges for the project and for us individually in the emergence of an “accidental IDer role” in addition to research roles. While our role as researchers informed our ID work, the two roles competed for limited time and resources. The need for an IDer became apparent not because anyone requested it, but because we recognized that others’ work was blocked as they lacked the necessary skills, knowledge, and dispositions and it seemed only we could help them develop

21  The Many Hats – Accidental Roles on an Interdisciplinary Research…

279

them (though this may have been hubris on our part!). The need for an ID role in this project could have been predicted when conceptualizing the grant proposal, yet this rarely happens. We have spoken with several other faculty members and IDers who had similar experiences. Their role in grant proposals was not well defined or was conceived of as being very limited; they were only brought in after many design decisions had been made, resulting in less than ideal design of instruction for the project. Therefore, there appears to be a general need for ID work to be identified early on in education-oriented interdisciplinary projects. The scope of time and resources needs to be defined at the proposal stage, based on realistic needs of the larger diverse and interdisciplinary team as well as for knowledge share across sub-teams. This implies a requirement for a needs analysis at the grant proposal conceptualization stage. Yet, it can be challenging in an interdisciplinary team as team members are often not familiar with what IDers and members of other disciplines (can) do. There is also a lack of familiarity with educational research processes and time needed (even if they have conducted education-related research within their own domains). This makes it difficult to identify and argue for these needs ahead of time. Furthermore, there may be budget constraints on doing this work prior to or during the funded period. An Agile or rapid prototype model might be a possible approach to keep costs lower and help other team members understand where this work can offer benefit. However, this may be risky as time and budget may be unknown ahead of time, and value for early prototypes may be unclear due to lack of understanding or respect for skillsets by other team members. We pose a call for action for our field: We need to find a way to better describe and justify the ID role using language that works for other disciplines and plan for that role in interdisciplinary projects to support team members, improve project and secondary outcomes, and to make the teaming experience beneficial for all. Acknowledgement  This material is based upon work supported by the National Science Foundation under Grant 2111097.

References Aboelela, S.  W., Larson, E., Bakken, S., Carrasquillo, O., Formicola, A., Glied, S.  A., Haas, J., & Gebbie, K.  M. (2007). Defining interdisciplinary research: Conclusions from a critical review of the literature. Health Services Research, 42(1p1), 329–346. https://doi. org/10.1111/j.1475-­6773.2006.00621.x Axelrod, R., & Cohen, M. (1999). Harnessing complexity: Organizational implications of a scientific frontier. The Free Press. Bean, C. (2014). The accidental instructional designer: Learning design for the digital age. American Society for Training and Development. Campbell, L.  M. (2005). Overcoming obstacles to interdisciplinary research. Conservation Biology, 19(2), 574–577. https://doi.org/10.1111/j.1523-­1739.2005.00058.x Chang, H., Ngunjiri, F., & Hernandez, K.  A. C. (2013). Collaborative autoethnography. Left Coast Press.

280

D. Tagare et al.

Cilliers, P. (1998). Complexity and postmodernism: Understanding complex systems. Routledge. Freeth, R., & Vilsmaier, U. (2020). Researching collaborative interdisciplinary teams: Practices and principles for navigating researcher positionality. Science & Technology Studies, 33(3), 57. https://doi.org/10.23987/sts.73060 Holbrook, J. B. (2013). What is interdisciplinary communication? Reflections on the very idea of disciplinary integration. Synthese (Dordrecht), 190(11), 1865–1879. https://doi.org/10.1007/ s11229-­012-­0179-­7 Holley, K.  A. (2017). Interdisciplinary curriculum and learning in higher education. In Oxford research encyclopedia of education. Oxford University Press. https://doi.org/10.1093/ acrefore/9780190264093.013.138 Klein, J. (1996). Crossing boundaries: Knowledge, disciplinarities, and interdisciplinarities. University of Virginia Press. Klein, J. (2005). Interdisciplinary teamwork: The dynamics of collaboration and integration. In S. Derry, C. Schunn, & M. Gernscacher (Eds.), Interdisciplinary collaboration: An emerging cognitive science (pp. 23–50). Taylor & Francis Group. Lattuca, L. (2001). Creating interdisciplinarity: Interdisciplinary research and teaching among college and university faculty. Vanderbilt University Press. Laursen, B., & O’Rourke, M. (2019). Thinking with Klein about integration. Issues in Interdisciplinary Studies, 37(2), 33–61. O’Boyle, A. (2018). Encounters with identity: Reflexivity and positioning in an interdisciplinary research project. International Journal of Research & Method in Education, 41(3), 353–366. https://doi.org/10.1080/1743727X.2017.1310835 Rhoten, D. (2004). Interdisciplinary research: Trend or transition. Items and Issues, 5(1–2), 6–11. Rhoten, D., & Pfirman, S. (2007). Women in interdisciplinary science: Exploring preferences and consequences. Research Policy, 36(1), 56–75. https://doi.org/10.1016/j.respol.2006.08.001 van Rijnsoever, F. J., & Hessels, L. K. (2011). Factors associated with disciplinary and interdisciplinary research collaboration. Research Policy, 40(3), 463–472. https://doi.org/10.1016/j. respol.2010.11.001 Wolf-Branigin, M. (2013). Using complexity theory for research and program evaluation. Oxford University Press.

Chapter 22

Using Personas to Leverage Students’ Understanding of Human Performance Technology to Support Their Instructional Design Practice Jill E. Stefaniak, Marisa E. Exter, and T. Logan Arrington Abstract  Scholars recommend that Human Performance Technology be incorporated within instructional design coursework. HPT embraces a systems-thinking approach that relies heavily on analysis methods. However, many of our students struggle with this mindset. In this chapter we will describe how we used UX techniques (personas and journey maps) to understand a variety of students’ journeys and struggles during an HPT class and identify areas that can be adapted and customized to meet students’ varied needs. Recommendations for how this can be used in course design activities will be offered. Keywords  Human performance technology · Instructional design · Personas · Instructional design pedagogy

J. E. Stefaniak (*) University of Georgia, Athens, GA, USA e-mail: [email protected] M. E. Exter Purdue University, West Lafayette, IN, USA e-mail: [email protected] T. L. Arrington University of West Georgia, Carrollton, GA, USA e-mail: [email protected] © The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 B. Hokanson et al. (eds.), Formative Design in Learning, Educational Communications and Technology: Issues and Innovations, https://doi.org/10.1007/978-3-031-41950-8_22

281

282

J. E. Stefaniak et al.

Introduction Human performance technology (HPT) can be defined as “the study and ethical practice of improving productivity in organizations by designing and developing effective interventions that are results-oriented, comprehensive, and systemic” (Pershing, 2006, p. 6). HPT focuses on embracing a systems-thinking view of an organization to support performance improvement initiatives within organizations. By implementing strategies involved in HPT, instructional designers can engage in systemic instructional design by utilizing performance analysis models to understand systemic relationships (Stefaniak, 2020). HPT has been an area of emphasis that scholars have noted should be incorporated within instructional design coursework (Dick & Wager, 1998; Foshay et al., 2014; Fox & Klein, 2002; Klein & Fox, 2004; Reiser, 2002; Sims & Koszalka, 2008). Scholarship exploring how instructional designers approach HPT activities such as needs assessments and program evaluation (Arrington et al., 2022; DeVaughn & Stefaniak, 2021; Giacumo & Asino, 2022; Moller et al., 2000; Stefaniak et al., 2018) have reported that instructional designers often face challenges with gaining access to information within an organization, identifying the types of data needed to inform their decisions, and understanding the bigger picture surrounding the project. Discourse in the field suggests that instructional designers would benefit from being taught strategies to navigate these challenges in their coursework. It is also important to note that the majority of the literature and scholarship involving HPT does not address performance improvement in K12 settings. This is not to say that HPT or its elements are not applied in these educational settings (Kang et al., 2020; Moore, 2004); it is just not reported in the scholarship outlets in our field. Most of the studies that are published in HPT venues such as Performance Improvement Quarterly or Performance Improvement are focused on workplace learning. Due to the lack of representation of K12 and other non-workplace education examples in the literature, we have found that students sometimes struggle to resonate with the concepts and strategies often presented in introductory performance improvement courses. Many of these struggles are rooted in students’ lack of understanding or consideration of the systemic facets of their organization. Additionally, students often struggle with the change in focus from traditional instructional design courses, where they may have presumed that the problems they were addressing were instructional in nature.

Use of Personas and Journeys to Support Curricular Design The overarching goal of this paper is to share how faculty at three different institutions designed personas to leverage their students’ understanding of HPT to support their instructional design practice. Emphasis will be placed on strategies we have employed in our courses to support instructional design students’ abilities to

22  Using Personas to Leverage Students’ Understanding of Human Performance…

283

embrace a systemic approach to design using HPT strategies. We also describe how the use of personas can be a formative tool for faculty to modify and customize their instruction. Each member of our team had several years’ experience teaching human performance technology courses in higher education. Additionally, we had experience leading performance improvement initiatives in higher education, K-12, and healthcare settings. We drew upon our collective experiences leading and teaching HPT to develop personas to support our curricular designs. We used two UX techniques to explore students’ varied experiences through our courses  – personas and journey maps. Personas are fictitious representations of “typical” users in the context where a product or service is being designed (Baek et al., 2007). Descriptions were written in narrative form to convey the users’ needs and wants (Williams van Rooij, 2012). Personas help designers customize their products or services by extending beyond traditional descriptors of their audience’s demographics (Baaki et al., 2017). “A journey map is a visualization of the process that a person goes through in order to accomplish a goal” (Nielsen Norman Group, 2018). A journey map displays a user’s (or in our case, a learner’s) progress through an experience in a graphical form, illustrating “user’s goals, pain points, and emotions, as well as processes that the user interacts with and ideas for improving that process” (Yale University, n.d.). In our chapter, we discuss what we learned from this process and how we are considering modifying each of our courses based on this activity. Additionally, we offer recommendations for how personas and journey maps can be developed and used to support learners’ growth and self-efficacy throughout their experience of a course. We also discuss how HPT concepts can be woven into instructional design coursework and adapted to the needs of the variety of students we tend to see in our instructional design programs.

Our Process Development of HPT Student Personas The three of us collaboratively discussed activities and instructional experiences we included in our human performance technology classes, after which we crafted eight personas that represent instructional design students who typically enroll in a human performance technology class. We relied on our expertise in teaching human performance technology courses at three different universities as we brainstormed challenges and behaviors we have observed among instructional design students. Our goal was to create a number of personas to represent a range of student backgrounds, motivations, and challenges similar to what we might encounter in our courses. We then used the personas to help guide our discussions as we worked to

284

J. E. Stefaniak et al.

identify different examples and opportunities for students to see HPT applied in different contexts. This section outlines the process we used to develop the personas. We used a whiteboard application called Miro that allowed our design team to collaborate synchronously online to draft characteristics that we have encountered while teaching HPT. This application allowed the three of us to enter information simultaneously during our brainstorm session. We could then review everyone’s entries and discuss where there were similarities or differences that warranted further attention. In this initial phase of persona development, we tasked ourselves with writing characteristics that described some of the students we have taught, their aspirations and goals, and comments that students have shared with us about HPT at the beginning of the semester (see Fig. 22.1). Traits we considered during our brainstorm included: personal demographics, current profession, motivation for obtaining a degree in the field of instructional design (e.g., “accidental instructional designers” becoming promoted in a current instructional design position; K12 teachers planning to transition to a new career), prior knowledge, current ways of approaching instructional design (e.g., have a systemic understanding of their role within their current organization and organizational culture; a solution-oriented problem-solving approach, leading them to immediately begin to develop a solution, which leads them to take shortcuts or largely disregarding the findings of analysis activities). We also included perceptions of how they might apply HPT in their current or future position (e.g., feelings of powerlessness as they believe they will not be able to instigate change within their organization; realization that they can gradually change perceptions of the role of instructional designer within their organization). Figure 22.1 provides an overview of the various comments and characteristics we brainstormed. We reflected on comments that had been shared in course

Fig. 22.1  Initial brainstorming session for persona development

22  Using Personas to Leverage Students’ Understanding of Human Performance…

285

Fig. 22.2  Close up of a cluster from initial brainstorming session

evaluations, class discussions, and personal conversations we have had with students over the years teaching HPT. Figure 22.2 provides a sample of one of the clusters reviewed during our brainstorming session. There is no significance in the colors of the post-its used in Fig. 22.2. After our initial brainstorming meeting, we took some time to individually reflect on the statements. At the beginning of our next design meeting, we discussed whether any additional statements needed to be included in the list. Once we were in agreement, we began discussing the different characteristics and comments we had posted to the Miro board. During this phase, we removed duplicate statements. We asked each other for clarity if a member of our team did not understand a statement that had been posted. We reflected on examples in our classes where students presented different needs regarding HPT. These discussions ensured that the three of us understood and agreed with the statements that had been posted. We then began bundling characteristics that we thought would belong in a persona.

286

J. E. Stefaniak et al.

Fig. 22.3  Clustering of potential personas

This activity was completed online in situ but we did not engage in conversation during the initial sort. Once we sorted the posts on the Miro board, we began discussing how the clusterings would represent a persona in a HPT class. Figure 22.3 demonstrates how the personas were clustered. We spent several meetings discussing the categories and thinking and moving posts from one cluster to another based on our group discussions. By the end of our discussions, we arrived at six personas. For each, we’ve included a profile similar to Fig. 22.4. • • • • • •

Dan (Software developer learning instructional design skills) Jamie (K-12 teacher who wants to stay in K-12) Sarah (K-12 teacher who wants to move into instructional design) Michele (K-12 instructional coach without formal instructional design training) Maria (Instructional designer working in a corporate setting) Bryan (Instructional designer working in higher education)

We assigned each persona a name with accompanying demographic information and provided a nickname for their persona as it would relate to their perceptions and goals in a HPT class. Appendix A provides details about each persona. It is important to note that the personas that were developed as a result of this exercise were created to assist our design team with developing empathy for our learners and to assist with making connections between HPT and their individual contexts (Baaki et al., 2017). Personas are not meant to be a complete reflection of one particular individual. To avoid overburdening the design team with too many personas, we worked to construct personas that would encompass various characteristics we have observed of our students. To avoid stereotyping our students, we presented our personas in narrative form to emphasize their personal goals and

22  Using Personas to Leverage Students’ Understanding of Human Performance…

287

Fig. 22.4  Jose’s profile

stories as they relate to HPT similar to other instructional projects that have used personas (e.g., Baaki et al., 2017; Stefaniak, 2020).

Journey Mapping Our HPT Personas To begin our process, one author first outlined the progression and schedule of their HPT course at the week and module level, based on his current course design (Table 22.1). In this course, students work through a performance systems analysis for an organization and a problem or opportunity of their choosing. We utilized this

288

J. E. Stefaniak et al.

Table 22.1  Class process overview Wk. 1

Module Orientation/Start Here

2

1: What is HPT

3 4

5

2: Planning to Analyze Performance Problems

6 7

8 9

3. Data Collection for HPT

10

11 12

13 14

4. Identifying Causes to Performance Problems

What Students are Doing Getting oriented to the course Beginning to meet their peers through introductory discussions and reviewing the course survey (a survey that all students complete, which includes their backgrounds, work experiences, goals, and group work preferences) Attend Module 1 Course meeting (includes HPT overview and discussion about performance problems) Begin establishing groups if working together on the HPT project. If working in Group, establish a group contract. Brainstorm or discuss ideas for projects Review HPT case study examples from Fundamentals of HPT (textbook) – Story of an HPT scenario end-to-end (Assignment [A] 1) Identify a performance problem, with adult workers as the performers, occurring within a specific organization to serve as the context for the remaining assignments Attend Module 2 Meeting (discusses data collection and our various purposes for collecting data [environmental, gap, and cause analysis]) Review new HPT case study examples, focusing on the analysis phases (environmental, gap, and cause) Review information regarding data collection methods to support HPT projects Consult with the instructor (individually or in groups) on integrating feedback from Assignment 1 as well as plans for data collection (A2) Finalize and submit a data collection plan (A2), including the collection instruments Attend Module 3 meeting (discusses data analysis techniques and how to be prepared to solicit and push for organizations/ individuals to give you data) Report out on where you’re at in your HPT project to your classmates via video (Problem/Org and Data Collection Plan) Begin collecting data Continue collecting data Review reports from other groups or individuals as they report out on their HPT projects Continue collecting data Attend Module 4 Meeting (discusses how to take the analyzed data and convert it into an analysis report) Finalize data collection processes Work on Analysis Report (A3), which includes environmental, gap, and causes analyses Submit A3 (continued)

22  Using Personas to Leverage Students’ Understanding of Human Performance…

289

Table 22.1 (continued) Wk. 15

16

17/ Final

Module 5. Selecting Interventions

What Students are Doing Attend Module 5 Meeting (discusses how to appropriately select interventions) Begin working on Intervention Recommendations (A4) Review examples of non-instructional interventions Report out on where you’re at in your HPT project to your classmates via video (A1-A4) Continue working on Intervention Recommendations (A4) Review reports from other groups or individuals as they report out on their HPT projects Submit a completed HPT report (A1-A4) document If working as a group, complete the Project Contribution Report [anonymous peer review of group members performance]

table to discuss all three of our approaches within our HPT courses. After discussion, we revised some aspects of this table to better capture elements of all our courses. We then created a journey map to “walk” the personas through the activities in the semester. We selected events from Table  22.1 that we believed had the most impact on students’ systems thinking ability and feelings about HPT and the course. One author had engaged in a similar activity in the past, using a whiteboard and in-­ person role-play, during which each of several co-designers took the part of the personas. In that case, a graph-like diagram was created to track how learners experienced the sequence of course activities (Fig. 22.5). Though it looked like a graph, a very qualitative process was used to determine learners’ comfort levels with various activities. For our HPT course, we decided to create two visualizations to represent students’ emotional state (from very anxious to very comfortable) and apparent competence in systems thinking ability as it applies to HPT (from confused/inadequate to strong). We wanted to acknowledge the emotional state to help us make an empathetic connection as we made modifications to HPT examples and activities that would be presented in our classes (Kouprie & Visser, 2009; Williams van Rooij, 2012). However, we needed to find a way to do this together online. After trying out various tools and templates, we decided to use Google Spreadsheet as our tool (see Fig.  22.6). First, we mapped out the process (see row 1 and 2  in Fig.  22.6). Underneath this we created two sets of “data” to represent the comfort level of each persona (rows 3–12) and the competence in systems thinking (rows 14–22). At each stage, we each spoke for the personas we had primarily created and said what level that persona was at and why. If one of us disagreed or had a different idea, we discussed until we came to a consensus. We wrote notes about our thoughts (see row 23). Note that although we filled in numerical values, they were really used to induce the tool to show different relative levels of comfort or apparent systems thinking ability – we did not intend this to be a true quantitative representation.

290

J. E. Stefaniak et al.

Fig. 22.5  Journey map exercise on a physical whiteboard. (From one author’s prior course design experience)

Fig. 22.6  Creating the journey map in spreadsheet form

22  Using Personas to Leverage Students’ Understanding of Human Performance…

291

Fig. 22.7  Journey map of comfort level. (From very anxious at the bottom to very comfortable at the top, with those who felt neutral at the midpoint)

Fig. 22.8  Journey map of systems thinking in HPT competence. (From confused/inadequate at the bottom to very strong at the top) Note. During our exercise, we placed people who could mechanically “fill in the blanks” in assignments without clear evidence of true understanding at the midpoint

Finally, we created graphs in Google Spreadsheet to display the journey of the personas throughout the semester (Figs. 22.7 and 22.8).

Implications for Designing Instruction The development of personas can be beneficial to designing instruction because it helps instructors and instructional designers pause and reflect upon the extent that they are making empathetic connections with their learning audience (Williams van Rooij, 2012). Constructing personas can help individuals responsible for designing

292

J. E. Stefaniak et al.

a course put themselves in their learners shoes and think about how they will engage with the content. Persona construction can be a very useful team design exercise to help facilitate discussions across courses and to think through students’ possible trajectories throughout an instructional program. As trends emerge and change in the field of instructional design, instructors can use personas as they revisit course content to update examples and activities accordingly. This is particularly helpful for instructors who are teaching students representing a variety of disciplines in instructional design (i.e., business and industry, government, healthcare, higher education, and K-12). This activity was helpful to our design team as we reflected on ways in which we could diversify HPT examples in our respective HPT courses. By talking through the personas, we could share with one another different scenarios where students may have struggled or flourished with incorporating HPT strategies in their design roles. It provided a means for us to reflect and question how much of a connection we were making with our students to help them contextualize HPT in their own lives.

References Arrington, T. L., Moore, A. L., Steele, K., & Klein, J. D. (2022). The value of human performance improvement in instructional design and technology. In J. E. Stefaniak & R. M. Reese (Eds.), The instructional design trainer’s guide: Authentic practices and considerations for mentoring ID and ed tech professionals (pp. 161–169). Routledge. Baaki, J., Maddrell, J., & Stauffer, E. (2017). Designing authentic and engaging personas for open education resources designers. International Journal of Designs for Learning, 8(2), 110–122. Baek, E. O., Cagiltay, K., Boling, E., & Frick, T. (2007). User-centered design and development. In J. M. Spector, M. D. Merrill, J. J. van Merrienboer, & M. F. Driscoll (Eds.), Handbook of research on educational communications and technology (3rd ed., pp. 659–670). Routledge. DeVaughn, P., & Stefaniak, J. (2021). An exploration of the challenges instructional designers encounter while conducting evaluations. Performance Improvement Quarterly, 33(4), 443–470. https://doi.org/10.1007/s11423-­020-­09823-­z Dick, W., & Wager, W. (1998). Preparing performance technologists: The role of a university. In P. J. Dean & D. E. Ripley (Eds.), Performance improvement interventions: Performance technologies in the workplace (pp. 239–251). The International Society Performance Improvement. Foshay, W. R., Villachica, S. W., & Stepich, D. A. (2014). Cousins but not twins: Instructional design and human performance technology in the workplace. In J. M. Spector, M. D. Merrill, J.  Elen, & M.  J. Bishop (Eds.), Handbook of research on educational communications and technology (4th ed., pp. 39–49). Springer. Fox, E. J., & Klein, J. D. (2002). What should instructional technologists know about human performance technology? The 25th Annual Proceedings of Selected Research and Development Papers Presented at the National Convention of the Association of Educational Communications and Technology (pp. 233–240). Giacumo, L., & Asino, T. (2022). Preparing instructional designers to apply human performance technology in global context. In J. E. Stefaniak & R. M. Reese (Eds.), The instructional design trainer’s guide: Authentic practices and considerations for mentoring ID and ed tech professionals (pp. 170–179). Routledge.

22  Using Personas to Leverage Students’ Understanding of Human Performance…

293

Kang, S. P., Huh, Y., & Kim, J. K. (2020). Application of human performance technology (HPT) standards for school teachers in Korea. Asia-Pacific Collaborative Education Journal, 16(1), 1–15. Klein, J. D., & Fox, E. J. (2004). Performance improvement competencies for instructional technologists. TechTrends, 48(2), 22–25. https://doi.org/10.1007/BF02762539 Kouprie, M., & Visser, F.  S. (2009). A framework for empathy in design: Stepping into and out of the user’s life. Journal of Engineering Design, 20(5), 437–448. https://doi. org/10.1080/09544820902875033 Moller, L., Benscoter, B., & Rohrer‐Murphy, L. (2000). Utilizing performance technology to improve evaluative practices of instructional designers. Performance Improvement Quarterly, 13(1), 84–95. Moore, J. (2004). Designing and implementing performance technology for teachers. Canadian Journal of Learning and Technology/La revue canadienne de l’apprentissage et de la technologie, 30(2). Nielsen Norman Group. (2018). Journey mapping 101. https://www.nngroup.com/articles/ journey-­mapping-­101/ Pershing, J.  A. (2006). Human performance technology fundamentals. In J.  A. Pershing (Ed.), Handbook of human performance technology (3rd ed., pp. 5–34). Pfeifer. Reiser, R.  A. (2002). What field did you say you were in? Defining and naming our field. In R. Reiser & J. V. Dempsey (Eds.), Trends and issues in instructional design and technology (pp. 5–15). Merrill/Prentice-Hall. Sims, R. C., & Koszalka, T. A. (2008). Competencies for the new-age instructional designer. In J.  M. Spector (Ed.), Handbook of research on educational communications and technology (pp. 569–575). Lawrence Erlbaum Associates. Stefaniak, J. (2020). Needs assessment for learning and performance: Theory, process, and practice. Routledge. Stefaniak, J., Baaki, J., Hoard, B., & Stapleton, L. (2018). The influence of perceived constraints during needs assessment on design conjecture. Journal of Computing in Higher Education, 30(1), 55–71. https://doi.org/10.1007/s12528-­018-­9173-­5 Williams van Rooij, S.  W. (2012). Research-based personas: Teaching empathy in professional education. Journal of Effective Teaching, 12(3), 77–86. Yale University. (n.d.). Journey maps and personas. https://usability.yale.edu/ understanding-­your-­user/journey-­maps-­personas

Chapter 23

Using Photo-Journals to Formatively Evaluate Elementary Student Robotic Construction Anna V. Blake, Lauren Harter, and Jason McKenna

Abstract  Educational robotics has become increasingly common in elementary classrooms and provides many benefits to young students, including improving STEM learning, positive perceptions of STEM topics, and critical twenty-first century skills such as collaboration and troubleshooting. The construction of robots themselves requires students to interpret instructions as well as apply spatial reasoning skills, creativity, and problem solving. Yet it can often be challenging for teachers to determine how students are progressing in these skills, as robotics is a formative, project-based learning activity. In this chapter, a photo-journal and evaluation rubric is proposed as a method for elementary teachers to monitor student learning of robotic construction skills in a formative manner. A classroom case study is used as an example for this formative method, with results and future research discussed. Keywords  Educational robotics · Hands-on · Spatial learning

Introduction The focus on science, technology, engineering, and math (STEM) education has increased in the past decade, as the United States falls further behind other countries in test scores. The National Science Foundation put forth that it is increasingly vital for students to have access to high quality STEM education, as students must be prepared for a technology-intensive global economy (2015). Federal strategies have A. V. Blake Elizabeth Forward School District, Elizabeth, PA, USA e-mail: [email protected] L. Harter · J. McKenna (*) VEX Robotics, Pittsburgh, PA, USA e-mail: [email protected]; [email protected] © The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 B. Hokanson et al. (eds.), Formative Design in Learning, Educational Communications and Technology: Issues and Innovations, https://doi.org/10.1007/978-3-031-41950-8_23

295

296

A. V. Blake et al.

been produced for STEM education, and funding for STEM learning is at an all-­ time high. However, as critical as STEM education is, research has shown that student attitudes toward STEM subjects become more negative as they age, beginning as early as fourth grade (Unfried et al., 2014). This presents a tension between the increased pressure to raise student achievement in STEM subjects and the increasingly negative student perceptions of STEM subjects as they get older. Introducing educational robotics in an interdisciplinary STEM curriculum  – especially to young students  – has been shown to give students early success in STEM and to improve student attitudes toward STEM subjects (McClure et  al., 2017; Ching et  al., 2019). Educational robotics combines hands-on construction, programming, and interdisciplinary STEM lessons with project-based learning that engages students in teamwork, problem-solving, and critical thinking. Researchers have found many benefits in incorporating robotics into school curriculum, including the development and application of STEM knowledge, increased computational thinking and problem-solving skills, and improved social and teamwork skills (Altin & Pedaste, 2013; Bers et  al., 2014; Kandlhofer & Steinbauer, 2015; Taylor, 2016). While the benefits of educational robotics have been well documented, there is still a gap remaining on how to evaluate students with this type of learning. Once again, we see a tension between this fun and engaging robotics curriculum improving student perceptions of STEM, but also needing to find a way to monitor progress that does not diminish the gains in creating positive attitudes. Traditional summative assessments – such as quizzes and tests – could reduce the benefits of educational robotics, while also being poorly aligned to the types of skills students are practicing. Traditional assessments often demotivate students and can stop the learning process (McKenna, 2023), both of which are detrimental effects for student engagement with robotics. Instead, we should investigate ways of designing a formative evaluation of educational robotics curriculum that can fully assess the skills students are learning in a way that doesn’t negate the authenticity of the experience and early feelings of success that are so crucial for elementary students. Unlike traditional high-stakes summative assessments (quizzes and tests), formative assessment is a no- or low-stakes form of evaluation where students are not graded on their performance. Formative evaluation offers teachers a way to monitor student learning while not impeding student learning or perceptions. In this chapter, we propose a photo-journal approach to evaluating student learning with educational robotics over time. This method was applied by a classroom teacher (also first author) to third, fourth, and fifth graders at a Pittsburgh-area elementary school. The teacher used a grant to obtain a classroom set of VEX GO robotics kits and to teach a year-long robotics curriculum to students. To track student progress over time, she created a photo journal online for each student to record the results of their robot builds. This approach allows students to continue to focus on the robotic STEM labs and enjoy the activities while providing the teacher with a way to evaluate their progress. The photo-journal provides one part of the solution. The second is a mastery-based rubric that can be used to evaluate student robot builds on aspects such as construction, spatial reasoning, problem solving, etc. The

23  Using Photo-Journals to Formatively Evaluate Elementary Student Robotic…

297

rubric provides a starting place for future iterations as teachers begin to apply this in their classrooms. The primary goals of this chapter are to: • Describe the photo-journal through a classroom case study • Discuss the formative evaluation rubric • Address future applications of this method and continued discussion on formative evaluation of educational robotics

Methods A teacher in Southwest Pennsylvania decided to add a photo-journal approach to capture student learning with educational robots over the course of the school year. This school is in a rural area where most students go into vocational or 2-year degree programs rather than 4-year college programs. Students in this district are predominantly Caucasian with about a third of students considered economically disadvantaged (Future Ready PA Index, 2022). A grant from Remake Learning, a nonprofit in Southwestern Pennsylvania, was awarded to the teacher which was used to purchase a classroom bundle of VEX GO robotics. Obtaining 10 VEX GO robotics kits, the teacher used the VEX GO STEM labs during her photo-journal assessment approach throughout the year. The VEX GO robotic kits were chosen because the plastic construction components were designed specifically for this age group, and the robot kits were also paired with extensive curriculum options. The STEM labs were developed with educational standards to fit the existing curriculum needs and provide students with a structure for engaging in collaborative, project-based learning. Less formal activities also give students more open-ended creative options for applying their skills. The teacher is also provided with instruction and professional development on how to teach the curriculum and facilitate optimal student learning. This teacher selected a combination of STEM labs and activities that would increase in difficulty over the course of the year and appropriately scaffold student ability. Working with students from third through fifth grade, the teacher used a photo-­ journal and a rubric to capture STEM successes in her classroom. This allowed her to track student progress over time and have students record the results of their build and share their ideas on each STEM lesson. Using the Canvas learning management system, students had access to their journals in an online platform that was easy to use and was previously established in their general education classes. Through the teacher-directed online photo-journal, students filled in pictures of their STEM builds during each lesson. While the journals were online, students could also save their photo portfolio to their iPad. With this saving feature, the photo-journal could be used in the future as a portfolio from year to year. This photo-journal approach allowed students to continue to focus on the robotic STEM labs and enjoy the activities while providing the teacher with a way to evaluate progress.

298

A. V. Blake et al.

Rubrics Using the photo-journal approach, the teacher could easily evaluate 75–100 students per grade level in their emerging STEM skills. The teacher used both the informal and mastery rubrics during the photo-journal approach. The informal rubric, shown in Fig. 23.1, allowed the teacher to quickly evaluate students that needed scaffolding in STEM labs. A 3-point scale helps to quickly assess how students are performing in a range of areas. Level 1 is described by the statement, “I am beginning to understand but I need help.” Level 2 is, “I understand and can do this by myself,” and level 3 is, “I understand and I can teach someone else.” This rubric is the first step to assessing students. The informal rubric was specifically helpful for areas such as using a 2D image to construct a 3D object and communicating and developing problem solving skills with partners. For example, in Fig. 23.1, the student on the right is at a 2 on the scale where the student on the left is a 1 on the scale. Identifying informally how students are progressing in a STEM lab allows the teacher to then scaffold learning for this team for both students to reach a 3 in understanding. By using the informal rubric, the teacher identified students who needed more scaffolding and differentiated instruction during the regular classroom lesson. This rubric is a stepping stone to the mastery rubric. The mastery rubric is a more in-depth rubric that evaluates student robot builds on aspects such as construction, spatial reasoning, problem solving, and communication. A 4-point rubric (Fig. 23.2) was developed to provide the teacher with more detail on how to evaluate student skill levels. The first category is construction, as the physical creation of the robot build for the lab or activity is a foundational skill. This construction category evaluates how well students followed build instructions or plans to create a successful robot build. The spatial reasoning category evaluates how well students solve problems related to spatial reasoning skills. For example, students may try to attach a rotating beam only to find that it hits the robot body when rotated. Spatial reasoning skills help students determine if the orientation of

Fig. 23.1  The informal rubric with a photo of two students being evaluated as they worked through a lesson

23  Using Photo-Journals to Formatively Evaluate Elementary Student Robotic…

299

Fig. 23.2  Mastery rubric and photo-journal example from the fun frogs STEM lab

their connection is correct or how the pieces could be reconstructed to solve the problem. The problem-solving category evaluates how active students are in overcoming obstacles and helping others solve problems. Lastly, the communication category evaluates how students engage with their team, critical to the teamwork component of the STEM labs and activities. Overall, the mastery rubric looks more closely at how the team worked together and identifies clearly how each student advanced throughout the year. For example, Fig. 23.2 shows a team working on the Fun Frogs STEM Lab using the VEX GO classroom kit. Using pictures from the STEM lesson, the teacher evaluated each student’s ability to show construction, spatial reasoning, problem solving, and communication skills. Identifying the gaps in skills is crucial for the teacher to adapt the next STEM lab and assess student learning for the curriculum for the year.

Build Types When the word robotic “build” is used, there are multiple meanings and types of builds. Some builds are mobile and can be coded to move or be driven using a Controller. Some builds are not powered by a robot brain or battery, but by different mechanisms like a pulley system, magnets, or rubber bands. Some builds may not move at all, but are used to explore concepts like the life cycle of a frog, or how to conceptualize fractions using different sized pieces.

300

A. V. Blake et al.

In addition to the multiple different types of builds, there are also different ways that builds can be constructed. Some builds are constructed using build instructions. These are step-by-step instructions that walk a student through each step of building a particular model. Build instructions are one of the highest forms of scaffolding in building. This is valuable to a student who has never built anything of this nature before to be guided through which pieces are used in certain ways, and how to physically attach pieces together. However, there are still certain skills that are needed even to successfully follow build instructions. One of the skills is spatial reasoning, in the way of following and adhering to a particular orientation of certain pieces, as seen in Fig. 23.3. Another skill is to be able to follow steps in sequential order, and not jump around the build instruction steps. The next scaffolded progression after build instructions is build modifications. Once a model is built, small modifications can be made to the design. This allows for more creativity and autonomy, without the un-scaffolded nature of free building. Modifications can be small, such as swapping out types of wheels, to something larger, such as adding a passive manipulator like a scoop or plow to gather and push objects. The last type of build in the progression is free build. Free building does not have any build instructions or steps to follow but is designed and then brought to life by the user. Since there are no discrete steps to follow, some students may struggle to know how pieces fit together naturally, which is why it is less scaffolded as an approach, compared to build instructions or modifying an existing build.

Fig. 23.3  Step 15 from the Codebase build instructions. This particular step is showing how the orientation of the chassis, shaft, and wheel should be aligned in order to properly attach the wheel in the correct place

23  Using Photo-Journals to Formatively Evaluate Elementary Student Robotic…

301

Examples: Evaluating Robotic Builds By evaluating the photo-journals, some differences emerged between the use of STEM Labs that included step-by-step build instructions and those where students created free builds. For example, during the VEX GO Activity, “The Wheel and Axle Lunar Rover” students did not have build instructions and so had to tinker and iterate with the free build. Creating a vehicle to move on the moon gave students a challenge that was an extension of a STEM Lab challenge. In this challenge, iteration was crucial in this activity to test, deconstruct, and then test again. Working on an extension of skills, such as problem solving and construction in a more authentic setting, allowed students’ thinking to become visible in the photo-journals. Similarly, in most of the STEM Labs, students who have finished with building their task for the day have an option to choose a coordinating activity from the “ChoiceBoard”. During free building, the teacher used the informal rubric to help identify students who could spatially reason and problem solve without instructions. Free builds seemed to be more challenging to some students, especially those who did not meet the mastery scale in a previous STEM lab. Similarly, photo-­ journals allowed the identification of those students who had difficulty with spatial reasoning. Using both free builds and STEM lab instructions for builds allowed students to ebb and flow through the four skills: construction, spatial reasoning, problem solving, and communication. In the end, the option of challenging students strengthened the teacher’s ability to find authenticity in the rubrics to help students increase their STEM skills as the year progressed (Fig. 23.4).

Fig. 23.4  Wheel and Axle Lunar Rover free build. In this example, students are building a lunar rover without instructions

302

A. V. Blake et al.

Evaluation Over Time At the beginning of the year, the teacher planned the VEX curriculum to scaffold student skills over time with a sequence of STEM labs and activities. By completing different STEM labs focusing on spatial reasoning, and communicating as a team, students objectively gained fluency from the continual use of the rubrics. At the end of the year, the class competed in a VEX GO competition by constructing a VEX GO mobile robot, called a Code Base, and then used game strategy to overcome different challenges for points. By using the informal and mastery rubric with the photo-journal approach, the teacher tracked her students’ progress, allowing her to identify the needs of each student in differentiating spatial reasoning or developing communication skills in order to compete in the final competition. The VEX GO robotics competition brought the insight from the informal rubric and all four components in the mastery rubric together. In order to be ready to compete in the VEX GO competition, the teacher used the informal and mastery rubrics during STEM Labs such as Fun Frogs to see how students were continuously progressing throughout the year. The final VEX GO competition replicated authentic problem-solving skills needed for future twenty-first century jobs. With continued support using informal and mastery level rubrics, students felt confident in their ability to compete in the competition. The assessment became a tool for the teacher to give that skill success to her students (Fig. 23.5).

Fig. 23.5  Scaffolded learning from beginning to end of year. In this figure, the skills throughout the year start at building a 3D object to competing in a 3D object competition using both construction, spatial learning skills and continued problem solving in a timed arena

23  Using Photo-Journals to Formatively Evaluate Elementary Student Robotic…

303

Conclusion There is a widespread push for STEM instruction in schools across America, yet no focus on formative evaluation methods to help teachers see and evaluate what students are learning and support their decision-making on instruction. Educational robotics allows students to think outside of the box and authentically experience science, technology, engineering and mathematics. Educational robotics gives students an authentic STEM learning experience that increases their attitudes toward STEM subjects. Yet, teachers need a way to monitor student progress in a formative approach that does not diminish the benefits of educational robotics but provides guidance and support to student learning. The need for STEM solutions will only increase, and so it is also critical for teachers to share solutions for formative learning and evaluation from real classroom experiences. The hope for this photo-journal and rubric method is to allow other teachers to create similar formative evaluation approaches for their students. The ability to evaluate both the STEM labs with construction directions and free builds categorically helps teachers support students learning both STEM and twenty-first century skills. Formative assessment allows learning to progress uninterrupted and doesn’t dampen student interest or positive attitudes. With this in mind, we hope STEM robotics education continues to progress with methods for formative evaluation. In this chapter, a single teacher’s experiences applying a photo-journal and rubric evaluation approach for robotics education were outlined with examples. For these third through fifth grade students, the photo-journals revealed student progress from the beginning of the year as they began creating basic robotic builds to the end of the year when they were constructing complex competition builds. An informal rubric helped the teacher quickly evaluate how students were doing with their STEM labs and activities in real-time. The informal rubric helped the teacher to quickly intervene with students who were struggling. This teacher also was able to use the photo-journals with the mastery rubric to categorically evaluate students on key metrics: construction, spatial reasoning, problem solving, and communication. The mastery rubric was beneficial to scaffolding individual students for individual skills. Since only one example of how this photo-journal approach was used in the classroom, future research should investigate how this method is used and adapted by other educators. Each teaching and learning context is unique. As more teachers adopt this method and report findings, this formative evaluation approach can be refined further. Additionally, future research could focus on quantifying learning via the mastery rubric for an entire class over time. It would also be beneficial to investigate how the informal rubric might support student self-evaluation in real-time during the robotic activities. As this method for formative evaluation evolves, there are many avenues for future research on related topics.

304

A. V. Blake et al.

Acknowledgements  We would like to thank the school administration, teachers, and parents for their contributions to this educational robotics initiative. We would also like to thank the VEX educational team for their support and collaboration. Finally, this project was only possible through the grant funding from Remake Learning in Pittsburgh, which allowed for the purchase of the VEX Robotics kits.

References Altin, H., & Pedaste, M. (2013). Learning approaches to applying robotics in science education. Journal of Baltic Science Education, 12(3), 365–378. Bers, M. U., Flannery, L., Kazakoff, E. R., & Sullivan, A. (2014). Computational thinking and tinkering: Exploration of an early child- hood robotics curriculum. Computers & Education, 72, 145–157. https://doi.org/10.1016/j.compedu.2013.10.020 Ching, Y. H., Yang, D., Wang, S., Baek, Y., Swanson, S., & Chittoori, B. (2019). Elementary school student development of STEM attitudes and perceived learning in a STEM integrated robotics curriculum. TechTrends, 63(5), 590–601. https://doi.org/10.1007/s11528-­019-­00388-­0 Future Ready PA Index. (2022). https://futurereadypa.org/School/FastFacts ?id=232084136004214230012131204081228254105093100117# Kandlhofer, M., & Steinbauer, G. (2015). Evaluating the impact of educational robotics on pupils’ technical- and social-skills and science related attitudes. Robotics and Autonomous Systems, 75, 679–685. https://doi.org/10.1016/j.robot.2015.09.007 McClure, E. R., Guernsey, L., Clements, D. H., Bales, S. N., Nichols, J., Kendall-Taylor, N., & Levine, M.  H. (2017). STEM starts early: Grounding science, technology, engineering, and math education in early childhood. Joan Ganz Cooney Center at Sesame Workshop. http:// joanganzcooneycenter.org/publication/stem-­starts-­early/ McKenna, J. (2023). https://www.amazon.com/What-­STEM-­Your-­Classroom-­Collaboration/ dp/1954631456 Taylor, K. (2016). Collaborative robotics, more than just working in groups: Effects of student collaboration on learning motivation, collaborative problem solving, and science process skills in robotic activities [Doctoral dissertation]. https://scholarworks.boisestate.edu/cgi/viewcontent. cgi?article=2179&context=td Unfried, A., Faber, M., & Wiebe, E. (2014). Gender and student attitudes toward science, technology, engineering, and mathematics (pp. 1–26). American Educational Research Association. https://www.researchgate.net/publication/261387698

Chapter 24

What Is Is Not What Has to Be: The Five Spaces Framework as a Lens for  (Re)design in Education Melissa Warr, Kevin Close, and Punya Mishra

Abstract Design is everywhere. Recognizing how everything in education is designed, including systems and cultures, increases our agency to make changes on those designs. In this chapter, we introduce the five spaces framework which provides an analytical tool for understanding the relationships among designed entities, shifting perspectives and offering new possibilities for (re)design. To illustrate the framework, we analyze three technologies in education: the teacher’s desk, PISA test, and learning management systems. Keywords  Design · Systems thinking · Educational systems Lenses – both physical lenses, which might amplify or color our vision, and metaphorical lenses, shaped by our beliefs and perspectives – alter how we see and interact with the world. The lens we use reveals some aspects of a situation and hides others, “suggest[ing] a different set of practices and solutions” (Ancona et al., 2001, p. 645). In this chapter, we describe how a lens that highlights the artificial nature of education can enable innovative approaches to redesigning education. The lens we apply here reveals that most everything around us is made up: created, whether intentionally or unintentionally, by other humans. This includes things that we often take to be natural, such as the foods we eat or animals we keep as pets. As it turns out, most of the vegetables we eat and the pets we love to spend time with M. Warr (*) New Mexico State University, Las Cruces, NM, USA e-mail: [email protected] K. Close Spencer Foundation, Chicago, IL, USA e-mail: [email protected] P. Mishra Arizona State University, Tempe, AZ, USA e-mail: [email protected] © The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 B. Hokanson et al. (eds.), Formative Design in Learning, Educational Communications and Technology: Issues and Innovations, https://doi.org/10.1007/978-3-031-41950-8_24

305

306

M. Warr et al.

have been “designed” by artificial selection over years, decades, and even centuries. From this perspective, an apple (or an Aussie Doodle, for that matter) is “designed” as much as a pencil or a college application. To be clear, this does not mean that there are no natural kinds, such as oceans, trees, or galaxies, out there in the world. But increasingly, we humans have managed to insulate ourselves from the natural worlds and often are engaging almost entirely with the world of the artificial. Recognizing that we live, for the most part, in an artificial, human-created world can change how we are in the world, how we perceive it, interact with it, and, more importantly, how we can change it. For Herb Simon (1969), this artificial world calls for a “science of the artificial” which is both a recognition of the designed nature of existence as well as a call to create a new form of knowledge, distinct from the natural sciences and the humanities. It points to the fact that there is nothing essential about much of what and how we interact with the artificial world. This provides us with agency to change the world since there is nothing inherently “natural” about these artifacts or processes or systems. One could argue that it also provides us with a moral imperative to do so because we know that much of the world around us is unfair, often disadvantaging and marginalizing huge swaths of people and communities. This is where the sciences of the artificial meet the idea of design. Since design is “concerned not with how things are but with how they might be” (Simon, 1969, p. 111), designers are adept at seeing what could or “might” be; they recognize that what is is not what has to be. Included in this artificial world is education. Almost every aspect of what makes up today’s educational system – classes, schools, credit hours, universities, degrees, even the very idea of receiving an “education” – has been invented by humans. The current design of education does not work for many, particularly the groups that have been historically marginalized. If schools are not fun, if they do not support play and creativity, it is because they were designed to be this way. Because these are creations of humans, they can be reimagined and redesigned for better outcomes. Although changing educational system might be incredibly complex, it is worth recognizing that it is designed and so can be re-designed. In our work, we have found that expanding what we see as artificial, particularly the artificial nature of education and schooling, can enable powerful change. It is enabling in two ways. First, it allows us to interrogate everything around us, not taking it as a given, but rather something that was created and thus can be re-created, re-imagined, and re-designed. Second, it provides a response to those who resist change by making an essentialist argument  – “this is just how things are.” Acknowledging the artificiality of the system suggests that this is how things may be, but they don’t have to be this way. Another important aspect of seeing the world as artificial is expanding what we mean by the “world.” For too long we (and the field of design) have conceived of the designed world as constituted of physical artifacts and other technological tools. Although these are important, we argue that there are many intangible aspects to the designed world. They may include processes (such as the process of registering for school), systems (such as the K-20 educational system), or even culture (such as the culture of high-school football). Although design in some spheres (such as systems

24 What Is Is Not What Has to Be: The Five Spaces Framework as a Lens…

307

Fig. 24.1  One representation of the five spaces for design in education

and culture) might be more complex than others, applying a wide-angled design lens can increase agency, empowering change makers. In order to do so, we need a frame, a way of categorizing or classifying the different kinds of “designed things” that are out there in the world. We have created a framework that supports applying this type of design lens to education. The Five Spaces for Design in Education framework presents design as occurring across five interactive spaces: artifacts, processes, experiences, systems, and culture (see Fig. 24.1 and Table 24.1). The framework provides an analytical tool for understanding the relationships among designed entities, shifting perspectives, and offering new possibilities for (re)design. In this chapter, we will use the five spaces framework to analyze three technologies in education: the teacher’s desk, PISA test, and learning management systems (LMS).

Case 1: The Teacher’s Desk We start by considering something that seems like a given in education – the physical elements of a classroom, specifically the teacher’s desk (an artifact).1 We illustrate how physical ‘designed’ elements of schooling like the desk intertwine with and reflect the processes, experiences, systems, and cultures of schooling. We do this to make visible the relationships among designed entities, offering new possibilities for thinking about (re)design. As an artifact, a teacher’s desk is a flat workspace that can hold papers, a computer, writing utensils, etc. It often also stores things, presumably the things that teachers need to do their work. These desks usually support a work space for a

 Our thinking about the role of the teacher desk in educational processes, experiences, and systems originated from a blog post written by Shawn Loescher which can be found here: https://sloescher. com/onleskine/f/on-desks 1

308

M. Warr et al.

Table 24.1  Descriptions of the five spaces for design in education Space Artifacts

Definition Stable objects that can be perceived through the senses

Examples Curricular materials, tools, software, manipulatives, videos Processes A procedure or directions that can be used outside Lesson plans, curricula, of the context within which it was created to schedules achieve a goal Experiences A piece of time with associated sights, sounds, Activities, celebrations feelings, and thoughts (graduation), learning communities Systems An organized and purposeful structure of Registration, certification interrelated and interdependent elements system, degree program, evaluation systems Culture A pattern of shared basic assumptions that allows Perceptions of technology, groups to perceive and interpret the world in schools, or education broadly; similar ways, develop and communicate meaning, classroom culture; school and transmit values to new group members culture

single person, with a leg barrier on one side that partitions off the individual workspace. The design suggests that a teacher’s main job at the desk is to independently work with papers, books, office supplies, and – more recently – computers. The placement of the desk in the room affects the potential processes and experiences of the classroom. For example, the teacher’s desk supports certain processes: gathering papers, grading, reviewing curricular materials, and planning lessons. Papers are often stacked on the desk, evaluated, and recorded in a gradebook before being returned to students. A desk in the back of a classroom, with the user facing the room, might be used by a teacher working while monitoring a class, whereas a desk at the front of the room suggests more direct observation. It is also informative to consider the processes desks do not support: the physical design of some desks makes it difficult for collaborative work between teacher and student, affecting the experiences (the next space we consider) students and teachers have around desks. The experiences a desk affords are impacted by the physical design, placement, and processes they are used to support. For example, a teacher’s desk in the back of a room facing a wall might suggest a space for a teacher to engage in activities separate from the classroom, perhaps mostly used during periods of time when students are not present. A desk in the back facing the main classroom might be a side area to work while still being a part of classroom activity. Or, if the items on the desk are less personal and the space is made available for others to use, a desk might suggest shared ownership of classroom roles, with all participants operating as teachers and learners. For example, Getzels (1974) connects desk position to conceptualizations of students as learners. He stated that when a desk was moved from the front of a classroom, “The vision of the learner as an empty organism was transformed into a vision of the learner as an active organism” (p. 532; see also Woolner et al., 2012). A desk at the front of a room is often a sign of authority, that the teacher is in front and in charge, and the work they do is what directs the learning.

24 What Is Is Not What Has to Be: The Five Spaces Framework as a Lens…

309

If we move to a systems level, we see that the processes enabled by desks – collecting artifacts of “learning” such as homework and tests and evaluating them – works with the larger system of schooling. This system is based on creating evidence of learning that can be objectively evaluated, the results stored in units such as credit hours and degrees. As credit hours or degrees, these pieces of learning can be purchased through tuition. They represent approval or permission for acting in certain roles in society, such as certain professions. The cultural space that relates to the desk can be seen in the use of the desk image of a sixth-grade math teacher’s drawings. David (pseudonym) drew two pictures of desks in response to the question “What does it mean to be a teacher?” (see Fig.  24.2). These pictures were drawn as part of a study on teacher identity and design (Warr, 2021). David explained that the top image is a teacher working at night on a weekend. The teacher has a “gap in the lesson plans” because “the state added a new standard and dropped one,” and he is trying to find material to address the new standard so that he can go to bed. David connected this event  – and the desk it is centered

Fig. 24.2  “What it means to be a teacher” created by a sixth-grade math teacher. The artist described the top image as what a teacher might do during a weekend night. The bottom image takes place during lunch on a school day

310

M. Warr et al.

on – directly to the systems and culture surrounding teachers in the United States. He explained, “It’s a narrative on the societal expectation of teachers to work outside of their work hours. They don’t get weekend nights to themselves.” In other words, because of the structure of the teaching system – with shifting expectations (standards), and limited contracted time for planning or designing – the individual work of a teacher might be relegated to unpaid hours. This practice is generally accepted as part of the culture of teaching and schooling. In the bottom picture, David again connected work at a desk with the experiences, systems, and culture of teachers. He explained the teacher is being “[taken] advantage of… He’s on his short lunch break. He can’t afford to eat out, he’s got a brown paper bag there and he’s also grading papers… It’s just like a commentary, not on the classroom side but on the contract side of the teaching.” What is happening at the desk – the brown bag lunch eating and grading – is a result of teacher contracts (systems) and expectations (culture).

Case 2: The PISA Test The PISA test, short for Program for International Student Assessment, like the teacher’s desk, is also an artifact. However, we lay it out as a second example precisely because the artifact (i.e., the computer-based test) is deeply and conspicuously intertwined with wider systems and cultures of testing, merit, and knowledge. In fact, in some ways the PISA test, which seeks to compare students across countries in math, science, and reading, can be an exemplar for how an artifact reflects a wider culture and systems. Culturally, the PISA test reflects and reproduces a certain meritocratic social paradigm that emphasizes the measurement and standardization of academic success. The design is intentional and shared with other large scale standardized tests like the SAT. As Dixon-Román (2017) stated, “The SAT is an apparatus that continues to enact and reconfigure what is possible and what is excluded from mattering for ability, merit, and college admissions” (p. 119). In the case of the PISA test, the artifact then reflects a culture of decision-making designed to focus economic decisions on international comparisons, 15-year-old students, and math, science, and reading performance. If we move to a systems level, we see that the processes enabled by the PISA test (collecting comparative scores in reading, writing, and math across countries) reflects wider systems of educational decision making. Recall, the teacher’s desk was part of the wider educational system based on creating evidence of learning that can be objectively evaluated. Ultimately, the PISA test was designed to fit within current global economic systems and current decision-making systems at the government level. The test results, which do not provide individual student scores or any sort of formative feedback, are more like research results. The test developers seek to “accurately describ[e] the proficiencies of nationally representative samples of 15-year-olds in

24 What Is Is Not What Has to Be: The Five Spaces Framework as a Lens…

311

each country” (OECD, 2018). Hence, while the PISA test seems like an educational artifact, it is in fact designed to drive political decision making. Low PISA scores are often accompanied by cries to “close the gap” and, more importantly, often accompanied by big line items in national budgets (Goldstein, 2019). The experience level reveals an important point about the design of the PISA test, which is that the person experiencing the PISA test can be thought of as the person taking the test or as the person using the test scores. In the case of the person taking the test, the experience, like the experience of most standardized tests, is isolated, quiet, and often stressful. In the case of the person using the test scores, the experience can be thought of as simple and clear because the results come out in comparative numbers per subject. Just as the experience related to the teacher’s desk was constrained by the physical affordances, the experience of the PISA test is influenced (or constrained) by the culture and the systems in which the PISA test is embedded. We illustrate this to show the interconnected nature of the five spaces. There are more critiques, more reflections, and more observations that arise when thinking about the design of the PISA test as an artifact and as a part of global socio-economic-political systems. However, the goal of this chapter is to think about how the five spaces framework can make visible the relationships among designed entities, offering new possibilities for thinking about the (re)design of educational testing systems. For one, the framework reveals the complexity of doing any type of (re)design work in education. Redesigning the PISA test entails redesigning some of the current culture designs pervasive in education such as thinking about students as economic resources instead of learners. It also means reflecting on the complex relationship between interrelated systems. Yet, naming those tensions and areas of negotiation may also set the stage for divergent thinking. Could the PISA test be designed for the good of students instead of for political decision making? Why focus on these subjects and not a more holistic measure of human development? Why quantitative comparisons? Why select “country” as a grain-size, given possible within-country variations? How much is this a function of the rise of the idea of the “nation-state” – itself a complex, and historically contingent idea. These questions and others can surface when thinking about the “design” at multiple levels in abstract areas. In the next section, we shift from thinking about the PISA test as a designed thing at many levels to thinking about another prevalent educational artifact: Learning Management Systems (LMS).

Case 3: The Learning Management System (LMS) In our discussion of the design of the teacher’s desk and PISA test, certain characteristics of how much of society views education are evident. In the teacher’s desk, we highlighted how, as designed (across the five spaces for design), the desk becomes a space for collecting and grading evidence of learning. With the PISA

312

M. Warr et al.

test, we emphasized the practice of standardizing and measuring academic achievement to drive political decisions. Considering these aspects of education as designs emphasizes that what is is not what has to be. The teacher’s desk and the PISA test both reflect and affect educational artifacts, processes, experiences, systems, and culture. In this section, we will apply the five spaces to an analysis of the Learning Management Systems (or LMS). Such systems are ubiquitous in education today, particularly after the significant move to online/virtual learning that was pushed on all of us during the COVID pandemic. The LMS developed out of the application of computers to education as described by terms such as computer-based instruction (CBI), computer-assisted learning (CAL), and integrated learning system (ILS) (Watson & Watson, 2007). Whereas these terms refer to educational computer programs, including management and tracking of learning, what distinguishes the LMS is its systemic nature: it brings together not only content and learning processes, but also human resources, registration, tracking, and more (Watson & Watson, 2007). An LMS can automate these systems while at the same time supporting creating and delivering content (Ellis, 2009). In the systems space, then, an LMS integrates with other educational systems. For example, it supports administrative tasks such as course registration, assigning instructors, and awarding course credits (Correia, 2018). It integrates with curricular systems such as content development, assessment, and learning standards (Ellis, 2009). Human resources can use the LMS for assigning and monitoring employee training (Ellis, 2009). Effective LMS’s also integrate with various tools and systems external to the educational institution. For example, they might integrate with other software such as Google Docs and work on a variety of operating systems and in various formats (such as desktop and mobile). These external connections can support a more open and connected learning experience, supporting the development of knowledge networks and social learning (Stone & Zheng, 2014). The artifact, process, and experience spaces of the LMS work together to support this system integration. For examples, on the artifact level, an LMS needs to have the properties that support the integration across various systems and tools. This artifact facilitates processes that need to be accomplished by the various systems, such as registration and assessment. The user experience of an LMS is dependent on its ability to fluidly support these processes. An effective LMS can reduce workload and provide smoother course management (Correia, 2018). The most common features of an LMS both reflect and sustain certain cultural views of education and learning. This is most evident in its name itself – learning management system, that it is a system for managing learning, not for fostering it, for encouraging it, or playing with it. It is a system that allows us to manage learning, particularly within existing systems of education. It is no surprise, therefore, that the structures of most LMS’s are based on existing educational practices: separate courses, and within those courses separate modules that hold content centered on specific topics. Student work is submitted and evaluated by the instructor then is

24 What Is Is Not What Has to Be: The Five Spaces Framework as a Lens…

313

stored in credit hours. These features match the acquisition metaphor of learning, that learning is about gaining knowledge (Sfard, 1998). The acquisition metaphor is at the center of current educational culture and systems (Sfard, 1998). Thus, LMS’s have certain key characteristics. First, they homogenize the learning experience. The focus on managing learning (defined generically and broadly) means that the nature of the content itself or broader educational goals are not seen as being critical to the design of these systems. Second, LMS systems are quite fundamentally based on the idea of blocks (whether they be called degree programs, courses, or modules). These blocks (at least their underlying architectures) are identical. A course in biochemistry is considered as having the same needs as a course on art or education. More recent research has emphasized other models of learning, such as the participation metaphor (Sfard, 1998), and concepts such as Communities of Inquiry (Garrison, 2007). Some features of the LMS, such as discussion forums and communication channels, can encourage social and participation-based learning. Additionally, mobile applications can increase contextual learning and connectedness – but at the end of the day these systems are about managing learning in terms of inter-operability, credit transfer, tracking participation, and grading. Thus, the design of a particular artifact, an LMS, constrains the kinds of educational processes that are “allowed” and therefore the very nature of student experience. These structures and architectures are driven by broader systems within which these tools are designed to be used and a broader educational culture that values consistency and interoperability over creativity and divergence. As society changes and becomes more networked and open (Kali et al., 2019; Voogt et al., 2013), reflecting on the design of the LMS can help update the educational system to meet the needs of learners in a networked society. For example, although some features of an LMS might support participation-centered learning, these features are generally kept within the boundaries of a single course, in a closed group of students, and only available within a limited time frame. Redesigning the LMS could expand learning possibilities. New features might include cross-course forums, social tagging or bookmarking, and tools for connecting with external experts and for opening the learning environment to those external to the course (Stone & Zheng, 2014). Instead of centering on information storage and activity tracking, an LMS for a networked society could be anchored by relationships and form a hub for learning connections across various contexts. Such a change might require a new name, as less emphasis would be put on managing learning and more on expanding learning. Of course, redesigning an LMS in this way is not easy. Because the LMS is integrated into the educational system, attempts to move away from an emphasis on closed courses and activity tracking will be difficult. However, moving in this direction has the possibility of changing not only the LMS but also educational processes, experiences, systems, and culture.

314

M. Warr et al.

Conclusion By considering these artifacts through the lens of the five spaces, we can better understand how design can happen in multiple ways and multiple spheres. Understanding design in abstract areas like processes, experiences, systems, and cultures can help us think critically and divergently about what can be designed. This understanding, that almost everything is designed and therefore can be redesigned, is more important as we think about ways to affect our education systems in deep, complex, and sustainable ways. We present this framework as a first step towards re-envisioning deeply embedded, seemingly invisible, designed spaces in education. We recognize that seeing what could or might be is a critical first step towards change in education. We believe the five spaces framework provides a platform to interrogate everything around us as well as a response to those who resist change by arguing that “this is just how things are.” Moreover, seeing the designed nature of education combined with our knowledge that most educational systems are not equitable spaces provides us with a moral imperative to change it to make it better. Which is why we’ve started to interrogate the very artifact you are reading now, the conclusion section of a book chapter. The conclusion is itself an artifact that reflects a wider set of cultural norms within an academic system. Yet, at the core, our goal in this section is to summarize the chapter, emphasize key points, and possibly leave the reader with something memorable. We feel like our abstract already accomplished most of this. So, instead of a proper conclusion, let’s end with our abstract, the place where we first started from: Design is everywhere. Recognizing how everything in education is designed, including systems and cultures, increases our agency to make changes on those designs. In this chapter, we introduce the five spaces framework which provides an analytical tool for understanding the relationships among designed entities, shifting perspectives, and offering new possibilities for (re)design. To illustrate the framework, we analyze three technologies in education: the teacher desk, PISA test, and learning management systems.

References Ancona, D. G., Goodman, P. S., Lawrence, B. S., & Tushman, M. L. (2001). Time: A new research lens. AMRO, 26(4), 645–663. https://doi.org/10.5465/amr.2001.5393903 Correia, A.-P. (2018). The evolution and diffusion of learning management systems: The case of canvas LMS.  In A.-P.  Correia (Ed.), Driving educational change: Innovations in action. The Ohio State University. https://ohiostate.pressbooks.pub/drivechange/chapter/ the-­evolution-­and-­diffusion-­of-­learning-­management-­systems-­the-­case-­of-­canvas-­lms/ Dixon-Román, E. J. (2017). Inheriting possibility: Social reproduction and quantification in education. U of Minnesota Press. ISBN: 9781517901264.

24 What Is Is Not What Has to Be: The Five Spaces Framework as a Lens…

315

Ellis, R.  K. (2009). Learning management systems. In A field guide to learning management systems. American Society for Training & Development. https://home.csulb.edu/~arezaei/ ETEC551/web/LMS_fieldguide_20091.pdf Garrison, D. R. (2007). Online community of inquiry review: Social, cognitive, and teaching presence issues. Journal of Asynchronous Learning Networks, 11, 61–72. Getzels, J. W. (1974). Images of the classroom and visions of the learner. Schwestern Revue, 82(4), 527–540. https://doi.org/10.1086/443148 Goldstein, D. (2019, December 5). ‘It Just Isn’t Working’: PISA test scores cast doubt on U.S. education efforts. The New  York Times. https://www.nytimes.com/2019/12/03/us/us-­students-­ international-­test-­scores.html Kali, Y., Baram-Tsabari, A., & Schejter, A. (2019). Learning in a networked society. Springer International Publishing. https://doi.org/10.1007/978-­3-­030-­14610-­8 OECD. (2018). PISA 2018 Technical Report. https://www.oecd.org/pisa/data/ pisa2018technicalreport/ Sfard, A. (1998). On two metaphors for learning and the dangers of choosing just one. Educational Researcher, 27(2), 4–13. https://www.jstor.org/stable/1176193 Simon, H. A. (1969). The sciences of the artificial (3rd ed.). MIT Press. Stone, D. E., & Zheng, G. (2014). Learning management systems in a changing environment. In Handbook of research on education and technology in a changing society (pp. 756–767). IGI Global. https://doi.org/10.4018/978-­1-­4666-­6046-­5.ch056 Voogt, J., Erstad, O., Dede, C., & Mishra, P. (2013). Challenges to learning and schooling in the digital networked world of the 21st century. Journal of Computer Assisted Learning, 29(5), 403–413. https://doi.org/10.1111/jcal.12029 Warr, M. (2021). In P.  Mishra (Ed.), Teachers as designers: Epistemic diversity and sensemaking amidst indeterminacy. Arizona State University. https://www.proquest.com/ docview/2532139792 Watson, W., & Watson, S. L. (2007). An argument for clarity: What are learning management systems, what are they not, and what should they become? TechTrends: For Leaders in Education & Training, 51(2), 28–34. https://doi.org/10.1007/s11528-­007-­0023-­y Woolner, P., McCarter, S., Wall, K., & Higgins, S. (2012). Changed learning through changed space: When can a participatory approach to the learning environment challenge preconceptions and alter practice? Improving Schools, 15(1), 45–60. https://doi.org/10.1177/1365480211434796

Chapter 25

Brad Hokanson and the Summer Research Symposium: The Quiet Force Behind a Signature Event Elizabeth Boling

A design colleague of mine once observed during a meeting, “When I am not sure who the designer is on a team, I listen for the person who is offering concrete ideas about solving problems, and I know that’s the designer.” If the field of educational communications and technology were that meeting, and if you were listening, you would most assuredly recognize Brad Hokanson as one of the designers. While he carries himself with quiet self-assurance (as befits a dedicated tango dancer!), he has never been one to brag about his accomplishments. What he has done throughout his career to date is look for concrete ways to support research and practice around design in this field. His academic home is in graphic design at the University of Minnesota, where Brad has not only established himself as a world-class teacher of design and creativity to undergraduate students campus-wide, conducted an active research agenda in design and creativity, and been recognized as an exceptional scholar, but has simultaneously offered generously of his time to AECT and the entire field of educational communications and technology. From the pre-­ internet showcase of instructional designs that he and Simon Hooper distributed via CD-ROM decades ago, to serving recently as President of AECT, he has stepped up over the years in multiple ways for the good of practitioners, scholars and students. From my perspective, one of Brad’s most significant and enduring contributions to the field has been his leadership of the AECT Summer Research Symposium. When he took the reins of this special event, he brought with him his philosophy and his presence. His philosophy of inclusiveness, support and community development is signaled by the World Café protocol in which all participating authors both give and receive feedback on the accepted papers. The whole group of authors lift each other up in performance and in spirit, and I have witnessed more than once how E. Boling (*) Indiana University, Bloomington, IN, USA e-mail: [email protected] © The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 B. Hokanson et al. (eds.), Formative Design in Learning, Educational Communications and Technology: Issues and Innovations, https://doi.org/10.1007/978-3-031-41950-8_25

317

318

E. Boling

Brad gently guides experienced authors toward the discussion tables of new authors. As symposium leader he makes it his business to get to know each participant and welcome each one into the community – as warmly at a third or fourth symposium as at the first. Each time I have participated in the symposium Brad has greeted me with a Diet Coke he has set aside for me (once from his sport coat pocket!), remembering that this is my not-so-secret vice and making sure I will get one. As he moves around the symposium venue, I notice that he connects with each participant individually, drawing everyone into the event. Brad’s attention to both inclusiveness and rigor has led to both high quality and broad representation in the publications resulting from the Symposium. Furthermore, he has reached out to others in the field to collaborate on organizing and leading the Symposium, and made space for student participants to engage in the event each year. He has brought in speakers and activities from outside the traditional boundaries of educational technology and applied his designer’s perspective to creating a valuable, signature event which has steadily elevated the research profile of AECT as an organization. Brad has, without a lot of fanfare, established – designed – the AECT Research Symposium as a generator of rigorous scholarship in the field while at the same time making every participant feel welcomed, valued, and included in the effort.