135 104
English Pages 216 [217] Year 2023
ADVANCED INSTRUCTIONAL DESIGN TECHNIQUES
Advanced Instructional Design Techniques provides comprehensive coverage of advanced topics in instructional design and development. This ideal resource for upper-level graduate coursework presents a thorough overview of theoretical foundations that support learning design beyond basic information processing and behaviorist principles, along with innovative strategies and problem-solving techniques to support designing for complex situations. Twelve wide-ranging chapters cover challenging topics such as needs assessment, sustainability, ethics, cognitive load, and more. Emphasizing reflective practice and decision-making in design environments, the book attends to the models and constructs that support context- specific instructional design across learning and training, from higher education and K-12 schooling to business and industry training to health care and public-sector services. Jill E. Stefaniak is Associate Professor in the Learning, Design, and Technology program in the Department of Workforce Education and Instructional Technology at the University of Georgia, USA. Her research interests focus on the professional development of instructional designers, designer decision-making processes, and contextual factors influencing design in situated environments.
ADVANCED INSTRUCTIONAL DESIGN TECHNIQUES Theories and Strategies for Complex Learning
Jill E. Stefaniak
Designed cover image: © Getty Images First published 2024 by Routledge 605 Third Avenue, New York, NY 10158 and by Routledge 4 Park Square, Milton Park, Abingdon, Oxon, OX14 4RN Routledge is an imprint of the Taylor & Francis Group, an informa business © 2024 Jill E. Stefaniak The right of Jill E. Stefaniak to be identified as author of this work has been asserted in accordance with sections 77 and 78 of the Copyright, Designs and Patents Act 1988. All rights reserved. No part of this book may be reprinted or reproduced or utilised in any form or by any electronic, mechanical, or other means, now known or hereafter invented, including photocopying and recording, or in any information storage or retrieval system, without permission in writing from the publishers. Trademark notice: Product or corporate names may be trademarks or registered trademarks, and are used only for identification and explanation without intent to infringe. Library of Congress Cataloging-in-Publication Data Names: Stefaniak, Jill E., 1984- author. Title: Advanced instructional design techniques : theories and strategies for complex learning / Jill E. Stefaniak. Description: New York, NY : Routledge, 2024. | Includes bibliographical references and index. | Identifiers: LCCN 2023020015 (print) | LCCN 2023020016 (ebook) | ISBN 9781032261836 (hardback) | ISBN 9781032262031 (paperback) | ISBN 9781003287049 (ebook) Subjects: LCSH: Instructional systems--Design. | Needs assessment. | Problem solving. | Decision making. Classification: LCC LB1028.38 .S74 2024 (print) | LCC LB1028.38 (ebook) | DDC 371.3--dc23/eng/20230517 LC record available at https://lccn.loc.gov/2023020015 LC ebook record available at https://lccn.loc.gov/2023020016 ISBN: 978-1-032-26183-6 (hbk) ISBN: 978-1-032-26203-1 (pbk) ISBN: 978-1-003-28704-9 (ebk) DOI: 10.4324/9781003287049 Typeset in Garamond by SPi Technologies India Pvt Ltd (Straive)
To Elvis—the heart and soul of rock ‘n’ roll.
CONTENTS
Preface Acknowledgments Glossary
viii x xi
1 Toward the Development of Expertise in Instructional Design 1 2 Decision-Making Strategies to Support Managing the Design Space 14 3 Reconsidering Needs Assessment in Instructional Design 32 4 Leveraging Learner Analysis to Foster Critical Consciousness 48 5 Developing a Localization of Context to Support Transfer of Learning 70 6 Fostering Knowledge Acquisition Through Authentic Learning 91 7 Instructional Strategies that Promote Generative Learning 116 8 Scaffolding Instruction to Support Self-Regulation of Learners 132 9 Motivational Design Practices to Support Knowledge Acquisition 145 10 Attending to Infrastructure for Implementation 159 11 Assessing the Sustainability of Instruction 174 12 Ethical Considerations in Instructional Design 187 Index
199
PREFACE
The purpose of this book is to provide individuals with a comprehensive textbook to assist them with advanced instructional design coursework. While there are several introductory instructional design textbooks, there is a need for books that address advanced instructional design topics. The main objectives of this book would be to present a through overview of theoretical foundations that support learning design that extend beyond basic information processing and behaviorist theories, present different instructional strategies and problem-solving techniques to support designing for complex situations, and emphasize the role that reflective practice and awareness of one’s decision-making can have in design environments. Most graduate programs that offer advanced instructional design courses often rely on using journal articles or arranging course packs to provide a comprehensive overview of advanced practices. This book would offer a comprehensive look at advanced instructional design techniques, building off the traditional ADDIE (analyze, design, develop, implement, and evaluation) process that is most commonly conveyed in the introductory texts (e.g., Brown & Green, 2016; Cennamo & Kalk, 2019; Dick et al., 2022; Larson & Lockee, 2020). A major goal of this book would be to provide readers with the theoretical constructs that support different strategies employed by instructional designers in a variety of instructional contexts (i.e., business and industry, health care, higher education, and K-12 instruction). The target audience for this book consists primarily of educators, students, and practitioners in the field of instructional design. To date, most instructional design books on the market are focused on introductory design practices. This book would be a follow-up to those introductory books (many of which have been published by Routledge) to delve into advanced instructional design topics. Strategies for engaging in decision-making and problem-solving as they relate to solving complex instructional design problems are offered in each chapter. An emphasis is placed on integrating systems thinking with instructional design practice. Systems thinking and performance improvement strategies are often looked upon as stand-alone concepts. This book
Preface
attempts to demonstrate synergy between the two to support instructional designers while they design sustainable solutions.
Overview of Book This book includes 12 chapters. Each chapter of the book will follow a similar format, including the following: • Guiding questions to assist the reader with topics they should reflect on throughout the chapter reading and activities • Identification of key terms and concepts • Chapter overview • Content addressing theory and processes related to the chapter • Case studies and examples of instructional design in practice • Tools to support instructional designers’ decision- making and reflective practice • A section entitled Connecting Process to Practice Activities (questions and activities to forward discussion related to content in the chapter) • A section entitled Bridging Research and Practice. This section would list research articles related to the chapter topic. • Strategies to promote inclusive design will be addressed as they relate to individual chapter topics • Glossary
References Brown, A.H., & Green, T.D. (2016). The essentials of instructional design: Connecting fundamental principles with process and practice (3rd ed.). Routledge. Cennamo, K., & Kalk, D. (2019). Real world instructional design: An iterative approach to designing learning experiences (2nd ed.). Routledge. Dick, W., Carey, L., & Carey, J.O. (2022). The systematic design of instruction (11th ed.). Pearson. Larson, M.B., & Lockee, B.B. (2020). Streamlined ID: A practical guide to instructional design (2nd ed.). Routledge.
ix
ACKNOWLEDGMENTS
I would like to thank everyone at Routledge for their help and guidance throughout this journey. I would like to thank Daniel Schwartz for seeing the need for this book and providing support along the way. Lastly, I would like to thank all the instructional designers and students whom I get to interact with regularly and who inspire me every day to embrace the messiness that is instructional design!
GLOSSARY
Affordances Perceivable actions possible (Norman, 2013). Attainment Value Refers to the importance that an individual associates with successfully completing a task. Authentic Learning A pedagogical approach that promotes learners’ abilities to engage in problem-solving by having them complete tasks in real-world environments. Autonomy Accounts for an individual’s desire to be in control of their life. Case-based Learning A pedagogical approach where learners apply their knowledge to real-world scenarios that have already occurred. Cognitive Apprenticeship An apprenticeship process that utilizes cognitive and metacognitive skills and processes to guide learning (Dennen & Burner, 2008). Communication Planning Plans that detail how new initiatives or information is communicated to employees. Examples may include listservs, company newsletters, training announcements, performance reviews, and employee feedback. Competence Accounts for one’s ability to be effective when performing in the environment (White, 1959). Conceptual Knowledge A knowledge domain that pertains to facts, theories, and key terms related to subject matter (Anderson et al., 2001). Conditional Knowledge A knowledge domain that encompasses one’s ability to apply concepts and procedures under a set of conditions (Stark et al., 2011). Confirmative Evaluation A systematic process to measure long-term performance beyond the scope of formative and summative evaluation (Giberson et al., 2006). Conjecture An educated guess. Conjecturing The ability for a designer to hypothesize different solutions based on their prior knowledge and experience (Murty et al., 2010). Context The circumstances that contribute to a situation. Contextual analysis “The process of identifying factors that inhibit or promote the ability to design instruction that is relevant to what occurs in real-world settings” (Stefaniak, 2020, p. 59).
xii
Glossary
Culture “A fuzzy set of basic assumptions and values, orientations to life, beliefs, policies, procedures, and behavioral conventions that are shared by a group of people, and that influence (but do not determine) each member’s behavior and his/her interpretations of the “meaning” of other people’s behavior” (Spencer-Oatey, 2008, p. 8). Decision A choice. Decision-making The process of choosing the best option among several. Design Judgment A judgment that addresses why a decision has been made (Lachheb & Boling, 2021). Dynamic Decision-making The process of making a decision given time constraints. Empathetic Design A design strategy whereby the instructional designer puts themselves in their learners’ shoes to better understand their perspectives and perceptions related to topics that may be covered during instruction. Environmental Structuring Learner efforts to select and arrange the physical or technical setting to make learning easier. Ethics The rules or standards that govern the conduct of the members of a group (Dean, 1993, p. 7). Evaluation A systematic investigation to determine the worth or merit of a solution (Scriven, 1967). Expertise To have expert skill and knowledge in a particular field or area of study. Extraneous Load Load that is imposed on the learner during the instruction as a result of the materials being presented illogically. Extrinsic Motivation Occurs when an individual is driven to achieve or complete a task as a result of external pressures or rewards (Ryan & Deci, 2000). Feedback Systems Detailed plans to provide employees with feedback on their work performance. This information may be used to identify individual training needs and opportunities for promotion. Fluency A term used to describe the transition from completing tasks in a very deliberate and linear fashion to automaticity (Swan et al., 2020). Formative Evaluation “A judgment of the strengths and weaknesses of instruction in its developing stages, for purposes of revising the instruction to improve its effectiveness and appeal” (Tessmer, 1993, p. 11). Generative Learning Strategies Strategies where the learner is given more autonomy in generating their understanding of the topic. Germane Load The amount of effort required for the learner to process the information from the instructional materials to organize information and construct schemas to support their abilities to apply what they are learning to a variety of tasks and settings. Goal Setting Learner efforts to establish goals and subgoals to help plan the sequencing, timing, and completion of academic tasks.
Glossary
Help seeking Learner efforts to secure additional task information from a variety of sources, such as an instructor, classmate, or outside resource. Human Performance Technology “The study and ethical practice of improving productivity in organizations by designing and developing effective interventions that are results-oriented, comprehensive, and systemic” (Pershing, 2006, p. 6). Inclusive Design Instructional design that takes into account learner differences. Instructional designers who employ inclusive design strive to customize their instruction so that their learners see reflections of themselves in the content and feel a sense of belonging to the learning community. Instructional Context The context that considers the environment where instruction will be delivered. Instructional Design “The science and art of creating detailed specifications for the development, evaluation, and maintenance of situations which facilitate learning and performance” (Richey et al., 2011, p. 3). Intrinsic Load Load that is imposed on the learner during a learning task because of the complexity of the content. Intrinsic Motivation Occurs when an individual is driven to accomplish a task because of internal rewards instead of external pressures or influences (Ryan & Deci, 2000). Intrinsic Value The enjoyment an individual feels while completing a task. Job Analysis Up- to- date job descriptions with complete task analyses will provide a detailed account of how to perform tasks conveyed in training. Knowledge Management Installation of learning management systems to track learning initiatives throughout the organization. Electronic performance support systems are used to provide just-in-time resources to employees. Learner Analysis A learner analysis is the process of gathering information about the learning audience to receive instruction. Learner Characteristics Descriptors of learners who may participate in the instructional designer’s instructional activities. Learner characteristics are gathered during the learner analysis. Learning “The relatively permanent change in a person’s knowledge or behavior due to experience” (Mayer, 1982, p. 1040). Learning Space The space that consists of the problem space and the conceptual space (Hmelo-Silver, 2013). Localization of Context The process by which emphasis is placed on identifying contextual factors that directly impact an individual as they go through whatever intervention is being provided. Modelling The demonstration of performance.
xiii
xiv
Glossary
Morals Personal judgments, standards and rules of conduct based on fundamental duties of promoting human welfare (beneficence), acknowledging human equality (justice), and honoring individual freedom (respect for persons), and so forth (Dean, 1993, p. 7). Need The gap (or difference) between the current state of affairs and the desired state of affairs (Altschuld & Kumar, 2010). Needs Analysis The process of determining what is contributing to the discrepancies in performance identified in a needs assessment. Needs Assessment “The process of identifying a gap in performance by comparing the actual state of performance to the desired state of performance” (Stefaniak, 2020, p. 16). Non-instructional Intervention An intervention that does not have an instructional component. Organizational Design A plan that outlines the organizational infrastructure of a company. Details are provided to demonstrate how different units interact and function with one another in the organization. Orienting Context The context that addresses factors concerning learners as it relates to their participation in the learning experience. Personas Fictional characters that are created on the basis of information that instructional designers gather during a learner analysis. The learner characteristics are used to create characteristics to help the instructional designer envision how a learner may experience instruction. Primary Needs Needs of the individuals who receive services or products to fulfill and resolve their needs (Altschuld & Kumar, 2010). Prior Knowledge The information and knowledge that a learner is in possession of prior to instruction. Problem-based Learning A pedagogical approach where learners explore solving problems in the real world. Problem Finding The process of articulating “a clear and concise representation of the problem(s) in a particular situation” (Ertmer & Stepich, 2005, p. 39). Problem Framing The process by which designers can navigate from the problem-finding space to the problem-solving space of any given project. Problem Solving “Developing a clear and relevant solution plan that explicitly describes how the proposed solutions address the issues that have been identified in the previous step” (Ertmer & Stepich, 2005, p. 41). Procedural Knowledge A knowledge domain that addresses the processes and procedures related to completing a task (Anderson et al., 2001; Burgin, 2017). Productive Failure An instructional approach that suggests that instructors should not intervene if learners make mistakes while engaged in ill-structured problem-solving (Kapur, 2008).
Glossary
Rational decision-making A decision that follows a systematic and linear process where the decision-maker considers a variety of alternative solutions, analyzes the advantages and disadvantages of each, and selects a solution that is optimal (De Martino et al., 2006; Klein, 1998). Relatedness One’s willingness to interact with others and feel a sense of belonging to others (Baumeister & Leary, 1995). Scaffolding The process by which an expert provides just-in-time support to a learner or novice (Wood et al., 1976). Secondary Needs Needs of the individuals responsible for delivering the products and services to address the primary level of needs (Altschuld & Kumar, 2010). Self-Evaluation Learner efforts to gauge the progress and quality of their work toward desired goals. Self-regulated Learning Refers to the actions that a learner undertakes to plan, enact, and monitor progress made toward the attainment of their learning goals (Zimmerman, 1998). Situated Cognition A theory that posits that learners learn from doing. Strategic Knowledge A knowledge domain that addresses problem- solving strategies and heuristics. Summative Evaluation The process to determine that an intervention was successful. Supplantive Learning Strategies Strategies that “supplant, facilitate, or scaffold most of the information processing for the learner by providing elaborations that supply all or part of the educational goal, organization, elaboration, sequencing and emphasis of content, monitoring of understanding, and suggestions for transfer to other contexts” (Smith & Ragan, 2005, p. 142). Systematic Process A prescriptive process that follows a series of steps to complete a task. Steps are typically completed in a prescribed order. Systemic Process A process that considers the implications that one aspect of a process may pose for subsequent steps. Task Strategies Learner efforts to actively utilize specific strategies to achieve desired goals. Tertiary Needs Needs that are centered on the resources and infrastructure needed to support the primary and secondary needs of an organization or group (Altschuld & Kumar, 2010). Time Management Learner efforts to consider what must be done and devote an appropriate amount of time to each task. Transfer Context The context that considers how and where the information acquired in the instructional context will be applied upon completion of training. Utility Value The perception an individual has that the task will be useful to them in the future. Valence The degree to which we place value on a task.
xv
xvi
Glossary
Values The core beliefs or desires that guide or motivate the individual’s attitudes and actions, such as honesty, integrity, and fairness, make up one’s values (Dean, 1993, p. 7). Zone of Proximal Development “The distance between the actual development level as determined by independent problem solving and the level of potential development as determined through problem solving under adult guidance, or in collaboration with more capable peers” (Vygotsky, 1978, p. 86).
1
TOWARD THE DEVELOPMENT OF EXPERTISE IN INSTRUCTIONAL DESIGN
Chapter Overview This introductory chapter introduces the goals of the book. A summary is provided of instructional design processes found in most introductory textbooks and guidance for how readers can use this book to expand upon their instructional design repository. Various types of expertise found in instructional design such as routine expertise and adaptive expertise will be differentiated. Emphasis is placed on classifying different instructional design tasks and activities according to procedural knowledge, conditional knowledge, conceptual knowledge, and knowledge generation. A goal for this chapter is to reposition instructional designers’ views on the use of models in instructional design as they elaborate on different design techniques throughout the book and work on their own professional development.
Guiding Questions 1. What is the difference between conceptual, conditional, and procedural knowledge in instructional design? 2. What does expertise look like in instructional design? 3. How can I develop expertise in instructional design? 4. What is the difference between systematic design and systemic design?
The Science, Craft, and Art of Instructional Design Instructional design does not occur in a vacuum! While instructional design has processes and tools to support learning and performance, most, if not all, instructional designers will be quick to admit that instructional
DOI: 10.4324/9781003287049-1
2
TOWARD THE DEVELOPMENT OF EXPERTISE IN INSTRUCTIONAL DESIGN
design is messy! Every situation presents different learners, environments, and contextual factors that will influence and sometimes hinder how an instructional designer can approach a project. Most instructional design models are often interpreted to serve as linear processes to design. While many of the models and processes (i.e., Dick et al., 2009; Morrison et al., 2013; Smith & Ragan, 2005) suggest a systematic process to guide instructional design, they do not necessarily imply linearity. While there are recommended steps and phases that instructional designers will find themselves engaged in during a project, experienced instructional designers will be the first to tell you that the instructional design process is recursive in nature. Oftentimes, these phases are revisited during several iterations of design as the instructional designer is provided with new information that impacts the outcomes of their project. Instructional design practices are systemic. Every decision that an instructional design makes during a project imposes positive or negative systemic implications to the environment. It is important for instructional designers to become comfortable with anticipating the needs of their learners and constituents and the implications that various design activities may have on the overall success of a project. Instructional designers should be continually scanning the environment to determine what is needed to support the learning audience technological platforms, delivery of instruction, and evaluative methods (both short-term and long-term) to measure the success of the instruction. To date, several scholars have offered different variations to define instructional design. I particularly subscribe to Richey and colleagues’ (2011) definition that “instructional design is the science and art of creating detailed specifications for the development, evaluation, and maintenance of situations which facilitate learning and performance” (p. 3). This definition implies the systemic nature of instructional design by acknowledging the need for regular maintenance of situations that facilitate learning. Their definition recognizes that instructional design is a balance between science and craft. The instructional designer considers and integrates learning theories and systems theories while adjusting prescriptive processes to meet the nuanced needs of the learners, environment, and subject matter. When we refer to instructional models and processes to guide various aspects of our projects, we are addressing the “science” of instructional design. Our craftsmanship is something that is refined over time. This is not achieved after the completion of one or two projects; rather, it is something that we refine with the completion of each new project and design experience. We add to our expertise toolbox every time we find ourselves deviating from the original plan to accommodate our learners or achieve the goals within imposed design constraints. We build upon our craftmanship when projects go well and as we learn from our mistakes.
TOWARD THE DEVELOPMENT OF EXPERTISE IN INSTRUCTIONAL DESIGN
I believe that our craftsmanship is fostered through practical approaches and reflective approaches. When we engage in formative design and obtain feedback on our design work at various phases throughout a project, we are provided with opportunities to embrace a recursive approach to instructional design. This enables us to continually assess the situation to ensure that we are addressing the needs of our constituents (i.e., learners, teachers, clients, and society). Recommendations for revisions that may be brought to our attention during beta-evaluations and usability testing help us to make the necessary adjustments and modifications to our work. This continual refinement enables us to cultivate our craftsmanship. We also promote the growth of our craftsmanship through reflective means. Reflective practice is a term that is used regularly among scholars in instructional design to support the development of our awareness of the design decisions and activities we engage in regularly (Boling et al., 2022; Parrish, 2012; Tracey & Hutchinson, 2016). Reflective exercises help us to step away from the project and consider how we are approaching design (McDonald, 2022). We may find ourselves reflecting on the following: What went well on this project? • • How will this benefit my learners? • I know my client is happy with what I am doing, but it is going to address long-term needs? • What is contributing to the challenges I am experiencing with this project? • What information do I still need to complete the project? Regardless of the design project, a common goal inherent in all instructional design work is to facilitate learning and improvement performance. Every time an instructional designer works on a new project, they are honing their craft as instructional designers. The continual scanning of the design environment helps instructional designers take stock of the environment as a system and their role within the system. In doing so, the designer is able to see the interrelationships and interconnections that may exist or may be needed to support individuals in the system (i.e., learners, teachers, parents, etc.). Approaching design through a systemic lens gives credence to the non-linearity of instructional design (Bannan-Ritland, 2003; Gibbons, 2014; Nelson, 2020; Stefaniak & Xu, 2020). As instructional designers work to build upon their expertise, they contribute to their designer mindset. Their mindset encompasses what they know about instructional design and the means by which they apply that information (Boling et al., 2022). Nelson and Stolterman (2012) purport that “the ability to use a systemic design approach is not dependent on
3
4
TOWARD THE DEVELOPMENT OF EXPERTISE IN INSTRUCTIONAL DESIGN
the mastery of a set of theories, methods, and facts” (p. 64). Rather, they suggest it is how we adjust our position as designers to view the situation. We balance ourselves between taking on an evaluative role versus a creative role. This balancing act helps us to frame our design space as we prepare our stance for approaching the project. The adjustments we make as instructional designers to frame the design problem and develop solutions embody the creative role we play within the system.
Balancing Roles in Instructional Design Think about the last project you completed. What was the nature of the project? Who was involved? What are examples of instructional design activities that informed your evaluative role in the project? In what ways did you espouse creativity? How did these roles inform one another?
When expert instructional designers engage in design, their processes and results take on an art form where they are applying systematized design principles, coordinating multiple tasks, and leveraging resources within their environment to facilitate learning. B.F. Skinner (1954) advocated that there is a science and art to teaching and designing instruction. While educators should be knowledgeable of subject matter and how individuals learn, there is a sentiment that they must be intuitive, attuned to the needs of their learning audience, and willing to be flexible to support their learners to achieve instructional outcomes. The artfulness inherent in instructional design is that there is no one-size-fits-all approach. We demonstrate artfulness in instructional design as we design, develop, and implement experiences within an environment that accounts for the needs of the situation. We demonstrate artfulness when we recommend solutions that consider the composition of the system. We demonstrate artfulness when we are forced to get creative with how we can allocate limited budgetary resources for a project or brainstorm multiple ways to make learning accessible to audiences with limited technological resources. The adjustments and refinements we make to our designs to account for these systemic factors are the very art of what we do.
What Does Expertise in Instructional Design Look Like? An expert is defined as “one with the special skill or knowledge representing mastery of a particular subject” (Webster’s Dictionary, 2020). We know an experienced instructional designer when we see one. We cannot always pinpoint specific characteristics or behaviors that make them an expert, but we often notice differences in their approach to instructional design processes, decision-making prowess, and their project outcomes (Ertmer & Stepich, 2005; Hardré et al., 2005).
TOWARD THE DEVELOPMENT OF EXPERTISE IN INSTRUCTIONAL DESIGN
There are three assumptions that can be made about the development of one’s expertise: . that the constituent skills can be identified 1 2. that the skills can be transmitted to prospective practitioners 3. that they can be appropriately drawn upon in practice. (Kennedy, 1987, p. 135) This is evident in the way that expert instructional designers approach design projects. Ertmer and Stepich (2005) recognize that novice instructional designers tend to engage in problem-finding strategies but that experts engage in problem solving. Problem finding encompasses “being able to articulate a clear and concise representation of the problem(s) in a particular situation” (Ertmer & Stepich, 2005, p. 39). Whereas expert instructional designers will certainly engage in some degree of problem finding, novices tend to become fixated on retrieving as much as information as possible related to the problem. This sometimes results in paralysis by analysis. Problem solving entails developing a plan to address the situation and explicitly describing how the solution addresses the issues that were uncovered during the problem-finding phase (Ertmer & Stepich, 2005). This phase involves understanding the relationship among solutions, the implications of each decision the design must make along with their solutions, and the ability to be flexible and adjust the solutions in situ. Boling and colleagues (2022) suggest that designers do not solve problems; rather, they engage with the design space. Their engagement with the design space entails framing the problem. Problem framing is described as one’s ability “to take ownership of an iteratively define what the problem really is, decide what should be included and excluded, and decide how to proceed in solving it” (Svihla, 2020, para 2). I do not necessarily view this as being synonymous with one another. I tend to think that problem framing is the process by which designers can navigate from the problem-finding space to the problem-solving space of any given project. The focus of Chapter 2 in this book is to examine decision-making strategies to support managing the design space. Research examining the practices of expert instructional designers (i.e., Hardré et al., 2005; Perez & Emery, 1995; Rowland, 1992; Stepich & Ertmer, 2009; York & Ertmer, 2016) describes their design process as recursive where they take a zig-zag and non-linear approach to design. It is a constant back-and-forth between problem finding and problem solving where they revisit different aspects of the design process as they gain more information pertaining to the situation. It is not the intention of this book to introduce another instructional design model or process; rather, it is meant to support further exploration into the messy components that comprise instructional design.
5
6
TOWARD THE DEVELOPMENT OF EXPERTISE IN INSTRUCTIONAL DESIGN
Instructional designers continue to develop their expertise with every project they complete. The topics included in this book are meant to provide strategies to support the instructional designer as they navigate between the problem-finding and problem-solving space. Heuristics are provided to support how instructional designers can engage in problem framing to manage their design space while adhering to the time limitations and other constraints associated with their projects.
A Trajectory Toward the Development of Expertise in Instructional Design To develop a designer mindset in instructional design, we need to think about what contributes to developing a stance in design. Boling and colleagues (2022) describe it as the instructional designer’s ability to use their knowledge and identify appropriate means to apply that knowledge to different situations. The acquisition of such knowledge spans three domains: procedural, conditional, and conceptual (Anderson et al., 2001; Bransford et al., 2000; Paris et al., 1983; Swan et al., 2020). Procedural knowledge addresses the processes and procedures related to completing a task (Anderson et al., 2001; Burgin, 2017). Examples of procedural tasks commonly recognized in instructional design may include conducting a needs assessment, learner analysis, task analyses, formative evaluations, usability testing, and summative evaluations. Each of these tasks includes a prescriptive set of procedures to guide the instructional designer. Conceptual knowledge pertains to facts, theories, and key terms related to subject matter (Anderson et al., 2001). Examples of conceptual knowledge prevalent in instructional design activities include referencing learning theories and other theoretical constructs (i.e., behaviorism, cognitivism, general systems, self-regulated learning, etc.) to guide design decisions and instructional interventions. Conceptual knowledge is inclusive of the vocabulary common to the field of instructional design. Instructional designers’ abilities to differentiate between different types of instructional strategies, instructional models, and methods of evaluation are demonstrations of their conceptual knowledge. Conditional knowledge encompasses one’s ability to apply concepts and procedures (procedural and conceptual knowledge) under a set of conditions (Stark et al., 2011). Instructional designers demonstrate their acquisition of conditional knowledge when they adjust their design practices depending on various contextual factors influencing the system. Perhaps a designer finds out that their students have only a certain amount of time allocated for the activities or that there are challenges with students accessing materials online from home. These conditions may require the instructional designer to make decisions on how to compartmentalize their content. They may need to devise alternative solutions for students
TOWARD THE DEVELOPMENT OF EXPERTISE IN INSTRUCTIONAL DESIGN
to access the materials and complete learning activities if it is a necessity that they complete the work outside of work or school. Oftentimes, literature on expertise focuses on the procedural, conceptual, and conditional knowledge domains. Paris and colleagues (1983) proposed strategic knowledge as a knowledge domain independent of conditional knowledge. Strategic knowledge is comprised of knowledge about problem-solving strategies and heuristics. Examples of strategic knowledge evident in instructional design practices include rational and dynamic decision-making processes and approaches to well-structured and ill-structured problem-solving. I personally like acknowledging strategic knowledge as its own domain because it places greater emphasis on the processes employed by an instructional designer as they engage in navigating their design space (i.e., problem finding, problem framing, problem solving). While some will think of conditional knowledge as encompassing strategy, I think it is particularly important to recognize conditional knowledge as an independent domain when teaching instructional design students about their design practices. Instructional designers routinely apply strategic knowledge when they are required to select appropriate data sources to support needs assessment and evaluation efforts. Determining how to couple instructional and non-instructional interventions elicits strategic knowledge. We also use strategic knowledge when we apply soft skills to our instructional design projects. Communicating with project constituents, managing client relationships, and overall project management require us to be strategic as we navigate solutions that are appropriate for unique situations (Sentz & Stefaniak, 2019). Strategic knowledge often requires us to be able to take a thirty-thousand-foot view of the situation and be proactive with our design decisions as we anticipate other issues that may appear. Studies have been conducted to explore the development of expertise in instructional design pedagogy. To date, an overwhelming majority of these studies have been conducted in introductory courses (Stefaniak & Hwang, 2021). Few studies (e.g., Hardré & Kollmann, 2013; Hartt & Rossett, 2000; McDonald et al., 2019, 2022; Vu & Fadde, 2013) have examined how instructional design students navigate through advanced instructional design course. While we could make assumptions that most students will demonstrate acquisition of the four knowledge domains previously mentioned, it is difficult to discern the trajectory of how that expertise is cultivated over time. While there is not a chapter dedicated to reflective practice in this book, it is a theme communicated throughout the various chapters. As you work on developing your expertise as an instructional designer, think about your own trajectory. How have you adjusted your design practices to support your acquisition of conditional and strategic knowledge?
7
8
TOWARD THE DEVELOPMENT OF EXPERTISE IN INSTRUCTIONAL DESIGN TABLE 1.1 Documenting Areas for Your Professional Development Procedural Knowledge
Conceptual Knowledge
Strategic Knowledge
Conditional Knowledge
What instructional design skills do you feel comfortable applying in your projects? What instructional design skills would you like to improve?
Developing One’s Expertise If you are reading this book, there is a good chance that you are an instructional designer. As you continue to explore the messiness that is instructional design and refine your craftsmanship, I implore you to take some time and reflect on your instructional design experience and knowledge across the procedural, conceptual, conditional, and strategic knowledge domains. What areas do you want to work on improving? Can you trace the challenges you experience on your design projects to a particular domain? What are your goals for enhancing your craftsmanship in the years to come? Each chapter in this book is geared toward exploring a topic relevant to instructional design work across these four domains. As you continue through the book, I encourage you to document areas that you would like to explore and work on across the domains to support the development of your expertise (see Table 1.1).
Development of Instructional Design Expertise as a Continuum Researchers who study the development of expertise recognize that it is enhanced in phases. Alexander (2003) suggests that development of one’s expertise goes through three phases: acclimation, competence, and proficiency. During the acclimation phase, the learner is introduced to unfamiliar territory. Think about the first course you ever took as an instructional designer. Many new students to instructional design will describe the experience as learning a new language. There are so many vocabulary terms and nuanced phrases that are unique to our field. Think about the first instructional design project you completed? What did you know about instructional design at that time? Whom did you rely on to get the information you needed to complete your work? While the development of instructional design expertise is not formulaic, we can anticipate that an individual going through the acclimation phase as a new or aspiring instructional designer will more than likely
TOWARD THE DEVELOPMENT OF EXPERTISE IN INSTRUCTIONAL DESIGN
focus their attention on conceptual knowledge related to the course or project they are completing. As they engage in the second phase, competence, they will begin to think more deeply in terms of the processes they are employing as they complete their tasks. During this phase, we can anticipate that we will see the individual focus more attention on the procedural domains. As they continue to demonstrate competence, we will see them build upon their conditional knowledge and strategic knowledge domains. As an instructional designer accrues more design experiences to practice application of their skills in different situated contexts, they will have more opportunities to see the implications of their design decisions. This becomes particularly evident to an instructional designer when they must adjust project expectations, guidelines, and tasks as different contextual factors and affordances contribute to or hinder the instructional environment or design space or both. The last phase of expertise development is proficiency. By the time an individual reaches this phase, they can demonstrate synergy between their ability to acclimate to a situation and demonstrate competence (Alexander, 2004). This coincides with the way expert instructional designers are described as they assess their design space when working on a project. Swan and colleagues (2020) described this level of expertise as being adaptive. When an individual achieves proficiency or adaptive expertise, there is a degree of automaticity that occurs when they execute tasks. Swan and colleagues (2020) proposed the term fluency to describe the transition from completing tasks in a very deliberate and linear fashion to automaticity. At this phase, the individual is able to draw across all knowledge domains simultaneously to support their actions. They are more apt to identify patterns of performance, anticipate needs, and understand the complexities inherent in each solution they propose. So, what would adaptive expertise look like in instructional design? Several studies that have examined instructional design expertise report on experts considering multiple facets of the project at one time (Ertmer et al., 2009; LeMaistre, 1998; Perez et al., 1995; Rowland, 1992; Rowley, 2005). Rowland (1992) describes expert instructional designers as demonstrating a zig-zag approach where they engage in a non-linear approach. Within this approach, they continually assess the situation and make the necessary modifications to their designs in reaction to unique circumstances associated with that project.
Striving to Avoid Linearity One way we could think about the development of our instructional design expertise is to think about how we approach our design projects. Research has shown that novice instructional designers typically subscribe to a
9
10
TOWARD THE DEVELOPMENT OF EXPERTISE IN INSTRUCTIONAL DESIGN
more linear approach where they take a step-by-step account for analysis, design, development, implementation, and evaluation. They do not tend to revisit phases once a decision has been made (Ertmer & Stepich, 2005; Rowland, 1992; Stefaniak et al., 2022). The expert instructional design embraces the non-linearity of design. They constantly surveil the situation and use an iterative approach. Phases are revisited frequently in a manner that does not hinder progress or time lines for completing the work. The topics in the remaining chapters of this book are structured to help you identify ways you can adopt a recursive and iterative process as you hone your craft and contribute to the development of your instructional design expertise.
Connecting Process to Practice Activities 1. Create a list of the various activities you engage in as an instructional designer. What challenges have you faced with your projects? What strategies have you employed to complete the tasks when information and resources are limited? 2. Instructional design is a culmination of science, craft, and art. Use Table to outline how you are approaching your current instructional design projects. 3. The term fluency is often associated with language acquisition to describe when an individual is able to comprehend language and reproduce the words according to its grammar (Swan et al., 2020). How does this translate to instructional design? What does fluency look like for an instructional designer? When would someone be able to say they are fluent in instructional design rhetoric? 4. Reflective practice is inherent in instructional design. Take a few moments to reflect on your professional development and learning goals to support your design capabilities. What do you want to learn as an instructional designer to enhance your knowledge base? How do these items relate to procedural knowledge, conditional knowledge, and conceptual knowledge? TABLE 1.2 Reflecting on the Science, Craft, and Art of Instructional Design Science
Craft
Art
What learning theories and research are guiding your work? (i.e., models, particular instructional strategies, multimedia principles, etc.)
What is your experience completing the required tasks for your current instructional design project? What are you applying from those experiences? What are you doing differently?
In what ways are you able to demonstrate creativity in your design work? How does this project differ from previous projects?
TOWARD THE DEVELOPMENT OF EXPERTISE IN INSTRUCTIONAL DESIGN
Bridging Research and Practice Chartier, K. J. (2021). Investigating instructional design expertise: A 25-year review of literature. Performance Improvement Quarterly, 34(2), 111–130. https://doi. org/10.1002/piq.21345 Ertmer, P. A., & Stepich, D. A. (2005). Instructional design expertise: How will we know it when we see it? Educational Technology, 45(6), 38–43. Hardré, P. L., Ge, X., & Thomas, M. K. (2005). Toward a model of development for instructional design expertise. Educational Technology, 45(1), 53–57. Hardré, P. L., Ge, X., & Thomas, M. K. (2006). An investigation of development toward instructional design expertise. Performance Improvement Quarterly, 19(4), 63–90. https://doi.org/10.1111/j.1937-8327.2006.tb00385.x Rowland, G. (1992). What do instructional designers actually do? An initial investigation of expert practice. Performance Improvement Quarterly, 5(2), 65–86. https://doi.org/10.1111/j.1937-8327.1992.tb00546.x Stefaniak, J. E., & Hwang, H. (2021). A systematic review of how expertise is cultivated in instructional design coursework. Educational Technology Research and Development, 69(6), 3331–3366. https://doi.org/10.1007/s11423-021-10064-x
References Alexander, P. A. (2003). The development of expertise: The journey from acclimation to proficiency. Educational Researcher, 32(8), 10–14. https://doi.org/10.31 02/0013189X032008010 Alexander, P. A. (2004). A model of domain learning: Reinterpreting expertise as a multidimensional, multistage process. In D. Y. Dai & R. J. Sternberg (Eds.), Motivation, emotion, and cognition: Integrative perspectives on intellectual functioning and development (pp. 273–298). Routledge. Anderson, L. W., Krathwohl, D. R., Airasian, P. W., Cruikshank, K. A., Mayer, R. E., Pintrich, P. R., Raths, J., & Wittrock, M. C. (2001). A taxonomy for learning, teaching, and assessing, A revision of Bloom’s taxonomy of educational objectives (abridged edition). Longman. Bannan-Ritland, B. (2003). The role of design in research: The integrative learning design framework. Educational Researcher, 32(1), 21–24. https://doi.org/10. 3102/0013189X032001021 Boling, E., Gray, C., & Lachheb, A. (2022). Inscribing a designer mindset to instructional design students. In J. E. Stefaniak & R. M. Reese (Eds.), The instructional design trainer’s guide: Authentic practices and considerations for mentoring ID and ed tech professionals (pp. 18–28). Routledge. Bransford, J., Brown, A., Cocking, R., & Center, E. R. I. (2000). How people learning: Brain, mind, experience, and school (2nd ed.). National Academy Press. Burgin, M. S. (2017). Theory of knowledge: Structures and processes. World Scientific. Definition of Expert. (2020). Retrieved May 8, 2020, from https://www.merriamwebster.com/dictionary/expert Dick, W., Carey, L., & Carey, J. (2009). The systematic design of instruction (7th ed.). Allyn & Bacon. Ertmer, P. A., & Stepich, D. A. (2005). Instructional design expertise: How will we know it when we see it? Educational Technology, 45(6), 38–43. Ertmer, P. A., York, C. S., & Gedik, N. (2009). Learning from the pros: How experienced designers translate instructional design models into practice. Educational Technology, 49(1), 19–27.
11
12
TOWARD THE DEVELOPMENT OF EXPERTISE IN INSTRUCTIONAL DESIGN Gibbons, A. S. (2014). Eight view of instructional design and what they should mean to instructional designers. In B. Hokanson & A. S. Gibbons (Eds.), Design in educational technology (pp. 15–36). Springer. Hardré, P. L., Ge, X., & Thomas, M. K. (2005). Toward a model of development for instructional design expertise. Educational Technology, 45(1), 53–57. Hardré, P. L., & Kollmann, S. (2013). Dynamics of instructional and perceptual factors in instructional design competence development. Journal of Learning Design, 6(1), 46–60. Hartt, D. C., & Rossett, A. (2000). When instructional design students consult with the real world. Performance Improvement, 39(7), 36–43. https://doi. org/10.1002/pfi.4140390712 Kennedy, M. M. (1987). Inexact sciences: Professional education and the development of expertise. Review of Research in Education, 14(1), 133–167. LeMaistre, C. (1998). What is an expert instructional designer? Evidence of expert performance during formative evaluation. Educational Technology Research and Development, 46(3), 21–36. https://doi.org/10.1007/BF02299759 McDonald, J. (2022). Preparing instructional design students for reflective practice. In J. E. Stefaniak & R. M. Reese (Eds.), The instructional design trainer’s guide: Authentic practices and considerations for mentoring ID and ed tech professionals (pp. 29–37). Routledge. McDonald, J. K., Rich, P. J., & Gubler, N. B. (2019). The perceived value of informal, peer critique in the instructional design studio. Tech Trends, 63(2), 149–159. https://doi.org/10.1007/s11528-018-0302-9 McDonald, J. K., Stefaniak, J., & Rich, P. J. (2022). Expecting the unexpected: A collaborative autoethnography of instructors’ experiences teaching advanced instructional design. TechTrends, 66(1), 90–101. https://doi.org/10.1007/ s11528-021-00677-7 Morrison, G. R., Ross, S. M., Kalman, H. K., & Kemp, J. E. (2013). Designing effective instruction (7th ed.). Wiley. Nelson, H. G. (2020). The promise of systemic designing: Giving form to water. In M. J. Spector, B. B. Lockee, & M. D. Childress (Eds.), Learning, design, and technology: An international compendium of theory, research, practice, and policy (pp. 1–49). Springer. Nelson, H. G., & Stolterman, E. (2012). The design way: Intentional change in an unpredictable world (2nd ed.). Routledge. Paris, S. G., Lipson, M. Y., & Wixson, K. K. (1983). Becoming a strategic reader. Contemporary Educational Psychology, 8, 293–316. https://doi.org/10.1016/ 0361-476X(83)90018-8 Parrish, P. (2012). What does a connoisseur connaît? Lessons for appreciating learning experiences. In S. B. Fee & B. R. Belland (Eds.), The role of criticism in understanding problem solving: Honoring the work of John C. Belland (pp. 43–53). Springer. Perez, R. S., & Emery, C. D. (1995). Designer thinking: How novices and experts think about instructional design. Performance Improvement Quarterly, 8(3), 80–95. https://doi.org/10.1111/j.1937-8327.1995.tb00688.x Perez, R. S., Johnson, J. F., & Emery, C. D. (1995). Instructional design expertise: A cognitive model of design. Instructional Science, 23(5–6), 321–349. https://doi. org/10.1111/j.1937-8327.1995.tb00688.x Richey, R. C., Klein, J. D., & Tracey, M. W. (2011). The instructional design knowledge base: Theory, research, and practice. Routledge. Rowland, G. (1992). What do instructional designers actually do? An initial investigation of expert practice. Performance Improvement Quarterly, 5(2), 65–86. https://doi.org/10.1111/j.1937-8327.1992.tb00546.x
TOWARD THE DEVELOPMENT OF EXPERTISE IN INSTRUCTIONAL DESIGN Rowley, K. (2005). Inquiry into the practices of expert courseware designers: A pragmatic method for the design of effective instructional systems. Journal of Educational Computing Research, 33(4), 419–450. https://doi. org/10.2190/9MLR-ARTQ-BD1P-KET Sentz, J., & Stefaniak, J. (2019). Instructional heuristics for the use of worked examples to manage instructional designers’ cognitive load while-solving. TechTrends, 63(2), 209–225. https://doi.org/10.1007/s11528-018-0348-8 Skinner, B. F. (1954). The science of learning and the art of teaching. Harvard Educational Review, 24, 86–97. Smith, P. L., & Ragan, T. J. (2005). Instructional design (3rd ed.). Jossey-Bass. Stark, R., Kopp, V., & Fischer, M. R. (2011). Case-based learning with worked examples in complex domains: Two experimental studies in undergraduate medical education. Learning and Instruction, 21(1), 22–33. https://doi.org/10.1016/ j.learninstruc.2009.10.001 Stefaniak, J., Baaki, J., & Stapleton, L. (2022). An exploration of conjecture strategies used by instructional design students to support design decision-making. Educational Technology Research and Development, 70(2), 595–613. https:// doi.org/10.1007/s11423-022-10092-1 Stefaniak, J., & Xu, M. (2020). An examination of the systemic reach of instructional design models: A systematic review. TechTrends, 64, 710–719. https://doi. org/10.1007/s11528-020-00539-8 Stepich, D. A., & Ertmer, P. A. (2009). “Teaching” instructional design expertise: Strategies to support students’ problem-finding skills. Technology, Instruction, Cognition & Learning, 7(2). Svihla, V. (2020). Problem framing. In J. K. McDonald & R. E. West (Eds.), Design for learning: Principles, processes, and praxis. EdTech Books. https://edtechbooks. org/id/problem_framing Swan, R. H., Plummer, K. J., & West, R. E. (2020). Toward functional expertise through formal education: Identifying an opportunity for higher education. Educational Technology Research & Development, 68(5), 2551–2568. https:// doi.org/10.1007/s11423-020-09778-1 Tracey, M. W., & Hutchinson, A. (2016). Uncertainty, reflection, and designer identity development. Design Studies, 42, 86–109. https://doi.org/10.1016/ j.destud.2015.10.004 Vu, P., & Fadde, P. J. (2013). When to talk, when to chat: Student interactions in live virtual classrooms. Journal of Interactive Online Learning, 12(2), 41–52. York, C. S., & Ertmer, P. A. (2016). Examining instructional design principles applied by experienced designers in practice. Performance Improvement Quarterly, 29(2), 169–192. https://doi.org/10.1002/piq.21220
13
2
DECISION-MAKING STRATEGIES TO SUPPORT MANAGING THE DESIGN SPACE
Chapter Overview This chapter provides an overview of decision-making strategies employed by instructional designers in the field. Rational and dynamic decision-making processes will be discussed. Common terms such as design decisions, decision-making processes, judgments, and conjectures will be differentiated. Tools will be provided to support instructional designers’ awareness of their own decision-making practices through various phases of their instructional design projects.
Guiding Questions 1. What are the differences between rational and dynamic decision- making processes?
2. What is the design space? 3. What strategies can instructional designers employ to manage the design space? 4. What types of design judgments do you frequently make as an instructional designer?
Fundamentals of Decision-Making in Instructional Design If we were to engage in an experiment where we attempted to track how many decisions we made during a design project, we would find out very quickly that we would lose count. More than likely, there would be several decisions that we would simply forget to track or include in our list. We might engage in actions that we would not necessarily recognize as being decisions. As someone who studies decision-making, I think it would be a pointless exercise to count every single decision we would make during a project. I do, however, think that developing an understanding for how we approach decisions and the challenges we may encounter while making DOI: 10.4324/9781003287049-2
DECISION-MAKING STRATEGIES TO SUPPORT MANAGING THE DESIGN SPACE
decisions can significantly help us improve our instructional design practices and save our sanity! As mentioned in Chapter 1, instructional design is messy. An experienced designer will attest that as they gain experience over time, they are able to make decisions in a way that appears seamless. They are able to draw from their knowledge of instructional design, previous experience, and awareness of the current situation and make informed decisions that help move the project along. A decision is a choice. An individual makes a decision when they are presented with more than one option. According to Skyttner (2001), four things must be present in a situation in order for a decision to be made: . A problem exists. 1 2. At least two alternatives for action remain. 3. Knowledge exists of the objective and the relationship to the problem. 4. The consequences of the decision can be estimated and sometimes quantified. (p. 340)
Considering Options for a Faculty Development Program Patrick is an instructional designer in the Center for Teaching and Learning at a local university. He was recently approached by the Office of Faculty Affairs, which asked him to develop a workshop to help faculty identify ways they can promote interaction among students in asynchronous online courses. The Office of Faculty Affairs shared with Patrick that it was up to him to decide whether the workshop is delivered face-to-face on campus or offered online. Table 2.1 provides an example for how a decision may be dissected in an instructional design context as Patrick considers his options for delivery format.
As you look at the various options that Patrick may consider in Table 2.1, you may notice a couple of things. While two options were provided for Patrick to consider in terms of course delivery, there may have been other options we could have added to complicate the situation. We could have offered asynchronous online workshops to allow faculty to complete the activities independently on their own time. We could have offered a hybrid approach that would have provided faculty with a choice of attending face-to-face or online. We also could have brainstormed some additional options they could have offered the workshop over the course of a couple of sessions blending face-to-face and online activities.
15
16
DECISION-MAKING STRATEGIES TO SUPPORT MANAGING THE DESIGN SPACE TABLE 2.1 Dissection of an Instructional Design Decision Characteristics Present Warranting a Decision (Skyttner, 2001)
Course Delivery Decision
A problem exists.
A faculty development workshop is needed to help faculty identify ways they can promote interaction among students in asynchronous online courses. Option 1: Deliver workshop in a face-to-face setting on campus. Option 2: Deliver workshop in an online synchronous format. Faculty will benefit from examples of how they can promote interaction among their students in online courses. There are a variety of instructional strategies they can integrate into their courses to accomplish this task. Option 1: Face-to-Face Workshop. • Patrick can show faculty activities they can incorporate in their classes, but they may not be able to experience those activities the way they would if they were online. • Faculty may not have an opportunity to integrate activities during the workshop if they do not bring a laptop. • Patrick may have to schedule follow-up meetings with faculty to assist them with integrating these activities in their courses if they ask for additional help. • Fewer faculty may attend the face-to-face session if they are not already scheduled to be on campus the day of the workshop. Option 2: Online Workshop. • Patrick can have the faculty engage in interactive activities online during the workshop to see what the students may encounter in their classes. • Faculty may have an opportunity to integrate activities in their course management systems during the workshop. • Patrick may have an easier time assisting faculty with their courses if they show him their classes during the online workshop. • An online workshop may attract more faculty who tend to work remotely.
At least two alternatives for action remain. Knowledge exists of the objective and the relationship to the problem. The consequences of the decision can be estimated and sometimes quantified.
DECISION-MAKING STRATEGIES TO SUPPORT MANAGING THE DESIGN SPACE
Regardless of which option Patrick may choose to take for this project, Table 2.1 also shows examples of some of the consequences that accompany each option. This could be further magnified if we added more alternatives for him to consider for course delivery. Depending on the consequence, Patrick may now have to make additional decisions as he weighs the advantages and disadvantages of each option. It is also important to recognize that Table 2.1 provides an overly simplified example. We have not even begun to consider Patrick’s previous experiences delivering workshops, either face- to- face or online, at his institution. He is probably going to rely on what he knows about the success of delivering workshops in different formats to the faculty as he makes his decision. He is also going to consider various contextual factors that may support or hinder the workshop. Factors such as the technological capabilities of a room will be considered. What platform might he consider using if he delivered the workshop online? How much time would it take to deliver the workshop and provide adequate time for the faculty to attempt different instructional strategies in their courses? This scenario is an example of what most decisions look like in instructional design. It is not a one-size-fits-all situation. What may work for one institution may not work as well for another. What may have worked well for other face-to-face workshops that Patrick has delivered in the past may not be optimal for this particular situation given the nature of the topic. Because instructional design presents situations that warrant lots of decisions and options, it is important that instructional designers develop an awareness of the types of decisions they are regularly making and their approach to such decisions. Yates (2003) describes decision-making as the process an individual makes to commit to a decision that yields optimal results. During this process, an individual will consider the nature of the problem and what they know about the problem. They will identify alternative approaches to solving the problem, weighing the advantages and disadvantages of each possible solution. After considering each alternative, they will move forward with a resolution that is optimal for the situation. As you are reading this, you may think that this sounds like a lot of work. You may be thinking, “I don’t do that when I’m making decisions for my projects.” Unless you are randomly picking decisions out of a hat, I can argue that you are, in actuality, going through this process with every decision you make.
Decision-Making Processes There are a variety of types of decisions an instructional designer may encounter. Yates and Tschirhart (2006) explain that decisions can be categorized as choices, acceptances/rejections, and constructions. Choices require us to select a solution from a larger set of solutions. As we make
17
18
DECISION-MAKING STRATEGIES TO SUPPORT MANAGING THE DESIGN SPACE
choices, we will consider the advantages and disadvantages of each option before proceeding with one. Acceptances and rejections are a type of binary decision where we decide to accept the solution that has been presented or reject it. Think of these examples as a yes-or-no option. Evaluative decisions involve assigning worth to a possible option (Fitzpatrick et al., 2011). When we engage in evaluative decisions, we not only consider the worth that proceeding with a particular solution may have on the situation but also need to consider the level of commitment required for implementing the solution. It is important that we be realistic in acknowledging the level of commitment needed and the level of commitment we (or our team) are willing to provide to the solution. Constructive decisions involve the process of considering solutions that take into account the resources we have available at the time of decision-making ( Jonassen, 2012). Table 2.2 provides an overview of each type along with examples adapted from Stefaniak (2020a). When engaged in making dynamic decisions, instructional designers must have a complete awareness of factors, including their knowledge, contextual factors, and time constraints. Lachheb and Boling (2021) differentiate between design decision-making and design judgments by explaining that design decisions are the “‘what and how,’ of design, whereas design judgments have to do with the ‘why’ a design decision has been made” (para 2). Within each of these typologies of decisions, instructional designers invoke design judgments. Judgment is based on knowledge derived from a unique situation (Nelson & Stolterman, 2012). As instructional designers engage in design, they may invoke several judgments to make decisions (Lachheb & Boling, 2021). In their book, The Design Way: Intentional Change in an Unpredictable World, Nelson and Stolterman (2012) distinguish design judgment as being independent from decision-making but equally necessary. They suggest: [Judgment] is not dependent on rules of logic founded within rational systems of inquiry. Judgment is not founded on strict rules of reasoning. It more likely to be dependent on the accumulation of the experience of consequences from choices made in complex situations. (p. 139) Nelson and Stolterman (2012) have identified 11 design judgments commonly invoked by designers (Table 2.3). Researchers in instructional design have explored the prevalence of these judgments in different instructional design settings (i.e., Boling et al., 2017; Demiral-Uzan, 2015; Gray et al., 2015; Zhu et al., 2020). Various types of design judgments may include, but are not limited to, core judgments based on an instructional designer’s predispositions, values, and beliefs; navigational judgments where an instructional designer may determine the best path for presenting content in a
Decision Type
Explanation
Example in Instructional Design
Choices
Selecting a solution from a larger set of options
Acceptances/ Rejections
A binary decision where the individual accepts or denies a solution Determining the worth and level of commitment associated with proceeding with a particular solution
An instructional designer has been asked to help a local museum with developing learning materials for their patrons. During their brainstorming meeting with the museum staff, they discuss the possibility of using audio headsets, mobile learning, QR (Quick Response) codes, online learning modules, and face-to-face training programs as training options. An instructional designer submits a proposal to present their project at a national instructional design conference. Reviewers responsible for reading the proposal must decide to accept or reject the conference proposal. An instructional design firm in a metropolitan city meets with a not-for-profit organization to discuss their training needs. During a few of the initial conversations, the firm realizes that their client would not be able to pay the typical fees they charge for their instructional design services. The CEO of the instructional design firm sees the impact that the not-for-profit has made in the local community and decides that they can offer a few of their services pro bono. An instructional design program discusses the options for offering two special topics courses to their students in the upcoming year. Program faculty discuss possible topics and determine which ones might be of the most interest to their students. During their discussions, they identify potential instructors for the courses and look to see how this might impact regular course offerings and instructor assignments.
Evaluation
Constructions
The process of selecting the most ideal solution on the basis of available resources
DECISION-MAKING STRATEGIES TO SUPPORT MANAGING THE DESIGN SPACE
TABLE 2.2 Types of Decisions and Examples in Instructional Design
19
20
DECISION-MAKING STRATEGIES TO SUPPORT MANAGING THE DESIGN SPACE TABLE 2.3 Design Judgments Commonly Invoked by Designers (Nelson & Stolterman, 2012) Design Judgment
Description
Core
Draws from designers’ beliefs and values that contribute to their approach Identifying appropriate tools to assist with decision- making and the project Establishing parameters around a project Making judgments without deliberation. These types of judgments have a degree of automaticity where the designer does not have much hesitation. Relies on recall of previous default judgments that were successful Assigning importance to a particular aspect of the project Judgments made within the boundaries of the project that give credence to craftsmanship Focusing on the temporal aspects of the design Considering the path required to complete the design project Making interconnections between objects to form functional assemblies Bringing parts of a design together to form a whole
Instrumental Framing Default
Deliberated Offhand (DOH) Appreciative Quality Appearance Navigational Connective Compositional
course or learning experience; and quality judgments where the instructional design is comparing an existing design against standards expected by the organization or profession or both. A theme central in each of the instructional design studies that have been conducted is that instructional designers often find themselves making several judgments simultaneously. These studies have also noted several challenges with studying design judgments (Gray et al., 2015). It is difficult to study design judgment because instructional designers may not be conveying everything that they are thinking about while they engage in design. Furthermore, most studies on judgment are a matter of interpretation (Tomita et al., 2021). While researchers may observe a designer or analyze their descriptions and reflections of design activities, they may not be accurate reflections of what is occurring in actuality. Although we know the challenges associated with studying decision- making and design judgments, these studies can help us develop an understanding for how different contextual factors and experiences may influence instructional designers’ abilities to make design decisions. This description of judgment builds upon our discussions of the acquisition of expertise in instructional design in Chapter 1. As instructional designers participate in design projects and gain familiarity with designing for different situations and contexts, they will be able to make design judgments on the basis of their knowledge of consequences resulting in previous design situations.
DECISION-MAKING STRATEGIES TO SUPPORT MANAGING THE DESIGN SPACE
Lachheb and Boling (2021) suggest that while designers will engage in all of the previously mentioned design judgments, there are three judgments, in particular, that play a critical role in instructional design: core, instructional, and framing. Nelson and Stolterman (2012) argue that core judgments are often made by the designer unconsciously whereas designers can consciously invoke framing and instrumental judgments. Instructional design requires a balancing act where the instructional designer is pulling from the unconscious to inform the conscious and vice versa. We could imagine this process visually as a teeter-totter we would find on a playground. Core judgments would be on one side, and framing and instrumental judgments on the other. As the instructional designer works on the project, their core judgments are influencing their approach and abilities to make framing and instrumental judgments. Likewise, the success that they will have with their framing and instrumental judgments will influence their core judgments over time. I agree with Lachheb and Boling’s (2021) sentiments that core, framing, and instrumental judgments are most prevalent in instructional design as they perhaps occur more frequently than some of the other judgment types identified by Nelson and Stolterman (2012). Drawing from our core values, establishing parameters for a project, and identifying the tools needed to make decisions related to design are present in every design project. I would argue that a fourth judgment should be weighted with the same importance or value as core, framing, and instrumental judgments in instructional design. I would argue that placing emphasis on compositional judgments is also critical to the success of any instructional design process and should be inherent in every design project. Compositional judgments elicit the instructional design bringing components (parts) of a system together to form a whole. Morrison and colleagues (2013) state that the “the goal of instructional design is to make learning more efficient, more effective, and less difficult” (p. 2). When instructional designers engage in design work, they should be asking themselves three questions: ) Am I contributing to effectiveness? 1 2) Am I contributing to efficiency? 3) Am I contributing to ease of learning? When an instructional designer answers “no” to any of those three questions while working on a project, it is a good indication that the systemic implications of their decisions were not considered. It is important that instructional designers be able to embrace a systems view of their environment and invoke the necessary judgments to bring various components of the system together (Gibbons, 2014; Kowch, 2019; Stefaniak, 2020b; Stefaniak & Xu, 2021).
21
22
DECISION-MAKING STRATEGIES TO SUPPORT MANAGING THE DESIGN SPACE
Decision-Making Processes Oftentimes, we make decisions intuitively based on our knowledge of instructional design and prior experience. We do not typically reflect on the specific steps we are completing to make a decision. Regardless of whether we are aware of the process we are following in the decision- making moment, our instructional design decision-making processes can be categorized as rational or dynamic ( Jonassen, 2012). Rational decision-making processes are often followed when we have the gift of time. There are no time pressures influencing our abilities to analyze a situation. A rational decision-making process follows a systematic and linear process where the decision-maker considers a variety of alternative solutions, analyzes the advantages and disadvantages of each, and selects a solution that is optimal (De Martino et al., 2006; Klein, 1998). Rational decision- making processes are typically followed when we engage in long-term planning. Usually, a deadline has not been immediately imposed upon us for making a decision. We have time to reflect, discuss, and collaborate with others. The following are examples of rational decision-making processes that an instructional designer may engage in: • A director of training and development has tasked a group of instructional designers to examine alternatives to their current learning management system (LMS). The contract with their existing LMS will expire in a year, and before renewing the license, the director would like to know what other options may exist. • The Center for Teaching and Learning at a local university has included a desire to increase faculty development programs in their latest strategic plan over the course of the next three years. The instructional design team begins meeting to discuss potential programming they can add to their existing faculty development offerings. • A K-12 school district recently asked their instructional coaches and instructional technologists to provide some additional professional development opportunities for teachers about online learning tools. Since the Covid-19 pandemic, teachers have expressed that it would be helpful if they could learn more about the learning tools they currently have available with existing software and content management systems to engage their students. If we were to look for similarities among these three examples, we would see that there doesn’t appear to be a sense of urgency for arriving at a solution. The first two examples have implied that a deadline for a decision exists; however, the instructional designers in both examples have an extended period of time to engage in decision-making. The first example acknowledges that they have a year to collect information about different LMSs they may want to consider in the future. The second example notes
DECISION-MAKING STRATEGIES TO SUPPORT MANAGING THE DESIGN SPACE
that the Center for Teaching and Learning has three years to work on increasing their program offerings. The third example does not offer any sort of time line. We can look at this example from the instructional coaches’ lens as well as the teachers’ lens. The instructional coaches have been asked to identify topics for future professional development sessions; however, the example implies that there is a very loose deadline because there is no reference to time provided. Furthermore, this example suggests that no time line is being imposed on the teachers to implement any of the new technologies they may learn about in the professional development sessions. In each of these examples, the individuals that have been tasked with making some decisions have sufficient time for planning how they will approach their decision-making. They have time to consult with others, ask questions, and make determinations on things such as how many LMSs they should explore in the first example or how many new programs and topics they should select in the second and third examples. While instructional designers do get to engage in rational decision- making processes, these opportunities appear to be few and far between. More often than not, instructional designers are engaged in dynamic decision-making processes where they are required to make prompt decisions given time constraints (Klein, 2008). These types of decisions do not always allow for sufficient time to consider alternative options. Instead, the instructional designers are required to leverage their prior knowledge, experience, and awareness of contextual factors influencing the situation that warrants a prompt decision ( Jonassen, 2012; Stefaniak & Xu, 2020). The following are examples of dynamic decision-making processes an instructional designer may engage in: • An instructional design team is reviewing materials for an e-learning course with their client. At the meeting, the client informs them that they will have to reduce their budget for the educational videos that the instructional design company is going to produce. Instead, they will need to implement a cheaper solution within the next four days to meet the final project deadline. • An instructional designer is delivering a workshop related to a new software package that is to be used to manage customer service calls. He soon realizes that the staff in attendance are confused with where to find the information needed to fill out one of the forms. During the workshop, the instructional designer adapts his workshop to address their immediate needs. • Instructional designers at a local university have been tasked with assisting faculty in transitioning their face-to-face learning materials to online platforms in the midst of the Covid-19 pandemic. Students at the school are away on spring break. All materials need to be accessible online in five days.
23
24
DECISION-MAKING STRATEGIES TO SUPPORT MANAGING THE DESIGN SPACE
Each of these examples demonstrates that urgency is a key factor prompting the instructional designers’ decisions. In the first example, the instructional design team may be required to brainstorm an alternative solution immediately while they are in the meeting with their client so they can obtain client approval and finalize their project within the remaining four days. They do not necessarily have the time to do extensive research on several different alternatives. They will need to consider what they have access to immediately and what is feasible to complete within a four-day time constraint. The second example illuminates what instructional designers and teachers encounter on a regular basis. Think about a time when you were delivering instruction? Did you ever have to adjust your curriculum because your learners were confused about the content? Did a learning activity take much longer than what you had initially thought it would? These types of experiences require us to make quick decisions in situ as we adjust, modify, or eliminate content in an attempt to support our learning audience. The third example is a great example of how time constraints greatly influence our dynamic decision-making. Many educators and instructional designers had to make swift adjustments to their instructional materials within a very short period of time when the pandemic first emerged. Many K-12 schools and higher education institutions had to resort to what I call good-enough design. They did not have the luxury of time where they could make significant improvements to curriculum and learning activities as they transitioned materials from face-to-face classrooms to online learning environments. Faculty who had never taught online did not have opportunities to learn about optimal instructional strategies to facilitate learning in online environments. Many were recording lectures and posting materials that followed a format similar to what they were doing in their face-to-face classes. The goal at the beginning of the pandemic was to just ensure that students had something to access in an online format. Improvements could be made afterwards. Imagine the difference if faculty were provided four months to transition their materials as opposed to one week. Think about what types of resources and training workshops could be developed to support the faculty who had never taught online courses prior to the pandemic? Think about the modifications that could be made to the content if faculty had opportunities to consult with instructional designers to change learning activities to be conducive to online discussions depending on whether the classes were being offered in synchronous or asynchronous formats.
Getting Comfortable with the Unknown Dynamic decision- making requires instructional designers to pull from their strategic and conditional knowledge bases. Situations requiring prompt decisions task us with figuring out a strategy we can employ that addresses the needs of the situation and accounts for the environmental factors. When an instructional designer engages in dynamic decision- making, they need to become comfortable with the unknown.
DECISION-MAKING STRATEGIES TO SUPPORT MANAGING THE DESIGN SPACE
Most situations that warrant dynamic decision-making will also include a degree of ambiguity. The instructional designer may not have access to all the information they would ideally like to have before deciding. Instead, they must rely heavily on their knowledge of instructional design, their prior experience, and their awareness of particular contextual factors that need to be accounted for in the solution. I do not think that many, if any, instructional designers enjoy having to design amidst ambiguity, but it is par for the course. One of the most exciting things about instructional design is that it is a field that does not produce a one-size-fits-all solution to instructional situations. You can ask students enrolled in instructional design programs, and often one of the most frustrating responses they are provided by their instructors when asking a question is “It all depends.” Studies that examined decision- making processes of instructional designers have found that less experienced designers may enter a state of design paralysis where they are unable to move forward with a design decision. They may feel overwhelmed by the situation and not having all the information they would like to have. Depending on their experience, they may feel that, owing to the ambiguity and uncertainty present, they are unable to make a decision. They may question their ability to rely on what they know about instructional design and the situation to make an informed decision promptly. The time pressures that accompany dynamic decision- making may sometimes impose stress on a novice or less experienced designer (Ertmer & Stepich, 2005). Studies comparing novice and expert instructional designers report novices exhibiting a hesitation in making prompt decisions (i.e., Ertmer et al., 2008; Hoard et al., 2019; Rowland, 1992; Rowley, 2005; Stefaniak et al., 2018). They tend to feel uncomfortable that they do not have time to explore their options and consult with others (Stefaniak et al., 2022). These studies have found that they often underestimate their capabilities and second- guess themselves as they engage in dynamic decision-making. Studies examining differences between novice and expert instructional designers have found that experts are quicker to offer conjectures to make prompt decisions. A conjecture is an educated guess. Murty and colleagues (2010) describe the process of conjecturing as the ability for a designer to hypothesize different solutions on the basis of their prior knowledge and experience.
Understanding the Relationship Between Conjectures, Judgments, and Decision-Making This chapter introduces several components that contribute to instructional design decision-making. An overview of decisions typologies, design judgments, design conjectures, and decision-making processes will be discussed. While these things are mutually exclusive, they come together to
25
26
DECISION-MAKING STRATEGIES TO SUPPORT MANAGING THE DESIGN SPACE
Prior Knowledge
Design Problem
Instructional Designer
Judgment(s)
Decision-Making Process (Rational Decisions or Dynamic Decisions)
Design Decision
(conjecture cycles)
FIGURE 2.1 Relationship between Conjectures, Design Judgments, and DecisionMaking Processes.
support the instructional designer’s decision-making. Lachheb and Boling (2021) suggest that a good way to envision the relationship between these components is to “think about design decisions as the ‘what and how,’ of design, whereas design judgments have to do with the ‘why’ a design decision has been made” (para 2). If we think of design judgments as tools to help us understand why a decision has been made, we can add that contextual factors present in a unique situation contribute to the rationale for a given design judgment. In addition, I see the role of conjecturing bridging the decision-making process and design judgments together (Figure 2.1). Depending on the urgency for a decision to be made, the instructional designer will engage in either a rational or dynamic decision-making process during which they will make a series of design judgments dependent on contextual factors influencing the situation. While engaging in the decision-making process, the instructional designer will engage in a cyclical process of conjecturing where they are drawing from their repertoire of instructional design knowledge, prior experience, and what they know about the current situation to make educated guesses to move forward with their decision- making process. As more information is made available to the designer, they will continue to engage in multiple iterations of conjecturing until they arrive at a decision or time constraints halt the process. This iterative process of conjecturing helps the instructional designer manage the design space allowed for their project.
Managing the Design Space Many of the design judgments an instructional designer will invoke while making decisions are associated with managing the affordances provided in a learning space. Learning space has been described by Hmelo-Silver (2013) as consisting of the problem space and the conceptual space. Ertmer
DECISION-MAKING STRATEGIES TO SUPPORT MANAGING THE DESIGN SPACE
and Stepich (2005) have divided activities associated with problem space into two categories: problem- finding space and problem- solving space. Examples of problem- finding space include identifying key stakeholders associated with a problem, determining the role of the instructional designer in the project, identifying project constraints, and describing key relationships. Counter to problem-finding space, problem-solving space activities consist of devising solutions to bring stakeholders together, identifying viable solutions, and predicting potential consequences of proposed solutions (Ertmer & Stepich, 2005). Research exploring the extent that instructional designers engage in design decision-making and judgment during the co-evolution process is minimal (Boling et al., 2017; Stefaniak, 2021; Tomita et al., 2021). This also coincides with other design scholars recognizing that additional empirical studies are needed to better understand how designers engage in co-evolution where they negotiate between problem space and solution space (Crilly & Morosanu Firth, 2019; Dorst, 2019; Wiltschnig et al., 2013). Each design project we work on brings its own unique set of challenges and constraints. During projects, instructional designers must balance the internal and external factors that influence the learning environment to maintain oversight of the design space (Ertmer & Koehler, 2014). In Chapter 1, I suggested that problem framing could be used to assist instructional designers with transitioning from the problem-finding space to the problem-solving space. Table 2.4 can be used to help support your design conjectures as you consider what you know about your design situation and what information is still needed. There are different tools that can be used to help support your instructional design decision-making. Common tools used in our field include design documents, external representations, group repositories, rapid prototyping, and reflection journals. Each of these tools is great for recordkeeping as you engage in multiple iterations of design during a project. A design document, also known as a design plan, is a document that serves as a blueprint for a design project. The information typically appearing in a design document includes course goals, learning objectives, TABLE 2.4 Supporting Your Conjectures Additional Questions Warranting Clarification for the Design Space Knowledge related to situation Prior experience making similar design decisions Contextual factors influencing the situation
Notes to Guide Decision-Making
27
28
DECISION-MAKING STRATEGIES TO SUPPORT MANAGING THE DESIGN SPACE
instructional platforms (i.e., face-to-face, online, etc.), instructional activities and strategies, assessments, project time lines, and budgets. Design documents typically include a list of all individuals involved in the project, such as instructional designers, graphic designers, developers, and project managers. Detailed time lines are typically documented to notify the team about expectations related to completing tasks. Establishing parameters to establish our design space for each project alerts us and others to how much room we have to stretch when accommodating learners, teachers, and other stakeholders and contextual affordances that may influence the needs of the project. It puts boundaries around the project with clear rules for accepting or rejecting changes. When we make design decisions for our projects, three factors influence those decisions: time, quality, and money. We could also think of the relationship between these three factors as an equation: Quality = Time × Money If we are making design decisions that prioritize quality, we can assume that high quality will be at the expense of time and money to ensure that the necessary resources are in place to produce a high-quality product. If we need to prioritize designing within a limited budget, quality may be hindered because there may not be enough money to account for resources and extra time needed. If we are in a situation that requires us to produce a product in a time frame much faster than what it typically takes to complete a project, there are two things to consider. Quality may be significantly impacted if we are asked to design something quickly. On the other hand, quality may not be impacted if additional funds are provided to provide the extra resources that will be needed to produce the project within the limited time frame. The latter may end up being a very expensive option. Table 2.5 provides a chart to assist you with establishing boundaries around your design space as they relate to quality, time, and money. TABLE 2.5 Prioritizing design tasks Design Tasks
Timeline to Complete
Individual(s) Needed
Potential Resources Needed
Costs Allocated for Work
DECISION-MAKING STRATEGIES TO SUPPORT MANAGING THE DESIGN SPACE
Summary This chapter provided an overview of decision- making strategies and design judgments that instructional designers may engage in during their projects. It is important for instructional designers to be aware of the ongoing decisions they make during their projects. Taking the time to reflect on how you evaluate the design situation and consider your options can be helpful to your professional development and future design work.
Connecting Process to Practice Activities 1. Think about the most recent instructional design project you have worked on. Start from the very beginning of your involvement with the project and make a list of decisions that you made during the first two days of the project. What were the decisions? What options did you consider? 2. Review Table 2.3, which lists the design judgments commonly involved by designers (Nelson & Stolterman, 2012). Do you experience difficulty invoking certain judgments over others? What do you think contributes to this difficulty? 3. This chapter discussed the equation Quality = Time × Money. Think about a design project you have completed. What types of decisions did you make? How did you balance quality, time, and money? 4. It is quite common for instructional designers to be required to make decisions with missing information. What types of strategies do you employ to help you conjecture? 5. If you were about to start a new design project, what questions might you ask to manage your design space? Make a list of five questions that would help you to establish some boundaries around your project.
Bridging Research and Practice Baaki, J., & Luo, T. (2019). Instructional designers guided by external representations in a design process. International Journal of Technology and Design Education, 29(3), 513–541. https://doi.org/10.1007/s10798-018-09493-2 Bannan-Ritland, B. (2001). Teaching instructional design: An action learning approach. Performance Improvement Quarterly, 14(2), 37–52. https://doi.org/ 10.1111/j.1937-8327.2001.tb00208.x Dorst, K., & Cross, N. (2001). Creativity in the design process: Co-evolution of problem–solution. Design Studies, 22(5), 425–437. https://doi.org/10.1016/ S0142-694X(01)00009-6 Piskurich, G. M. (2015). Rapid instructional design: Learning ID fast and right. John Wiley & Sons, Inc. Tripp, S. D., & Bichelmeyer, B. (1990). Rapid prototyping: An alternative instructional design strategy. Educational Technology Research and Development, 38(1), 31–44. https://doi.org/10.1007/BF02298246
29
30
DECISION-MAKING STRATEGIES TO SUPPORT MANAGING THE DESIGN SPACE
References Boling, E., Alangari, H., Hajdu, I. M., Guo, M., Gyabak, K., Kizilboga, R., Tomita, K., Alsaif, M., Lachheb, A., Bae, H., Ergulec, F., Zhu, M., Basdogan, M., Buggs, C., Sari, A., & Techawitthayachinda, R. (2017). Core judgments of instructional designers in practice. Performance Improvement Quarterly, 30(3), 199–219. https://doi.org/10.1002/piq.21250 Crilly, N., & Firth, R. M. (2019). Creativity and fixation in the real world: Three case studies of invention, design and innovation. Design Studies, 64, 169–212. De Martino, B., Kumaran, D., Seymour, B., & Dolan, R. J. (2006). Frames, biases, and rational decision-making in the human brain. Science, 313(5787), 684–687. https://doi.org/10.1126/science.1128356 Demiral-Uzan, M. (2015). Instructional design students’ design judgment in action. Performance Improvement Quarterly, 28(3), 7–23. https://doi.org/10.1002/ piq.21195 Dorst, K. (2019). Co-evolution and emergence in design. Design Studies, 65, 60–77. https://doi.org/10.1016/j.destud.2019.10.005 Ertmer, P. A., & Koehler, A. A. (2014). Online case-based discussions: Examining coverage of the afforded problem space. Educational Technology Research and Development, 62(5), 617–636. https://doi.org/10.1007/s11423-014-9350-9 Ertmer, P. A., & Stepich, D. A. (2005). Instructional design expertise: How will we know it when we see it? Educational Technology, 45(6), 38–43. Ertmer, P. A., Stepich, D. A., York, C. S., Stickman, A., Wu, X., Zurek, S., & Goktas, Y. (2008). How instructional design experts use knowledge and experience to solve ill-structured problems. Performance Improvement Quarterly, 21(1), 17–42. https://doi.org/10.1002/piq.20013 Fitzpatrick, J. L., Sanders, J. P., & Worthen, B. R. (2011). Program evaluation: Alternative approaches and practical guidelines (4th ed.). Pearson. Gibbons, A. S. (2014). Eight views of instructional design and what they should mean to instructional designers. In B. Hokanson & A. S. Gibbons (Eds.), Design in educational technology (pp. 15–36). Springer. Gray, C. M., Dagli, C., Demiral- Uzan, M., Ergulec, F., Tan, V., Altuwaijri, A. A., Gyabak, K., Hilligoss, M., Kizilboga, R., Tomita, K., & Boling, E. (2015). Judgment and instructional design: How ID practitioners work in practice. Performance Improvement Quarterly, 28(3), 25–49. https://doi.org/10.1002/piq.21198 Hmelo- Silver, C. (2013). Creating a learning space in problem- based learning. Interdisciplinary Journal of Problem-Based Learning, 7(1), 1–15. Hoard, B., Stefaniak, J., Baaki, J., & Draper, D. (2019). The influence of multimedia development knowledge and workplace pressures on the design decisions of the instructional designer. Educational Technology Research and Development, 67(6), 1479–1505. https://doi.org/10.1007/s11423-019-09687-y Jonassen, D. H. (2012). Designing for decision making. Educational Technology Research and Development, 60(2), 341–359. https://doi.org/10.1007/s11423-0119230-5 Klein, G. A. (1998). Sources of power: How people make decisions. MIT Press. Klein, G. (2008). Naturalistic decision making. Human Factors, 50(3), 456–460. Kowch, E. G. (2019). Introduction to systems thinking and change. In M. J. Spector, B. B. Lockee, & M. D. Childress (Eds.), Learning, design, and technology: An international compendium of theory, research, practice, and policy (pp. 1–14). Springer. Lachheb, A., & Boling, E. (2021). The role of design judgment and reflection in instructional design. In J. K. McDonald & R. West (Eds.), Design for learning: Principles, processes, and praxis. EdTech Books. https://edtechbooks.org/id/ design_judgment
DECISION-MAKING STRATEGIES TO SUPPORT MANAGING THE DESIGN SPACE Morrison, G. R., Ross, S. M., Kalman, H. K., & Kemp, J. E. (2013). Designing effective instruction (7th ed.). Wiley. Murty, P., Paulini, M., & Maher, M. L. (2010). Collective intelligence and design thinking. In DTRS’10: Design Thinking Research Symposium (pp. 309–315). Sydney University of Technology. Nelson, H. G., & Stolterman, E. (2012). The design way: Intentional change in an unpredictable world (2nd ed.). The MIT Press. Rowland, G. (1992). What do instructional designers actually do? An initial investigation of expert practice. Performance Improvement Quarterly, 5(2), 65–86. https://doi.org/10.1111/j.1937-8327.1992.tb00546.x Rowley, K. (2005). Inquiry into the practices of expert courseware designers: A pragmatic method for the design of effective instructional systems. Journal of Educational Computing Research, 33(4), 419–450. https://doi. org/10.2190/9MLR-ARTQ-BD1P-KET Skyttner, L. (2001). General systems theory: Ideas and applications. World Scientific Publishing. Stefaniak, J. (2020a). Documenting instructional design decisions. In J. K. McDonald & R. E. West (Eds.), Design for learning: Principles, processes, and praxis. EdTech Books. https://edtechbooks.org/id/documenting_decisions Stefaniak, J. (2020b). The utility of design thinking to promote systemic instructional design practices in the workplace. TechTrends, 64(2), 202–210. https:// doi.org/10.1007/s11528-019-00453-8 Stefaniak, J. (2021). Leveraging failure-based learning to support decision-making and creative risk in instructional design pedagogy. TechTrends, 1–7. https://doi. org/10.1007/s11528-021-00608-6 Stefaniak, J., & Xu, M. (2020). Leveraging dynamic decision-making and environmental analysis to support authentic learning experiences in digital environments. Revista De Educación a Distancia (RED), 20(64), 1–21. Stefaniak, J., & Xu, M. (2021). An examination of the systemic reach of instructional design models: A systematic review. TechTrends, 64(5), 710–719. https://doi. org/10.1007/s11528-020-00539-8 Stefaniak, J., Baaki, J., Hoard, B., & Stapleton, L. (2018). The influence of perceived constraints during needs assessment on design conjecture. Journal of Computing in Higher Education, 30(1), 55–71. https://doi.org/10.1007/s12528-018-9173-5 Stefaniak, J., Baaki, J., & Stapleton, L. (2022). An exploration of conjecture strategies used by instructional design students to support design decision- making. Educational Technology Research and Development, 1–29. https://doi. org/10.1007/s11423-022-10092-1 Tomita, K., Alangari, H., Zhu, M., Ergulec, F., Lachheb, A., & Boling, E. (2021). Challenges implementing qualitative research methods in a study of instructional design practice. TechTrends, 65(2), 144–151. https://doi.org/10.1007/ s11528-020-00569-2 Wiltschnig, S., Christensen, B. T., & Ball, L. J. (2013). Collaborative problem–solution co-evolution in creative design. Design Studies, 34(5), 515–542. https://doi. org/10.1016/j.destud.2013.01.002 Yates, J. F. (2003). Decision management. Jossey-Bass. Yates, J. F., & Tschirhart, M. D. (2006). Decision-making expertise. In K. A. Ericsson, N. Charness, P. J. Feltovich, & R. R. Hoffman (Eds.), The Cambridge handbook of expertise and expert performance (pp. 421–438). Cambridge University Press. Zhu, M., Basdogan, M., & Bonk, C. J. (2020). A case study of the design practices and judgments of novice instructional designers. Contemporary Educational Technology, 12(2), 1–19.
31
3
RECONSIDERING NEEDS ASSESSMENT IN INSTRUCTIONAL DESIGN
Chapter Overview Needs analysis is often limited to conducting a learner analysis in existing instructional design literature when it extends beyond the learner. This chapter will provide an overview for how instructional designers can set up a needs assessment that takes a systems view to identify environmental affordances that contribute and inhibit learner performance. Emphasis will be placed on why needs assessments should to extend beyond traditional learner analyses to address multiple layers of an organization and system. Strategies for how instructional designers can scale needs assessments to meet the needs of their problems will be provided.
Guiding Questions 1. What is a needs assessment? 2. How can an instructional designer scale needs assessment activities? 3. What role does needs assessment play in the instructional design process?
4. How can instructional designers advocate for needs assessments when meeting with their clients?
The Purpose of Needs Assessment Needs assessments are often conducted when an organization, team, or individual recognizes that there is an area that could benefit from improvement. Individuals may recognize that errors continue to happen, resulting in extra work to correct a situation. Employees may request training to enhance their knowledge and skills in a particular area to support their job. Organizations may adopt a new technological platform to conduct business and recognize that their staff need to be trained how to use the new tools. While each of the examples is unique, a DOI: 10.4324/9781003287049-3
RECONSIDERING NEEDS ASSESSMENT IN INSTRUCTIONAL DESIGN
commonality among them is that someone recognizes that performance could be improved upon. Needs assessment is the process of determining any discrepancies between a current state of affairs and a desired state of affairs. A need exists when a gap has been identified between the current state of affairs and a desired state. Needs assessments are conducted primarily for three reasons: 1. To address a recurring program that has been identified within an organization 2. To identify strategies to improve the quality of existing organizational practices 3. To identify opportunities for growth and expansion (Stefaniak, 2020, p. 5) During a needs assessment, the individual conducting the assessment, often referred to as the needs assessor, will use a systematic process to gather data to gain information that can inform what gaps in performance may exist for an organization. The process of determining whether a gap in performance exists is identified through a needs assessment and should not be confused with a needs analysis. A needs analysis is the process of determining why that gap exists. Data should be collected to indicate what is contributing to the gap existing in the first place. While there are a variety of different needs assessment models that may offer slight variations of processes to guide individuals through the necessary steps to identify problems and causes, the common steps that are employed by all include the following: • • • • • •
Identification of problem(s) Identification of needs Identification of data sources Data collection Data analysis Recommendations to client
Figure 3.1 provides an overview of these steps. In the subsequent sections, I will provide an overview of each of these steps in terms of how an instructional designer may encounter them.
Identification of Problem(s) In an ideal situation, an instructional designer will be brought in at the very beginning of a project. The client may have a hunch regarding some problems that may be occurring within the organization or may have identified some opportunities for growth and would like to gather more information
33
34
CONTINUOUS MONITORING
Data Analysis Data Collection
Identification of Problem(s)
Identification of Needs
Identification of Data Sources
VERIFICATION OF NEEDS
FIGURE 3.1 Overview of Needs Assessment Process. Source: Stefaniak, J. E. (2020). Figure 2.1. Overview of a needs assessment process (p. 17). Used with permission.
RECONSIDERING NEEDS ASSESSMENT IN INSTRUCTIONAL DESIGN
Recommendations to Client
RECONSIDERING NEEDS ASSESSMENT IN INSTRUCTIONAL DESIGN
to determine what types of solutions are needed. These early conversations with a client typically involve them presenting the situation as needs statements. “We need training.” “We need to transition to e-learning.” “We need to offer additional services to maintain our customer base.” Usually when an instructional designer is brought onto a project that requires the design of instructional materials, a problem has already been identified. Other times, an instructional designer may be asked to conduct a needs assessment to determine whether instructional solutions are warranted. Regardless, it is important that you have an understanding of the role that needs assessment plays in instructional practices. There are four times you may be brought onto a project: • • • •
Before a needs assessment has been conducted While a needs assessment is being conducted After a needs assessment has been conducted When no needs assessment has been conducted, but problems have been identified
Knowing when you are being brought on board for a project will help you identify questions you may need to ask. For instance, if a needs assessment was conducted and you were asked to design instruction as a result of those findings, you can make some assumptions that data was gathered and analyzed that indicated instructional solutions were needed to address specific problems within the organization. It would be beneficial for you to ensure that the objectives of your materials align with the supporting the needs of the organization. If a needs assessment has not been conducted but you have still been asked to design instruction, there are several questions you may want to ask: • • • •
What needs do you want to be addressed through the instruction? Who determined that training was needed? Has training been done in the past? Are there any major changes since the last time training was conducted? If so, what are those changes?
While I am a big proponent of needs assessment, there are times when a robust needs assessment is not necessary. Sometimes, a client or the organization where you work may ask you to update training that they are currently offering. Depending on the changes that are being asked, a needs assessment may not have been necessary. Sometimes, we are asked to update training to address slight variations in industry standards. Other times, it might be to transition materials that were face-to-face to an online platform to address a growing customer base or employees who are dispersed geographically. These types of changes do not require
35
36
RECONSIDERING NEEDS ASSESSMENT IN INSTRUCTIONAL DESIGN
you to create a data collection plan or analyze data to verify that those changes are needed in actuality.
Identification of Needs Conversations with clients can go a few different ways when discussing or identifying needs. Sometimes, clients will have a solid understanding of their organization’s needs and will be able to clearly articulate those needs to you, the instructional designer. In situations like this, they will have some data indicating the needs are valid. Other times, they may be able share the problems they are experiencing, but they may not be able to articulate the actual needs. In most cases, it is easy to determine when performance gaps exist, but it’s not always as easy to determine what is contributing to those gaps. When you encounter these types of situations, it is important to start thinking about what types of data you may need to gather to determine the causes. One another thing to remember during this phase of a needs assessment is that it is quite common for additional needs to be recognized during a needs assessment. Once data has been collected, we may find that the needs that were presented to us at the beginning of the project may not actually be the needs. Data collection may also present additional needs or problems that were occurring in the background that your client may or may not have been aware of during those initial discussions (Altschuld & Watkins, 2014). Regardless of whether or not these needs have been validated, they serve as a starting point for the individual or team responsible for conducting a needs assessment. These are needs that have been identified or felt by your client. The way we can validate those needs is through collecting sufficient data to help us answer questions about our projects. You, as the instructional designer, will benefit greatly from being able to differentiate between the levels of needs. Needs, regardless of the industry, can be classified according to three levels: primary, secondary, and tertiary (Altschuld & Kumar, 2010). Primary needs are the needs of the individuals who will receive services or products to fulfill and resolve their needs. These individuals may consist of the employees who will complete your instructional courses, students in your K-12 or higher education classes, or other constituents who are directly on the receiving end of your interventions. Secondary needs involve the needs of the individuals responsible for delivering the products and services to address the primary level of needs. These may include instructional designers, trainers, supervisors, and so on. The third level addresses tertiary needs. Tertiary needs are centered on the resources and infrastructure needed to support the primary and secondary needs of an organization or group. Figure 3.2 provides an example of the types of questions an instructional designer may ask according to the three levels of needs while designing instruction for a client.
RECONSIDERING NEEDS ASSESSMENT IN INSTRUCTIONAL DESIGN
Tertiary Needs •What learning management system will be used to track learner progress?
•To what extent is training and instruction supported by the organization?
•How will training and instruction support the needs of the organization?
Secondary Needs •Does the organization have knowledgeable trainers? •Will the instructors have direct access to their learning audience?
•What are the trainers’ levels of familiarity with the instructional technologies being used?
•Are there enough trainers/teachers/educators?
Primary Needs •What content do the learners need to know to be proficient? •To what extent are learners familiar with the content? •What are learners’ perceptions regarding training/instruction?
FIGURE 3.2 Types of Questions Pertaining to Level of Needs in Instructional Design.
By anticipating what needs may exist across the primary, secondary, and tertiary level, the instructional designer can curtail the necessary conversations with their clients to plan for what infrastructure will be needed by the organization to support implementation and maintenance of the instructional activities and materials being designed. As you progress through the needs assessment process and identify data sources to inform your knowledge of the organization, you can begin to identify data sources that will provide insight into the various levels of needs that may exist.
Identification of Data Sources If the decision has been made to conduct a needs assessment, the needs assessor will begin to identify potential data sources that can provide additional information about each need that has been identified. The client should be able to assist you with identifying data sources. I recommend making an exhaustive list of possible data sources prior to meeting with your client. Sometimes, certain data sources may not be available for you to access. Other times, the client may be aware of data sources that could yield valuable information that you may not have known existed. It is good practice to gather data from multiple sources for each need you are
37
38
RECONSIDERING NEEDS ASSESSMENT IN INSTRUCTIONAL DESIGN
exploring. We may be provided with an inaccurate picture of the situation if we gather data from only one source or group of individuals. Different data sources will present different perspectives related to the need. I also recommend having a conversation with your client to ask them whether they perceive or anticipate any challenges with data collection (Stefaniak, 2020). For example, they may recommend that you avoid conducting focus groups if conflict exists between a few employees in the group. Individual interviews might be more useful in mitigating disagreements from arising in a focus group. If direct observation of employees completing tasks has been identified as a potential data source, the client may recommend that you spend time with certain employees who do exemplary work so you can see what best practices look like. They may also identify some employees who are not performing well and may recommend that you observe them to pinpoint performance gaps. One frustration that a lot of needs assessors experience is not being provided with access to all data sources that have been identified or requested. Very rarely will we be provided with access to everything that we need to sufficiently analyze and make recommendations regarding performance gaps and possible recommendations. When this happens, do not panic. You’ll scale your needs assessment recommendations on the basis of the data that is available. Chapter 2 discussed establishing your design space for a project. You can think of it the same way for a needs assessment. Time limitations, availability of resources, and access to data sources can be used to help you establish the boundaries for your needs assessment space.
Data Collection Once data sources have been identified, the needs assessor can begin collecting data. It is important to establish a realistic time line for gathering data. A common mistake that I observe novice needs assessors make regularly is not allocating a sufficient amount of time for data collection. They forget to account for hiccups that may occur, resulting in delays. Perhaps certain individuals are not available at the time you initially planned to meet with them. New project deadlines may require your client to reach out to you and postpone your meetings. You may have asked your client to send out a survey on your behalf only to find out that they forgot or that the employees ignored the request. Obtaining permissions may also contribute to delays with collecting data. It is important to determine what levels of approval are needed to obtain permission before data collection begins. If you are planning to conduct a needs assessment and publish the results in a book or academic publication, approval from an institutional review board will most likely be needed to protect the rights and privacy of your participants.
RECONSIDERING NEEDS ASSESSMENT IN INSTRUCTIONAL DESIGN
Other organizations may have internal committees that will need to review and approve your data collection materials. I recommend providing cushioning to your data collection time line. Very rarely have I heard someone say that they were given too much time to collect data; it’s often the opposite. By adding time to your data collection plan, you have some flexibility if you need to adjust meetings, send out additional reminders and requests, or identify new data sources.
Data Analysis You can begin data analysis once you have finished data collection. This can quickly become a very overwhelming and time- consuming task if you are exploring multiple needs. I also find that novice needs assessors underestimate the amount of time it will take them to analyze data. They often make assumptions that the recommendations will become glaringly obvious in this phase. Usually what happens at the beginning of this phase is that we can confirm or deny that a performance gap or need exists. The data analysis phase is where some of the most important work takes place for a needs assessment. It is during this phase that we should be able to start to figure out what is causing those performance gaps or needs to exist. There are different ways you can approach data analysis depending on the types of data sources you used to collect data in the previous phase. If you are analyzing quantitative data, you may use descriptive statistics (mean, median and mode) to report on trends. You may need to run more advanced statistical tests if you are comparing groups or performance over an extended period of time. It can become very easy to get lost in the data when you begin analyzing data. Because certain data sources may be used to inform more than one need associated with the project, I recommend that you review each need one at a time. As you look at your various data sources, you can review the data to answer questions you may have about each specific need. Table 3.1 can be used to help you organize your notes. When you have begun to make sense of the performance gaps and possible factors contributing to the issues, you should ask the following questions: Is there the potential to fix the problem? • • What solutions might be possible to address the problem or gap in performance? • Is it worth the cost to fix the problem? (Stefaniak, 2020, p. 132) The data analysis phase will also help you to prioritize what needs should be addressed first. Some needs may need to be addressed immediately if they are related to health and safety. Others may have a moderate or low
39
40
RECONSIDERING NEEDS ASSESSMENT IN INSTRUCTIONAL DESIGN TABLE 3.1 Organizing Findings from Data Collection to Inform Recommendations Need
Data Source
Key Findings
Consistency with Other Data Sources
Discrepancy with Other Data Sources
Preliminary Recommendations
Need 1: Need 2: Need 3: Need 4: Need ‘n’:
level of urgency. Sometimes, it may cost more money to fix a problem than to just leave it as is. There may be instances where your client will have to implement an expensive solution to avoid legal liabilities if something were to go wrong in the future. As you begin to consider your options for possible solutions, you will want to consider the level of urgency of each need and the costs associated with each proposed solution. The needs assessor can work with the client to determine the costs associated with implementing solutions versus the costs associated with doing nothing.
Recommendations to Client The last stage of the needs assessment process is to meet with your client and present your recommendations. It is helpful to make a list of short- and long-term recommendations. You can also present information that you have that may influence the level of urgency for addressing each need and possible costs for implementing solutions or doing nothing. While you may make recommendations to your client, it will be up to them to decide what solutions they may or not choose to implement.
Moving Beyond the ‘A’ in ADDIE Needs assessment plays a pivotal role in the instructional design process. In Chapter 2, we discussed that there are three questions we should always be asking ourselves when engaging in instructional design: . Am I contributing to effectiveness? 1 2. Am I contributing to efficiency? 3. Am I contributing to ease of learning? When we engage in needs assessment, no matter how big or small, we should be asking questions that will contribute to our ability to contribute to effectiveness, efficiency, and ease of learning. If we are brought in to conduct a needs assessment to inform future training opportunities, we should be identifying data sources that will
RECONSIDERING NEEDS ASSESSMENT IN INSTRUCTIONAL DESIGN
help us answer those three questions. We should be thinking about what resources are needed for training to be considered efficient and effective. What infrastructure needs to be in place to contribute to our learners’ abilities to engage with the content? One of my biggest frustrations is that needs assessment tends to be approached with a minimalist view in instructional design. A lot of instructional designers will equate needs assessment and the ‘A’ in the ADDIE (Analyze, Design, Develop, Implement, and Evaluate) process as the same thing. More often than not, they are different. While the ‘A’ in ADDIE should encompass needs assessment and analysis, the focus tends to be placed on learner analysis. As you ask yourself how you are addressing efficiency, effectiveness, and ease of learning, I implore you to think about the systemic implications of the design strategies you may implement. One criticism in our field is that few instructional design models actually address the systemic impact of our designs. In a systematic review examining the systemic reach of instructional design models (Stefaniak & Xu, 2020), we found that only a few models explicitly addressed the systemic nature of instructional design (Cennamo, 2003; Gibbons, 2014; Tessmer & Wedman, 1990). While other designers who have developed instructional design models may argue that their model could be used to address the systemic implications of a project, the reality is that the majority of models suggest a linear process that is focused on one aspect of the design project. Very few models address the larger systems view of a project and the iterative process that accompanies most design solutions. This is important to note because when we look at needs assessment, we do not typically see the relationship between needs assessment and the instructional design process. The other challenge that we must consider is that very rarely will we have the opportunity to conduct a robust needs assessment to support our instructional design efforts. It is important that you be able to learn how to scale your needs assessment activities to keep within time and resource constraints that accompany your projects. As you begin to determine the types of information you need (or have access to), you should be identifying questions and data sources that will support your dynamic decision-making efforts discussed in Chapter 2.
Needs Assessment to Inform Instructional Design In Chapter 1, I discussed my preference for Richey and colleagues’ (2011) definition of instructional design: instructional design is the science and art of creating detailed specifications for the development, evaluation, and maintenance of situations which facilitate learning and performance. (p. 3)
41
42
RECONSIDERING NEEDS ASSESSMENT IN INSTRUCTIONAL DESIGN
One of the reasons why I gravitate toward this definition is that I see synergy between the maintenance of situations which facilitate learning and performance and learning and needs assessment. If an expectation of instructional designers is to engage in maintenance of situations, we should always be scanning our environments to ensure that we are meeting the needs of our learners and considering the environmental affordances that will impact learning. A thorough needs assessment can provide rich data that can inform instructional design practices. If we are in a position to gather data that provides a detailed account of the situation (i.e., politics, stakeholder perspectives, instructional needs, and technological affordances), we are better positioned to customize instructional solutions that will be sustainable. Data yielded from a needs assessment can be beneficial in that it can provide insights and increase awareness for what non-instructional interventions may be needed to support instruction. Taking this into account, we should address two realities of needs assessment in instructional design: 1. Rarely will an instructional designer be provided the opportunity to conduct a thorough needs assessment at the beginning of an instructional project. 2. Limited attention is given to non-instructional interventions when discussing instructional design projects with clients. Some individuals in our field will say that needs assessments are never conducted in our field, and I tend to disagree with that statement. Robust needs assessments may not be carried out on a regular basis, but instructional designers are often engaged in needs assessment activities, albeit intuitively. If we think about instructional design as a recursive and iterative process, we can also think of needs assessment as an iterative process as well. As instructional designers, we should task ourselves with continually scanning the environment when we are working on design projects. We should take note if new information emerges from talking with clients, team members, and other members in the organization. We should make note of any information that has the potential to directly and indirectly impact the instruction. While it might not pose challenges within the next couple of months, we should be thinking forward and considering how it might impact what we are designing in the future. It is particularly important to adhere to Richey and colleagues’ (2011) definition of instructional design that tasks us with both facilitating and maintaining situations that facilitate learning and improve performance. So, how do we engage in this “constant scanning” mode while being efficient at our design activities? We should not let needs assessment get in the way of making progress on a project. Oftentimes, the needs have already been addressed when an instructional design project is pitched
RECONSIDERING NEEDS ASSESSMENT IN INSTRUCTIONAL DESIGN
to us. Most clients do not want to invest additional time and money in a robust needs assessment when they already have a good indication of what the problems may be. They have already determined that they need instruction. We can work around this by scaling our needs assessment activities. Think about the formal training you have acquired as an instructional designer. Think about the variety of learning and instructional theories you may be knowledgeable in, the educational technologies you are familiar with for designing activities, and the vocabulary and jargon that accompany our field. Think about how you discuss instructional design practices among your instructional design peers versus your client or family and friends. The same can be said about needs assessment. If you have been hired to design a series of e- learning modules or update an existing training program for an organization, you do not have to tell your client that you need to halt progress on the project until a thorough needs assessment has been completed. You do not have to overwhelm your client and the team with needs assessment terms, arguing that it is impossible to begin design work on the e-learning materials. Instead, you can simultaneously gather data through different mechanisms that will assist you as you begin designing instruction. When we consider the scalability of needs assessment activities, we should think about the types of needs outlined in Figure 3.2. By identifying the realities that exist in the organization, you will be better able to adequately address the affordances that have a direct impact on your instructional materials. Even if you are not responsible for conducting a full-scale needs assessment, you can still glean information that will be indicative of things you need to consider as your design instruction. When we think about what types of data can be gathered to support the design of instruction, we can think about it in terms of affordances. Regardless of where you may be designing and implementing instruction (i.e., K-12 classrooms, business and industry, higher education, health-care settings, etc.), affordances are present and greatly impact the successful delivery of instruction and transfer of knowledge. Norman (2013) describes affordances as perceivable actions possible. For purposes of instructional design affordances, we can categorize affordances across four domains: learner affordances, instructor/instructional designer affordances, technological affordances, and environmental affordances. Learner affordances are the perceivable actions that are possible for how the learners may interact with instruction and apply concepts covered during instruction upon completion of training. Instructor/instructional designer affordances are the perceivable actions possible of the individuals responsible for designing or delivering instruction (or both). This may involve more than one person depending on how instructional design efforts are carried out within an organization. Technological affordances are the capabilities of the technology itself. This includes the advantages
43
44
RECONSIDERING NEEDS ASSESSMENT IN INSTRUCTIONAL DESIGN
and limitations when delivering instruction via different platforms. Lastly, environmental affordances are an extension of the technological affordances and consider the environment where instruction will be delivered and then transferred (applied). Environmental affordances consider organizational infrastructure that may or may not be directly related to technological affordances. By categorizing affordances in multiple categories, we can make better decisions about possible non-instructional interventions that need to be considered. Table 3.2 provides an overview of the types of questions that an instructional designer may consider when gathering information pertaining to the TABLE 3.2 Questions Exploring Affordances Affordances
Questions to Consider
Learner
• What is the learners’ familiarity with the topic to be explored? • What are the learners’ perceptions related to participating in instructional activities/trainings? • What are the learners’ capabilities using different educational technologies to support their learning? • What is the likelihood of the learners’ adopting the instruction and applying it afterwards? • What training have the instructors/instructional designers had regarding the topic? • How comfortable are the instructors with delivering content through different delivery platforms (i.e., face-to-face, online, blended, mobile, etc.)? • To what extent will the instructor have the ability to monitor learner progress upon completion of training? • To what extent will the instructor have the authority to monitor learners’ application of content in real-world settings? • How will instructional materials be accessed by the learners? • What are the limitations to the instructional delivery platform that has been chosen for this project? • How will learners interact with the instructors and their peers during the instruction? • Will technological tools be used to facilitate learner-to- instructor and peer-to-peer communication? • What resources are needed to monitor and maintain the instruction upon completion of the project? • What support mechanisms are in place to support learners during instruction? • What support mechanisms are in place to support learners upon completion of the instruction? • Does the organizational culture support training and development?
Instructor/ Instructional designer
Technology
Environment
RECONSIDERING NEEDS ASSESSMENT IN INSTRUCTIONAL DESIGN
four types of affordances previously mentioned. The relationship between these affordances will be discussed in more detail in Chapter 5 when we explore how instructional designers can localize context related to their projects. Taking into account how the affordances impact varying needs prevalent to your design situation, you can begin to have conversations with your client about non-instructional interventions to ensure that you have adequate infrastructure in place to support instruction. Examples of non- instructional interventions may include, but are not limited to, organizational development practices, job design, communication, feedback mechanisms to monitor performance, and rewards and incentives for employee/learner engagement. Examples of non- instructional interventions will be discussed in additional chapters throughout this book. Needs assessment and analysis can be used to identify where discrepancies and gaps in performance exist in the organization. Once these gaps have been determined, the instructional designer can determine what needs warrant instruction and what additional resources and support are needed to support instructional efforts that have been deemed necessary. TABLE 3.3 Scalability of Instructional Design Needs Assessment Activities Level of Scale
Tasks
Low (1–2 weeks)
• Review existing training materials. • Review documents explaining job processes. • Meet with a subject matter expert (in the organization) to provide guidance on content that should be emphasized in the instructional product. • Obtain an overview of the learning audience by the client. • Review existing training materials. • Conduct observations of employees performing job tasks. • Update existing task analyses. • Meet with individuals that represent multiple levels of authority within the organization related to the instructional project. • Obtain an overview of the learning audience by the client. • Review existing training materials. • Review strategic planning documents. • Meet with individuals that represent multiple levels of authority within the organization. • Conduct observations of employees performing job tasks. • Update existing task analyses. • Conduct interviews and/or focus groups to understand factors that are inhibiting the transfer of learning. • Triangulate information from multiple sources to understand patterns contributing to or inhibiting employee/learner performance on the job.
Medium (1 month)
High (several months)
45
46
RECONSIDERING NEEDS ASSESSMENT IN INSTRUCTIONAL DESIGN
If you remember only one thing from reading this chapter, it should be that needs assessment activities are scalable. There are lots of things that we, as instructional designers, can do to scan and gather information to support our design tasks whether we have been given the authority to conduct a needs assessment or whether we have to gather information while we design instruction. Table 3.3 provides an overview of how instructional designers can scale different needs assessment activities during a project (Stefaniak, 2021).
Summary This chapter provided an overview for how instructional designers can set up a needs assessment that takes a systems view to identify environmental affordances that contribute and inhibit learner performance. Needs assessment provides another way to support the instructional designers’ ability to manage their design space as discussed in Chapter 2. Strategies for how instructional designers can scale needs assessments to meet the needs of their problems were provided. Chapters 4 and 5 will provide additional resources for conducting learner and contextual analyses.
Connecting Process to Practice Activities 1. Make a list of situations where the word “need” was used to describe an issue. The situation can be your current job as an instructional designer or it could be conversations you have participated in that have discussed the potential need for instructional design work. How was the word “need” used? Was it used as a noun or a verb? What rationale did individuals provide to support the need being discussed? 2. Think about instructional design projects you have completed. What types of information were missing? Did your client have an adequate understanding of the needs of the organization? What instruction was needed? What information did you need as the instructional designer? 3. Think about a time when you worked for an organization and were required to complete training. If you were to analyze that organization by looking at the four affordances, where would you see alignment? Where would you see discrepancies suggesting needs? 4. If you were hired by a client to design instruction and started identifying a lack of infrastructure to support the instruction, what would you say to your client? What are three things you could say to your client to explain the value that non-instructional interventions offer instruction?
RECONSIDERING NEEDS ASSESSMENT IN INSTRUCTIONAL DESIGN
Bridging Research and Practice Barker Steege, L. M., Marra, R. M., & Jones, K. (2012). Meeting needs assessment challenges: Applying the performance pyramid in the US Army. Performance Improvement, 51(10), 32–41. https://doi.org/10.1002/pfi.21313 Breman, J., Giacumo, L. A., & Griffith-Boyes, R. (2019). A needs analysis to inform global humanitarian capacity building. TechTrends, 63(3), 294–303. https://doi. org/10.1007/s11528-019-00390-6 Rummler, G. A., & Brache, A. P. (2013). Improving performance: How to manage the white space on the organizational chart (3rd ed.). Jossey-Bass. Stefaniak, J. (2020). Needs assessment for learning and performance: Theory, process, and practice. Routledge. Watkins, R., Meiers, M. W., & Visser, Y. L. (2012). A guide to assessing needs: Essential tools for collecting information, making decisions, and achieving development results. The World Bank.
References Altschuld, J. W., & Kumar, D. D. (2010). Needs assessment: An overview. Sage. Altschuld, J. W., & Watkins, R. (2014). A primer on needs assessment: More than 40 years of research and practice. New Directions for Evaluation, 2014(144), 5–18. https://doi.org/10.1002/ev.20099 Cennamo, K. S. (2003). Design as knowledge construction: Constructing knowledge of design. Computers in the Schools, 20(4), 13–35. https://doi.org/10.1300/ J025v20n04_03 Gibbons, A. S. (2014). Eight views of instructional design and what they should mean to instructional designers. In B. Hokanson & A. S. Gibbons (Eds.), Design in educational technology (pp. 15–36). Springer. Norman, D. (2013). The design of everyday things. Basic Books. Richey, R. C., Klein, J. D., & Tracey, M. W. (2011). The instructional design knowledge base: Theory, research, and practice. Routledge. Stefaniak, J. (2020). Needs assessment for learning and performance: Theory, process, and practice. Routledge. Stefaniak, J. (2021). Determining environmental and contextual needs. In J. K. McDonald & R. E. West (Eds.), Design for learning: Principles, processes, and praxis. EdTech Books. https://edtechbooks.org/id/needs_analysis Stefaniak, J., & Xu, M. (2020). An examination of the systemic reach of instructional design models: A systematic review. TechTrends, 64(5), 710–719. https://doi. org/10.1007/s11528-020-00539-8 Tessmer, M., & Wedman, J. F. (1990). A layers-of-necessity instructional development model. Educational Technology Research and Development, 38(2), 77–85. https://doi.org/10.1007/BF02298271
47
4
LEVERAGING LEARNER ANALYSIS TO FOSTER CRITICAL CONSCIOUSNESS
Chapter Overview This chapter re-examines learner analyses strategies and provides tools to support instructional designers’ abilities to engage their learners in the learning design process. Empathic design, inclusive design, and persona development are discussed as they relate to learner analysis and meeting the needs of learners.
Guiding Questions 1. What is a learner analysis? 2. What is the relationship between the learner analysis and the design space? 3. What tensions must instructional designers balance when conducting a learner analysis? 4. What strategies can we employ to promote inclusivity in our learner analyses?
Considering Our Learners in Instructional Design It would almost seem silly or absurd to think that we need to remind instructional designers of the importance of remembering their learners when designing instruction. Many artists and musicians will credit someone for being their muse, a source of inspiration, for a project. Should we not think of our learners as our muses? A motto commonly used by a health-care system I worked at years ago was “the patient is the center of all that we do.” Every employee was expected to know that motto if they were ever approached or asked about the motto. Whether it was the physicians and nurses, receptionists, cafeteria staff, or custodians—everyone was working to support the functions of the health-care system with the patient in mind.
DOI: 10.4324/9781003287049-4
LEVERAGING LEARNER ANALYSIS TO FOSTER CRITICAL CONSCIOUSNESS
I like to use that analogy when designing instruction. As instructional designers, we should be thinking about the learner being the center of all that we do. It can become very easy to get lost in the design process, especially after we go through several rounds of client feedback and design iterations, and forget that the learner should be at the focal point of the project. With every design decision we make, we should be considering how we are contributing to ease of learning. Our involvement with our learners will vary from project to project. Sometimes, we are very familiar with our learners and will have opportunities to interact with them as they experience the instruction we design and afterwards. In these situations, we have an easier time understanding the unique nuances of their jobs and roles in the learning experience. We are better positioned to customize instruction that is reflective of how it will be used upon completion of training. These environments also provide us with opportunities to evaluate the success of training after an extended period of time. In other instances, we may find ourselves designing instruction for people we know absolutely nothing about. There are times when an instructional designer may be responsible for designing or modifying shelf courses for a training company. Shelf courses are packaged instructional materials that provide generalized instruction for a particular topic. These courses are sold to organizations who are looking to purchase a series of training materials for their employees. A common challenge that instructional designers face when developing shelf courses is that the content needs to be kept somewhat generalizable so that they can be sold to more companies. If the content was customized to meet the needs of a group of learners at one company, that content may not be as transferrable to others. These types of design challenges require instructional designers to make lots of assumptions about their learning audience. Another type of design challenge we may face is accommodating the needs of our learners. We typically design instruction to be used by a group. Even if the instruction is meant to be used by individuals, we often assume that a group of individuals will be accessing the content. We often find ourselves asking the following questions: • • • •
Will this content be of value to my learners? Will every learner be able to use this information? Have I addressed every learner’s needs? Have I forgotten anyone? Have I promoted inclusivity in my content?
The process of learner analysis is one approach to assist instructional designers with answering these questions. The following sections will provide an overview of learner analysis as well as different strategies to gather data about our learning audiences.
49
50
LEVERAGING LEARNER ANALYSIS TO FOSTER CRITICAL CONSCIOUSNESS
What is Learner Analysis? Throughout the past few decades, instructional design has relied on a number of prescriptive processes to help us gather the data that we need to design instruction. To inform our design work, we can conduct a variety of analyses, such as needs analyses, contextual analyses, environmental analyses, task analyses, and learner analyses. These different forms of analyses are typically conducted to help us meet our instructional design goals, ensuring efficiency, effectiveness, and ease of learning (Morrison et al., 2013). A learner analysis is the process of gathering information about the learning audience to receive instruction. This may not be as daunting a task if you are familiar with the audience; however, it can prove to be quite challenging if you do not have direct contact with the individuals who will be on the receiving end of the instruction you are designing. Smith and Ragan (2005) warn that novice instructional designers have the tendency to describe the learner characteristics that they would like their learners to possess rather than the actual characteristics of their learning audience. The goal of a learner analysis is to gather sufficient information to understand the learners’ needs as it relates to the instruction. What do they already know about the topic? What are their attitudes and predispositions related to the topic? What is expected of them after completing instruction? There are several categories that instructional designers should consider when conducting a learner analysis: 1. General characteristics 2. Prior knowledge and prerequisite skills 3. Predispositions pertaining to the topic 4. Attitude and motivation toward training 5. Group characteristics General characteristics. This information typically consists of basic demographic information of your learning audience. Factors such as age, gender, ethnicity, education, and prior work experience are collected. While this data alone will not suffice, it does help to provide an overall picture of who may be in attendance for the instruction. This is typically useful information at the beginning of a project if you are determining whether instruction should be targeted toward novices or experts. If a training program was being delivered for mid-level supervisors of a manufacturing plant, you can begin to make the assumption that they are already familiar with certain tasks and responsibilities in the organization. Instead, you can begin instruction with the assumption that they have a certain level of prerequisite skills and knowledge. If you collected data and it revealed that your learners would be a combination of novices and experts, you would have to reconsider the
LEVERAGING LEARNER ANALYSIS TO FOSTER CRITICAL CONSCIOUSNESS
types of examples you may include in your instruction. This may also influence how you choose to group learners if you plan on incorporating any collaborative exercises. Would it make sense to separate novices from experts and provide them with activities based on their prior experience? Or would another option be to strategically pair novices with experts in groups so that the novices could rely on the experts for support? These are all viable options that should be considered but should not be relied on solely to inform your learner analysis. Prior knowledge and prerequisite skills. One of the ways to increase efficiency and effectiveness is to take into account what our learners already know about a topic. We would not want to require learners to sit through an introductory course if we later found out that they had over 15 years of experience related to the instruction. Likewise, we would not want to develop an advanced course in computer programming to find out afterwards that all of our learners had very limited experience with the topic and the programming language. It is important to gain an understanding of our learners’ prior knowledge. This helps us to determine appropriate prerequisite skills for our instruction. By identifying prerequisite knowledge and skills upon entry of instruction, we are able to impose parameters around our design space. This is particularly helpful when we find ourselves having to make decisions with the time allowed for the project and making decisions about which learner needs should be prioritized over others. By communicating and establishing prerequisite skills that are expected of learners prior to beginning instruction, we have flexibility with where we can begin with our content. If you were delivering training that required the learners to demonstrate physical strength, you may want to include that information in your prerequisite skills section. This would help inform the learners of expectations regarding the training. You could also gather information from your potential learning audience to ensure that they meet the physical standards required for the training activities. Predispositions pertaining to the content. Every learner comes to an instructional experience with predispositions. In an ideal world, we would have 100% engagement and participation every time we delivered instruction. Delivering training would be much easier if we knew that every one of our learners was excited to be in attendance and learning about the material. In reality, we know that is not always the case. If you have ever delivered instruction or training, you can attest to the challenges that can occur when you have one or two individuals who are miserable and do want to participate. Perhaps training is being mandated by their employer. They have been told they had to participate in additional training as a result of a negative annual evaluation. Perhaps you have a student who is taking your class only because it is a required course for their plan of study.
51
52
LEVERAGING LEARNER ANALYSIS TO FOSTER CRITICAL CONSCIOUSNESS
The more we are able to glean from their experiences, attitudes, and opinions toward the content, the more we can customize the instruction and help them to see the relevancy and utility in what is being taught. While we should always attempt to address utility and relevancy when designing instruction, this becomes especially important when we have a select group of individuals who do not want to participate and who can drastically impact the morale and learning environment. When these situations occur, it is particularly important that we can convey how they can use the instruction upon completion of training. While all learners benefit from developing an understanding of the relevancy of instruction and recognizing how they can use the instruction afterwards, it is most helpful to communicate this information to individuals who are not as enthusiastic at the start of training. Fulgencio and Asino (2021) remind us that “it is important to remember that learners are not empty containers in which knowledge can simply be poured. They have experiences through which they understand the world and through which they will understand or evaluate the instruction” (para 2). One way that additional information can be provided through the learner analysis is to inquire about learners’ expectations regarding the content (Dick et al., 2009). It is also beneficial to inquire about the learners’ experience with the delivery platform that is to be used during instruction. If learners have never participated in online instruction, you may consider including a brief orientation to guide them through what to expect and how to navigate the platform. If learners complete most of their instruction through the completion of e-learning modules, this type of orientation would not be necessary. Motivation toward training. It can be beneficial to obtain information not only about learners’ predispositions related to the content but also about their motivation regarding their participation in the training activities. It is important to distinguish between learner attitude and learner motivation toward training. Morrison and colleagues (2013) note that a learner may appear to be very interested in enrolling in a course but, owing to the complexity of the content and his prior experience, may not feel confident that he will pass it. If the results of your learner analysis showed that several of your learners were not confident with the material that is to be covered in your course, you may want to consider incorporating additional practice exercises. This not only provides an opportunity to slowly increase the complexity of the content but also can build their confidence by easing them into the difficult content. In his ARCS model, Keller (1987, 2010) proposed four motivational constructs that should be addressed to support learning: attention, relevancy, confident, and satisfaction. Attention is focused on identifying strategies to gain the attention of the learners and stimulate their curiosity at the beginning of the learning experience. Relevance entails emphasizing and reiterating how the instructional content and experience will be of benefit
LEVERAGING LEARNER ANALYSIS TO FOSTER CRITICAL CONSCIOUSNESS
to the learners. Imparting this information can help learners see the utility in what is being taught. If learners understand that they will be able to directly use the information being taught on their job or if they need to master the information today as a prerequisite for a subsequent course, they may be more apt to participate and pay attention. Learning is successful when learners feel confident that they can demonstrate mastery of the content. Instructional designers can foster confidence by incorporating strategies and activities that will help learners believe they can succeed. You may want to gather data from your learner analysis that would indicate how prepared learners currently feel completing particular tasks. If they have experienced challenges or have low confidence, follow-up questions about why they feel less confident may be valuable information to consider as you design learning activities. Lastly, satisfaction is focused on ensuring that learners feel a sense of accomplishment while engaging in instruction. This may include addressing rewards (internal and external). Internal rewards may be a sense of pride that they have completed a lesson or a major milestone in their training. External rewards could be a certificate of completion or consideration for a promotion at work. Keller (2010) recommends that instructional designers reflect on the following four questions during the design process: How can I make this learning experience stimulating and interesting? • • In what ways will this learning experience be valuable for my students? • How can I, via instruction, help the students succeed and allow them to control their success? • What can I do to help the students feel good about their experience and desire to continue learning? (p. 45) The ARCS model works as an overlay model to complement other instructional design models and frameworks that an instructional designer may use to guide their design on a project. As an instructional designer progresses through the instructional design process, they should be taking an iterative approach, asking themselves how they are addressing these motivational constructs throughout the instruction. While each of the four motivational constructs is helpful, I tend to prioritize relevancy when designing instruction. If we think back to Chapter 3 and the importance of ensuring alignment between needs and solutions, we are ultimately addressing relevancy. When instructional solutions align with meeting organizational needs and minimizing or eliminating gaps in performance, it is relevant. When we can explain that to our learning audience, they are more apt to pay attention. Anytime someone can explain the value of something and how it can benefit us, we are more apt to pay attention.
53
54
LEVERAGING LEARNER ANALYSIS TO FOSTER CRITICAL CONSCIOUSNESS
Group characteristics. While learner analyses can yield valuable information about our learners’ individual needs, it can also provide a good overview of the group characteristics of the learning audience as a whole (Dick et al., 2009; Smith & Ragan, 2005). Dick and colleagues (2009) note that the learner analysis can provide information related to the degree of heterogeneity within the learning audience. Looking at learner responses individually and then collectively will help to determine how much diversity exists among the group. It can help identify areas where additional information to promote diversity is needed. Lastly, it can be a good indicator to the instructional designer if any tensions may exist among members of the learning audience.
But Who Are Our Learners? Even if we are fortunate to receive information from our clients about our learning audience, do we ever really feel like we have been given sufficient information? Most often, we are provided with general characteristics and maybe some information about prerequisite skills and knowledge. But in terms of reaching our learners and helping them make connections with the instructional content, we do not always have what we need to address their attention, explain relevancy, and increase their confidence and satisfaction. Rothwell and Kazanas (2004) have suggested that instructional designers could benefit from learning from organizations that do advertising. Just think about some of the products you own in your home. For example, did any of us know that we absolutely needed a smartphone 30 years ago? If we had told our future selves that one day we could be tethered to an electronic device to send messages, search the internet, and manage many of our day-to-day tasks and routines through a phone, we would have been skeptical. If we were to look at our smartphone, we would observe how it fits in our hand. Look at how we are able to stretch our fingers to navigate the screen. Someone (or company) thought about this before we did. They anticipated our needs before we knew they were our needs. Just as a product development department might anticipate the needs of their users as they develop and refine a product, we too must work to anticipate our learners’ needs before they are aware of them. Thinking about the marketing messages that are conveyed when a company is selling a product, we can customize our messaging to gain the attention of our audience and sell the relevancy of our instruction. Translating this to a learner analysis, we can begin to look at what the information is telling us about our learners. One way to do this is to classify the information we obtain into two categories: (1) What are our learners? (2) Who are our learners? (Stefaniak & Baaki, 2013). Table 4.1 provides an outline that you can use to organize data gathered from your
LEVERAGING LEARNER ANALYSIS TO FOSTER CRITICAL CONSCIOUSNESS TABLE 4.1 Identifying Learner Characteristics Outline What Are Our Learners?
Who Are Our Learners?
General Characteristics Gender: Age: Work experience: Education: Ethnicity: Specific Characteristics Prerequisite skills: Knowledge, physical capabilities, know-how, adaptability to conditions Attitudes: What attitudes do individuals need to have? Learning Preferences How do individuals approach learning tasks and process information?
Motivation and Attitude What are individuals’ motivations to take the training? What are the individuals’ attitudes about the subject? Expectations and Vocational Aspirations What expectations do individuals have? How does this training align with individual aspirations?
What Are the Stories and Circumstances? What makes individuals “tick”? What are their hopes, dreams, and fears?
Source: Stefaniak, J. E., & Baaki, J. (2013). Table 4.1. Identifying Learner Characteristics Outline (p. 8). Used with permission.
learner analysis. By separating data from the what and the who, we are more apt to gather richer data to help our learners make connections with the instructional content. By understanding their motivations for taking a course or participating in training, we can tailor our examples to address relevance and help increase their perception of utility.
Identifying Appropriate Data Sources for a Learner Analysis There are a variety of different sources that an instructional designer may consider when conducting a learner analysis. Much like a needs assessment, the types of sources you may use depend on a number of factors such as the amount of time that has been allocated for you to conduct a learner analysis, the extent that your client is familiar with the problems or needs warranting instruction, availability of sources, and budget. A design challenge that many instructional designers face is that they are not provided with direct access to their learning audience prior to designing instruction. Often, they have to rely on information that is conveyed by the client. Sometimes, the information is accurate, but this is not a guarantee. Other times, a lot of important information is left out. Much like needs assessment discussed in Chapter 3, learner analyses can be scaled to meet the constraints imposed on the project.
55
56
LEVERAGING LEARNER ANALYSIS TO FOSTER CRITICAL CONSCIOUSNESS
What was missing? Think about the last instructional design project you completed. Reflect on the following questions: • What information was provided to you at the beginning of the project? • Was a learner analysis conducted, or were you required to rely on second-hand information you received from your client or supervisor? • What did you know about your learners? • Do you know what data sources were used to inform the information that was provided to you by your client or supervisor? • What information do you wish you had?
Much like needs assessment, learner analyses can benefit from gathering data from multiple sources. If time allows, instructional designers may survey their upcoming learning audience, review feedback from previous training programs, conduct interviews or focus groups (or both), conduct direct observations, and engage in document analysis. When you select appropriate data sources, it is important to consider what sources will help you understand your learners’ predispositions and needs and the complexities associated with the tasks your learners are expected to know and perform upon completion of training. It can be incredibly helpful to directly observe individuals correctly performing a task or procedure in the actual environment where they will be expected to perform. This provides you with an opportunity to observe other environmental factors that may pose distractions to your learners, such as time constraints, noise, temperature, and reliance on others. Direct observations can also provide opportunities for you to collect video footage of an individual performing the task that you can share during your instruction. It can provide a realistic view of the environment to your learning audience so they have a more actual visualization of what is being taught. There are different types of observer roles that an individual may adopt when conducting direct observations of individuals. Spradley (1980) has identified five types: non- participatory, passive participation, moderate participation, active participation, and complete participation. Table 4.2 provides an overview of these types of participation along with advantages and disadvantages of using them during a learner analysis. If you are provided an opportunity to conduct a learner analysis, it is important that you consider what information different data sources will yield. How will the information inform your understanding of your learners. Table 4.3 can be used to help you organize and rationalize what data sources you use for your analysis.
TABLE 4.2 Overview of the Types of Participant Observations Description
Strengths
Limitations
Non-Participatory
The observer has no direct contact with the individual(s) being observed.
The observer is unable to ask follow-up questions to the population as new information emerges.
Passive Participation
The observer serves as a bystander and does not directly interact with anyone in the environment.
Moderate Participation
The observer balances between being an insider and outsider.
Active Participation
The observer becomes a member of the group being observed and actively participates to gain an understanding of the environment.
Complete Participation
The observer is a member of the group being observed prior to the needs assessment (or study).
This can be useful for social situations that do now allow for direct participation or interaction with the population. This allows for the observer to avoid interfering with individuals performing particular tasks or interacting with others in the environment (or both). Provides the observer with the ability to engage directly with the environment while still being able to detach from the environment in order to remain objective. Provides the observer with an opportunity to perform tasks first-hand in order to see how an individual interacts with others within an environment. The observer is familiar with the culture of the environment being observed.
As in a non-participatory role, the observer is unable able to build a rapport or ask individuals for follow-up information.
While this is not specifically a limitation, it is important that the observer understand their roles as an insider and an outsider to be objective.
Challenges may arise if the observer becomes too entrenched in the environment that they are unable to detach and make objective observations. There are a number of risks of the observer not being able to maintain objectivity during the project.
Source: Stefaniak. J. E. (2020). Figure 8.2. Overview of the Types of Participant Observation (p. 113). Used with permission.
LEVERAGING LEARNER ANALYSIS TO FOSTER CRITICAL CONSCIOUSNESS
Type
57
58
LEVERAGING LEARNER ANALYSIS TO FOSTER CRITICAL CONSCIOUSNESS TABLE 4.3 Identifying Potential Data Sources Information Needed Data Source
Time to Assess
Individual(s) Needed
Additional Resources Needed
Individual(s) Needed
Additional Resources Needed
Information Needed Data Source
Time to Assess
Where Does Learner Analysis Fit in the Design Process? When we think back to the three questions we should ask ourselves during every instructional design process (Am I contributing to efficiency? Am I contributing to effectiveness? Am I contributing to my learners’ ease of learning?), we need to be mindful of how much time we spend on our learner analysis. Learner analyses and needs assessments are similar in that clients do not want us to spend unnecessary amounts of time collecting data when we can be designing. It is important that we communicate our needs as designers when designing instruction and advocate for how additional information pertaining to our learners can contribute to the efficiency, effectiveness, and ease of the project. Cennamo and Kalk (2019) note that while a number of instructional design textbooks (Dick et al., 2009; Larson & Lockee, 2020; Morrison et al., 2013; Smith & Ragan, 2005) offer slight variations in the different types of learner characteristics an instructional designer may gather, it is important to determine what information will be essential to your design. It is important to remind yourself why you need particular information about your learners. Cennamo and Kalk (2019) recommend considering the relationship of learner characteristics (see Figure 4.1). Your learners’ motivation, prior knowledge, and cognitive and physical abilities inform and influence their understanding (Cennamo & Kalk, 2019). Demographic data provide general data that can help you make assumptions about your learning audience and engage in design conjecture. Chapter 2 discussed the role that conjecturing plays in the design process. Even when we make conjectures, we should have a rationale for why we are making particular assumptions. As we make assumptions
LEVERAGING LEARNER ANALYSIS TO FOSTER CRITICAL CONSCIOUSNESS Movaon
Prior knowledge
Cognive and physical abilies
FIGURE 4.1 Relationship of Learner Characteristics. Source: Cennamo, K., & Kalk, D. (2019). Real world instructional design: An iterative approach to designing learning experiences (2nd ed.). Routledge. Figure 2.2 – Relationship of learner characteristics (p. 36). Used with permission.
Design Space Time Allocated for Project Who is my primary audience? What is the relaonship between the learner characteriscs? What data do I need to balance movaon with relevancy and transfer? What data can I collect within the me allocated for the project?
FIGURE 4.2 Relationship Between Learner Analysis and Design Space.
about your learning audience, it is important that we consider what information you have pertaining to your primary learning audience. Earlier in this chapter, I asked that you reflect on your most recent instructional design project. I asked what information you wish you had been provided about your learners. Chapter 2 discussed establishing parameters around your design space. The same can be done for visualizing where learner analysis fits into your design space (see Figure 4.2). If we think of our design space as a box, we should be asking ourselves what information is essential to the project and can be reasonably collected during the duration of the project. This is where we need to be able to differentiate between information that is essential versus information that
59
60
LEVERAGING LEARNER ANALYSIS TO FOSTER CRITICAL CONSCIOUSNESS TABLE 4.4 Learner Analysis Conjectures Information Available
Assumptions to be Made
Design Decision (Action)
Who is my primary audience? What is the relationship between the learner characteristics? What data do I need to balance motivation with relevancy and transfer? What data can I collect within the time allocated for the project?
would be nice to have. Once we are able to determine what information falls within our design space, we may begin to conjecture and make design decisions based on the information given to us. Table 4.4 provides a learner analysis form that you may use to help inform your design conjectures. Chapter 5 will provide additional information for how we can localize context to further establish parameters around our design space. Additional information will be provided to support aligning learners’ predispositions with instructional and transfer contexts.
Balancing Tensions Between Time, Quality, and Inclusivity It can become very overwhelming to think about how best to map out our instruction when we are presented with a large number of learners, each bringing unique needs and predispositions to instruction. In an ideal world, we would accommodate every learner to ensure that they leave the instructional experience with 100% satisfaction. When we are dealing with designing instruction for multiple learners, we need to be prepared that it will be quite difficult to accommodate every learner’s unique needs and requests. To do so could be at the expense of the quality of instruction, time allocated to design the instruction, and overall expectations expressed by the client. When a learner analysis presents multiple and competing learner needs, designers are faced with the challenge of determining whose needs should be prioritized. Smith and Ragan (2005) suggest identifying the primary audience and the secondary audience. The primary audience is prioritized in the instruction. For example, you may be tasked with designing a workshop to teach new faculty how to use the learning management system at a local higher education institution. Most of the faculty who will be in attendance are assistant professors who are starting their first job as a professor. You have been informed that a few may already have experience with the learning management system platform.
LEVERAGING LEARNER ANALYSIS TO FOSTER CRITICAL CONSCIOUSNESS
If your primary audience is the majority of faculty who have minimal experience, you can focus your instruction on ensuring that you are covering the basis for using the learning management system. While there may be a few participants who have previous experience, you can focus on the fact that your primary audience (and the majority of participants) does not. This helps you to put parameters around your design space for what is expected of you to meet your learners’ needs. Learner analysis is typically mentioned as a component of the A in the ADDIE (Analyze, Design, Develop, Implement, and Evaluate) process. While the bulk of information about our learning audience should be gathered at the beginning of the project, it should be revisited continually through each iteration of design. One way we can further understand who our learners are is by putting ourselves in their shoes. This can be accomplished by engaging in empathetic design.
The Role of Empathy in Learner Analysis Learner analyses have the potential to provide invaluable information to instructional designers who want to customize instruction to meet the needs of their learners. Through empathy, we can put ourselves in our learners’ shoes and think about how they might navigate our instructional materials and interpret different aspects. Design thinking is a philosophy that many instructional designers have gravitated toward in the past two decades. The design thinking philosophy consists of five phases to transform critical needs into viable solutions: empathize, define, ideate, prototype, and test (Brown & Wyatt, 2010). The empathy phase is at the forefront of the design thinking process and requires us to put ourselves in someone else’s position (Köppen & Meinel, 2015; Roberts et al., 2016). This enables us to better understand the needs of the project from their perspective. This initial phase of empathy is prominent as a designer defines and frames the problem; ideates a variety of plausible solutions, designs, and prototype; and tests its effectiveness. Kouprie and Visser (2009) describe empathy occurring through four phases in design practice: discovery, immersion, connection, and detachment. Table 4.5 provides an overview of how an instructional designer may engage in Kouprie and Visser’s (2009) phases of empathy.
The Use of Personas Once we have gathered data on our learning audience, we can use that information to develop personas. Personas can be used “to make the user ‘real’, so that [designers] can develop empathy for the user and use that empathetic connection to view design decisions from the persona’s perspective” (Williams van Rooij, 2012, p. 79). Personas provide us with an opportunity to put ourselves in our learners’ shoes. We can think about
61
62
LEVERAGING LEARNER ANALYSIS TO FOSTER CRITICAL CONSCIOUSNESS TABLE 4.5 Phases of Empathy as Applied in Instructional Design Phase
Descriptors (Kouprie & Visser, 2009)
Examples in Learner Analysis Design
Discovery
Entering the user’s world Achieving willingness
Immersion
Wandering around in the user’s world Taking user’s point of reference
Connection
Resonating with the user Achieve emotional resonance
Detachment
Leaving the user’s world Design with user perspective
• Meeting with the learner to hear their stories and perspectives that can inform needs assessment and learner and contextual analyses • Engage in direct observations and interactions with the learners to develop an understanding of their world as it relates to the project • Discuss directly with the learner the challenges they experience • Practice cultural humility • Listen with an open mind as the learner shares their perceptions • Attempt to understand the learner’s motivational drivers • Reflecting on your own experiences to better understand your learner’s perspective • Seeking to understand feelings and meanings of what the learner is experiencing • Return to your design space to make sense of what you now understand of your learner’s world • Reflect upon how you can help support your learner’s goals
how a particular person may approach and interpret instruction. Personas can be helpful as we try to envision how a learner will engage with the instruction from start to finish. Effective personas: Represent a major user group • • Express and focus on the major needs and expectations of the most important user groups • Give a clear picture of the user’s expectations and how they’re likely to use the content • Aid in uncovering universal features and functionality • Describe real people with backgrounds, goals, and values (U.S. Department of Health and Human Services, 2020, para 1) Personas are a strategy of empathic design that keeps the learner at the focal point throughout the design and development process (Williams van
LEVERAGING LEARNER ANALYSIS TO FOSTER CRITICAL CONSCIOUSNESS
Rooij, 2012). When we design personas, it is important to create a narrative that describes a learner that we, as designers, can visualize throughout our design process. As we design instruction and make decisions about different activities and assessments, we can revisit those personas to think about how that individual would feel before, during, and after instruction. It is important to visualize our personas. What do they look like? When developing personas, instructional designers will often rely on stock images to present an image of a potential learner. Personas should be given names so we can refer to them as actual people when discussing them. Personas can provide insights into what our learners want to accomplish and their experiences with the content, work, and others (Cooper, 2004; Kouprie & Visser, 2009). Personas provide an opportunity to build upon information that we may have about our learners’ predispositions and motivations. Crafting personas helps us consider the conversations we might have if we were trying to convince them that instruction is valuable. Figure 4.3 provides an example of a persona that was used for a training and development program. When we develop personas, it is important to be mindful of the realities of our learning audience. We do not craft personas that are reflective of
FIGURE 4.3 Example of a Learner Persona. Source: Stefaniak, J., Shah, S., Mills, E., & Luo, T. (2020). Keeping the learner at the focal point: The use of needs assessment and persona construction to develop an instructional resource center for instructional designers. International Journal of Designs for Learning, 11(2), 142–155. Figure 2 – Persona of Jamie Jones. (p. 145). Used with permission.
63
64
LEVERAGING LEARNER ANALYSIS TO FOSTER CRITICAL CONSCIOUSNESS
the learners that we want to have participating in instruction; they must be reflective of the primary audience. This may involve developing multiple personas with conflicting goals, attitudes, and experience. By developing multiple persons, we can begin to think about how different learners will perceive instruction. The following steps are adapted from Williams van Rooij (2012) to develop personas. This could be done as an exercise with other members of your design team. Each designer would complete the activity individually and then compare their notes: . Recall a learner who is memorable. 1 2. List learner’s characteristics (one characteristic per note or post-it note). 3. Place all notes on a board for your design team to see. 4. Review and discuss the learner characteristics. Are there similarities? What major differences exist? 5. Cluster similar characteristics together. 6. Review the learner characteristics that are left unclustered. 7. Use clusters to form a mental model of a single person. 8. Draft a description of the learner. Provide descriptors of what they may look like. 9. Compare and consolidate drafts.
Fostering Critical Consciousness in Learning Design Practice While learner analysis has been included as a primary instructional design task in the majority of instructional design models, the role of the instructional designer in the learner analysis process has tended to subscribe to a paternalistic approach where the instructional designer decides on behalf of their learning audience what learner needs are worth addressing. Recently, scholars have begun to question the utility and appropriateness of the instructional design models used by many in the field (Moore, 2021). Several have criticized the inconsistent emphasis of or total disregard for culture in instructional design practices (Asino et al., 2017; Bradshaw, 2018; Giacumo et al., 2021; Subramony, 2019). I particularly like Spencer-Oatey’s (2008) definition of culture: Culture is a fuzzy set of basic assumptions and values, orientations to life, beliefs, policies, procedures, and behavioral conventions that are shared by a group of people, and that influence (but do not determine) each member’s behavior and his/her interpretations of the “meaning” of other people’s behavior. (p. 8) A common phrase used by most instructional designers when asked a question is “it all depends.” Spencer-Oatey’s (2008) definition of culture embraces that ambiguity by acknowledging that while culture influences
LEVERAGING LEARNER ANALYSIS TO FOSTER CRITICAL CONSCIOUSNESS
each member’s group, it does not necessarily determine their behavior or their own interpretation and meaning-making of the world around them. The fuzziness of the definition accounts for the varying degrees to which individuals within a group may subscribe to a particular culture (Benson et al., 2017). Every learner has their own perceptions of the world around them, and it is important to recognize that those perceptions are real to the learner. I am not entirely convinced that we need new design models, but I do think that we need to strengthen our efforts in how instructional designers are being prepared to approach various tasks within the instructional design process. Recognizing that the emphasis of this chapter is on learner analyses, I do think that by adopting an empathize approach to design can help us connect with our learning audience and inform our design. The third phase of Kouprie and Visser’s (2009) phases of empathy framework suggests that during the connection phase where a designer is resonating with the user, “both affective and cognitive components are important; the affective to understand feelings, the cognitive to understand meanings” (p. 445). By adopting an empathetic mindset throughout our design process, we are better positioned to understand our learners’ perceptions of the world, particularly as they directly or indirectly relate to the topic for instruction. Immersing ourselves in their world provides us with additional information to help us understand our learners’ perspectives and how they approach different tasks in their environment. We talk a lot in our field about the importance for instructional designers to engage in reflective practice; however, emphasis is placed more on cultivating designers’ responses to contextual saliences (McDonald, 2022). Sockman and Kieselbach (2022) implore instructors to practice cultural humility. This further supports Kouprie and Visser’s (2009) framework, which emphasizes the importance of resonating with the learner and listening to understand their perspectives. The practice of cultural humility can support our abilities to engage in reflection-in-action during practice to make the necessary adjustments in response to contextual saliences. Hammond (2015) provides an analogy of the depths of the ocean to describe the depths of culture. She proposes that there are three cultural depth levels. The surface level consists of elements that are generally observable. Examples may include holidays, language, clothing, art, and food. Shallow culture encompasses the unspoken rules of social norms that exist within a group of people. Examples may include expectations regarding tone of voice, eye contact, and personal space. Deep culture embodies unconscious assumptions that govern our worldview. Hammond (2015) argues that deep culture is the most important form because it has an emotional impact on establishing trust and greatly influences our abilities to engage in meaning making and making sense of our world. Examples of deep culture include decision-making practices, spirituality, ethics, and concepts of self. Sockman and Kieselbach (2022) proposed heuristics to support instructional designers’ integration of culture in their design practices (Table 4.6).
65
66
Cultural Depth Levels
Examples
Instructional Design Questions for Consideration
Surface
Observable and concrete elements, what we see, generally acknowledged Unspoken rules around everyday social interactions and norms, non-verbal
• Media: Does media represent various cultures, ethnicity, and race in equal status through position, dress, and intellect? • Greetings: Do opening messages welcome diverse perspectives? • Calendar: How do calendars and time lines acknowledge holidays of cultures other than the dominant or privileged culture?
Shallow
Deep
Unconscious assumptions that govern the worldview of good or bad that guides ethics, spirituality, health, and theories of group harmony
• Social Interactions • What is communication like among different genders and identities in different learning spaces? (i.e., in-person, discussion boards, video conferencing) • What does communication with instructors look like? (eye contact, questioning, and comfort) • Course norms of time, sequence, and effort • How are assignment due dates viewed? • What process is expected when learning the course content, and how is that expectation conveyed? • How much time and effort are learners expected to devote to assignments? Success: What does success look like from the perspective of the learner and of the professor and why? Ethics • What are the ethical embodiments of communication (i.e. netiquette, speech)? • What are the expectations and enactments of copyright, Creative Commons, and plagiarism? Group Harmony – Individualistic or Collectivist • How are learners expected to generate thought in groups or individually, and how do learners perceive the difference? • Is the course collaborative or competitive (or both) in its orientation? Exemplars • What are “good” and “exemplars”? Do they represent a diversity of perspectives with various cultures in equal position? Aesthetics • Is the course aesthetically pleasing (for the content in its layout and design) to the beholder? • What is the basis for aesthetic preference that governs acceptable qualities?
Source: Sockman, B., & Kieselbach, L. (2022). Table 14.1. Levels of Culture (Hammond, 2015, pp. 22–24) with Instructional Design Considerations (p. 136). Used with permission.
LEVERAGING LEARNER ANALYSIS TO FOSTER CRITICAL CONSCIOUSNESS
TABLE 4.6 Levels of Culture with Instructional Design Considerations
LEVERAGING LEARNER ANALYSIS TO FOSTER CRITICAL CONSCIOUSNESS
The questions in Table 4.6 are intended to guide the instructional designer as they reflect on their learning audience. This coincides with Kouprie and Visser’s (2009) fourth phase of empathy detachment where the designer must leave the user’s world and return to the design space. Sockman and Kieselbach’s (2022) questions serve as heuristics to aid the instructional design in reviewing their design materials from a cultural lens.
Summary This chapter provided an overview of strategies to support instructional designers as they engage in learner analysis. Emphasis was placed on employing empathetic design approaches to develop a deeper understanding of our learners. Chapter 5 expands on our discussions of needs assessment and analyses by examining the role that context plays in our design practices. We will discuss how context can be localized to enable instructional designers to design efficiently and effectively.
Connecting Process to Practice Activities 1. What challenges have you experienced as a designer conducting learner analyses? What types of information are typically shared with you? Have you had an opportunity to directly gather data? 2. If you worked for a company that did not provide you the opportunity to conduct learner analyses, what are five questions you would want to ask your clients about your learners for every project? 3. How might you advocate the importance of conducting a learner analysis to a client? How could a learner analysis support the efficiency, effectiveness, and ease of instruction? 4. If you were asked to design a workshop on learner analysis for an introductory course in instructional design, who would be your learners? Create two personas that reflect your learners. For each persona, write a two-to three-paragraph description that addresses the following: What’s their backstory? Why are they in your workshop? Are they happy to be there? What are their aspirations? How do they plan to use the information provided in your workshop? 5. Using Table 4.6 (Sockman & Kieselbach, 2022), review instructional materials that you have recently developed. Take a moment to use the guiding questions to audit your work. How have you addressed culture in your materials?
Bridging Research and Practice Avgerinou, M. D., & Andersson, C. (2007). E-moderating personas. The Quarterly Review of Distance Education, 8(4), 353–364. Baaki, J., Tracey, M. W., Bailey, E., & Shah, S. (2021). Graduate instructional design students using empathy as a means to an end. Journal of Design Research, 19(4–6), 290–307.
67
68
LEVERAGING LEARNER ANALYSIS TO FOSTER CRITICAL CONSCIOUSNESS Dudek, J., & Heiser, R. (2017). Elements, principles, and critical inquiry for identity- centered design of online environments. International Journal of E-Learning & Distance Education, 32(2), 1–18. Matthews, M. T., Williams, G. S., Yanchar, S. C., & McDonald, J. K. (2017). Empathy in distance learning design practice. TechTrends, 61(5), 486–493. https://doi. org/10.1007/s11528-017-0212-2 Parrish, P. (2006). Design as storytelling. TechTrends, 50(4), 72–82. Quintana, R. M., Haley, S. R., Magyar, N., & Tan, Y. (2020). Integrating learner and user experience design: A bidirectional approach. In M. Schmidt, A. A. Tawfik, I. Jahnke, & Y. Earnshaw (Eds.), Learner and user experience research: An introduction for the field of learning design & technology. EdTech Books. https:// edtechbooks.org/ux/integrating_lxd_and_uxd
References Asino, T. I., Giacumo, L. A., & Chen, V. (2017). Culture as a design “next”: Theoretical frameworks to guide new design, development, and research of learning environments. The Design Journal, 20(sup1), S875–S885. https://doi.org/10.1080/ 14606925.2017.1353033 Benson, A. D., Joseph, R., & Moore, J. L. (2017). Introduction to culture, learning, and technology: Research and practice. In A. D. Benson, R. Joseph, & J. L. Moore (Eds.), Culture, learning, and technology: Research and Practice (pp. 1–7). Routledge. Bradshaw, A. C. (2018). Reconsidering the instructional design and technology timeline through a lens of social justice. TechTrends, 62(4), 336–344. https:// doi.org/10.1007/s11528-018-0269-6 Brown, T., & Wyatt, J. (2010). Design thinking for social innovation. Development Outreach, 12(1), 29–43. Cennamo, K., & Kalk, D. (2019). Real world instructional design: An iterative approach to designing learning experiences (2nd ed.). Routledge. Cooper, A. (2004). The inmates are running the asylum: Why high-tech products drive us crazy and how to restore the sanity. Sams Publishing. Dick, W., Carey, L., & Carey, J. (2009). The systematic design of instruction (7th ed.). Allyn & Bacon. Fulgencio, J., & Asino, T. (2021). Conducting a learner analysis. In J. K. McDonald & R. E. West (Eds.), Design for learning: Principles, processes, and praxis. EdTechBooks. Giacumo, L. A., MacDonald, M., & Peters, D. T. (2021). Promoting organizational justice in cross-cultural data collection, analysis, and interpretation: Towards an emerging conceptual model. Journal of Applied Instructional Design, 10(4). https://doi.org/10.51869/104/lgi Hammond, Z. (2015). Culturally responsive teaching & the brain: Promoting authentic engagement and rigor among culturally and linguistically diverse students. Corwin Press. Keller, J. M. (1987). Development and use of the ARCS model of instructional design. Journal of instructional development, 10(3), 2–10. https://doi.org/10.1007/ BF02905780 Keller, J. M. (2010). Motivational design for learning and performance. Springer. Köppen, E., & Meinel, C. (2015). Empathy via design thinking: Creation of sense and knowledge. In H. Plattner, C. Meinel, & L. Leifer (Eds.), Design thinking research: Building innovators (pp. 15–28). Springer.
LEVERAGING LEARNER ANALYSIS TO FOSTER CRITICAL CONSCIOUSNESS Kouprie, M., & Visser, F. S. (2009). A framework for empathy in design: Stepping into and out of the user’s life. Journal of Engineering Design, 20(5), 437–448. https://doi.org/10.1080/09544820902875033 Larson, M. B., & Lockee, B. B. (2020). Streamlined ID: A practical guide to instructional design (2nd ed.). Routledge. McDonald, J. (2022). Preparing instructional design students for reflective practice. In J. E. Stefaniak & R. M. Reese (Eds.), The instructional design trainer’s guide: Authentic practices and considerations for mentoring ID and ed tech professionals (pp. 29–37). Routledge. Moore, S. L. (2021). The design models we have are not the design models we need. The Journal of Applied Instructional Design, 10(4). Morrison, G. R., Ross, S. M., Kalman, H. K., & Kemp, J. E. (2013). Designing effective instruction (7th ed.). Wiley. Roberts, J. P., Fisher, T. R., Trowbridge, M. J., & Bent, C. (2016). A design thinking framework for healthcare management and innovation. Healthcare, 4(1), 11–14. https://doi.org/10.1016/j.hjdsi.2015.12.002 Rothwell, W., & Kazanas, H. (2004). Mastering the instructional design process. A systematic approach (3rd ed.). Pfeiffer. Smith, P. L., & Ragan, T. J. (2005). Instructional design (3rd ed.). Jossey-Bass. Sockman, B., & Kieselbach, L. (2022). Instructional design embedded in culture. In J. E. Stefaniak & R. M. Reese (Eds.), The instructional design trainer’s guide: Authentic practices and considerations for mentoring ID and ed tech professionals (pp. 135–146). Routledge. Spencer- Oatey, H. (2008). Culturally- speaking: Culture, communication, and politeness theory (2nd ed.). Bloomsbury Publishing. Spradley, J. P. (1980). Participant observation. Holt, Rinehart, and Winston. Stefaniak, J., Shah, S., Mills, E., & Luo, T. (2020). Keeping the learner at the focal point: The use of needs assessment and persona construction to develop an instructional resource center for instructional designers. International Journal of Designs for Learning, 11(2), 142–155. Stefaniak, J. E., & Baaki, J. (2013). A layered approach to understanding your audience. Performance Improvement, 52(6), 5–10. https://doi.org/10.1002/pfi.21352 Subramony, D. P. (2019). Revisiting instructional technologists’ inattention to issues of cultural diversity among stakeholders. In A. D. Benson, R. Joseph, & J. L. Moore (Eds.), Culture, learning, and technology: Research and Practice (pp. 28–43). Routledge. U.S. Department of Health and Human Services (2020). Personas. https://www. usability.gov/how-to-and-tools/methods/personas.html Williams van Rooij, S. W. (2012). Research-based personas: Teaching empathy in professional education. Journal of Effective Teaching, 12(3), 77–86.
69
5
DEVELOPING A LOCALIZATION OF CONTEXT TO SUPPORT TRANSFER OF LEARNING
Chapter Overview Recently, research has begun to introduce the idea of a localization of context. This supports instructional designers’ abilities to manage their design space and enforce scalability to complete their projects. Strategies for how to scale design work and work within a bounded rationality will be discussed. The strategies discussed in this chapter will also be related back to content in Chapter 2 that introduces strategies to support managing design space through conjecturing, design judgments, and dynamic decision-making.
Guiding Questions 1. What is context? 2. What is a contextual analysis? 3. How can an instructional designer localize their context? 4. What is the relationship between different contexts? What is Context? “Context” is a term that is used frequently in conversations among instructional designers. We often hear people say: • “We need to take into consideration the context” “What contextual factors will impact the training?” • “The instruction needs to reflect context.” • While there are several variations of the definition context, I think there are two definitions in particular that support the goals of instructional design. The Oxford Dictionary (2020) defines context as “the circumstances that form the setting for an event, statement, or idea, and in terms of which can be fully understood and assessed.” I particularly like that this definition emphasizes circumstances that contribute to a situation or setting. When DOI: 10.4324/9781003287049-5
LOCALIZATION OF CONTEXT TO SUPPORT TRANSFER OF LEARNING
we design instruction, it is important that we consider the various conditions that may support or hinder the instructional activity (or event) and our learners’ abilities to engage meaningfully with the content. The Oxford Dictionary’s definition prominently states that context is the culmination of circumstances that lead to something that can be fully understood and assessed. Context plays a very important role throughout all phases of the instructional design process. Chapter 3 discussed how needs assessment should be used to gather as much information as possible to inform the instructional designer of the context. The information that we are able to gather about the situation and our learners prior to designing instruction enables us to customize instruction that is relevant to our learners. Without accounting for context, this task is impossible! A second definition, offered by Merriam-Webster (2022), describes context as “the interrelated conditions in which something exists or occurs.” This definition alludes to the systemic considerations that must be accounted for when designing instruction. As human beings, we exist and interact within a number of different systems. We are members of different systems at work (i.e., teams, departments, and organizations). Our family is a system. All organizations we are affiliated with (i.e., schools, workplaces, community members, and churches) are systems. Each system is composed of several subsystems. Our bodies are composed of 12 systems such as our respiratory, circulatory, digestive, skeletal, and nervous systems, to name a few. Systems play an important part in our everyday lives. Many of the systems that we are part of interact with other systems. I particularly like the term interrelated conditions in Merriam-Webster’s (2022) definition because it acknowledges the connections that exist within a context. So many aspects of instructional design are derived from systems thinking, taking into account the various moving parts that influence the environments that we design for. We need to think about the organizations and classrooms, our learning communities, technological affordances, and learner affordances that influence how we design and deliver instruction. Contextual analysis enables our abilities to design instruction that considers the interrelationships between different individuals and components within the system. I have described contextual analysis as it relates to instructional design as “the process of identifying factors that inhibit or promote the ability to design instruction that is relevant to what occurs in real-world settings” (Stefaniak, 2020, p. 59). Contextual analysis provides a comprehensive view of a situation. It offers a richness that we, as designers, are not always privy to if we take information provided to us by management and our clients at face value. We differentiated between needs assessment and needs analysis in Chapter 3. As a quick recap, needs assessment is the process of determining whether a gap in performance or expectations exists. Needs analysis
71
72
LOCALIZATION OF CONTEXT TO SUPPORT TRANSFER OF LEARNING
entails determining what is contributing to the existence of those gaps. Contextual analysis is similar to needs analysis in that it provides insight into factors causing things to occur in the environment. It is imperative that, as instructional designers, we have a thorough understanding of the context we are designing for and within, so that we can design solutions that are sustainable. If we only know pieces of information but do not have an understanding of the implications, we may not be addressing the needs of the situation. Contextual analysis enables us to view the environment through multiple layers to gain understanding in how different factors may interact with or influence others. As we gather information during the contextual analysis, we are able to refine our design space (as discussed in Chapter 2) and establish boundaries around our design in order to complete the project within the given constraints, thus making it more manageable.
Systemic Implications for Contextual Analysis in Instructional Design Seminal research on contextual analysis in our field was explored in- depth in the 1990s, when Rita Richey and Martin Tessmer published several pieces advocating for the exploration of context. Tessmer’s (1990) earlier work advocated for instructional designers to conduct environmental analyses to record contextual factors that may interfere with or facilitate the design and delivery of instruction. Environmental analysis is “the analysis of the context in which the instructional product will be employed, of the physical and use factors of the instructional environment and its support environment” (Tessmer, 1991, p. 9). Tessmer viewed environmental analysis as an extension of needs assessment or front-end analysis. During design work, environmental analyses provide instructional designers with the ability to hone in on specific factors (i.e., economics, legal, political, social, and technological) that may influence the outcomes of the overall environment and projects occurring within the system (Stefaniak, 2020). Information yielded from environmental analyses can influence instructional strategies, delivery platforms, use of educational technologies, assessments, and long-term evaluative strategies that may be employed for a project. Richey and Tessmer (1995) purport that every design project should consider 1. Learning environment factors, such as a. Facilities factors (heat, light, space, and seating) b. Sociocultural factors (perceived learner/teacher roles) c. Materials factors (adaptability, useability, and life span) 2. Support environment factors, such as a. Instructional support (production, storage, and distribution services)
LOCALIZATION OF CONTEXT TO SUPPORT TRANSFER OF LEARNING
. Physical conditions (site distribution, seasons, and schedules) b c. Organizational factors (culture, human resource development practices, and change commitment) (p. 192) Around the time, Tessmer and Richey began exploring the systemic implications prevalent in instructional design, paying particular attention to the implications that adult learners and organizational clients can impose on a training environment. Richey’s (1992) systemic training design model was considered to be an elaboration of existing instructional design models at that time by integrating specific checkpoints imposed on the instructional design process for feedback. The emphasis on feedback demonstrates the iterative nature of design. Richey’s (1992) model highlights that designers actively engage in feedback throughout the duration of a project. Richey and Tessmer’s independent and collective research on contextual analysis (e.g., Tessmer & Richey, 1997; Tessmer & Wedman, 1995) suggests that instructional designers should address three levels of context in their design work (orienting, instructional, and transfer). The orienting context addresses factors that concern learners as they relate to their participation in the learning experience. The instructional context considers the environment where instruction will be delivered. The transfer context considers how and where the information acquired in the instructional context will be applied upon completion of training. Table 5.1 provides TABLE 5.1 Contextual Factors Within Levels of the Orienting, Instructional, and Transfer Contexts Orienting Context
Instructional Context
Transfer Context
Learner Factors
Learner profile Goal setting Perceived utility Perceived accountability
Learner role perception Learner task perception
Immediate Environment Factors
Social support
Organizational Factors
Incentives Learning culture
Sensory conditions Seating Instructor role perception Learning schedules Content culture Rewards and values Learning supports Teaching supports
Utility perceptions Perceived resources Transfer coping strategy Experiential background Transfer opportunities Social support Situational cues
Transfer culture Incentives
Source: Tessmer, M., & Richey, R. C. (1997). Table 2. Contextual factors within levels of the orienting, instructional, and transfer contexts (p. 92). Used with permission.
73
74
LOCALIZATION OF CONTEXT TO SUPPORT TRANSFER OF LEARNING Instructional
Organizational Immediate Environment Learner Orienting
Transfer
FIGURE 5.1 Relationship among the Learner, Immediate Environment, and Organizational Factors with the Orienting, Instructional, and Transfer Contexts.
an overview of contextual factors that fall within the orienting, instructional, and transfer contexts as depicted by Tessmer and Richey (1997). Factors within each context are further differentiated according to level: learner, immediate environmental, and organizational. The learner factors take into consideration learner characteristics that are relevant to the situation. Immediate environmental factors consist of the workplace of the job that is relevant to the situation. Lastly, organizational factors consist of the broad contextual factors that may influence the situation and depict a higher level within the organizational system. Figure 5.1 provides an overview of the relationship among the learner, immediate environment, and organizational factors with the orienting, instructional, and transfer contexts. These three levels are a depiction of the many systems and sub-systems that may exist within a systems thinking view of an organization (Banathy, 1991). Regardless of the subsystems that may be included in a system and overarching environment, each layer consists of inputs that contribute to or hinder the productivity of the system (i.e., people and resources). There is also an expectation that outputs emerge from each subsystem or system within an environment. These outputs will vary depending on the goals and directives. Within each organizational subsystem and system are people, processes, and products (Richey et al., 2011). Orienting Context. The orienting context accounts for contextual factors that precede instruction. This context considers learners’ predispositions and motivation prior to the start of instruction. The learner’s perception of how useful the instruction is and of how accountable they will be for completing and then applying it contributes to how motivated they will be in the instructional context. The extent that the environment (e.g., place of employment, school, and hospital) supports training also impacts the extent that instruction will be well received.
LOCALIZATION OF CONTEXT TO SUPPORT TRANSFER OF LEARNING
Instructional Context. The instructional context encompasses where the instruction will take place. Learner characteristics such as their perceptions regarding their role during instruction are considered as well as their perception of the tasks to be included in the instruction. Information pertaining to learner factors should initially be collected during a learner analysis as discussed in Chapter 4. Immediate factors such as the instructional factors are also considered within the instructional context. Delivery platforms (e.g., face-to-face, online, blended, or hybrid) are considered along with learning schedules. Seating for face-to-face courses and training sessions along with group assignments for online courses are also considered. Additionally, the instructor’s role perception plays an important role in the instructional context. The amount of coaching they may provide to their learners during training should be addressed. Additionally, it is important to consider the degree of autonomy the instructor intends to impart to the learners. This will vary depending on the complexity of the task and the learners’ degree of familiarity. When we look at the instructional context from a systems view, it is important to note that in order for the instructional activity to be successful, there must be a balance and mutual understanding between the learner role and the instructor role. Both parties should understand what is expected of the other. The following is a list of items that should be considered when addressing learner role and instructor role perception during instruction: • • • •
Has the instructor communicated their expectations of the learners? How much autonomy will the learners have? What is the instructor’s role during instruction? Are the students aware of the instructor’s role?
Transfer Context. The transfer context encompasses where the knowledge that is acquired during the instructional context will be applied. A prominent factor that influences the extent that instruction is applied in the transfer context is perception of utility. Learners and other individuals in the organization must see the value of what is being taught in the instructional context. If the learners see the value and have the perception that the information can be transferred, they are more apt to advocate for change in the transfer context. The extent to which individuals are successful at applying information from the instructional context to the transfer context is dependent on strategies that are in place to support such transition. While orienting, instructional, and transfer contexts should be addressed in every design project, instructional designers often struggle with obtaining sufficient information about all three contexts to do their jobs correctly. In Chapter 1, we discussed three questions instructional designers should ask themselves when designing instruction:
75
76
LOCALIZATION OF CONTEXT TO SUPPORT TRANSFER OF LEARNING
. Am I contributing to effectiveness? 1 2. Am I contributing to efficiency? 3. Am I contributing to ease of learning? These three questions are extremely difficult to address if an instructional designer does not have the information they need. How can they address efficiency, effectiveness, and ease of learning if they have no knowledge of their learning audience’s predispositions, prior experiences, or aspirational goals? How can they select appropriate instructional strategies to deliver instruction if they do not know how their learners will be expected to apply what they learn in training?
Relationship Between Contexts in Instructional Design The orienting, instructional, and transfer contexts inform each other, as demonstrated in Figure 5.1. It is important for us, as instructional designers, to recognize the amount of control and power of influence we possess for each context. We typically have the most control of the instructional context. We have the ability to use our power of influence over the various design decisions that are made regarding how content will be delivered, what instructional strategies will be used, learning schedules, and distinguishing between the learners’ and instructor’s roles during instruction. We typically have the least amount of control over the transfer context. Depending on our role within the project and the organization, we may never have the opportunity to see how the learners take what they have learned and apply it in the transfer context. Other times, we may be in the transfer context with them and have a front-row seat to see the extent they are able to enact change in their environment. Since we are usually going to be in full control of these three contexts in the ways that we would like as designers, it is helpful to dissect the relationship between them. Figure 5.2 demonstrates the relationship between the orienting and instructional contexts. It is difficult to exert control over the orienting context in our role as instructional designers because these are the traits and characteristics that the learners are bringing to the situation. We cannot control an individual’s goals, perceptions Instructional Designers Seek Alignment Between: • • • • • Orienting Context
Learner profiles and learning community Learners’ predispositions and instructional goals Learners’ role and instructor perceptions Social and teaching supports Incentives and rewards Instructional Context
FIGURE 5.2 Relationship between Orienting and Instructional Contexts.
LOCALIZATION OF CONTEXT TO SUPPORT TRANSFER OF LEARNING
of utility and accountability, or other predispositions. We can, however, attempt to understand why they may have these predispositions. By learning more about our learners, we can leverage that information and customize instructional strategies and examples that support their goals and align with their perceptions of utility and accountability. This also enables us to be a learning community where learners see themselves as valued members. Using information gained during learner analysis can inform the inclusion of diverse examples that are representative of the audience. We should seek alignment between learners’ and instructor’s roles. It is important that each group have clear expectations of what is required of each of them in the instructional context. We should not assume that this will be intuitive for the learners and the instructor. Rather, this should be clearly communicated prior to instruction and reiterated during instruction. Additionally, these roles can be coupled with appropriate social and teaching support. A good learner analysis should be indicative of learners’ prior experiences and expectations pertaining to social supports for learning. It is important to understand what support learners have had in the past to support their learning along with their attitudes toward learning. Teaching supports may vary depending on the role that the instructional designer is serving in the instructional context. There may be times where we are responsible for designing instruction with the intent that someone else will be responsible for delivering the content. Other times, we may be responsible for designing and delivering the instruction. Regardless of our role, it is important that we ensure that whoever is responsible for delivering the instruction has sufficient resources. Lastly, there should be alignment between incentives and rewards. Within the orienting context, individuals, including the learners, will be accustomed to having particular incentives provided for participating in instruction. It is important to gain an understanding of intrinsic and extrinsic motivators that will influence the learners’ participation in the instructional context. If we understand the organizational culture and learners’ expectations regarding incentives, appropriate rewards and values can be integrated into the instructional context to recognize learners for their performance. If we do nothing to ensure alignment between incentives and rewards, the rewards we integrate into our instructional context may mean nothing to our learners. If a reward for completing a training is not valued or perceived as attainable by the learner, they may not be inclined or motivated to participate and apply the content in the transfer context. For example, I worked for an office supply store on a part-time basis during college. The company had changed their guidelines for technology warranties and required all staff to complete training. During one of the busiest holiday seasons, employees were told that they would be entered into a warranty competition. Whoever sold the most warranties during a two-week period would receive a paid vacation day. While that sounds
77
78
LOCALIZATION OF CONTEXT TO SUPPORT TRANSFER OF LEARNING
like an enticing incentive, the company expected part-time employees to compete with full-time employees. Employees like me, who were working 10–12 hours a week, would have an extremely difficult time competing against an individual working 40–50 hours a week. The full-time employees would have more exposure to customers because of the difference in hours. The store’s management could not figure out why the part-time employees were not motivated to attempt to sell extra warranties to customers. When designing opportunities for rewards associated with learning, instructional designers and managers should consider what will be valued by learners and employees and what is reasonable given other incentives and expectations within the organization. When an individual who has participated in instruction fails to apply what they have learned to the transfer setting, the two biggest reasons are often that (1) there was a disconnect between the instruction and the real world where instruction was to be applied, and (2) there was a lack of support from organizational learnership to support any change in the transfer context. If we were brought on a project to conduct a thorough organizational needs assessment prior to any decisions being made about instruction, we may have some influence over organizational support for training. We could use data that we collected to inform recommendations made to organizational leaders. This could initiate conversations about the types of infrastructure, resources, and support needed to support individuals in the transfer context. Table 5.2 provides a tool to support your abilities during a needs assessment to explore a situation through different contextual lenses. TABLE 5.2 Organization of Information Related to Contextual Factors Contextual Factors
What is Already Known about These Factors?
What Information Is Needed about These Contextual Factors?
What Data Sources Are Needed to Address Questions?
How Do These Contextual Factors Impact the Needs of the Project?
Do These Contextual Factors Present Other Needs that Should Be Included in the Needs Analysis?
Orienting Instructional Transfer Source: Stefaniak, J. (2020). Table 5.2. Organization of information related to context factors (p. 69). Used with permission.
LOCALIZATION OF CONTEXT TO SUPPORT TRANSFER OF LEARNING
Our influence over the organizational support is greatly minimized if the decision has already been made that training is needed when we are brought on a project. It is further decreased if a lack of organizational support exists. Where we do have influence is in our ability to strengthen our instructional design activities to reflect the realities that learners will experience in the transfer context. One of the most frustrating training experiences a learner can go through is to attend training only to go back to work and realize that they will never be able to use what they just learned. They may come to the realization that their organization does not have the current technology that was used in training, the infrastructure might not be there to support the application of their new knowledge, or other environmental factors might hinder productive change from occurring. Knowing that we typically hold the most power of influence over what occurs in the instructional context, we can intentionally design our instructional content to align with utility perceptions in the transfer context (Figure 5.3). As designers, we can ask our clients questions to understand what the realities are within the setting where our learners would be expected to apply their newly acquired knowledge and skills. We can attempt to align content with perceptions of utility by including images of the transfer setting that learners will recognize, using the same technology that they will be expected to use in the transfer context, and explaining the rationale for why change is needed so that learners recognize the utility of changes. Within the instructional context, we have the opportunity to gather data on our learners’ perceptions toward the task. By understanding their perceptions regarding the level of difficulty it may take to learn the new tasks and apply them in the transfer setting, we can weave in transfer coping strategies in the instructional context. This presents the learner with resources and strategies they may need to employ when they begin enacting change. It also provides foreshadowing of some of the early growing pains they may experience if they are responsible for enacting substantial changes to regular operations and expectations within their organization. While rewards and incentives were discussed when exploring the relationship between orienting and instructional contexts, they are also Instructional Designers Seek Alignment Between: • Instructional content and perceived utility • Learner task perception and transfer coping strategies • Participation and incentives for application
Instructional Context
Transfer Context
FIGURE 5.3 Relationship between Instructional and Transfer Contexts.
79
80
LOCALIZATION OF CONTEXT TO SUPPORT TRANSFER OF LEARNING
prevalent when examining the relationship between instructional and transfer contexts. Participation incentives provided for participating in training should also be aligned with reasonable and attainable incentives in the transfer context. Learners should understand what the incentives are for changing their behavior and approach to tasks in the transfer setting. Similar to aligning learner task perception with transfer coping strategies, incentives provided in the transfer setting (intrinsic or extrinsic or both) can be discussed in the instructional context as a means to help learners see the relevance of the instruction. This enables them to begin to make their own connections between the instructional and transfer contexts. In addition to balance between the orienting and transfer contexts with the instructional context, alignment is needed between the orienting and transfer contexts. If we isolate these two contexts and explore what is needed to support effective and efficient transfer of new knowledge, there should be balance between learners’ perceptions of utility and opportunities for transfer. Not only should learners perceive that what they are learning will be useful and contribute to efficiency and effectiveness in the application setting, there should also be opportunities that support transfer. If one of these elements is absent, successful application and change most likely will not occur. If learners perceive utility in what they have just learned but there are no opportunities to engage in change or apply their knowledge, minimal transfer will occur; this is because of a lack of infrastructure. If opportunities are present in the transfer context to warrant and support application and change but learners fail to see the usefulness in what they have just learned, transfer will not occur; this is due to learners putting in minimal effort to change. Any time that changes or adjustments are to occur within an organization, there needs to be a mutual understanding between learner accountability and transfer culture. Learners need to be able to view themselves as accountable to learning new tasks and applying those tasks in the transfer context. The organization needs to promote an environment that supports the application and implementation of those changes. That support needs to extend beyond communicating with members of the organization that change is supported; it also means that there needs to be an understanding that change is incremental and takes time to yield efficient and effective results. Lastly, in an effort to optimize learner engagement throughout the process of learning new tasks and applying them in the transfer context, it is imperative that the incentives for transfer align with the learners’ goals. In the orienting context, learners should bring with them their individual goals for learning new tasks as well as goals for contributing to the organization. In a successful environment, the learners will see the potential for them to meet these goals by applying new tasks in the transfer context. One particular caveat that is important to note is that we have minimal control over managing the isolated relationship between the orienting and
LOCALIZATION OF CONTEXT TO SUPPORT TRANSFER OF LEARNING Instructional Designers Seek Alignment Between: • Learners’ perceived utility and opportunities for transfer • Learners’ perceived accountability and transfer culture • Learners’ goals and incentives for transfer
Orienting Context
Transfer Context
FIGURE 5.4 Relationship between Orienting and Transfer Contexts.
transfer contexts. If we have a role early in the planning process, particularly with conducting a needs assessment to map out an organization’s needs, we can initiate conversations to determine the extent that balance may exist between the factors included in Figure. If we are limited to solely focusing our time on developing instruction, we can attempt to mitigate any discrepancies or imbalance that may exist between these factors by streamlining instructional rewards that support learners’ goals and reflect the realities of what they can expect in the transfer context (Figure 5.4).
Where Does Context Fit in Our Design Process? Tessmer and Richey (1997) outlined several assumptions pertaining to the role that context plays in instructional design: We are condemned to context. • • Context is a medley of factors that inhibit or facilitate to varying degrees. • There may be multiple contexts for a given learning or performance. • Instructional designers are responsible for the successful application as well as acquisition of learning and therefore must respond to orienting and performance contexts as well as instructional contexts. • Instructional designs can accommodate context but cannot control it. • The impact of context varies with the nature of the learner, the content, and the intensity of the contextual elements. • Successful instructional designs must be, to some extent, situation-specific. • Systems orientations to instructional design are, on the whole, more effective than systematic orientations. (p. 88) If our motto for every design project we engage in were “the learner is at the center of everything we do,” we would structure every design question we have during different phases of design around our learners’ needs. We would use all the information we gathered from exercises outlined
81
82
LOCALIZATION OF CONTEXT TO SUPPORT TRANSFER OF LEARNING
in Chapter 4 to inform our design questions. Learner analysis tools like personas would give us a glimpse into how our learners may approach and use the instruction we are planning to provide to them. Keeping the learner at the center of everything we do, we must ask ourselves, where does context fit? If the learner is the focal point of our design work, then context should be in our peripheral vision at all times. Ideally, we should be re-evaluating context as we maneuver through each phase of design. This iterative scanning can provide frequent and ongoing feedback to inform our design decisions as recommended by Richey (1992) in her systemic design model. In doing so, we can identify new information that may be relevant to our projects and potentially warrant some adjustments to our designs. Figure 5.5 provides an example of sample questions you may choose to ask during an initial project kick-off meeting to gain an overview of the orienting, instructional, and transfer contexts relevant to the project (Stefaniak, 2021).
A Localization of Context Contextual analysis is similar to needs assessment in that it does not occur one time during a project; rather, it is an iterative process that requires the instructional designer to continually scan the environment for new information that may hinder or support instruction. A challenge with contextual analysis is being able to gather sufficient information about different contexts in a timely manner. Just like needs assessment, contextual analysis should be viewed as being scalable. In an ideal situation, time is allocated at the beginning of a project for a needs assessment. Contextual analysis can be integrated into many of the needs assessment activities prior to starting the design of instruction. When time is not earmarked for needs assessment, we can still get creative by incorporating contextual analysis into our regular design practices. We can leverage discussions with our clients during project kick- off meetings, review sessions with subject matter experts, and possible beta-tests with learners as opportunities to ask questions, observe behaviors, and make adjustments to instructional content. Another challenge that we need to be mindful of is knowing what information is relevant to our project. Conducting a thorough needs assessment and contextual analysis takes time and requires us to collect and analyze data from multiple sources. Depending on how much information is made available to us, this can quickly become a daunting task. Sometimes, it is difficult to distinguish between what would be nice to know versus what is needed. One way we can mitigate some of the logistical challenges and constraints associated with contextual analysis is through the localization of context. The localization of context was introduced by Meloncon (2017) to emphasize the personal experience in health care. Emphasis is placed
LOCALIZATION OF CONTEXT TO SUPPORT TRANSFER OF LEARNING INSTRUCTIONAL DESIGN PROJECT INTAKE FORM Date: Client: Instruconal Designer: Project Name: PROJECT OVERVIEW 1. What is the purpose of the project (instructional need)? 2. What is the scope of the project? a. Learning platform (Face-to-face, blended, online) b. Overarching course goal c. Learning objectives 3. What level of importance is the training? (i.e., severe, moderate, mild)
LEARNING AUDIENCE 1. Who is the intended learning audience? 2. What are the learners’ experiences with the project topic? 3. What challenges do learners typically experience with this topic? 4. What are the learners' overall attitudes toward training? 5. What information will the instructional designer have access to regarding the learning audience? (i.e., job observations, meetings with learners, work products, interviews, etc.) INSTRUCTIONAL ENVIRONMENT 1. How will instruction be delivered? 2. How will learners access the material? 3. What is the length of the course? 4. What are the learners’ roles during instruction? 5. What is the instructor’s role during instruction? 6. What types of assessment need to be included in the instruction? TRANSFER (APPLICATION CONTEXT) 1. How soon after the training will learners apply their newly acquired skills? 2. What are the anticipated challenges with applying these new skills in a real-world environment? 3. What resources are available to support learners during this transfer phase (i.e. job aids)? 4. Who is responsible for monitoring learners with transference? EVALUATION 1. How and when will the instructional training be evaluated for effectiveness? 2. Who will be responsible for conducting an evaluation? 3. What methods of evaluation will be used to determine the efficiency and effectiveness of the instruction? OTHER COMMENTS:
FIGURE 5.5 An Example of an Instructional Design Intake Form.
83
84
LOCALIZATION OF CONTEXT TO SUPPORT TRANSFER OF LEARNING
on identifying contextual factors that will directly impact an individual as they go through whatever intervention is being provided. This helps the designer reflect more on the user experience as well as understand what relationships exist between different contextual factors. Through examining the role of user experience design in patient education, Meloncon (2017) proposed a new and more patient- centered approach that she described as patient- experience design. This approach incorporated a more participatory approach to contextual inquiry to understand the relationships between information, the patient (user), technical communication, and activities that occur within health care. Baaki and Tracey (2019) introduced the idea of using a localization of context to guide contextual inquiries in instructional design. Their interpretation of contextual analysis builds upon Tessmer and Richey’s (1997) view that the learner should be at the center of each context (e.g., orienting, instructional, and transfer). Their approach also integrates Meloncon’s (2017) approach to support patient experience design by providing more autonomy to the patient in the user experience. Baaki and Tracey’s (2019) approach attempts to promote a more autonomous role of the learner(s) in contextual analysis activities. Learners are able to integrate perspectives on meaning-making based that also promote a learner-centered view (Bruner, 1990). Through this approach, Baaki and Tracey (2019, 2022) suggest that instructional designers and learners should weave together a dynamic context to achieve a localized context. This integrated dynamic context proposes five premises: . Learner and designer contexts are dynamic. 1 2. Learner and designer contexts are about interpretation. 3. Learner and designer contexts are about filling spaces. 4. Learner and designer contexts are about meaning-making. 5. Learner and designer contexts are about creating meaning and moving forward. (Baaki & Tracey, 2022, p. 63) When we think about the dynamicity of learning and design contexts, this premise relates to earlier discussions in Chapter 2. It is important to recognize that context is ever changing; it is never stagnant. It is important for us to consider the need for instructional designers to approach the learning and design context with flexibility and willingness to adapt and modify instruction and design solutions to meet the current and future needs of the learners. It is important to be flexible, making adjustments and pivoting during instruction when questions arise, to accommodate changes occurring within the organization that will have a significant impact on the learners’ abilities to perform in the transfer context.
LOCALIZATION OF CONTEXT TO SUPPORT TRANSFER OF LEARNING
Recognizing that learner and designer contexts are about interpretation reminds us that perceptions are real. We must develop an awareness of our own perceptions and interpretations of the learning and design contexts. Our perceptions are developed over time on the basis of our previous experiences and knowledge that form our designer identity. We must also remind ourselves that our learners will have their own perceptions and identities that will influence how they interpret these contexts. I particularly like the phrase that learner and designer contexts are about filling spaces. We want what we design to be relevant to our learners’ world. We want to contribute to the development of our learners and the ways in which they may apply new tasks and skills in situated environments. I tend to conceptualize filling space with maximizing learner and design potential. This helps me think about what boundaries need to be imposed to establish the design space. By doing this in tandem, I am able emphasize particular aspects that need to be the central focus of the project and my design decisions and actions. The premise that meaning-making is an important role in the learner and designer contexts emphasizes the negotiations that occur between the learners, instructional designer, and ultimately the instructional contexts. We are better able to empathize with our learners and design experiences that they interpret to be relevant to them and representative of them by using a variety of data collection methods during the learner analysis phase of a project. This last premise builds upon meaning-making and underscores the need for moving forward. It is important to understand the role that the learners and designer serve in moving forward and contributing to change in learning and performance. When we think about those three important goals of instructional design to contribute to efficiency, effectiveness, and ease of learning, the ability for us to see how we can build upon what we currently know and enact change is important. The same is true for our learners. We want our learners to build upon their predispositions that they bring to the situation in the orienting context, design to those predispositions to help them connect with the content and engage in meaning- making within the instructional context, and see utility in how what they have learned can be enacted in the transfer context. A growing criticism of instructional design models and processes is an indistinct connection to learners’ actual needs. Moore (2021) suggests that while phases of the instructional design process are recognized as pertinent in our field’s existing models, they deviate from a learner-centered focus in that the strategies are very paternalistic and prescriptive in nature. Information that has been gathered during needs analyses and contextual analyses have been reviewed by designers who have determined what information is important and worth addressing.
85
86
LOCALIZATION OF CONTEXT TO SUPPORT TRANSFER OF LEARNING
Baaki and Tracey’s (2019) campaign for a localization of context incorporates the learner into the contextual analysis process to build upon empathetic design practices, user experience design, and needs assessment. Aligning the learner with the designer establishes a dynamic context that supports ongoing negotiation and interpretation throughout the process to ensure a sustainable design product. This extends beyond the use of learner personas by providing the learner with more autonomy throughout the process. This renewed interest in re-examining the critical role that context plays in instructional design is providing tangible empathetic design strategies that can be overlayed throughout the instructional design process. We explored the relationship between time, quality, and money that is prevalent in every design project. The localization of context not only illuminates the learner at the forefront of design but also helps us, as designers, manage the relationship between quality, time, and money as we establish parameters around our design space. Orienting, instructional, and transfer factors can interfere with the progress of a project if not managed appropriately when addressing contextual analysis in instructional design (Stefaniak & Xu, 2020). To expedite the process (time and money), “the instructional designer may focus on the contextual factors that are directly linked to the learning experience and exclude factors that may impact the situation at the periphery” (Stefaniak & Xu, 2020, p. 5). “A localized context of use is scaled back to what is needed in a situation or moment” (Baaki & Tracey, 2019, p. 2). This ultimately provides a more learner-centered interpretation of the situation (quality).
Is Transfer Beyond Our Control? To date, a small collection of papers have looked specifically at context as it relates to instructional design. Of those, emphasis has been placed on learning contexts (Gilbert, 2006; Herrington & Oliver, 2000), design contexts (Gibbons, 2011), and cultural contexts (McLoughlin & Oliver, 2000). Few studies have been conducted examining the transfer context because it is difficult for instructional designers and researchers to gather sufficient information. Oftentimes, we are not privy to that information. Earlier in the chapter, I discussed the challenges with designing for transfer; it is usually the context where we have the least power of influence. If we build upon Baaki and Tracey’s (2022) premises for creating a dynamic context between the learner and the designer, perhaps we can re-examine the types of questions we may seek to answer during our design conversations with clients, team members, and potential learners. By engaging in a localization of context, we can approach the role of context in instructional design through a more empathetic lens that can support our learners and foster critical consciousness.
LOCALIZATION OF CONTEXT TO SUPPORT TRANSFER OF LEARNING
Designing within a Bounded Rationality to Manage Design Space Chapter 2 introduced the concept of managing our design space, a topic that has been revisited in every chapter in this book. When we think about establishing parameters around our design space, we need to be realistic. Our ability to make decisions regarding boundaries around our design space is dependent on our expertise as designers, knowledge, and available resources. The theory of bounded rationality, introduced by Herbert Simon (1955), has been used to conceptualize mathematical and economic modeling of decision-making. Simon argues that human beings are unable to approach decision-making practices with perfect rationality. They must operate within a bounded rationality where they engage in decision-making based on the resources available at the time a decision is warranted. I argue that every decision that we make is made within a bounded rationality. This is important to note, especially when examining the role that context plays, because it supports the idea that context is dynamic. Every context, whether it is orienting, instructional, or transfer, has a degree of dynamicity. Table 5.3 provides a list of questions you may consider when exploring what you know about context while taking into account that the overall environment (and system) changes continually. TABLE 5.3 Questions to Guide the Instructional Designer’s Evaluation of their Instructional Design Context
Checkpoints to Address Dynamic Systems in Instructional Design
Orienting
What are my perceptions, as the instructional designer, of the instructional and transfer environments? What external representations can I retrieve from my repertoire that can be applied to the current project? What contextual factors have been identified from the transfer (situated) context that will guide my instructional design? How do the learners’ perceptions of accountability and utility in the transfer environment provide boundaries with regard to their interaction with the instructional environment? Are the instructional activities situated in an environment similar to the transfer environment? What instructional activities have I designed that will promote interactivity between learners and their environment? What instructional strategies will help learners construct their own internal representations of the learning environment and their roles within that environment? What external representations have I provided my learners to assist them with offloading cognitive demand onto the environment?
Instructional
(Continued )
87
88
LOCALIZATION OF CONTEXT TO SUPPORT TRANSFER OF LEARNING TABLE 5.3 (Continued) Context
Checkpoints to Address Dynamic Systems in Instructional Design
Transfer
Have I presented external representations to assist learners with transferring new knowledge to the environment? Have I presented mechanisms to support learners’ abilities to offload cognitive demand as they apply their newly acquired skills in the environment? Have I accounted for cultural factors in the learners’ environment that will potentially facilitate or hinder transfer?
Source: Stefaniak, J., Tawfik, A., & Sentz, J. (2022). Table 2. Questions to Guide the Instructional Designer’s Evaluation of their Instructional Design (p. 9). Used with permission.
Summary This chapter provided an overview of the role that context has in instructional design. Relationships between different factors and contexts were explored to identify opportunities for designers to structure decisions and practices to create learning experiences that are meaningful and perceived as relevant and useful by individuals within the environment. The first five chapters of this book have focused on the planning and analysis aspects associated with instructional design. We will continue to review the role that needs, learner, and contextual analyses serve in the instructional design process in the remaining chapters of this book that are focused on designing instructional experiences.
Connecting Process to Practice Activities 1. Think about the last few instructional design projects you have been involved with in some capacity. What did you know about the orienting, instructional, and transfer contexts at the beginning of the project? What information did you wish you had? What impact did the missing information have on your project? 2. You have recently been asked to meet with a client who is interested in hiring you to design a new training program. Their organization wants to update their safety protocols and establish a training program that all employees will be required to complete. How would you address the need to address and align incentives across multiple contexts? How might you explain how this impacts instructional design? 3. Brainstorm some ideas for ways you could gather information about the transfer context that would be informative to how you design instruction? 4. How might you localize context and design instruction that promotes meaning-making and moving forward in an organization? What would that look like in your instructional materials?
LOCALIZATION OF CONTEXT TO SUPPORT TRANSFER OF LEARNING
Bridging Research and Practice Arias, S., & Clark, K. A. (2004). Instructional technologies in developing countries: A contextual analysis approach. TechTrends, 48(4), 52–55. https://doi. org/10.1007/BF02763445 Baaki, J., & Tracey, M. W. (2022). Empathy for action in instructional design. In J. E. Stefaniak & R. M. Reese (Eds.), The instructional design trainer’s guide: Authentic practices and considerations for mentoring ID and ed tech professionals (pp. 58–66). Routledge. Tessmer, M., & Wedman, J. (1995). Context-sensitive instructional design models: A response to design research, studies, and criticism. Performance Improvement Quarterly, 8(3), 38–54. https://doi.org/10.1111/j.1937-8327.1995.tb00685.x Tracey, M. W., & Baaki, J. (2022). Empathy and empathic design for meaningful deliverables. Educational Technology Research and Development, 70(6), 2091– 2116. https://doi.org/10.1007/s11423-022-10146-4
References Baaki, J., & Tracey, M. W. (2019). Weaving a localized context of use: What it means in instructional design. Journal of Applied Instructional Design, 8(1), 2–13. https://doi.org/10.28990/jaid2011.00100 Baaki, J., & Tracey, M. W. (2022). Empathy for action in instructional design. In J. E. Stefaniak & R. M. Reese (Eds.), The instructional design trainer’s guide: Authentic practices and considerations for mentoring ID and ed tech professionals (pp. 58–66). Routledge. Banathy, B. (1991). Systems design of education. Educational Technology Publications. Bruner, J. (1990). Acts of meaning. Harvard University Press. Gibbons, A. S. (2011). Contexts of instructional design. The Journal of Applied Instructional Design, 1(1), 5–12. Gilbert, J. K. (2006). On the nature of “context” in chemical education. International Journal of Science Education, 28(9), 957–976. https://doi.org/10. 1080/09500690600702470 Herrington, J., & Oliver, R. (2000). An instructional design framework for authentic learning environments. Educational Technology Research and Development, 48(3), 23–48. https://doi.org/10.1007/BF02319856 McLoughlin, C., & Oliver, R. (2000). Designing learning environments for cultural inclusivity: A case study of indigenous online learning at tertiary level. Australasian Journal of Educational Technology, 16(1). https://doi. org/10.14742/ajet.1822 Meloncon, L. K. (2017). Patient experience design: Expanding usability methodologies for healthcare. Communication Design Quarterly Review, 5(2), 19–28. https://doi.org/10.1145/3131201.3131203 Merriam-Webster Dictionary (2022). https://www.merriam-webster.com/dictionary/ context Moore, S. (2021). The design models we have are not the design models we need. Journal of Applied Instructional Design, 10(4). https://dx.doi.org/10.51869/ 104/smo Oxford Dictionary (2020).https://www.oxfordlearnersdictionaries.com/us/definition/ english/context Richey, R. C. (1992). Designing instruction for the adult learner. Kogan Page/ Taylor and Francis.
89
90
LOCALIZATION OF CONTEXT TO SUPPORT TRANSFER OF LEARNING Richey, R. C., & Tessmer, M. (1995). Enhancing instructional systems design through contextual analysis. In B. Seels (Ed.), Instructional design fundamentals: A reconsideration (pp. 189–199). Educational Technology Publications. Richey, R. C., Klein, J. D., & Tracey, M. W. (2011). The instructional design knowledge base: Theory, research, and practice. Routledge. Simon, H. A. (1955). A behavioral model of rational choice. Quarterly Journal of Economics, 59, 99–118. Stefaniak, J. E. (2020). Needs assessment for learning and performance: Theory, process, and praxis. Routledge. Stefaniak, J. E. (2021). Determining environmental and contextual needs. In J. K. McDonald & R. E. West (Eds.), Design for learning: Principles, processes, and praxis. EdTech Books. https://edtechbooks.org/id/needs_analysis Stefaniak, J., & Xu, M. (2020). Leveraging dynamic decision-making and environmental analysis to support authentic learning experiences in digital environments. Revista De Educación a Distancia (RED), 20(64). Stefaniak, J., Tawfik, A., & Sentz, J. (2022). Supporting dynamic instructional design decisions within a bounded rationality. TechTrends, 1–14. https://doi. org/10.1007/s11528-022-00792-z Tessmer, M. (1990). Environment analysis: A neglected stage of instructional design. Educational Technology Research and Development, 38(1), 55–64. https://doi. org/10.1007/BF02298248 Tessmer, M. (1991). Back to the future: The environment analysis stage of front-end analysis. Performance and Instruction, 30(1), 9–12. Tessmer, M., & Richey, R. C. (1997). The role of context in learning and instructional design. Educational Technology Research and Development, 45(2), 85– 115. https://doi.org/10.1007/BF02299526 Tessmer, M., & Wedman, J. (1995). Context-sensitive instructional design models: A response to design research, studies, and criticism. Performance Improvement Quarterly, 8(3), 38–54. https://doi.org/10.1111/j.1937-8327.1995.tb00685.x
6
FOSTERING KNOWLEDGE ACQUISITION THROUGH AUTHENTIC LEARNING
Chapter Overview This chapter will discuss how instructional designers can design learning experiences that facilitate knowledge acquisition through authentic learning experiences. Strategies for how instructional designers can support learners’ acquisition of different knowledge types (e.g., conditional, conceptual, and procedural) are discussed. Theoretical constructs such as situated cognition and social learning theory will be discussed. Instructional strategies to promote learning in authentic environments such as case-based learning, problem-based learning, and cognitive apprenticeships will also be presented.
Guiding Questions 1. What is authentic learning? 2. What are examples of authentic assessments? 3. How can instructional designers support learners’ zone of proximal development?
4. What are the differences between case-based and problem-based learning? 5. What is a cognitive apprenticeship? 6. How can context be localized to support authentic learning experiences?
A common phrase used by many instructional designers when asked a question about design is “it all depends.” Strategies that are used for one project will not always transfer successfully to the next project. Instructional design calls for us to develop an awareness of the situation, align our design strategies with localized contexts, and design instruction
DOI: 10.4324/9781003287049-6
92
FOSTERING KNOWLEDGE ACQUISITION THROUGH AUTHENTIC LEARNING
that is attuned to the unique needs of the learners, individuals, and organization. Instructional design never results in a one-size-fits-all approach. Richey and colleagues (2011) premised three: • There are different types of learning outcomes, and each type of learning calls for different types of instruction. • Instructional sequencing relies upon relationships among the various learning outcomes. • Instructional strategies should facilitate the internal processes of learning. (p. 105) If we look at learning in its broadest sense, it can be defined as “the relatively permanent change in a person’s knowledge or behavior due to experience” (Mayer, 1982, p. 1040). If we were to dissect this definition, we should pay close attention to the phrases “relatively permanent change” and “experience.” The more experience a learner has learning new content and practicing ways to apply it, the more likely they will experience a permanent change in behavior. Change does not occur within a learner after being exposed to something once. Owing to the dynamic nature of context, instructional designers often find themselves navigating the orienting, instructional, and transfer contexts in their design solutions. Driscoll’s (1994) definition expands upon Mayer’s and recognizes learning as being a “a persisting change in human performance brought about as a result of the learner’s interaction with the environment” (p. 8). Driscoll’s (1994) definition acknowledges that learning is a result of a learner’s interaction with the environment. In Chapter 5, we discussed how a localization of context can support a learner’s ability to interpret the world around them, construct meaning, and move forward as they participate in instruction (Baaki & Tracey, 2022). A localization of context brings the learner and designer to the forefront in more autonomous roles. The extent to which a learner can construct meaning within instruction will significantly impact their ability to transfer and apply that knowledge in multiple real-world settings. This chapter expands upon our discussion in Chapter 5 about supporting learners’ abilities to effectively apply new knowledge to the transfer setting through the use of authentic learning. Chapter 5 discussed the need for instructional designers to seek alignment between the following factors when engaging in design: • • • •
instructional content and learners’ perceived utility learner task perception and transfer coping strategies learners’ perceived accountability and transfer participation and incentives for application
FOSTERING KNOWLEDGE ACQUISITION THROUGH AUTHENTIC LEARNING
One way to maximize a learners’ perception of utility during instruction is to create learning experiences that are indicative of the real-world environment where they will use their knowledge. Authentic learning provides opportunities for learners to acquire and practice new skills in situ.
What is authentic learning? Authentic learning is a pedagogical approach that promotes learners’ abilities to engage in problem-solving by having them complete tasks in real- world environments (Herrington et al., 2014). Depending on the content, learners focus on solving real-world problems in the actual environment in order to experience the realities of the situation. Learners can also experience authentic learning through participating in simulated situations that reflect real-world problems. Chapter 5 discussed challenges that instructional designers encounter with providing adequate alignment between the instructional and transfer contexts. A goal of instruction is to provide learners with content that is useful and can easily be translated and relied upon in an application setting. This chapter explores the utility of designing authentic learning experiences and offers guidance on how to integrate contextual factors from the orienting and transfer contexts. Authentic learning experiences are task-based and require learners to apply their previous knowledge to solve a problem. These problems are often ill defined, warranting more than one solution ( Jonassen, 1997). Authentic learning elicits higher-level thinking skills as students analyze the situation, synthesize information, and begin to design solutions. These learning experiences provide the learner with greater autonomy of their learning experience (Mims, 2003). Social learning theory and situated cognition are two theoretical constructs that support the use of authentic learning as a pedagogical approach.
Social Learning Theory Bandura (1971) theorized that learners learn best when they are provided opportunities to learn from experts and observe how they perform tasks. His theory, social learning theory, purported that learners benefited greatly from social interactions between experts and novices. Through exposure to experts, learners can recreate ideal behaviors when presented with a stimulus. It was Bandura’s early research on social learning theory that placed emphasis on the importance of modeling in instruction. Bandura and Schunk (1981) argued that learners would be able to create a mental model if they were able to witness an expert demonstrating how to complete a task or execute a particular behavior. From there, the mental model could easily be internalized and stored in their long-term memory for future application.
93
94
FOSTERING KNOWLEDGE ACQUISITION THROUGH AUTHENTIC LEARNING
Building upon the body of research that explored social learning theory, Lev Vygotsky (1978) introduced the concept of the zone of proximal development (ZPD). Vygotsky defined ZPD as the distance between the actual development level as determined by independent problem-solving and the level of potential development as determined through problem-solving under adult guidance, or in collaboration with more capable peers. (Vygotsky, 1978, p. 86) Vygotsky’s (1986) research on ZPD supported Bandura’s (1971) earlier work on social learning theory. He believed that learners’ performances would improve by observing expert performance and through routine practice. The concept of ZPD is used when pairing a less experienced learner with a learner who has more experience regarding a topic. This is something that teachers often consider when intentionally arranging students for small group assignments. Figure 6.1 provides a visual of the ZPD. The actual ZPD consists of the skills that a learner can master if they are paired with an expert or someone more knowledgeable than them on a topic. The ZPD is the potential for how much the learner can achieve during the learning process. From a practical standpoint, there is always a degree of ZPD that we should consider when designing activities. The degree that a ZPD exists will be dependent on the complexity of the topic being delivered and the learners’ prior experience. As instructional designers, we can help support learners’ navigation through their ZPD by understanding their knowledge of and familiarity with the topic at the beginning of instruction. From there, we can determine the level of difficulty we may impose to help them propel through the ZPD and build upon their skills. This can be accomplished, in part, through conducting a learner analysis. As we think about what questions we would want to ask to gather information on our prospective learners, we can think about what information we would need to best support them in the ZPD.
What the learner already knows about the topic
ZONE OF PROXIMAL DEVELOPMENT
FIGURE 6.1 Zone of Proximal Development.
What the learner does not know about the topic
FOSTERING KNOWLEDGE ACQUISITION THROUGH AUTHENTIC LEARNING
When thinking specifically about authentic learning, knowledge of our learners’ ZPDs will be helpful in determining the level of complexity we expose them to in the situated experience. If learners are novices in learning about a topic, the instructor may impose additional scaffolding to guide them through the situated activity. They may have a stronger presence throughout the assignment and provide several opportunities for feedback and debriefing to assist learners with unknowns. If learners are more familiar with the topic, the instructor may choose to expose them to additional complexities for them to experience how they may have to adapt strategies they are already familiar with to contend with different contextual factors that are present.
Situated Cognition Situated cognition is a theory that posits that learners learn from doing. Brown and colleagues (1989) argued that knowing about how to complete a task is inseparable from actually doing the task. Their research on situated cognition suggests that in order to reach mastery of a topic, learners must practice new skills in the actual environment where they would be expected to be completed. Through these situated learning experiences, learners have an opportunity to interact with other individuals in the environment, learn from experts, and practice adapting their performance depending on different contextual factors that may be prevalent in the situation. Researchers in the 1990s (Brown et al., 1989; Lave & Wenger, 1991) began arguing for situated learning experiences, noting that some concepts were difficult to teach in traditional classroom settings. They purported that learners could develop more meaningful mental models and schemas if they were given opportunities to practice performing tasks in the actual environment where they would be expected to perform. One of the challenges noted in Chapter 5 was that discrepancies can exist for learners as they maneuver from the instructional context to the transfer context. Situated learning through authentic experience can help mitigate those discrepancies by incorporating the transfer context simultaneously with the instructional activities.
Types of Authentic Learning There are a variety of different types of authentic learning activities that an instructional designer may consider when designing a course or learning experience. Most authentic learning activities can be categorized as case- based learning or problem- based learning. Table 6.1 differentiates between the roles of instructors and students in case-based and problem- based learning activities.
95
96
FOSTERING KNOWLEDGE ACQUISITION THROUGH AUTHENTIC LEARNING TABLE 6.1 Differentiating Between Case-based Learning and Problem-Based Learning
Purpose
Role of Instructor(s)
Role of Learner(s)
Case-based Learning
Problem-based Learning
Focuses on creative problem-solving of a real-world project with some preparation. Instructor uses guiding questions to make connections between the activity and the main learning objectives. Learners prepare in advance for the session. Learners may ask questions during the debriefing session.
Focuses on the process of discovery by learners to simulate problem-solving, independent learning, and teamwork. Instructor plays a minimal role and does not guide the discussion.
Learners are responsible for finding their own ways to define the problem. Explore related issues and identify possible solutions.
Problem-based Learning Problem-based learning creates an opportunity for learners to explore a problem occurring in the real world. “The primary goal of problem-based learning is to enhance learning by requiring learners to solve problems” (Hung et al., 2008, p. 488). The problems presented in a problem-based learning activity are ill structured in nature, requiring learners the opportunity to explore a variety of potential solutions. Savery (2018) notes that “a critical skill developed through problem-based learning is the ability to identify a problem and set parameters on the development of a solution.” Hung and colleagues (2008) note that problem-based learning activities are characteristic of the following: • “It is problem-focused, such that learners begin learning by addressing simulations of an authentic, ill-structured problem. • It is student-centered because a faculty cannot dictate learning. • It is self-directed, such that students individually and collaboratively assume responsibility for generating learning issues and processes through self-assessment and peer assessment and access their own learning materials. • It is self-reflective, such that learners monitor their understanding and learn to adjust strategies for learning. • Tutors are facilitators (not knowledge disseminators) who sup port and model reasoning processes, facilitate group processes and interpersonal dynamics, probe students’ knowledge deeply, and never interject or provide direct answers to questions.” (p. 488)
FOSTERING KNOWLEDGE ACQUISITION THROUGH AUTHENTIC LEARNING
When designing a problem-based activity, it is up to the instructor to determine how much autonomy is provided to the learners regarding exploration of a problem. Students who have advanced knowledge of a subject may be given a lot of flexibility to explore different approaches, whereas an instructor may impose several design constraints to an assignment for novice learners to help direct their focus to specific goals intended for the activity. There are many benefits to incorporating problem-based learning activities into instruction. Nilson (2010) notes that students develop the following skills by participating in authentic problem-solving activities: • • • • • • • • • • •
Working in teams Managing projects and holding leadership roles Oral and written communication Self-awareness and evaluation of group processes Working independently Critical thinking and analysis Explaining concepts Self-directed learning Applying course content to real-world examples Researching and information literacy Problem-solving across disciplines
During problem-based learning activities, more autonomy is placed on learners to explore the issues and work interpedently and collaboratively to solve the problem. The instructor’s role is minimal, and they often intervene when necessary. The instructor typically avoids providing a direct response or answer to questions posed by their learners. Instead, they may redirect them to explore additional resources to support their critical thinking and analysis of the problem. If it is not possible for instructors to place their learners in real-world situations at the time of the instructional period, simulations can be used to recreate the real-world environments. “Simulations are environments where components of a problem are manipulated by learners who receive feedback about the effects of their manipulations” (Jonassen, 2011, p. 11). Simulations can range in fidelity: low-fidelity simulations provide an overview of the situation or concept to be explored, and high-fidelity simulations provide more detailed nuances that are indicative of the real-world environment. Simulations are used often to train health-care professionals in clinical procedures. Task trainers may be used to teach nursing students how to insert IVs where higher-fidelity simulators have been developed to teach a variety of labor and delivery procedures and other complex surgical procedures. These simulators are used to provide learners with an opportunity to receive immediate feedback in a safe environment so that they are not practicing on a patient for the very first time.
97
98
FOSTERING KNOWLEDGE ACQUISITION THROUGH AUTHENTIC LEARNING
Simulators have also been used by police officers to prepare for how to enter unfamiliar territory to retrieve someone committing a crime. Simulators have been used to help prepare new police recruits for when to apply different levels of force. These in situ simulators enable officers to enhance their situational awareness by being placed in a simulated environment where they can experience the consequences of their actions in a safe space. Their preceptors can intervene when necessary or wait to debrief with the officers as a group to discuss the rationale for the decisions they made during the simulated activity.
Case-based Learning Case-based learning is a pedagogical approach where learners apply their knowledge to real-world scenarios that have already occurred. Case-based learning activities differ from problem-based learning activities in that the learners are not expected to solve problems; rather, they are expected to answer other individuals’ decisions ( Jonassen, 2011). Discussion plays a significant role in the case-based learning activity where students analyze the actions and steps others use to solve a problem and discuss the effectiveness of those solutions as a group. Case-based learning requires some advanced preparation of the learners as they read a case study prior to discussion with their peers and class (Srinivasan et al., 2007). The instructor can direct the discussions by providing learners with questions to help guide their analysis. Case-based learning provides a great opportunity to introduce learners to authentic experiences when access to real-world contexts may not be readily available during the time of instruction. It still provides learners with the ability to explore different contextual factors that may influence or hinder the implementation of different solutions. Case studies also provide learners exposure to expert performances by reading what steps were taken by individuals in a situated context to solve a problem. While case studies can be used for any learning audience, they can be particularly beneficial to novice learners. Case studies can introduce novice learners to authentic contexts when the subject matter is still relatively new. Discussions surrounding the case studies can be curtailed by the instructor to meet the learners where they are currently at with understanding the instructional content. This enables learners to engage in higher-order thinking skills that are suitable and meet their learning needs. When designing case studies for instructional purposes, we can use events that have occurred in the real world for learners to see what has happened in the situated context. This helps to provide learners with a preview of what they can expect in the transfer context when they are expected to apply concepts they are currently learning. Regardless of whether we create our own case studies or find case studies that have
FOSTERING KNOWLEDGE ACQUISITION THROUGH AUTHENTIC LEARNING
been previously written, it is important to have a rationale for including that case as an instructional activity. We should be thinking about what problem-solving strategies we want our learners to observe. We should be thinking about particular difficulties we want them to witness experts experiencing while engaging in their problem-solving. Multiple roles and perspectives can also be explored through case studies to help learners understand resolutions that may have been sought.
Characteristics of Authentic Learning Instructional experiences that promote authentic learning differ from instructional encounters where an instructor is imparting knowledge to their learners. Herrington and Oliver (2000) offer nine characteristics to describe authentic learning experiences: 1. Provide authentic contexts that reflect the way the knowledge will be used in real life 2. Provide authentic activities 3. Provide access to expert performances and the modeling of processes 4. Provide multiple roles and perspectives 5. Support collaborative construction of knowledge 6. Promote reflection to enable abstractions to be formed 7. Promote articulation to enable tacit knowledge to be made explicit 8. Provide coaching and scaffolding by the teacher at critical times 9. Provide for authentic assessment of learning within the tasks Authentic learning provides learners with opportunities to experience first-hand how knowledge they are acquiring can be used in real-life situations. This enables them to see how different contextual factors may influence their decisions and problem-solving strategies. A benefit to authentic learning is that it supports students conceptualization of what they can expect if they were to work in different settings. For example, a learner would experience significant differences if they participated in a training session in a conference room where they were taught how to use a cash register to complete a sale at a store. That experience would be heightened if they engaged in training on the actual store floor where they would be required to interact with customers, respond to different inquiries, and experience other stressors that may occur if the store was busy. The more authentic experience would require the learner to deviate from scripted protocols, when necessary, to answer customer questions requiring an immediate response. Authentic learning provides learners with authentic activities. There is a degree of immediacy that occurs with bridging the instructional and transfer contexts. Learners are no longer tasked with completing instructional
99
100
FOSTERING KNOWLEDGE ACQUISITION THROUGH AUTHENTIC LEARNING
activities eliciting recall of facts and concepts. Instead, they are tasked with applying their conceptual, procedural, conditional, and strategic knowledge to an actual problem occurring in real time. These types of authentic activities provide learners with opportunities to trial different strategies to solve problems. It allows them to experience the flexibility needed and willingness to adapt if a particular contextual factor is present. Authentic learning provides learners with an immersive experience by placing them in an actual situation where they can make decisions. Herrington and Oliver (2000) note that a good authentic experience exposures learners to expert performance and the modeling of processes. For example, if learners are assigned to complete a task in real time over an intended period of time, such as an internship, they may be exposed to experts regularly. During this time, the learners will learn through modeling. Experts will routinely demonstrate how to complete a task. Modeling may look a little different if a learner is participating in a simulated environment such as completing a series of case studies. Learners will still be exposed to experts demonstrating appropriate performance and means for completing tasks; however, the learner will be involved directly. During these simulated experiences, the instructor will play a pivotal role in supporting learners’ understanding of problem-solving in real-world settings. It is important that task expectations align with what learners know regarding the topic. The instructor must align activities that are suited for the learners’ current conceptual, procedural, and conditional knowledge. An instructor’s role during authentic learning experiences differs from their role in traditional instruction. During an authentic experience, the instructor will provide coaching at critical times. There may be instances where the instructor will remain quiet and avoid redirecting students if a goal of the learning experience is for the learners to learn from their mistakes. Other times, the instructor may intervene if they foresee a problem occurring that will significantly disrupt their students’ learning experience. The role of instructor-as-coach is to provide ongoing feedback to learners to correct errors and improve their overall performance. In addition to being exposed to expert performances in authentic learning, learners are provided access to multiple roles and perspectives. If a student is working in a situated environment, they will more than likely be exposed to different individuals within the organization. This will provide them with opportunities to understand how different roles interact within an organization and different perspectives individuals may bring to a situation. Exposure to multiple perspectives aids in strengthening learner characteristics typically presented within the orienting context. As instructional designers, we can build upon the information we have acquired about our learning audience through the orienting context and be intentional with specific activities and individuals we expose our learners to during an authentic experience. Our role in the authentic experience is variable depending on whether or not we are directly involved in
FOSTERING KNOWLEDGE ACQUISITION THROUGH AUTHENTIC LEARNING
the delivery and facilitation of instruction. If we do have a more prominent role with facilitating the instruction and serving in an instructor role, we can identify experiences that will help enhance our learners’ orienting characteristics. For those who have had limited exposure to diversity, we can emphasize these experiences through authentic activities. If learners have more experience working with a particular age group or applying course content to a particular industry or setting, we can introduce other industries or situations to them. Authentic learning is rooted in Bandura’s (1971) social learning theory. Authentic learning provides opportunities for learners to engage in collaborative construction of knowledge by working with other students to solve a real problem. The act of collaborative problem-solving also provides them exposure to different roles and perspectives. These experiences teach learners how to adapt to different aspects of their environment. Regardless of your learning audience’s levels of expertise, authentic learning can be customized to support their individual growth. By providing learners with exposure to a variety of contextual factors influencing a situation, they can engage in reflective activities to make sense of what they are witnessing, experiencing, and contributing. Authentic learning promotes reflection to enable abstractions to be formed. This is particularly beneficial to learners as they build upon their development of expertise and begin to make connections between different aspects of their knowledge domains. Some reflection exercises may occur organically within little to no prompting. Learners will engage in reflection intuitively as they bear witness to some solutions working and others warranting additional support. Other times, the instructor can provide learners with guided reflections to provide them with prompts to help them make these connections. It is important that we, as instructional designers, have a clear understanding of the outcomes associated with the authentic learning experience. Examples of questions we should ask during the design process include but are not limited to the following: • • • •
What is the purpose of the experience? What are the desired outcomes for the authentic activity? How much autonomy should learners have during the experience? To what extent will they be able to make connections between various knowledge domains? • What is the role of the instructor during the authentic activity? • How will the learners be assessed during and after their participation in the authentic activity? Through authentic learning, learners are also expected to engage in activities that promote articulation to enable tacit knowledge to be made explicit (Herrington & Oliver, 2000). Learners are able to demonstrate their
101
102
FOSTERING KNOWLEDGE ACQUISITION THROUGH AUTHENTIC LEARNING
abilities to articulate their understanding of the situation through a variety of ways: communicating and collaborating with others in the situation to identify and implement a solution. Learners can also be prompted to articulate their understanding of a situation during debriefing sessions with their fellow learners and the instructor. During debriefing sessions, the instructor can ask specific questions to prompt learners to provide a rationale for different decisions they may have made during the authentic activity. During this time, they can specifically state out loud why they might have selected one strategy over another and what contextual factors influenced their decisions. Debriefing sessions also serve to provide valuable insights to the instructor and the instructional designer by indicating the extent that learners understand the assignment and their current level of performance or development in applying knowledge to real-world settings. A unique characteristic of authentic learning experiences is that they provide for authentic assessment within the actual tasks being completed. Authentic learning does not rely on tests and examinations upon completion of the activity. Instead, assessment is embedded continually throughout the activity. Products developed by learners in the situated environment can serve as artifacts to assess their learning. Their ability to communicate and collaborate with others can be used as a metric for assessment. Reflection activities requiring learners to articulate the reasoning behind their decision-making process are also good indicators of how they are making connections between instructional content.
Design Challenges for Consideration Instructional Design Challenges While there are a number of benefits to providing learners with authentic learning experiences, there are different design challenges an instructional designer must consider when determining whether an authentic experience is most appropriate. When we design instruction, our role as designers may involve designing and delivering the instruction to the intended audience or focus solely on the design of instruction with the intent that someone else will be responsible for delivering the instruction and interacting with the learning audience. The following are design considerations for authentic learning experiences: • • • • •
Learner preparedness for situated environments Degree of supervision required Involvement of others Consistency of authentic experiences Differentiations in assessment
FOSTERING KNOWLEDGE ACQUISITION THROUGH AUTHENTIC LEARNING
Learners, regardless of their familiarity with a topic, can be placed in a situated environment to support their learning; however, the instructional designer and instructor must be attuned to their predispositions and prior knowledge of the subject matter. If a goal in learning is to help foster a permanent change in behavior over a prolonged period of time, authentic experiences can be leveraged to help learners navigate their individual ZPD. Taking into account that every learner will have a different ZPD, we can use information gathered during a learner and contextual analysis to identify what learner characteristics we can cultivate during the instructional experience and what specific contextual factors the learner should be prepared to address in the transfer context. Depending on the complexity of the tasks and the learners’ familiarity with the subject and the context, we can scaffold activities appropriately by establishing parameters around the activity to direct their focus to specific steps of a task and contextual factors. These parameters can be expanded over time as learners gain experience. When designing an authentic learning experience, we also need to be mindful of the role of the instructor. The degree of supervision required will be dependent on the skill level of the learners. If exposure to expert performances is of particular importance for a situated activity, we need to consider what degree of involvement from others will be needed to facilitate the activity. Learning experiences such as internships typically require a lot of time between the expert and novice, whereas simulations and case studies may require less. When planning for different activities, the instructional designer should work with their management and clients to determine the feasibility of gaining access to individuals in the transfer context. This is particularly important when designing on-the-job training because it supports the learners’ transitions from the instructional context to the transfer context. Establishing consistency among the authentic learning experiences is a challenge often experienced by instructors in K- 12 and higher education settings. The more autonomy that learners are provided in the authentic activity, the greater the chance for inconsistencies among projects and learner performance to occur. This lack of structure can pose challenges for instructors who want to incorporate consistent evaluative practices in their instructional materials. When this occurs, instructors can clarify the goals of the assignment and set expectations for what outcomes they want their learners to meet in regard to applying content to a situated environment and evaluate whether those outcomes were met regardless of the situation or project. This is less of a challenge if an instructor or instructional design team is working to incorporate on-the-job training as a part of an onboarding process as they have far more flexibility in terms of standardizing assessments for their learners.
103
104
FOSTERING KNOWLEDGE ACQUISITION THROUGH AUTHENTIC LEARNING
Learner Challenges Brush and Saye (2000) identified several issues that may arise during the implementation of authentic learning experiences that instructors and instructional designers must be cognizant of and take into consideration while planning. When integrating authentic learning experiences within a course curriculum, instructors make assumptions that their learners have the necessary foundational knowledge to carry out the tasks at hand in a situated context. Learners may experience difficulty dealing with the lack of structure and may be overwhelmed by the amount of information they are required to manage while working on a project. They suggest that having students monitor their progress on a frequent basis and providing them with regular feedback may help to alleviate some of the stresses imposed during the project. Depending on the learners’ levels of familiarity with the instructional content and contexts, they may be overwhelmed by the amount of information. Authentic learning balances knowledge acquisition with doing. It is important that those involved with designing authentic activities clearly communicate the expectations regarding the management of information and the extent that students are expected to apply the information. A common challenge among students who struggle with participating in situated learning activities is a lack of feedback. It is important that the instructional designer embed regular checkpoints to assess learners’ performance throughout the entire project.
Facilitating Exploration through Cognitive Apprenticeships Cognitive apprenticeships have been viewed as a viable option for providing long-term mentoring and guidance to learners in a situated environment. During a cognitive apprenticeship, emphasis is placed on guided experiences where a novice learns from observing and working with an expert (Brown et al., 1989; Collins, 2006). Cognitive apprenticeships stem from traditional apprenticeships that date back to the Middle Ages as blacksmiths and medical professionals learned their profession. A cognitive apprenticeship differs from a traditional apprenticeship in that • Emphasis is placed on the learner gaining experience applying knowledge across multiple contexts to gather experience in order to make decisions in the future. • Experts explicitly model performance and discuss the cogni tive and metacognitive processes they employed during their problem-solving. • Scaffolds are used to provide learners with cognitive supports as they are acquiring new skills in different contexts. These scaffolds are eventually removed as the learner gains experience and can perform tasks independently.
FOSTERING KNOWLEDGE ACQUISITION THROUGH AUTHENTIC LEARNING
• Experts apply faded coaching where they remove cognitive supports as the learner demonstrates more independence. (Ertmer & Cennamo, 1995) A goal of cognitive apprenticeships is training the learner (apprentice) not only to complete a task but also to be able to explore the process they used to make a decision and complete tasks to solve a problem. Collins and colleagues (1991) recommend that every cognitive apprenticeship adhere to the following three processes: . Identify the processes of the task and make them visible to students. 1 2. Situate abstract tasks in authentic contexts so that the students understand the relevance of the work. 3. Vary the diversity of situations and articulate the common aspects so that students can transfer what they learn. (Collins et al., 1991, p. 8) The cognitive apprenticeship methodology consists of six phases (Collins et al., 1991): 1. 2. 3. 4. 5. 6.
Modeling Coaching Scaffolding Articulation Reflection Exploration
It is important to note that activities within the cognitive apprenticeship may not occur in a linear progression. There may be times where various phases of the cognitive apprenticeship framework are revisited because of the introduction of complex tasks and the learner’s mastery of others (Dennen & Burner, 2008). The following sections describe the six phases. Table 6.2 provides an example of how Ertmer and Cennamo (1995) used cognitive apprenticeship strategies to support students in an instructional design course.
Modeling The majority of modeling typically occurs at the beginning of a cognitive apprenticeship. During this time, the expert will demonstrate performance (how to complete a task) to the learner. This provides the learner with the opportunity to witness task completion correctly from start to finish. During the modeling phase, the expert also demonstrates the thinking process to the learner by talking out loud while they complete a task. This helps the learner to make connections between knowledge pertaining to
105
106
FOSTERING KNOWLEDGE ACQUISITION THROUGH AUTHENTIC LEARNING TABLE 6.2 Example of Cognitive Apprenticeship to Teach Instructional Design (Ertmer & Cennamo, 1995) Instructional Environment/ Pedagogical Feature
Cognitive Apprenticeship
Instructional Design (ID) Course
Situated learning
Learning in multiple contexts that reflect the way knowledge is used to solve real problems and complete tasks Explicit modeling of the cognitive and metacognitive processes used by experts Scaffolding of cognitive supports until students can perform intellectual tasks independently Comparison of self with expert (thinking processes and problem- solving skills) Verbalize what one thinks; articulation of principles underlying knowledge use Cognitive support fades; students set their own subgoals and frame problems independently
Students design instructional modules, both in teams and individually, during which they observe, implement, and evaluate their own design behaviors. Instructors engage in “think- alouds” while analyzing unfamiliar content.
Modeling
Coaching
Reflection
Articulation
Exploration
Instructors meet with students, individually and in groups, to question, probe, and seek clarification of design decisions. Students compare their ideas, beliefs, and design processes with other students, instructors, the textbook, and other ID examples. Through written reflections and group discussions, students verbalize their thought processes in carrying out specific design activities. Students broaden their perspective of ID through participation in case studies and role-plays based on real design problems and activities.
Source: Ertmer, P. A., & Cennamo, K. S. (1995). Table 2. Examples of pedagogical features of a cognitive apprenticeship model within an instructional design course (p. 46). Used with permission.
the task and any adaptations the expert may make to address contextual factors present in a given setting. Most introductory instructional design courses and textbooks will discuss task analysis. During task analysis, a subject matter expert provides a detailed account of how to complete a task. Information derived from a task analysis is used to guide instruction on how to complete procedural
FOSTERING KNOWLEDGE ACQUISITION THROUGH AUTHENTIC LEARNING
tasks. During a cognitive apprenticeship, the expert follows a similar process, demonstrating how to complete a task with additional annotations explaining the rationale behind their decisions.
Coaching The expert also serves in a coaching role where they provide necessary assistance and support to the learner throughout the duration of their relationship in the apprenticeship. Mentorship plays a significant role in the cognitive apprenticeship as the learner (apprentice) spends a significant amount of time working with an expert. In a study that utilized the cognitive apprenticeship framework to train librarians to develop youth story time programs, Scott-Brown and Stefaniak (2016) proposed a framework that aligned mentorship and cognitive apprenticeship methodologies (Figure 6.2). This framework draws from research exploring phases of mentorship (Kram, 1983; Newby & Heide, 1992) to demonstrate that participants of a cognitive apprenticeship engage in four phases of mentorship: set goals, initiation, cultivation, and separation. The initial phase of mentorship, setting goals, can begin before the learner is paired with an expert for the apprenticeship. At the beginning of the goal-setting phase, the learner is responsible for identifying performance goals they would like to achieve during the cognitive apprenticeship. This requires them to reflect on what they currently know and identify areas for improvement. The more honest the learner can be during this phase, the greater opportunity they have for expanding their ZPD. Modeling Scaffolding Coaching Articulation Reflection Exploration
Set Goals
Initiation
Cultivation
Separation
The Mentorship Continuum
FIGURE 6.2 Alignment Between Mentorship and Cognitive Apprenticeship Methodologies. Source: Scott-Brown, J.A., & Stefaniak, J. (2016). The design of a cognitive apprenticeship to facilitate storytime programming for librarians. Contemporary Educational Technology, 7(4), 331–351. Figure 1 – Alignment between mentorship and cognitive apprenticeship methodologies (p. 335). Used with permission.
107
108
FOSTERING KNOWLEDGE ACQUISITION THROUGH AUTHENTIC LEARNING
Goal setting continues into the second phase, initiation. At the beginning of a cognitive apprenticeship, the learner will work with the expert to set goals for their time together. This not only involves the explicit identification of cognitive tasks the learner is expected to master during the apprenticeship but also establishes guidelines regarding expectations of the learner and the expert, serving as the mentor, during the apprenticeship. The first two phases of mentoring are particularly important because they ensure that the learner and the expert/mentor have a shared understanding of expectations. If there were discrepancies in expectations, the learner and expert may need to decide whether they should continue their working relationship. Cultivation is the third phase of a mentoring relationship. During this time, the learner works with the expert to develop their relationship. As the learner learns by working directly with the expert, the closer their relationship, the better equipped the expert will be to determine what cognitive supports are necessary as well as when they should engage in faded coaching. By emphasizing the role of mentorship that occurs within a cognitive apprenticeship, experts who are asked to serve as mentees can establish trust with their learners and enhance their confidence as they are presented with increasingly complex tasks. The final phase of mentorship is separation and occurs toward the end of the cognitive apprenticeship. This phase entails the learner reaching total independence from the expert. By the time a cognitive apprenticeship is complete, the learner should be able to perform a variety of complex tasks in different contexts without assistance. The separation phase provides closure to the learner and the expert as their mentoring relationship reaches completion.
Scaffolding Scaffolding occurs throughout the apprenticeship to varying degrees. Scaffolding is defined as an expert providing just-in-time support to a learner or novice (Wood et al., 1976). During a cognitive apprenticeship, the expert will provide increased support through scaffolding at the beginning of the apprenticeship. This support fades over time as the learner gains experience and demonstrates their ability to complete tasks independently. Wood (2003) identifies five levels of contingent support: • • • • •
Level Level Level Level Level
1: 2: 3: 4: 5:
General verbal intervention Specific verbal intervention Specific verbal intervention plus nonverbal indicators Prepares for next action Demonstrates action (p. 12)
FOSTERING KNOWLEDGE ACQUISITION THROUGH AUTHENTIC LEARNING
Wood (2003) has described these levels according to three contingencies of support: 1. Instructional contingency: how to support activity 2. Domain contingency: what to focus on next 3. Temporary contingency: if and when to intervene The instructional contingency requires the expert to identify appropriate means to provide support and coaching to the learner. The expert in a cognitive apprenticeship will draw upon their subject matter expertise in addition to their relationship with their mentee. The domain contingency focuses on what should be focused on next. The expert will determine which tasks warrant further attention as they identify an appropriate plan to cultivate their mentee’s conceptual, procedural, and conditional knowledge acquisition. The temporary contingency accounts for determining whether and when an expert should intervene to support their mentee’s performance. This will be dependent on the goals of the activities. There may be some instances when part of the learning experience is for the mentee to experience failure. Other times, the expert may need to intervene immediately to prevent the mentee from experiencing harm or significant setbacks. As in coaching, the amount of scaffolding provided to the learner will fade over time as they demonstrate competency.
Articulation Cognitive apprenticeships focus on supporting a learner’s cognitive understanding of the process required to complete a task in addition to completing the task. During a cognitive apprenticeship, it is important to elicit the learner’s ability to articulate their understanding routinely. This requires the learner to verbalize the steps they are taking to complete a task. During this think-aloud, the learner will communicate the rationale behind the decisions they have made. They will explain if they were influenced by previous experience or particular contextual factors that they assumed would either promote or hinder their progress in completing the task.
Reflection Since cognitive apprenticeships focus so much on developing a learner’s cognitive and metacognitive skills, reflection is a useful strategy to support the learner’s ability to engage in analysis of the situation and self- assessment. Periodic checkpoints should be used throughout the cognitive apprenticeship for the learner to assess their performance, identifying their perceived strengths and weaknesses. These reflective checkpoints can be used during coaching sessions with the expert to determine the next course of action to be taken during the cognitive apprenticeship.
109
110
FOSTERING KNOWLEDGE ACQUISITION THROUGH AUTHENTIC LEARNING
Exploration Exploration is embedded throughout the duration of a cognitive apprenticeship; however, a long-term goal of the apprenticeship should be that the learner achieves competency where they can engage in posing and exploring their own problems (Collins et al., 1991). Throughout the cognitive apprenticeship, the learner is encouraged to explore completing tasks in different settings to enhance their cognitive abilities and problem- solving skills. This also prepares them for applying higher-order thinking skills when presented with new situations.
Revisiting Context Context has been referenced on several occasions throughout this chapter. Authentic learning provides a means to establish a deeper connection between the instructional and transfer contexts. Authentic learning experiences support a reciprocal relationship between an instructional context and the situated context for the problem-based activity. In a systematic review exploring strategies used to support the transfer of learning in online problem-based projects that used service-learning (Stefaniak, 2020), I proposed a framework to understand such a relationship between the instructional and transfer contexts by examining how an instructional setting (course) and the situated activity (e-service-learning) interacted with each other as subsystems (Figure 6.3). Service-learning is a pedagogical approach used as “credit-bearing educational experience in which students gain further understanding of course content, a broader appreciation of the discipline, and an enhanced sense of civic responsibility” (Bringle & Hatcher, 1996, p. 222). E-service-learning is a term used to reflect when “the instructional component, the service component, or both are conducted online” (Waldner et al., 2012, p. 125). The illustration in Figure 6.3 can be used to gain an understanding of the interactions that occur within the instructional and transfer contexts associated with a problem-based activity (Stefaniak, 2020). If we were to think of the instructional context where instruction is delivered to learners as a subsystem within a larger learning environment and the situated environment where the authentic learning activity occurs as a second subsystem, we can see how each of these systems is dependent and independent of the other. Within each subsystem are individuals, objects, and processes that learners are expected to adhere to. The content explored in the instructional context (course subsystem) influences the types of activities that the learners are expected to engage in within the transfer context (situated learning subsystem). The greater alignment that exists among these two subsystems enhances learners’ perceived utility. The situated environment provides the learner with in situ experiences where they can see how the instructional content transfers to a real-world
Environment
Resources • Personnel • Instructional Technologies • Equipment • Time • Budgetary Support
implementation ad monitoring
Course Subsystem Perceived Utility Persons Resources
Processes
E-Service-Learning Subsystem
Persons
Objects
Community Partners and Stakeholders
Constraints
• Organizational support • Learner motivation • Politics • Organizational culture • Time • Resources assigned for
Coping Strategies
Outputs
Objects Processes
Social Support
Feedback
111
FIGURE 6.3 Utilization of a systems view to design e-service-learning experiences. Source: Stefaniak, J. (2020). A systems view of supporting the transfer of learning through e-service-learning experiences in real-world contexts. TechTrends, 64(4), 561–569. Figure 1- Utilization of a systems view to design e-service learning experiences (p. 567). Used with permission.
FOSTERING KNOWLEDGE ACQUISITION THROUGH AUTHENTIC LEARNING
Resources (Tools, support, personnel)
Learning System
112
FOSTERING KNOWLEDGE ACQUISITION THROUGH AUTHENTIC LEARNING
setting during the learning period rather than waiting upon completion of instruction. Throughout the learning experience, instructors assume a coaching role by providing learners with the necessary coping strategies to mitigate challenges they may encounter applying their knowledge to a real- world situation. Throughout the authentic experience, learners encounter social supports within both the instructional and situated subsystems. This enhances their acquisition of knowledge and higher-order thinking skills required of them during problem-solving in situated contexts. In Chapter 5, we discussed the localization of context to support learners’ contextual inquiries. The situatedness of authentic learning experiences coincides with the five premises proposed by Baaki and Tracey (2022) to support a localization of context. The learner and design contexts are dynamic in the authentic environment in that everything is situated. Learners are required to adapt to different contextual factors as they present themselves to the situation. Throughout the learning experience, learners gain experience and build upon their prior knowledge to interpret different situations that support their problem-solving capabilities. Extended authentic learning experiences such as cognitive apprenticeships provide learners the ability to engage in meaning-making as they work toward strengthening their metacognitive skills through the exploration of applying their skills to different settings.
Summary This chapter provided an overview of how instructional designers can facilitate learners’ acquisition of knowledge through authentic learning experiences. Instructional design considerations were provided for case- based learning, problem-based learning, and cognitive apprenticeships. In addition, the relationship between instructional and situated contexts was explored to better understand the localization of context in authentic learning environments. Chapter 7 will continue exploration into additional instructional strategies that can support the learning process.
Connecting Process to Practice Activities 1. Savery (2018) posits, “the problem simulations used in problem- based learning must be ill-structured and allow for free inquiry. Problems in the real world are ill-structured (or they would not be problems). A critical skill developed through PBL is the ability to identify the problem and set parameters on the development of a solution.” When learners are engaged in ill-structured problem- based learning experiences, how can teachers facilitate and scaffold students’ learning experiences in digital learning environments?
FOSTERING KNOWLEDGE ACQUISITION THROUGH AUTHENTIC LEARNING
How can instructional designers make design decisions to ensure a better learning environment for learners? 2. Construct a cognitive apprenticeship as if you were responsible for providing training to an aspiring instructional designer as part of a 3-month onboarding process where you work. How would you address modeling, coaching, scaffolding, articulation, reflection, and exploration? How much estimated time would you allocate for different activities? 3. Make a list of challenges that learners may experience when engaging in exploration within a situated environment. What strategies can instructors use to mitigate these challenges? 4. Think about a time when you were responsible for providing on- the-job support. Using Wood’s (2003) three contingencies of support, how would you provide the necessary scaffolding to your trainee? Provide examples for how you would address the instructional, domain, and temporal contingencies.
Bridging Research and Practice Dabbagh, N. H., Jonassen, D. H., Yueh, H. P., & Samouilova, M. (2000). Assessing a problem- based learning approach to an introductory instructional design course: A case study. Performance Improvement Quarterly, 13(3), 60–83. https://doi.org/10.1111/j.1937-8327.2000.tb00176.x Ertmer, P. A., Quinn, J. A., & Glazewski, K. D. (2019). The ID casebook: Case studies in instructional design (5th ed.). Routledge. Schultz, M., Young, K., Gunning, T. K., & Harvey, M. L. (2022). Defining and measuring authentic assessment: a case study in the context of tertiary science. Assessment & Evaluation in Higher Education, 47(1), 77–94. https://doi.org/ 10.1080/02602938.2021.1887811 Stefaniak, J., Maddrell, J., Earnshaw, Y., & Hale, P. (2018). The evolution of designing E- service- learning projects: A look at the development of instructional Designers. International Journal of Designs for Learning, 9(1), 122–134. Tawfik, A. A., & Kolodner, J. L. (2016). Systematizing scaffolding for problem- based learning: A view from case-based reasoning. Interdisciplinary Journal of Problem-Based Learning, 10(1). https://doi.org/10.7771/1541-5015.1608
References Baaki, J., & Tracey, M. W. (2022). Empathy for action in instructional design. In J. E. Stefaniak & R. M. Reese (Eds.), The instructional design trainer’s guide: Authentic practices and considerations for mentoring ID and ed tech professionals (pp. 58–66). Routledge. Bandura, A. (1971). Social learning theory. General Learning Press. Bandura, A., & Schunk, D. H. (1981). Cultivating competence, self-efficacy, and intrinsic interest through proximal self-motivation. Journal of Personality and Social Psychology, 41(3), 586–598. https://doi.org/10.1037/0022-3514.41.3.586 Bringle, R. G., & Hatcher, J. A. (1996). Implementing service learning in higher education. The Journal of Higher Education, 67(2), 221–239. https://doi.org/ 10.1080/00221546.1996.11780257
113
114
FOSTERING KNOWLEDGE ACQUISITION THROUGH AUTHENTIC LEARNING Brown, J. S., Collins, A., & Duguid, P. (1989). Situated cognition and the culture of learning. Educational Researcher, 18(1), 32–42. Brush, T., & Saye, J. (2000). Implementation and evaluation of a student-centered learning unit: A case study. Educational Technology Research and Development, 48(3), 79–100. https://doi.org/10.1007/BF02319859 Collins, A. (2006). Cognitive apprenticeship. In R. K. Sawyer (Ed.), Cambridge handbook of the learning sciences (pp. 47–60). Cambridge University Press. Collins, A., Brown, J. S., & Holum, A. (1991). Cognitive apprenticeship: Making things visible. American Educator, 15(3), 1–18. Dennen, V. P., & Burner, K. J. (2008). The cognitive apprenticeship model in educational practice. In J. M. Spector, M. D. Merrill, J. van Merrienboer, & M. P. Driscoll (Eds.), Handbook of research on educational communications and technology (3rd ed., pp. 425–439). Routledge. Driscoll, M. P. (1994). Psychology of learning for instruction. Allyn & Bacon. Ertmer, P. A., & Cennamo, K. S. (1995). Teaching instructional design: An apprenticeship model. Performance improvement quarterly, 8(4), 43–58. https://doi. org/10.1111/j.1937-8327.1995.tb00699.x Herrington, J., & Oliver, R. (2000). An instructional design framework for authentic learning environments. Educational Technology Research and Development, 23–48. https://doi.org/10.1007/BF02319856 Herrington, J., Reeves, T. C., & Oliver, R. (2014). Authentic learning environments. In J. M. Spector, M. D. Merrill, J. Elen, & M. J. Bishop (Eds.), Handbook of research on educational communications and technology (4th ed., pp. 401–412). Springer. Hung, W., Jonassen, D. H., & Liu, R. (2008). Problem-based learning. In J. M. Spector, M. D. Merrill, J. van Merrienboer, & M. P. Driscoll (Eds.), Handbook of research on educational communications and technology (3rd ed., pp. 485–506). Routledge. Jonassen, D. (2011). Supporting problem-solving in PBL. Interdisciplinary Journal of Problem-Based Learning, 5(2), 95–119. https://doi.org/10.7771/1541-5015.1256 Jonassen, D. H. (1997). Instructional design models for well-structured and III- structured problem- solving learning outcomes. Educational Technology Research and Development, 45(1), 65–94. https://doi.org/10.1007/BF02299613 Kram, K. E. (1983). Phases of the mentor relationship. The Academy of Management Journal, 26(4), 608–625. https://doi.org/10.5465/255910 Lave, J., & Wenger, E. (1991). Situated learning: Legitimate peripheral participation. Cambridge University Press. Mayer, R. E. (1982). Learning. In H. Mitzel (Ed.), Encyclopedia of educational research (pp. 1040–1058). The Free Press. Mims, C. (2003). Authentic learning: A practical introduction & guide for implementation. Meridian: A Middle School Computer Technologies Journal, 6(1), 1–3. Newby, T. J. & Heide, A. (1992). The value of mentoring. Performance Improvement Quarterly, 5(4), 2–15. https://doi.org/10.1111/j.1937-8327.1992.tb00562.x Nilson, L. B. (2010). Teaching at its best: A research-based resource for college instructors (2nd ed.). Jossey-Bass. Richey, R. C., Klein, J. D., & Tracey, M. W. (2011). The instructional design knowledge base: Theory, research, and practice. Routledge. Savery, J. R., (2018). Overview of problem-based learning: Definitions and distinctions. In R. E. West (Ed.), Foundations of learning and instructional design technology: The past, present, and future of learning and instructional design technology. EdTech Books. https://edtechbooks.org/lidtfoundations/ overview_of_problem-based_learning Scott-Brown, J. A., & Stefaniak, J. E. (2016). The design of a cognitive apprenticeship to facilitate storytime programming for librarians. Contemporary Educational Technology, 7(4), 331–351.
FOSTERING KNOWLEDGE ACQUISITION THROUGH AUTHENTIC LEARNING Srinivasan, M., Wilkes, M., Stevenson, F., Nguyen, T., & Slavin, S. (2007). Comparing problem- based learning with case- based learning: Effects of a major curricular shift at two institutions. Academic Medicine, 82(1), 74–82. http://doi. org/10.1097/01.ACM.0000249963.93776.aa Stefaniak, J. (2020). A systems view of supporting the transfer of learning through e-service-learning experiences in real-world contexts. TechTrends, 64(4), 561– 569. https://doi.org/10.1007/s11528-020-00487-3 Vygotsky, L. (1978). Mind in society: The development of higher psychological processes. Harvard University Press. Vygotsky, L. (1986). Thought and language. MIT Press. Waldner, L. S., Widener, M. C., & McGorry, S. Y. (2012). E-service learning: The evolution of service-learning to engage a growing online student population. Journal of Higher Education Outreach and Engagement, 16(2), 123–150. Wood, D. (2003). The why? What? When? And how of tutoring: The development of helping and tutoring skills in children. Literacy Teaching and Learning, 7(1&2), 1–30. Wood, D., Bruner, J. S., & Ross, G. (1976). The role of tutoring in problem- solving. Journal of Child Psychology and Psychiatry, 17, 89–100. https://doi. org/10.1111/j.1469-7610.1976.tb00381.x
115
7
INSTRUCTIONAL STRATEGIES THAT PROMOTE GENERATIVE LEARNING
Chapter Overview Three goals for all instructional design projects are to contribute to effectiveness, efficiency, and ease of learning. This chapter introduces instructional strategies that can be used to promote generative learning (Brod, 2020; Fiorella & Mayer, 2016; Lee et al., 2008). Examples of strategies introduced include complex learning and failure-based learning (e.g., Rong & Choi, 2019; Stefaniak, 2021; Tawfik et al., 2012, 2015). Cognitive load theorists have generated specific instructional strategies to manage cognitive overload that may burden learners during the learning process (e.g., Sentz & Stefaniak, 2019; Sentz et al., 2019; Sweller, 1994). This chapter differentiates between the various types of cognitive load (intrinsic, extrinsic, and germane), discusses how cognitive load has been addressed in learning design research, and offers strategies that instructional designers can implement to support their learners.
Guiding Questions 1. What is generative learning? 2. What are the differences between generative learning strategies and supplantive learning strategies? 3. How can I design instruction that promotes productive failure? 4. How can I manage the learners’ cognitive load?
What Is Generative Learning? Instructional designers have a lot of options to choose from when it comes to selecting instructional strategies to support learning and performance for a given project. Previous chapters in this book have placed emphasis on the important role that context plays in learning. Chapter 6 explored instructional considerations for implementing authentic learning experiences through the use of problem-based learning and case-based learning. DOI: 10.4324/9781003287049-7
INSTRUCTIONAL STRATEGIES THAT PROMOTE GENERATIVE LEARNING
The critical role of mentorship in a cognitive apprenticeship was also discussed. This chapter builds upon the exploration of different instructional strategies with emphasis placed on how particular instructional activities can support the localization of context within a learning experience. Chapter 5 discussed challenges that instructional designers face aligning orienting and instructional contexts with the transfer context. Incorporating authentic learning strategies is a viable approach to minimizing significant gaps between instruction and the transfer setting. Through situated learning experiences, learners take a more active role in the learning process through which they engage in meaning-making and exploration. Generative learning “involves actively making sense of to-be-learned information by mentally reorganizing and integrating it with one’s prior knowledge, thereby enabling learners to apply what they have learned to new situations” (Fiorella & Mayer, 2016). The theory of generative learning was proposed by Wittrock in 1974. Wittrock (1989) posited that learners are not passive consumers of information. Rather, their mind “actively constructs its own interpretations of information and draws inference on them” (p. 348). Generative strategies differ from supplantive strategies in that the learner is given more autonomy in generating their understanding of the topic. Supplantive strategies supplant, facilitate, or scaffold most of the information processing for the learner by providing elaborations that supply all or part of the educational goal, organization, elaboration, sequencing and emphasis of content, monitoring of understanding, and suggestions for transfer to other contexts. (Smith & Ragan, 2005, p. 142) Table 7.1 provides a guide for when you might choose to integrate a supplantive instructional design versus a generative instructional strategy (Smith & Ragan, 2005). Generative strategies are more often used to support learning when learners require lower levels of scaffolding. Generative strategies work well when students are entering the instructional setting with a high level of prior knowledge. They are more apt to be flexible during the learning process and explore different strategies when they have low anxiety associated with the instructional experience. Students who have less experience with the content and are unaware of cognitive strategies to support their learning may be better suited for supplantive strategies that can promote higher levels of scaffolding. Increased scaffolding can also support learners who may not be as motivated to complete tasks on their own or may have higher levels of anxiety associated with the tasks. When tasks are complex and ill structured, posing multiple solutions, generative strategies provide learner with the opportunity to explore
117
118
INSTRUCTIONAL STRATEGIES THAT PROMOTE GENERATIVE LEARNING TABLE 7.1 When to Consider Supplantive Versus Generative Strategies
Learner
Task
Context
Generative Strategies (Low Level of Scaffolding)
Supplantive Strategies (High Level of Scaffolding)
High levels of prior knowledge High aptitude Aware of many cognitive strategies Flexible High motivation Low anxiety Intrinsically motivated Complex Ill structured Not hazardous Performance level is non-critical Ample time Goal priority is more important than cognitive strategies Fewer external demands for competency
Low levels of prior knowledge Low aptitude Aware of few cognitive strategies Low motivation High anxiety Extrinsically motivated Simple Well defined Hazardous High performance level required Limited time Cognitive strategies are more important than goal priority External demands for competency
Adapted from Smith and Ragan (2005) Source: Smith, P. L., & Ragan, T. J. (2005). Figure 7.2. The balance of generative and supplantive strategies (p. 143). Used with permission.
different approaches to problem-solving. This freedom to explore pairs well with tasks that are non-critical and pose no harm to the learners or others. Supplantive strategies can support simple well-defined tasks where there are limited options for solving a problem. These types of strategies are favored when a learner is expected to perform a task that could pose harm to others if performed incorrectly. Supplantive strategies help to ensure consistency with how problems should be performed, particularly when high performance levels are expected in the transfer settings. The context of the learning environment and where learners are expected to apply their knowledge will greatly influence the types of strategies an instructor may select. When minimal time constraints are imposed on the instructor and the learners, generative strategies can allow for the learners to explore and try a variety of different strategies. This enables them opportunities to evaluate which strategies are more successful than others at solving a situated problem. More freedom typically exists within the learning environment when emphasis is placed on learnings setting individualized goals rather than being required to master specific tasks upon completion of the instructional experience. Supplantive strategies are more suited when there are significant time constraints imposed for learners to demonstrate mastery and when cognitive strategies and task mastery are prioritized over individual goal-setting. For example, supplantive strategies may be more appropriate for teaching a pre-service teacher
INSTRUCTIONAL STRATEGIES THAT PROMOTE GENERATIVE LEARNING • Aenon
• Analogies • Metaphors • Summaries
• Aribuon • Interests
Learning
Movaon
Generaon
Knowledge Creaon • Preconcepons • Concepts • Beliefs
FIGURE 7.1 Emphasis Areas of Generative Learning. (Adapted from Wittrock, 1989)
how to upload materials into a school’s learning management system or training a nurse on how to insert an IV into a patient. Generative strategies may be more appropriate when teaching learners how to design lesson plans that promote creativity in the classroom. Wittrock’s (1989) generative model of learning argues that learning consists of four processes: learning, motivation, knowledge creation, and generation (Figure 7.1). His model suggests that learners engage in active and dynamic generations of learning “through reorganizations, reconceptualizations and to elaborations and relations that increase understanding” (Wittrock, 1992, p. 532). Wittrock’s (1989) generative learning model examines learning through attention-seeking activities. Wittrock’s model suggests that there are four categories of generative learning: recall, integration, organization, and elaboration. Recall requires learners to engage in repetitive activities in order to store information in their long-term memory. Examples of instructional strategies that elicit learners’ ability to recall facts include recitation, rehearsal, and mnemonics (Morrison et al., 2013). Examples of recall activities could include students using flashcards to learn addition and subtraction math facts, a golfer going to the driving range to practice their swing, or a learner practicing a new computer skill until they no longer need assistance regarding the sequence of steps. Integration strategies involve a learner adding information to what they currently know about a topic. Activities such as paraphrasing or analogies enable learners to make sense of the new information by associating it with things they already know. The ability to paraphrase and put information into their own words helps learners to transfer that information to their long-term memory.
119
120
INSTRUCTIONAL STRATEGIES THAT PROMOTE GENERATIVE LEARNING
Organizational strategies typically involve learners generating lists or grouping items of information. This allows learners to build upon their pre-existing knowledge and incorporate new information to begin organizing information in a meaningful way based on similar characteristics. Elaboration strategies require learners to contribute to the information they have acquired and begin to think about how they can make connections between other concepts. Having learners create their own concept maps or conceptual frameworks to explain different concepts allows learners to interpret information and contribute their own strategy for organizing and developing an understanding of the information. Fiorella and Mayer (2016) have expanded upon Wittrock’s model of generative learning and suggest that learners engage in three phases of learning: 1. Learners must select the most relevant incoming sensor information for further conscious processing in working memory. 2. Learners must organize the selected information into a coherent mental representation in working memory by building relevant connections based on the material’s underlying structure. 3. Learners must integrate the new representation constructed in working memory with the relevant knowledge structure stored in long-term memory. (p. 719) Within their Select- Organize- Integrate (SOI) model, Fiorella and Mayer (2016) propose eight different generative learning strategies. Table 7.2 provides an overview of these strategies along with boundary conditions that suggest when they are most applicable. TABLE 7.2 Types of Generative Learning Strategies with Boundary Conditions Generative Learning Strategy
Description
Boundary Conditions
Summarizing
Learners summarize information by putting information into their own words. Learners take the spoken word and develop a concept map or graphical organizer to organize content and the relationship between different components.
This strategy is most appropriate for tasks that have low levels of complexity and are not spatial.
Mapping
This strategy is most useful to learners after they have been taught how to design spatial representations to organize information.
(Continued )
INSTRUCTIONAL STRATEGIES THAT PROMOTE GENERATIVE LEARNING TABLE 7.2 (Continued) Generative Learning Strategy
Description
Boundary Conditions
Drawing
Learners draw a picture to represent the content of a lesson.
Imagining
Learners develop a mental image that depicts the content of a lesson.
Self-testing
Learners engage in practice questions throughout the learning experience.
Self- explaining
Learners take the content they have just been presented with and explain the contents to themselves.
Teaching
Learners take information that has been presented to them and teach the content or concept to a group of individuals. Learners engage in movement to manipulate objects that are relevant to the task.
It is important for the instructor to be mindful that an unintentional extraneous load may be imposed on learners if they are not familiar or comfortable with drawing. This strategy is most successful for learners with significant prior knowledge as they can build upon concepts they already understand. This strategy is most effective when corrective feedback is provided and when there is a close connection between the practice questions and the format of the final exam or test. This strategy works best if learners are able to generate quality explanations on their own. Instructors can provide prompts to guide learners on what to emphasize during the self- explanation activity. This strategy supports learners’ abilities to engage in explanation and building upon their prior schemas.
Enacting
This strategy works best with learners who are highly skilled and can see the connections between their movements and the content.
(Fiorella & Mayer, 2016)
Supporting Complex Learning Generative learning strategies provide a mechanism to help learners draw from their prior knowledge and organize and conceptualize new information to form mental schemas. While these strategies can help learners enhance their understanding of new concepts and store information in their long-term memory, their ability to transfer that knowledge to real-life settings is significantly impacted when the complexity of the task is high. van Merriënboer and Kirschner (2013) describe complex learning as “the
121
122
INSTRUCTIONAL STRATEGIES THAT PROMOTE GENERATIVE LEARNING
integration of knowledge, skills, and attitudes; the coordination of qualitatively different constituent skills, and often the transfer of what is learned in the school or training setting to daily life and work settings” (p. 2). It is important to remember that the decisions we make as instructors pertaining to sequencing are critical to the success of the learner mastering the material. Sequencing of learning tasks is at the backbone of every instructional program. A challenge that many instructors face when teaching complex tasks is that learners are expected to be able to draw from prior knowledge from various subject areas in order to approach problem- solving and complex learning critically. This is especially true as learners progress through their training. By the time they reach more advanced levels, instructors expect higher- order application of complex material, the ability to problem-solve complex questions, and the ability to rely on evidence-based strategies (i.e., research) to do so. van Merriënboer’s and Kirschner’s (2013) Four Blueprint Components of Four-Component Instructional Design (4C/ID) model helps instructional designers develop experiences that support complex learning and allow you to scaffold your instruction to help activate learners’ prior knowledge (Table 7.3). The 4C/ID model is most appropriate when designing instruction that is to be delivered over an extended period of time. This framework could be used to support apprenticeships or long training periods such as medical training. The beginning of the framework emphasizes learning tasks. Within this phase, information is organized and presented to learners in simple to complex tasks. This provides learners with appropriate scaffolding, especially if the complexity of the content is high. As examples increase in complexity, learners are presented with tasks ranging in
TABLE 7.3 Four Blueprint Components of Four-Component Instructional Design (4C/ID) and the Ten Steps Blueprint Components of 4C/ID
Ten Steps to Complex Learning
Learning tasks
1. Design learning tasks 2. Develop assessment instruments 3. Sequence learning tasks 4. Design supportive information 5. Analyze cognitive strategies 6. Analyze mental models 7. Design procedural information 8. Analyze cognitive rules 9. Analyze prerequisite knowledge 10. Design part-task practice
Supportive information
Procedural information
Part-task practice
(van Merriënboer & Kirschner, 2013) Source: van Merriënboer, J. J. G., & Kirschner, P. A. (2013). Table 1.1. Four Blueprint Components of 4C/ID and the Ten Steps (p. 9). Used with permission.
INSTRUCTIONAL STRATEGIES THAT PROMOTE GENERATIVE LEARNING
variability. The use of modeling and worked examples can be used to support learners and demonstrate to them how to perform a task. As learners gain familiarity with the content, support information can be provided to them on an as-needed basis. Job aids can be used to provide learners with just-in-time support. Other generative learning strategies previously mentioned can also be used to support learners’ abilities to integrate new knowledge with their prior knowledge and build upon their pre-existing mental models and schema. When instructional activities are developed for complex content, the 4C/ID model recommends that supportive information should be made available to the learners. Procedural information requires the learners to gain experience by performing the tasks and receiving immediate corrective feedback. Additional performance demonstrations may be needed depending on the variability of tasks; however, the goal of the instruction is to provide faded coaching over a period of time so that the learner is able to perform tasks independently. Part-task practice allows for the learner to focus on specific aspects of a task in order to reach the necessary level of automaticity. To be adaptable to different contextual factors that may influence a situation, learners should practice tasks that can be altered depending on different conditions. Part-task practice works best when integrated intermittently with learning tasks (van Merriënboer & Kirschner, 2013).
Managing Cognitive Load Kerr (1983) suggests that instructional design practice involves three activities: identifying strategies that potentially solve a problem, using a set of criteria to determine which strategies are employed, and making decisions about implementation based on strategies that have been selected. When we design instruction activities, we should be prioritizing the following: • Supporting our learners’ abilities to apply what they learn to real- world contexts • Designing instructional activities that provide an appropriate level of scaffolding to support learning • Sequencing instruction that increases with an adequate amount of complexity Generative learning strategies introduced earlier in this chapter provide different approaches to support learners’ acquisition of knowledge that are customized to their prerequisite knowledge, degree of complexity associated with the task, and degree of accuracy with performing tasks a particular way for safety reasons. Failure-based learning can be used to provide learners with a supportive environment to learn through trialing different possible solutions and learning through mistakes incurred
123
124
INSTRUCTIONAL STRATEGIES THAT PROMOTE GENERATIVE LEARNING
along the way. Regardless of the strategy used to support instruction, instructional designers should be mindful of the amount of cognitive load imposed on their learners throughout any given task. There are three types of load that impact a learner: intrinsic, extraneous, and germane (Sweller, 2008). Intrinsic load is imposed by the learning task. A more complex task will have a higher degree of intrinsic load compared with an easy task that does not require significant problem- solving. Out of the three types of load, intrinsic load is one that instructional designers probably have the least amount of control over. Clark et al. (2006) note that although you cannot directly alter the inherent intrinsic load of your instructional content, you can manage the intrinsic load of any given lesson by decomposing complex tasks into a series of perquisite tasks and supporting knowledge distributed over a series of topics or lesson. (p. 10) Extraneous load is imposed by the way information is presented to learners. If presentation materials are presented in an illogical sequence or if activities have nothing to do with the information, extraneous load will be higher compared with materials that are clearly organized and communicated to learners. Instructional designers have the most control over the degree of extraneous load imposed on their learners. We can reduce extraneous load by being cognizant of the amount of content we introduce to learners at one time, how activities are sequenced over time to build upon one another, and the types of pictures, graphs, sounds, and text we integrate into our materials to present to our learners at one time. Presenting instructional materials that are chaotic and provide too many distractions to the learner detracts from the learner’s ability to master the content. Germane load is the amount of effort required for the learner to process the information from the instructional materials to organize information and construct schemas to support their abilities to apply what they are learning to a variety of tasks and settings. Incorporating diverse examples and instructional activities for learners to practice applying new skills supports their transfer for learning. The diversity of examples supports learners’ far transfer where they can take a concept they have learned in class and apply it to different scenarios. Instructional designers can reduce the amount of germane load by incorporating diverse examples, sequencing them appropriately so as not to overwhelm the learners. Cognitive load theorists recommend that the way to manage cognitive load is by managing intrinsic load, minimizing extraneous load, maximizing germane load, and adjusting strategies as learners gain experience ( Jung & Suzuki, 2015; Oksa et al., 2010; Sweller, 2008). A majority of seminal research exploring cognitive load is conducted in math, science, and computer science (Paas & van Merriënboer, 1994; Sweller & Cooper, 1985).
INSTRUCTIONAL STRATEGIES THAT PROMOTE GENERATIVE LEARNING •Measure Cognitive Load Effects •Adjust Cognitive Load Strategies
•Detect Interacting Elements
Implement/ Evaluate Deliver Instruction Formative Eval Revise Materials
Develop Create Messages
•Minimize Extraneous Load •Adjust for Expertise Reversal Effect
125
Select Media/ Technology Create Materials
Analyze Identify Problem/ Opportunity
•Rapid Tests of Expertise •Consider Germane Load
Task Analysis Learner Analysis
Design Write Objectives Write Assessments Sequence Content Select Strategies
•Manage Intrinsic Load •Maximize Germane Load
FIGURE 7.2 Cognitive load overlay model with corresponding instructional design steps (Sentz et al., (2019). Source: Sentz, J., Stefaniak, J., Baaki, J., & Eckhoff, A. (2019). How do instructional designers manage learners’ cognitive load? An examination of awareness and application of strategies. Educational Technology Research and Development, 67(1), 199–245. Figure 2 – Cognitive overlay model with corresponding instructional design steps (p. 219). Used with permission.
Over the past decade, there has been an increase in cognitive load studies exploring complex subjects such as engineering and medicine (Kyun et al., 2013; Stark et al., 2011). Research has identified that there is a need for instructional designers to close the gap between prescriptive theories and instructional design practice (Kirschner et al., 2002; Sentz et al., 2019; Winer & Vázquez‐Abad, 1995). Figure 7.2 depicts a cognitive overlap model developed by Sentz et al. (2019) to manage learners’ cognitive load. The goal of the cognitive load overlay model is to identify areas throughout the instructional design process where a designer can integrate strategies to manage their learners’ cognitive load. During the analysis phase where the instructional designer is identifying the needs associated with the project and conducting task analyses and learner analyses related to the project, they can engage in detailing the interacting elements associated with the tasks. This enables the instructional designer to identify the complexity of tasks on the basis of the number of elements required to perform the task. Depending on the number of elements, instructional designers can think about ways in which they can manage their learners’
126
INSTRUCTIONAL STRATEGIES THAT PROMOTE GENERATIVE LEARNING
germane load through different opportunities for task variability (Blayney et al., 2015). During the design phase of instruction, intrinsic load can be managed through content sequencing. Objectives and assessments should be aligned with instructional activities that support learners’ abilities to construct mental models to support their understanding and long-term storage of information. Instructional designers can maximize germane load through incorporating generative learning strategies that promote learners’ co-constructing knowledge and building upon their prior knowledge. Extraneous load can be minimized during the development phase of the instructional design process. While instructional designers are engaged in developing messaging that will be used to deliver content and identifying technological platforms to deliver instruction, they can minimize extraneous load by carefully selecting text and images that support learners’ ability to retain information. “Existing materials can be sought to shorten the time needed to develop goal-free tasks, worked examples, and completion problems in a particular subject domain” (Sentz et al., 2019, p. 220). During the implementation and evaluation phases of the instructional design process, instructional designers can measure their learners’ cognitive load and make the necessary adjustments in situ while delivering instruction and engaging in formative evaluation. Feedback provided to the instructional design can be leveraged to modify training materials to manage intrinsic load, maximize germane load, and minimize extraneous load.
When Is Failure Good? This chapter has been presenting a variety of teaching strategies and frameworks to support learners’ knowledge acquisition and generative knowledge through appropriate sequencing. Most research on instructional design strategies has emphasized minimizing the negative impacts associated with cognitive overload (Kalyuga, 2011; Paas et al., 2003). Research has identified best practices to minimize extraneous load present in instructional materials. Additionally, studies have explored ways in which instructional activities can be designed to motivate learners and develop appropriate schema to organize information so they can be most successful. Another instructional approach that has garnered attention in the field of instructional design in the past decade is learning through productive failure. Productive failure is an instructional approach that suggests that instructors should not intervene if learners make mistakes while engaged in ill- structured problem- solving (Kapur, 2008). Reflecting on what went wrong and why it went way is a large component of the learning experience. It is important to note that failure-based learning disrupts how instructional designers and learners have approached instruction (Darabi et al.,
INSTRUCTIONAL STRATEGIES THAT PROMOTE GENERATIVE LEARNING
2018). Instead of strategically designing activities to mitigate challenges for learners, instructional designers allow pitfalls to be present throughout the learning experience to allow learners an opportunity to make mistakes and learn from that failure. Productive failure stems from Piaget’s (1977) Cognitive Disequilibrium theory that suggests we encounter a state of cognitive imbalance when faced with information and situations that challenge our existing scheme and interpretations of a situation. To overcome imbalance, individuals engage in assimilation and accommodation through developing and modifying a schema that integrates new information. Building upon Piaget’s (1977) theory and Kapur’s (2008) recommendations for facilitating productive failure, Tawfik et al. (2015) suggest that the following process occurs through productive failure: . Learner experiences failure. 1 2. Learner’s existing mental model is challenged as a result of failed experience. 3. Learner generates solutions through exploration and consolidation of possible solutions to address situational constraints. 4. Learner updates their mental model to accommodate new knowledge they have acquired as a result of the failed experience. Instructional activities that promote productive failure tend to emphasize the development of conceptual knowledge versus procedural knowledge (Arrington & Tawfik, 2022). To support learners’ development of their conceptual knowledge and their abilities to make connections between different components they are learning, directed reflective activities can be used as a debriefing strategy at the end of the failed experience. As previously mentioned, a lot of research has been done to identify strategies that minimize extraneous load imposed by instructional materials and the delivery of instruction. Integrating failure-based learning is a concept that will be foreign to learners, and many feel anxious with failure being an intentional component of the experience. The expectations in most, if not all, learning environments is for learners to succeed. Success is often tied to the learning outcome. Learners often encounter challenges becoming comfortable with failure, particularly in education settings where points are awarded to assignments. Oftentimes, learners hold expectations of themselves to do well in a course and earn a good grade. Learners need time to adjust to failure being okay. Law and Finnigan (2021) recommend that learners can overcome failure through renegotiating expectations. This can be accomplished through “experiencing and managing the wave of emotion, modifying expectations of self, and others, redefining success and moving forward, and building flexibility of expectations” (p. 360). When learners encounter failure during a learning experience, it is important to allow them the opportunity to
127
128
INSTRUCTIONAL STRATEGIES THAT PROMOTE GENERATIVE LEARNING
feel and express their frustrations. This supports their ability to acknowledge and regulate their emotions. Instructors interested in integrating productive failure into their courses should allow time to pre-brief learners prior to a learning activity and debriefing afterwards. If failure-based learning is a new strategy being used with a group of learners, the pre-briefing can be done to introduce learners to the activity. Learners can be assured that they may encounter some challenges and failures, but it will be productive struggle. To promote productive failure, scaffolding is delayed to provide learners the opportunity to explore more solutions, thus enhancing creativity (Kapur, 2016). Productive failure treats the problems encountered by learners as impasses they must overcome or reconcile (Darabi et al., 2018, p. 1104). Tawfik et al. (2015) offer four guidelines to support the design of abilities that promote failure: . Allow learners to identify failure. 1 2. Design learning environments to intentionally encounter failure. 3. Support inquiry into failure for analogical transfer and support solution generation to resolve failures. I would recommend adding a guideline, which is to debrief with the learners upon completion of solution generation. This provides an opportunity for the instructor to discuss the implications of different solutions that may have been explored during initial problem-solving.
Summary This chapter provided an overview of different strategies that can be used to support learning. Recommendations for which types of learning outcomes may warrant particular instructional strategies were discussed. Chapter 7 expands on instructional strategies placing emphasis on ways in which instructional designers can scaffold instruction to support self- regulated learning.
Connecting Process to Practice Activities 1. What types of generative learning strategies have you used in the past to support instruction? You can answer this based on your experience as a learner or as an instructional designer. What recommendations or additional information would you want to impart to learners participating in instruction using generative learning strategies? 2. You have been asked to design a 4-month training program to train a new instructional designer on best practices at your organization. You have been informed that this new instructional designer has
INSTRUCTIONAL STRATEGIES THAT PROMOTE GENERATIVE LEARNING
taken one introductory course but does not have any job-related experience applying instructional design. How might you use the 4C/ID framework to set up an onboarding program to train your new instructional designer? 3. Reflect upon the most recent instructional design project that you completed. Using the cognitive load overlap model, reflect on ways in which you did improve (or could have improved) your learners’ cognitive load. 4. Think about a topic you are familiar with. If you were to design an activity that promotes productive failure, what elements would you want to emphasize throughout the activity? What challenges might you anticipate your learners experiencing? What would you want to address or emphasize during a debriefing session upon conclusion of your learners’ exploration?
Bridging Research and Practice Brod, G. (2021). Generative learning: Which strategies for what age? Educational Psychology Review, 33(4), 1295–1318. Caskurlu, S., Richardson, J. C., Alamri, H. A., Chartier, K., Farmer, T., Janakiraman, S., Strait, M., & Yang, M. (2021). Cognitive load and online course quality: Insights from instructional designers in a higher education context. British Journal of Educational Technology, 52(2), 584–605. Costa, J. M., Miranda, G. L., & Melo, M. (2022). Four-component instructional design (4C/ID) model: A meta- analysis on use and effect. Learning Environments Research, 25(2), 445–463. Sinha, T., & Kapur, M. (2021). When problem solving followed by instruction works: Evidence for productive failure. Review of Educational Research, 91(5), 761–798. Van Gog, T., & Sweller, J. (2015). Not new, but nearly forgotten: The testing effect decreases or even disappears as the complexity of learning materials increases. Educational Psychology Review, 27, 247–264.
References Arrington, T. L., & Tawfik, A. (2022). Designed failure in instructional design and technology. In J. E. Stefaniak & R. M. Reese (Eds.), The instructional designer trainer’s guide: Authentic practices and considerations for mentoring ID and ed tech professionals (pp. 67–76). Routledge. Blayney, P., Kalyuga, S., & Sweller, J. (2015). Using cognitive load theory to tailor instruction to levels of accounting students’ expertise. Journal of Educational Technology & Society, 18(4), 199–210. Brod, G. (2020). Generative learning: Which strategies for what age? Educational Psychology Review, 1–24. https://doi.org/10.1007/s10648-020-09571-9 Clark, R., Nguyen, F., & Sweller, J. (2006). Efficiency in learning: Evidence-based guidelines to manage cognitive load. Pfeiffer. Darabi, A., Arrington, T. L., & Sayilir, E. (2018). Learning from failure: A meta-analysis of the empirical studies. Educational Technology Research and Development, 66, 1101–1118. https://doi.org/10.1007/s11423-018-9579-9
129
130
INSTRUCTIONAL STRATEGIES THAT PROMOTE GENERATIVE LEARNING Fiorella, L., & Mayer, R. E. (2016). Eight ways to promote generative learning. Educational Psychology Review, 28(4), 717–741. https://doi.org/10.1007/s10648- 015-9348-9 Jung, I., & Suzuki, Y. (2015). Scaffolding strategies for wiki-based collaboration: Action research in a multicultural Japanese language program. British Journal of Educational Technology, 46(4), 829–838. https://doi.org/10.1111/bjet.12175 Kalyuga, S. (2011). Cognitive load theory: How many types of load does it really need? Educational Psychology Review, 23, 1–19. https://doi.org/10.1007/s10648- 010-9150-7 Kapur, M. (2008). Productive failure. Cognition and Instruction, 26(3), 379–424. https://doi.org/10.1080/07370000802212669 Kapur, M. (2016). Examining productive failure, productive success, unproductive failure, and unproductive success in learning. Educational Psychologist, 51(2), 289–299. Kerr, S. T. (1983). Inside the black box: Making design decisions for instruction. British Journal of Educational Technology, 14(1), 45–58. https://doi. org/10.1111/j.1467-8535.1983.tb00448.x Kirschner, P., Carr, C., Van Merriënboer, J., & Sloep, P. (2002). How expert designers design. Performance Improvement Quarterly, 15(4), 86–104. Kyun, S., Kalyuga, S., & Sweller, J. (2013). The effect of worked examples when learning to write essays in English literature. The Journal of Experimental Education, 81(3), 385–408. Law, M. P., & Finnigan, J. K. (2021). Letting your students fail: Overcoming failure experiences in undergraduate work-integrated learning. International Journal of Work-Integrated Learning, 22(3), 357–368. Lee, H. W., Lim, K. Y., & Grabowski, B. L. (2008). Generative learning: Principles and implications for making meaning. In J. M. Spector, M. D. Merrill, J. van Merriënboer, & M. P. Driscoll (Eds.), Handbook of research on educational communications and technology (3rd ed., pp. 111–124). Routledge. Morrison, G. R., Ross, S. M., Kalman, H. K., & Kemp, J. E. (2013). Designing effective instruction (7th ed.). Wiley. Oksa, A., Kalyuga, S., & Chandler, P. (2010). Expertise reversal effect in using explanatory notes for readers of Shakespearean text. Instructional Science, 38, 217–236. https://doi.org/10.1007/s11251-009-9109-6 Paas, F. G., & van Merriënboer, J. J. (1994). Variability of worked examples and transfer of geometrical problem-solving skills: A cognitive-load approach. Journal of Educational Psychology, 86(1), 122. https://doi.org/10.1037/0022-0663.86.1.122 Paas, F., Renkl, A., & Sweller, J. (2003). Cognitive load theory and instructional design: Recent developments. Educational Psychologist, 38(1), 1–4. https://doi. org/10.1207/S15326985EP3801_1 Piaget, J. (1977). The development of thought: Equilibrium of cognitive structures. Viking. Rong, H., & Choi, I. (2019). Integrating failure in case- based learning: A conceptual framework for failure classification and its instructional implications. Educational Technology Research and Development, 67(3), 617–637. https:// doi.org/10.1007/s11423-018-9629-3 Sentz, J., & Stefaniak, J. (2019). Instructional heuristics for the use of worked examples to manage instructional designers’ cognitive load while problem-solving. TechTrends, 63(2), 209–225. https://doi.org/10.1007/s11528-018-0348-8 Sentz, J., Stefaniak, J., Baaki, J., & Eckhoff, A. (2019). How do instructional designers manage learners’ cognitive load? An examination of awareness and application of strategies. Educational Technology Research and Development, 67(1), 199–245. https://doi.org/10.1007/s11423-018-09640-5
INSTRUCTIONAL STRATEGIES THAT PROMOTE GENERATIVE LEARNING Smith, P. L., & Ragan, T. J. (2005). Instructional design (3rd ed.). Jossey-Bass. Stark, R., Kopp, V., & Fischer, M. R. (2011). Case-based learning with worked examples in complex domains: Two experimental studies in undergraduate medical education. Learning and instruction, 21(1), 22–33. https://doi.org/10.1016/ j.learninstruc.2009.10.001 Stefaniak, J. (2021). Leveraging failure-based learning to support decision-making and creative risk in instructional design pedagogy. TechTrends, 65(5), 646–652. https://doi.org/10.1007/s11528-021-00608-6 Sweller, J. (1994). Cognitive load theory, learning difficulty, and instructional design. Learning and Instruction, 4(4), 295–312. Sweller, J. (2008). Human cognitive architecture. In J. M. Spector, M. D. Merrill, J. Elen, & M. J. Bishop (Eds.), Handbook of research on educational communications and technology (3rd ed., pp. 369–381). Springer. Sweller, J., & Cooper, G. A. (1985). The use of worked examples as a substitute for problem solving in learning algebra. Cognition and Instruction, 2(1), 59–89. https://doi.org/10.1207/s1532690xci0201_3 Tawfik, A., Jonassen, D., & Keene, C. (2012). Why do we fall? Using experiences of failure to design case libraries. International Journal of Designs for Learning, 3(1). Tawfik, A. A., Rong, H., & Choi, I. (2015). Failing to learn: Towards a unified design approach for failure-based learning. Educational Technology Research and Development, 63(6), 975–994. https://doi.org/10.1007/s11423-015-9399-0 van Merriënboer, J. J. G., & Kirschner, P. A. (2013). Ten steps to complex learning: A systematic approach to four-component instructional design (2nd ed.). Routledge. Winer, L. R., & Vázquez‐Abad, J. (1995). The present and future of ID practice. Performance Improvement Quarterly, 8(3), 55–67. Wittrock, M. C. (1974). Learning as a generative process. Educational Psychologist, 11(2), 87–95. https://doi.org/10.1080/00461527409529129 Wittrock, M. C. (1989). Generative processes of comprehension. Educational Psychologist, 24(4), 345–376. https://doi.org/10.1207/s15326985ep2404_2 Wittrock, M. C. (1992). Generative learning processes of the brain. Educational Psychologist, 27(4), 531–541.
131
8
SCAFFOLDING INSTRUCTION TO SUPPORT SELF-REGULATION OF LEARNERS
Chapter Overview While several chapters in this book will address the importance of gradually building upon complexity when designing instruction, this chapter will place emphasis on how instructional designers can integrate strategies to support their learners’ abilities to engage in help-seeking and self-regulated learning. Self-regulated learning typically occurs during three phases: forethought, performance control, and self- evaluation. Common strategies include, but are not limited to, goal-setting, environmental structuring, task strategies, time management, help-seeking, and self-evaluation. Specific strategies will be given to show how self-regulated learning can be promoted in face-to-face learning and online learning environments. This chapter will expand upon the decision-making chapter (Chapter 2) to discuss the types of design judgments that an instructional designer may make when scaffolding instruction to support self-regulated learning.
Guiding Questions 1. What is self-regulated learning? 2. What are different types of help-seeking sources? 3. What are examples of different instructional scaffolds? 4. How can instructional scaffolds be integrated into a lesson to promote self-regulated learning?
Earlier chapters in this book have explored different instructional strategies that can be used to support learning within different contexts. Chapter 7 explored the use of generative learning strategies, and Chapter 8 provides recommendations for managing learners’ cognitive load. This chapter builds upon these previous chapters by looking at ways in which we can scaffold instructional activities to support learners’ abilities to engage in self-regulated learning. Additionally, strategies related to self-regulated
DOI: 10.4324/9781003287049-8
SCAFFOLDING INSTRUCTION TO SUPPORT SELF-REGULATION OF LEARNERS
learning and help-seeking will be explored in terms of how they can support a localization of context in instructional design and support design judgments made by instructional designers.
Self-Regulated Learning Self-regulated learning refers to the actions that a learner undertakes to plan, enact, and monitor progress made toward the attainment of their learning goals (Zimmerman, 1998). Emphasis is placed on goal- setting prior to learning activities to support students’ abilities to prioritize different learning tasks to meet those goals. Six commonly used strategies to promote self-regulated learning are goal-setting, environmental structuring, task strategies, time management, help-seeking, and self-evaluation (Effeney et al., 2013; Zimmerman & Martinez- Pons, 1990). To date, researchers have proposed categorizing self-regulated learning activities into different phases (Pintrich, 1999; Winne & Hadwin, 2008; Zimmerman, 2002). One of the most recognized self-regulated learning processes, proposed by Zimmerman (2002), was that learners engage in three phases of self-regulated learning: (1) forethought, (2) performance control, and (3) self-reflection. Table 8.1 provides an overview of self-regulated learning strategies as they relate to Zimmerman’s (2002) phases of self-regulated learning. Several researchers have noted that self-regulated learning is an iterative and recursive process (Winne & Hadwin, 2008; Zimmerman, 2002). Other scholars have specifically labeled self- regulation as a dynamic process (Beishuizen, 2008; Ben-Eliyahu & Bernacki, 2015; Iran-Nejad & Chissom, 1992).
The Forethought Phase Forethought is the first phase of Zimmerman’s (2002) self-regulated learning framework and is composed of the activities that a learner engages in in preparation for the learning activity. Typically, these activities involve goal-setting and strategic planning required to complete the learning activity (Barnard-Brak et al., 2010; Khaled et al., 2016; Zimmerman, 1998). During this phase, learners establish goals to help them break down the task into meaningful compartments to assist them with sequencing and time management. They also engage in environmental structuring where they assess the physical and technical environment needed to contribute to their ease of learning. Winne and Hadwin (2008) proposed that learners engage in four phases of self-regulated learning. They proposed an initial phase before the forefront phase that addressed learner task perception. The act of assessing the difficulty of the task is embedded in Zimmerman’s (2002) forefront phase.
133
134
SCAFFOLDING INSTRUCTION TO SUPPORT SELF-REGULATION OF LEARNERS
TABLE 8.1 Self-Regulated Learning (SRL) Strategies and Associated Phases SRL Strategy
Strategy Description
Goal-setting
Learner efforts to establish goals and subgoals to help plan the sequencing, timing, and completion of academic tasks Learner efforts to select and arrange the physical or technical setting to make learning easier Learner efforts to actively utilize specific strategies to achieve desired goals Learner efforts to consider what must be done and devote an appropriate amount of time to each task Learner efforts to secure additional task information from a variety of sources, such as an instructor, classmate, or outside resource Learner efforts to gauge the progress and quality of their work toward desired goals
Environmental structuring
Task strategies
Time management
Help-seeking
Self-evaluation
Forethought Phase
Performance Control Phase
Self-Reflection Phase
Source: Bruso, J., Stefaniak, J., & Bol, L. (2020). Table 1. SRL strategies and associated phases (p. 2662). Used with permission.
I do think it is important to note that task perception should be considered during the forefront phase, particularly as the learner begins to engage in goal-setting. Depending on their perceptions of how difficult the task may be, their goals should be designed to be attainable by the learner. The environmental structuring activities will be dependent on the learner’s task perception and the goals they have established for completing. This also coincides with learner factors to be considered in the instructional context presented by Tessmer and Richey (1997).
SCAFFOLDING INSTRUCTION TO SUPPORT SELF-REGULATION OF LEARNERS
The Performance Control Phase The second phase of self-regulated learning is performance control. This phase is rooted in the instructional context at the time learning is occurring. This phase is composed of the learner applying task strategies to complete the task, time management, and help-seeking. During the performance control phase, learners will identify strategies to complete tasks related to their goals (Abrami et al., 2011; Hattie, 2009). Learners employ time management techniques to ensure that adequate time is devoted to task completion. Learners’ abilities to identify what time requirements exist for completing a task should be an extension of their planning in the forethought phase. Taking time to reflect on their task perception helps them to recognize areas where they may require additional assistance. As the learner begins to work on the task(s), they may engage in help- seeking if they require assistance. Help-seeking occurs when an individual recognizes a gap in their comprehension and they seek assistance to improve their performance (Karabenick & Knapp, 1991). Help-seeking is a manifestation of self-regulated behavior where the learner reflects upon their progress meeting their goals (Lynch & Dembo, 2004; Ryan & Pintrich, 1997). Help- seeking processes often are categorized as executive help-seeking or adaptive help-seeking (Nelson-Le Gall, 1985). Executive help-seeking consists of a learner seeking help focusing solely on obtaining the correct answer. Little to no concern is given to mastering the content. Adaptive help-seeking consists of a learner seeking help with the goal to master the content. As instructional designers, we should aim to put the necessary instructional supports into our learning environments that promote adaptive help-seeking. This supports the development of learners’ adaptive expertise as they learn to apply content they have learned in the instructional context to the transfer context. Recognizing that real-world settings are dynamic, learners need to be prepared to use the resources provided to them while engaging in problem-solving that is relevant to the situation. The help-seeking process can be broken down into the following steps offered by Karabenick (2011): . Determine that a problem exists. 1 2. Determine that help is needed. 3. Decide to seek help. 4. Establish the purpose or goal of seeking help. 5. Decide whom to ask. 6. Solicit help. 7. Obtain the requested help. When a learner engages in help-seeking, it is important that they identify and explore help-seeking strategies that align with their learning goals and environmental structuring. Adaptive help-seeking is influenced by the
135
136
SCAFFOLDING INSTRUCTION TO SUPPORT SELF-REGULATION OF LEARNERS
environment that the learner is in while engaged in self-regulated learning (Giblin & Stefaniak, 2017; Karabenick & Dembo, 2011). While help- seeking activities typically occur during the performance control phase, successful strategies that promote adaptive help-seeking are intertwined with the activities occurring in the forethought phase. This supports the dynamic and iterative nature inherent in self-regulated learning. When identifying appropriate help-seeking sources, learners often evaluate the source according to four dimensions: role, relationship, channel, and adaptability (Makara & Karabenick, 2013). Formal roles typically consist of course materials posted on a course website or learning management system, syllabi, textbooks, or instructors and tutors. Informal help-seeking sources include discussion boards, peer networks, family and friends, and social networking sites. Sources can also be viewed in terms of relationship of the learner as either personal or impersonal. Friends in a class, an instructor that you have had before and are familiar with, or family and friends are typically considered to be personal sources. Impersonal sources may include syllabus, textbooks, web search engines, and sending emails to an instructor. Channels are categorized as mediated versus face- to-face sources. Examples of mediated sources include chat rooms, discussion forums, sending emails to instructors and peers, syllabi, textbooks, or social networking sites. Face-to-face sources include meeting in person with peers, tutors, family, friends, and your instructor. The adaptability of a source can be examined in terms of the degree that it is dynamic versus static. Dynamic sources include chatrooms, discussion forums, and reaching out to your instructor (in person or through email). These types of help sources are dynamic in that the resource is typically customized to address the learner’s specific needs and situation. Static sources do not change and are not modified to suit the learner’s specific need for help. Examples of static sources include syllabi, textbooks, and web searches. The results that are offered through these sources are constant. It is up to the learner to review and determine whether the information is pertinent to their needs they are trying to address. The degree to which a learner will engage in help-seeking is variable depending upon their familiarity with the content and the complexity of the task. From an instructional design perspective, it is important to note that not all learners will be enthusiastic to engage in help-seeking. Some may choose to avoid it altogether despite not being able to complete tasks. Reasons why learners may choose to avoid seeking help include, but are not limited to, a fear of having to take an autonomous role in their learning, perceptions that seeking help is a threat to their self-worth, wanting to maintain a positive image among their peers or co-workers, and being afraid of being judged as incompetent by others (Butler & Neuman, 1995; Karabenick, 2004; Karabenick & Knapp, 1991; Ryan et al., 2001; Tanaka et al., 2002). The third and final phase of Zimmerman’s (2002) self-regulated learning is self-reflection. During this phase, the learner engages in self-evaluation
SCAFFOLDING INSTRUCTION TO SUPPORT SELF-REGULATION OF LEARNERS
of their efforts to gauge the progress and quality of their work toward achieving their desired goals. Through a self- evaluation strategy, the learner should reflect on the effectiveness of strategies they attempted to use during the performance control phase. They can assess the extent to which the strategies they used supported their ability to obtain their goals. The self-evaluation strategy can also be used proactively to help the learner re-focus and begin planning for additional goals they would like to work on. When integrated in instructional design activities, the learner can be prompted to reflect on how they self-monitored their program, managed their time, and handled any other factors that may have impacted their performance.
The Instructional Designer’s Role in Self-Regulated Learning Activities While self- regulated learning should be driven predominantly by the learner, the instructional design can embed prompts within instruction to support their learners during a learning activity. Table 8.2 provides an TABLE 8.2 Instructional Designer’s Role During Self-Regulated Learning Self-Regulated Learning Strategy
Instructional Designer’s Role
Goal-setting
• Prompt learners to acknowledge their perception of the task. • Encourage learners to set appropriate goals for completing the tasks. • Emphasize sequencing and timing of tasks. • Provide opportunities for learners to share their goals with peers, if applicable. • Ensure that learners’ goals and tasks are achievable within the instructional setting. • Inform learners of resources that may be available to them to achieve their goals within the instructional environment. • Allocate sufficient time for learners to explore task strategies. • Provide some flexibility with completing tasks to allow for troubleshooting. • Incorporate check-in points for learners to provide progress updates. • Provide guidance if modifications need to be made to plans in order to achieve goals. • Inform learners of help-seeking sources available to them. • Debrief with learners after the learning event/activity. • Provide guidance to learners as they engage in planning for upcoming goals.
Environmental structuring
Task strategies
Time management
Help-seeking Self-evaluation
137
138
SCAFFOLDING INSTRUCTION TO SUPPORT SELF-REGULATION OF LEARNERS
overview of the learner’s and instructor’s role in self-regulated learning activities. Leveraging information we have gathered during the contextual analysis enables us to help students engage in meaningful self-regulation strategies that build upon the orienting context. Like a mentor during a cognitive apprenticeship, the instructional designer or instructor (or both) should assume more of a coaching role. The more opportunities learners have to engage in self-regulated learning strategies, the more comfortable they will become with taking a more autonomous role in their learning. Instructional activities can be designed in a way that prompts learners to set goals, take inventory of the environment and resources available to them, and engage in tasks to meet their goals. Strategies such as time management, help-seeking, and self- evaluation can be integrated throughout activities through opportunities for learners to debrief with the instructor or their peers (or both) and reflect upon their progress. Self- regulated learning activities can strengthen learners’ abilities to engage in help- seeking strategies by providing them with intentional opportunities to engage in goal-setting, identification of resources, and self-evaluations of their progress. Most instructional designers would argue that they would want their learners to engage in adaptive help-seeking, working toward content mastery as opposed to executive help-seeking strategies that are focused solely on obtaining the correct answer to a specific problem. While learners need to be intrinsically motivated to seek help to support the attainment of their goals, instructional designers can embed opportunities for learners to identify different help sources available to them during a learning activity. While it is not the responsibility of the instructional designer or the instructor to ensure that a learner follows through with seeking help, they can make learners aware of resources to support their learning. This does not have to be a tedious task for the instructional designer. Rather, it could be something that is introduced to learners during an orientation session to a class or a kick-off meeting at an on-the-job training event. Self- regulated learning should be introduced to learners so they are aware of what it is and how it can be used in their development. Instructors can routinely integrate opportunities for discussions around goal-setting and environmental structuring to support the identification of necessary tasks needed to achieve those goals. By incorporating conversations and different activities that promote self-regulated learning, learners will become more comfortable with the concept and will take on a more autonomous role in their own training.
Scaffolding Chapter 6 introduced scaffolding as a significant part of a cognitive apprenticeship. During a cognitive apprenticeship, the expert intentionally
SCAFFOLDING INSTRUCTION TO SUPPORT SELF-REGULATION OF LEARNERS
scaffolds learning experiences, providing just-in-time supports to guide the learner as needed. Like cognitive apprenticeships, scaffolding plays a significant role in the self-regulated learning process. During self-regulated learning, the learner works to break down tasks into manageable steps. If the tasks associated with the learners’ goals are complex and will benefit from being broken down into smaller components, scaffolding can complement the self-regulated learning process. Instructional designers involved in designing such experiences can also scaffold learning to gradually increase learners’ autonomy in self- regulation as it relates to the instructional content. As mentioned in Chapter 6, Wood (2003) has described scaffolding according to three contingencies of support: 1. Instructional contingency: how to support activity 2. Domain contingency: what to focus on next 3. Temporary contingency: if and when to intervene Regarding designing a learning experience that integrates self-regulated learning strategies, the instructional designer can support learners by scaffolding activities that promote their abilities to frame the situation, set goals, and engage in environmental structuring and help-seeking. Table 8.3 provides prompts for things to consider when designing activities with the intent of promoting self-regulated learning.
TABLE 8.3 Prompts to Design a Self-Regulated Learning Experience Contingency
Design Prompts for the Instructional Designer
Instructional contingency
• To what extent do you want your learners reflecting on the content? • What do you want your learners to focus on during the activity? • What is the complexity of the task? • How much support will be needed from the instructor? • What are ways the instructor can provide feedback to the learners on the goals they have set? • How much time will be allocated to accomplish tasks identified during goal-setting? • Are learners aware of help sources? • How will goal-setting and task identification lead to subsequent topics? • How much trial-and-error is the instructor willing to impose on their learners? • Is it important for learners to learn from the consequences of their actions? • What are appropriate times for the instructor to intervene?
Domain contingency
Temporal contingency
139
140
SCAFFOLDING INSTRUCTION TO SUPPORT SELF-REGULATION OF LEARNERS
As mentioned in Chapter 6, scaffolding is the process of an expert providing just-in-time support to a learner or novice (Wood et al., 1976). As a learner is introduced to new content and tasks, the instructional design should intentionally scaffold activities that meet the learner where they are and support their development over an extended period of time. Belland (2017) notes that the scaffolding should be intentionally integrated into an instructional activity as a means for learners to engage in meaningful participation. Examples of scaffolding mechanisms as noted by Belland (2014) include the following: • • • • • •
Enlisting student interest Controlling frustration Providing feedback Indicating important/task problem elements to consider Modeling expert processes Questioning
There are three primary modalities for instructional scaffolding: one-to-one, peer, and computer-based. One-to-one scaffolding involves the instructor providing direct feedback to the learner. Most often, these one-to-one scaffolding experiences involve the expert modeling processes and providing feedback to the learner on their performance (van de Pol et al., 2011). Peer scaffolding is when learners work together in pairs or small groups and provide feedback to one another as they engage in the completion of tasks. Research has shown that when learners are expected to engage in peer-to-peer scaffolding, it may be necessary for the instructor to provide them with prompts as they may not know how to critically evaluate one another’s performance if they have the same level of knowledge and experience (Belland, 2014; Mercer et al., 2004; Oh & Jonassen, 2007). Computer- based scaffolding can be used to provide learners with support in addition to an instructor providing one- to- one scaffolding. Computers can be used to scaffold instruction, guiding learners through various tasks and content gradually with the intentions that the instructor will also intervene (Belland et al., 2011; Sandoval & Reiser, 2004). Prompts can be incorporated through computer-based scaffolds to support learners as they reflect on different lessons and topics and begin to set goals to support their mastery. Depending on the instructional designer’s intentions for the learning activity, instructors can provide deliberate prompts to guide learners through goal-setting and environmental structuring depending on which scaffolding mechanisms may be prioritized for the activity. Belland (2014) recommends that “when developing scaffolding interventions, it is important to remember the key scaffolding modalities—one-to-one, peer, and computer-based—and to consider how such modalities can be combined in an overall distributed scaffolding strategy” (p. 515).
SCAFFOLDING INSTRUCTION TO SUPPORT SELF-REGULATION OF LEARNERS
Self-Regulated Learning to Support Instructional Design Judgments While this chapter has been centered on the learner, it is important to note that self-regulated learning strategies can also be used to support our instructional design judgments and reflective practices. Chapter 2 provided an overview of design judgments commonly invoked by designers (Nelson & Stolterman, 2012) that can be applied to instructional design contexts. Zimmerman’s (2002) self-regulated learning framework can be used to support our design decision-making by helping us to map out goals and set tasks that are supportive in developing and enhancing our learners’ experiences. As we determine what aspects of our designs we want to emphasize, we can begin to plan for how we can set goals with our learners to help them achieve various milestones associated with different aspects of our design. As research on design decision-making continues to arouse interest, we can also look to self-regulated learning to support our own professional development as instructional designers. If we were to think about the various design judgments commonly invoked by designers (offered by Nelson & Stolterman, 2012), we could begin to focus on particular judgments we may need to improve. As we think about where our strengths lie as designers, we can think about what our goals are in relation to improving our abilities to enact particular judgments (i.e., appearance, navigational, etc.). From there, we could outline what tasks are necessary for being able to make particular judgments and translate those judgments through our designs. Lastly, we can engage in self-evaluation to determine the effectiveness of how we set out to meet our goals.
Summary This chapter provided an overview of different strategies used to promote self-regulated learning. It is important to know the critical role that contextual awareness plays in supporting learners’ abilities to engage in environmental structuring when setting goals to accomplish tasks. The more information the instructional designer has regarding their learning audience, the better they are able to curtail instructional activities to meet their needs. While self-regulated learning is completed predominantly by the learner, the instructional designer should intentionally design activities that scaffold self-regulating activities to increase learners’ autonomy over time. Chapter 9 extends discussions about promoting learner autonomy during instruction by examining the role of motivational strategies in instructional design.
Connecting Process to Practice Activities 1. You have recently been hired to develop a series of e- learning modules for first-year college students to promote good study habits. How might you integrate Zimmerman’s (2002) self-regulated
141
142
SCAFFOLDING INSTRUCTION TO SUPPORT SELF-REGULATION OF LEARNERS
learning module throughout the modules? What types of prompts would you use to support learners as they engage in forethought, performance control, and self-evaluation? 2. What types of help-seeking sources would you recommend a manufacturing company share with employees they recently hired to work on an assembly line? How might you make employees aware of help-seeking sources available to them? 3. Instructional designers and instructors are more apt to be successful at customizing appropriate prompts to promote self-regulated learning if they have a strong knowledge of their learning audience. Based on your experiences with contextual analysis, how else could different information obtained about orienting, instructional, and transfer contexts be used to support self-regulated learning? 4. What types of scaffolding mechanism (i.e., one-to-one, peer, and computer-based) have you experienced as a learner? What are the advantages and disadvantages of using each scaffolding mechanism?
Bridging Research and Practice Fan, Y., Matcha, W., Uzir, N. A. A., Wang, Q., & Gašević, D. (2021). Learning analytics to reveal links between learning design and self- regulated learning. International Journal of Artificial Intelligence in Education, 31(4), 980–1021. https://doi.org/10.1007/s40593-021-00249-z Glazewski, K. D., & Hmelo-Silver, C. E. (2019). Scaffolding and supporting use of information for ambitious learning practices. Information and Learning Sciences, 120(1/2), 39–58. Huang, L., & Lajoie, S. P. (2021). Process analysis of teachers’ self-regulated learning patterns in technological pedagogical content knowledge development. Computers & Education, 166, 104169. https://doi.org/10.1016/j.compedu.2021. 104169 Lee, D., Watson, S. L., & Watson, W. R. (2020). The influence of successful MOOC learners’ self- regulated learning strategies, self- efficacy, and task value on their perceived effectiveness of a massive open online course. International Review of Research in Open and Distributed Learning, 21(3), 81–98. https://doi. org/10.19173/irrodl.v21i3.4642 Taranto, D., & Buchanan, M. T. (2020). Sustaining lifelong learning: A self-regulated learning (SRL) approach. Discourse and Communication for Sustainable Education, 11(1), 5–15. https://doi.org/10.2478/dcse-2020-0002
References Abrami, P. C., Bernard, R. M., Bures, E. M., Borokhovski, E., & Tamim, R. M. (2011). Interaction in distance education and online learning: Using evidence and theory to improve practice. Journal of Computing in Higher Education, 23(2–3), 82–103. https://doi.org/10.1007/s12528-011-9043-x Barnard-Brak, L., Paton, V. O., & Lan, W. Y. (2010). Profiles in self-regulated learning in the online learning environment. International Review of Research in Open and Distributed Learning, 11(1), 61–80. https://doi.org/10.19173/irrodl. v11i1.769
SCAFFOLDING INSTRUCTION TO SUPPORT SELF-REGULATION OF LEARNERS Beishuizen, J. (2008). Does a community of learners foster self-regulated learning? Technology, Pedagogy and Education, 17(3), 183–193. https://doi. org/10.1080/14759390802383769 Belland, B. (2014). Scaffolding: Definition, current debates, and future directions. In J. M. Spector, M. D. Merrill, J. Elen, & M. J. Bishop (Eds.), Handbook of research on educational communications and technology (4th ed., pp. 505– 518). Springer. Belland, B. (2017). Instructional scaffolding in STEM education: Strategies and efficacy evidence. Springer. Belland, B. R., Glazewski, K. D., & Richardson, J. C. (2011). Problem-based learning and argumentation: Testing a scaffolding framework to support middle school students’ creation of evidence-based arguments. Instructional Science, 39, 667– 694. https://doi.org/10.1007/s11251-010-9148-z Ben- Eliyahu, A., & Bernacki, M. L. (2015). Addressing complexities in self- regulated learning: A focus on contextual factors, contingencies, and dynamic relations. Metacognition and Learning, 10, 1–13. https://doi.org/10.1007/ s11409-015-9134-6 Bruso, J., Stefaniak, J., & Bol, L. (2020). An examination of personality traits as a predictor of the use of self-regulated learning strategies and considerations for online instruction. Educational Technology Research and Development, 68, 2659–2683. https://doi.org/10.1007/s11423-020-09797-y Butler, R., & Neuman, O. (1995). Effects of task and ego achievement goals on help- seeking behaviors and attitudes. Journal of Educational Psychology, 87(2), 261. Effeney, G., Carroll, A., & Bahr, N. (2013). Self- regulated learning: Key strategies and their sources in a sample of adolescent males. Australian Journal of Educational & Developmental Psychology, 13, 58–74. Giblin, J., & Stefaniak, J. (2017). Achievement goal structure and type of assistance sought in an undergraduate classroom. Journal of Applied Instructional Design, 6(1), 33–42. Hattie, J. (2009). Visible learning. Routledge. Iran-Nejad, A., & Chissom, B. S. (1992). Contributions of active and dynamic self- regulation to learning. Innovative Higher Education, 17, 125–136. https://doi. org/10.1007/BF00917134 Karabenick, S. A. (2004). Perceived achievement goal structure and college student help-seeking. Journal of Educational Psychology, 96(3), 569–581. https://doi. org/10.1037/0022-0663.96.3.569 Karabenick, S. A. (2011). Classroom and technology-supported help seeking: The need for converging research paradigms. Learning and Instruction, 21(2), 290–296. Karabenick, S. A., & Dembo, M. H. (2011). Understanding and facilitating self- regulated help-seeking. New Directions for Teaching and Learning, 2011(126), 33–43. https://doi.org/10.1002/tl.442 Karabenick, S. A., & Knapp, J. R. (1991). Relationship of academic help-seeking to the use of learning strategies and other instrumental achievement behavior in college students. Journal of Educational Psychology, 83(2), 221–230. https:// doi.org/10.1037/0022-0663.83.2.221 Khaled, A., Gulikers, J., Biemans, H., & Mulder, M. (2016). Occurrences and quality of teacher and student strategies for self-regulated learning in hands-on simulations. Studies in Continuing Education, 38(1), 101–121. https://doi.org/10.108 0/0158037X.2015.1040751 Lynch, R., & Dembo, M. (2004). The relationship between self- regulation and online learning in a blended learning context. International Review of Research in Open and Distributed Learning, 5(2), 1–16. https://doi.org/10.19173/irrodl. v5i2.189
143
144
SCAFFOLDING INSTRUCTION TO SUPPORT SELF-REGULATION OF LEARNERS Makara, K., & Karabenick, S. (2013). Characterizing sources of academic help in the age of expanding educational technology: A new conceptual framework. In S. Karabenick & M. Puustinen (Eds.), Advances in help-seeking research and applications: The role of emerging technologies (pp. 37–72). Information Age Publishing. Mercer, N., Dawes, L., Wegerif, R., & Sams, C. (2004). Reasoning as a scientist: Ways of helping children to use language to learn science. British Educational Research Journal, 30(3), 359–377. https://doi.org/10.1080/01411920410001689689 Nelson, H. G., & Stolterman, E. (2012). The design way: Intentional change in an unpredictable world (2nd ed.). The MIT Press. Nelson-Le Gall, S. (1985). Help-seeking behavior in learning. Review of Research in Education, 12(1985), 55–90. Retrieved from http://www.jstor.org/stable/1167146 Oh, S., & Jonassen, D. H. (2007). Scaffolding online argumentation during problem solving. Journal of Computer Assisted Learning, 23(2), 95–110. https://doi. org/10.1111/j.1365-2729.2006.00206.x Pintrich, P. R. (1999). The role of motivation in promoting and sustaining self- regulated learning. International Journal of Educational Research, 31(6), 459– 470. https://doi.org/10.1016/S0883-0355(99)00015-4 Ryan, A. M., & Pintrich, P. R. (1997). “Should I ask for help?” The role of motivation and attitudes in adolescents’ help-seeking in math class. Journal of Educational Psychology, 89(2), 329–341. https://doi.org/10.1037/0022-0663.89.2.329 Ryan, A. M., Pintrich, P. R., & Midgley, C. (2001). Avoiding seeking help in the classroom: Who and why? Educational Psychology Review, 93–114. Sandoval, W. A., & Reiser, B. J. (2004). Explanation-driven inquiry: Integrating conceptual and epistemic scaffolds for scientific inquiry. Science Education, 88(3), 345–372. https://doi.org/10.1002/sce.10130 Tanaka, A., Murakami, Y., Okuno, T., & Yamauchi, H. (2002). Achievement goals, attitudes toward help- seeking, and help- seeking behavior in the classroom. Learning and Individual Differences, 13(1), 23–35. https://doi.org/10.1016/ S1041-6080(02)00043-2 Tessmer, M., & Richey, R. C. (1997). The role of context in learning and instructional design. Educational Technology Research and Development, 45(2), 85– 115. https://doi.org/10.1007/BF02299526 Van de Pol, J., Volman, M., & Beishuizen, J. (2011). Patterns of contingent teaching in teacher–student interaction. Learning and Instruction, 21(1), 46–57. https:// doi.org/10.1016/j.learninstruc.2009.10.004 Winne, P. H., & Hadwin, A. F. (2008). The weave of motivation and self-regulated learning. In D. H. Schunk & B. J. Zimmerman (Eds.), Motivation and self- regulated learning: Theory, research, and applications (pp. 297–314). Lawrence Erlbaum Associates Publishers. Wood, D. (2003). The why? What? When? And how of tutoring: The development of helping and tutoring skills in children. Literacy Teaching and Learning, 7(1&2), 1–30. Wood, D., Bruner, J. S., & Ross, G. (1976). The role of tutoring in problem solving. Journal of Child Psychology and Psychiatry, 17, 89–100. https://doi.org/10. 1111/j.1469-7610.1976.tb00381.x Zimmerman, B. J. (1998). Academic studying and the development of personal skill: A self-regulatory perspective. Educational Psychologist, 33(2–3), 73–86. Zimmerman, B. J. (2002). Becoming a self-regulated learner: An overview. Theory into Practice, 41(2), 64–70. Zimmerman, B. J., & Martinez-Pons, M. (1990). Student differences in self-regulated learning: Relating grade, sex, and giftedness to self-efficacy and strategy use. Journal of Educational Psychology, 82(1), 51–59. https://doi.org/10.1037/0022-0663.82.1.51
9
MOTIVATIONAL DESIGN PRACTICES TO SUPPORT KNOWLEDGE ACQUISITION
Chapter Overview Building upon other instructional strategies that have been discussed in the book to support knowledge acquisition, this chapter will emphasize motivational design strategies to support learners. Motivation and volition strategies will be discussed and aligned with earlier topics pertaining to needs assessment, learner analysis, and localization of context. Strategies for how motivation design models can serve as complementary overlay models to existing instructional design practices will be presented and tools will be provided to help instructional designers intentionally design motivational activities.
Guiding Questions 1. What is motivational design? 2. What strategies can I incorporate in my instruction to motivate my learners? 3. How can contextual analysis be leveraged to support learner motivation? 4. What role does the instructional designer serve in motivating learners?
Considering Motivation in Instructional Design A theme inherent in every chapter of this book is that the work we do as instructional designers should contribute to effectiveness, efficiency, and ease of learning. If we focus on this last goal, ease of learning, we should recognize that it is not just a matter of designing content that is organized and anchored to specific learning outcomes; learners’ motivation to participate in the instruction is also a big contributor to their success. Many instructional design courses compartmentalize different parts of the instructional design process to manage the amount of content presented to learners. This is particularly important in introductory
DOI: 10.4324/9781003287049-9
146
MOTIVATIONAL DESIGN PRACTICES TO SUPPORT KNOWLEDGE ACQUISITION
instructional design courses. The majority of the literature describing the instructional design process will describe it as being an iterative process. While several chapters in this book have placed emphasis on different aspects of instructional design, the discussions emphasize the characteristic that design is indeed iterative. Everything we do as designers is part of an integrated process. As we look to gathering information to truly understand our intended learning audience and attempt to pair instructional strategies that support the goals of a project, we should also be thinking about the intersection between our design decisions and activities and motivation. While most instructional designers will say that they want to engage their learners, I am not sure how many of us actually stop and reflect throughout each phase or major milestone of our projects to ask ourselves “To what extent am I motivating my learners?” or “How is this instructional activity going to motivate my learners to master the content?” Motivation is what gives individuals the energy or drive to complete a task (Seifert & Sutton, 2018). It is what drives a student to persevere and continue to practice despite not mastering the task. It is what prompts a learner to continue to seek help and resources because they are determined to understand how to complete a task independently. From an instructional lens, I have witnessed a spectrum of motivation among learners. There are learners who will read every resource provided to them, proactively seek feedback on their performance, and are willing to put in the necessary time and effort until they master the task. They do not give up until they exceed expectations. Anything less is unacceptable. I also have learners who will work hard, seek help from multiple channels, and persevere until they meet expectations. They are okay with “good enough.” On the other end of the spectrum are learners who are not motivated or engaged. They do not participate in instructional activities and perform poorly or not at all. These learners are somewhat perplexing because they leave the instructor wondering why they signed up for a course or training in the first place. It becomes even more perplexing when we find out that they have voluntarily signed up and are unmotivated. It is difficult to understand why some individuals are more motivated to learn than others. We know from learner analysis that a lot can depend on learners’ perception of utility – do they believe that the instruction will be useful to them? We also know that motivation can be subject to change when an employee is being forced to attend training for something they most likely will not be required to use in their job. Children may lack motivation at the beginning of a new activity if they have encountered challenges in the past. Our chapters on different types of analyses that are used in instructional design (i.e., needs, learner, and context) have emphasized the need for alignment between individual, team, and organizational needs with instructional content.
MOTIVATIONAL DESIGN PRACTICES TO SUPPORT KNOWLEDGE ACQUISITION
Several contextual factors that were introduced by Tessmer and Richey (1997) in their breakdown of orienting, instructional, and transfer contexts refer to motivation in different aspects of instruction design. One of the interesting things about motivation is that while there are researchers in our field who focus specifically on studying motivational constructs such as achievement, self- efficacy, and self- regulation, motivation is not displayed prominently in our instructional design models and prescriptions. The purpose of this chapter is to explore certain motivational constructs that are relevant to instructional design. Emphasis will be placed on identifying different strategies we can employ in our design work to intentionally consider how motivation is being addressed. To date, only a couple of motivational design models have been introduced to the learning, design, and technology field. This chapter will provide an overview of those models and give recommendations for how a motivational lens can be imposed on existing instructional design frameworks. As mentioned in Chapter 5, when we look at the different contexts that touch on instructional design, we, as designers, have the most control over the instructional context. By integrating instructional strategies that promote motivation, instructional designers can help enhance the relationship between the orienting and transfer contexts. Again, by conducting learner analyses and developing personas that are representative of our learning audiences, we can attend to instructional activities that support a combination of intrinsic and extrinsic motivation. Intrinsic motivation is when an individual is driven to accomplish a task because of internal rewards instead of external pressures or influences (Deci & Ryan, 2000). Intrinsic motivation in instructional activities is when a learner will complete a task for the satisfaction of knowing they have completed it. Learners who are intrinsically motivated are typically interested in mastering content and view their abilities to accomplish tasks as being under their own control. Extrinsic motivation is when an individual is driven to achieve or complete a task as a result of external pressures or rewards. Examples of such rewards could include money, achieving a high grade, or recognition among a group of people. Punishment or other negative consequences could also be a factor that contributes to extrinsic motivation. A learner may be extrinsically motivated to master a task if there is a punishment attributed to anyone who fails the activity. In these situations, learners can be motivated by anticipated positive or negative consequences.
Motivational Theories in Instructional Design Many motivational theories have been identified to describe and better understand human behavior. It is important to note that the motivational theories presented in this section are not an exhaustive list of motivational constructs; however, they are the most representative of instructional
147
148
MOTIVATIONAL DESIGN PRACTICES TO SUPPORT KNOWLEDGE ACQUISITION
design research and practice. This section provides an overview of five motivational theories: self-determination, expectancy value, self-efficacy, goal-setting, and attribution. Ways that instructional designers can integrate activities that promote these motivational theories will be discussed.
Self-Determination Theory Self-determination theory was developed by Deci and Ryan in 1985 to better explain how motivation can be influenced by orientation. Self- determination theory recognizes that an individual’s level of motivation related to completing a task will be influenced by the degree of their intrinsic motivation and extrinsic motivation. “Intrinsic motivation refers to a disposition to engage in a task for one’s inner pleasure” (Park, 2018). Examples of intrinsic motivation may include a learner reading additional books about a topic that was introduced in class, merely out of pure interest. They are willing to continue exploring the topic regardless of any external rewards that may or may not be associated with completing the task. Extrinsic motivation “refers to the performance of an activity in order attain some separable outcome and, thus, contrasts with intrinsic motivation, which refers to doing an activity for the inherent satisfaction of the activity itself” (Ryan & Deci, 2000, p. 71). An example of extrinsic motivation may include a student studying hard in class because they want to earn a good grade on an exam. This is of particular importance when designing instruction. We cannot assume that sole responsibility of the motivation to learn falls on the learner or the instructor. Rather, it is a combination of factors that contribute to the degree to which a learner is motivated intrinsically and extrinsically. Developing a learner profile that represents their prior knowledge, skills, and predispositions is important in that it establishes a baseline for the instructor to gauge their intrinsic willingness to participate and take an active role in their own learning. Through the development of understanding of the learning audience, the instructor can integrate incentives and rewards to support the learners’ abilities to see the utility of instruction. Self-determination theory specifies that a combination of intrinsic and extrinsic motivation contributes to satisficing three needs for autonomy, competence, and relatedness (Ryan & Deci, 2000). Autonomy accounts for an individual’s desire to be in control of their life. In the context of instructional design and learner motivation, learners are more apt to be motivated if they feel they have autonomy over the decisions they are making regarding their education, training, and careers. Competence accounts for one’s ability to be effective when performing with the environment (White, 1959). Relatedness is one’s willingness to interact with others and feel a sense of belonging to others (Baumeister & Leary, 1995). Another way for instructional designers to consider the extent to which they are connecting with their learners and supporting their abilities to
MOTIVATIONAL DESIGN PRACTICES TO SUPPORT KNOWLEDGE ACQUISITION
engage in meaning making is by looking at the relationship between inclusive design practices as they relate to the needs of autonomy, competence, and relatedness proposed by Ryan and Deci (2000). As we gather information about our learning audience and begin to design and develop instructional materials, we should ask ourselves the following questions: How much control will my learners have in the experience? • • Will my learners have the ability to customize instruction based on their specific needs? • Do my learners have the necessary training to be able to complete tasks successfully in a real-world environment? • Do my learners see themselves represented in the examples, activities, and other instructional content? • To what extent do my learners feel that they are part of a learning community? • To what extent does the learning community extend to the other aspects of their world?
Expectancy Theory Expectancy theory proposed by Victor Vroom (1964) suggests that individuals are motivated to complete tasks or perform a behavior in a particular way on the basis of their expectations that the outcomes will be favorable. Vroom’s (1964) expectancy theory has three components related to performance: expectancy, instrumentality, and valence. Expectancy consists of the relationship between effort and performance. Individuals are more apt to be motivated to complete a task if they believe that the more effort they put toward practicing will contribute to improved performance and attainment of goals. Learners are more likely to engage with the instructional content and participate in activities if they have the expectation that they will be successful. Instrumentality is the belief that the individual will receive an award or be satisfied with the outcome if a desired state of performance is met. Instrumentality in instruction suggests that learners see alignment between the activities they are engaged in and a goal. If we think back to Chapter 8, Zimmerman’s (2002) self-regulated learning framework, goals that are set by the learner during the forethought phase and aligned appropriately through environmental structuring will be perceived as more attainable compared with activities that are misaligned. Valence is the degree to which we place value on a task. Wigfield and Eccles (1992) suggest that individuals complete tasks based on the following types of value they attribute to the task: intrinsic value, attainment value, utility value, and cost. Intrinsic value is the enjoyment an individual feels while completing a task. Attainment value refers to the importance an individual associates with successfully completing a task. Utility value
149
150
MOTIVATIONAL DESIGN PRACTICES TO SUPPORT KNOWLEDGE ACQUISITION TABLE 9.1 Alignment of Expectancy Values with Contextual Factors Expectancy Values
Orienting
Instructional
Transfer
Intrinsic value
Learner task perception Learner role perception Learner task perception Learning supports Teaching supports
Experiential background Transfer opportunities
Utility value
Learner profile Goal-setting Perceived accountability Social support Perceived utility
Cost
Incentives
Rewards and values
Attainment value
Utility perceptions Perceived resources Transfer coping strategy Incentives Transfer opportunities
is the perception an individual has that the task will be useful to them in the future. Examples of this are prevalent when considering the degree to which instruction is aligned with the transfer context. Learners are more motivated when their perception of utility is high. They are less likely to be motivated when their perceptions of utility are low. A learner may not want to participate or complete instructional activities if they do not see the relevance or usefulness of the content. Lastly, if an individual perceives that the costs associated with completing a task outweigh the benefits associated with completing the task, they will not be as motivated compared with a situation where they associate the benefits to be greater than the costs. Table 9.1 provides an overview of the four expectancy values proposed by Wigfield and Eccles (1992) and Tessmer and Richey’s (1997) contextual factors prevalent in orienting, instructional, and transfer contexts.
Self-Efficacy Self-efficacy is our belief in our own abilities to complete a specific task successfully (Bandura, 1977). Several studies have found that students with high levels of self- efficacy perform better academically compared with students with low levels of self-efficacy (Chang et al., 2005; DeWitz et al., 2009). Gnoleba and colleagues (2018) argue that “students with high levels of self-efficacy are likely to have more pronounced self-regulatory skills and challenge themselves at advanced levels” (p. 9). Learners’ self- efficacy influences learner role and learner task perceptions when participating in instruction. Learners who are confident in their capabilities related to task mastery are more likely to engage in instruction because they have positive perceptions associated with the tasks. Learners are apt to be less motivated to participate when they doubt their capabilities and
MOTIVATIONAL DESIGN PRACTICES TO SUPPORT KNOWLEDGE ACQUISITION
perceive the task to be too difficult. Researchers have also reported that students with increased levels of self-efficacy tend to take on more challenging tasks compared with those with lower levels (Bandura, 1997; Park, 2018; Park & Huynh, 2015; Pintrich & De Groot, 1990). Wigfield and Eccles (2000) differentiate between self- efficacy and expectancy in that self-efficacy is task-specific and expectancy is domain- specific. The following are examples that differentiate between task and domain: • A dancer considered to be good at ballet (domain-specific) versus an individual who is good at battements (task-specific). • A student who is considered to be strong in computer science (domain-specific) versus a student who is strong at programming in C++ (task-specific). When we consider how to support learners’ motivation through different instructional activities, it is particularly important to support learners’ self-efficacy related to tasks to support their perceptions of utility in the transfer context.
Goal-Setting Theory One’s ability to set and attain goals impacts their motivation to complete a task. Similar to expectancy theory, learners are more motivated to complete a task if they believe it will help them attain a goal (Locke & Latham, 1984). Locke and Latham (2002) suggest five principles related to goal- setting: clarity, challenge, commitment, feedback, and task complexity. The clarity of the goal must be understood by the individual. In terms of goal-setting in an instructional context, the learner must have a clear understanding of the goals they are attempting to achieve. The degree of difficulty also impacts goal-setting. If the learner perceives a task as being too challenging, this will impact their ability to engage in appropriate goal-setting. It is also important to note the degree of task difficulty when setting goals, particularly when approximating the amount of time and practice it may take to reach task mastery. Learner commitment plays a significant role in goal-setting. Learners need to be committed to the goal. Learner buy-in influences their perceptions of task utility. If learners perceive the utility of the task, they are more apt to positively engage in goal-setting and putting forth the necessary effort to attain their goals. To ensure learner success in the goal-setting process, they require regular feedback on their performance. Instructional designers should design checkpoints for learners to assess their performance and determine whether they are on track to attain their goals. Instructional designers and instructors can provide opportunities to give learners feedback to remediate performance as necessary and adjust
151
152
MOTIVATIONAL DESIGN PRACTICES TO SUPPORT KNOWLEDGE ACQUISITION
timelines that were initially assigned to goals accordingly. The complexity of the task will influence learners’ motivation to complete the task. More complex tasks should be broken down into manageable sections. Learners’ motivation will increase as they complete smaller milestones along the way and can see gradual progress being made toward goal attainment.
Attribution Theory Attribution theory accounts for the causal reasoning for why an individual succeeded or failed at a task (Weiner, 1986). Whether individuals’ perceptions are accurate, it is important to remember that perceptions drive motivation. Attributions are based on three dimensions: locus, stability, and controllability (Weiner, 1986, 2010). Causes that an individual has attributed to a success or failure can be classified according to the location of the cause in relation to the individual. Stability of the cause considers the degree to which a cause is constant or intermittent. Controllability of the cause is the individual’s perception regarding how much control they may have over a cause. Park (2018) notes that while learners can control the amount of effort they exert to complete a task, they cannot control the level of difficulty or complexity of the task. When designing instructional activities, instructional designers can integrate activities that promote self-regulation. During reflective exercises where learners set goals, they can identify the realities versus the needs for environmental structuring associated with the tasks and attempt to approximate the amount of time required for task mastery. As they engage in reflective activities during the forethought and self-evaluation phases, they can begin to attribute the causes contributing to their success or struggles during the learning experience.
What Is Motivational Design? The term motivational design has evolved from John Keller’s research exploring the relationship between instructional design and motivation. Keller’s (1987) early work proposed that motivational strategies could be integrated systematically into an instructional design. Keller (2010) suggested that a motivational design process included the following steps: . Obtain course information. 1 2. Obtain audience information. 3. Analyze audience. 4. Analyze existing materials. 5. List objectives and assessments. 6. List potential tactics. 7. Select and design tactics.
MOTIVATIONAL DESIGN PRACTICES TO SUPPORT KNOWLEDGE ACQUISITION TABLE 9.2 Elaborated Process of Motivational Design (Park, 2018) Domains
Motivational Design Steps
Analyze
1. Acquisition of course information 2. Acquisition of audience information 3. Acquisition of audience motivation 4. Analysis of motivational tactics in existing materials 5. Description of motivational goals and assessment methods 6. Identification of potential tactics 7. Design of tactics 8. Integration of motivational tactics with instructional plans 9. Development of materials 10. Evaluation of student reactions
Design
Develop Evaluate
. Integrate with instruction. 8 9. Select and development materials. 10. Evaluate and revise. (p. 57) At an initial glance of these ten steps, one would find a lot of similarities with the instructional design process. Several of the earlier steps place an emphasis on learner analysis and gradually move toward selecting appropriate tactics or strategies. Keller’s (2008, 2010) work examining motivational design suggests that motivation can be a lens overlayed with the instructional design process. Keller’s ARCS model (2010) suggests that such a motivational design model exploring attention, relevant, confidence, and satisfaction can be used as an overlay model to enhance any instructional design models being referenced to guide a product. Keller suggested that instructional experiences could be enhanced by examining different instructional design tasks through a motivational design lens. Park (2018) adapted Keller’s (2010) steps of motivational design as they relate to the following domains: analysis, design, development, and evaluation (Table 9.2). Through these ten steps to guide motivational design, Keller (1987) proposed that a variety of motivational constructs could be sorted into four categories: attention, relevance, confidence, and satisfaction (ARCS). The ARCS model was developed and proposed to the field as a motivational design model. Keller’s model proposed that typical steps and tasks that are carried out through the instructional design process could be approached through a motivational lens that centered design activities on the four motivational categories. Keller’s (2010) ARCS categories are described accordingly: • Attention: the ability to capture the interest of the learners and gain their attention • Relevance: the process of aligning instruction with meeting the personal needs of the learners
153
154
MOTIVATIONAL DESIGN PRACTICES TO SUPPORT KNOWLEDGE ACQUISITION
• Confidence: designing instructional experiences to help learners believe that they will be successful and attain their goals • Satisfaction: reinforcing learners’ sense of accomplishment through intrinsic and extrinsic rewards Keller’s (2010) motivational design model suggests considering attention, relevance, confidence, and satisfaction at each phase of the instructional design process. To motivate learners and promote their autonomy in the learning process, Keller recommends writing learning objectives that address these four categories. Writing motivational objectives provides a more learner-centered approach to instruction. The associated outcomes to measure the attainment of those objectives are good indicators to both the learner and the instructional design team as to the extent that the learner is motivated, competent, and perceives the instruction to be useful. Keller’s (2010) ARCS model can be considered an overlay model in that can be paired with other instructional design models. An instructional designer may still refer to other instructional design models to guide their design and development of different instructional activities. The motivational design framework enables them to reflect upon their design work through a motivational lens to determine the extent to which they may connect with their learners. It is also important to note that, to date, the ARCS model is the only motivational design model in instructional design. Other motivational models exist to support instructional activities in various capacities, but Keller’s (2010) model is the only one to integrate motivational constructs into individual components of the instructional design process. While I am not going to argue that we need an abundance of more motivational design models, I will champion more research that examines the extent to which a motivational design approach can support the design of instruction that is inclusive, equitable, and accessible. An extension of this research could be to determine the relationship between instructional designers’ judgments and motivational design strategies used in practice, a deeper understanding of the relationship between contextual factors as they relate to instructional designers’ abilities to engage in motivational design, and the consideration of what additional steps may need to be considered to support a motivational design approach to instruction. Research has shown that adopting a systematic process to supporting learners’ motivation is beneficial (Bekele, 2010; Keller & Thomas, 2018; Mayer, 2014). Despite the research that has demonstrated positive correlations between motivational strategies and learner achievement, Sung and Huang (2022) note that motivation is often overlooked in many instructional design models and processes. In a systematic review examining how motivational design was used in instructional design research, Sung and Huang (2022) reported a total of 29 studies that accounted for aspects of motivational design, such as learners’ motivation and the relationship
MOTIVATIONAL DESIGN PRACTICES TO SUPPORT KNOWLEDGE ACQUISITION
between the learners’ motivation with engagement and learning outcomes. As a result of their systematic review, they recommend the following to continue the exploration of motivational design in instructional design practices: • Continued research and case studies that explore a systematic approach to motivational design. • Expanding the learning audiences in motivational design research. To date, most studies that have examined motivational design have been in higher education contexts. • Additional studies are needed to explore the utility of a systematic motivational design framework in diverse contexts. Sung and Huang (2022) note that “there is a noticeable absence of studies investigating influences of social experiences, cultural affiliation, economic status, and prior educational struggles of learners in a time when online learning is being increasing diverse.” • Longitudinal studies that can provide insights into learners’ motivation over an extended period of time. These studies could also support a deeper understanding of the role that motivation plays as learners transfer knowledge they have acquired to real-world contexts.
Summary This chapter provided an overview of how different motivational constructs can influence our work as designers. Motivational theories such as self-determination theory, self-efficacy, expectancy theory, goal-setting, and attribution theory were discussed. Chapter 10 continues the exploration of supporting our learners’ needs through attending to infrastructure.
Connecting Process to Practice Activities 1. In what ways have you been successful at supporting learners’ motivation in instructional materials you have developed? What challenges have you experienced addressing motivation? 2. You have recently been hired as an instructional designer for a training and development company that produces off- the- shelf e-learning training courses for workplace safety. Without having any direct connection or knowledge of your learners, how might you incorporate activities to motivate them throughout the course? 3. Patricia is a middle school teacher who teaches remedial math. Her students have been struggling and complain that they will never be able to use concepts like geometry in real life. With the ARCS model, what suggestions might you offer Patricia to help promote attention, relevance, confidence, and satisfaction in her math lessons?
155
156
MOTIVATIONAL DESIGN PRACTICES TO SUPPORT KNOWLEDGE ACQUISITION
4. Think about a task that you successfully completed and a task that you did not complete successfully. If you were to examine those situations in terms of Victor Vroom’s (1964) expectancy theory, how would you describe the expectancy, instrumentality, and valence in each of those situations? 5. What role do you think goal-setting could play in supporting learners who have attributed their failure to complete a task to a lack of time and insufficient feedback provided by their supervisor?
Bridging Research and Practice Cheng, Y. C., & Yeh, H. T. (2009). From concepts of motivation to its application in instructional design: Reconsidering motivation from an instructional design perspective. British Journal of Educational Technology, 40(4), 597–605. https:// doi.org/10.1111/j.1467-8535.2008.00857.x Efklides, A. (2011). Interactions of metacognition with motivation and affect in self- regulated learning: The MASRL model. Educational Psychologist, 46(1), 6–25. https://doi.org/10.1080/00461520.2011.538645 Hardré, P. L. (2003). Beyond two decades of motivation: A review of the research and practice in instructional design and human performance technology. Human Resource Development Review, 2(1), 54–81. Vansteenkiste, M., Simons, J., Lens, W., Sheldon, K. M., & Deci, E. L. (2004). Motivating learning, performance, and persistence: The synergistic effects of intrinsic goal contents and autonomy-supportive contexts. Journal of Personality and Social Psychology, 87(2), 246. https://doi.org/10.1037/0022-3514.87.2.246 Zhao, B. (2011). Learning from errors: The role of context, emotion, and personality. Journal of Organizational Behavior, 32(3), 435–463. https://doi.org/10. 1002/job.696
References Bandura, A. (1977). Self-efficacy: Toward a unifying theory of behavioral change. Psychological Review, 84(2), 191. https://doi.org/10.1037/0033-295X.84.2.191 Bandura, A. (1997). Self-efficacy: The exercise of control. Freeman. Baumeister, R. F., & Leary, M. R. (1995). The need to belong: Desire for interpersonal attachments as a fundamental human motivation. Psychological Bulletin, 117(3), 497–529. https://doi.org/10.1037/0033-2909.117.3.497 Bekele, T. A. (2010). Motivation and satisfaction in internet-supported learning environments: A review. Journal of Educational Technology & Society, 13(2), 116–127. Chang, M. J., Chang, J. C., & Ledesma, M. C. (2005). Beyond magical thinking: Doing the real work of diversifying our institutions. About Campus, 10(2), 9–16. https://doi.org/10.1002/abc.124 Deci, E. L., & Ryan, R. M. (1985). Intrinsic motivation and self-determination in human behavior. Plenum. Deci, E. L., & Ryan, R. M. (2000). The “what” and “why” of goal pursuits: Human needs and the self- determination of behavior. Psychological Inquiry, 11(4), 227–268. https://doi.org/10.1207/S15327965PLI1104_01 DeWitz, S. J., Woolsey, M. L., & Walsh, W. B. (2009). College student retention: An exploration of the relationship between self-efficacy beliefs and purpose in life among college students. Journal of College Student Development, 50(1), 19–34.
MOTIVATIONAL DESIGN PRACTICES TO SUPPORT KNOWLEDGE ACQUISITION Gnoleba, M. A., Kitsantas, A., & Hiller, S. E. (2018). Exploring faculty-student interactions, academic self- efficacy, perceived responsibility, and academic achievement of college students. In J. E. Stefaniak (Ed.), Self-regulated learners: Strategies, performance, and individual differences. Nova Science Publishers. Keller, J. M. (1987). Development and use of the ARCS model of instructional design. Journal of Instructional Development, 10(3), 2–10. https://doi.org/10.1007/ BF02905780 Keller, J. M. (2008). An integrative theory of motivation, volition, and performance. Technology, Instruction, Cognition, and Learning, 6(2), 79–104. Keller, J. M. (2010). Motivational design for learning and performance: The ARCS model approach. Springer. Keller, J. M., & Thomas, J. W. (2018). An application of the ARCS model of motivational design. In C. M. Reigeluth (Ed.), Instructional theories in action (pp. 289–320). Routledge. Locke, E. A., & Latham, G. P. (1984). Goal-setting: A motivational technique that works! Prentice-Hall. Locke, E. A., & Latham, G. P. (2002). Building a practically useful theory of goal- setting and task motivation: A 35-year odyssey. American Psychologist, 57(9), 705–717. https://doi.org/10.1037/0003-066X.57.9.705 Mayer, R. E. (2014). Incorporating motivation into multimedia learning. Learning and Instruction, 29, 171–173. https://doi.org/10.1016/j.learninstruc.2013. 04.003 Park, S. (2018). Motivation theories and instructional design. In R. E. West (Ed.), Foundations of learning and instructional design technology. EdTech Books. https://edtechbooks.org/lidtfoundations/motivation_theories_and_instructional_ design Park, S. W., & Huynh, N. T. (2015). How are non-geography majors motivated in a large introductory world geography course? Journal of Geography in Higher Education, 39(3), 386–406. https://doi.org/10.1080/03098265.2015.1048507 Pintrich, P. R., & De Groot, E. V. (1990). Motivational and self-regulated learning components of classroom academic performance. Journal of Educational Psychology, 82(1), 33–40. https://doi.org/10.1037/0022-0663.82.1.33 Ryan, R. M., & Deci, E. L. (2000). Intrinsic and extrinsic motivations: Classic definitions and new directions. Contemporary Educational Psychology, 25(1), 54–67. https://doi.org/10.1006/ceps.1999.1020 Seifert, K. & Sutton, R. (2018). Motivation theories on learning. In R. E. West (Ed.), Foundations of learning and instructional design technology: The past, present, and future of learning and instructional design technology. EdTech Books. https://edtechbooks.org/lidtfoundations/motivation_theories_on_learning Sung, J. S. & Huang, W. D. (2022). Motivational design for inclusive digital learning innovation: A systematic literature review. The Journal of Applied Instructional Design, 11(2). https://doi.org/10.51869/112/jsswdh Tessmer, M., & Richey, R. C. (1997). The role of context in learning and instructional design. Educational Technology Research and Development, 45(2), 85– 115. https://doi.org/10.1007/BF02299526 Vroom, V. H. (1964). Work and motivation. Wiley & Sons. Weiner, B. (1986). An attributional theory of motivation and emotion. Springer. Weiner, B. (2010). The development of an attribution- based theory of motivation: A history of ideas. Educational Psychologist, 45(1), 28–36. https://doi. org/10.1080/00461520903433596 White, R. W. (1959). Motivation reconsidered: The concept of competence. Psychological Review, 66(5), 297–333. https://doi.org/10.1037/h0040934
157
158
MOTIVATIONAL DESIGN PRACTICES TO SUPPORT KNOWLEDGE ACQUISITION Wigfield, A., & Eccles, J. S. (1992). The development of achievement task values: A theoretical analysis. Developmental Review, 12(3), 265–310. https://doi.org/ 10.1016/0273-2297(92)90011-P Wigfield, A., & Eccles, J. S. (2000). Expectancy–value theory of achievement motivation. Contemporary Educational Psychology, 25(1), 68–81. https://doi. org/10.1006/ceps.1999.1015 Zimmerman, B. J. (2002). Becoming a self-regulated learner: An overview. Theory into Practice, 41(2), 64–70.
10 ATTENDING TO INFRASTRUCTURE FOR IMPLEMENTATION
Chapter Overview This chapter will expand upon concepts introduced in Chapter 3 (needs assessment) and Chapter 5 (localization of context) to discuss the importance for advocating for appropriate infrastructure when designing instruction. It is the responsibility of instructional designers to be knowledgeable of various non-instructional interventions that may be needed to support the implementation of instruction. Examples of non-instructional interventions include, but are not limited to, organizational design, job analysis, task analysis, communication and feedback systems, and knowledge management. This chapter will emphasize the relationship between learning design and human performance technology and provide tools for instructional designers to use when planning to implement their projects within an organization.
Guiding Questions 1. What is human performance technology? 2. What synergies exist between human performance technology and instructional design? 3. Why is it important to adopt a “systems view” with my projects? 4. How can I advocate for appropriate infrastructure to support my projects?
One particular theme that is emphasized throughout this book is the importance of taking a “big picture” view of the design situations we find ourselves in. With instructional design reaching so many different industries and contexts, the roles of instructional designers throughout the typical instructional design (Analyze, Design, Develop, Implement, and Evaluate, or ADDIE) process are variable. Some instructional designers are involved from the conception of a project to the completion. Other times, instructional designers may be responsible only for the initial design aspects
DOI: 10.4324/9781003287049-10
160
ATTENDING TO INFRASTRUCTURE FOR IMPLEMENTATION
of a project while other instructional designers will take care of multimedia and development. While every instructional designer will attest to the importance of having a sound infrastructure to support training, their varying roles in the instructional design process can make it difficult to advocate and ensure that appropriate infrastructure is in place.
What if we did not consider infrastructure? I once worked for a health-care organization that had decided to transition their annual mandatory training modules from face-to-face sessions held in a classroom to a series of e-learning modules. Employees received notice that they were responsible for logging into the organization’s intranet and completing the modules by a specific deadline. Employees were not able to log in from a computer outside of the organization. What the human resources department forgot to do was figure out how employees who had jobs that did not require them to use computers were going to access the modules. Employees such as janitorial staff, housekeeping, and cafeteria attendants had no means to access the modules. This is an example of a failed infrastructure.
The examples provided in this chapter of things that go wrong when we do not attend to infrastructure are some of many. Infrastructure is something that we take for granted. We do not usually realize the infrastructure is flawed until it halts our progress on instructional design projects. A goal of this chapter to explore ways in which instructional designers can advocate for sound infrastructure when collaborating with others on projects. We could design the best piece of instruction, but if we have not given consideration to what infrastructure is needed, it will be dead on arrival. Infrastructure can mean many different things. Chapter 3 discussed the pivotal role that needs assessment can play in assessing any potential gaps or challenges that may impact instructional programming. This chapter is a continuation of topics regarding analysis but implores instructional designers to take a systemic view of the organization or setting that they are designing within.
What about organizational development? I have been involved with several projects and have had instructional design friends share similar stories where issues abound when it was discovered that policies and procedures that were being used as reference materials for
ATTENDING TO INFRASTRUCTURE FOR IMPLEMENTATION
trainings were outdated. Job descriptions, outlining roles and responsibilities, had not been updated in several years and did not account for changes to technologies and other key competencies. This ultimately required the instructional design team to have to pause their work and collaborate with others in the organization to make the necessary updates to the procedures that employees were to be taught.
To avoid these catastrophes from occurring, instructional designers can advocate for the necessary resources and tools that contribute to a solid infrastructure to support their instructional projects. While meeting with clients, managers, and fellow collaborators, instructional designers should ask the following questions: • Who will be involved in the launch of the instructional course/ program? • What information technology support do I need if I plan to host e-learning modules in our learning management system? • Have the changes that are being made to policies and procedures been communicated to employees and other stakeholders? • Who, outside of instructional design, needs to be involved in implementation? • Will the course/program impact other initiatives that may be occurring within the organization? While there are certain questions that most instructional designers would agree should be asked at any initial project meeting, every situation is going to be unique, bringing forth its own nuances and challenges. The best way an instructional designer can be prepared to ask the necessary questions to ensure that infrastructure is being factored into decisions is to actually know what questions to ask. By taking a broader and more systemic view of situation, the instructional designer can frame questions that are specific to the needs of the situation. There is an expression in carpentry that it is better to measure twice and cut once. A similar mindset can be applied to our work as instructional designers. By putting in some extra time at the beginning to observe and reflect upon the organization that we are working with and how our instruction is going to contribute to the organization’s goals in terms of facilitating learning and improving performance, we can plan ahead for challenges that may be inherent to implementation. Human performance technology requires instructional designers to step back from their design activities and responsibilities and really look at the systemic implications pertaining to their work.
161
162
ATTENDING TO INFRASTRUCTURE FOR IMPLEMENTATION
Examining Infrastructure through Human Performance Technology Human performance technology is “the study and ethical practice of improving productivity in organizations by designing and developing effective interventions that are results-oriented, comprehensive, and systemic” (Pershing, 2006, p. 6). In research years, some scholars in our field have started shifting from using the term human performance improvement instead of human performance technology because of the misconceptions that technology must mean the use of a computer or other highly technological device. I still gravitate toward the term, human performance technology, because the use of the word technology emphasizes tools to be used to solve problems. When looking at projects supporting human performance, technology implies a systematic and systemic approach to solving projects (Pershing, 2006). Human performance technology can support the instructional design process by taking a broader systems view of the organization, the system, and all its components. Foshay and colleagues (2014) describe a systems view as the following: . It is holistic. 1 2. It focuses primarily on the interactions among the elements rather than the elements themselves. 3. It views systems as “nested” with larger systems made up of smaller ones. (Foshay et al., 2014, p. 42) If we were to think about the organizations, schools, and other environments that we design instruction for as systems, we could quickly begin to see that these environments are quite complex with many moving parts. Every system, regardless of type, has a purpose. Every individual in the organization would represent a component in the system. Every tool that we use to deliver instruction could represent a component in the system. Every teacher…every instructional designer…every learner could represent a component in the system. It is not just a matter of having all of these individuals and objects within a system. We also must be mindful of the degree of interactivity that occurs between these people, objects, and processes. Some interactions may contribute to the productivity of a system, while others may be detrimental to the system’s existence. If we were to take a step back and view the larger picture of the organization and the interactions (or lack thereof) between the components in the system, we could begin to see to what extent the system is achieving its purpose. The study of human performance technology proposes a systematic process for assessing situations. Figure 10.1 illustrates how this process supports the goals of instructional design projects. Performance analysis
ATTENDING TO INFRASTRUCTURE FOR IMPLEMENTATION Performance Analysis ● Needs assessment and analysis
Intervenon Selecon
Intervenon Design and Development
Instruconal Design Project Goals
Facilitang Learning and Improving Performance
Intervenon Implementaon Intervenon Maintenance
FIGURE 10.1 The Human Performance Technology Process in Relation to Instructional Design Project Goals.
is a critical part of human performance technology. In order for anyone to make decisions regarding performance improvement, they must collect sufficient data that informs their understanding of the situation and the problems they are trying to solve (Van Tiem et al., 2012). Given that the definition of instructional design offered by Richey and colleagues (2011) refers to instructional design as being a process that facilitates learning and improves performance, undertaking performance analysis should be a priority for instructional designers. Needs analysis is an umbrella term that encompasses any analyses that will explore organizational goals, needs, and processes, employee or learner performance (or both), and causal analysis. Causal analysis is often used synonymously with needs analysis to understand what is contributing to (causing) a discrepancy in performance at an organization, group, or individual level. Chapter 3 discussed the challenges we face advocating for needs assessments in instructional design work as most often clients have already identified the need before they have hired us. It also provided guidelines for how we can scale needs assessment activities when we may not be provided the opportunity to conduct a thorough needs assessment with multiple data points. Arrington and colleagues (2022) suggest that “data should be collected that will inform three facets of the problem: (1) the environment, (2) the performance gap, and (3) potential causes” (p. 163). Chapter 3 provides several strategies for engaging in data analysis to inform instructional designers on how to identify gaps and discrepancies in performance and the contributing factors of those gaps. The emphasis of this chapter is to think about the role that infrastructure plays in the implementation of our instructional design activities and materials. Given the inevitable realities of our line of work, there are ways we
163
164
ATTENDING TO INFRASTRUCTURE FOR IMPLEMENTATION TABLE 10.1 Questions that Inform Understanding of the Problem Facet of the Problem
Questions to Inform Understanding of Problem as It Relates to Instructional Design
The Environment
• What is the culture of the organization? • What perceptions do organizational members (i.e., employees, students, etc.) have of training? • Are there political issues within the organization that may impact the implementation of training and instruction? • What are the gaps in performance? • How do these gaps impact the success of the organization? • Who, in the organization, is impacted by these gaps? • What are the expectations for the instructional design activities related to these identified gaps? • How long have these performance gaps existed? • Has training previously been developed to address the issues? • Have causes contributing to the performance gaps been identified? • What contextual factors are contributing to these gaps? • What is the time line for addressing these gaps through training? • Have the causes been communicated to all individuals within the organization?
The Performance Gap
Potential Causes
gather additional information that will contribute to our abilities to view our situation through a systems lens. We can build upon the suggestions of Arrington and colleagues (2022) and structure preliminary questions around those three facets as outlined in Table 10.1. The implementation phase of the instructional design process is one that does not always receive the attention that is warranted. Quite a bit of attention goes into the development of materials as these are tangible products of instructional design. We often think of implementation as the process of imparting or handing these deliverables off to our learning audience. We do not always think about what components need to be aligned in the system to support that hand-off. Most organizations will not realize the attention that is needed until these experience a failed hand-off. Even if we are unable to go through a process that allows us to collect data to address these three facets, any information we can obtain at the start of the problem is helpful. What is most important is that we can start asking questions that will prompt our clients or management (or both) to begin thinking about additional resources that may be needed to support implementation. The information that is gleaned from assessing the needs and interactions between components in the system directly informs and
ATTENDING TO INFRASTRUCTURE FOR IMPLEMENTATION
impacts the types of interventions that may be selected to eradicate those identified areas needing improvement. The application of human performance technology promotes exploring a variety of interventions, both instructional and non-instructional, when seeking appropriate solutions to solve problems. Particular emphasis is placed on reviewing the alignment and systemic relationship between interventions within a system. This is particularly important when planning ahead for the implementation of programming and prioritizing which interventions should be implemented first. Van Tiem and colleagues (2012) offer the following classifications for interventions: • • • • • • • •
Learning Performance support Job analysis/work design Personal development Human resource development Organizational communication Organizational design and development Financial systems
Table 10.2 provides an overview of common non-instructional interventions that can be used to support instructional design efforts (Stefaniak, TABLE 10.2 Non-instructional Strategies Used to Support Instructional Design Non-instructional Strategy
Benefit to the Instructional Design Process
Job analysis
Up-to-date job descriptions with complete task analyses will provide a detailed account of how to perform tasks conveyed in training. A plan that outlines the organizational infrastructure of a company. Details are provided to demonstrate how different units interact and function with one another in the organization. Plans that detail how new initiatives or information is communicated to employees. Examples may include listservs, company newsletters, training announcements, performance reviews, and employee feedback. Detailed plans to provide employees with feedback on their work performance. This information may be used to identify individual training needs and opportunities for promotion. Installation of learning management systems to track learning initiatives throughout the organization. Electronic performance support systems are used to provide just-in- time resources to employees.
Organizational design
Communication planning
Feedback systems
Knowledge management
165
166
ATTENDING TO INFRASTRUCTURE FOR IMPLEMENTATION
2018). Regardless of the intervention being implemented, the most successful implementation maximizes organizational and human benefit, minimizes organizational and human distress, and is reasonable in terms in the amount of time and budgetary support needed (Dormant, 1992). Any time we make a decision resulting in a change in behavior and performance, we consider the advantages and disadvantages. When preparing to implement the deliverables for an instructional design, we might think of the value they will bring to the organization (or system). We can do this by thinking about them in terms of relative advantage, simplicity, compatibility, modifiability, and the social impact they may pose on members of the system to enact change (Dormant, 1992). Most interventions that are not successful fail because they lack relative advantage, ease of learning, ease of understanding, ease of use, compatibility, and modifiability. Additionally, interventions are rejected or fail if they impose a sizable social impact on the system. A good intervention is one that members of the organization are committed to supporting and is feasible, sustainable, and cost-effective. According to Spitzer (1992), every intervention should • • • • •
Be based on a comprehensive understanding of the situation Be carefully targeted Have a sponsor Be designed with a team approach Be cost-effective.
When evaluating what infrastructure is needed or already in place to support an intervention, you can use Table 10.3 to determine the feasibility and appropriate ordering of such interventions. Spitzer (1992) identified 11 criteria for determining whether an intervention is successful:
TABLE 10.3 Identifying Support Needed for Implementing Interventions Intervention
Sponsor
Commitment Required
Resources Needed
Time Line for Implementation
Additional Prioritization Interventions for Needed Implementation (Low, Medium, High)
ATTENDING TO INFRASTRUCTURE FOR IMPLEMENTATION
1. Design should be based on a comprehensive understanding of the situation. This is where previous performance and cause analyses come together. The more information we have available to us about the situation, the better able we are to customize our instruction. By approaching our design projects with a systems thinking view, we can try to anticipate what aspects of an organization our instruction will reach. By understanding the causes that are contributing to gaps in performance, we can specifically target those areas in our instruction and establish a plan to evaluate upon implementation. 2. Interventions should be carefully targeted. Target the right people, in the right setting, and at the right time. Time is an extremely important factor to consider when preparing to implement a course or program. In many cases, it does not get the full attention it deserves. I think teams do a good job discussing how much time it will take to complete development and upload materials; but time does not always get viewed in regard to the relationship the training could have with other initiatives in the organization. Questions the design team may consider when preparing a schedule for implementation include the following: • Does the implementation of this initiative conflict with other initiatives that are in progress or are about to be launched? • Do members of the organization have time to commit to training at the time of the scheduled implementation? • How does the timing of this program line up with the organization’s annual performance reviews and goals? For instance, if you were preparing to implement a professional development training series for students in a local school district in the early spring, there may be a lack of participation. The school may already be occupied with implementing state-mandated initiatives to adhere to curriculum guidelines. Teachers will be in intense preparation sessions with their students, preparing them for quarterly examinations. The idea of participating in one more initiative might be overwhelming for some. Taking the time to see whether there are any negative impacts on other existing initiatives in the system can help you to ensure that your audience is not overburdened with other responsibilities and priorities. It also helps to improve the likelihood that they will be more present and engaged with your initiative. 3. An intervention should have a sponsor. A sponsor is someone who will champion the activity. A program will not be successful if it does not have a dedicated sponsor. A sponsor is someone who can communicate to others within the organization, obtain buy-in from the necessary people, and help to lead change by example. Even if we, as the instructional designers,
167
168
ATTENDING TO INFRASTRUCTURE FOR IMPLEMENTATION
are dedicated to the project, long-term sustainable is unlikely without a sponsor championing it. It is important to have the right people on board with the project. The right people are individuals within the organization who know about the issues, care about the issues, and can make the change happen (Cavanagh & Chadwick, 2005). It is not just a matter of finding an individual who knows and understands that there are discrepancies in performance – they need to care enough about the issues to have a vested interest to commitment to supporting the implementation. Committing to the project should not just involve allowing the project to take place; rather, a true project champion will be meeting with individuals within the organization and preparing them in advance for the launch so they can anticipate it. Ideally, the champion is in a position of power when they can help persuade others in the organization to participate both by leading through example and by enforcing new policies and procedures that may emerge. 4. Interventions should be designed with a team approach. The ability to draw upon expertise from all areas of the organization is vital to successful intervention selection. While the actual design of the intervention may fall to a select few individuals, it is important to obtain buy-in from the necessary stakeholders who will be significantly impacted by changes that are a result of the intervention. Even if you are not invited to planning meetings with senior leadership, you can still advocate for formative feedback on the interventions prior to moving into development. This will help ensure that some additional people have reviewed the materials you are designing and can ask questions about implementation. A needs analysis can help identify which stakeholders should be consulted for feedback throughout the design of the intervention. In the event that a needs analysis is not conducted, you can still inquire during the initial kick- off meeting as to who should be asked to provide feedback on your designs. 5. Intervention design should be cost-sensitive. Any time an organization decides to move forward with implementing an intervention, either instructional or non-instructional, it is important that they consider the return on investment. Is the cost of designing and implementing the intervention going to save the organization more in the long term? Sometimes, organizations must assess the extent they may be losing money by not offering training to their employees. Other times, an organization may find that while an intervention may solve some annoyances or flaws in the system, it is not worth the cost it would take to design and implement the intervention. In these cases, the organization will leave everything as is.
ATTENDING TO INFRASTRUCTURE FOR IMPLEMENTATION
6. Interventions should be designed on the basis of comprehensive, prioritized requirements, based on what is most important to both the individual and the organization. This sixth principle is particularly tied to the overarching mission and goals of the organization. As mentioned earlier in this chapter, every system, regardless of what it is, has a purpose. Any interventions that are designed for a system should be designed with the intent that it needs to support the system’s purpose. In the case of systems involving groups of people such as companies, schools, classrooms, and teams, organizational analyses can help inform the design team of the system’s purpose and priorities. Knowing this information helps us with aligning our interventions to the goals of the organization, group, or individual. A good indicator that an intervention will not be successful is where there is a lack of alignment. If you, as the designer, do not have a clear understanding of what those goals may entail, that should be an alert to you that more information is needed from the client. 7. A variety of intervention options should be investigated because the creation of a new intervention can be costly. While this seventh principle is important, it can become quite challenging for instructional designers to adhere to depending on their role in the organization. If you are in a position where you have a strong voice at planning meetings to discuss instructional programming and needs, it is easier to advocate or be involved in the exploration and prioritization of different interventions. Other times, someone else has identified that the intervention is needed as they have forwarded the design needs to you to complete the work. When this is the case, it does not hurt to ask the client whether other interventions were explored. By doing this, we can gather some additional information to understand the rationale for why instruction was decided upon as being the optimal solution. Chapter 2 of this book discusses the role that decision-making plays in our instructional design process. We differentiate between rational and dynamic decision-making. In the event that you are tasked with reviewing different possibilities for interventions, it is helpful to impose some parameters and guidelines for how much time will be allocated to this task. When we think about rational decision-making, more time tends to be allotted for reviewing different options, weighing those options, considering alternative solutions, and then moving forward with a recommendation. Typically, rational decision-making takes more time than dynamic decision- making, but it is important to note that rational decision-making does not get dragged out too long when options become obsolete. If and when you are tasked with reviewing different possible interventions, it is important to know and impose some constraints
169
170
ATTENDING TO INFRASTRUCTURE FOR IMPLEMENTATION
on the search. You can think of these constraints as boundaries to manage the “design space” for your project. How much money do you have to contribute to an intervention? How much time? How much customization is needed to meet the needs of the organization. It is important to think about the three constructs that should considered for any solution to a project: time, quality, and money. 8. Interventions should be sufficiently powerful. Consider long-term versus short- term effectiveness. Use multiple strategies to effect change. One of the things that I particularly like about human performance technology is that it promotes individuals coming together to have these important discussions surrounding implementation of interventions and the needs of the organization. Sometimes, we can feel like we are alone on an instructional design island within an organization. This is especially true if we are the sole instructional designer. The activities that are used to examine situations through a human performance technology lens require individuals to conduct the necessary analyses before making decisions. These prescriptions require multiple perspectives and views. I find that integrating a human performance technology mindset into your design work can be very beneficial because it can help inform the different questions you may ask regarding your role as the instructional designer. This principle promoting consideration of long-term versus short-term effectiveness emphasizes the need for planning, aligning interventions, and evaluative strategies. If any organization is planning to implement an intervention that they know will take a considerable amount of time before sufficient change occurs, it would be helpful to acknowledge them within the instructional design materials and implementation of the interventions. For instance, an organization may choose to adopt a phased training initiative that is spread out over an extended period of time to help individuals build upon new skills and slowly adopt change. Evaluative instruments could be integrated to measure summative performance that immediately follows completion of training and instruments to measure the success of the change after some time has passed. Conversations around evaluation not only inform the types of evaluative assessments we may need to consider for the instructional materials we are developing but also begin the important conversations for how we can measure the effectiveness and sustainability of these interventions over time. 9. Interventions should be sustainable. Thought must be given to institutionalizing the intervention over time. To really be successful, the intervention must become
ATTENDING TO INFRASTRUCTURE FOR IMPLEMENTATION
ingrained in the organization’s culture. This coincides with a previous principle advocating for a project champion. As instructional designers, we can ask questions and make recommendations regarding what types of support may be needed for the learners and maintain or update instructional content after the materials have been developed and implemented. 10. Interventions should be designed with viability of development and implementation in mind. An intervention needs human resources and organizational support. When instructional modules are designed as part of routine employee professional development, the human resources or employment development department should be consulted to identify what processes are necessary to document that individuals within the organization have completed training. While the instructional designer will not need to know the specifics for every process, knowing how materials will be stored or launched can help them determine how materials, particularly e-learning modules, can be packaged and uploaded to a learning management system. Individuals with a supervisory role in an organization should know where instructional materials can be accessed and how participation will be documented. If training is being conducted as a job requirement, appropriate documentation is needed for human resources and to inform management during their regularly scheduled performance reviews and evaluations. 11. Interventions should be designed using an iterative approach. This occurs during the formative evaluation stage when multiple revisions will generate interventions to fit the organization. An iterative approach to designing non-instructional interventions is very similar to the iterative nature that we employ to the instructional design process. Opportunities are given to seek feedback and make adjustments. Adopting human performance technology principles to support your instructional design practices can help you refine your approach to iterative design by thinking about additional aspects of the system beyond the immediate design needs. Again, it positions you to view the whole picture.
Summary This chapter presented an overview of human performance technology and discussed how it could be used to support instructional design practices. Examples of non-instructional interventions were discussed. Chapter 12 builds upon the principles of human performance technology in how we can assess the sustainability of instruction in organizations.
171
172
ATTENDING TO INFRASTRUCTURE FOR IMPLEMENTATION
Connecting Process to Practice Activities 1. Think about the most recent instructional design project you completed. What non-instructional interventions were needed to support the implementation of the project? Who was involved? What would have happened if those non-instructional interventions were not in place? 2. Differentiate between human performance technology and instructional design. How do these two studies of practice differ? How do they complement the other? 3. Take a moment to reflect on your own instructional design process. To what extent do you incorporate human performance technology principles? Are there any areas of improvement you would like to consider as you hone your craft as an instructional designer?
Bridging Research and Practice Arrington, T. L., Moore, A. L., Steele, K., & Klein, J. D. (2022). The value of human performance improvement in instructional design and technology. In J. Stefaniak & R. M. Reese (Eds.), The instructional design trainer’s guide: Authentic practices and considerations for mentoring ID and ed tech professionals (pp. 161–169). Routledge. Asino, T. I., Giacumo, L. A., & Chen, V. (2017). Culture as a design “next”: Theoretical frameworks to guide new design, development, and research of learning environments. The Design Journal, 20(sup1), S875–S885. https://doi.org/10.1080/1 4606925.2017.1353033 Fox, E. J., & Klein, J. D. (2003). What should instructional designers & technologists know about human performance technology?. Performance Improvement Quarterly, 16(3), 87–98. https://doi.org/10.1111/j.1937-8327.2003.tb00289.x Giacumo, L. A., & Asino, T. I. (2022). Preparing instructional designers to apply human performance technology in global context. In J. Stefaniak & R. M. Reese (Eds.), The instructional design trainer’s guide: Authentic practices and considerations for mentoring ID and ed tech professionals (pp. 170–179). Routledge. Giacumo, L. A., & Breman, J. (2021). Trends and implications of models, frameworks, and approaches used by instructional designers in workplace learning and performance improvement. Performance Improvement Quarterly, 34(2), 131–170. https://doi.org/10.1002/piq.21349 Peters, D. J. T., & Giacumo, L. A. (2019). A systematic evaluation process: Soliciting client participation and working in a cross- cultural context. Performance Improvement, 58(3), 6–19. https://doi.org/10.1002/pfi.21845
References Arrington, T. L., Moore, A. L., Steele, K., & Klein, J. D. (2022). The value of human performance improvement in instructional design and technology. In J. Stefaniak & R. M. Reese (Eds.), The instructional design trainer’s guide: Authentic practices and considerations for mentoring ID and ed tech professionals (pp. 161–169). Routledge. Cavanagh, S., & Chadwick, K. (2005). Health needs assessment: A practical guide. Health Development Agency.
ATTENDING TO INFRASTRUCTURE FOR IMPLEMENTATION Dormant. (1992). Implementing human performance technology in organizations. In H. D. Stolovich & E. J. Keeps (Eds.), Handbook of human performance technology: A comprehensive guide for analyzing and solving performance problems in organizations (pp. 167–187). Jossey-Bass. Foshay, W. R., Villachica, S. W., & Stepich, D. A. (2014). Cousins but not twins: Instructional design and human performance technology in the workplace. In J. M. Spector, M. D. Merrill, J. Elen, & M. J. Bishop (Eds.), Handbook of research on educational communications and technology (4th ed., pp. 39–49). Springer. Pershing, J. A. (2006). Human performance technology fundamentals. In J. A. Pershing (Ed.), Handbook of human performance technology (3rd ed., pp. 5–26). Pfeiffer. Richey, R. C., Klein, J. D., & Tracey, M. W. (2011). The instructional design knowledge base: Theory, research, and practice. Routledge. Spitzer, D. R. (1992). The design and development of effective interventions. In H. D. Stolovitch & E. J. Keeps (Eds.), Handbook of human performance technology (pp. 114–129). Pfeiffer. Stefaniak, J. (2018). Performance technology. In R. E. West (Ed.), Foundations of learning and instructional design technology: The past, present, and future of learning and instructional design technology. EdTech Books. https:// edtechbooks.org/lidtfoundations/performance_technology Van Tiem, D., Moseley, J. L., & Dessinger, J. C. (2012). Fundamentals of performance improvement: A guide to improving people, process, and performance (3rd ed.). Pfeiffer.
173
11 ASSESSING THE SUSTAINABILITY OF INSTRUCTION
Chapter Overview This chapter will discuss evaluative techniques to measure the success and sustainability of instruction. Many instructional design textbooks discuss formative evaluation during the design process and summative evaluation upon conclusion of a learning experience. This chapter will look at measuring effective change after a period of time has taken place after training. Emphasis will be placed on evaluating learning transfer to real-world environments. An overview of program evaluation and associated data collection will be discussed to help instructional designers determine what types of data they may need after training is complete.
Guiding Questions 1. What is evaluation? 2. What is the difference between evaluation and assessment? 3. What are the differences between formative, summative, and confirmative evaluation? 4. How can I promote sustainability in my conversations with my clients?
Avoiding a Band-Aid Solution When instructional designers engage in design, they want to provide solutions that are going to serve the needs of their audience and last. In an effort to improve efficiency, effectiveness, and ease of learning, the better able we are to align our solutions to meet the needs of the audience and organization, the better our chances are that the solutions will last. When we fail to adequately align our instructional solutions in a manner that supports the ongoing operations of an individual, group, or organization, we are merely providing a temporary fix. In the long-term, this is detrimental to our clients and stakeholders and potentially our vitality as being gainfully employed instructional designers. DOI: 10.4324/9781003287049-11
ASSESSING THE SUSTAINABILITY OF INSTRUCTION
While this chapter is focused on assessing the sustainability of our instruction and instructional design practices, this certainly is not the first time that this book has emphasized the importance of designing long-standing solutions. Striving for sustainability is inherent in many of the design activities we engage in for a practice ranging from the needs assessment and needs analysis, learner analysis, localization of context, design and development, and implementation. Keeping our learner as our focal point in the work we do helps us to ensure that all activities are being conducted in a meaningful way that will help to facilitate their learning and improve performance.
Evaluating Impact If the goals of instructional design are to contribute to effectiveness, efficiency, and ease of learning, then we should be able to evaluate the extent to which we are accomplishing these goals. The purpose of evaluation is to engage in a systematic investigation to determine the worth or merit of a solution (National Science Foundation, 2010; Scriven, 1967). Guerra-Lopez (2007) emphasizes that the process of determining the merit and worth of objects enables us to make data-driven decisions by comparing results with expectations, finding drivers and barriers to expected performance, and producing action plans for improving solutions being evaluated (Guerra- Lopez, 2007). Evaluating the impact of what we are designing and sharing with our learners helps us to build asset capacity and strive for sustainability. One of the most important ways to measure impact is by being able to inform our clients and stakeholders of how much we have improved. Guerra-Lopez (2008) has proposed a seven-step process for evaluating impact: . Identifying stakeholders and expectations 1 2. Determining key decisions and objectives 3. Deriving measurable indicators 4. Identifying data sources 5. Selecting data collection methods 6. Selecting data analysis tools 7. Communicating results and recommendations With the Impact Evaluation Model, Guerra- Lopez (2008) emphasizes a need to gather data from stakeholders external to the project to provide a wider and robust representation of those who may be impacted by the project being evaluated. While these seven steps can be approached systematically, Guerra-Lopez (2008) notes that they should not be viewed solely as a linear process. Rather, they should be viewed iteratively as emphasis is placed on ensuring alignment between each step, depending on data that is made available during the evaluation. A majority of evaluation strategies that are used in instructional design focus specifically on the extent that learners have met the intended learning
175
176
ASSESSING THE SUSTAINABILITY OF INSTRUCTION
objectives. The Impact Evaluation Model does not rely solely on evaluative data that is related to specific learning or performance objectives that have been assigned to members of a group organization. Instead, the model seeks to collect data to determine the long-term impacts of the interventions being evaluated. This model can provide instructional designers with the utility to take a broader view of the extent that instructional interventions are successful by evaluating how the intervention fits within the larger system. This enables us to predict the longevity of our interventions and determine whether any adjustments need to be made in conjunction with other interventions to serve the needs of the system. Table 11.1 provides an overview of some questions an instructional designer may consider when evaluating the impact of an instructional intervention adapted from Guerra-Lopez (2008). TABLE 11.1 Questions to Consider when Evaluating the Impact of an Instructional Intervention Phase of Evaluative Project
Questions Specific to Instructional Design
Identifying stakeholders and expectations
• How are stakeholders directly and indirectly impacted by the instructional interventions? • Do stakeholders have a clear understanding of the expectations associated with the instructional intervention? • What recommendations do the stakeholders have? • What are the evaluative objectives to be explored to measure impact of performance? • Have stakeholders contributed to the development of the evaluative objectives? • What data sources will we have access to during our evaluation? • How many evaluative checkpoints will be assessed? • Who will be involved in the evaluation? • Have stakeholders been informed that an evaluation is taking place to measure the impact of instruction? • How will success be determined by the stakeholders? • What measurable indicators will be gathered during instruction? • What measurable indicators will be gathered after instruction has been delivered? • What data sources are available to the team? • What data sources are needed to support the measurable indicators? • Is there alignment between data sources available and the objectives of the evaluation? • Are data sources representative of different stakeholders impacted by the interventions?
Determining key decisions and objectives
Deriving measurable indicators
Identifying data sources
(Continued)
ASSESSING THE SUSTAINABILITY OF INSTRUCTION TABLE 11.1 (Continued) Phase of Evaluative Project
Questions Specific to Instructional Design
Selecting data collection methods Selecting data analysis tools
• What data collection methods will be best to gather data from the previously identified data sources regarding the success of the instructional intervention? • How will data sources be analyzed? • How will data sources be triangulated? • How much time is required in between data collection to address the objectives of the evaluation? • Do my recommendations reflect the ethical implications associated with the project? • Have I addressed the concerns and needs of all stakeholders?
Communication of results and recommendations
Adapted from Guerra-Lopez (2008).
Differentiating Between Needs Assessment and Evaluation Needs assessment and evaluation are two distinct activities, but, if done correctly, they can be used to inform the other. When we think about needs assessment and evaluation in terms of instructional design, they each play a critical role in the process; they are the bookends. Needs assessment requires us to assess the situation, determine the needs, and make recommendations for what solutions are needed to improve performance. While evaluation is integrated throughout the design process, the evaluative activities that occur at the end are employed to determine whether the solutions we have implemented have met their intended purpose. Table 11.2 differentiates between needs assessment and evaluation statements (Watkins & Guerra, 2002). TABLE 11.2 Differentiating between needs assessment and evaluation activities Needs Assessment Activities
Evaluation Activities
• Identify performance gaps prior to exploring alternative solutions. • Estimate the potential utility of an intervention. • Make recommendations for interventions that will support an organization’s strategic plan. • Gather input from different stakeholders regarding existing programs and processes. • Identify possible consequences of alternative solutions.
• Determine the effectiveness of an intervention. • Make recommendations about how to improve an existing intervention. • Determine the efficiency of an intervention. • Determine the suitability of an intervention. • Identify which interventions should continue to be used by the organization.
177
178
ASSESSING THE SUSTAINABILITY OF INSTRUCTION
Needs assessment activities typically involve gathering information to explore alternative solutions and making recommendations for which ones best align with other goals and activities of the organization. Evaluative activities are set up to determine the extent to which the recommendations that came out of the needs assessment were effective. Data gathered on the state of performance during a needs assessment can be used as baseline data that can be reviewed after interventions have been implemented.
Determining Levels of Design in Relation to Sustainability While most instructional designers will agree that an ultimate goal for every project is for the intervention to be sustainable and long-lasting, we want our learners and clients to see the value and utility in what we design; we approach design with the intention of mitigating challenges and improving performance. While effectiveness, efficiency, and a desire to contribute to our learner’s ease of learning are certainly at the forefront of many of our design plans, the reality is that there are many constraints that may get in the way of us meeting these goals. Time, money, and resources often impact the extent to which we can achieve effectiveness and efficiency. While Chapter 12 will provide an overview of ethical considerations for instructional designers, this book has emphasized in multiple chapters that instructional designers have a responsibility to design solutions that are in their clients’ and learners’ best interests. Spector (2012) proposes five design levels to acknowledge components and criteria that promote good learning and instruction (Figure 11.1). What I like in particular about Spector’s (2012) design levels is that they prompt us to address the realities of our design projects and plan within what is feasible for the project. This is another useful tool to ensure that we are on the same page with our clients as we establish our design space. If we know what is considered to be the desirable result or what will be considered “good enough,” we can structure our evaluative activities accordingly so we can measure the impact of the interventions. Alignment is a phrase that is often used in needs assessment and evaluation discussions. It is important that data collection strategies be aligned with the needs of the system. The same rules apply to conducting evaluations. There needs to be alignment between the tools we are using to gather data and the content, learning, and performance that are being evaluated. Addressing what design category an overall project goal may fall within can help us to frame evaluative questions that are appropriate.
Types of Evaluation Commonly Used in Instructional Design Three types of evaluation are commonly used in instructional design. These are formative, summative, and confirmative evaluations. This section reviews these different types of evaluations in terms of how they can be used to assess the sustainability of instruction.
ASSESSING THE SUSTAINABILITY OF INSTRUCTION Design Levels 5 No harm is done.
4
The program is sustainable.
3
The learning experience is appealing and aracve.
2
The learning environment is useful and reliable with meaningful acvies aligned with goals.
1
Learning goals and objecves are met (a basic requirement of all instruconal programs; implies formave and summave assessments).
FIGURE 11.1 Components and criteria of good learning and instruction. Source: Spector, J.M. (2012). Foundations of educational technology: Integrative approaches and interdisciplinary perspectives. Routledge. Figure 2.1 Components and criteria of good learning and instruction (p. 16). Used with permission.
Formative Evaluation Formative evaluation can be described as being “a judgment of the strengths and weaknesses of instruction in its developing stages, for purposes of revising the instruction to improve its effectiveness and appeal” (Tessmer, 1993, p. 11). Formative evaluation often occurs in phases throughout the entire design process to inform design decisions and provide feedback prior to implementation. Formative evaluation is considered necessary particularly when The designer is relatively new to the practice of instructional design • • The content is new to the designer or team • The technology is new to the designer or team • The learner population is new to the design team • Unfamiliar or experimental instructional strategies are part of the instruction • Accurate task performance from the instruction is critically important • Materials will be disseminated in large quantity • The chances for revisions or newer versions of the instruction after its release are slim. (Tessmer, 1994, p. 20) During a formative evaluation, an instructional designer may engage in multiple methods to gather data to identify any necessary revisions to
179
180
ASSESSING THE SUSTAINABILITY OF INSTRUCTION
be made to the content. Examples of different methods for gathering data include expert review, one- to- one evaluation, small group, and field test. The means by which an instructional design team may gather data will be dependent on availability of resources, time allocated for gathering formative feedback, and costs associated with design and development. Expert review consists of a subject-matter expert reviewing the instructional materials to identify areas of the content that need to be corrected. During expert review, the design team may seek feedback from a subject- matter expert who will provide feedback to remove errors pertaining to the subject matter or an instructional design expert who will provide feedback in terms of the design and usability of the structure of the instructional intervention itself. One-to-one evaluations consist of the design team seeking feedback from individuals who will go through the training materials and provide feedback in terms of usability and accessibility. Most often, learners during a one-to-one evaluation will participate in think-alouds where they will talk through their thoughts and navigate through materials. They will notify the design team if any activities are confusing or if certain parts of the instruction do not work as intended. A small group consists of 5 or 6 individuals who have been tasked with reviewing the instructional materials. During the small group session, the design team may observe the instruction being carried out but they will not interrupt or intervene. The goal of a small group session is to observe how the instruction is delivered. This provides the design team with the opportunity to see whether activities can be implemented as planned. The design team will take notes if certain activities take more or less time, what types of questions the learners may be asking if seeking clarification, and any points in time where learners were unable to complete the instruction as planned. A small group evaluation is similar to having a focus group provide detailed feedback on the instruction. This format allows for individuals to communicate with one another and the design team. As a group, they can discuss different aspects of the instructional materials that they think are helpful as well as parts that are somewhat confusing and may require modifications. A field trial consists of doing a pilot study for a limited amount of time where the instructional materials are implemented with a group of individuals who have learner characteristics similar to those of the intended learning audience. Calhoun and colleagues (2020) recommend that formative evaluation in a field trial address the following questions: • • • •
What needs to be done? How should it be done? Is it being done? Did it succeed?
ASSESSING THE SUSTAINABILITY OF INSTRUCTION
According to Tessmer (1993), formative evaluation helps to make instruction effective, efficient, motivating, usable, and acceptable. When instructional designers engage in a project, they must balance designing for efficiency, effectiveness, and ease of learning with time, money, and expectations of quality. Formative feedback can prove valuable to the designers to make modifications to the instructional materials before they are finalized and implemented across a larger audience. Formative feedback and small group and field trial sessions are most beneficial for instructional designers who are responsible for designing off-the-shelf training courses. These times of feedback sessions can provide insights into how the materials will be received by learners whom the design team does not have direct access to. Formative feedback also can provide information to help instructional designers develop materials that can be sustainable over an extended period of time. If an instructional designer is internal to an organization, it may be easier for them to follow up and make adjustments and modifications during and after implementation. If the instructional designer is external to the organization or they are in a time-limited contractual relationship with a unit within their organization, it is important that they plan for the maintenance and sustainability of their instructional materials. Figure 11.2 provides a checklist developed by Tessmer and Harris (1992, p. 67) to guide instructional designers’ conversations with their clients.
Summative Evaluation Summative evaluation is conducted at the end of an instructional session being delivered or shortly afterwards. The purpose of the summative evaluation is to determine the extent to which learners have mastered the content. Learner achievement compared with the intended goals and learning objectives associated with a course can be evaluated through a variety of means such as pre-tests and post-tests, performance demonstrations, reactionary feedback surveys, or tests. Morrison and colleagues (2013) warn that instructional designers should be careful about pre-testing learners at the beginning of instruction since it “may cue the learner to concentrate on certain things while neglecting others” (p. 272). Summative evaluations that are conducted at the end of instruction are informative in terms of what content the learners have retained; however, they are not a good indicator to determine the learners’ abilities to apply that knowledge to other settings. Summative evaluations that are carried out a few days or weeks after instruction allow learners the opportunity to attempt applying what they have learned to the intended context for application.
Confirmative Evaluation The purpose of confirmative evaluation is to measure long-term performance beyond the scope of formative and summative evaluation (Giberson
181
182
ASSESSING THE SUSTAINABILITY OF INSTRUCTION Checklist: Materials factors questions of the learning environment When will the instruction be outdated or abandoned?
For how long are materials intended to be used? When will the content become outdated? When will the visuals or language become outdated? Will changes in equipment outdate the materials?
Do environmental constraints and user needs require adaptable materials? Will some users only use parts of the materials? Will some users alter the sequence of the instruction in the materials? Can materials be designed for adaptable content and sequence?
Will materials be easy to use by users in their learning environment(s)? What skills or knowledge are required to use the materials? Will users be capable of easily using the materials without training? Can users easily work the equipment needed to display materials?
Can materials be easily duplicated on-site or off site?
Will producers want materials to be copied by users? What is the probability that users will lose or damage the materials? Do intended media formats allow for on-site reproduction? What duplication equipment do users have in the learning environment(s)? Do users have the skills to reproduce the materials?
Can materials be replaced?
What is the probability that users will lose or damage parts of the materials? Can the remaining materials be effectively used if some parts are missing? Can lost materials be replaced on-site? Are off-site replacements easily obtainable?
FIGURE 11.2 Checklist: Materials factors questions of the learning environment. Source: Tessmer, M. & Harris, D. (1992). Analysing the instructional design: Environmental analysis. Routledge. Checklist: Materials factors questions of the learning environment (p. 67). Used with permission.
et al., 2006). Confirmative evaluations are typically conducted at least six months after instruction to allow learners time to integrate what they have learned to the transfer context. Confirmative evaluation is considered to be a continuous improvement strategy because of its integration with organizational goals, operations, and culture (Moseley & Solomon, 1997). In a study exploring the challenges that instructional designers encounter while conducting evaluations, the majority of participants shared
ASSESSING THE SUSTAINABILITY OF INSTRUCTION
that they were unable to conduct confirmative evaluations (DeVaughn & Stefaniak, 2021). The participants attributed this to clients not wanting to invest the money in conducting a confirmative evaluation, participants were satisfied with the summative data they received immediately upon completion of instruction, and the lack of access to follow-up with clients and their stakeholders. Challenges instructional designers reported directly related to attempting to conduct confirmative evaluations. These challenges identified a disconnect between and among organizational departments and the inability to convince stakeholders of relevance. Instructional designers reported that misalignment between business processes across organizational departments posed challenges for being able to attribute performance success to instructional activities. One way these types of challenges could be mitigated would be to engage in needs assessment and analyses prior to designing and implementing instructional interventions. This allows for appropriate data sources to be identified to validate the client’s perceived performance needs that can later be used as a baseline to evaluate the impact of interventions, both instructional and non-instructional, that have been developed as a result of the recommendations derived from the needs analysis. The other challenge was that clients did not necessarily understand the value and relevance associated with conducting a confirmative evaluation. This is similar to the challenges mentioned in Chapter 3 with instructional designers attempting to convince their clients of the importance of needs assessment.
Assessing Program Effectiveness, Efficiency, and Ease of Learning Each chapter in this book has revisited the goals of instructional design that are to contribute to effectiveness, efficiency, and ease of learning. While you have been encouraged to revisit these goals throughout the design process as you engage in design decision- making, they should most definitely be revisited after implementation to assess the sustainability of instruction. For each of these three goals, it is important that data be gathered from multiple stakeholders and more than one learner to gain a bigger picture of the impact that the instruction may be having on individuals, groups, and organizations.
Effectiveness An instructional intervention is often considered to be successful if it has met its intended outcomes. Effectiveness of the instruction can be looked at in two ways: (1) Were learners able to meet the learning objective set forth during instruction? (2) Have the results of the instruction impacted the organization’s goals and needs? Instructional effectiveness is often determined by reporting on learner achievement during the instruction. While these metrics will provide insight into the instructional context, they
183
184
ASSESSING THE SUSTAINABILITY OF INSTRUCTION
do not always provide sufficient information regarding the extent to which learners are able to apply what they have learned to their jobs or other real-world contexts. An instructional course or program would certainly be considered to be effective if learners’ assessments demonstrated that they had mastered the learning objectives presented during instruction, were successful at applying what they had learned to real-world contexts, and had improved the overall needs of the organization that were initially identified during a needs assessment. An instructional intervention is considered to be effective if the benefits and rewards outweigh the costs. If learners are making satisfactory progress upon completion of instruction, leaders within the organization will view the costs associated as an investment to support learner/employee performance. If any organization was not seeing any improvements in performance and errors were continuing to occur, an organization may reconsider the overall effectiveness of the program.
Efficiency An instructional program will be considered efficient on the basis of several factors. The amount of time it may take to deliver instruction and see an increase in learner performance in real-world contexts will contribute to efficiency. The amount of time required to train members of an organization in relation to the costs associated to offer training will also determine the degree of efficiency. The costs associated with resources to design interventions and deliver them may also contribute to or hinder efficiency. Earlier, this chapter discussed the importance of engaging in conversations with clients regarding the maintenance of instructional materials. Tessmer and Harris (1992) recommended that instructional materials be evaluated in terms of their adaptability, usability, and replicability. The extent to which instructional materials can be modified to meet the unique needs of different audiences at a low cost will also be perceived as contributing to efficiency.
Ease of Learning Feedback obtained from our learners serves as a great indicator of the ease of learning. Learners’ performance on different assessments can indicate the extent to which they understand the content provided during instruction. Reactionary surveys asking learners what they liked or did not like about training can also help the design team look for ways to improve the content and delivery. The amount of time it takes for learners to integrate new processes and practices that they learned during training into their regular routines is also an indicator of the degree to which the design team has contributed to ease of learning.
ASSESSING THE SUSTAINABILITY OF INSTRUCTION
Summary The purpose of this chapter was to look at how evaluative techniques can be used to assess the sustainability of instruction. This chapter looked at measuring effective change after a period of time has taken place after training. It is important to recognize that evaluation does not begin at the completion of instruction. A successful evaluation is one that is often planned well in advance of the interventions being developed and implemented. Instructional designers should engage in conversations with their design teams and clients about ways in which their instructional programming will be evaluated in terms of efficiency, effectiveness, and ease of learning.
Connecting Process to Practice Activities 1. Think about the different types of design projects you have worked on. What has been your role in evaluation? What frustrations have you encountered while evaluating different aspects of your projects? Are there things you wish you could have evaluated but you did not have access? How would this information inform your design work? 2. Enid works for an e-learning company, Health Media, that produces e-learning modules that are used for continuing education for nursing and allied health professionals. Health Media sells their courses to hospitals and health-care facilities throughout North America. A challenge that Enid must consider is the extent to which there may be differences depending on what country, state, or province a health-care facility is located in. Use Tessmer and Harris’ (1992) checklist provided in Figure 11.2 and consider what questions Enid should ask during the initial kick-off meeting for launching the next set of e-learning courses. 3. Formative evaluation can be used as a planning tool to help the instructional designer identify what is needed to ensure that clients can adapt instructional materials for sustained use after your role has ended on the project. What recommendations might you pose to your learners to ensure that the instructional materials can continue to be used? 4. This chapter discussed the importance of evaluating for effectiveness, efficiency, and ease of learning. Think about a project you are currently working on or a topic that interests you. If you were to design an instructional program, how might you evaluate effectiveness, efficiency, and ease of learning through formative, summative, and confirmative evaluations? To what extent would it be helpful to gather data at three different points? Based on your familiarity with the organization or the topic (or both), what challenges would you foresee in conducting these evaluations?
185
186
ASSESSING THE SUSTAINABILITY OF INSTRUCTION
Bridging Research and Practice Chyung, S. Y. (2015). Foundational concepts for conducting program evaluations. Performance Improvement Quarterly, 27(4), 77–96. https://doi.org/10.1002/ piq.21181 Guerra-López, I., & Toker, S. (2012). An application of the impact evaluation process for designing a performance measurement and evaluation framework in K-12 environments. Evaluation and Program Planning, 35(2), 222–235. https:// doi.org/10.1016/j.evalprogplan.2011.10.001 Moller, L., & Mallin, P. (1996). Evaluation practices of instructional designers and organizational supports and barriers. Performance Improvement Quarterly, 9(4), 82–92. https://doi.org/10.1111/j.1937-8327.1996.tb00740.x Smith, C. L., & Freeman, R. L. (2002). Using continuous system level assessment to build school capacity. American Journal of Evaluation, 23(3), 307–319. Williams, D. D., South, J. B., Yanchar, S. C., Wilson, B. G., & Allen, S. (2011). How do instructional designers evaluate? A qualitative study of evaluation in practice. Educational Technology Research and Development, 59, 885–907. https://doi. org/10.1007/s11423-011-9211-8
References Calhoun, C., Sahay, S., & Wilson, M. (2020). Instructional design evaluation. In J. K. McDonald and R. E. West (Eds.), Design for learning: Principles, processes, and praxis. EdTech Books. https://edtechbooks.org/id/instructional_design_evaluation DeVaughn, P., & Stefaniak, J. (2021). An exploration of the challenges instructional designers encounter while conducting evaluations. Performance Improvement Quarterly, 33(4), 443–470. https://doi.org/10.1002/piq.21332 Giberson, T. R., Tracey, M. W., & Harris, M. T. (2006). Confirmative evaluation of training outcomes. Performance Improvement Quarterly, 19(4), 43–61. https:// doi.org/10.1111/j.1937-8327.2006.tb00384.x Guerra-Lopez, I. J. (2007). Evaluating impact: Building a case for demonstrating the worth of performance improvement interventions. Performance Improvement Journal, 46(7), 33–38. Guerra-Lopez, I. J. (2008). Performance evaluation: Proven approaches for improving program and organizational performance. Jossey Bass. Morrison, G. R., Ross, S. M., Kalman, H. K., & Kemp, J. E. (2013). Designing effective instruction (7th ed.). Wiley. Moseley, J. L., & Solomon, D. L. (1997). Confirmative evaluation: A new paradigm for continuous improvement. Performance Improvement, 36(5), 12–16. National Science Foundation. (2010). The 2010 user-friendly handbook for project evaluation. National Science Foundation. Scriven, M. (1967). The methodology of evaluation. In R. Tyler, R. Gagne, & M. Scriven (Eds.), Perspectives on curriculum evaluation (pp. 39–83). McGraw-Hill. Spector, J. M. (2012). Foundations of educational technology: Integrative approaches and interdisciplinary perspectives. Routledge. Tessmer, M. (1993). Planning and conductive formative evaluations: Improving the quality of education and training. Routledge. Tessmer, M. (1994). Formative evaluation alternatives. Performance Improvement Quarterly, 7(1), 3–18. https://doi.org/10.1111/j.1937-8327.1994.tb00613.x Tessmer, M., & Harris, D. (1992). Analyzing the instructional setting: Environmental analysis. Routledge. Watkins, R., & Guerra, I. (2002). How do you determine whether assessment or evaluation is required. ASTD T&D Sourcebook, 131–139.
12 ETHICAL CONSIDERATIONS IN INSTRUCTIONAL DESIGN
Chapter Overview Ethical considerations as they relate to learning design have not garnered a lot of attention in learning design research and scholarship; however, in recent years, more researchers are advocating for additional focus to be placed on ethics and design practices. Additional emphasis will be placed on instructional designers’ responsibilities to promote inclusive design. This chapter will provide an overview of how ethics has been discussed in the field as well as provide scenarios where instructional designers may be challenged to uphold ethics. Emphasis will be placed on aligning ethical considerations with designing within systems (Chapters 3, 5, and 10) and meeting learners’ needs (Chapter 4).
Guiding Questions 1. What is ethics? 2. What role does ethics have in instructional design? 3. What is my responsibility to uphold ethics in my profession? 4. How can I integrate ethics into my everyday design practices?
What Are Ethics? It is important to note that this chapter is one of the most difficult to write. This is due to the subjectivity of what constitutes ethical behavior. When we engage in conversations with other individuals, most people will say they strive to uphold ethical behavior. If we think about the different groups we belong to, such as work groups, school groups, friends, and family, we have certain expectations that members of these groups will adhere to rules. According to Dean (1993), it is important to differentiate between ethics, morals, and values:
DOI: 10.4324/9781003287049-12
188
ETHICAL CONSIDERATIONS IN INSTRUCTIONAL DESIGN
• Ethics: the rules or standards that govern the conduct of the members of a group • Morals: personal judgments, standards, and rules of conduct based on fundamental duties of promoting human welfare (beneficence), acknowledging human equality (justice), and honoring individual freedom (respect for persons) • Values: the core beliefs or desires that guide or motivate the individual’s attitudes and actions, such as honesty, integrity, and fairness (p. 7) Each of us has our own set of morals and values that influence the decisions we make and how we approach other individuals and situations. Our morals and values are informed by our interactions with others, ethical expectations we have been exposed to throughout our lifetime, our education, and training, and other life experiences. Think about a time you witnessed an interaction between two individuals and you thought that what they were doing was wrong. Think about how you respond to stories in the news reporting on crime, political issues, and other societal issues. Our morals and values are what shape our interpretations of the world around us. Not only do our morals and values impact how we engage with other individuals on a daily basis, they greatly impact our work as instructional designers. Chapter 2 provided an overview of various types of design judgments commonly invoked by instructional designers (Nelson & Stolterman, 2012). As mentioned in Chapter 2, Lachheb and Boling (2021) noted that framing, instrumental, and core judgments were the most prevalent design judgments in instructional design. It is important to note that our core design judgments draw from designers’ beliefs and values that contribute to their approach (Nelson & Stolterman, 2012). Examples of our core judgments in instructional design include but are not limited to the following: • The amount of data we determine to be sufficient to enable us to make accurate recommendations to our clients during a needs assessment or program evaluation • Our willingness to properly attribute and give credit to instructional materials developed by others • The amount of concern we put forth to ensure that the materials we design and develop will contribute to improved learner outcomes • Our ability to speak up and advocate for how we can promote inclusive and accessible materials. Our personal ethics are what we bring to our design projects when interacting with others in educational and workplace settings. Glassdoor (2021) suggests that personal ethics are important for guiding behavior in the workplace because they
ETHICAL CONSIDERATIONS IN INSTRUCTIONAL DESIGN
Allow leaders to more effectively lead their teams • • Instill a sense of trust and support in leaders • Give individuals a solid basis on which to determine the most appropriate action in any given situation • Improve the decision-making process • Set a standard of behavior • Support motivation A combination of both personal and professional ethics guides our practices. A challenge to writing this chapter is that everyone’s personal ethics is variable, and while we may agree upon a set of personal and professional standards, our individual interpretations of what constitutes those standards and how they should be translated to instructional design practice are subjective. In the hopes of prompting personal reflection for how we recognize and address ethical implications related to our work, this chapter will explore different ethical standards that have been identified by professional organizations in our field.
The Instructional Designer’s Responsibility to Uphold Ethics Sometimes it is difficult to know when an ethical dilemma is present because we feel as though they sneak up on us. Perhaps it is not a drastic deviation from agreed-upon ethical behavior. Hill (2006) offers the following examples for ethical dilemmas that instructional designers may find themselves in: • • • • • •
Simply agreeing with the client versus recommending what they need Accepting a job without having the expertise that the client is requesting Recommending a colleague who does not have the necessary expertise Skewing data to make an intervention look more attractive Promising results you know you cannot deliver Deliberately delaying the delivery of a task (p. 1050)
Sometimes we may find ourselves in situations where our client has hired us for a project and the solution they want is not the solution they need. We may be able to identify at the onset of the project that they will not get the desirable results they are seeking. This becomes an ethical dilemma because it puts us in a situation that may impact our participation in the project. If we warn our client that the path they want to take is not viable, they may decide to remove us from the project altogether and find an instructional designer who is willing to follow their approach. While some instructional designers may say that they are okay because they do not want to jeopardize their integrity and responsibility to the profession, other instructional designers may argue that they need the income that will result from the project.
189
190
ETHICAL CONSIDERATIONS IN INSTRUCTIONAL DESIGN
On the other hand, our client may appreciate that we have notified them of the risks involved in taking a particular approach. They may attribute our willingness to speak up as a testament to our expertise and credibility as an instructional designer and client. It can be perceived as a demonstration of our willingness to do what is in the best interests of our clients. Another expectation of instructional designers is that we should not oversell our capabilities when a client is requesting our services. If we do not have the necessary experience or training, we should not lead our client on to think that we do. That could significantly impact the integrity of a project and will not be in the client’s best interests. The same goes for referring colleagues for positions. If we make recommendations to a client to hire an individual that we know, we should do our due diligence to ensure that we are recommending someone who can support the needs and goals of the projects. Instructional designers have an ethical responsibility to be honest when analyzing and presenting data. We should never skew data to make an intervention look more attractive. If survey results or test results show that the intervention was not successful, we must present all data to our clients. We cannot pick and choose which information to include or exclude. Skewing the data could lead a client to implement interventions that will not work and may cause more harm than good. It is also important to provide clients with realistic estimates for the amount of time it may take to complete tasks. We should not accept a contract when we know from the very beginning that it will be impossible to meet the deadlines. While some designers will agree to the terms of the project knowing that they will not be successful, we are better to approach our clients and request an amendment to the proposed timeline. If we find that clients are not amenable to change when planning for project deliverables at the beginning of the project, this is a good indicator that there are likely to be more difficulties throughout the remainder of the project. A reasonable client will appreciate your being forthright with what is a reasonable amount of time to complete a task properly.
The Role of Ethics in a Profession Some of the earliest conversations about the role that ethics serves in instructional design were presented by James D. Finn (1953) in his paper, “Professionalizing the Audio-Visual Field,” in which he identified six characteristics that distinguished a profession: . “An intellectual technique 1 2. An application of that technique to the practical affairs of man 3. A period of long training necessary before entering into the profession
ETHICAL CONSIDERATIONS IN INSTRUCTIONAL DESIGN
4. An association of the members of the profession in a closely knit group with a high quality of communication between members 5. A series of standards and a statement of ethics which is enforced 6. An organized body of intellectual theory constantly expanding by research.” (p. 7) Finn (1953) noted: although there is much criticism of many professions as this point and some evidence that many codes are window dressing to protect the profession from public interference and are not enforced except to the advantage of the profession as against the public, the fact remains that the idea of an ethic with the power of enforcement places a personal responsibility on each member of a profession not associated with other types of occupations. (p. 8) The International Board of Standards for Training, Performance, and Instruction (IBSTPI) has listed the ability for an instructional design to identify and respond to ethical, political, and legal implications of their design practices as an essential competency. Within this competency, instructional designers are responsible for recognizing and planning for how to address ethical implications impacting their work. Examples of common ethical implications that instructional designers encounter regularly include knowing the regulation guidelines associated with the industry which the designer may be developing content for. It is also important for instructional designers to plan ahead for how to address issues surrounding the intellectual property rights of other individuals. Instructional designers need to consider what copyright permissions are attributed to written content and images that a design team may want to include in their training materials. The IBSTPI also recognizes that another way designers can demonstrate ethical awareness is by adhering to codes of ethics associated with the organizations in which they work and their profession. The IBSTPI’s 2012 Code of Ethics explicitly state that instructional designers should uphold the following standards in their professional work: 1. Responsibility to others The instructional designer’s responsibilities to others include, but are not limited to, providing cost-effective and reasonable solutions to our clients, designing solutions for our clients that meet the goals of the project and the intended needs, and using systematic processes to improve learning and performance. It has been mentioned before in this book that the learner should be at the center of all that we
191
192
ETHICAL CONSIDERATIONS IN INSTRUCTIONAL DESIGN
do (Chapter 1). It is our responsibility to our learners, clients, and employers that we adhere to that in the most ethically sound way. We can demonstrate that by following the proper protocols to ensure that the design decisions we make are in the best interests of our learners. The use of systematic processes to improve learning and performance suggests that we are following processes that are grounded in theory and best practices in our field. These processes allow us to gather data during needs assessment and program evaluation to ensure that we have gained a solid representation and understanding of the situation. By employing systematic processes for gathering data to inform various aspects of our design work, we are better positioned to design toward sustainability. Another important responsibility of the instructional designer that lies within this ethical standard is the ability to engage in risk management. It is our responsibility to inform clients of risks and consequences associated with different design decisions to help them inform their own decisions pertaining to the project. While this does not mean that we have to do a risk assessment on every decision that we make, it is important that we engage in conversations when we anticipate that there may be ethical, political, and legal implications associated with the actions taken on a project. . Social mandates 2 Instructional designers’ ethical responsibilities to adhere to social mandates include “making professional decisions based upon moral and ethical positions regarding societal issues” (Koszalka et al., 2013, p. 149). This entails supporting goals and initiatives to engage and include members of an organization. This has become more prevalent in diversity, equity, and inclusion efforts that have been integrated into organizational planning. We have also begun to see an increased awareness and need in our field to promote inclusive design. Scholars in our field are beginning to explore ways in which we can employ empathetic approaches in our design practices to engage our learners in meaningful learning experiences. We are also continuing to see scholars imploring our field to re-evaluate some of the instructional design models and prescriptions we have adhered to for decades to determine the extent they are guiding designers to make inclusive and ethical decisions (Moore, 2021). Adhering to social mandates is not just a matter of striving for efficiency, it is ensuring we are supporting our learners through ethical means. I anticipate we will continue to see increased attention to ways in which instructional designers can support social mandates through their design work and advocate for inclusivity and equity. As a result, we may see new guidelines and overlay models developed to accompany our instructional design processes.
ETHICAL CONSIDERATIONS IN INSTRUCTIONAL DESIGN
3. Respecting the rights of others Within this standard, instructional designers are responsible for protecting the rights and privacy of our clients. It is important that we not share personal information about our learners to other individuals. If a learner or client tells us something in confidence, it is important that we maintain their privacy throughout the project. If something is relayed to us that could pose significant harm to a member of the organization, it is important that we tactically work through what information needs to be shared with which individuals. We would not want our instructors to necessarily share with the class the fact that we earned a poor grade on an assignment or a course. Likewise, we should be careful not to include personal identifiers when presenting data on learning outcomes associated with our instruction. Another important responsibility that falls within this ethical standard is to protect intellectual property rights and attribute credit to others when appropriate. It is our responsibility to ensure that we are attributing credit appropriately when referencing other individuals’ work. Instructional designers must be aware of copyright licenses associated with using images that have been developed by other individuals. It is important that instructional designers not plagiarize content that has been written by other people. If an instructional designer is working for multiple clients, it is important that they not leverage resources or content they have acquired on one job to use for the other. This is especially important if an instructional designer finds themselves working for clients who may be competing in the same industry. Another example of ways in which we can respect the rights of others is to not make false claims about individuals or discriminate unfairly in actions related to hiring. If an instructional designer is on the hiring team, it is important that every applicant be evaluated consistently and fairly on the basis of their qualifications related to the job they have applied for. 4. Professional practice This last guiding standard set forth by the IBSTPI implores instructional designers to uphold standards in the profession. Instructional designers are expected to demonstrate honesty in their work, acknowledge the contributions of others to a project, and support colleagues. Additionally, instructional designers are expected to engage in continual training to keep current with trends in the field and build upon their knowledge, skills, and attitudes. This coincides with Finn’s (1953) characteristics for what makes a profession and his arguments for why our field should be considered a profession. Another practice that falls within this guiding standard is to withdraw from a project when an instructional design observes a
193
194
ETHICAL CONSIDERATIONS IN INSTRUCTIONAL DESIGN
client behaving unethically or if there is a conflict of interest that may impact the integrity of the project. Examples of this could be removing oneself from a project if you have been asked to relay inaccurate information to a group of learners or designing instructional content that has significant errors that may lead to individuals getting hurt.
Incorporating an Ethical Lens into Instructional Design Practices While the IBSTPI’s code of ethics provides a statement of behavior expected of individuals within our field, a challenge is that ethics is subject to interpretation. We each live by our own interpretation of what constitutes ethical behavior. What I might find appalling and inappropriate, others may not see as a concern. Dean (1993) warns us: Codes do not necessarily eliminate behaviors such as theft of materials or blocking the agenda of a group for personal gain. Those who are unethical may remain so at heart. They may not realize an action is unethical and may continue doing it out of habit, or they may know it is unethical and simply not care…Isn’t establishing clear performance expectations one of the first things we recommend for achieving exemplary performance? (p. 5) In her paper, “The Design Models We Have Are Not the Design Models We Need,” Moore (2021) implores scholars in our field to re-evaluate our systematic instructional design frameworks that have guided so much of what we do over the past several decades. Moore (2021) notes that “the absence of ethical considerations in design models presents a significant limitation in both the models and mindset as a field resulting in disservices and injustices to learners and educational, organizational, and social systems” (para 3). If we are to truly embrace an ethical lens in our instructional design work, instructional designers must look beyond achieving learning objectives and ensure that design practices address accessibility, inclusion, and racial and economic inequalities (Boling & Gray, 2021; Moore, 2021; Rieber & Estes, 2017). What I particularly like about Moore’s (2021) paper is that she ends it with a call to the field to continue exploring ethical issues that implicate our instructional design practices so that we can create intentional and deliberate responses to address ethical implications in the contexts within which we design. Moore and Griffin (2022) suggest that one way our field can better prepare future instructional designers is by anchoring ethics into our design and instructional practices. This suggests that this can be accomplished through three core activities: problem- framing, reflection- in- action, and ethics as design (Figure 12.1).
Problem Framing
Articulate both learning and other desired outcomes (Kaufman)
Ethical analysis by way of questioning and reflection integrated into the design and decisionmaking process (Schon; Beck & Warren) Generate questions to prompt ethical considerations throughout the process
Ethics as Design
Emphasis on devising solutions to ethical problems rather than selecting a “right” or “wrong” answer Articulate how ethical considerations influence technical specifications and other design requirements
FIGURE 12.1 Integration and application for ethics as design. Source: Moore, S.L. & Griffin, G. (2022). Integrating ethics into the curriculum: A design-based approach for preparing professionals to address complex problem spaces. In J.E. Stefaniak & R.M. Reese (Eds.), The instructional design trainer’s guide: Authentic practices and considerations for mentoring ID and ed tech professionals (pp. 121–134). Routledge. Figure 13.2 Integration and application for ethics as design (p. 127). Used with permission.
ETHICAL CONSIDERATIONS IN INSTRUCTIONAL DESIGN
How is the problem framed to include ethical considerations - equity, privacy, accessibility, efficacy, sustainability, health, and safety (Svihla)
Reflection-in-action -
195
196
ETHICAL CONSIDERATIONS IN INSTRUCTIONAL DESIGN
Their recommendations for how ethics can be integrated in these fundamental activities in the design process provide a different lens to approach design practices. During the problem-framing phase of a design project, we can set boundaries to manage our design space by asking questions that will acknowledge the space and resources needed to support accessibility, inclusion, and equity. We can build upon reflection-in-action by analyzing our actions in terms of how well they are supporting ethical and social issues related to the project. Lastly, the information we gather and leverage during problem-framing and throughout reflection-in-action should impact our abilities to engage in the actual design of materials. Questions about the quality of the design that we can gather through formative feedback should be used to answer what additional work needs to be done to support accessibility and inclusion. Instructional designers will benefit from more activities that require them to review their own ethical positions related to design. Instructional design coursework should incorporate material that prompts students to reflect on the ethical implications of their design actions. While codes of ethics can be useful for professional organizations and professions, they are ineffective if they are written only for the sake of being written. Instructional designers should routinely review codes of ethics guiding their practice to identify areas for improvement. In doing so, they identify specific performance areas that can be targeted through professional development opportunities. It should be noted that ethics is not something that is achieved through completion of a workshop or a single course. Similar to honing our craftsmanship as instructional designers, our ethical positionalities are influenced by each design experience we encounter. The development of our abilities to critically examine situations and anticipate ethical implications associated with our work is an ongoing process that instructional designers should not lose sight of.
Summary The purpose of this chapter is to provide an overview of the role that ethics plays in our professional practice. Examples of scenarios in which instructional designers may find themselves were introduced. This chapter is not intended to provide an exhaustive list of recommendations for how to uphold ethical standards in our profession. It is important to the note that this work is just beginning; more researchers are advocating for additional focus to be placed on ethics and design practices (e.g., Gray & Boling, 2016; Lin, 2007; Moore, 2014, 2021; Osguthorpe et al., 2003).
Connecting Process to Practice Activities 1. Review a code of ethics that you think is most relevant to your work as an instructional designer. Upon review, identify what would be
ETHICAL CONSIDERATIONS IN INSTRUCTIONAL DESIGN
your top five principles. Why did you prioritize these principles over others? What recommendations do you have for upholding these principles? 2. What are ways in which you already demonstrate ethical practices in your design projects? What are the ethical implications associated with what you have described? 3. Pick an instructional design model that you are familiar with or modify one to suit your needs as a designer. How might you weave ethics into the model? What prescriptions could you propose at different aspects of the design process to promote ethical considerations? 4. Moore and Griffin (2022) suggest that ethical considerations can be integrated into problem-framing, reflection-in-action, and ethical design. Think about a project you have recently completed. Reflect upon the extent to which ethical considerations were integrated into these stages of the project. If you could revise your approach to the project, what changes would you make to these stages?
Bridging Research and Practice Campbell, K., Schwier, R. A., & Kenny, R. F. (2005). Agency of the instructional designer: Moral coherence and transformative social practice. Australasian Journal of Educational Technology, 21(2). https://doi.org/10.14742/ajet.1337 Garcia, A., & Lee, C. H. (2020). Equity-centered approaches to educational technology. In M. J. Bishop, E. Boling, J. Elen, & V. Svihla (Eds.), Handbook of research in educational communications and technology: Learning design (4th ed., pp 247–261). Springer. Ifenthaler, D., & Tracey, M. W. (2016). Exploring the relationship of ethics and privacy in learning analytics and design: Implications for the field of educational technology. Educational Technology Research and Development, 64, 877–880. https://doi.org/10.1007/s11423-016-9480-3 Steele, P., Burleigh, C., Kroposki, M., Magabo, M., & Bailey, L. (2020). Ethical considerations in designing virtual and augmented reality products—Virtual and augmented reality design with students in mind: Designers’ perceptions. Journal of Educational Technology Systems, 49(2), 219–238.
References Boling, E., & Gray, C. (2021). Instructional design and user experience design: Values and perspectives examined through artifact analysis. In B. Hokanson, M. Exter, A. Grincewicz, M. Schmidt, & A. Tawfik (Eds.), Intersections across disciplines: Interdisciplinarity and Learning (pp. 93–108). Springer. Dean, P. J. (1993). A selected review of the underpinnings of ethics for human performance technology professionals—Part one: Key ethical theories and research. Performance Improvement Quarterly, 6(4), 3–32. https://doi.org/10. 1111/j.1937-8327.1993.tb00603.x Finn, J. D. (1953). Professionalizing the audio-visual field. Audio Visual Communication Review, 1(1), 6–17. Glassdoor Team. (2021). Personal ethics: What they are and why they are important. https://www.glassdoor.com/blog/guide/personal-ethics/
197
198
ETHICAL CONSIDERATIONS IN INSTRUCTIONAL DESIGN Gray, C. M., & Boling, E. (2016). Inscribing ethics and values in designs for learning: A problematic. Educational Technology Research and Development, 64(5), 969–1001. https://doi.org/10.1007/s11423-016-9478-x Hill, J. (2006). Professional ethics. In J. A. Pershing (Ed.), Handbook of human performance technology (3rd ed., pp. 1047–1066). Pfeiffer. IBSTPI. (2012). Code of ethics. In T. A. Koszalka, D. F. Russ-Eft, & R. Reiser (Eds.), Instructional designer competencies: The standards (4th ed., pp. 149–150). Information Age Publishing. Koszalka, T. A., Russ-Eft, D. F., & Reiser, R. (2013). Instructional designer competencies: The Standards (4th ed.). Information Age Publishing. Lachheb, A., & Boling, E. (2021). The role of design judgment and reflection in instructional design. In J. K. McDonald & R. West (Eds.), Design for learning: Principles, processes, and praxis. EdTech Books. https://edtechbooks.org/id/ design_judgment Lin, H. (2007). The ethics of instructional technology: Issues and coping strategies experienced by professional technologists in design and training situations in higher education. Educational Technology Research and Development, 55(5), 411–437. https://doi.org/10.1007/s11423-006-9029-y Moore, S. L. (2014). Ethics and design: Rethinking professional ethics as part of the design domain. In B. Hokanson & A. Gibbons (Eds.), Design in educational technology (pp. 185–204). Springer. Moore. S. L. (2021). The design models we have are not the design models we need. The Journal of Applied Instructional Design, 10(4). https://doi.org/10. 51869/104/smo Moore, S. L., & Griffin, G. (2022). Integrating ethics into the curriculum: A design- based approach for preparing professionals to address complex problem spaces. In J. E. Stefaniak & R. M. Reese (Eds.), The instructional design trainer’s guide: Authentic practices and considerations for mentoring ID and ed tech professionals (pp. 121–134). Routledge. Nelson, H. G., & Stolterman, E. (2012). The design way: Intentional change in an unpredictable world (2nd ed.). The MIT Press. Osguthorpe, R. T., Osguthorpe, R. D., Jacob, W. J., & Davies, R. (2003). The moral dimensions of instructional design. Educational Technology, 43(2), 19–23. Rieber, L., & Este, M. (2017). Accessibility and instructional technology: Reframing the discussion. Journal of Applied Instructional Design, 6(1), 9–19.
INDEX
Pages in italics refer to figures and pages in bold refer to tables. Affordances 9, 26, 42–44, 71 Attainment value 149–150 Authentic learning 92–96, 99–101, 116–117 Autonomy 75, 84, 93, 97, 103, 139, 148, 154 Case-based learning 95–96, 98, 116 Cognitive apprenticeship 104–110, 117, 138–139 Competence 8–9, 148–149 Conceptual knowledge 6–10, 127 Conditional knowledge 6–9, 24, 100, 109 Confirmative evaluation 178, 181–183 Conjecture 25–27, 58, 60 Context 18, 20–22, 70–72 Contextual analysis 71–76 Culture 64–66, 171
Generative learning strategies 120, 123–126 Germane load 124–126 Goal setting 73, 108 Help seeking 135–138 Human performance technology 162–163, 170 Inclusive design 192 Instructional context 73–77, 85, 95, 103, 110, 117, 134–135, 147, 151, 183 Instructional design 2 Intrinsic load 124, 126 Intrinsic motivation 147–148 Intrinsic value 149–150 Job analysis 165 Knowledge management 159
Decision 14–17 Decision-making 17–24, 60, 87 Design judgment 18, 26, 141, 188 Dynamic decision-making 23–26, 169 Empathetic design 61, 66, 86 Ethics 65–66, 187–189 Evaluation 6, 19, 83, 87, 126, 170–171, 175–177 Expertise 4–6 Extraneous load 121, 124, 126–127 Extrinsic motivation 147–148 Feedback systems 165 Fluency 9–10 Formative evaluation 171, 179–181
Learner analysis 6, 41, 48–51 Learner characteristics 50, 55, 58, 64, 74–75, 100, 103, 180 Learning 92–93 Learning space 26 Localization of context 82, 86–87, 92, 112 Modeling 100, 105–106 Morals 187–188 Need 36–37 Needs analysis 32–38 Needs assessment 32–37 Non-instructional intervention 42, 45, 165, 171
200
Index Organizational design 165 Orienting context 74, 76–77, 80, 85, 100, 138 Personas 61–64, 86, 147 Primary needs 36–37 Prior knowledge 23, 25–26, 50–51, 117–118, 122 Problem-based learning 95–98 Problem finding 5, 7 Problem framing 5–7, 27 Problem solving 5 Procedural knowledge 6 Productive failure 126–128
Self-regulated learning 6, 133–137 Situated cognition 93, 95 Strategic knowledge 7–9 Summative evaluation 6, 181 Supplantive learning strategies 117–118 Systematic process 2, 33, 154, 191–192 Systemic process 2–4 Task strategies 133–135, 137 Tertiary needs 36–37 Time management 133, 135, 138 Transfer context 73–76, 92–93, 99, 110, 147, 150, 182 Utility value 149–150
Rational decision-making 22, 169 Relatedness 148–149 Scaffolding 95, 99, 106–108, 117–118, 122–123, 139–140 Secondary needs 36–37 Self-evaluation 133–137, 141, 152
Valence 149 Values 18, 20–21, 62, 64, 73, 77, 150, 187–188 Zone of proximal development 94–95, 103