Assessing Service-Learning and Civic Engagement : Principles and Techniques [2 ed.] 9781945459122

This book offers a broad overview of many issues related to assessment in higher education, with specific application fo

214 77 12MB

English Pages 170 Year 2018

Report DMCA / Copyright

DOWNLOAD PDF FILE

Recommend Papers

Assessing Service-Learning and Civic Engagement : Principles and Techniques [2 ed.]
 9781945459122

  • 0 0 0
  • Like this paper and download? You can publish your own PDF file online for free in a few minutes! Sign Up
File loading please wait...
Citation preview

Assessing Service-Learning and Civic Engagement

ASCV.indb 1

27-08-2018 15:59:47

Campus Compact is a national coalition of colleges and universities committed to the public purposes of higher ­ ­education. Campus Compact publications focus on practical strategies for campuses to put civic education and community engagement into action. Please visit http://compact.org for more information.

ASCV.indb 2

27-08-2018 15:59:48

Assessing Service-Learning and Civic Engagement Principles and Techniques Second Edition

SHERRIL B. GELMON, BARBARA A. HOLLAND, AND AMY SPRING WITH SEANNA M. KERRIGAN AND AMY DRISCOLL

B O S T O N , M A S S AC H U SE T T S Distributed by Stylus Publishing, LLC.

ASCV.indb 3

27-08-2018 15:59:48

COPYRIGHT © 2018 BY CAMPUS COMPACT Published by Campus Compact 45 Temple Place Boston, MA 02111 All rights reserved. No part of this book may be reprinted or reproduced in any form or by any electronic, mechanical, or other means, now known or hereafter invented, including photocopying, recording, and information storage and retrieval, without permission in writing from the publisher. Library of Congress Cataloging-in-Publication Data Names: Gelmon, Sherril B., 1955- editor. Title: Assessing service-learning and civic engagement : principles and techniques / Sherril B. Gelmon, Barbara A. Holland, and Amy Spring, with Seanna Kerrigan and Amy Driscoll. Description: Second Edition. | Boston, Massachusettts : Campus Compact, [2018] | “Distributed by Stylus Publishing.” | Includes bibliographical references and index. Identifiers: LCCN 2017050429 (print) | LCCN 2018000183 (ebook) | ISBN 9781945459122 (uPDF) | ISBN 9781945459115 (ePub, mobi) | ISBN 9781945459092 (cloth : acid-free paper) | ISBN 9781945459108 (paperback : acid-free paper) | ISBN 9781945459122 (library networkable e-edition) | ISBN 9781945459115 (consumer e-edition) Subjects: LCSH: Service learning--United States. | Community and college--United States. | Political participation--United States. Classification: LCC LC220.5 (ebook) | LCC LC220.5 .A85 2018 (print) | DDC 361.3/70973--dc23 LC record available at https://lccn.loc.gov/2017050429 13-digit ISBN: 978-1-945459-09-2 (cloth) 13-digit ISBN: 978-1-945459-10-8 (paperback) 13-digit ISBN: 978-1-945459-12-2 (library networkable e-edition) 13-digit ISBN: 978-1-945459-11-5 (consumer e-edition) Printed in the United States of America All first editions printed on acid-free paper that meets the American National Standards Institute Z39-48 Standard. Bulk Purchases Quantity discounts are available for use in workshops and for staff development. Call 1-800-232-0223 First Edition, 2018

ASCV.indb 4

27-08-2018 15:59:48

CONTENTS

PREFACE

vii

ACKNOWLEDGMENTS

xi

1

REFLECTIONS ON THE PROGRESS OF MEASUREMENT OF COMMUNITY ENGAGEMENT, 2001–2018

1

2

ASSESSMENT PRINCIPLES AND STRATEGIES: AN OVERVIEW31

3

STUDENT IMPACT

45

4

FACULTY IMPACT

65

5

COMMUNITY IMPACT95

6

INSTITUTIONAL IMPACT

113

7

METHODS AND ANALYSIS

129

REFERENCES

137



145

ABOUT THE AUTHORS

INDEX

147

v

ASCV.indb 5

27-08-2018 15:59:48

ASCV.indb 6

27-08-2018 15:59:48

PREFACE

Service-learning is an educational methodology that combines community-based experiences with explicit academic learning objectives and deliberate reflection. These learning experiences require a partnership between the community (nonprofit, schools, government, business, human services organizations, or other entities) and the institution or academic unit/program and are characterized by a focus on mutual benefit. Improvement and sustainability of the experiences and the partnerships are enhanced through formal assessment activities that involve community, faculty, student, and institutional voices and perspectives. This handbook presents a set of well-tested strategies and techniques for gathering data about the impact of service-learning and similar programs.

handbook offers guidance on how to clarify measurement goals and objectives; select and develop the best instruments and tools most relevant to specific goals for assessment and evaluation; craft clear questions that help respondents give accurate and relevant responses; and focus on the diverse perspectives of each constituency involved in engaged activities: students, faculty, community partners, and institutional voices. The creation of the core content that became this handbook began with an internal research project at Portland State University (PSU) in 1994. The 1990s were a decade of major institutional transformation at PSU. Campus leadership and a large and diverse group of faculty, staff, students, and community partners were involved in participatory planning and visioning that led to a new identity as an engaged urban research university with the focus that continues today: “Let Knowledge Serve the City.” Completely new models were developed for general education and the undergraduate experience, teaching improvement infrastructure and other faculty development support, an innovative faculty promotion and tenure policy that incorporated Ernest Boyer’s vision of multiple forms of scholarship, and a research agenda that incorporated an emphasis on community-relevant research issues, among other changes. Service-learning (often referred to at PSU and other institutions as community-based learning) was a prominent strategy, emphasized throughout undergraduate learning and culminating in a final year community-based capstone for all undergraduate students. Engaged learning was also integrated into graduate and professional education. With such a sweeping commitment to engagement strategies as a core of the institution’s plan and identity, leaders and scholars worked together to develop a comprehensive agenda of evaluation and research on the effects of community-based learning and engagement on the participating constituencies. This monograph emerged from various projects, first at PSU and then through national and regional research initiatives and at other institutions. Our goal in 1994 when we began this work was to develop an assessment model that responded to the complexity of

The Evolution of This Monograph This handbook was first published by Campus Compact in 2001 and remains one of Campus Compact’s best-sellers. The third reprint of the original book was published in 2009 and has long been sold out. The ongoing demand for copies of this handbook signals that the core material of the original publication is still relevant. Now, 17 years after it first appeared, and on the occasion of the 30th anniversary of Campus Compact, the original author team has developed an updated chapter 1 that brings the handbook into this new and dynamic time of innovation, progress, and growth in the work of community and civic engagement in higher education. We considered rewriting the entire work to create a new edition, but as we talked with others who have found the book useful and reviewed the content in detail, we concluded that the original text continues to have utility for practitioners in the field. It is a clear and straightforward description of the basics of how to design and implement a comprehensive data collection plan including the development of instruments and tools for capturing data from the diverse constituencies involved in community engagement, service-learning, and related activities. The vii

ASCV.indb 7

27-08-2018 15:59:48

viii  Preface service-learning and other strategies for community engagement in higher education. When we began, a literature review revealed there was little available in terms of relevant models, approaches, and instruments. We used a case study method to test multiple assessment strategies at PSU that would give value to both our service-learning and other community activities. Over four years (1994–1998), the research team (composed of the authors with other occasional participants and graduate assistants) conducted an extensive series of studies using multiple methods to develop questions, instruments, and an understanding of which approaches generated the most accurate and useful data for each constituent group. From the beginning, there was an intentional focus on understanding how to hear the voices of students, community, faculty, and campus leaders; build knowledge of the kind of infrastructure, staffing, and funding needed both for start-up and for ongoing sustainability; and investigate how to measure impacts and benefits and then apply that information for program improvement and support.

Using Multiple Methods With Multiple Constituents Early on in our development and testing of our conceptual framework and approach, we learned that an intentional examination and testing of diverse methods was necessary, and it ultimately led to the structure that became this handbook (Driscoll, Holland, Gelmon, & Kerrigan, 1996). We observed that approaches to data collection had to be specifically tailored for students, faculty, community partners, and institutions. By testing a range of methods, we concluded that diverse methods offered different value and purposes in the context of the unique constituents and individual purposes. We also found that multiple methods of data collection were important for each constituent group to gain a full perspective on activities and outcomes. We relied on existing literature of the field to draft instruments that would be relevant to any institution or community setting. We observed that practitioners in the field needed advice on planning and sustaining strategies for data collection, selection of methods, design of tools, methods of data analysis, and reporting, so we also included guidance for methods and implementation in the original handbook. We began presenting our conceptual model in various venues in 1995. We immediately received

ASCV.indb 8

many requests for copies of our assessment methods and instruments and decided to prepare a handbook that could be widely available. We published a first edition of this handbook in June 1997 through the Center for Academic Excellence at Portland State University (Driscoll, Gelmon, et al., 1997). That edition was based on development of a large number of assessment instruments, and our experiences pilottesting those instruments in 10 service-learning courses at PSU. We quickly learned from our analysis of those results that we could make refinements in the conceptual matrices that guided our assessment activities, and that many of the instruments could benefit from further refinement. These refinements were made, and a second edition was published, again by PSU, in April 1998 (Driscoll, Gelmon, et al., 1998). Over 2,000 copies were printed and distributed nationally and internationally through PSU. Campus Compact then invited us to expand the handbook and provided us with support, made possible by the Corporation for National Service, to prepare and publish the 2001 edition. This edition, while grounded in the earlier PSU versions, offered a much broader perspective on assessment strategies as the authors had worked with this conceptual material in various projects at multiple higher education institutions across the country and with multiple community organizations. This provided broader insights into background, supporting literature, advantages and limitations, and practical guidance on use of the various instruments. Well over 2,000 copies have been sold through Campus Compact. This handbook’s popularity reflects awareness of the need for quality tools to gather even basic information about service-learning and engagement activities. In PSU’s planning environment, which emphasized evidence-based decisions and attention to the corollary impacts of those decisions, and in the c­ontext of rapid implementation, we had a unique opportunity to focus on measurement at the same  time that engagement programs were being launched. We realized that the core focus was on partnerships, and the reality that each participating constituent group comes to a partnership with its own goals (and some shared ones as well) meant that any plan to describe, measure, or analyze activity design and outcomes would have to be organized by the constituent group to reflect its unique goals, aims, and roles. This understanding continues to shape approaches to data collection for this field today (e.g., the Carnegie Community Engagement Classification application, described in Chapter 1).

27-08-2018 15:59:48

Preface  ix Much like the original publication, this new edition relies heavily on examples and practices employed at PSU. The original work, associated instruments, and practices were developed, tested, and revised in PSU classrooms, with faculty and institutional systems. Over the years PSU has continued to use many of the assessment techniques that were born out of this work. Other institutions and organizations have also built on our work and contributed to our learning, understanding and examples.

Focus on Service-Learning The primary focus of our assessment efforts has been on curricular-based service-learning (which today is often called community-based learning, communityengaged learning, and other terms). In each chapter, we also provide illustrations of other applications of this material, such as in other kinds of experiential education, cocurricular activities, institutional change processes, partnerships, or other kinds of faculty development initiatives. While we recognize that there are many potential applications of this material and encourage such use, nonetheless the primary focus and therefore most of the illustrations relate to service-learning. A reader can take these illustrations and apply them as a helpful resource in other contexts as well. This handbook is not intended to be the ultimate guide to service-learning. For resource materials on service-learning, the reader is referred to the many resources available through Campus Compact’s website.

Organization of the Handbook and Framing of the 2018 Chapter 1 When this volume was originally produced, the unit of analysis was primarily the individual, and thus the focus was on the impact of an individual course, or the involvement of a faculty member or community partner organization in a specific activity, with some attention to how these findings could be aggregated to understand broader impact. We articulated four constituencies (students, faculty, institution, and community) and believe that the four-constituency approach still works and much of the initial content is relevant. The original framework of the handbook focused on fundamental questions such as: What do I want to know? What would I look for? How will I measure or

ASCV.indb 9

observe this? From whom or where can I collect this evidence? This approach has reportedly helped many practitioners and leaders advance and inform their approach to measurement and evaluation. The sample tools for each constituent group were designed to address the basic and commonly understood characteristics of good practices as well as the common goals widely shared as desirable outcomes of effective service-learning and engagement. These basic practices and goals are still at the core of the field’s aims and values, though intervening years have produced new insights regarding purposes, objectives, outcomes, and policies. We have not undertaken a systematic literature review in preparing this edition, and this update  should not be read as a scholarly summary of the state of the field. Rather, we draw on our collective experiences at PSU (where three of us continue to work) and with multiple other institutions and organizations and reflect on those experiences. We cite some examples from other institutions that we know and that have materials readily accessible on websites, and we recognize that there are many other institutions doing excellent work (that may be accessed via their websites). We begin with a new chapter, “Reflections on the Progress of Measurement of Community Engagement, 2001–2018.” This chapter starts with a review of issues related to measurement of engagement. Readers have asked for this foundational content to help understand the general context for using this handbook. We then present updates on progress in measurement for each of the four constituencies framed in the original text. We have ordered the discussions of the four constituencies to start with the institutional perspective—this is where the major work is occurring today, and the primary questions often asked about the impact of engagementrelated activities are, “Engagement—to what end?” and “How can we develop a more focused agenda of engagement informed by integrated measures?” As a result, we begin with a focus on understanding institutional mission and purpose; then we discuss the community perspective and progress on gathering the evidence of that work; then we address the development and recognition of engaged faculty; and then focus on developments and trends in measuring the impact of engagement on students. This 2018 reflection is followed by the original handbook content. The content of the original handbook is retained and is presented in sequential chapters. The first is an overview of assessment

27-08-2018 15:59:48

x  Preface philosophy and methods. While many resources exist on assessment, we have included an overview of assessment strategies in this edition of the handbook as a resource for framing our approach to assessment. In addition, this preface will ensure that users of this handbook have ready access to basic information about assessment. The following chapters present each of the four assessment constituencies in a separate chapter (students, faculty, community, and institution). Each of these chapters includes • a brief review of the literature, • discussion of issues in assessing impact on that constituency, • the assessment matrix, • strategies for assessment of that particular constituency (including advantages and limitations of particular instruments), and • examples of assessment instruments we have used in various settings. Each instrument is introduced by a discussion of purpose, preparation, administration, and analysis, specific to that instrument. The final chapter focuses on using the methods and analysis of data. Again, we offer best practices and suggestions for use based on our collective experiences. This chapter incorporates discussion on strategies for making assessment work.

Conclusions We have learned that systematic attention to measurement and assessment is important for improving outcomes as well as communicating the value of service-learning to many audiences. Developing a “culture of evidence” to describe and document the impact of service-learning supports its institutionalization, facilitates the ability of coursebased learning to be translated into scholarship, and fosters trust and communication among the various involved constituencies. Engagement in measurement and assessment is a valued element of the service-

learning experience for each constituent group as it articulates its unique perspective and learns from and appreciates the perspectives of the other constituent groups. Ongoing attention to measurement and assessment warrants our investment of time, our expenditure of resources, and our commitment. There is tremendous diversity across the field regarding expertise and capacity to design, implement, analyze, and interpret data on community engagement. Whether one is a faculty member with advanced research skills in a specific discipline, a staff person working in a program or unit supporting engagement, a coordinator responsible for an AmeriCorps program or a volunteer center, or an academic administrator overseeing for institutional assessment, the design of research, evaluation, and assessment of service-learning and community engagement is a challenge. Why? The language is contested, activity and program designs are diverse within and across institutions, the intended outcomes and goals are often fuzzy, and the impacts on and expectations of different participants (students, faculty, institutional leaders, community partners) are inevitably different. For these and many other reasons, Campus Compact recognized the need for an updated handbook that would continue to encourage attention to rigor, good practice, and consistency to methods and approaches used to gathering evidence of impact. Over the past decade it has become clear that the field was better at creating service-learning and community engagement programs than it was at assessing, evaluating or even describing them in any systematic way. This leads us to where we are today— a dynamic and exciting era where the focus of this work is now on gathering quality descriptive and analytical evidence about activities. Increasingly this information is used to inform improvement strategies and redesign curricula to enhance community-based learning opportunities. Higher education institutions are developing more focused agendas of work with articulated outcomes, recognizing and rewarding rigorous community-engaged scholarship, and developing systematic data collection methods that provide actionable information.

Sherril Gelmon, Barbara Holland, Amy Spring, Seanna Kerrigan, and Amy Driscoll Portland, Oregon November 2017

ASCV.indb 10

27-08-2018 15:59:48

ACKNOWLEDGMENTS

Partners

This work has been supported over the years by a number of institutions, funders, and partners. We are grateful to the dozens of institutions, and individuals at those institutions, who have participated in various projects to test and improve this assessment model; they are too many to name! The following list recognizes the major supporters; any omissions from this list are unintentional.

• Campus Compact • Center for Academic Excellence, Portland State University • Center for the Health Professions, University of California San Francisco • CES4Health • Community-Campus Partnerships for Health • Community Care Network Demonstration Program, Hospital Research and Educational Trust, American Hospital Association • Council of Independent Colleges • Health Professions Schools in Service to the Nation • Healthy Communities of the ColumbiaWillamette, Inc. • Institute for Healthcare Improvement, Interdisciplinary Professional Education Collaborative • Institute for Nonprofit Management, Portland State University • Many participating students, community partners, faculty, and institutional representatives

Institutions • Portland State University • California State University, Monterey Bay • Northern Kentucky University

Funders • The Corporation for National Service • The Pew Charitable Trusts • Bureau of Health Professions, Health Resources and Services Administration, U.S. Public Health Service • Fund for the Improvement of Post-Secondary Education, U.S. Department of Education • W.K. Kellogg Foundation • National Fund for Medical Education • Northwest Health Foundation • David and Lucille Packard Foundation • Robert Wood Johnson Foundation

xi

ASCV.indb 11

27-08-2018 15:59:48

ASCV.indb 12

27-08-2018 15:59:48

Chapter One

R E F L E C T IO N S O N T H E P R O G R E S S O F M E A SU R E M E N T O F C OM M U N I T Y E N G AG E M E N T, 2 0 0 1 – 2 0 1 8

Context for the 2018 Handbook Revision

at all forms of community engagement—engaged learning, engaged research, engaged public service, and outreach. The introduction of the President’s Higher Education Community Service Honor Roll after Hurricane Katrina in 2005 (Corporation for National and Community Service, 2016) and the Carnegie Foundation’s Elective Community Engagement Classification in 2006 (NERCHE, 2016) were two major opportunities for institutional recognition that required every applicant to generate specific, defensibly accurate data about their engagement activities. In the 1990s and early 2000s, federal funding for engagement activities increasingly required grant reports and specific evaluations, both of which created greater need for ongoing and accurate data collection. Descriptive and analytic data have also been increasingly essential as institutions seek to increase alumni and donor involvement, including gifts and endowments for engagement scholarships, equipment, transportation programs, staff positions, academic chairs or fellows, and buildings meant to support the campus engagement agenda. The Honor Roll and the Carnegie application revealed that the lack of attention to sustained measurement and data collection was a substantial gap in institutional planning and practice approaches. Institutional leaders and engagement practitioners were frustrated by the lack of systems for data collection that would report the outcomes of this work. The focus of engagement management was often on

This updated chapter reflects our observations and reflections on the evolution and current state of the language, purposes, and perspectives on the field of community engagement. Over 15 years, key developments in policy and culture have increased the strategic value of community-engaged activities as well as the motivation to research, evaluate, and measure the work. Today, almost every educational institution considers it an urgent priority to describe and measure the purpose, quality, process, and impact of their full array of community engagement practices and activities. Throughout this book, we offer updates and examples that illustrate how attention to assessment, measurement, and monitoring strategies helps answer the enduring questions: How do we know that this work makes a difference and meets the intended goals? What constitutes best or promising practices? How do we capture impacts and perspectives of those who are involved in the work?

Increasing Interest in Collecting Community Engagement Data Today, there is substantial attention paid to the need to systematically and continuously gather descriptive data about engagement activities of all types as well as their impacts and outcomes. This agenda looks 1

ASCV.indb 1

27-08-2018 15:59:48

2  Assessing Service-Learning and Civic Engagement growth, participation, resources, and political support but was rarely systematic and often quite random. Individual faculty made choices about how the idea of engagement might connect to their teaching and research work, or if they saw it as public service. If they took the plunge into engagement, they might attend a workshop on curricular design and find their own partner or seek help from a campus unit meant to support community engagement activities. Leaders of engagement become aware that random work is very difficult to measure, and if measured, any results are difficult to interpret or replicate. Early impetus to reflect and measure came from participation in various federal grant programs. The largest and most influential programs were the Housing and Urban Development (HUD) Community Outreach Partnership Centers program and Learn and Serve America (a program of the Corporation for National and Community Service). These programs integrated required reporting, reflection and data collection elements that demonstrated the risk of putting one’s energy into organizing and running programs without integrating attention to descriptive and analytical data collection and evaluation. Practitioners and leaders of the field began to realize that enthusiasm for launching new program activities could overwhelm the immediate need to articulate specific goals and implement integrated strategies for capturing inputs, outputs, and outcomes throughout the funded program.

New Pathways for Dissemination of Engagement Knowledge Since this handbook was published in 2001, the level and quality of scholarly research on engagement has expanded. There are more refereed journals and conferences for dissemination. Established journals such as the Michigan Journal of Community Service Learning, Metropolitan Universities, and the Journal of Higher Education Outreach and Community Engagement continue to be prominent venues for dissemination of scholarship and practice reports related to community engagement. In addition, new journals and conferences have increased the choice of venues for publication of research and other evaluative studies (see the discussion in the Faculty section of this chapter), such as the International Association for Research on ­Service-Learning and Community Engagement and its journal, the International Journal of Research on ­Service-Learning and Community Engagement. Journals and conferences emphasizing engagement scholarship, complemented by many disciplinary journals

ASCV.indb 2

and conferences, are creating a much broader knowledge base from which others can learn. In addition, individual faculty have taken on this scholarship and spread it “locally” in their disciplines, as the overall engagement field has grown. There is a need to establish a more consistent understanding of “what works” in terms of engagement practices and for more consistent and shared opportunities for review and recognition of impacts and outcomes. Ultimately, all institutions should have the capacity to measure their work for ongoing improvement as well as benchmark progress against role model institutions.

Gathering Evidence About Community Engagement Community engagement and related pedagogies are gaining more strategic support and interest on many campuses. The work requires us to develop effective, routine, and ongoing data collection systems that will create the conditions to capture evidence of effective actions and strategies and connect individual actions to larger community and institutional efforts. The growing interest in focusing our engagement efforts on a few broad public issues is not meant to restrict independent faculty engagement but recognizes the complexity of persistent challenges facing local and global communities. Working internally and externally as partners, looking at complementary and diverse aspects of a public issue, we will be more able to measure impact and outcomes that lead to replication and progress toward change. We encourage users of this handbook to focus on the intentionality of alignment between institutional strengths and objectives, and community issues (or needs) and opportunities (or assets). This strategic focus defines an agenda of engagement that will facilitate the tracking of outcomes and results for both the institution and community. The original manuscript of this book was mostly about focusing on the specific benefits to each separate constituent group; now we understand the need to focus more on alignment of goals that are truly mutually beneficial for all but may have different, yet complementary, goals and outcomes. As engagement has grown and deepened, institutional leaders are not asking for evidence that community engagement is worthy of time and expense; they are asking questions about how to identify models of partnership and collaboration that are likely to produce positive results. Integrated into their expectations is the need for a campus measurement

27-08-2018 15:59:48

Reflections  3

strategy that uses an ongoing tracking and measuring scheme that captures evidence for improvement and informs other corollary projects. The growing level of interest in tracking and measurement is evidenced by the explosion of diverse online systems now appearing on the Internet, each claiming to be an effective repository for collecting information about some combination of volunteering, service-learning, other forms of engagement, and/or community partner information. Some of these focus on collecting descriptive data, and some offer capacity to seek or gather feedback or input from various participant constituencies. This is a dynamic space, and it is beyond the scope of this update to review or describe them in any way at this stage. We merely acknowledge these systems are numerous, diverse, sometimes free and sometimes costly (see the Institution section of this chapter for further discussion). Over time, the collective experience of the field in using such online tools will determine those that prove useful and relevant from those that do not. The relevant point here is that the desire and motivation to capture both descriptive and analytical data about this work is so energetic and urgent that it has attracted the interest of many online entrepreneurs. Since 2001, new energy and expertise to support greater attention to research, evaluation, and assessment activities also came from the emergence of new organizations, networks, and dissemination venues. To summarize some of the key activities: • In 2001, Andrew Furco and Shelley Billig convened the first Service-Learning Research Conference at the University of California, Berkeley. By 2006, the conference attendees organized into a 501(c)3 organization called the International Association for Research on Service-Learning and Community Engagement (IARSLCE) (Gelmon, 2010). Today the annual conference attracts 300 to 400 scholars, students, and practitioners from multiple countries each year, and IARSLCE has become a respected venue for refereed scholarship of all forms of community engagement from engagement leaders and academics around the world, as well as an important venue for new scholars (graduate students, junior faculty, and faculty just beginning work in this area) to present their research and receive feedback from senior scholars (IARSLCE, 2016). • The major initiatives of the Engagement Scholarship Consortium (ESC, 2017) include the annual Emerging Engagement Scholars

ASCV.indb 3

workshop, the Outreach and Engagement Practitioners Network, and the Academy of Community Engagement Scholarship. ESC’s annual conference grew out of an initial partnership among three land-grant universities that launched the National Outreach Scholarship Conference in 2001 to share knowledge about their community-based programs (Bruns, 2010). • Campus Compact (2016a) has contributed significantly to quality practices in servicelearning and community engagement through a variety of initiatives as well as regional/ national conferences that have inspired faculty, students, administrators, and senior executives to value these teaching and partnership methods as key contributors to student learning and development, to productive connection of campus mission to local voices and local goals, and to the advancement of faculty skills as practitioners (McGovern & Curley, 2010). It has been an influential voice leading the effort to create both commitment to and practical models for reviewing and rewarding faculty involvement in community-engaged scholarship and for intentional planning of engagement work. • Community-Campus Partnerships for Health (CCPH) was established in 1996 (www.ccph .info), building on the national demonstration project on service-learning in the health professions called Health Professions Schools in Service to the Nation (HPSISN, 1995–1998). CCPH is one of the few community engagement–related organizations that emphasizes the role of community and the active involvement of community partners and organizations, as well as academics, in all its efforts. It holds its major conference every two years, alternating with the Canadian-based CUExpo conference. CCPH has been a major player in advancing our understanding and application of community-engaged scholarship, including the development of CES4Health (2016a). • Imagining America: Artists and Scholars in Public Life (IA) was launched in 1999 as a consortium of colleges and  universities that fostered a national network of campuscommunity collaborators in humanities, arts, and design. The organization developed an analytical framework to identify and critically consider the range of emerging artistic and scholarly endeavors and promoted public

27-08-2018 15:59:48

4  Assessing Service-Learning and Civic Engagement scholarship as an important and legitimate enterprise in higher education (IA, 2016). In 2008, IA released its report Scholarship in Public: Knowledge Creation and Tenure Policy in the Engaged University (Ellison & Eatman, 2008). This publication has been widely used by many IA institutions to shape their own internal policies and practices. • The Talloires Network emerged from a 2005 meeting of 29 presidents, rectors, and vicechancellors from 23 nations convened by Tufts University and other partners. This led to the creation of the Talloires Declaration on the Civic Roles and Social Responsibilities of Higher Education (Talloires Network, 2005). The Talloires Network now includes 10 national and regional networks, has 350 members in 75 countries, and holds periodic conferences in international locations (Talloires Network, 2016). The focus on more systematic and sustained systems for capturing engagement descriptions and data is having a powerful effect on clarifying language and goals within institutions and will continue to contribute to greater consistency of the work overall. For example, student learning through engagement with community is increasingly integrated into the spectrum of campus modes of hands-on or experiential learning; one example is SUNY Applied Learning (www.suny.edu/ applied-learning/). Yet there continues to be debate about the language used in this field—community engagement, civic engagement, service-learning, community-based learning, community-based research, field placements, cooperative placements, and other strategies that explicitly connect academic work to public work with an aim for mutual benefit.

Planning a Measurement Strategy The level of systematic effort or even of sustained intent to track and measure engagement activities varies widely today. Before focusing on specific constituencies, it is useful to focus on the key components of building a strong and useful measurement strategy.

Articulating Clear Purposes for Data Collection The fundamental challenge to any data collection scheme is clarifying your ultimate purpose; in other

ASCV.indb 4

words, “What do we want to know?” and “How will the information be used?” A clear purpose guides the selection of the type or method of data collection as well as the specific questions and data fields that are essential, optional, and not relevant. When asking others to respond and provide data, we must describe both the value and utility of the information and how the data will be analyzed, disseminated, and used. Clarity of purpose informs the design of instruments and methods, and helps improve quality and quantity of responses. A purpose common to many measurement plans, especially if it is the first attempt to gather information on community engagement, is to clarify language and understanding of the work. What do we mean by service-learning and community engagement at our institution (or whatever language you want to establish as the standard)? One of the great barriers to instrument design and interpretation of results is the persistent confusion of terms and language used in this field. A goal of this handbook is to inform instrument design that can reinforce consistent understanding of terms and definitions. For example, faculty can be asked if they use servicelearning in their teaching, but if the definition of service-learning is not widely understood in a consistent manner, consistency of responses cannot be ensured. Asking faculty if they involve students in activities outside the classroom through interaction with people or issues relevant to the local jurisdiction in order to meet a specific learning goal will generate responses that will have fidelity to this conception of community engagement. Probing further to understand consistent use of service-learning practices, one can explain the difference in information gathered from field trips, guest speakers, and so on, and thus improve campus-wide consensus on the concept. This is how data collection can help build a campus-wide understanding of both language and good practice components. For all purposes, think about how measurement will connect community engagement to one or more important and specific institutional strategic goals. Aim to measure in order to test the potential of the activity to contribute to institutional objectives and mission, not just to grow for its own sake. In this way, we begin to move community engagement strategies from the margin to the core by showing its relevance to institutional aims. For those we seek to influence or inform, instrument designers must consider many specific factors so that your instruments will actually collect

27-08-2018 15:59:48

Reflections  5

data relevant to your purpose and audience. Before crafting a data collection method, you need to consider the following: • Who is/are the person(s) we seek to influence or inform? • What action do we want them to take or what attitude do we want them to adopt as a result? • What evidence would be relevant to that action or attitude? • What type of evidence would be convincing to the person(s)? Other purposes that may be important for an institution include, but are not limited to the following: • Describing scope and scale of activity to external groups by topic, issue, location, or population • Attracting internal or external funding • Promoting quality practices • Identifying areas for improvement • Monitoring and growing participation/access for students • Promoting collaboration across projects • Identifying areas of similar work and convening like-minded individuals • Identifying resource or training needs • Highlighting links between engagement and teaching and learning • Linking results to awards, recognition, internal funding • Identifying barriers such as policies, transportation, and so on • Preparing for accreditation, external recognition, other institutional reports In planning an approach to tracking and measuring, be clear about the purposes, including thinking about the key audiences with whom data will be shared, and consider what kind of action or understanding they might gain from the information. This will help to identify important data elements and approaches to presenting the data.

Identifying Relevant Available Descriptive and Analytic Data Given the historic focus on growing activity more than capturing information about the activity itself or the results of the effort, it is important to understand the distinction between tracking or monitoring engagement activity and measuring, assessing, or

ASCV.indb 5

evaluating results. Tracking or monitoring primarily captures descriptive information, which can be very useful and important. Measuring, assessing, or evaluating involves more detailed methods of objective inquiry and analysis to look for data relating to specific questions, all as a way to determine the effect or impact. Tracking is about “What’s happening?” and measuring is about “What was the result?” Tracking or monitoring refers to the process of capturing a comprehensive portrait of the level and variety of community-related activities across an institution. The types or forms captured depend on the overall goals and purposes but it is probably wise, in the long run, to be as comprehensive as possible. Whatever the range of activities, the first aim is to capture descriptive information about these activities from those who are organizing them. The best approach is to set this up in a database where those who create records can update them on their own, thus minimizing effort and ensuring accuracy. A common approach is to create a template of fields that form the record and reflect your overall goals for measurement. In gathering descriptive information you want to think about the basics of description: What is going on? Think about what, when, where, why, how, and who is involved. The aim is to capture the project title; partner(s) identity; location; purpose/ goal; timing; method/strategy of the activity; issue(s) of focus; population(s) involved; intended outcomes; links (if any) to teaching, learning, or research; and achieved outputs or outcomes, including asking for any formal evaluations. A good tracking system that contains a robust array of community-connected activities can surface many useful data points, such as the following: • Scope, scale, and distribution of activity across campus and community • Campus understanding of different modes/ methods of work in community • Number of students, faculty, and staff involved in community-based activity • Number of partnerships that can be sorted by location, issue/topic, population, and so on • Partner contact information for feedback • Evidence of outputs/outcomes for academy and community As shown, a good database of activities can include asking the record holder to share any data they have generated through evaluations, feedback strategies, or other research or analysis strategies

27-08-2018 15:59:48

6  Assessing Service-Learning and Civic Engagement they have used in their community-based activity. Many who lead service-learning and other forms of community-based partnerships routinely collect feedback from or formally evaluate the impacts or outcomes of the activity as articulated by those involved (students, community, others). Some may have formal evaluation reports to share; others may have conducted formal research and even published findings. All of this means data that are useful and typically of good quality. If you create a strong tracking system, you will be amazed at how much data already exists. For large service-learning programs, convene leaders to discuss the potential for using common assessment or data tools that will help capture consistent and similar data from student participants. As an example, California State University Monterey Bay (CSUMB) has piloted the use of a signature assignment that provides authentic evidence of student learning with respect to the learning outcomes across a variety of courses that all use servicelearning. CSUMB hopes to expand this application to all of its service-learning courses, each of which has specific learning outcomes and uses a rubric to assess student achievement. For projects that involve complex issues or comparative interventions, formal research or evaluation methods may need to be implemented, in collaboration between the academic leaders and partners. The understanding of community perspective on project design, implementation, and views of the cost/benefit of their effort and outcomes (if any) is historically the weakest aspect of attempts at data collection. A good strategy may be to convene engaged faculty with relevant research skills to develop integrated evaluation protocols for planning, operating, and reviewing each campus-community partnership. This approach brings rigor and consistency to the process of hearing community voice and input on process and results and will go far to advance the quality and impact of the work overall. Then, convene relevant faculty and staff along with their community partners so they can jointly discuss their interests and expectations in data collection and evaluation efforts. What data do they collect now, or wish they could collect going forward? What methods work best for their culture and needs? What timing is best for them? Higher education institutions must learn to design measurement schemes that incorporate community capacity, values, and interests if the results are to be valuable in both realms.

ASCV.indb 6

Higher education institutions should be cautious in making broad claims about their effect on communities (e.g., reducing poverty, improving school outcomes, creating solutions for homelessness, etc.). A good approach is to recognize that an integrated and ongoing tracking and measuring scheme will enhance the ability of the campus to accurately state the results learned from specific projects and to estimate their replicability in similar settings. Even in carefully designed interventions, there are many other factors influencing participants in other aspects of their lives. We cannot control all factors, so our claims should be evidence-based. Most of the community issues our campuses focus on are complex problems with many factors affecting any attempt to create improved outcomes. In general, we should only describe the apparent results of the actual work we do with the people who were actively involved, acknowledge there are other unknown factors involved, and avoid generalization beyond the specific project. There is no substitute for talking with communities about their perspective on the processes, outcomes, and impacts of partnership work. Their voice and their data should always be evident in our strategies and publications or other reports of results.

Recruiting Allies to Help Design and Launch the Data Collection System Colleges and universities are continually collecting data for all kinds of reasons. Before creating a new instrument that will be another survey or other instrument people will be asked to complete, consider who might be helpful to you and where there might already be some useful data. A good way to start is to meet with the institutional research office staff. These people know what data are already collected and how to access these data. There are already many databases, reporting platforms, routine and ongoing surveys, and other institutional data being collected. From these, you may be able to glean useful data pertinent to your goals, and/or you may be able to negotiate adding one or a few fields to existing tools and databases so that your information will be collected through that mechanism. Other offices also keep valuable data that may relate to your engagement measurement goals. For example, student affairs may keep extensive information on volunteering or the Office of Research may be able to search funded grant projects for key words relating to community-based research or other community partnerships.

27-08-2018 15:59:48

Reflections  7

Each institution is different, so take time early in your process to look for existing data in systems such as annual faculty activity reports, annual institutional reports, recent self-studies for campus or programmatic accreditation, program review reports, course evaluations, climate surveys, employee surveys, alumni surveys, other regular external reports, or national surveys of students. There are nationally validated research instruments that are available (some charge fees) and include questions about student responses to service-learning experiences. Drawing data from existing internal or external tools or adding a few questions to existing processes may help uncover good data, reduce the need for unique analysis of data, and keep any unique data collection strategy shorter and free of any duplicate questions.

Building Instruments In the original narrative of this handbook, there was guidance on how to decide what key data points will be important for tracking and measuring engagement and sample data collection tools with instructions on how to implement them, analyze them, and share them. More tools have been developed in recent years; many of these are referenced throughout the text, and others may be identified through a search of both published literature and websites. Whether or not you are an expert researcher or evaluator, it may be helpful to recruit some experts from faculty, staff, and advanced students to work with you. The conceptual matrix framework described in the original narrative is a guide to identifying specific concepts (what we want to know) and key indicators (what we look for to see if the concept was present), as well as suggesting data collection methods and sources. Matrices for each constituent group are also included in the original text. The articulation of concepts and indicators specific to your needs informs your choice of methods and identification of respondents. The discipline of this process makes analysis of returned data more accurate and efficient, because the fields and questions in the instrument are linked back to the core concepts, which ultimately reflect the purposes for gathering data in the first place. Consider the type of method(s) and instru­ ment(s) that will best get at the core concepts and overall purposes you have in mind. See the discussion at the end of chapter 2 for tips on making decisions about which instrument would be most effective for your questions. Surveys are too often selected because they seem easy to do. The reality

ASCV.indb 7

is that surveys are good at getting at some types of responses and not as good for others, in terms of time, depth of information, and accuracy in a context of diverse respondents. The examples will help you see the differences. Whatever method or instrument you choose, get input from a colleague who has real expertise in that particular type of method or instrument. Even if you are research-trained and have experience creating instruments it is wise to let another expert review it with a fresh perspective; this will greatly strengthen the quality of the response rate and data.

Planning for a Sustained Data Collection and Analysis Process As you progress in developing your plan, think about a network of support and involvement to help you meet your purposes and to keep the data collection process moving forward. Identify a working team to advise your efforts and help with communication of the purposes across the institution. You may also wish to work with a senior administrator who is supportive of the engagement agenda and can offer positional leadership when needed to clear the way and encourage robust participation. While every institution is different, you should think about whether or not data collection is best housed in the community engagement infrastructure or in another unit of the university that has the capacity to support and sustain the process in a way that brings greater credibility and political buy-in, so that the campus sees the value of the information and there is motivation to contribute and participate. In the following sections, we address specific challenges, opportunities, and strategies for data collection from each of the primary audiences involved in community engagement endeavors.

Institutional Impact During the last 15 years, higher education in the United States has experienced declining financial support and growing criticism regarding cost and effectiveness, especially in relation to student success. Curriculum, funding and budgeting models, and academic culture are all changing to adapt to twenty-first-century expectations and opportunities. In this context, many academic institutions have begun to explore dramatic changes meant to enhance the student experience, reduce the expense of tuition, and improve learning and graduation rates,

27-08-2018 15:59:49

8  Assessing Service-Learning and Civic Engagement while also demonstrating institutional citizenship through greater involvement in regional and global issues, challenges, and opportunities. This has greatly increased adoption of engaged methodologies in the context of teaching, learning, and research, and thus has necessitated significant changes in scholarly practices, values, and faculty culture. To succeed in leveraging community engagement strategies as a pathway toward these goals, every institution needs to consider an intentional plan and ongoing strategy for monitoring engagement activities and plans and for capturing relevant descriptive and analytical data to inform both program improvement and outcomes reporting. Growth in institutional appreciation of community engagement as an important strategy for achieving both campus and community goals has further accelerated the need for this handbook. After years of wrestling with the language, concepts, application, and principles of engagement, community engagement has now emerged as a key institutional commitment that no campus can afford to overlook. Both anecdotal cases and concrete evidence reveal engagement’s potential for addressing key higher education challenges such as recruitment and retention of students, faculty, and staff; improved student completion rates; increased research activity including more interdisciplinary forms of discovery; renewed connections with governments; effective collaborations with community nonprofit agencies; access to new sources of revenue including connections to donors who seek to use their resources to improve community life and outcomes; and expectations of higher education institutions as “anchors” in their communities.

Strategic Commitment to Engagement This recognition of the strategic value of engagement to institutional improvement and progress has been accompanied by a growing national commitment of higher education toward addressing what many call “the grand challenges” facing local and global communities (Bill & Melinda Gates Foundation, 2016). Higher education played a substantial role in the mid-twentieth century in advancing national objectives such as economic growth, the race to space, the development of vaccines and medicines to increase health and longevity, and exploration of multiple kinds of technology and their applications. Many higher education leaders believe that higher education must step forward today and focus on the complex,

ASCV.indb 8

multidimensional aspects of economic growth, water, food, climate change, immigration, conflict, and other complex, large-scale public issues. These challenges require internal collaboration among multiple disciplines and external partner­ships with government, business, industry, health, schools, and the nonprofit sector. Solutions for and progress on complex public issues requires new approaches that combine academic knowledge and skills with the lived experiences and ideas of other sectors. The growing evidence regarding the impact of engaged strategies on teaching, learning, and research effectiveness is increasing the incentive for every postsecondary institution to develop an intentional agenda of engagement activity to help it succeed. Though the literature still lacks megascale research studies, existing work reveals a strong association with important strategic goals common to most campuses. These goals intertwine with the growing external pressure on academic institutions to increase student retention and completion rates, to increase research activity on questions of local or global urgency, to increase effectiveness and efficiency, and to be an engine for development (economic, cultural, social, human). Thus, the focus of institutional attention to community engagement has now turned to new, more specific questions: To what end does higher education engage with communities as a form of teaching, learning, research, and service? What is the alignment between our areas of intellectual expertise and public goals and questions? How do we identify, develop, support, and sustain productive community–academic partnerships? What structures need to be developed to govern partnerships, both internally and externally? These questions require an intentional and systematic approach to collection of reliable and relevant data to inform progress and replicate results in the context of established goals and aims. As engagement transitions from work that was mostly individualized to intentional campus/community agendas guided by specific goals and structures, institutional leaders now realize that they need ongoing data collection systems to provide regular information to monitor and measure their engagement agenda and its impacts. Leadership of engagement itself has become a more common expectation of presidents and provosts. A review of recent conference or strategic agendas of higher education associations such as Campus Compact, American Association of State Colleges and Universities, Association of Public and

27-08-2018 15:59:49

Reflections  9

Land-grant Universities, National Association of Independent Colleges and Universities, Council of Independent Colleges, and others shows evidence of attention to community engagement as a strategy for achieving internal and external goals relating to teaching, learning, and research that are enriched by and contribute to public progress. Recent examples demonstrate that some colleges and universities intentionally seek to select campus leaders who will sustain and/or grow the institution’s commitment to community engagement and partnerships. There are also academic institutions that have made little progress on community engagement. There is tremendous diversity in higher education’s involvement in engagement with communities.

Carnegie Community Engagement Classification Over the past 10 years the Carnegie Community Engagement Classification has become the key framework that guides increasingly more sophisticated and intentional approaches to engagement actions and requires that campuses describe, measure, and evaluate engagement at all levels across the campus. The expectation is that if a campus is an “engaged campus” then we would expect the work to be supported, informed, governed, and monitored regarding both process and results (NERCHE, 2016). For that reason, the Carnegie application intentionally requires campus-wide and community partner participation in completing the submitted materials. The application has become a guide to structuring and planning engagement that often creates a framework for deciding what a system of data collection, measurement, and evaluation should track and how such information will inform future work and adaptations. Since its initiation in 2006, the Carnegie Community Engagement Classification has been awarded to 431 colleges and universities. The intent of the classification is to encourage documentation of institutional engagement with community in an inquiry process that yields practical and useful data (Driscoll, 2008). A critical component of the classification framework focuses on tracking and assessing community engagement especially in terms of its impact on students, faculty, community, and institution. With each round of institutional applications (in 2006, 2008, 2010, 2015), the data and narratives have revealed an ongoing need for improvement and expansion in institutional capacity and practice

ASCV.indb 9

related to data collection and evaluation (Driscoll, 2014). In the most recent round of Carnegie applications in 2015, even those that were successfully classified indicated little progress in systematic approaches to data collection, but it is worth noting some areas of improvement. For example, in 2006 approximately 12% of successful applicant institutions reported specific institutional learning outcomes associated with service-learning. In 2008 and in the application rounds that followed, more than 80% of institutions provided examples of institutional learning outcomes (Driscoll, 2014). However, those institutions that provided examples of learning outcomes fell short of expectations in terms of defining specific civic learning outcomes. As for other assessment practices, the earliest applications for the Carnegie Community Engagement Classification had few examples of tracking mechanisms to record data such as number of students, number of hours of service, number and kinds of community sites, kinds of engagement, and number of faculty. In the most recent applications, tracking practices have expanded and become more sophisticated, especially with the use of technology (Saltmarsh, Sandmann, & Hartley, personal communication, 2015). The assessment questions that continue to be most challenging for applicants are those that probe the actual impact of community engagement. Institutions are asked to provide descriptions of assessment practices and at least one example of a finding related to impact on students, community, faculty, and institution. Often, impact findings are not authentic examples of impact, but rather statistics of growth in the number of students or faculty. Assessment of impact of community engagement on students is the one area that has shown some modest progress and is most prominent. In the 2015 applications, institutions described surveys, focus groups, and interviews as mechanisms to assess impact on students and were able to describe some specific impact (Saltmarsh, Sandmann, & Hartley, personal communication, 2015). These examples often relied on indirect measures of impact on students; few direct measures were found in the applications. A current challenge for many institutions is to make an informed choice regarding what data systems are needed to help them track and measure and assess community engagement in all its dimensions. The classification process has increased the demand for ongoing and systematic data collection, and many tools have been launched for such purposes;

27-08-2018 15:59:49

10  Assessing Service-Learning and Civic Engagement these are of widely varying focus, philosophy, and method. There is no doubt that ongoing systems are ideal, and whatever is used should ensure both continuous maintenance of descriptive data as well as the ability to analyze data for multiple planning and reporting purposes. As institutions collect data, they are able to publicly share results, such as on a dashboard (Virginia Commonwealth University, 2017). The key, again, is intentionality and alignment. Academic institutions should work with internal and external partners to carefully select a system that reflects their overall goals and agenda, but can also be customized in the context of local purposes. In summary, systematic approaches to the measurement and tracking of community engagement and its impact on students, faculty, community, and institutions are growing, but remain a challenge, especially for institutions that lack a strategic plan that articulates community engagement’s internal and external aims. The next phase of engagement’s development will require colleges and universities to intentionally integrate actions and goals with ongoing data collection and evaluation, adopting sustainable monitoring and measurement practices that yield quality data and informs improvement in institutional engagement with community.

What Do We Want to Know Today About Institutional Impact? There are many questions that may guide the exploration and analysis of the institution’s role in engagement. Key questions and considerations that one might ask include the following: • What are the institution’s motives for engagement? • What current and future strategic goals of the institution are served by engagement strategies and actions? • What outcomes and benefits do we seek? • What is the alignment of mission and expertise with challenges and opportunities presented in the chosen region of focus (local or beyond)? • Who are our key partners and how might those partnerships evolve over time? • How is the work governed both internally and externally? • Where does engagement sit in the organization chart?

ASCV.indb 10

• What are the measurable outcomes and results we seek for our institutions and what are our partners seeking to achieve? • What works and what does not work to create progress on a specific issue? • What resources and infrastructure are required? As the body of evidence grows for engagement’s positive contribution to contemporary challenges facing society and higher education, we must work on collecting better institutional data to create the basis for larger multi-institutional research studies. Higher education associations have contributed richly to the wider appreciation of community engagement strategies as important tools for implementing change and improvement across higher education. This further validates the importance of developing more intentional institutional plans for engagement that incorporate clear goals and expectations for measurable results for campus and community. In many ways, the motivation and the responsibility for consistent and ongoing monitoring, measurement, and assessment of engagement is one of the most important tasks before senior leaders across higher education. The questions posed in this section on institutional engagement measurement issues may offer a useful guide for campus leadership dialogues in planning both for engagement action and for ongoing measurement of engagement’s outcomes and impacts for improvement and replication.

Examples of Resources for Assessing Institution-Level Engagement Activity Many instruments and systems are being or have been created for monitoring and measuring engagement. Before plunging into selection of data collection tools, it is important to reflect on and plan for your institution’s specific data goals and needs. It is important to develop an accurate portrait of activity in order to make decisions about methods of measurement and assessment or evaluation. We describe here a few research-based tools that can help explore the “state of ” engagement at your institution. A useful guide for institutional self-assessment is the CCPH metric to describe capacity for community engagement (Gelmon, Seifer, Kauper-Brown, & Mikkelsen, 2005). This model builds on the earlier work of Holland (1997). The self-assessment guides institutional representatives to reflect on

27-08-2018 15:59:49

Reflections  11

the institution’s capacity for engagement along six dimensions, ranking each item of each dimension on a four-point scale: • Definition and vision of community engagement • Faculty support for and involvement in community engagement • Student support for and involvement in community engagement • Community support for and involvement in community engagement • Institutional leadership and support for community engagement • Community-engaged scholarship This metric can be used for institutional purposes and also speaks specifically to the faculty role in engagement. This metric has been used at several institutions and in several multi-institutional projects, and it has been shown to be helpful in identifying strengths and opportunities for growth. The “Indicators of Engagement” developed by Campus Compact may also prove to be a useful framework for institutions in developing their plan for measurement of engagement (Hollander, Saltmarsh, & Zlotkowski, 2001). The indicators summarize successful practices in engagement and categorize them as the following: • Mission and purpose • Academic and administrative leadership • Disciplines, departments, and interdisciplinary work • Teaching and learning • Faculty development • Faculty roles and rewards • Support structures and resources • Internal budget and resource allocations • Community voice • External resource allocations • Coordination of community-based activities • Forums for fostering public dialogue • Student voice More detail about the operationalization of the indicators may be found at Campus Compact (2001). Another resource that may be of value is the work of the National Coordinating Centre for Public Engagement in the United Kingdom (Hart, Northmore, & Gerhardt, n.d.). This work articulates

ASCV.indb 11

dimensions of public engagement in a framework that includes the following dimensions: • • • • • •

Public access to facilities Public access to knowledge Student engagement Faculty engagement Widening participation Encouraging economic regeneration and enterprise in social engagement • Institutional relationship and partnership building The framework is illustrated through application in a case study at the University of Brighton. Readers should note that “audit” as used in this document equates to “program review” in U.S. higher education.

Community Impact and Partnerships Since our original publication in 2001, research on community impact and outcomes, and on community partnerships, has increased. Public and private grants for engagement have increasingly required attention to data on community impact. The Carnegie Community Engagement Classification requires evidence of regular collection of activity data and feedback, including communication of results to community partners and community leaders. The growing interest in creating a more focused agenda of work on community impact also puts a much greater focus on setting goals and measuring progress with community input and involvement. The historic emphasis of research and evaluation efforts has largely been on hearing the community voice and obtaining community input on impact as a way of ensuring that we are true to the spirit of community partner expectations regarding their roles and intended outcomes. The focus of most data collection regarding community partnerships was on the characteristics and forms of partnerships framed by an emphasis on benefits for the community and respect for communities as cocreators of knowledge and coeducators of our students. Ironically, partners themselves have said that this intense focus on partner perspectives may, at times, have minimized attention to the benefits of community engagement for the academic partner. Community-based work may have been well structured to provide benefits to the community, but not always explicitly linked to specific student learning goals or other scholarly

27-08-2018 15:59:49

12  Assessing Service-Learning and Civic Engagement outcomes. One of the largest studies of partner perspectives (Sandy & Holland, 2006) revealed that experienced community partners often reported they had little or no idea what the specific learning goals were for service-learning students spending time with their organization. This suggested that more needs to be done to ensure that faculty are engaging community partners as coeducators and collaborators and that there are clear outcomes and benefits for students, partners, and academics. This is the basis of accurate and useful data collection that can lead to identification of effective actions producing desired impacts. Today the issues around community impact and partnerships need to focus both on how a more intentional agenda of engagement with community is aligned with community goals and on the reasons academic institutions create partnerships with external entities. The campus must be clear and intentional about identifying what they want to know about those partnerships and their results for both campus and partners and what they intend to do with the evaluation findings. Campus data goals may include demonstration of community impact for public relations or funding purposes or documentation for grants, institutional plans, or accreditation, among other goals. Together both campus and community want to discover effective practices and innovative strategies that lead to real progress in communities, in keeping with community goals and vision. A focus on measuring impacts and outcomes of campus-community partnerships may lead to the ultimate benefit: the discovery of effective strategies that lead to community progress and that may be replicated elsewhere. The findings of monitoring and measuring of engaged partnerships should be of use to both campus and community.

Definitions of Community Developing an understanding of what community means to students, faculty, community partners, and the institution is essential. Despite many years of higher education working with communities, many faculty and students continue to have a wide range of perceptions of “who” the community is (Gelmon, Holland, Seifer, Shinnamon, & Conners, 1998). As greater attention is given to measurement of impact on community, it has become clear that partnerships and their goals must be defined and evaluated in order to develop a logical and effective process of working together across organizations and sectors.

ASCV.indb 12

Contemporary understandings of partner­ships are grounded in foundational work that is 20 years old and continues to be relevant. Community– university partnerships are “organic, complex, and interdependent systems” (Sigmon, 1996). Partner‑ ships are rarely static, and they evolve and change as a result of changes in personnel, focus, resource availability, organizational infrastructure, political forces, and/or environmental circumstances. Partnerships are best viewed from a systems perspective, recognizing that a change affecting any partner is likely to have an impact on multiple aspects of the partnership (Gelmon, 1997a). CCPH is one of the international organizations that emphasizes the role and practice of partnerships. According to CCPH, partnerships work best with quality processes: relationship focused; open, honest, respectful, and ethical; trust building; acknowledging of history; and committed to mutual learning and sharing credit. In order to assess such partnerships, one needs to develop mechanisms that will honestly and accurately measure or observe these key elements. Further, partnerships should be designed and managed to ensure meaningful outcomes that are tangible and relevant to communities and defined by the communities, such as creating affordable housing, ensuring a clinic offers convenient hours for access to health services, developing relevant afterschool programs, and revitalizing rural economies (CCPH, 2017). Finally, CCPH recommends viewing partnerships through a lens of transformative experiences, including • personal transformation of students, including self-reflection and heightened political consciousness; • institutional transformation, including changing policies and systems; • community transformation, including community capacity building; • transformation of science and knowledge, including how knowledge is generated, used, and valued and what constitutes “evidence” and “ethical practice”; and • political transformation, including social justice.

Principles of Partnerships Over the last 15 years we have learned more about what needs to be examined, and our view of evaluating community partnerships has changed.

27-08-2018 15:59:49

Reflections  13

Institutions still struggle with how to get the relevant information about partnerships, but the field now understands more about the factors that contribute to effective partnerships and the dynamics of operationalizing a partnership that is mutual, respectful, and beneficial to all participants. Core principles of partnerships have been articulated by multiple authors; one of the most robust and widely accepted sets of principles is the Principles of Partnerships (CCPH Board of Directors, 2013). The original 1998 principles were very new at the time we published the original handbook and, as a result, were only minimally addressed in our early assessment methods. As the Principles of Partnerships have been applied and interpreted by multiple organizations, CCPH has periodically reviewed and updated the principles to ensure these guiding statements help individuals and organizations to understand what makes partnerships work, sustain authenticity, and achieve the change they want to see in their community (CCPH Board of Directors, 2013). The following 2013 principles are not intended to be prescriptive but rather to serve as the basis for discussion about creating and implementing partnerships—and in fact offer a template for creating institution-specific monitoring and evaluation methods: 1. The partnership forms to serve a specific purpose and may take on new goals over time. 2. The partnership agrees upon mission, values, goals, measurable outcomes, and processes for accountability. 3. The relationship between partners in the partnership is characterized by mutual trust, respect, genuineness, and commitment. 4. The partnership builds upon identified strengths and assets, but also works to address needs and increase capacity of all partners. 5. The partnership balances power among partners and enables resources among partners to be shared. 6. Partners make clear and open communication an ongoing priority in the partnership by striving to understand each other’s needs and self-interests and developing a common language. 7. Principles and processes for the partnership are established with the input and agreement of all partners, especially for decision-making and conflict resolution.

ASCV.indb 13

8. There is feedback among all stakeholders in the partnership, with the goal of continuously improving the partnership and its outcomes. 9. Partners share the benefits of the partnership’s accomplishments. 10. Partnerships can dissolve, and when they do, partners need to plan a process for closure. 11. Partnerships consider the nature of the environment within which they exist as a principle of their design, evaluation, and sustainability. 12. The partnership values multiple kinds of knowledge and life experiences. (CCPH Board of Directors, 2013) Each principle offers a basis for studying partnerships, necessitating consideration of multiple perspectives and potentially leading to multiple methods of evaluation to develop a 360-degree perspective. This framework can be used as the basis for an institution to design its own evaluation strategy; these have been used within methods such as the CCPH institutional self-assessment of capacity for community engagement (Gelmon et al., 2005). The partnership principles have also been adapted by organizations as a guiding framework for selection of their own strategic partnerships; an example can be found at Campus Compact (2016b).

Perspectives on Partnerships Understanding partnerships and their outcomes requires consideration of multiple perspectives. Faculty, students, community partners, and other institutional participants all come to partnerships with different concerns and expectations, thus demanding an intentional evaluation strategy (Holland, 2001a). From the higher education perspective, considerations include the viewpoints of students, faculty, and institutional leaders, which must be placed in the context of curricular offerings, research and scholarly projects, institutional development, student affairs, alumni relations, faculty development, international affairs, and donor relations, among others. Community partners that collaborate with higher education institutions will have different assets and needs depending on their organizational focus—and the community context and areas of focus may determine which higher education institution and what areas of campus expertise are central to the partnership. As described, for example, in Barbara Jacoby’s

27-08-2018 15:59:49

14  Assessing Service-Learning and Civic Engagement book on service-learning partnerships, advancing service-learning (and other forms of community engagement activities), requires “creation and sustainability of a wide range of authentic, democratic, reciprocal partnerships” (Jacoby et al., 2003, p. xviii). The institution has a responsibility to work with multiple organizations, and not limit its sharing of expertise and cocreation of knowledge. The lack of a cohesive agenda around partnerships and the randomness of an individual faculty approach to a community means the body of community-engaged work for a single institution is difficult to document, much less measure impact. Partnerships are often chaotic for an institution— there are many and they may be productive, but they are not always easily identified, in particular because there may be different views about what is or is not community engagement or engaged partnerships. Many institutions strive to develop an accurate portrait of all the partnerships between the campus and external entities. While institutions do not want to stifle faculty engagement with communities by instituting too many rules and policies, this lack of coordination makes it nearly impossible to develop an accurate portrait of overall activity that could be the basis for creating a systematic and sustainable approach to measuring the impact of the institution’s collective impact on (and with) its communities. The evaluation of partnerships requires careful attention to processes of working together: coordination, collaboration, and cooperation. Multiple perspectives need to be invited into the evaluation in order to truly understand the collaborative work (Mattesich, Murray-Close, & Monsey, 2001; Gajda, 2004). Since partnerships may take so many different forms and may arise out of a diversity of intentions, it has become increasingly important for each institution to create a categorization of the kinds of partnerships that are common between community organizations and their college and university partners. The characteristics of these types of partnerships begin to frame some of the anticipated outcomes. Consideration of community impact and outcomes of partnerships needs to be linked to institutional mission and motivations, strategic goals for teaching and research, and the role of engagement with external sectors through intentional partnerships. While principles of partnerships provide a common framework to guide campus-community relationships and interactions, the agenda of engaged partnerships will vary according to the alignment of academic strengths and strategic goals of the

ASCV.indb 14

institution with the opportunities and strategic goals of its surrounding communities. Thus, the agenda of community engagement will vary across institutions, creating significantly different kinds of partnerships and focusing on different goals. However, adherence to the core principles of community-campus partnerships means it is possible to have common, nationwide or even international scale systems for data collection regarding community engaged partnerships and their impacts and outcomes. Most work on partnerships has articulated a two-way partnership: The partnership enables the university to realize its goals with respect to ­community-based teaching and learning and enables the community organization to access university resources and expertise in support of its activities and mission attainment (Gelmon, 2003). A threeway perspective on partnerships was operationalized in the Community–Higher Education–Service Partnerships (CHESP) in South Africa, an initiative that was established to actively engage higher education institutions in the South African democratic transition beginning in the late 1990s (Lazarus, 2007). The partnership was viewed as the unit of transformation and consisted of a three-way interaction among historically disadvantaged communities, higher education institutions, and service providers including nonprofit organizations and government agencies. The ways in which the three partners collectively accomplished their goals was the focus for the evaluation of partnerships in CHESP (Gelmon, 2003). In contrast, the SOFAR model identifies five key constituencies of stakeholders: students, organizations, faculty, administrators, and residents, resulting in 10 dyadic relationships, each with two vectors of communication and influence (Bringle, Clayton, & Price, 2009). A scan of higher education institutional websites shows increased use of the language of engagement, often as part of an institutional plan (see the discussion in the Institutional sections in this chapter and in chapter 6). Thus an understanding of community and partner impact must take into account the goals, governance, operating principles, and actual activities of these partnerships. Partnerships may be launched by the university, by a community organization addressing a specific issue, or by other institutions such as government or philanthropy. These latter groups may have more capacity and influence to convene groups to address issues that no one else is working on and create synergy across multiple sectors of a community to foster

27-08-2018 15:59:49

Reflections  15

communication and problem-solving. Evaluating a partnership program such as Partners Investing in Nursing’s Future (PIN) built on the PSU model of partnership assessment, creating a robust evaluation and monitoring system that helped to document community development partnerships across multiple sectors in multiple communities across the United States (Norman, Gelmon, & Ryan, 2014; Ladden, Hassmiller, Maher, & Reinhardt, 2014). Evaluating multiple partnerships across different communities and contexts can pose measurement challenges. The creation of a conceptual framework as described in the original handbook can help to bridge these differences. Core concepts can relate to areas such as partnership development and management, creation of sustainable solutions to the initial issue identified, value of program participation, and perceptions of the evolving roles and contributions of the various partners. The CCPH partnership principles, previously cited, can inform design of such an evaluation, helping to focus on issues important to the partners. The findings from the PIN evaluation relevant to partnership development, management, and sustainability identified key insights related to use of an effective collaboration model; involvement of a neutral convener; attention to the phases of development of the partnerships; creation of a defined structure and operating principles; focus on priority issues; attention to sustainability; and credibility and visibility (Norman et al., 2014). Strategies for partnership evaluation need to focus on the specific context of the partnership and its goals, from which one can explore how the partnership contributes to the outcomes and impact.

An Institutional Example of Partnership Structures Some institutions are trying to systematically track their partnerships to document an understanding of the range of activities and relationships, to identify

patterns of interaction, and to provide support of various kinds. An example is the Engagement and Partnership Spectrum created at PSU in 2012 (Flynn, 2015). The spectrum (Figure 1.1) has allowed the university to categorize and define its partnership activities and to track activities according to partnership types and diverse sectors and audiences in the community. This spectrum is a relevant tool for what PSU was trying to accomplish (Portland State University, 2017b); it may or may not be adaptable in other institutional contexts. PSU is an urban-serving university that celebrates its motto “Let Knowledge Serve the City” by animating student and faculty experiences with engagement opportunities that bring the campus community into applied teaching, learning, and research settings in partnership with community organizations. Engagement work ranges from smallscale projects with short-term objectives to larger projects carried out over several years and involving students and faculty from multiple courses, schools, and colleges. The diversity of community-university partnerships reinforces the need to develop an understanding for the nuances associated with developing authentic relationships between communities and institutions of higher learning. At PSU, partnerships often focus on student learning and are supported by an individual faculty member. Evaluation of community impact is determined by the community partner and faculty member involved in the partnership. Assessment measures often focus on number of attendees, number of hours served, and accomplishments at the partner site on a specific activity. Student employment and professional application illustrate the professional competency of students developed through the partnership, possibly through practicum requirements for professional programs or internship placements. Assessment of these individual student experiences often requires students to demonstrate learning through a portfolio, presentation, or reflective paper.

Figure 1.1.  PSU Engagement and Partnership Spectrum. PSU ENGAGEMENT AND PARTNERSHIP SPECTRUM

ASCV.indb 15

COMMUNITY ENGAGEMENT

VOLUNTEER EXPERIENCE

PROFESSIONAL DEVELOPMENT

RESEARCH & SPONSORED PROJECTS

STRATEGIC PARTNERSHIPS

Community-Based Teaching & Learning

Cocurricular Student Engagement

Student Employment & Professional Application

State & Local Research and Service

Regional Business & Civic Partners

COURSE-BASED

SERVICE-BASED

WORK-BASED

CONTRACTUAL

INSTITUTION-WIDE

27-08-2018 15:59:49

16  Assessing Service-Learning and Civic Engagement Community level measurement typically is done by tracking the number of professional placements a partner hosts from year to year. State and local research and service partnerships include funded and unfunded research and professional training agreements that respond to a local or regional concern. Memoranda of understanding and/or funded contracts with defined deliverables often guide these partnerships and facilitate tracking of these partnerships. Finally, regional business and civic partners reflect strategic institutional partnerships with a defined governing structure that includes representatives from the university and the community partner. PSU created a Partnership Council to fulfill PSU’s goal of “Civic Leadership Through Partnerships.” The council is made up of faculty and administrative staff from schools/colleges, staff from student services, and representatives of the alumni association. The council is a means to organize and mobilize internal campus practices and systems to strengthen and sustain the community engagement activities. Often, community engagement advisory groups are created in the perhaps misguided belief that they will be the bridge between the institutions and the multiple community partners, but it is rare that they can articulate the wider community perspective (Feld, 2002). PSU has opted to engage community partner voice and power-sharing systems at the project level rather than the institutional level to create a governing body with a collective voice for the advancement of partnerships and engagement at PSU.

Partnerships Using Innovative Approaches An important opportunity to improve capacity to gather quality data about partnerships and community impacts and outcomes is now emerging across higher education in three forms: collective impact, geographically defined partnerships, and shared spaces. As noted in the Institutional section of this chapter and in chapter 6, many institutions are developing a focused agenda of engagement. While individual partnerships and projects certainly continue, some institutions are identifying one or more broad themes where the intent is to work with community partners to develop specific, complementary projects. Such an approach considers the multiple aspects of a community issue, the current conditions and opportunities, and draws on multidisciplinary expertise across the institution and community. By developing a focused plan, benchmarks or goals can be designed that will frame evaluation of strategies, impacts and

ASCV.indb 16

outcomes. For example, the Strive Together initiative, now operating in many U.S. cities, brings higher education, schools, community organizations, and others together to support school readiness for all children (Strive Together, 2017). This example uses the principles of collective impact theory (Kania & Kramer, 2011), which are of growing interest as a way to organize partnerships intent on producing successful results. This model is predicated on the theory that large-scale social change efforts demand a cross-sectoral response. A more intentional focus on an agenda of engagement creates the opportunity to concentrate efforts in ways that may lead to or accelerate desired results. PSU's Sustainable Neighborhood Initiative (SNI) uses a structured approach to neighborhood partnerships in geographically defined neighborhoods throughout the city. PSU staff work with a coalition of community partners to identify a set of independent and interconnected community projects that are linked to students and faculty. The evaluation of SNI partnerships is focused on community partner project outputs and outcomes (Beaudoin & Sherman, 2016). An evaluation approach that is focused on outputs that are of primary interest to community partners may focus on outputs that have limited value within the academy and its traditional reward structures. This works against the mutually beneficial goals inherent in community-engaged scholarship. Shaping measurement concepts to guide what both the community partner and the educational institution want to know about the impact of the partnership becomes essential. Another strategy for advancing partnership relationships and their intended results is the idea of shared spaces. In this model, the institution and community work together to create a vision for a physical facility that will be a hub of collaboration and interaction. The HUD Community Outreach Partnership Centers (COPC) grant program in the 1990s encouraged higher education institutions to be present in their partner neighborhoods. By being present and sharing the same space every day, partnerships could deepen and become stronger and more effective. The COPC grant program required campus and community collaboration in planning activities of the grant, and associated measures that would be tracked. Case studies can be found at the HUD website (US HUD Community Outreach Partnership Centers, 2017). The University of Nebraska Omaha (UNO) opened the Barbara Weitz Community Engagement Center in 2014. It is a purpose-designed and

27-08-2018 15:59:49

Reflections  17

-built facility for community engagement that offers a public meeting space, dedicated parking for the community, a wide variety of community meeting rooms, and an abundance of community-generated art. It houses several UNO units relating to community engagement and nearly 30 community partner organizations. Guided by a cocreated set of values and operating principles, this facility is succeeding in its goal to make the entire campus more accessible and welcoming to the community and to deepen partnerships through daily interactions (University of Nebraska Omaha, 2017). Another model of shared space is the concept of a “community storefront.” For example, the University of Technology Sydney Shopfront (UTS Shopfront, in Australia) was founded in 1996 as a small, cross-­ disciplinary initiative to strengthen university and community collaboration. Since then it has delivered more than 900 community projects and supports many community-engaged research partnerships (University of Technology Sydney, 2017). One of the measures of success of UTS Shopfront is the number of partnerships; the website states that this has been achieved “by cooperation with all stakeholders, partnerships with community organisations, efficient resource management, and working for joint outcomes.” Other institutions have created similar facilities that foster community and campus dialogue and partnerships. The University Research and Outreach Center (UROC) at the University of Minnesota was founded in 2006 as a convening place for the University Northside Partnership (University of Minnesota, 2016). The UROC building provides university faculty and staff with office space in north Minneapolis for collaborative outreach and research programs reflecting community-identified priorities in the areas of education and training, family and community health, and economic development.

Using Data Systems to Monitor and Track Partnership Activity Many other campuses around the country have developed electronic tracking systems that capture the nuances and impacts of their community partnerships. Unique data tracking systems that are developed by existing or uniquely local software design firms often do not effectively interface with established campus data systems. These boutique systems have typically failed because of the lack of technical support and the inability of campuses to update and increase usability as newer technological

ASCV.indb 17

systems become available. In recent years, multiple commercial and nonprofit software firms have begun to develop partnership and engagement tracking systems for use by campuses. These electronic systems can be an important method of tracking, documenting, and evaluating community partner activities. Locally designed systems require institutions to bear the full costs of software development, refinement, and maintenance and may not support larger efforts to benchmark or conduct research across institutions. The University of North Carolina at Greensboro (UNCG) created its own tracking system through its Institute for Community and Economic Engagement (ICEE) (University of North Carolina Greensboro, 2017). ICEE launched a systematic analysis of community engagement activities within each school, college, and unit, and decided to create its own database for partnership tracking (E. Janke, personal communication, 2016). Staff developed a custom, online, publicly viewable, relational database that allowed each faculty and staff member to enter descriptive records of all their engagement activities including information about partnerships, activities, goals, and outcomes. The database connected to the university’s student and employee data tracking and user authentication systems. After less than a year using the online system, UNCG licensed the system to a private software vendor to address ongoing internal costs related to updates, modifications, and security, as well as the desire to expand collection and reporting functionality and facilitate usage by other institutions for benchmarking and research. Indiana University–Purdue University Indianapolis (IUPUI) has a history of assessment that has evolved over time and resulted in greater attention to tracking of partnerships as part of understanding a larger narrative about community engagement. IUPUI started by counting service-learning courses, including the number of students, hours, faculty, and partners. Counting courses ultimately led to questions about faculty work and the process of developing and sustaining partnerships; ultimately, a “home-grown” system was created to capture information about courses to further explore process. Recently, IUPUI reexamined how faculty, students, and staff engaged with the community (e.g., courses, projects, research, economic development, technical training/assistance), and was faced with the challenge of information in multiple places and the inability to tell a comprehensive story of engagement. It has now focused on gathering information that helps to demonstrate how community engagement is a strategy

27-08-2018 15:59:50

18  Assessing Service-Learning and Civic Engagement through which it achieves institutional mission and goals (K. Norris, personal communication, 2016). Inspired by the work done by IUPUI and UNCG, PSU identified the need for improved tracking and documentation of partnership activities. It has opted to document and evaluate a cross-section of partnerships across the campus in each of the partnership categories defined in the Partnership Spectrum (Portland State University, 2017b; see also Figure 1.1). It decided to adopt the use of an electronic system for partnership documentation and evaluation. Public-facing data will become available in the future. Whatever the perspective or motivation for gathering data, each institution’s primary challenge is to gather comprehensive information of a c­onsistent nature and to ensure it is always current. These data management systems help institutions to track important engagement activities that include service-learning courses, cocurricular service projects, community research activities, and other community-engaged events that are done in partnership with a community organization. Including all engagement activities within a single system allows the community and the institution to better understand the variety of engagement occurring and creates a means to expand partnerships between faculty and community partners involved in similar activities. Maintenance and sustainability of such a system likely requires providing dedicated infrastructure (staff time) to distribute instruments, gather/analyze results, and create meaningful use of the data findings. An Internet search will reveal many different data collection systems and products that propose to help collect data on one or more aspects of volunteering, service-learning, events, partnerships, and other related activities that may be relevant to community engagement. Some are free and relatively simple, and others are complex enough to create detailed and sophisticated sets of data. Some are specialized, and others are quite broad. There are a wide variety of tools with very different features. Selection of one of these online systems or tools for monitoring and/or measuring objectives requires time to evaluate any system of interest. Suggested steps for evaluation of these systems include the following: • Talk with leaders in your institution and partner networks to explore their goals and interests in data collection. • Consider what questions they have, in what ways the data would be used, what goals would be informed, what budget is available, who would manage the system and the

ASCV.indb 18

communications with audiences, who will enter data, who will analyze data, and so on. • Use this analysis of needs, goals, and assets to assess different tools or systems to determine the best match for local context and budget. • If a particular system seems of interest, ask for references at institutions currently using the product and talk with them about how it works and how they use it. It is beyond the scope of this handbook to make any recommendation regarding any one software approach or method. No one system is going to work for everyone; each institution needs to do its own research and find the software or system that will work best for its needs.

What Do We Want to Know About Community Impact and Partnerships? In order to understand the impact of community partnerships, consider those factors or conditions that are to be transformed, strengthened, and accomplished as a result of the partnership. Key questions and considerations that one might ask include the following: • What is the focus of the partnership? How does that frame the type of measurement that is important to understanding impact or outcomes? • What difference does the partnership make? To whom? • What are the key strategies or interventions? • Who/what is the community and what baseline knowledge exists? • What is relevant to study in terms of leadership, communication, and resources? • What expertise may be needed to move forward? • What data are being collected already? What additional data are needed? How will the data inform continuous improvement of the partnership? • Are the assessment interests different for community partners than they are for the educational institution? Are strategies being employed to make sure both perspectives are addressed? • Does a university presence in engaged work in a specific neighborhood result in greater employment and involvement in that community?

27-08-2018 15:59:50

Reflections  19

Resources Various models of understanding community impact and partnerships have been articulated, and the reader is encouraged to select one—or develop new methods—by carefully considering local context, goals, and vision, and then articulating the evaluation plan in this context. Existing tools may be used for understanding what works about the collaboration (i.e., the process) or for understanding the impact of the collaboration (outputs and outcomes). Readers may wish to consider the work of TaylorPowell, Rossing, and Geran (1998) that emphasizes the phases of a collaboration: the first phase, “form and focus,” emphasizes getting started; the second phase, “organize and act,” is about the process; and the third phase, “achieve and transform,” is about results. Woodland and Hutton (2012) describe a strategic alliance formative assessment rubric, which articulates four levels of integration of the alliance or partnership, and for each outlines purpose, strategies and tasks, leadership and decision-making, and interpersonal communication. Himmelman (2004) presents an alternative approach for understanding and evaluating coalition strategies for collaboration, articulating characteristics of level of integration, networking, coordinating, cooperating, and collaborating. Each of these may offer a useful starting point for considering how to approach partnership evaluation. Some specific examples of surveys of partnerships are presented in a guide on partnerships authored by the Commonwealth Corporation (2013). A number of resources are available that include organizational frameworks and evaluation tools that might be useful in developing a partnership assessment. Among these are the following: • The Partnership Toolbox. World Wildlife Federation WWF-UK (http://www.wwf.org.uk/ wwf_articles.cfm?unewsid=3211) • The Partnerships Analysis Tool for Partners in Health Promotion. Victorian Health Promotion Foundation, Victoria, Australia (https://www.vichealth.vic.gov.au/mediaand-resources/publications/the-partnershipsanalysis-tool) • Collaborative Partnerships Evaluation Tool. South Australian Community Health Research Unit (http://som.flinders.edu.au/ FUSA/SACHRU/Toolkit/PDF/3.pdf ) • Measuring Partnerships for Impact (http:// measurepartnershipsforimpact.strikingly .com/)

ASCV.indb 19

• Workforce Partnership Guidance Tool. National Fund for Workforce Solutions (http://nfwsolutions.org/sites/nfwsolutions .org/files/publications/NFWS_workforce_ guidance_tool_111110.pdf) • Collaboration Factors Inventory. Wilder Research, Amherst H. Wilder Foundation (http://www.wilder.org/Wilder-Research/ Publications/Studies/Collaboration%20 Factors%20Inventory/Collaboration%20 Factors%20Inventory.pdf).

Faculty The most substantial change in faculty-focused resources regarding community engagement over the past 15 years has been in developing, demonstrating, and assessing the depth and rigor of engaged scholarship. In the past, community engagement as a form of scholarship was not always considered in promotion and tenure processes and was often seen as service. Individual faculty adopted engagement methods if they were of interest to their academic work and interests; few institutions had coherent agendas of partnerships or a detailed strategic plan or vision for engagement’s purposes (though many aspired to do so). Today, engagement is more widely accepted as a scholarly methodology and a strategy for teaching and research (Holland, 2016; Furco & Holland, 2013). As institutions have embraced engagement through explicit criteria in personnel review processes, faculty now are reviewed for the quality of their community engagement activities— in teaching and learning, and in research and scholarship.

Changing Faculty Demographics and Culture Some of the increased attention to communityengaged scholarship is being fueled by the large-scale transformation of the faculty workforce. For the first time in several decades, many colleges and universities are experiencing growing numbers of faculty retirements and/or enrollment growth that are creating the opportunity to hire significant numbers of new faculty who largely represent Generations X and Y. Many of these faculty may have experienced service-learning in school or college or, reflecting the values of their generation, they often demonstrate a strong motivation to apply their education toward amelioration of challenges and threats to local and

27-08-2018 15:59:50

20  Assessing Service-Learning and Civic Engagement global communities (O’Meara, Eatman, & Peterson, 2015; Trower, 2010). These new-generation faculty are highly collaborative, see teaching and research as connected activities, and value a transparent review process and interdisciplinary work, among other new and more diverse academic values (Trower, 2006, 2012). Faculty culture is changing to be more accepting of engaged scholarship across a wide array of disciplines. This acceptance is a positive force in terms of renewal or expansion of research interests, increased attention to interdisciplinary collaboration, and a broadened view of the scope and methods of faculty work that are embodied and rewarded in tenure and promotion policies. Today, there is great diversity across higher education regarding recognition and valuation of community engagement integrated into scholarly work. However, in some institutions faculty are still often rewarded more for publishing a paper in a traditional academic journal or receiving grant funding than for contributing to innovations that lead to meaningful societal change (Gelmon, Jordan, & Seifer, 2013a). Academic culture does not change quickly. In 2001, the dominant feature of academic culture across all types of colleges and universities was a focus on individual performance based on a review of individual faculty work. This review would be conducted in a context of specific, separate activities relating to the individual’s contributions and achievements relating to teaching, research, and service. This culture of individualistic work, segregated into  three “buckets” of activity, was formed in the midtwentieth century when higher education as a sector was growing rapidly, research was expanding dramatically as a primary activity, and thousands of new faculty were entering the workforce. Looking to create a system that would hopefully be both equitable and efficient, academic policies related to career progress and review of performance were organized around a process controlled mostly at the department and college level with a focus on individual performance within each of the three separate domains of academic work: teaching, research, and service (Orton & Weick, 1990). This culture of individualism affected all types of institutions and persists today. The dominance of this approach was extensive and was sustained by a competitive, peer-reviewed approach to academic journals and federal grant funding. Solo work and solo papers were revered. Even some institutions with teaching-intensive missions tended to reward faculty

ASCV.indb 20

primarily on the basis of peer-reviewed research publications and/or research grants. This singular model persisted, in part, because membership in the faculty workforce has been remarkably stable for nearly 40 years. As economic strain has deepened in higher education, the percentage of tenure-track faculty has declined and the intensity of focus on getting a tenure-track position and achieving tenure is still a major career objective for many faculty (Furco & Holland, 2013). However, the culture of individualism is now fading in the face of new priorities and strategies. In large part, these changes have several broad purposes, including development of a new financial model for higher education, a balanced relationship between teaching and research, and a renewed view of the relevance of higher education to public problemsolving. Change is being fueled by a number of forces. For example, major federal funders (National Science Foundation, the National Institutes of Health, etc.) now favor research grants that involve teams of multidisciplinary and multisector perspectives to examine multidimensional and complex questions and issues. The need to improve student learning, progress, and graduation rates requires transformation of curriculum and greater diversity of learning methods and pathways, including large increases in opportunities for experiential learning, including multiple forms of community-based learning.

Community-Engaged Scholarship Community-engaged scholarship (CES) is thriving in this environment because it is a method that naturally involves multisectoral and multidisciplinary lenses through internal and external partnerships and contributes to effective teaching, learning, and research. Some institutions now identify the value they place on CES in position announcements. Quality engagement practices are increasingly less random and more strategic in their contribution to campus strategic goals as well as community progress. As previously mentioned, as a result of a more strategic view of engagement, an approach that is of growing interest today is a focus on a small set of “grand challenges” (Bill & Melinda Gates Foundation, 2016). A broad focus on a few local and global challenges helps faculty learn to work together, break down academic silos, and recognize the many different disciplinary lenses that must be involved to discover solutions and strategies for complex issues. This also helps to avoid sequencing activities around

27-08-2018 15:59:50

Reflections  21

the dominant influence of the academic calendar or schedules for grants and publications and gives greater attention to the actual flow of work and discovery in partnership with community. This type of focused agenda inherently requires attention to the inclusion of engaged methodologies and often leads to increased funding opportunities and research productivity, while also generating measurable community benefits. This emerging practice holds promise for articulating a clear framework for each institution to identify specific internal and external outcomes and benchmarks (with community input). Such an approach would greatly expand the ability of any campus to accurately measure and analyze its impacts and outcomes on participants from each constituency involved. Creating greater clarity in the definition of community engagement has helped facilitate an understanding of it as a method of teaching, learning, and research—a form of scholarly work (Holland, 2008). The concept of CES combines the principles of community engagement with accepted standards of scholarship. CES is a method that involves community members and academic scholars working together in a collaborative and mutually beneficial partnership to explore questions of mutual interest (Commission on Community-Engaged Scholarship in the Health Professions, 2005). Mutually beneficial means that the outputs of CES will be varied and useful to both the community and the advancement of knowledge and education (Gelmon, Holland, Seifer, Shinnamon, & Connors, 1998). Thus high-quality CES work will result in both traditional and nontraditional representations of the work when a faculty member goes through performance review. The challenge is that many faculty serving on promotion and tenure review panels are not practitioners of community-engaged teaching and research methods. Therefore, institutions seeking to recognize CES as a scholarly method have begun to change both their policy frameworks and their approach to professional development for promotion andtenure committee members so as to ensure their ability to review CES. In CES the typical concerns of peer review, focused on rigorous methods, participant risks and benefits, and the significance of findings for the field, are complemented by equivalent concerns for the quality of the engagement process, community-level ethical considerations, and benefit to the community (Jordan, Wong, & Jungnickel, 2009). Community engagement in and of itself is not necessarily

ASCV.indb 21

scholarship; the work must use a scholarly approach, be grounded in work that has come before, and document the work through products that can be disseminated and subjected to critique (i.e., peer review, but potentially by a range of peers from a variety of contexts) (Jordan, Seifer, Sandmann, & Gelmon, 2009).

National Initiatives Shaping Faculty Engagement and Culture The legitimacy of engaged teaching and scholarship has gained additional recognition and credibility through national initiatives that began to define excellence in faculty engagement. These research projects and national professional networks have provided faculty with external benchmarks for their engagement activities. Three examples of large initiatives that have focused attention related to engagement specifically are CCPH activities in support of community-engaged scholarship, the work of IA, and the Carnegie Community Engagement Classification. CCPH convened the W. K. Kellogg ­Foundationfunded Commission on Community-Engaged Scholarship in the Health Professions in 2003 to provide national leadership for change. The Commission issued a landmark report, Linking Scholarship and Communities, that called upon health professional schools and their national associations to align their faculty review, promotion and tenure systems with CES and offered practical strategies for change (Commission on Community-Engaged Scholarship in the Health Professions, 2005). CCPH subsequently launched the Community-Engaged Scholarship for Health Collaborative in 2006, supported by the Fund for the Improvement of Postsecondary Education (FIPSE). The collaborative had an explicit focus on aligning the schools’ review, promotion, and tenure policies and practices with the recognition and reward of CES (Seifer, Wong, Gelmon, & Lederer, 2009). Another FIPSE-funded project, Faculty for the Engaged Campus, enabled CCPH to develop, test, and launch CES4Health in 2009, an online portal for the peer review and dissemination of nontraditional products of community-engaged scholarship (Jordan et al., 2009). Today, CES4Health has peer-reviewed and published over 80 nontraditional products of scholarship, providing a venue for faculty to “get credit” for their CES and to disseminate valuable products of scholarship that may have more relevance for the communities of interest than traditional academic journals (CES4Health, 2016a).

27-08-2018 15:59:50

22  Assessing Service-Learning and Civic Engagement IA has developed scholarly communities called Collaboratories to leverage IA’s intellectual and creative capital by drawing on the expertise of a diverse group of investigators from the consortium. The primary goal of the Collaboratories has been to incubate and nurture scholarly work through exploration of shared interests. One of IA’s areas of study has been the Publicly Engaged Scholars Study, which explores the graduate school experiences and career aspirations and decisions of students and early career faculty and staff (Imagining America, n.d.). The recognition of engagement as a form of scholarship has also been enhanced by the Carnegie Community Engagement Classification, discussed in detail in the Institution section of this handbook. The desire to obtain the classification has created a strategic incentive for institutions to recognize the strategic value of engaged teaching, learning, and research in improving institutional performance as well as the need to revise policies and practices around faculty involvement in engagement to ensure effective review and rewards for quality work. The Classification application requires institutions to describe their policies and practices regarding rewards for engaged faculty. Such changes also acknowledge the growing diversity of the new generation of faculty, who are introducing new methodologies of teaching and research, including, but not limited to, engaged scholarship (O’Meara, Eatman, & Peterson, 2015).

Venues for Peer Review and Dissemination Faculty value the opportunity to publish and disseminate their scholarly work. Venues for dissemination of community-engaged scholarship through refereed journals have increased greatly since 2001. A search for journals focused specifically on civic and/or community engagement identified 38 different journals (L. Sandmann, personal communication, 2016). The search also identified another 63 disciplinary- or topic-specific scholarly journals and 12 higher education-oriented journals and news outlets that publish papers and stories about engagement. These publications include U.S. and international journals. New scholarly conferences have also developed as venues for peer review and dissemination. IARSLCE, described in the introductory section, has become not only an annual venue for presenting

ASCV.indb 22

community engagement scholarship but also a prominent opportunity for graduate students to meet and network with senior faculty as well as other developing scholars and jointly develop knowledge. Similarly, the Engagement Scholarship Consortium, Coalition for Urban and Metropolitan Universities, various Campus Compact affiliates, and disciplinary associations have created opportunities for dissemination of CES. Nontraditional venues for the peer review and dissemination of scholarly products have also grown in numbers, illustrated by three examples. The evaluation process for peer review and publication in CES4Health (2016b) uses a modification of the Glassick, Huber, and Maeroff (1997) criteria for scholarship. The peer review process used mimics the traditional journal review process but is more rigorous, including reviewer training, detailed quantitative and qualitative analyses, short review time, and editor-generated letters of recognition to enhance the value of the resource. CES4Health seeks for authors to demonstrate and document authentic collaborations, sharing credit with community partners; the use of a scholarly approach to ground community-engaged work in scholarship that has come before; the utility of the product to others; quality and rigor in the context of the work conducted; and the real or potential impact of the work. Multimedia Education Resource for Learning and Online Teaching (MERLOT) is a program of the California State University System partnering with education institutions, professional societies, and industry (MERLOT, 2016). It is a curated collection of free and open online teaching, learning, and faculty development services contributed and used by an international education community. As MERLOT has evolved since 1998, its leaders have developed evaluation standards and peer review processes for online teaching-learning material. Materials are submitted, evaluated, and results are reported; MERLOT members can also comment on materials leading to crowd-sourced reviews. MedEd Portal (2016) is another example of a nontraditional peer review mechanism, a program of the Association of American Medical Colleges in partnership with the American Dental Education Association and the  American Psychological Association. It was established to promote educational scholarship and collaboration by facilitating the open exchange of peer-reviewed health education teaching

27-08-2018 15:59:50

Reflections  23

and assessment resources. It is a free, open-access publication service disseminating scholarly works that are considered stand-alone or complete teaching or learning modules that have been classroom tested and are ready for implementation by users at their own institutions. Each submission is peer reviewed using a standardized review instrument grounded in the tenets of educational scholarship. Each of these examples provides an illustration of how peer review mechanisms are evolving and thus providing more vehicles for the review and dissemination of products of community-engaged scholarship. As emphasized in the Carnegie Classification application and in these examples, professional development for faculty reviewers, department chairs, and deans is necessary in order to equip them with skills and rubrics to evaluate and value the CES submissions of their faculty colleagues (Gelmon, Jordan, & Seifer, 2013b).

What Do We Want to Know Today About Faculty? There are a number of questions that may be asked about faculty and community engagement. Some new questions, beyond those we asked in the original handbook, include the following: • How are junior faculty seeking to change academic culture and strategies for dissemination? • Do engaged partnerships involve faculty from multiple disciplines; why or why not? • In what ways have departments or institutions sought input and feedback from community partners in faculty review processes? How useful and effective are these methods? • Do faculty seek to publish their engaged work in different venues than the other scholarly work they do; why or why not? • Has teaching in ways that involve both faculty and students working in collaboration with community partners changed teaching styles? • Do engaged faculty become involved in other types of interdisciplinary scholarship in addition to engaged work? • Is involvement in engagement affecting faculty career paths (time to promotion, overall productivity, longevity in one place, or mobility across academic institutions or other sectors, etc.)?

ASCV.indb 23

Resources For many faculty, the key to documenting and disseminating the evidence of their communityengaged teaching and scholarship will be finding the “best fit” strategy for their personal situation to demonstrate the value of their work. In some cases, this will occur in the manner by which individuals describe and format their work in a portfolio for professional personnel review: using new technologies and software, creating digital portfolios via a website, and structuring information to respond to criteria that may be campus-based in tenure and promotion policies or integrated into external peer review mechanisms. Faculty may be unaware of resources that exist on their own campus, so it is imperative that these resources (for teaching and learning, faculty scholarship, coordination of partnerships, or other support) be visible and available to faculty. Some of the resources that may be of use to faculty include the following: • CES toolkit on Community-Engaged Scholarship: http://communityengagedscholarship .info • Online Database of Faculty Mentors & Portfolio Reviewers: http://facultydatabase.info • Research University Community Engagement Network: http://compact.org/ initiatives/trucen/research-universityengaged-scholarship-toolkit/ • UNCG, two volumes on Excellence in ­Community Engagement and Community-Engaged Scholarship: https://­communityengagement.uncg.edu/ publications-reports/

Students When the first edition of this handbook was published in 2001, research highlighting the need for educational and curricular reforms in higher education was growing and would soon lead to important new visions and frameworks for student learning. A focus on learning outcomes and curricular innovations increased the attention given to servicelearning as a pedagogy. The Association of American Colleges & Universities (AAC&U) was on the verge of publishing Greater Expectations: A New Vision

27-08-2018 15:59:50

24  Assessing Service-Learning and Civic Engagement of Learning as a Nation Goes to College (Ramaley & Leakes, 2002). This publication was born out of a national discussion about the need to transform education so that graduates were better prepared to live and work productively in the twenty-first century. Many of the recommendations focused on the need for educational programs that are engaging, practical, integrative, and analytical. By 2002, some institutions, such as PSU, had already transformed many of their educational programs to reflect the recommendations outlined in Greater Expectations, but with its publication more colleges and universities began looking for ways to bring student engagement, applied, and integrative learning into their programs (Schneider, 2002). After the 2008 economic recession, the higher education sector came under greater scrutiny regarding the costs, value, and efficiency of undergraduate education. Online learning grew rapidly, and external forces pushed for faster and shorter degree programs. Part of the higher education response to these and related pressures was to increase learning effectiveness by improving student access to diverse forms of experiential learning as a form of learning that is central to the educational experience. This increased institutional commitment to communitybased learning models and the assessment of learning outcomes for students. As a result, there has been a growing tendency to use the term community-based learning as opposed to service-learning, because in some institutions service-learning is seen as specifically focused on civic and social responsibility (Welch & Saltmarsh, 2013). Regardless of the name, these forms of learning are also increasingly blended with undergraduate research, internships, practice, clinical studies, and global learning experiences as reflected in their inclusion in High-Impact Practices (Kuh, Kinzie, Schuh, & Whitt, 2011).

Service-Learning as a High-Impact Practice In 2008, an annotated bibliography of the large body of research on the impact of service-learning on student retention (Simonet, 2008) further cemented national understanding of this teaching and learning strategy as a high-impact practice (HIP) (Kuh et al., 2011). By 2010, AAC&U was strongly advocating the use of 10 HIPs (Kuh & O’Donnell, 2013), including first-year seminars, common intellectual experiences, learning communities, writing-intensive courses, collaborative assignments, service-learning, undergraduate research, global learning, internships,

ASCV.indb 24

and capstone courses. Inclusion of engaging and active pedagogies clearly signaled service-­learning as a HIP, further elevated the importance of the pedagogy, and increased adoption of its usage within domestic and international higher education institutions. AAC&U documented the outcomes of servicelearning including citing academic achievement, civic engagement, and personal growth as three organizing categories of impact. An example of the application of HIPs is found at Virginia Commonwealth University. VCU developed a systematic institutional assessment model to investigate the impact of high-impact educational practices on undergraduate student success (Pelco & Baab, 2016). This HIP Assessment Model aligns with VCU’s strategic and quality enhancement plans as well as with theory and best practice in higher education assessment (Astin, 1993). The model uses institutional data (derived from the Banner system) and program data (i.e., surveys and direct assessments) to answer assessment questions in three categories that are linked to institutional objectives for student learning. Category 1 investigates the degree to which diverse and underrepresented diverse students participate in HIPs (VCU’s Inclusive Excellence Objective). Category 2 researches whether participating in VCU HIPs increases students’ retention and graduation rates (VCU’s Degree Completion Objective). Category 3 explores the relationship between VCU HIPs participation and student learning and development (VCU’s Quality of Learning Objective). In 2015, AAC&U celebrated its 100th anniversary with the publication of a bold vision for higher education called The LEAP Challenge: Education for a World of Unscripted Problems (AAC&U, 2015). AAC&U made the case for a post-secondary education that “prepares students to understand and manage complexity, diversity, and change” (p. 1). The most important aspect of this educational proposition was that “students apply knowledge in a real world ­setting” (p. 1). This vision includes explicit language that all students would encounter this application of knowledge, not simply those in selected programs or those who chose to seek out experiential opportunities in college. AAC&U challenged institutions of higher education to create “Signature Work” opportunities for all students. These are culminating learning experiences that enable the student to pursue a problem that they consider is relevant in the world. AAC&U makes a compelling case for the importance of service-learning to build students’ capacity to “find solutions to intractable problems”

27-08-2018 15:59:50

Reflections  25

(AAC&U, 2015, p. 2). AAC&U is committed to promoting “Signature Work in Action” by disseminating multiple models to assist colleges and universities in achieving this form of education and by furthering the implementation of service-learning in higher education.

Increased Understanding of Impact on Students Service-learning and other forms of experiential education began to flourish within institutions as an increasing number of educational practitioners and researchers began to understand the power of grounding academic learning in experience. The field began to explore how this form of experiential learning had the power to develop civic learning, personal development, and intercultural competence among students. Growth in understanding of how the pedagogy can facilitate a diverse set of learning outcomes inspired the growth of increased scholarly exploration. In a review of the literature from 2008 to 2017, dozens of published studies were identified that examined the impact of service-learning on student learning outcomes and the elements that need to be in place to facilitate student learning. These studies begin to capture the power of the pedagogy and the many ways it can affect student learning. They helped to inform how service-learning experiences affected students perceptions of themselves regarding competencies and outcomes such as citizenship, diversity, academic and professional development, career orientation, and degree completion (Bamber & Hankin, 2011; Hahn & Hatcher, 2014; Hatcher, Bringle, & Hahn, 2016; Keen & Hall, 2009; Levesque-Bristol, Knapp, & Fisher, 2011; McKay & Estrella, 2008; Yorio & Ye, 2012). Laursen, Thiry, and Liston (2012) explored the impact of service-learning on the professional development of students. They report that servicelearning participants experienced considerable gains in teaching skills, which they view as valuable both for educators and for other professions requiring scientific communications (Laursen et al., 2012). The students’ personal and emotional gains—­confidence as science teachers, pride and pleasure in their work—reflect a growing sense of identity as teaching professionals. Together, these gains addressed both cognitive and affective elements of socialization through mechanisms including formal training, experiential learning, and observation of other professionals (Laursen et al.

ASCV.indb 25

2012). The service-learning project and the associated learning outcomes are relevant to those who have the teaching and scientific skills to practice in this ­community-based setting. Service-learning has increasingly been used as a pedagogy to develop civic engagement learning outcomes. At IUPUI, there has been considerable attention paid to developing an understanding of the civic-minded graduate (Hatcher, 2008; Steinberg, Hatcher & Bringle, 2008). This concept refers to a student who has “completed a course of study and has the capacity and desire to work with others to achieve the common good” (Center for Service and Learning, 2017). The Civic-Minded Graduate (CMG) model integrates the dimensions of identity, educational experiences and civic experiences, and draws upon the domains of knowledge, skills, dispositions and behavioral intentions. Methods for measuring the CMG construct include a quantitative self-report measure, a qualitative measure, and a protocol and rubric for face-to-face interviews (Center for Service Learning, 2017). PSU’s community-based senior capstones are framed by a focus on interdisciplinary servicelearning and assessment as a continuous improvement strategy. Many of these courses identify specific content-related learning outcomes as well as a focus on personal development, intercultural competence, and civic learning outcomes. Fullerton, Reitenauer, and Kerrigan (2015) capture the power of these courses to facilitate these learning outcomes. The research team interviewed 20 randomly selected alumni who took a communitybased senior capstone course where they worked in an intensive way with people with disabilities. Eighteen of the 20 interviewees identified the capstone as among their most significant learning experiences in their college education. Furthermore, 50% reported enhanced interpersonal and communication skills, 70% described a deeper appreciation for diversity related to disability, 60% described gaining a new perspective that human variation is typical rather than atypical, and 35% described a newfound maturity and gratitude leading to a desire to serve. Many respondents found the course to be a profound engagement with their own fear and discomfort around difference, and the ways that engagement opened them up as human beings and allowed them to develop capacities they did not know they had. Research on Service-Learning: Conceptual Frameworks and Assessment (Clayton, Bringle, & Hatcher,

27-08-2018 15:59:50

26  Assessing Service-Learning and Civic Engagement 2012) made a substantial contribution to the field by providing a review of past research in the field as well as a set of assessment methods and instruments. This publication offers assessment methods that reflect the varied learning outcomes and purposes of the service-learning experience. Historically, the field had focused on articulating the different ways student learning would be assessed depending on if the course was seeking to develop civic learning, diversity competencies, and/or professional skills. The assessment methods were primarily focused on application of course concepts and those that focused more intently on developing students’ civic learning skills and competencies. As a field, we now understand that service-learning has the power to affect student learning on a number of fronts and thus our assessment methods need to be shaped in ways that are responsive to the variety of learning goals.

Expansion of Online Service-Learning Courses In the past five years there has been growth in online course and program offerings. The expansion of online course and degree options are in response to market demands allowing college courses to be more accessible to nontraditional students. Online communitybased learning is an educational practice that is emerging as a growing area of interest. Student learning and the conditions that facilitate the achievement of intended learning outcomes commonly associated with service-learning courses are necessarily different for online courses. The pedagogical practices employed in online courses differ from those used in face-to-face courses, yet the learning outcomes are often similar. Learning outcomes for these courses include improved ability to solve problems in the community, intercultural communication, navigating difference and power dynamics, and understanding the political and social dynamics that ground the community work. The course evaluation is used to evaluate the course as well as serving as one of the methods to assess student learning. Course evaluations used for online service-learning courses include the same questions included in the face-to-face service-learning courses. Student course evaluation data for online courses and face-to-face courses often show similar results across these modes of delivery. It is much more difficult for faculty teaching online courses to get high return rates for endof-course evaluations from students. Berk (2012) documents that course evaluation response rates

ASCV.indb 26

are consistently 50%  or lower whereas face-to-face administration of evaluations garners 90% response rates. This is a challenge for online courses in general but is of particular concern in online service-learning courses, given the newness of this educational delivery platform in community-based courses. Dedicating staff time to make personal contact through a phone interview can be an effective way to increase student response rates. Assessing the educational practices used by faculty and learning experiences of students in these courses helps inform the continuous improvement of these and newly created online community-based courses.

Increased Emphasis on Assessment of Student Learning In the past 15 years there has been a growing emphasis on accountability and consequently growth in student learning assessment. The primary purpose of the 2001 publication was to share what we had learned about the assessment methods we had developed and tested in the early days of large-scale service-learning implementation at PSU. As pointed out by Driscoll and Wood (2007), outcomes-based assessment across higher education is nearly 30 years old. Use of student work samples and rubrics to assess learning has grown rapidly. AAC&U has stated that: “In the current climate it is not enough for an institution to assess its students . . . Colleges and universities also must provide useful knowledge to the public . . . about the quality of student learning” (Council for Higher Education Accreditation [CHEA], 2006). AAC&U took on a leadership role in highlighting a need for revised accountability measures for liberal education (Schneider, 2004). In 2007, it took a substantial step in contributing to the body of tools that helped campuses assess liberal education student learning. The Valid Assessment of Learning in Undergraduate Education (VALUE) initiative resulted in the development of a set of rubrics meant to assess learning achievement by examining student work samples (Rhodes, 2008). The 16 VALUE rubrics serve as samples for campus development of individualized student assessment rubrics. A number of additional national associations, including the Business Higher Education Forum, the National Center for Public Policy and Higher Education, the Council for Aid to Education, and the Council for Higher Education Accreditation also highlighted the need to increase higher education accountability practices (McPherson &

27-08-2018 15:59:50

Reflections  27

Shulenburger, 2006). In their own ways, these associations have contributed to developing tools and practices that are helping document institutional effectiveness and student success. For example, the Business Higher Education Forum has urged that there be a new “accountability consensus” for student learning in higher education. This organization states, “We are most concerned with the national capacity to measure and publicly account for general knowledge and skill levels” (Hoagland, 2006). The leadership within these associations engaged in the hard work to develop accountability language. All of this high level policy work makes itself visible in accreditation practices and increasingly so in how campuses document student learning and institutional performance. The important work of these various associations has led to substantial expansion, increasing sophistication, and greater consistency of methods used to assess student learning achievement. The national call to increase accountability and assessment generally resulted in the growth of assessment scholarship, which in turn informed the growth of new and effective approaches to assessment of servicelearning impacts on students. In higher education’s current environment, with the expansion of servicelearning and other forms of experiential education and the increased testing and understanding of student learning assessment, institutions are better prepared to consider the impact of service-learning on students and have identified some new methods of assessment that go beyond the methods explored in the first edition of this publication.

New Strategies for Assessing Impact on Students: Examples From PSU Newer strategies for assessing student learning in the context of service-learning include a focus on course design as a method to frame assessment; small group instructional diagnosis; work sample assessment; critical incident methodology; assessment of online community-based courses; and summative assessments. The examples provided here have been effective at PSU and may be applicable in other settings. Course Design as a Method to Frame Assessment Since 1995 PSU has rigorously assessed its community-based capstone course experience. As a required course within the general education program, the capstone engages over 4,300 students per year in the community. It was clearly incumbent

ASCV.indb 27

upon the institution to thoroughly assess the outcomes achieved through these courses and engage in continuous improvement. A best practice that has emerged at PSU is the use of a holistic assessment approach that is extensively tested and applied to over 230 community-based learning courses per year. These methods are most consistently utilized in capstone courses but applicable to any servicelearning or community-based course at any level of the curriculum. The approach has the following six distinctive elements: 1. Faculty develop clear course learning outcomes framed in the course proposal process before the course begins. The articulated learning outcomes as identified by the instructor become the foundation of further assessment efforts. 2. Faculty submit clear learning outcomes in the course proposal and provide examples of how the learning outcomes as well as the institution-wide general education outcomes will be addressed in readings, assignments, in-class discussions, and reflection activities throughout the course. 3. The instructor agrees to have the students in the course participate in a small-group ­qualitative assessment that takes place during the third or fourth week of the term. This formative assessment practice occurs in all courses being offered for the first time and 20% of courses being offered on a regular basis. More information about this method of assessment can be found in the following description of “Group Instructional Feedback Technique” (GIFT) (p. 28). 4. End-of-term course surveys/course evaluations are administered, collected, analyzed, and the aggregate student data for the course is delivered back to the faculty for their review. 5. Student Work Samples are collected from 20% of all courses each year to assess how students are engaging with, and fulfilling, the general education learning outcomes. Those work samples are analyzed with rubrics developed by the faculty. 6. Multiple opportunities each year are provided for faculty to talk about their teaching and assessment in various settings including one-on-one faculty consultations, informal faculty brownbag discussions, and formal workshops/retreats.

27-08-2018 15:59:50

28  Assessing Service-Learning and Civic Engagement Each of these assessment elements will be described in further detail. Tools to Develop Goals, Objectives, and Course Proposals PSU works extensively with faculty to collect assessment and use the assessment with faculty for course improvement and program assessment. This process begins with clarifying goals, purposes, teaching and assessment methods, and partnering processes. The use of the course proposal is an initial element to gaining clarity on these fronts. Although PSUspecific, having a look at the Capstone Proposal can help those using a course proposal as a tool for assessment. Other institutions that wish to consider this approach may wish to modify the Proposal to reflect their institutional identifiers (see Portland State University, n.d.). GIFT GIFT, also referred to as Small Group Instructional Diagnosis (SGID), is a formative midquarter qualitative feedback process introduced by Angelo and Cross (1993). This feedback process, which asks students to evaluate course design, has become an educational best practice to evaluate courses at the midpoint of the term as a means to provide data to faculty for midcourse modifications in order to enhance student learning and outcomes. PSU has utilized this midquarter evaluation protocol for more than 15 years with significant success by having a skilled facilitator come into communitybased learning courses and specifically ask the students (typically in groups) the following: • What aspects of this course are helping you to better understand the course content? To better prepare you for your community work? • What could be changed to improve this course? • What specific suggestions do you have to bring about these changes? • In what ways does this course enhance your understanding of our general education goals (Communication, Social Responsibility, Critical Thinking, and Diversity of Human Experience)? Most of the open-ended questions used to facilitate these conversations are ones that have transferability to any community-based course. The fourth question in this protocol was targeted specifically to

ASCV.indb 28

address the general education learning outcomes at PSU and should be modified by other institutions to address relevant learning outcomes. After the feedback session the facilitator documents the students’ feedback, provides it to the faculty member, and then meets with the faculty to discuss tangible actions for improving the course. The primary focus of the evaluation is on improvement of individual courses. Programs can also aggregate the data by looking at the feedback for a collection of courses. PSU annually aggregates the data from about 25 capstone SGIDs to identify program strengths and challenges, and to inform faculty development. Kerrigan and Jhaj (2007) detail how this methodology was used at PSU and provide specific examples of the types of student feedback collected. The advantage to this evaluation approach is that it provides faculty with accurate data regarding students’ experiences in the classroom and gives faculty tangible suggestions for improving the course. This has become a best practice of formative evaluation since faculty and students discuss how to improve the course well before issuing the end of the term course survey and the assignment of students’ final grades. End-of-Term Course Surveys/Course Evaluations As at many campuses, end-of-term student surveys used by PSU are similar to the student surveys provided in the 2001 publication of this book. To view data collected by PSU capstone surveys, see Portland State University (2017a). Work Sample Assessment Each year the PSU Capstone Program uses a work sample assessment approach to assess how individual students report growth along one of the General Education goals (communication, critical thinking, social responsibility, or appreciation for human diversity). This assessment method can be used at any institution that wishes to do program level assessment by accessing student work samples. Identifying the programmatic goals and creating a relevant rubric are the essential ingredients to making this method transferable to other institutional settings. At PSU, 18 to 20 of the 60 distinct capstone courses are involved in this process annually. The faculty teaching one of the selected courses attends a premeeting with the other faculty teaching courses that were selected to be part of the Work Sample Assessment process for that year. The group discusses the learning goal being assessed that year and

27-08-2018 15:59:50

Reflections  29

asks them to identify the assignment they anticipate submitting at the end of the term. At the end of the term, faculty submit written work from five students in their course, the associated assignment instructions, the syllabi for the course, and any contextual course information. Each student submission is a written response to an assignment related to one General Education goal. There is an existing rubric for each of the four General Education goals. At the end of each academic year the 18 to 20 faculty who collected work samples meet and are trained to use a rubric relevant to the goal being assessed in that particular year. This process is used to determine if the course met the expectations of the program, were exemplary, or were inadequate. Faculty meet and discuss the results and deepen their own understanding of the various ways that each learning goal can be enhanced in their course. The assessment is helpful in supplementing the data provided through course surveys and course evaluations. For example, it is common for over 80% of capstone students to report that they enhanced their communication skills and critical thinking skills in this community-based course, but the quantitative data does not explain how they enhanced these skills and in what specific ways they furthered these skills. Student-written essays and reflections do indeed detail for researchers how students enhanced these skills and provide specific examples of how these broad skills were fostered. Critical Incident Methodology In the 2001 edition of this publication, the Critical Incident Methodology was one of several methods used to understand how service-learning and civic engagement affects institutional practices. It has since been demonstrated that this method has utility in helping to assess student learning and has become a practice used at PSU for program assessment purposes. In addition to the comprehensive assessment approach employed in c­ommunity-based capstones, PSU also has employed a modified “critical incident approach to outcomes assessment” (Bycio & Allen, 2004) in specific studies conducted to investigate the impact of community-based experiences on students after completion of the course. Bycio and Allen (2004) claimed that student questionnaires provide a large amount of student feedback to institutions regarding individual questions related to a course or program, but they fail to accurately assess the relevance, importance, or significance of the

ASCV.indb 29

educational event from the students’ perspective. This critique resonated with researchers at PSU who could see from the data that students were consistently reporting that they furthered their learning on the General Education goals in their community-based capstone, but those data gave no information to the institution regarding students’ perception of the value of the growth in these areas. Researchers realized that students had experienced critical educational incidents, which are lived experiences “that reflect especially good or bad performance” (Bycio & Allen, 2004, p. 86) by an educator or an educational event in their college years. Researchers conducted an alumni phone survey that asked graduates to name their three most significant learning experiences in college and to describe factors that made these events significant. The goal was to ask these open-ended questions with no reference to community-based learning or capstones and simply let the graduates define what life experiences in college were most significant to them. The results showed that the communitybased capstone course was the single most-reported significant learning experience reported by PSU graduates (Fullerton et al., 2015). It gave researchers a new piece of information. The communitybased capstones were achieving individual learning outcomes on survey questions, and, from a larger perspective, graduates found these courses to be the most significant learning experiences in their college education. This assessment process also informed researchers what factors students detailed as contributing to their perceived significance: the highly relational aspect of the courses (high peerto-peer interaction and high faculty-to-student interaction), engaging with persons from diverse backgrounds, working in teams, and applying learning in the field/community. This approach can be used at any institution and can simply entail the addition of a limited number of truly open-ended questions that allow communitybased learning participants the opportunity to name which educational events in higher education had the most significant impact on their lives.

What Do We Want to Know Today About Impact on Students? There are many questions that may be asked about students and community engagement. Some new questions, beyond those we asked in the original handbook, include the following:

27-08-2018 15:59:51

30  Assessing Service-Learning and Civic Engagement • What is the impact of engagement strategies on student learning, retention, progress and completion? • What are the best approaches to the placement of students in communities similar to those of their own background and life experiences? What are the challenges of these strategies? • Does engagement of students from marginalized populations help to raise awareness of key issues experienced by these populations among all students? • How do students from different backgrounds and life experiences conceive of community? How does this affect their learning experiences? • Where community engagement activities are voluntary, what are the characteristics of students who volunteer for these activities (as compared to when the experiences are required)? • What kinds of experiences are most effective for freshmen as compared to seniors as compared to graduate students? • What do we need to know about students’ cultural backgrounds that influences their understanding of community engaged experiences? • Does a university presence in engaged work in a specific neighborhood or from underrepresented populations result in greater enrollment from that community? • Do community engagement experiences have a differential effect on students who are first generation or from underrepresented populations? Student learning assessment has enjoyed robust growth in the past dozen years. There has been a

ASCV.indb 30

nuanced understanding of the variety of purposes for why a faculty member opts to use servicelearning pedagogy in their courses. Faculty may seek to offer students an opportunity to apply and test course concepts. Sometimes service-learning is used as a way to help students develop and practice civic engagement identities and skills. There are times when service-learning is used as a means to explore diversity and social justice goals. This variety of purposes shapes the assessment methods. PSU has continued to use many of the assessment tools from the 2001 edition of this handbook, yet the expanded understanding of service-learning and the national and international growth of assessment methods over the past 15 years has inspired the use of additional methods. Readers are encouraged to explore resources from a variety of campuses and organizations in order to find and adapt those best suited to their purposes.

Concluding Comments As stated at the beginning of this narrative, we have prepared an update and reflection from our collective experiences over the past 15 years on the evolution of measuring, assessing, evaluating, and monitoring the impact of community engagement strategies. This work is dynamic, and thus any handbook such as this is out of date as soon as it is published, because new methods, strategies, and research results are routinely being generated. We encourage users to continue to be creative and develop new methods and share these widely  through conferences, publications, and nontraditional venues for dissemination in order to continue to advance our work and expand our knowledge and application in practice.

27-08-2018 15:59:51

Chapter Two

A S SE S SM E N T P R I N C I P L E S A N D S T R AT E G I E S An Overview

Institutions committed to civic engagement and service-learning must be able to demonstrate the impact of these initiatives to ensure quality for student and community participants, to justify resource investments, and to inform the improvement and expansion of such programs. Understanding and articulating “impact” requires knowledge and expertise in the design and application of assessment methods. This handbook provides basic information on practical methods and tools for assessment planning, design, and implementation. The material can be used for a campus-wide service-learning initiative, for individual courses, or for other activities related to civic engagement and community involvement. Since, throughout the text, the primary focus is on developing a comprehensive assessment strategy on the institutional or programmatic level, the word program occurs often. Nevertheless, as has just been noted, these materials can also be adapted for individual faculty use in freestanding courses.

attend teaching and learning—for example, knowledge, focus, the curriculum, instruction, design strategies, student roles, and organizational change— a framework such as Figure 2.1 can be used to illustrate the transition from the “old” to the “new.” Note that service-learning and other forms of communitybased education all demonstrate characteristics of the new—emphasizing the application of knowledge, a team and community focus for learning, collective instruction and collective curriculum definition, an integrated sequencing of courses, and active student learning. These concerns are all important to keep in mind as one begins to consider how one can best assess the impact of such programs. The language of “civic engagement” and community participation is increasingly evident in discussions of trends in higher education. Ehrlich (2000) describes civic engagement as “working to make a difference in the civic life of our communities” (p. vi). Such an enterprise changes the function of the institution, creates new challenges for faculty roles, offers opportunities for new collaborations with community partners, and affects not only what students learn but also what should be taught. These changes inevitably lead to questions such as “What impact is engagement having on the institution and its component parts?” To answer questions like this one, carefully constructed assessments must be created—assessments that ask clear questions, collect appropriate data, and analyze and report results in meaningful ways.

Context for Assessment Increasingly, higher education is experiencing a shift away from a traditional emphasis on teaching to a new emphasis on learning. Barr and Tagg (1995) have described this as a movement from seeing colleges as institutions whose function is to provide instruction to seeing them as institutions designed to produce learning. If one thinks of the core concepts that

Note. The text for this chapter is adapted from Sherril B. Gelmon, “How Do We Know That Our Work Makes a Difference?” Metropolitan Universities, 11, 2 (Fall 2000), pp. 28–39. Permission granted by the editor. 31

ASCV.indb 31

27-08-2018 15:59:51

32  Assessing Service-Learning and Civic Engagement Figure 2.1.  Moving from teaching to learning. “Old Way” Acquisition

Issue

“New Way”

Knowledge

Application

Individual

Focus

By faculty

Curriculum definition

By faculty/community, students

Banking

Instruction

Collective

Prescribed courses Passive Sporadic reform

Design Student learning Change

Why Do Assessments? Why do we do assessments? Many say that universities have to prepare better-educated students; therefore, the primary reason for assessments is to improve student learning. Others believe that a key reason to conduct assessments is to provide immediate feedback to enable program leaders to make incremental changes, responding to identified needs and concerns. Over the long term, assessment data can provide the basis for program planning and for redesign and substantive improvement. Why is assessment currently so important? Across the United States, there is an increasing interest in assessment due to regulatory requirements, public demands for educational accountability, and administrative concerns about resource utilization— not to mention the growing interest in assessment for program improvement (Gelmon, 1997b). With ever-increasing calls for accountability from funding agencies and accreditors, particularly regarding resource accountability, there exists a regular demand for clear assessment data. At the same time, many campuses find it difficult to articulate how a service-learning program should be assessed, let alone how the results of such an assessment can contribute to an understanding of civic engagement. But as we learn more about the institutional impact of a commitment to engagement, the main reason for documenting that impact reveals itself as essential to ensuring sustained, highquality relationships among all participants. Faculty, students, community partners, and institutional participants all come to a partnership with different concerns and expectations; these demand a complex and intentional assessment strategy (Holland, 2001a).

ASCV.indb 32

Team/community

Integrated sequence Active Continuous improvement

Furthermore, although interest in service-learning and engagement continues to grow, some faculty remain cool to these new endeavors and remain skeptical, demanding strong evidence for the value of this work. Assessment can produce the evidence of impact many faculty require and, thus, can lead to broader participation.

Service-Learning and the Engaged Campus As institutions explore and discuss the concept of an engaged campus, some faculty demand to know what exactly engagement is in relation to academic work and whether such a focus represents a true descriptor or key trait of their institution’s missions and actions. What does an engaged campus look like? How does it result in different faculty work and expectations? What are the characteristics of engaged students and the nature of their learning experiences? What can be observed about community-campus partnerships? All of these questions can begin to frame a campus-based assessment of engagement. Since studies show that an institutional commitment to engagement is strongly linked to the inclusion of community-based learning experiences in an institution’s curricula (Holland, 2001b), service-learning can be viewed as an instructional strategy appropriate for increasing civic engagement (Hollander & Hartley, 2000; Gelmon, 2000a). This handbook is intended to help fill a void in the published literature about assessing the impact of service-learning across a broad range of constituencies.

27-08-2018 15:59:51

Assessment Principles and Strategies  33

Engagement and partnerships represent a new dimension of academic work. The multidimensional nature of partnerships and engagement is reflected in several recent publications. For example, the 1999 report of the Kellogg Commission on the Future of State and Land-Grant Universities, Returning to Our Roots: The Engaged Institution, offers a set of characteristics that form a seven-part “test” of engagement (Kellogg Commission, 1999): • Responsiveness: Are we really listening to the communities we serve? • Respect for partners: Do we genuinely respect the skills and capacities of our partners in collaborative projects? • Academic neutrality: Do partnerships maintain the university in the role of neutral facilitator despite the existence of potentially contentious issues? • Accessibility: Is our expertise equally accessible to all the constituencies of concern within our communities? • Integration: Does the institutional climate offer new opportunities for integrating institutional scholarship with the service and teaching missions of the university? • Coordination: Are academic units, institutional support offices, and groups of faculty, staff, and students dealing with others productively and sharing/translating their knowledge into something the public can appreciate? • Resource partnerships: Are resources committed to the task sufficient? Each of these characteristics offers a potential focus for planning and assessment. Similarly, the 1999 Presidents’ Fourth of July Declaration on the Civic Responsibility of Higher Education (Campus Compact, 1999) includes the “Campus Assessment of Civic Responsibility,” which provides a framework for institutions to conduct a baseline self-assessment of their readiness for civic engagement by involving administrators, trustees, faculty, staff, students, alumni, and community partners in a deliberative process of describing institutional commitment and conditions. The questions included in the Compact’s assessment framework address topics such as leadership, curriculum, involvement in community public policy development, campus and faculty culture,

ASCV.indb 33

diversity, community-campus partnerships, communication, and community improvement. Analogously, Holland (1997) has created a matrix of key organizational factors that can guide campus explorations of commitment to engagement based on the degree to which engagement is seen as an element of campus mission. For each level of commitment, the matrix provides measures and features associated with successful and sustained engagement programs. Furco (2000) has developed still another guide for assessing the institutionalization of service-­learning. He proposes five dimensions for such an assessment: (a) philosophy and mission of service-learning, (b) faculty support/involvement in ­ service-learning, (c) student support/involvement in service-learning, (d) community participation and partnerships, and (e) institutional support. Multiple categories are then articulated for each dimension. A team involved in self-assessment would consider each criterion and determine its school’s place according to the following three levels of institutionalization: 1. Critical mass building 2. Quality building 3. Sustained institutionalization The engaged campus is also being discussed in many higher education forums, and one of the key areas that emerges here as vital for any assessment effort is an analysis of community-university partnerships. Such partnerships are an essential component of engagement and form the basis for service-learning activities. A review of lessons learned through projects related to service-learning and other forms of community-based learning (Holland & Gelmon, 1998) suggests that some of the key areas in need of assessment include the nature of relevant partnerships; understanding the needs and assets of both the educational institution and the community; defining community; leadership roles; curricular placement and emphasis; the nature of learning; and partnership sustainability.

Beginning the Assessment Process Assessment serves a useful purpose as a mechanism to tell the story of what one has learned from one’s work—articulating that learning for oneself as well as for others. In beginning any assessment process, one should ask a series of key questions. The answers to

27-08-2018 15:59:51

34  Assessing Service-Learning and Civic Engagement the following questions will frame the design of the assessment process: • What is the aim of the assessment? • Who wants or needs the assessment information? • What resources are available to support the assessment? • Who will conduct the assessment? • How can one ensure the results are used? These questions are important for several reasons. Every assessment process should have an aim and stated purpose. Without them, there may seem to be little reason to carry forward the required work. Furthermore, the person or agency wanting or needing the assessment may dictate the nature of the actual work to be carried out: Is it mandated by a funder, is it part of an accreditation or other regulatory review, is it part of an individual’s personal performance review? To ensure implementation and follow-through, an assessment plan must identify the resources that will support the assessment and the person or persons who will do the work. Too often, assessments are designed without a clear understanding of their resource implications, with frustration resulting from the fact that the plans drawn up do not correspond to the realities of available resources and expertise. Finally, it is important to ensure that the results of an assessment process will be attended to and used. Designing and conducting a comprehensive program assessment, only to see the results of that process ignored, can leave those involved with a deep sense of futility. As suggested previously, assessment is important, first, to articulate what has been learned for oneself. Many professionals in higher education today have little time to stop, reflect, and consider the impact of their work. Articulating the implications of that work can help to delineate issues, describe strategies, and highlight areas where further work is needed. Assessment can also provide an opportunity to stop and celebrate successes that have been achieved— something rarely done. Finally, assessment can help us to focus our thinking in ways that result in new insights and identify opportunities for improvement. Second, assessment helps us to articulate our learning for others. It can facilitate our sharing lessons we have learned and can transmit knowledge essential to others’ learning. In particular, a strategic assessment plan can identify factors of import to others conducting similar or parallel work.

ASCV.indb 34

Finally, we should note that assessment can vary in its scope depending upon its constituencies and purpose. In the context of an institutional review for regional accreditation, an assessment plan might be university-wide. However, a department or program might undertake assessment for internal review purposes, for professional review (by a state governmental entity or a specialized/professional accreditor), or as one aspect of a departmental/ program plan. Campus-wide general education programs are often the focus of assessment efforts aimed at gaining greater understanding of how such programs affect multiple student populations. New service-learning initiatives are often the focus of intensive assessment efforts as institutional leaders seek to determine the comparative value and expense of service-learning as compared with other pedagogical approaches. Since assessment of service-learning and civic engagement demands that one understands multiple areas of impact, this handbook presents detailed discussions of how one can understand their impact on students, faculty, community partners, and the institution as a whole.

Who Should Be Involved in Assessment? Successful assessment requires bringing together players central to the activity being assessed and helping them to step outside their normal roles and to create a new culture—one that facilitates pooling their collective interests to focus on the program, service, department, or other activity being assessed. For this reason, assessment can have a transformational impact on the unit or activity in question (Magruder, McManis, & Young, 1997). In the context of service-learning assessment, multiple players are essential to provide a broad perspective on program impact. Students, faculty, community partners, and institutional leaders all have a distinctive role to play as key informants. In addition, faculty may be involved in the design and administration of assessment activities related directly to their classes. Community partners may wish to play a similar role in the design and administration of assessment procedures related to their responsibilities. Graduate students may also represent a valuable resource in helping with assessment design, administration, and analysis, for such work may have bearing on their own programs of study. Finally, institutional centers for teaching and learning, community

27-08-2018 15:59:51

Assessment Principles and Strategies  35

outreach, institutional research and assessment, and/ or service-learning can play important roles in various aspects of the assessment process. Considerable debate exists regarding the merits of centralized versus decentralized responsibility for assessment. In some institutions, a central office has been developed to provide a focus within the institution’s administrative structure and to serve as a campus-wide resource (e.g., Palomba, 1997). While establishment of such an office is often viewed as evidence of an institution’s commitment to assessment, one liability of such an approach is that faculty and/or departments may come to view assessment as solely the responsibility of that office and as something about which they need not be concerned. Furthermore, results may be viewed with suspicion, since they derive not from the faculty but from a central administrative structure. One way to avoid problems like these could involve using a central office merely as the resource that supports, encourages, and facilitates faculty and/ or departmental activities by encouraging, for example, buy-in to institution-wide assessment activities and by disseminating assessment results. Such buyin is invariably enhanced by investing in faculty and staff development of assessment skills and by involving as many people as is feasible in the assessment process.

Common Themes and Concerns in Beginning Assessment Several concerns frequently arise at the beginning of the assessment process. One has to do with identifying appropriate and affordable expertise. At academic institutions, despite the presence of a number of disciplines where one might expect to find assessment expertise, it is often difficult to find a few, if any, individuals who have the particular expertise needed to design, lead, and manage this work. At some institutions such expertise may indeed be found, but the individuals who possess it may already be overcommitted to other projects and/or scholarly activities. If obtaining the necessary expertise represents a financial investment, it will be especially important to determine accurately what resources are available to support that investment. A second concern relates to conceptualizing the focus of the assessment process; in other words, what precisely is to be assessed? When? For whom and for what purposes? The key questions identified in

ASCV.indb 35

an earlier section can help to answer these questions and to frame the project, but considerable discussion may be needed to reach agreement on how best to frame the assessment plan. Once the focus of the assessment has been conceptualized, another concern has to do with implementation: Who is responsible? What resources do they have? What leverage exists to encourage people to participate in assessment activities and to cooperate in meeting data needs in a timely manner? Still another concern relates to the selection of assessment methods. Those selected must take into account the purpose of the assessment and anticipated uses of the information collected as well as the potential burden the assessment activities will impose. When plans and needs are clearly set out and agreed upon, agreement upon methods may exist. However, it is not unusual to find participants who feel they themselves are the experts who should dictate specific assessment methods, particularly for use in their own classroom. In this regard, one especially common source of contention involves the ongoing debate between qualitative and quantitative data—a debate that includes questions of appropriateness, validity of results, generalizability, and other challenges. These questions, in turn, lead to discussions of rigor, specification of methodological needs, and ultimately to design issues that may go well beyond the resources available to support the assessment process. A final concern revolves around the uses of assessment findings. Once again, problems can sometimes be avoided if there is discussion and agreement from early on as to what will be done with the data. Perceptions of a “closed” process or one intended to justify program closure or termination of faculty/ staff positions will compromise the assessment process. Measures designed to guarantee an open process with agreed-upon uses of the data obtained will assist greatly in facilitating assessment activities. However, even a well-designed plan and emphatically open process may still encounter resistance— whether the threat the findings represent is real or perceived. If outside experts are brought in (either to provide an alternative to or to supplement internal experts), these outsiders may be intrinsically intimidating (the fear of airing “dirty laundry” in public). Skeptics will then question the rigor of the assessment plan and its methods and may not be willing to accept that compromises in scientific method are sometimes necessary to meet deadlines. Other issues

27-08-2018 15:59:51

36  Assessing Service-Learning and Civic Engagement that may be raised include training to develop the internal capacity to conduct and manage the various components of the assessment as well as questions related to supervision, data collection, confidentiality, and data management. Finally, resistance is almost a certainty when the environment is already politically charged—for example, where there exists fierce competition for resources, assessment data may help decide allocations. While each assessment situation is unique, there nevertheless exist several responses that may help to overcome any resistance. Agreement upon the purposes of the assessment, public sharing of these purposes, and careful adherence to the assessment’s purposes and scope will help to establish the authenticity and legitimacy of the effort. Energy should also be deliberately invested in building buy-in for its value. Confidentiality of the respondents must be assured. Roles and tasks must be defined early in the process, and mechanisms for regular reporting, sharing of findings, updates, and airing of concerns must be clearly established.

Assessment as an Improvement Strategy Assessment can be viewed as a strategy for improvement—an integrated set of activities designed to identify strengths and areas for improvement and capable of providing evidence to support future program planning. Assessment can be a useful mechanism to tell a program’s story but becomes most useful only when it is viewed as a value-added, routine undertaking—not as a burdensome add-on or species of busy work. Assessment gives program managers, administrators, and other leaders a mechanism to identify what they have learned that is useful—both to articulate it internally and to share it with others. Such an approach to assessment builds upon the “Model for Improvement” (Langley, Nolan, Nolan, Norman, & Provost, 1996) that has been used widely throughout many sectors including higher education and health care. The core assumption here is that work/situations can usually be improved upon—or conversely that one needs to develop evidence to know that change is not needed. This model consists of the following three basic elements: • Statement of the aim: “What are we trying to accomplish?” This clarifies the purpose of the assessment and makes it explicit to all those participating.

ASCV.indb 36

• Clarification of current knowledge: “How will we know that a change is an improvement, or if no change is needed?” This sets out what is known and what the new knowledge will make possible once the assessment has been completed. • Testing of various improvements: “What changes can be tried that will result in improvement?” Based on what has been learned, this will help to define what might be implemented as initial improvement ­strategies. When one applies this model to higher education, the following questions help frame the assessment process (Gelmon, Holland, Shinnamon, & Morris, 1998; Gelmon, White, Carlson, and Norman, 2000): • How is learning conducted (e.g., servicelearning or learning grounded in communityuniversity partnerships)? • How does this pedagogical method become part of the curriculum (how is it introduced, how is it developed, how is it integrated)? • How can this educational method be improved? • How do individuals using this method know that a change is an improvement (i.e., what comparisons can be made using pre- and post-data)? In thinking of assessment as an improvement effort, one can focus on delineating issues that otherwise might not be obvious; describing strategies for future replication; highlighting areas for further work; and/or focusing thinking. Sharing the results of this effort through internal communications and/or through broader external dissemination via presentations at professional meetings, publications in professional literature, and postings on websites will also facilitate others’ learning. Using such an approach to assessment may well result in the identification of many opportunities for improvement. For example, curricular strengths may be clarified, validating existing knowledge and providing data to support the continuation of current activities. Or, conversely, deficiencies may be uncovered, thus providing the evidence and justification needed for making changes. Assessment may also serve to identify areas where faculty resources should be reallocated and where faculty should be recognized for excellence or assisted in correcting

27-08-2018 15:59:51

Assessment Principles and Strategies  37

deficiencies. In an institutional context, such activities are vital in order to consider broader issues of resource allocation (human, fiscal, physical, information, technological, and other resources), to inform public relations and marketing strategies, and to consider possible changes or realignments in organizational relationships and strategies.

A Multiconstituency Approach to Assessment The authors have developed a multiconstituency approach to assessment—initially, for use in assessing service-learning and now for use in a wider range of community-based learning situations. This approach began at PSU as part of an explicit effort to assess the impact of service-learning on students, faculty, the institution, and the community (Driscoll, Holland, Gelmon, & Kerrigan, 1996). At that time, considerable effort was being invested in developing undergraduate and graduate service-learning courses, as well as in c­ ommunity-university partnerships, and the institution needed to understand the impact of these efforts. At this time, the university was also implementing a new general education program and was attempting to create opportunities for service-learning experiences throughout that program, particularly in community-based senior capstone courses. This approach was then greatly expanded and revised to assess the impact of service-learning in health professions education for the Health Professions Schools in Service to the Nation (HPSISN) program, a national service-learning demonstration program (Gelmon, Holland, Seifer, Shinnamon, & Connors, 1998; Gelmon, Holland, & Shinnamon, 1998; Gelmon, Holland, Shinnamon, & Morris, 1998). This expansion added community partnerships as a fifth area of assessment focus. In both the PSU and HPSISN cases, the goal of the assessment effort was to explore implementation of service-learning and its differential impact on various constituencies and to identify lessons learned for future service-learning programming. The HPSISN project was one of the first efforts to study the disciplinary implications of service-learning. Subsequently, the model has been applied in other attempts to assess the impact of learning in the community. In two of these projects, students, faculty, and community partners worked together on community health improvement as part of their

ASCV.indb 37

academic course-based work. The two projects are described in the following: • An assessment of the community-based quality improvement in the Education for the Health Professions (CBQIE-HP) program where interdisciplinary teams of health professions students worked on specific community health improvement projects using the Model for Improvement methodology (Gelmon & Barnett, 1998; Gelmon, White, Carlson, & Norman, 2000), and • An evaluation of the Portland Tri-County Healthy Communities initiative, a crosssectoral community development approach to building community collaborations to address specific community health problems (Gelmon, McBride, Hill, Chester, & Guernsey, 1998). Another example of the adaptability of this approach to assessment design was demonstrated in 1998 when the Goal-Concept-Indicator-Method approach was used to create a matrix and methods for assessing the impact of a new masters in public administration degree with a concentration in Tribal administration, a program delivered to students using distance learning technology (administered from PSU, Mark O. Hatfield School of Government). This unique project, funded by a U.S. Department of Education FIPSE grant, called for an assessment model that tracked the effectiveness of program strategies and campustribal partnerships in meeting specific program goals and in creating satisfactory learning experiences for students in multiple locations (Holland, 2000a). The design was adapted again in 1999 to create a unique assessment plan for a national project involving many types of civic engagement activities. The Council of Independent Colleges, funded by the W. K. Kellogg Foundation, awarded grants to eight private, urban liberal arts colleges to facilitate their exploration and implementation of an urban mission defined primarily in terms of civic engagement and partnerships. This effort resulted in interesting lessons about the unique challenges faced by smaller private colleges when they seek to enhance their civic engagement programs (Holland, 1999b; Holland, 2001b). All these projects have confirmed the utility of this multiconstituency approach for assessment of a broad array of partnership activities including but also transcending service-learning.

27-08-2018 15:59:51

38  Assessing Service-Learning and Civic Engagement

The Assessment Matrix The approach presented here is based on the development of a conceptual matrix, derived from project goals, that frames the assessment plan, guides the development of assessment instruments, and structures the data analysis and reporting. At PSU, it initially served as a framework to predict outcomes anticipated as a result of early anecdotal data available at that time. The approach was then refined and began to be referred to as the Goal-ConceptIndicator-Method approach (Gelmon, Holland, & Shinnamon, 1998; Shinnamon, Gelmon, & Holland, 1999). It involves the following four primary questions: • What do we want to know? This helps the evaluator to articulate the aim of the assessment, based upon the project goals. • What will we look for? This leads the evaluator to identify core concepts that are derived from the project goals and the aim of the assessment. • What will we measure? For each core concept, relevant measurable/observable indicators are specified which will enable the evaluator to measure or observe change or status. • How will we gather the evidence needed to demonstrate what we want to know? At this stage, the evaluator identifies or develops appropriate methods and tools by which he/ she can collect the information for each indicator and identifies sources of the data. Such an assessment framework thus provides a structure to guide evaluation; enables program administrators and evaluators to articulate clearly the framework for evaluation; and facilitates data collection, analysis, and reporting in a way that is true to the aims and goals of the assessment. In addition, this framework overcomes many of the sources of resistance to assessment previously described. It is illustrated in each of the following four chapters  in

the context of a conceptual matrix for assessing impact on students, faculty, community and institution (see Tables 3.1, 4.1, 5.1, and 6.2). It strongly links assessment to program goals and in this way increases the effectiveness of the assessment effort and the relevance of its findings. The assessment framework is a tool that helps guide the thinking process in the design phase, serves as an important framework for implementation, and aids in defining and focusing the analysis. In its skeletal form, it appears as the matrix presented in Table 2.1. It has four main components: core concepts, key indicators, methods, and sources of information (Gelmon & Connell, 2000). Core concepts are broad topic areas. To identify these, ask: “What are the possible areas of impact that can be observed from courses, programs, or activities?” The definition of a concept is written in neutral language to provide a basic foundation for continued discussion and elaboration as to how the program may affect such a concept. Stating a concept in language such as “increase in _____” or “change in _____” introduces a bias into the assessment and compromises objective data collection. Refer to Tables 3.1, 4.1, 5.1, and 6.2 for examples of core concepts for each of the constituencies. Key indicators are key evidence of impact and are usually stated as the specific measurable or observable factors related to each core concept. To develop these, ask: “What might we look for to show that the concepts are being addressed? What measures can we study to gain evidence of how the core concepts are being affected?” As with the concepts, these should be stated in neutral rather than directional terms to avoid bias. There should be multiple key indicators for each core concept. Wherever possible, avoid defining indicators such as number of, increase in, improved, and so on, as this may limit the range of available data collection methods. For example, by stating “number of ____,” one is directed to quantitative methods, whereas by avoiding this terminology one can use quantitative or qualitative

TABLE 2.1 

The Matrix Framework Core Concepts

ASCV.indb 38

Key Indicators

Methods

Sources

27-08-2018 15:59:52

Assessment Principles and Strategies  39

methods. (Again, refer to Tables 3.1, 4.1, 5.1, and 6.2 for examples of suitable key indicators.) Methods indicate the actual instrumentation or strategy to be used for gathering evidence through measurement or observation. In selecting appropriate methods, ask: “How will we look for it?” This refers to the instrument(s) one selects and, if applicable, to the way one will use it (them). The most commonly used instruments include the following: • Surveys (self-administered or administered by another person) • Interviews (in person or telephone) • Focus groups • Document reviews • Observations • Journals • Critical incident reports Each matrix in Tables 3.1, 4.1, 5.1, and 6.2 lists a variety of methods. A more detailed discussion of methods for each constituency is presented as those methods are introduced. Sources of information may be a specific person, a group of people, a database, or a report. A source may be internal or external to an organization and may be people who have had some personal contact or experience with the activity being assessed or documentation containing relevant information. In reality, different methods may be used for each indicator, and each source may provide data for many methods, but not all sources will be involved in each method, and not all methods will address each indicator. While there is a direct linear relationship between a concept and its related indicators, there is no such relationship between methods and sources.

Using the Assessment Matrix The completed matrix should be reviewed to ensure that the concepts are clear and distinct. Indicators should be verified to ensure that they are measurable or provide opportunities to collect evidence. If it is not possible to determine how an indicator will be measured or observed, it should be restated to enhance specificity. Program goals should be reviewed to ensure that the concepts and indicators reflect those goals, that all information included in the matrix is necessary to assess their accomplishment, and that no goal or major activity has been overlooked. Finally, it is important to ascertain that what has been set out for assessment is practical and

ASCV.indb 39

feasible within the context of a specific organization, that is, with reference to the resources available and the population(s) being served. The matrix will be very useful in focusing the analysis of data. Key indicators of a program’s success as listed on the matrix provide a critical point of reference that, although flexible, holds the evaluators accountable to the program’s goals and objectives. Since the matrix will have been used in determining what information should be gathered and in developing the appropriate evaluation instruments, the data gathered should relate directly back to the key indicators and core concepts. In analyzing the data, one should focus on how the key indicators are reflected and to what extent they have been achieved.

Issues in Instrument Selection1 The key issue in selecting appropriate instruments is determining what will provide the best information to help accomplish the goals of the assessment. Selection of relevant assessment instruments involves evaluating their relative merits and determining which are best suited to specific needs (Gelmon & Connell, 2000). Primary considerations in selecting specific instruments include the following: • Design issues: time, expertise, resources available • Data collection: ease, time, expertise needed • Data analysis: skills needed, time, level of detail • Response content: limited versus expansive • Flexibility and accuracy of instrument • Bias introduced by method • Nature of questions: open-ended, closedended • Side benefits or disadvantages Each of these key considerations is illustrated in Table 2.2 for a variety of assessment methods. Each assessment instrument also raises issues with respect to the trade-off between resources required to administer and analyze an instrument and the value of the information that will be collected. Resources include money, equipment, expertise, and time for design, set up, administration, analysis, and reporting. Key issues to consider in measuring the trade-off include the following: • Set-up time • Administration time • Analysis time

27-08-2018 15:59:52

ASCV.indb 40

TABLE 2.2 

Comparison of Assessment Methods Instrument

Design Issues

Data Collection

Data Analysis

Response Content

Flexibility/ Accuracy

Bias

Nature of Questions

Side Benefits or Disadvantages

Survey

Relatively complex and labor intensive

Effort to ensure participation

Limited

Accurate if questions valid

Generalizable if validated

Sample or whole population

Resources include printing, mailing, responses

Use existing lists to recruit

Hand analysis or by computer

Less detail

Little (controlled by question design)

Closed (primarily)

Requires expertise in survey design

Requires knowledge of statistics

Relatively easy for expert to design

Selection of subjects important to ensure representation

In-Person Interview

Sampling for subjects

Focus Group

Interviewer training

Time intensive (one on one)

Relatively easy for expert to design

Ensure range of representatives

Selection of participants

Time intensive

Interviewer training Telephone Interview

Lengthy

Own words

Need qualitative skills

Range of opinions

Little flexibility once designed

High flexibility within protocol Use of probes

Detailed Reflective

Expertise to facilitate

Lengthy Need qualitative skills

Highly detailed

Sampling for subjects

Potential for increased number of rejections

Potential for high interviewerintroduced bias

Open ended Conversational

High flexibility within protocol

Dynamic

High potential Open to derail ConverPotential sational among participants Nonverbal issues

Lengthy

Own words

Need qualitative and quantitative skills

Range of opinions Detailed

High flexibility within protocol

Eliminates nonverbal issues

Rich input Comparability requires large number of interviewees Labor intensive and time-consuming

Nonverbal issues

Agreement necessary Less time than in person

Easy to report to different audiences Results may be used for public relations and/or promotion

Permission necessary

Relatively easy for expert to design

Interviewer training

Must use simple words

Open Conversational or survey-type

Participants can build upon each other and interact and therefore generate more ideas than an individual alone High potential for early termination Potential for fabrication of answers Blend of strengths of surveys and interviews

27-08-2018 15:59:52

ASCV.indb 41

Observation

Relatively easy for expert to design

Training of observers Time-intensive to observe

Lengthy Need qualitative skills

Permission necessary

Varied (fixed versus open)

High flexibility

Words of observer and quotes from participants

High because of observer

Open or closed

Presence of observer may bias behavior

Can view “real” interactions Bias of observer’s presence Opportunity for additional problemsolving or consultation Augments primary data Generates lists of uncertain value

Documentation

Relatively easy for expert to design

Can be very timeconsuming to locate and review Initial access may take time

Journals or Critical Incident Reports

Relatively easy for expert to design

Variable (depends on kinds of data collected)

Completeness, comparability, and accuracy of records may be variable

Analysis may already be available

Highly dependent on willingness of participant to give the time

Lengthy if lots of content Analysis may already be available

Limited or extensive

Depends upon protocol and report style and format

Could be high from collection

Open or closed

Bias of what is recorded

Augments primary data Could inspire improved record keeping Could raise issues not previously considered Generate lots of information but of uncertain value

Varies but High flexibility should be detailed and in own words Highly personal

Respondent chooses to include or not include

Open within general guidelines

Augments primary data Reveal information not otherwise provided May generate lots of information with little context for evaluation

27-08-2018 15:59:52

ASCV.indb 42

TABLE 2.3 

General Guidelines on Time Versus Value Instrument

Setup Time

Administration Time

Analysis Time

Other Issues

Survey

One to four days

Variable with survey length (Five minutes to one hour per survey)

Variable depending on question design and automated analysis

Need database and/or Lots of data statistical expertise Little measurable variation

One and a half hours per interview

Three hours per interview plus synthesis

Need qualitative data experience

Interview

Half day

Outputs

Numerical reports Generalizable in most cases Reams of paper/tapes Individual stories Personal words and anecdotes Cannot quantify Draw generalizations only after multiple interviews

Focus Group

Half day

One and a half hours per focus group

Three hours per focus Need qualitative data group plus synthesis experience

Reams of paper/tapes Individual stories and words Dynamic interactions within group Peer dialogue Cannot quantify Can highlight new questions Generalizable with sufficient replication

Observation

Documentation

Half day

Time to get access

As long as it takes to observe

Can be very lengthy or very brief

Time to observe

Thin data

Access

Useful to back up other sources and provide additional insights

Lengthy

Lengthy

Access

Richness depends on quality Complements narrative or numerical data Capitalize on existing information

Critical Incident or Journal

One to two hours

Lots of individual time (not evaluator time)

Lengthy

Willingness of participants to give time and to respect the method or format

Rich stories, variable focus Not generalizable Promotes reflection Backs up other methods/insights

27-08-2018 15:59:52

Assessment Principles and Strategies  43

• Other issues requiring resources (that may outweigh the potential value of the data) • Nature of output An approximation of the costs associated with each of these issues for each method is illustrated in Table 2.3. Again, in selecting instruments, one must determine what can be afforded that will provide the best information. Frequently, trade-offs of costs against potential data are necessary but ultimately do not compromise the overall quality of the assessment.

Completing the Assessment Cycle Once the necessary data have been collected, assessment leaders must be prepared to engage in extensive data analysis, synthesis, discussion, and report writing. As mentioned previously, there are often lengthy debates on campuses about the relative merits of quantitative versus qualitative methods. Our own experiences show that a mixed methodology is most useful. Methods should be selected based on the kind of data that will be gathered as well as issues such as ease of data collection, ease of data analysis, and time and costs involved in both collection and analysis. However, consideration must also be given to the richness of the data that can be derived from various methods. Interviews, focus groups, observations, and reflective journals provide extensive and detailed information that necessitate a major time commitment to transcribe and analyze. In contrast, surveys provide less individual detail but are relatively easy, inexpensive, and time efficient to administer and analyze. Assessment leaders who do not have familiarity and expertise with specific assessment methods should ensure they engage an expert to advise them during instrument development as well as data analysis. In each of the following chapters, there appears information on the assessment methods proposed, as well as the advantages and limitations of each. A final step in the assessment process is reporting the results. A fairly typical method involves writing an assessment report that describes (a) project goals, (b) what was done (programmatically), (c) what was measured, (d) the results, and (e) implications and/ or recommendations. Reporting results should be guided explicitly by the matrix (using the concepts as major headings and the indicators as subheadings); this will facilitate a synthesis of the findings and their

ASCV.indb 43

presentation in a report. Assessment results can also form the basis for scholarly presentations and publications. However, care should be given to ensure that no confidential information is disclosed and that participants have given permission for the assessment findings to be released in a public forum. Consideration should also be given to alternative forms of reporting to ensure wider and more rapid dissemination. For example, summaries of key findings can be presented in poster format and displayed in a campus cafeteria, at the library, student union, or other central location. Selected results and participant stories can be integrated into a university website. Other forms of reporting through annual reports, community updates, or focused brochures can also be used.

Conclusion Assessment provides a valuable mechanism for communicating the value of one’s work. In particular, when seeking to document the effect of an approach such as service-learning or other activities demonstrating civic engagement, one must be able to provide evidence that the approach is making a difference— and be able to show the differential impact on various constituent groups. Good assessment requires collaboration and a commitment to invest time and energy. The very nature of assessment necessitates a long-term perspective, as the assessment effort is never complete. Nonetheless, continuous investment in the process provides the information needed to respond adequately to the assets and needs of those involved in a critical aspect of higher education reform and to seek continued improvement of the programs and services higher education provides. The following chapters identify the various constituencies relevant to the assessment of servicelearning and related community engagement activities. Each chapter includes an overview, instruments, and guides to the use of appropriate instruments.

Note 1. This section is adapted from Sherril B. Gelmon and Amy Connell, Program Evaluation Principles and Practices: A Handbook for Northwest Health Foundation (Portland, OR: Northwest Health Foundation, 2000) with permission from the Northwest Health Foundation.

27-08-2018 15:59:52

ASCV.indb 44

27-08-2018 15:59:52

Chapter Three

ST U D E N T I M PAC T

student development in terms of knowledge of community factors and social responsibility. Finally, service-learning practitioners advocate for assessment of impact in order to document the advantages and challenges of this work. Practitioners also utilize assessment data to document the innovative techniques faculty and students of various disciplinary backgrounds employ to achieve their student learning outcomes. Dissemination of these assessment results helps practitioners to improve the quality of student learning and contributes to the growth of the field of service-learning. The need for and importance of increased assessment of service-learning was evidenced in 1991 when 40 educational leaders were convened by the (thennamed) National Society for Internships and Experiential Education at a Wingspread conference. Those involved in these initial discussions noted the growing number of students engaged in service each year (Giles, Honnet, & Migliori, 1991). They identified lack of documentation of the students’ experiences in service-learning courses. They found a paucity of data on the impact that this educational pedagogy had on students and suggested that this was a national problem. These experts in the field felt that it was imperative to study specific areas of potential impact on students, including the following:

Why Assess the Impact of ServiceLearning on Students? The impetus for assessing the impact on students emerges from several constituencies. First, institutions of higher education are advocating for all courses to be rigorously assessed as institutions are continuously being held accountable for their graduates’ level of preparedness upon entering the workforce. New programs and pedagogies such as service-learning must endure institutional examination to prove their value to the institution and their contribution toward student learning. Second, in addition to institutional demand for assessment, faculty members advocate for assessment in order to understand the impact of courses and pedagogies on student learning. Faculty who have a data-driven understanding of the impact of service-learning often utilize the data to continuously improve their teaching and the use of this pedagogy. Assessment also aids faculty who must demonstrate the rigor of this method of teaching among their colleagues. Assessment provides a means for educators to respond to students’ questions of why they need to engage in this form of learning. Similar to the other constituents of service-learning, students seek to understand the effectiveness of service-learning as a component of their learning experience. As with the other constituents, the answer to these questions emerges from assessment data. Community members seek assessment data about students’ learning and experiences in order to help them better understand how they contribute to the student learning activities. They may play an important role in evaluating the work of the students and in understanding

• What knowledge do students gain from participating in service-learning? • Does service-learning affect students’ perception of self and others? • Does service-learning promote pro-social attitudes and behaviors? • Does service-learning influence the development of effective citizens? 45

ASCV.indb 45

27-08-2018 15:59:52

46  Assessing Service-Learning and Civic Engagement • How do factors of age, socioeconomic status, developmental stage, and background of students affect the outcomes of service-learning? Conference participants encouraged colleagues across the nation to join with practitioners and students to conduct research in this field and to disseminate the findings. Since the Wingspread conference the urgency and importance of student assessment has increased as institutions of higher education place an increasing number of students in the community and sometimes do so without fully understanding the impact that this pedagogy has on student learning. To ensure quality of service and the best possible outcomes for learning, student servicelearning experiences must be assessed systematically.

Understanding the Impact of ServiceLearning on Students Since the 1991 Wingspread conference, researchers have worked to document the impact servicelearning has on students (Alt & Medrich, 1994; Anderson, 1998; Astin & Sax, 1998; Eyler & Giles, 1999; Eyler, Giles, & Gray, 1999; Gray, Ondaatje, et al., 1999). In Assessment for Excellence (1993), Astin presented a useful taxonomy that reflects the different types of college student outcomes assessed and the different forms of data collected. Student outcomes are described as either cognitive or affective. Examples of cognitive outcomes include theoretical knowledge and critical thinking, problem-solving, and decision-making skills. Affective outcomes include changes in attitudes toward community issues, populations served, community service, and personal values. The type of data collected is described as either psychological or behavioral. Psychological data refers to the internal state of the student, and behavioral data refers to the “student’s observable activities” (Astin, 1993, p. 44). Finally, the time dimension of the study can focus on short-term (during college) or long-term (beyond college) time frames. Most of the service-learning research to date has collected psychological data about student outcomes. Many of the studies measure a change either in students’ attitudes toward others (Battistoni, 1997; Giles & Eyler, 1994; Myers-Lipton, 1996) or toward service itself (Astin & Sax, 1998; Astin, Vogelgesang, Ikeda, & Lee, 2000; Buchanan, 1997; Gilbert, Holdt, and Christophersen, 1998). It is rare to find studies collecting data based on observable student actions.

ASCV.indb 46

Battistoni’s (1997) study included observations of student behaviors in the classroom, where principles of democracy were modeled. In addition, some studies ask students to report on their own behaviors. Astin and Sax (1998), Sax and Astin (1996), and Astin and colleagues (2000) include students’ responses regarding their behavior while volunteering. While many of the studies are considered affective in nature (asking students to report on their attitudes), there is a growing body of literature focused on cognitive outcomes including critical thinking and decisionmaking skills (Batchelder & Root, 1999; Battistoni, 1997; Berson & Youkin, 1998; Gilbert et al., 1998; Wechsler & Fogel, 1995). Another methodological issue raised by Astin (1993) was the time dimension involved in each study. Most of the studies could be categorized as assessing short-term outcomes or outcomes measured while the student is still in college. The study reported by Sax and Astin (1996) is the only one to collect data over time from students after graduation. Even within the short-term studies, researchers assessed students over different lengths of time. Jordan (1994) collected data from students engaged in a six-week servicelearning experience and found suggestions that longer experiences may prove more productive. Substantial findings from Myers-Lipton (1996) and Astin and colleagues (2000) provide a rationale for studying the longer-term effects of service-learning on students. These researchers have responded to the National Society for Experiential Education’s call for research on service-learning (Giles et al., 1991). The Wingspread conference framed primary questions revolving around the issue of the impact this form of teaching and learning has on students. Since that time, researchers, practitioners, and doctoral students have attempted to bridge the gaps in the literature. The authors of this handbook have defined potential variables to be studied and suggested a mix of methods (Driscoll, Gelmon, et al., 1998). Others have used purely qualitative methods to study the students in their own service-learning courses (Battistoni, 1997; Gilbert et al., 1998). Universities such as the University of Utah (Buchanan, 1997) have gathered institution-wide data on the impact of service, while additional researchers have used larger national samples to gather quantitative data (Astin & Sax, 1998; Sax & Astin, 1996) on the effects of service. In the special issue of the Michigan Journal of Community Service Learning on strategic directions for service-learning research, Eyler (2000) notes that most of the research of the prior decade gave

27-08-2018 15:59:52

Student Impact  47

us adequate evidence of the impact service-learning has on student’s personal and social development, yet there is little evidence of the cognitive impact this pedagogy has on student learning. The PSU approach has provided the field with a set of research concepts and measurement strategies to mitigate this gap in the research (Driscoll, Holland, et al., 1996). The comprehensive set of techniques set forth in this handbook can be useful in assisting researchers with understanding and documenting student cognitive development documentation, as well as personal and social development. Our assessment project began as a means to learn about and document the impact servicelearning was having on students. To do this, we chose to employ a variety of different data-gathering methods with the intent of determining the nature and variety of information each of the tools would uncover about the impact on students. As a result of testing various measurement methods, we now understand which strategies are most likely to aid us in documenting intellectual outcomes, problemsolving skills, and level of commitment among those students involved in service-learning.

Assessment Matrix for Students In the initial phase of our work at PSU, we chose not to limit our investigation of possible areas of impact by having only one broad hypothesis. This was an exploratory study and was comprehensive in its approach. Therefore, one of our original hypotheses was that service-learning would have an impact on students. We looked to the literature and early institutional observations to guide us in defining key concepts (variables) that would help us to understand the impact on students. The concepts are listed in Table 3.1. This list of concepts serves as the framework for defining focus and specific measurement for the assessment of the impact on students involved in service-learning. Once this list of concepts was identified, we identified measurable indicators for each concept that would demonstrate the presence (or absence) of the concept. Multiple indicators are needed for each concept in order to tease out the various ways in which students may experience a particular concept. One of the challenges in developing a process to assess the impact of service-learning on students is the lack of proven effective methods available to evaluate this form of learning. Eyler and Giles (1994)

ASCV.indb 47

claim that the lack of assessment instruments is because the purpose of service-learning is not always delineated, which results in ambiguous student variables, indicators, and outcomes. Universities define their service-learning programs with very different goals. Some are curricular based while others are strictly cocurricular. Some are concerned with social justice and citizenship development, while others are focused on using service-learning as a pedagogy directed to disciplinary course content. This diversity of programmatic goals makes the uniform assessment of service-learning outcomes very difficult if not impossible. As a result, educators have produced very few instruments that measure the impact that service has on students. The authors of this handbook were faced with the challenge of assessing the impact service-learning was having on faculty, students, community partners, and the institutional culture. More specifically, we wanted to measure the impact of curricularbased service-learning that is focused on enhancing classroom learning. Therefore, we developed a set of variables that described the impact on students and relevant indicators to measure this impact on students involved in service-learning projects connected to course content. Awareness of community, involvement in community, commitment to service, and sensitivity to diversity are all concepts that measure impact on students’ psychological change. These psychological changes are indicated by students’ increased knowledge of community issues and strengths, increased understanding of the role they play in addressing community concerns, and an increased sensitivity to working with communities that they have not previously been a part of. Although extremely important impact variables, these are all what Astin (1993) would call an affective student outcome. • Awareness of community seeks to determine if students had or developed a heightened awareness and understanding of community issues, needs, strengths, problems, and resources. • Involvement with community describes the quality and quantity of interactions with the community, their positive or negative attitude about working with the community partner, a desire or importance of getting feedback from their community partner, and/or a recognition about the benefits they gain and the community partner gain through their relationship.

27-08-2018 15:59:52

48  Assessing Service-Learning and Civic Engagement TABLE 3.1 

Matrix for Student Assessment What do we want to How will we know it? (indicators) know? (concepts)

How will we measure it? (methods)

Who/what will provide the data? (sources)

Awareness of community

Knowledge of community issues

Interviews

Students

Ability to identify community assets and needs

Focus groups

Faculty

Classroom observations

Community partners

Quantity/quality of interactions

Interviews

Students

Attitude toward involvement

Focus groups

Faculty

Interdependence among community partners and students

Classroom observations

Community partners

Understanding of community strengths, problems, resources Involvement with community

Feedback from community Commitment to service Career development

Attitude toward current service experience(s)

Interviews

Students

Plans for, and barriers to, future service

Focus groups

Faculty

Reaction to demands/challenges of the service Surveys

Community partners

Career decisions/opportunities

Surveys

Students

Development of professional skills related to career

Interviews

Faculty

Focus groups

Community partners

Awareness of personal strengths, limits, goals, and fears

Interviews

Students

Changes in preconceived understandings/ ability to articulate beliefs

Surveys

Faculty

Classroom observations

Community partners

Role of community experience in understanding and applying content Perceived relevance of community experience to course content

Interviews

Students

Surveys

Faculty

Classroom observations

Community partners

Attitudes about and understanding of diversity

Interviews

Students

Knowledge of new communities

Surveys

Faculty

Community observations

Community partners

Autonomy and independence from faculty

Focus groups

Students

Sense of role as learner and provider in partnership

Classroom observations

Faculty

Interviews

Community partners

Perceived skill development

Interviews

Students

Recognition of importance of communication

Classroom observations

Faculty

Demonstrated abilities (verbal and written)

Community observations

Community partners

Role of student peers in learning

Focus groups

Students

Perception and role of community partners in learning

Classroom observations

Faculty

Community observations

Community partners

Opportunities for career preparation related to service experience Self-awareness

Understanding of course content

Sensitivity to diversity

Self-confidence and comfort in community settings Sense of ownership

Responsibility for community project Communication

Valuing of pedagogy of multiple teachers

Role of faculty in learning

ASCV.indb 48

Community observations

27-08-2018 15:59:52

Student Impact  49

• Commitment to service is measured by looking at students’ attitudes toward their current service and their plans or barriers for future service. • Sensitivity to diversity is measured by students’ expressed attitudes about working with communities with which they were not familiar, an increased comfort and confidence working within these communities, and a recognition that they gained knowledge of a new community. Concepts such as career development, understanding of course content, and communication serve as measures of impact on students’ cognitive development. As Eyler (2000) points out, it is extremely important to measure the impact on students’ cognitive development because this is at the heart of most college and university missions. These concepts are indicated by students’ ability to utilize the servicelearning experience to influence their career decisions or give them the opportunity to develop skills that relate to those they will need in their intended career. Because the PSU service-learning program is one that is primarily curricular based, a particularly important concept to the PSU research team was understanding of course content. This concept is indicated by students’ ability to apply what they are learning in the class to their community setting and their ability to understand how the service experience is relevant to the topic and learning goals of the class. One of the indicators for the communication variable is students’ increased skill development with the multiple communication demands of working within community settings. When students were able to demonstrate impact on any of these three concepts, clear cognitive change and growth could be documented. • Career development is measured in terms of the development of professional skills and increased student awareness of the skills needed for a person working in the field in which they were doing their service project. This variable is also measured by students’ increased knowledge about their career of interest (both positive and negative), as well as their understanding of the professional directions they might pursue. • Understanding of course content is measured by students’ ability to make clear connections

ASCV.indb 49

between the course goals and communitybased project. • Communication is measured by students’ recognition that they may have gained new communication skills as well as the importance communication plays in the complex relationships presenting in these communitybased learning experiences. Self-awareness, sense of ownership, and valuing of multiple teachers are concepts that measure students’ understanding of themselves as part of a learning community and the skills and perspectives they themselves and their colleagues contribute to the community project and the class. The selfawareness concept identifies when students recognize their own contributions, strengths, and limitations regarding the community project in which they are engaged, as well as when they acknowledge that they have rethought and possibly modified previously held beliefs. Sense of ownership measures students’ recognition of themselves as contributors to a community of learners. Students may come to recognize that the community partner is a valuable source of knowledge and that the partner looks to the students to contribute a high-quality, valuable product at the conclusion of the course. Valuing of multiple teachers addresses the fact that service-learning courses offer a different teaching modality than traditional classrooms, and students may recognize that student colleagues, community partners, and their faculty play different and important roles in their learning in these experiences. • Self-awareness is measured by students’ recognition and awareness of their own personal strengths and weaknesses as they relate to the completion of the course and their engagement in the community. This variable is also measured by the indication that a student changes his or her previously held beliefs due to his or her engagement in the community. • Sense of ownership is measured by students’ expressed autonomy and independence from the faculty member. The student’s ability to see his or her community partner as a source of knowledge and that student’s increased investment in the class by taking responsibility to provide the community partner with high quality outcomes are all indicators of this variable.

27-08-2018 15:59:52

50  Assessing Service-Learning and Civic Engagement • Valuing of multiple teachers is measured by students’ descriptions of the changing roles among faculty, students, and community partners, as well as students’ recognition that student peers and community partners may at times shift into teaching roles, while the faculty may occupy the role of learner.

Strategies for Assessing Impact on Students We articulated and designed measurement strategies most likely to gather useful and relevant data. Both qualitative and quantitative methods were necessary to learn about the different types of learning that was taking place. Creswell (1994) has advocated using an approach that incorporates multiple methods. Our approach to assessing service-learning has shown that combining methods proved to be useful as a strategy for enhancing validity of findings. We were able to triangulate the data by confirming findings from the student interview transcripts with observations in the classroom and the student survey. In addition, we were able to complement the data found in the surveys regarding students’ perceptions about the impact of service-learning on future participation in the community with interviews from students. The interviews allowed students to describe various aspects of the data gathered in the survey in more depth. This was especially helpful when the information was personal, such as students becoming more aware of their own biases and prejudice. The interview quotes from the students made the statistical data more rich and descriptive of how these experiences affect students. In this process, researchers used the qualitative and quantitative approaches to expand the study to be inclusive of new concepts which students may have discovered through their service-learning experiences. In addition to these stated purposes, the use of multiple data-gathering methods allowed the PSU research team to understand which of the methods would most likely provide useful and relevant data on the various concepts. With the less quantifiable psychological concepts, interviews and focus groups were methods by which students would explore perceptions of personal growth as it related to the service-learning course. Classroom observations, community observations, and surveys have the potential to capture the impact the service-learning

ASCV.indb 50

experience had on students’ cognitive skill development. As is the case with all of the data collected, in-person observations and individual and group interviews provide the researchers with the specific indicators of impact on students’ cognitive growth that a survey may not capture. Given the chance to discuss and reflect on their skill development resulting from their involvement in the community, students provide a number of varied and specific examples.

Surveys Advantages of Surveys

One of the greatest advantages to utilizing surveys as an assessment tool is the ease of analysis. Most institutions of higher education have an office of institutional research or qualified faculty and staff trained to enter and analyze statistical data. This allows for the analysis of a wide range of courses and also detailed analysis of data for specific research questions. Surveys developed from this research are now used to collect data from more than 3,000 students per year at PSU. We are able to make substantive observations about the quality of service-learning courses, students’ attitudes toward this pedagogy, students’ perceptions of their learning, and their growth in each of the student concepts. We are also able to compare the impact that these courses have on students throughout the various levels of their education. For example, does participation in service-learning courses during the first year influence students in a different way than these types of courses in their senior year? Surveys also give us the ability to compare various forms of service-learning. Are students who engage in courses which contain direct contact with the community partner affected differently from those who do not come in direct contact with the community partner (perhaps by doing some form of indirect research project)? Finally, the surveys allow institutions to compare various populations, addressing such questions as: Are nontraditional students affected in similar or different ways from traditional college-age students? Do men and women respond differently to these courses? Surveys provide institutions the opportunity to analyze data from a large population and understand the impact of servicelearning on the different types of students engaged in service.

27-08-2018 15:59:52

Student Impact  51

Limitations of Surveys There are a few limitations to this method of data collection. First, the limitation of using the survey method alone is that it limits the amount of student voice in the assessment. Students are not given the opportunity to share their success or their own learning. Second, surveys frequently do not capture personal struggles and challenges faced in their community setting. Students often respond affirmatively to many of the questions but when interviewed more fully describe an experience that often includes satisfactory and unsatisfactory encounters. Third, surveys do not allow students to describe the multiple factors that contribute to their evaluation of their learning experience. Fourth, surveys given too close to the end of a short-term service-learning experience may not fully capture the various impacts or changes in students’ self-awareness. Therefore, we suggest using a blend of interviews and focus groups to add in-depth understanding of the student experience.

Interviews Advantages of Interviews

One advantage of the interview method is that it provides data regarding the context in which the student experience is taking place. For example, students discuss how the community participation takes place in the context of their busy lives, sometimes involving juggling multiple jobs and family responsibilities. In addition, interviews promote reflection—the hallmark of service-learning—and provide in-depth accounts of transformational events in the students’ learning. Frequently these events take place as students encounter issues of race, class, gender, and difference in their communities. Interviews allow researchers to gain insights into the interactions that take place in these courses, the process of engaging with the community through the voices of students. Finally, the interview method provides solid and specific recommendations for improving the quality of service-learning courses. The student interview protocol included in this monograph specifically introduces the notion of students’ perceptions regarding their level of preparedness for the service experience they encountered. This information assisted administrators and faculty in planning ways to better prepare students for this type of work in the community. For example, as a result of data gathered through this method, administrators were able to identify the

ASCV.indb 51

need for increased diversity training for students as they enter increasingly more ethnically and economically diverse communities.

Limitations of Interviews The primary limitation associated with the interview method is the labor-intensive task of analysis. Researchers can become bogged down in hundreds of pages of interview transcriptions and be faced with insufficient time, resources, and qualified staff to analyze the data. Institutions committed to learning about the student experience in service-learning courses need to allocate the proper level of resources to transcribe, analyze, and write up the findings from the data. Designing interview protocols to focus on specific research variables and indicators provides a natural plan for analysis and helps overcome this potential limitation.

Focus Groups Advantages of Focus Groups

One major advantage of using focus groups is that they provide an opportunity for groups of students to reflect upon lessons learned in their experience and share this learning with others. Students gain new insights from hearing their peers share significant insights and struggles and build connections between various ideas expressed by other students. Focus groups also allow researchers to capture student voices. As with student interview data, focus groups capture students’ individualized experiences in the community. The researcher is able to ask probing questions that capture how the servicelearning experience affected the student’s learning. In addition, focus groups serve as a powerful way to reaffirm the learning that has taken place and to document the progress that has been accomplished in the course. They also serve as a means by which students can directly offer suggestions for improvement of the quality of the service-learning courses. Here, students are given the opportunity to demonstrate critical thinking skills as they offer constructive criticism and conceptualize programmatic improvements.

Limitations of Focus Groups There are a few disadvantages with the use of focus group data collection. First, focus groups require that

27-08-2018 15:59:52

52  Assessing Service-Learning and Civic Engagement students convene in one location, at a specified time, for a defined period. These logistical requirements can be difficult to organize since it is likely students will only be able to convene during the scheduled meeting time. Because of limited class time, giving up a class session to facilitate a focus group is not always possible. Second, audio recording best captures focus group data. Having effective audio equipment that will uniformly capture the many voices of those participating in the focus group is an important, but often difficult technical challenge. Third, like student-interview data, analysis of focus-group data can be a labor-intensive task that can easily burden researchers with many pages of transcriptions. Fourth, in classes where students do not have adequate time to reflect on their experience in the community and do not have ample opportunity to discuss their community experience as it relates to the course content, focus groups provide an environment for students to discuss the logistical challenges of working in community-based settings. In a group

ASCV.indb 52

setting like that of a focus group, once students begin discussing the logistical barriers to working in community settings turning the discussion to deeper analysis of learning can be very difficult.

Concluding Thoughts We have learned that collecting data through all of these methods allowed us to truly understand the dynamics of these complex courses. It gave us the means to triangulate data and to develop a solid sense of the impact on students (as well as on faculty, community partners, and the institution, as will be discussed in the subsequent sections). We are now able to utilize end-of-course surveys as a single proxy measure to understand impact when resources do not permit using other more labor-intensive methods (such as interviews and focus groups) and have developed the language to describe and effectively analyze the data from surveys.

27-08-2018 15:59:52

Student Impact  53

Strategies and Methods: Students Student Survey Student Interviews Student Focus Groups

Student Survey Purpose

The student survey is intended to describe students’ perspectives and attitudes on issues related to their experience in a service-learning course. The survey is based on a five-point Likert scale where students report their level of agreement regarding their service-learning experience. The scale range includes strongly disagree, disagree, neutral, agree, and strongly agree. Topics assessed through the survey include students’ views or attitudes about service, the impact of the service on their choice of major/career and their understanding of the course material, their views on diversity issues, their perception of self-awareness, and the role other student colleagues and their community partner have in their learning. In addition, the survey provides demographic data, which profiles students’ racial background, age, gender, class level, and employment. The survey instrument is useful in describing students’ perspectives and demographics that exist within one course and across courses in a university. With this standardized survey format, institutions are able to administer this survey to all servicelearning courses. The data collected can be used by an individual faculty member to understand his or her student’s perception about the impact that servicelearning has on them. The data can also be collated to provide information that describes and measures the impact of service-learning institution-wide. Institutionally, surveys are one of the most efficient ways to collect information from a large number of students. Therefore, it is the one means that allows institutions to make claims about broad representation from the student body. This is useful for universities with diverse populations, especially those with a nontraditional student body. By utilizing this survey, faculty members and administrators will gain a profile of the student body they are serving. Note that two survey instruments are included. One is a longer survey that was used initially in our work. Following the pilot testing, the survey was

ASCV.indb 53

streamlined for administration and analysis through our institutional research office, and this scan-ready version is the second survey presented.

Preparation In preparation for using this instrument the following steps are recommended. 1. Faculty should determine if they are going to use this survey to assess change in students’ perceptions and attitudes before and after the course, or if they are going to use the survey only to assess the general attitudes of the students after they have taken the course. The survey is meant to complement other datagathering strategies to develop a clearer picture of the students’ perspectives and attitudes toward this form of learning. 2. Faculty should determine the most appropriate time to administer the survey. Pretests should be given in the first week of the academic term and post-tests in the last week. 3. Informed consent procedures should be initiated and completed prior to survey administration.

Administration 1. Faculty should stress the importance of the instrument so the students take the time to respond to the survey with honesty and integrity. 2. Faculty and the researchers should also assure students that their response will not negatively or positively affect the faculty or community partner or the students’ grades. 3. Student confidentiality should be assured and maintained throughout the collection of data. 4. The survey should be handed out to students during a scheduled class time. 5. Students should be given 15 to 20 minutes to complete the form. 6. Forms should be collected before students leave the classroom. 7. Results should immediately be shared with the participating faculty member.

27-08-2018 15:59:53

54  Assessing Service-Learning and Civic Engagement

Analysis Data analysis can be conducted through utilization of the Statistical Package for Social Sciences (SPSS) software. In the case of assessing and comparing pre- and post-service-learning experiences, the analysis could include frequencies and other descriptive statistics, Chi-squares, ANOVA, and factor analysis. Descriptive statistics can serve as a database, providing mean, mode, and standard deviation between items. Chi-squares correlate demographic data between student groups. Factor analysis reduces items into categories that are closely related. Analyses of variance or ANOVAs are useful to explore the existence of variation within and between groups on either single items or groups of items that may arise from the factor analysis. The descriptive data that provide a rich profile of the sample both in terms of demographics and responses to individual items are particularly useful. The survey may be used in a pre/post format to measure change in the individual student.

ASCV.indb 54

However, there are a few difficulties with the use of the pre/post surveys. First, it is difficult to measure significant change within a 10-week (quarter system) or 15-week (semester system) course. Few students will demonstrate dramatic changes in the concepts being assessed in one quarter or semester. Changes in attitudes about diversity, students’ role as learners, and successful community development skills frequently take a full academic year to show significant movement. In addition to the short time frame, surveys have limited success in capturing individualized and personal student learning. Classroom observations, focus groups, and/or interviews with individual students may reveal significant change in student perceptions about their understanding of course content, awareness of their own personal development and strengths, and their choice of major/ career. The pre/post survey does not reflect this change. As noted earlier, the survey data are most useful in collecting descriptive data from students.

27-08-2018 15:59:53

Student Impact  55

Community-Based Learning—Student Survey (longer version) We would like to better understand the impact that community-based learning has on students. We particularly want to know how this experience has influenced your perspective on learning, your view of service, your choice of major/career, and your perspective of working in a diverse community. I. First, we would like to know some information about you.

1. What is your racial background? ☐ Caucasian/White ☐ African American

☐ Asian/Asian American ☐ Native American

☐ Hispanic ☐ Other

2. What is your age group?

☐ Under 25   ☐ 25–34   ☐ 35–44   ☐ 45–54   ☐ Over 55

3. What is your gender?

☐ Male ☐ Prefer to self-describe

4. What is your class level? ☐ Freshman ☐ Senior

☐ Female ☐ Prefer not to answer

☐ Male/Female nonbinary/third gender

☐ Sophomore ☐ Graduate Student

☐ Junior ☐ Other

5. I have a job that requires me to work . . . ☐ 1–10 hrs/wk ☐ 31–40 hrs/wk

☐ 11–20 hrs/wk ☐ 41+ hrs/wk

☐ 21–30 hrs/wk ☐ I do not have a job

6. Name of the community-based learning course you enrolled in:_____________________ 7. The course number:________________________________________________________ 8. Name of community partner/agency you worked with:_____________________________ II. Next, we would like to gain your perspective about this community-based learning course. Please indicate your level of agreement with each statement. 9. The community participation aspect of this course helped me to see how the subject matter I learned can be used in everyday life. 10. The community work I did through this course helped me to better understand the lectures and readings in this course. 11. I feel I would have learned more from this course if more time was spent in the classroom instead of doing community work. 12. The idea of combining work in the community with university coursework should be practiced in more classes at this university. 13. I was responsible for the quantity and the quality of knowledge that I obtained from this course.

ASCV.indb 55

Strongly Disagree

Disagree

Neutral

Agree

Strongly Agree



















































27-08-2018 15:59:53

56  Assessing Service-Learning and Civic Engagement III. The next set of questions relates to your attitude toward community involvement. Please indicate your level of agreement with each statement. 14. I was already volunteering in my community before taking this course. 15. The community participation aspect of this course showed me how I can become more involved in my community. 16. I feel that the community work I did through this course benefited the community. 17. I probably won’t volunteer or participate in the community after this course. 18. The community work involved in this course helped me to become more aware of the needs in my community. 19. I have a responsibility to serve my community.

Strongly Disagree

Disagree

Neutral

Agree

Strongly Agree





























































IV. Next, we would like to know the influence of your service on your choice of major and profession. Please indicate your level of agreement with each statement. 20. Doing work in the community helped me to define my personal strengths and weaknesses. 21. Performing work in the community helped me clarify which major I will pursue. 22. The community work in this course assisted me in defining which profession I want to enter. 23. The work I accomplished in this course has made me more marketable in my chosen profession when I graduate.

Strongly Disagree

Disagree

Neutral

Agree

Strongly Agree









































V. Finally, we would like some of your personal reflections on this experience. Please indicate your level of agreement with each statement. 24. Most people can make a difference in their community. 25. I developed a good relationship with the instructor of this course because of the community work we performed. 26. I was comfortable working with cultures other than my own. 27. The community work involved in this course made me aware of some of my own biases and prejudices. 28. The work I performed in this course helped me learn how to plan and complete a project. 29. Participating in the community helped me enhance my leadership skills.

ASCV.indb 56

Strongly Disagree

Disagree

Neutral

Agree

Strongly Agree





























































27-08-2018 15:59:53

Student Impact  57

30. The work I performed in the community enhanced my ability to communicate my ideas in a real world context. 31. I can make a difference in my community.

Strongly Disagree

Disagree

Neutral

Agree

Strongly Agree





















Finally, please add any other comments you have about courses where learning takes place in a community setting. (Please use the back of this piece of paper or attach an additional sheet of paper.) Thank you for your insights regarding community-based learning!

ASCV.indb 57

27-08-2018 15:59:53

58  Assessing Service-Learning and Civic Engagement

Community-Based Learning—Student Survey (streamlined version) We would like to better understand the impact that community-based learning has on students. We particularly want to know how this experience has influenced your perspective on learning, your view of service, your choice of career, and your perspectives on working with diverse communities. Please take 5 to 10 minutes to complete this survey and return it before you leave class today. I. First, we would like some information about you.

1. What is your ethnic background? ☐ Caucasian/White ☐ Hispanic

☐ African American ☐ Native American

☐ Asian/Asian American ☐ Other

2. What is your age group?

☐ Under 25   ☐ 25–34   ☐ 35–44   ☐ 45–54   ☐ Over 55

3. What is your gender?

☐ Male ☐ Prefer to self-describe

4. What is your class level? ☐ Freshman ☐ Senior

☐ Female ☐ Prefer not to answer

☐ Male/Female nonbinary/third gender

☐ Sophomore ☐ Graduate Student

☐ Junior ☐ Other

5. I have a job that requires me to work . . . ☐ 1–10 hrs/wk ☐ 31–40 hrs/wk

☐ 11–20 hrs/wk ☐ 41+ hrs/wk

☐ 21–30 hrs/wk ☐ I do not have a job

6. Name of the agency/community organization with which you worked during this class: ___________ II. Next, we would like to gain your perspective about this course. Please mark your level of agreement with each statement. 7. The community participation aspect of this course helped me to see how the subject matter I learned can be used in everyday life. 8. The community work I did helped me to better understand the lectures and readings in this course. 9. The idea of combining work in the community with university coursework should be practiced in more courses at this university.

Strongly Disagree

Disagree

Neutral

Agree

Strongly Agree































III. The next set of questions relates to your attitude toward community involvement. Please indicate your level of agreement with each of the following statements. 10. I was already volunteering in the community before taking this course. 11. I feel that the community work I did through this course benefited the community. 12. I was able to work directly with a community partner through this course.

ASCV.indb 58

Strongly Disagree

Disagree

Neutral

Agree

Strongly Agree































27-08-2018 15:59:53

Student Impact  59

13. I felt a personal responsibility to meet the needs of the community partner of this course. 14. I probably won’t volunteer or participate in the community after this course. 15. My interactions with the community partner enhanced my learning in this course.

Strongly Disagree

Disagree

Neutral

Agree

Strongly Agree































IV. Next we would like to know about the influence of your service on your choice of major and profession. Please indicate your level of agreement with each of these statements. 16. Doing work in the community helped me to become aware of my personal strengths and weaknesses. 17. The community work in this course assisted me in clarifying my career plans. 18. The community work I performed in this class enhanced my relationship with the faculty member. 19. The community work involved in this course made me more aware of my own biases and prejudices. 20. The work I performed in the community enhanced my ability to communicate in a “real world” setting. 21. The community aspect of this course helped me to develop my problem-solving skills.

Strongly Disagree

Disagree

Neutral

Agree

Strongly Agree





























































V. Finally, we would like some of your personal reflections on this experience. 22. The syllabus provided for this course outlined the objectives of the community work in relation to the course objectives. 23. The other students in this class played an important role in my learning. 24. I had the opportunity in this course to periodically discuss my community work and its relationship to the course content.

Strongly Disagree

Disagree

Neutral

Agree

Strongly Agree































Thank you for your comments. Please return the completed form to [personalize information].

ASCV.indb 59

27-08-2018 15:59:53

60  Assessing Service-Learning and Civic Engagement

Student Interviews Purpose

Student interviews are intended to foster a oneon-one conversation with students to explore their experiences of working with the community in connection with an academic course. Interviews capture students’ voices as they describe their experience of the service-learning course. This assessment method provides a deeper understanding of the nature of students’ daily experiences in service-learning courses. Students state their own perception of the situations that they encountered and the meaning of these events in their lives. The interview approach can be used to assess a wide range of effects of service-learning. Specifically, the interview protocol provided in this handbook is designed to gather data from students about the nature of their service-learning involvement, the students’ roles in these courses, their understanding about linkages between course content and the community, and the challenges of engaging in service-learning courses. Furthermore, it probes students’ fears and concerns related to participation in the community, and assesses the self-awareness that emerges from the experience.

Preparation Faculty should be asked to provide a class roster so the researcher may select a set of students to interview. If the students are doing different projects, faculty should be asked to help to identify students who are engaged in a range of varying projects. Informed consent procedures should be initiated and completed prior to beginning to arrange interviews. Schedule interviews with students who represent the diversity of community experience and background that make up the class. Begin contacting students and schedule a one-hour interview with each in a location and at a time that is convenient to the student. In advance, describe the purpose of the interview so that the student may be prepared for the interview.

Administration The administration of interviews should be consistent across all interview subjects. The following guidelines will be useful for conducting interviews:

ASCV.indb 60

• Begin on time, introduce yourself and your role in the project, and explain the purposes of the interview. • Assure confidentiality and stress the importance of candor. • Assure the student that his or her faculty member or community partner will not be affected (negatively or positively) by student remarks. • Take notes or ask permission to record (assure the student that names will not appear on transcriptions). • Follow the interview protocol carefully and keep probes neutral. If in the course of the interview you sense there are difficulties and challenges, encourage the student to discuss them by moving question 9 to the beginning of the interview.

Analysis Data analysis begins with transcription of interview recordings immediately after the interview. These transcriptions should be double-spaced for reading ease. An initial reading of the transcripts by several people will allow the analysis team to come up with a list of key words and themes that appear in the text, each of which may be coded with their own color or symbol. The key words and themes are compared and matched to the research variables. The research team should then read each of the transcribed interviews again to code each interview according to the agreed-upon list of key words and themes. It is helpful to have multiple people independently reading, analyzing, and coding the interview data so that these separate sources can be compared for consistency and accuracy. Once the data are coded, the individual interviews can be combined into research variable categories. This provides the researcher with a group of student quotes for each of the research variables. This data can be used to assess the level to which students have been affected according to each of the research variables. Data analysis may also be completed by utilizing a qualitative data analysis package such as Ethnograph or ATALS.ti. Analysis may also be done by coding the interview data and categorizing transcribed text according to the set of research variables.

27-08-2018 15:59:53

Student Impact  61

Student Interview Protocol (Provide introduction to set context.) 1. Describe the work you did in the community for this community-based learning course. 2. Describe your relationship with the community partner and the project. 3. What did you learn about the community through this experience? What did you learn in the community that connected to the content of this course? How was that connection made? 4. Did you have any fears or concerns about working in the community as part of this class? What were those concerns? 5. Do you think you will do anything differently as a result of your experience in this course? (Probe: volunteering, career choices, activism/advocacy, etc.?) Has this created any new opportunities for you? 6. What did you learn about yourself as a result of your experiences in this community-based course? (Probe: Did you become aware of biases or fears? What did this teach you about your interaction with people different than yourself?) 7. In your community-based learning course did you learn from anyone other than your faculty instructor? (Probe: community partner, peers, other examples) 8. Did you feel prepared to perform the work required of you? If not, what would have made you feel more prepared? 9. What did you find most challenging in your community service experience? 10. What did you find the most rewarding in your community service experience? 11. What would you change about this community-based course?

Thank participants.

ASCV.indb 61

27-08-2018 15:59:53

62  Assessing Service-Learning and Civic Engagement

Student Focus Groups Purpose

“Focus groups are fundamentally a way of listening to people and learning from them” (Morgan, 1998, p. 9). Focus groups can be used to stimulate an interactive discussion about service-learning experiences among students in a particular course. Data can be gathered regarding several student variables by discovering patterns of students’ experiences. Students are queried through the focus group to learn how they define a successful service-learning experience and to gain insights about student interactions and relationships with community partners. The focus group should be structured with the intent of listening to suggestions for improvement in organizing and supporting students throughout their servicelearning courses. Students in focus groups will be able to provide feedback on various concepts by informing the researcher if certain variables were influenced through this experience. Data gathered from focus groups can improve planning for a service-learning program: Students can inform faculty and administrators about pitfalls in logistics, scheduling, and transportation necessary for effective planning in service programs; give feedback about the implementation of the course; and compare how this course was taught in relation to previous courses. They are able to discuss the different types of learning that took place. Morgan (1998) states that focus groups are effective for the purpose of final assessment, where data are used for quality improvement and students provide insights about “how and why” certain outcomes were achieved in the course. Students are able to utilize the focus group time to share with one another their final assessment of their service experience and reflect on their varying community experiences and related learning outcomes with the researchers and their student colleagues.

Preparation Arrange with the course professor early in the quarter/semester for a one-hour session near the end of the term to conduct the focus group. The focus group should be scheduled during the usual class time and should take place after students have completed any final projects, papers, or tests to minimize distraction or frustration. Focus groups should be facilitated by independent researchers or experts. The faculty member should not be present.

ASCV.indb 62

Arrange for a quality recording device, preferably one with multiple omnidirectional microphones. One facilitator will be needed for every eight to 12 students. If the course is large, it may be necessary conduct two simultaneous focus groups by dividing the group and obtaining a second room and a second facilitator. Each focus group should have a note-taker to accompany the facilitator. The note-taker should take responsibility for ensuring that the recording equipment is working continuously, turn the tape over when necessary, and take detailed notes about the conversation and nonverbal communication. In the event that the recording fails, the note-taker will be able to provide backup documentation of the focus group conversation.

Administration The introductory message on page 64 should be read to the students prior to beginning the focus group questions. Some specific guidelines include the following: • Remind students of the introductory guidelines as needed. • Begin and end on time. • Introduce facilitators and note-takers. • Arrange the group in a circular form. • Ensure the facilitator guides the discussion and is not a participant. • Make sure all probing questions are neutral. • Ensure that all students participate, no one dominates, and no one holds back. • Remind students of the guidelines throughout the focus group discussion as necessary. For more information on the administration of focus groups, see the work of David Morgan (1993, 1997, 1998) in the reference list.

Analysis Data analysis can be completed by utilizing a qualitative data analysis package such as Ethnograph or NUDIST (Miles & Huberman, 1994). Analysis may also be done by coding the focus group data by categorizing transcribed text according to the set of research variables. Data analysis begins with transcription of interview recording tapes immediately after the interview. These transcriptions should be double-spaced for reading ease. An initial reading of the transcripts by several people will allow the analysis team to

27-08-2018 15:59:53

Student Impact  63

come up with a list of key words and themes that appear in the text, each of which may be coded with their own color or symbol. The key words and themes are compared and matched to the research variables. The research team should then read each of the transcribed focus group discussions again to code each according to the agreed-upon list of key words and themes. It is helpful to have multiple people independently reading, analyzing, and coding

ASCV.indb 63

the data, so that these separate sources can compare findings for consistency and accuracy. Once the data are coded, the individually coded focus group discussions can be combined into research variable categories. This provides the researcher with a group of student quotes for each of the key concepts. This data can be used to assess the extent to which students have been affected according to each of the concepts.

27-08-2018 15:59:53

64  Assessing Service-Learning and Civic Engagement

Student Focus Group Protocol Introduction

Our goal for this focus group is to have an open and interactive discussion. Focus groups are a guided conversation in which everyone participates. We want to learn more about how you felt about your communitybased learning experience and will ask you a few questions that will focus on aspects of the experience and its effect on you as a learner. As facilitator, I will be asking questions to guide the discussion, but will not be participating or offering my own comments or reactions. The purpose of the focus group is to hear everyone’s ideas and impressions. Generally, in a focus group, hearing what others say may stimulate your own thinking and reflection on your experience. You do not need to repeat what others have said, but rather offer your own unique view or expand, clarify, or elaborate on what others have said. If you hear comments or ideas with which you disagree, do not hesitate to describe your perspective or contradictory view. A focus group, however, is not meant to resolve those differences or to press for consensus. The idea is to hear everyone’s thoughts, not to reach agreement. There are no right or wrong answers. The purpose is to capture a wide array of comments, opinions, ideas, and suggestions. This discussion will be recorded. Your faculty instructor will not hear the recording. Only the person transcribing the recording will hear it. The summary reports or transcripts will not identify speakers so what you say will be kept confidential. To ensure a quality transcription, it will be helpful if you speak one person at a time, and try to speak clearly and with more volume than usual so your comments are captured on the recording.

Questions 1. What were your personal learning goals for this community-based learning experience? What were the learning goals of the class? (10 minutes) 2. How would you assess your experience? Was it a success? Why? What factors contributed to the success? What obstacles did you encounter and how did you overcome them? (5 minutes) 3. Describe your interactions with the community partner. What role did your community partner have in your learning? (10 minutes) 4. What did you learn about the community or society in general from this experience? (10 minutes) 5. Did this community experience leave you with new questions or concerns? (5 minutes) 6. What connections can you describe between the community service work and the classroom discussions, required readings, assignments? Was there a good balance of course time and community activity? (10 minutes) 7. What role did your instructor play in your community service work? (5 minutes) 8. What recommendations do you have for future community-based learning courses? (5 minutes) 9. Do you have any other comments you would like to share?

Thank participants.

ASCV.indb 64

27-08-2018 15:59:53

Chapter Four

FAC U LT Y I M PAC T

roles as community-based teachers. The faculty need to have a good sense of how service-learning relates to their discipline and specifically to course content and an understanding of their role with respect to working with community partners. They also often need new skill development to create partnerships, design effective student assessment strategies for community-based learning, and integrate new methodologies such as reflection into their courses. Thus assessment of faculty can also help to understand the demands that this pedagogy places on faculty. Holland (1999a) used multi-institutional data to outline the key factors that motivate faculty involvement in service and to identify strategies for increasing motivation. With increased attention to the pedagogy of service or community-based learning, universities and colleges have begun to attend to motivating and supporting faculty as the “cornerstone for implementing service-learning” (Rice & Stacey, 1997, p. 64). The comprehensive case studies conducted at PSU (Driscoll, Holland, et al., 1996) addressed the gap in the service-learning literature by developing a range of strategies and methods for studying faculty. The case studies yielded significant core concepts and aided subsequent revision of instrumentation for faculty, including a survey, interview protocol, classroom observation protocol, syllabus analysis guide, and a guide for analysis of curriculum vitae.

Why Assess the Impact on Faculty? Throughout the service-learning literature, there is repeated acknowledgment of the critical role and influence of faculty. As Bringle and Hatcher (1998) note, service-learning in its most common form is a course-driven feature of the curriculum, an area of the university that is controlled by faculty. The prominent features of quality service-learning or communitybased learning depend for the most part on the faculty, including meaningful and adaptive placements; connections between subject matter and community issues; and experiences, critical reflection, and preparation for diversity and conflict (Eyler & Giles, 1999). There is also growing indication of the resulting changes in the nature of faculty work influenced by the service-learning movement. As such, learning becomes increasingly integrated into the broad spectrum of faculty roles and responsibilities and visibly institutionalized in higher education (Zlotkowski, 1999a), there are signs of its influence in the professional life of faculty. Thus, faculty are both influential with, and influenced by, service- or community-based learning. As early as 1990, Stanton criticized the minimal attention given to the faculty role in service-learning literature (Stanton, 1990). Most of the current literature has focused on the preparation of faculty for service-learning (Bringle & Hatcher, 1995, 1998; Stanton, 1994) and on institutional reward systems that support faculty work in service-learning (Driscoll & Lynton, 1999; Holland, 1997; Lynton, 1995). Yet we continue to know little about the relationship between faculty and service-learning. Further study of faculty roles will help to better understand faculty perceptions, predict which faculty will embrace service-learning, and determine what resources are necessary to help prepare faculty for

Current State of Research on Faculty and Service-Learning For reasons well known in higher education, major research and evaluation efforts have focused on student outcomes of service-learning (Astin & Sax, 65

ASCV.indb 65

27-08-2018 15:59:53

66  Assessing Service-Learning and Civic Engagement 1998; Berson & Younkin, 1998; Eyler & Giles, 1999). Evidence that service-learning makes a difference in students’ educational experiences has significant implications for funding, resource allocation, program development, and institutional change. For some of the same reasons, there is intense interest in assessing the impact of service-learning on the community and the institution. In contrast, there has been a distinct lack of research focused on faculty and service-learning. One notable exception to the gap in the servicelearning literature is a study by Hammond (1994) of faculty motivation, satisfaction, and the intersection of the two. Commissioned by the Curriculum Development Committee of the Michigan Campus Compact, Hammond contacted 250 faculty in 23 Michigan institutions of higher learning to gather baseline data about the characteristics of faculty and the servicelearning courses they were teaching. A survey was developed to document those characteristics and to query faculty about their service-learning work. Interestingly, the 163 survey respondents affirmed the importance of three conditions previously found to be related to general faculty satisfaction in academic culture (Astin & Gamson, 1983 ; Bess, 1982; Deci & Ryan, 1982; McKeachie, 1982): (a) sufficient freedom, autonomy, and control; (b) belief that their work has meaning and purpose; and (c) feedback that their efforts are successful. Hammond’s study can inspire a significant but unrealized agenda focused on faculty roles and satisfaction in service-learning, pedagogical issues, and the faculty need for support. The theme of faculty development has been studied with increasingly more attention as campuses became aware of the importance of the faculty role. Development efforts and research studies have focused on different dimensions of faculty in service-learning. Bringle and Hatcher (1995) addressed faculty cognitive needs with the assumption that a critical knowledge base is essential to the implementation of service-learning. In contrast, Ward (1996) proposed and supported the value of peer support. She stated, “The most effective person to encourage faculty to use service-learning as a pedagogical tool is a fellow member of the faculty who understands the cultural nuances of the campus (e.g., workload issues, relative weight of teaching to research)” (Ward, 1996, p. 33). Many campuses advocate the value of long-term efforts in faculty development to secure commitment and assure faculty that service-learning is not just another educational fad (Johnson, 1996). Finally, programs such as the Office of Academic Service-Learning Faculty Fellows

ASCV.indb 66

Program at Eastern Michigan University developed and studied a small-group dynamics model within a long-term faculty development approach to both prepare and educate faculty as well as to promote commitment to the pedagogy (Rice & Stacey, 1997). Their findings were positive about the faculty development approach while highlighting the need for ongoing development efforts and activities to sustain interest and involvement in service-learning and to recruit new faculty. Faculty development was one of eight strategies for increasing faculty involvement in service and service-learning identified by Holland (1999a). Drawing upon interviews and focus groups with faculty members from 32 diverse institutions, Holland identified three primary types of motivational forces that affect faculty involvement: (a) personal values regarding social responsibilities; (b) relevance to their discipline; and (c) evidence of potential reward or other positive impact on the individual or institution. With these motivational forces in mind, Holland articulated key obstacles to increasing faculty commitment to service and reported on the most effective strategies for overcoming these barriers. Among the strategies for enhancing faculty motivation is the curriculum itself. Service-learning in the curriculum is often the first service-related activity that faculty members will try, and it has proven to be a good approach to building faculty confidence and interest in public service as academic work (Zlotkowski, 1999b). Recent efforts to study and assess faculty roles in service-learning encourage and inspire the development of a future agenda that is holistic and attends to both the influence of faculty on service-learning and the impact of service-learning on faculty. The PSU case studies attended to the latter with the identification of core concepts of impact as well as the development of associated measurement methods. The core concepts also extended to current interest areas of professional development, motivation and attraction of faculty, satisfactions, and barriers and facilitators to begin to provide a wide lens with which to study the faculty role in service-learning.

Assessment Matrix for Faculty Impact The assessment matrix for faculty impact is presented in Table 4.1 and represents those core concepts that emerged from the PSU case studies as well as other research and descriptions of best practice. Each of the concepts will be described with a rationale for its importance and a summary of what is known about

27-08-2018 15:59:53

ASCV.indb 67

TABLE 4.1 

Matrix for Faculty Assessment

27-08-2018 15:59:53

What do we want to know? (concepts or variables)

How will we know it? (indicators)

How will we measure it? (methods)

Who/what will provide the data? (sources)

Motivation and attraction of faculty to service-learning

Level and nature of community participation Activity related to level of learner in courses/discipline Linkage to other scholarly activities Identification of motivating factors (value, rewards, etc.) Awareness of socioeconomic, environmental, cultural factors

Interviews Focus groups Critical incident review Curriculum vitae analysis

Faculty Community partner Students Department chair Faculty peers

Professional development (support needed/sought)

Attendance at related conferences/seminars Interview Participation in campus-based activities Focus groups Leadership/mentoring role with others in promoting service-learning Curriculum vitae analysis Role in advocating service-learning in academic societies

Faculty Community partner Students

Impact or influence on teaching

Knowledge of community assets and needs Nature of class format, organization, activities Evolution of teaching and learning methods Articulation of philosophy of teaching Nature of faculty/student/community partner interactions

Interview Focus groups Critical incident review Curriculum vitae analysis

Faculty Community partner Students Institutional resources

Impact or influence on scholarship

Changes in research emphases Changes in publication/presentation content and venues Changes in focus of research proposals, grants, and projects Scholarly collaborations around community-based learning

Interview Focus groups Critical incident review Curriculum vitae analysis

Faculty Community partner Institutional resources

Other personal or professional impact

Creation of partnerships with community organizations New roles with community organizations Campus-based leadership role around CBL Mentoring of students Commitment to community-based teaching and learning Role in department/program advocating service-learning

Interview Focus groups Critical incident review

Faculty Community partner Students Department chair

Identification of barriers and facilitators

Strategies to capitalize on facilitators Methods and activities to overcome barriers Illustrations of creative problem-solving Ability to build upon barriers and create facilitators

Interview Focus group Critical incident review

Faculty Community partner Students

Satisfaction with experience

Strengths and lessons learned Opportunities for future improvement

Interview Focus groups

Faculty Students

68  Assessing Service-Learning and Civic Engagement the concept within the larger framework of servicelearning and other community initiatives for faculty.

Motivation and Attraction of Faculty to Service-Learning With the initial work of Hammond (1994), Holland (1999a) and the PSU case studies (Driscoll, Holland, et al. 1996), we have begun to have an understanding of what brings faculty to servicelearning and what sustains them. At the top of the list of motivators is the satisfaction faculty experience with service-learning as a pedagogy—the experience of observing students transformed by community work. That satisfaction affirms findings on faculty satisfaction with their work in general. With service-learning, there are new indicators of increased awareness about community context and unique insights for both faculty and students that accompany the faculty involvement in communitybased learning experiences. As with most faculty efforts, the importance of the reward system and institutional support are critical to motivating and attracting faculty. Information about motivation and attraction of faculty to service-learning is critical to the institutionalization of service or community-based learning on campuses. The information will inform faculty recruitment and faculty development planning and programs.

Professional Development (Support Needed/Sought) Closely aligned with the attraction and motivation variable is a description of the professional development needs of faculty involved in service-learning. Again the institutionalization of service-learning depends on both attracting faculty and supporting them to sustain their involvement. While there is growing literature about “what works” in faculty development related to servicelearning, there has been minimal attention to studying professional development from the perspective of faculty needs—asking faculty and other constituencies about the kind of support needed by faculty. There is a strong indication that the satisfaction gained by faculty from their service-learning experiences as well as the level of impact they will have on students and community through facilitation of service-learning may depend greatly on the professional development support provided to them. This concept is of great significance to the success of most programs.

ASCV.indb 68

Impact or Influence on Teaching As observed in the PSU case studies and noted by other service-learning practitioners (Driscoll, Strouse, & Longley, 1997; Howard, 1995; Gelmon, Holland, Shinnamon, & Morris, 1998), service or community-based learning has the potential to change faculty pedagogy significantly. Both faculty and students have access to new information through their involvement with the community as well as new resources for learning. The unexpected and unanticipated events, ideas, questions, and interests of community work have the potential to revise curriculum, modify teaching methods, and even transform the roles of faculty and students. Information about the impact of servicelearning on the teaching of faculty is of great interest to campuses with a commitment to enhanced student learning. Many of the changes in faculty teaching observed in service-learning contexts are correlated with increases in student engagement and achievement. For many campuses, institutional missions are directed to student outcomes of civic engagement and responsible citizenship, so that faculty pedagogy is again critical to achieving those missions.

Impact or Influence on Scholarship For many faculty, especially junior or untenured faculty, this area of impact is most significant. Participating in service- or community-based learning takes a toll on faculty time, realistically adding responsibilities to those typically associated with teaching. These time demands can add stress to a situation in which many faculty already feel vulnerable; thus, there is the potential for negative impact on faculty scholarship. However, participation in service- or community-based learning can open new venues for faculty scholarship with the potential for positive impact on faculty scholarship. New areas of research, new publication foci, and scholarly collaborations with colleagues and community partners all offer potential for faculty rewards. Those faculty who do succeed in building scholarship around community-based learning provide models to colleagues and more importantly promote institutional change around faculty roles and rewards. This concept has the potential to transform campus culture as well as national initiatives. Currently, many professional disciplinary associations support new forms of scholarship and provide credible venues for faculty work (Zlotkowski, 2000). As the servicelearning movement grows and becomes nationally

27-08-2018 15:59:53

Faculty Impact  69

institutionalized, the impact on faculty scholarship will gain in importance and contribute much to the movement.

Other Personal or Professional Impact This category of impact possibilities emerged as we identified unanticipated kinds of impact on faculty. Within this category are the possibilities for enhanced faculty volunteerism because of their experience with service-learning, as well as new roles for faculty within community organizations. For many institutions of higher education, these possibilities will make strong contributions to their missions of civic engagement. Within the pedagogical experience, there is potential for new or different mentoring roles with students. Connecting academic content with community-based projects often changes the roles of faculty and students, reveals different needs of students, and/or enhances the relationships between faculty and students. Thus, faculty may begin to interact with students in nontraditional ways or through enhanced mentoring. Because service-learning is still novel and considered an innovation on many campuses, faculty involvement often leads to leadership roles for those participating faculty. They may become the unofficial champions or advocates for the campus with the potential to influence their peers. Ultimately, most faculty develop a deep commitment to service-learning after several experiences with this pedagogy.

Barriers and Facilitators Much like our findings about motivation and attraction, the information about barriers and facilitators to faculty participation in service-learning is critical to campus programs of recruitment, faculty development, and support. The increased time demands and responsibilities that come with teaching a service-learning course mean there is a need to identify barriers and provide needed support to ease or remove those obstacles to faculty success. Similarly, the need to identify facilitators and ways to support or ease the workload of faculty in servicelearning courses are essential to both individual success and campus success. As faculty gain experience with service-learning courses, their awareness of barriers and facilitators can be captured through assessment activities and used to guide institutional decisions and actions.

ASCV.indb 69

This concept will no doubt be expanded as we develop extended experience with service-learning and study its use in different settings of higher education. It will hold individual possibilities as well as institutional possibilities in terms of specific findings and will further inform infrastructure and resource decisions in the future.

Satisfaction With the Experience This too is a concept that offers present and future potential for study. Our early findings reveal much satisfaction with the pedagogical aspects of servicelearning—the joys of student learning experiences and new insights, the extended outcomes for student learning, and the commitment of students to community work following a service-learning course. All of these provide enormous satisfaction for faculty. We have observed the collegiality built into campus programs for faculty participants in servicelearning and the satisfaction of being involved with such programs. The insights of individual faculty are critically important to the satisfaction concept. At the same time, developing profiles of faculty satisfaction informs faculty recruitment, support for faculty, faculty development, and reward systems to ultimately enhance institutional engagement with the community. In sum, all of the faculty concepts have enormous potential to shape and support the institutionalization of service-learning on individual campuses and nationally. The information provided by studying each of the variables will enhance the work of faculty in service-learning and ultimately make servicelearning an integral component of every student’s higher education experience.

Strategies for Assessing Impact of Service-Learning on Faculty When developing and implementing strategies to study faculty in service-learning contexts or to assess the impact of service-learning on faculty, considerations of time, sensitivity to faculty workload, convenience of scheduling, and confidentiality issues must be maintained. When designing strategies to assess impact on faculty, researchers experience difficulty in developing approaches that are nonintrusive and that view impact holistically. We have used multiple sources of evidence from a range of strategies in order to capture a big picture of impact and to reveal

27-08-2018 15:59:53

70  Assessing Service-Learning and Civic Engagement unexpected concepts. Once data are collected, the varied sources of evidence can be triangulated, with each set of data explained, clarified, or extended by the information provided by another source of data. For example, insights gained from faculty interviews helped explain what was observed in the faculty member’s classroom or reviewed in the curriculum vitae. Similarly, faculty insights about a course experience can be augmented by additional review of student and community observations of the same experience. The accuracy of information gained from strategies to assess the impact of service-learning on faculty depends on the questions posed, the availability of faculty to participate in the assessment activities, and on the resources available for assessment. We have found that significant change in faculty-expressed attitudes or observations of behavior are not usually evident during a typical academic quarter or semester; thus, there is a need for extended studies and long-term assessment. Similarly, some assessment concepts, such as scholarship, will not be evident over the short-term and will evolve over a number of years. These strategies to assess impact on faculty are best used in combination. They provide substantive data when used for multiple forms of evidence, and offer a blend of opportunities to solicit the faculty voice (faculty interview and survey), to observe the faculty work (classroom observation), and to review faculty documentation (syllabus analysis, curriculum vitae analysis). They provide both quantitative and qualitative data that yield a profile of individual faculty or a composite of a group of faculty.

Faculty Survey Benefits of the Faculty Survey

The faculty survey is designed for use as a post-test after faculty have completed teaching a servicelearning course. As a post-test, the survey provides a profile of faculty who are teaching service-learning courses. The survey can also be modified as a pretest instrument to use in a pre/post study for assessing change in faculty attitudes and perceptions.

Issues With Use of the Faculty Survey Much of the survey is designed with “forced choice” items and may not always provide the most accurate data. It does not provide much opportunity for explanation of faculty responses so there is a chance for misinterpretation. When used as a pre/post test in a traditional academic quarter or semester, the time

ASCV.indb 70

period may be too brief for any significant change to be indicated. It may be best used before and after a faculty member has taught a service-learning course two or three times.

Faculty Interview Benefits of the Faculty Interview The faculty interview has multiple uses for both individual faculty members and for institutions and programs. It can be used as a reflective process to assist faculty in reviewing and assessing their experience and course. In its narrative form, it can be used to support individual scholarship of engagement. The faculty interview also provides an opportunity for faculty to provide input and insights to the campus program. Data can be used for planning faculty development, for making decisions about resource allocation, and for assessing campus-wide impact of service-learning. The faculty interview is best used as a complement to the faculty survey, as well as with observational data and the analysis of faculty vitae.

Issues With Faculty Interview Care must be taken to ensure confidentiality and the interviewee’s capacity to be candid by assuring that the interviewer is someone who will not be a threat or cause discomfort to the interviewee. Care must also be taken to assure that all interviews are conducted with consistency in the procedures and the questions and use only neutral probes. The interviewer must not comment on the responses of the interviewee.

Classroom Observation Benefits of Classroom Observation

Initially the data gained from classroom observations capture the status of service or community integration in academic courses. As such they can guide faculty development planning and campus programs. They can also yield a campus profile of teaching and learning when used with a significant number of faculty. For individual faculty, data from classroom observations provide information useful for selfassessment of teaching and appropriate for submission as scholarship of engagement or scholarship of teaching or both. Observations allow the researcher to report to the faculty member about the demonstrable learning of students.

27-08-2018 15:59:54

Faculty Impact  71

Issues With Use of Classroom Observation Classroom observation data collection is both timeconsuming and costly, but it does yield data that describe multiple aspects of service-learning within a course context. Such observations capture the dynamics of the teaching and learning processes with complex and comprehensive data. Observer training is critical and must be conducted with attention to reliability. The analysis process is also timeconsuming and requires skills to surface both the expected patterns as well as the unexpected themes. It is often suggested that two different individuals analyze the same classroom observation data to better probe all of the patterns. By listening to classroom discussions, the researcher is able to document student quotes regarding their reaction to a community issue being studied, their personal response to engaging in the community, and observe the teachable moments that take place as faculty and students come together to make meaning out of the experience. Classroom observations allow the researcher to assess the level to which the service experience is being integrated into the discussion of the class content. This first-hand observation also allows the researcher to track the impact that the service experience has on the way the faculty teaches the class. These elements are only observable and trackable when the researcher has first-hand access to students and faculty interacting. A few of the disadvantages of classroom observations are: (a) it can be difficult to gain access to classes because faculty may not feel comfortable allowing a researcher to observe their teaching; (b) the researcher who observes the class on a periodic basis can become intrusive to the class dynamics and culture, causing the faculty or students to behave differently; and (c) for classroom observation to be an effective source of data, the researcher must be available to attend class frequently to stay abreast of the dynamics of the class and those critical events occurring in the community. These frequent observations cause this form of data collection to be time-consuming and labor intensive.

Teaching/Learning Continua Benefits of the Teaching/Learning Continua

The uses of the teaching/learning continua are quite similar to those of classroom observations in general. They have uses at all levels: individual faculty,

ASCV.indb 71

program, and institutional. The continua capture unique and specific qualities of teaching and learning, rather than general observational data. The continua should be used in conjunction with the observation protocol. In addition to being used by observers in describing individual class sessions, ongoing classes, and change or lack of change in teaching and learning, the continua can be used as a self-assessment or reflection tool by individual faculty. They can also be used by students to reflect on or provide feedback on individual class sessions or as an ongoing form of feedback. If they are used by students, it is imperative that students understand the concepts and terms used in the continua, which adds another kind of content to a course. The kinds of understandings implied in the continua could enhance students’ metacognitive learning.

Issues With Using the Teaching ⁄ Learning Continua The accuracy of use of the teaching/learning continua increases with consistent understanding of the continua’s concepts. Without thorough discussion and development of definitions and understandings of the concepts, the data from observations may be misleading or inaccurate. The discussions and development processes are, however, very useful and valuable for those participating in a service-learning program. When the continua are used as a self-assessment or reflection tool by faculty or students, there is a tendency for users to place themselves on the lines where they think they “should” be. The concepts are not value-free, so there is the potential for values to influence the respondents’ choices. However, another positive implication of this issue is that respondents, especially faculty, may be influenced behaviorally by their own responses and modify their own teaching.

Syllabus Analysis Guide Benefits of Syllabus Analysis Guide

The syllabus analysis guide is useful for the initial development of a syllabus or the review and modification of a syllabus. Used with other assessment processes, syllabus analysis can provide a profile of an individual course or of a faculty member’s approach to service-learning. The syllabus analysis is also evidence of the scholarship of engagement or the scholarship of teaching or both, and is thus useful for the

27-08-2018 15:59:54

72  Assessing Service-Learning and Civic Engagement individual faculty member’s professional portfolio. Use of the syllabus analysis guide is a valuable faculty development process when used by the individual faculty member or a group of faculty.

Issues in Using the Syllabus Analysis Guide The previously mentioned sensitivities and time needed for both design and analysis of faculty syllabi will be important to attend to during the entire process. In analyzing a syllabus, care must be taken not to impose on the faculty member’s disciplinary expertise but rather to focus on how the syllabus documents the nature and effort of a community-based learning experience. Areas to be examined include presence of specific objectives for the community experience, description of the use of reflection, ways in which course and community content is integrated, and methods for assessment of student learning in the context of community effort.

Faculty Journals Benefits of Faculty Journals

The faculty journal is best used in conjunction with other data-gathering strategies focused on faculty teaching and learning. It will contribute insights and perspectives for individual faculty profiles as well as information and direction for program planning. Journals help faculty to engage in their own reflection as it relates to service-learning and can give faculty the necessary experience to help students with their own journal writing. Individual faculty may use their journal narratives in the development of the typical narratives used in promotion and tenure portfolios. Journals will reveal faculty struggles in making connections between course content and community activities. They may also help faculty to identify the difficulties they might experience with classroom exchanges related to project content and challenges.

Issues With Using Faculty Journals It is almost impossible to guarantee anonymity when analyzing faculty journals because the writing often reveals circumstances and details that identify individuals. Individual faculty should be alerted to this issue and assured that all possible steps to protect their anonymity will be taken in the reporting of findings. Faculty should be encouraged to reread their journal

ASCV.indb 72

entries prior to submission for analysis and to disguise information that is too personal, revealing, or confidential to share. Faculty journals are also time-consuming and it is often difficult for faculty to consistently write in them. Analysis of the journal data is equally time-consuming. Both processes have significant merit and ultimately warrant the time investment.

Curriculum Vitae Analysis Benefits of Curriculum Vitae Analysis

The curriculum vitae analysis and guide may be used by individual faculty members in preparation of their materials for promotion and tenure review and documentation of their scholarly activities related to community service. The guide supports reflection by individual faculty on their professional and scholarly activities during the preparation and documentation process. The curriculum vitae analysis and guide may also be used to study faculty who are engaged in service and community-based learning. It can provide an individual profile or a composite of a group of faculty’s scholarly activities related to community service. The information may be used to plan faculty development on an individual or group level, as well as to support faculty scholarship. Further analysis will provide evaluation information on the roles of faculty in community.

Concluding Comments The time and resources required for conscientious assessment of impact on faculty may look daunting but the “future growth and sustainability of servicelearning depends to a large extent on the faculty, and the success with which universities are able to support and reward their efforts” (Driscoll, 2000, p. 39). Careful and systematic assessment of the impact on faculty and their impact on service-learning offers an opportunity for “scholarly study of faculty work with the end result of more informed decision making, programmatic changes, and directions for resources and efforts” (p. 40). For faculty in general, the assessment will enhance higher education’s understanding of their roles, their work, and their needs for support. For individual faculty, the assessment process as well as the data from such assessments can foster more reflective practice.

27-08-2018 15:59:54

Faculty Impact  73

Strategies and Methods: Faculty Faculty Interview Syllabus Analysis Faculty Journal Curriculum Vitae Analysis Classroom Observation Teaching/Learning Continua Faculty Survey

impact and related issues in advance of the interview session. Gain consent to record the interview. 4. Review the interview protocol to ensure smooth administration of the questions.

Administration Once preparation is complete, the following guidelines are recommended for the interview:

Faculty Interviews Purpose Faculty interviews are intended to foster one-on-one conversations with faculty members to explore their perspectives on the experience of connecting the academic content of a course to community service. This approach could be used to assess a wide variety of possible outcomes for faculty engaged in serviceor community-based learning. This interview protocol is designed to probe community awareness, teaching and learning philosophy and practice, the logistics of service-learning, and influences of the community service experience on faculty work in general. .

Preparation The following steps are recommended in advance of the actual administration: 1. Identify and schedule a location with minimal distraction and maximum comfort. 2. Schedule the interview at a time and location convenient to the faculty member. 3. Advise the faculty member of the purpose of the interview so that they may reflect on

ASCV.indb 73

1. Begin on time. 2. Introduce yourself and your role in the assessment process. 3. Explain the purposes of the interview and respond to the faculty member’s questions. 4. Assure confidentiality. 5. Stress importance of candor. 6. Record the interview (with permission), and also take back-up notes. 7. Follow the interview protocol carefully and keep probes neutral.

Analysis Before analysis of the data, a careful transcription of recordings is necessary as well as a review of notes taken during the interview. The transcribed notes should be read at least two times before any analysis occurs. After several readings, the reader identifies key words and themes. These key words and themes are then coded on the transcripts. The key words and themes are further organized into patterns and related to the research variables. There may be key words and themes that indicate new variables or unexpected patterns.

27-08-2018 15:59:54

74  Assessing Service-Learning and Civic Engagement

Faculty Interview Protocol (Provide introduction to set context.) 1. Describe the conditions and needs of the community where the service-learning experience took place. 2. Describe any new information you have learned about your community in the process of offering your community-based learning course. 3. After teaching your community-based learning course, how would you describe your own learning experience? 4. As you taught your community-based learning course, what were your concerns? How did you address them? 5. Describe the preparation and coordination that this community-based learning course required. 6. Was this a successful teaching and learning experience? How did you know? 7. Were the student learning outcomes different in this course from those in courses without a community experience? 8. Do you think that your teaching changed as a result of having a community dimension in your course? Why or why not? 9. Based on this experience, when you teach another community-based learning course, how will you approach it? 10. Has your community-based learning experience influenced your other scholarly activities? Will it do so in the future? 11. Is there any other information you would like to share? Thank the interviewee for his/her time and input.

ASCV.indb 74

27-08-2018 15:59:54

Faculty Impact  75

Syllabus Analysis The purpose of the syllabus analysis and the accompanying guide is to provide a framework for the development and assessment of syllabi for serviceor community-based learning courses. Within the analysis, aspects of a course can be highlighted: integration of community service, outcomes related to community service, assessment of community service outcomes, and major forms of pedagogy. The syllabus analysis facilitates assessment of one aspect of a course, providing a picture of the planning and thinking of a faculty member prior to teaching the course.

consultation during this time to assist and illustrate examples of syllabus design and adaptation for service-learning. It will be important during the preparation and administration to maintain sensitivity to individual faculty philosophies and approaches to course planning. Faculty have individual philosophies about the level of specificity of a syllabus, with some who prefer it loose and open to ongoing change and those who prefer a tightly constructed and organized document. If, however, the syllabus is to be used as a method of documentation for assessment, then it is necessary that there be some standardization of content to demonstrate the expression of service-learning components.

Preparation

Administration

Purpose

The most critical aspect of preparation for use of the syllabus analysis guide is adequate time in advance for the faculty member to study the guide and prepare or revise a syllabus for review. It is useful to bring faculty together to discuss the guide, explore the rationale for the components of the guide, and raise questions of the meaning or value of the components. It may be useful to provide expert

ASCV.indb 75

Once faculty have been briefed on the format of the guide and engaged in discussions, they should be given adequate time to develop or review a syllabus before submission. Faculty can be given the option to simply submit the syllabus in writing or electronically or to present the syllabus orally with the opportunity to discuss, describe, or respond to questions about the components.

27-08-2018 15:59:54

76  Assessing Service-Learning and Civic Engagement

Syllabus Analysis Guide This guide is intended to identify the components of course design in general and the components of “best practices” in service- or community-based learning in particular. Using the guide for analysis purposes, the presence or absence of those components is highlighted, along with descriptions of how they are integrated in the course. The main components expected in a syllabus for a service-learning course are: 1. Description of the service-learning experience. 2. Goals and objectives of the service-learning and anticipated outcomes of the experiences for both students and for the community partner organization. 3. Opportunities for structured and unstructured reflections by students on the connections between academic content and community service. 4. Integration of academic content and community service in both teaching and assessment. To determine the presence of these main components, the analysis looks for the following: 1. Course description which includes description of community-based learning experience and approach for the course. 2. Learning objectives or outcomes for students that are directly related to the community service component. 3. General service outcomes for community partners. 4. Nature of projects/assignments related to the community service experience. 5. Readings/discussions/presentations/speakers related to the community service experience. 6. Direct and deliberate connections between the academic content and the community service experience. 7. Opportunities for reflection, both structured and unstructured, in the form of assignments, journal writing, discussions, and other mechanisms explicitly described in the syllabus. 8. Assessment of community service experience as an explicit component of determining course evaluation and grade. 9. Evidence of the community service experience as a teaching/learning approach that is integrated with other pedagogy. Note: There is no explicit weighting of these components. However, the ideal syllabus would include all of the components.

ASCV.indb 76

27-08-2018 15:59:54

Faculty Impact  77

Faculty Journal Purpose The purpose of the journal is to encourage faculty through a structured opportunity to reflect on their experiences with service- or communitybased teaching and learning. Many faculty already keep journals so this may not be a new experience for them. In this context, the journal may provide additional detail for developing profiles of faculty who teach service-learning courses. If faculty are willing to share their journals, the writing will provide data on issues, concerns, questions, successes, insights, and perspectives of the teaching and learning experience. Such content will be useful to the individual faculty member to review for selfassessment and summative reflection.

Preparation and Administration It is useful to bring faculty together, hold discussions of experiences with journal writing, and review the specific protocol for this activity. Faculty who are new to journal writing may have concerns and questions and may need some additional coaching on the journal writing process and its merits. In most cases faculty are asked to write in their journals every week of their course. Specific instructions are provided in the protocol to guide the content of the journal, as well as to focus specific entries.

ASCV.indb 77

It is particularly important to emphasize to faculty that the intent of the journal is not to keep a diary of events and happenings, but to provide structured reflection on the events, on student learning, on personal learning, and on the ultimate achievement of course goals. It is useful to collect journals halfway through the course or period of study in order to provide feedback to faculty and if necessary to assist them in reflective writing rather than reporting. Copies of the final journals are then collected at the end of the course.

Analysis The analysis of faculty journals requires the same rereading process described for analyzing the data from faculty interviews and classroom observations. Again, the reader is searching for key terms and themes in the journals, to be later categorized into patterns. The journal data may be used to explain data gained from classroom observations or to elaborate on data from the faculty interviews. There is the potential for some quantitative data in the form of “over 50% of the entries” or “the theme appeared in six of the ten entries” to accompany the descriptive data of themes or patterns. There is some value in simultaneous analysis of faculty journals with faculty interviews and classroom observations for the purpose of finding common themes and patterns or connecting themes between the different sources of data.

27-08-2018 15:59:54

78  Assessing Service-Learning and Civic Engagement

Faculty Journal Protocol The purpose of keeping a journal is to offer a structured opportunity to reflect on experiences with service- or community-based teaching and learning. For purposes of discussing the process and value of journals, we will meet at the beginning of the semester [or quarter]. We ask you to hand in your journal twice during the semester [or quarter]: halfway through your course, and at the end of the course. We will provide feedback on the first half to guide your future journal writing. This feedback will not be about specific content, but rather about the connections you are making between your service-learning course and the key concepts we are interested in understanding. Each week of the course, we ask you to write one to four pages in a journal, and in your writing to reflect on the community-based learning component of your course and its influence on your course in general. We encourage you to notice any changes in your role or orientation toward teaching and learning. Your reflection should address the following broad themes: • Values: your own and those of your students about the community and the service-learning process • Your role as a teacher and learner: any changes in those roles as a result of the service component and the community emphasis of the course • Service: your perspective on your personal commitment to service, your definition and awareness of your community, the service that you and your students are providing to the community, and the impact of the service on your course and teaching • Influence on scholarship: the impact (if any) that the community experience is having on the focus of your scholarly activities such as writing, presentations, research, and professional involvements • Motivation: personal motivation or incentive to create community-based or service experiences in this course You may or may not address each theme in each entry. We urge you to explore other themes that emerge from your experiences.

First Entry Begin your first journal entry with an overview of the course you have planned, emphasizing the communitybased service experience component. Set out a series of goals or desired outcomes you wish to achieve with respect to incorporating community-based teaching and learning into your course. After addressing the broad themes, develop a brief summary of the entry.

Subsequent Entries Each entry should be dated. Each week review your summary from the end of the previous week’s entry and begin your new entry by commenting on the progress or changes from the previous entry and acknowledging any problems encountered. At the end of each weekly entry, reserve some space to discuss accomplishments of the week, anticipated challenges in the next week, and specific goals and actions to help you meet those goals.

Last Journal Entry At the end of the course, reread the entire journal and write a summary entry addressing the themes previously described. Comment on the extent to which your goals and desired outcomes were achieved and the personal and professional impact of the experience. Finally reflect on what you will do differently in the future as a result of this experience.

Final Journal Reflection This reflection will be structured around the format of the application for the Thomas Ehrlich Faculty Award for Service-Learning (presented annually by Campus Compact) and will summarize your reflections on your

ASCV.indb 78

27-08-2018 15:59:54

Faculty Impact  79

experience. You are asked to write a two-page reflective synthesis which describes how you integrate service- or community-based learning into your teaching, curricula, and scholarship and how you are or might be able to integrate academic and personal service. Finally, please review all of your writing prior to submission, and “blind” or disguise any names or events that you feel are too sensitive or of a confidential nature. We will not reveal your name in any of our analysis, but if you are concerned that any of your writing is too private to disclose, then you should make changes (or delete that material) so that there are no potential opportunities for violation of privacy.

ASCV.indb 79

27-08-2018 15:59:54

80  Assessing Service-Learning and Civic Engagement

Curriculum Vitae Analysis Purpose Faculty members engaged in service- or communitybased learning may illustrate this activity and related scholarship in their curriculum vitae. The guide that accompanies this data-gathering strategy provides a list of possible scholarly activities or contributions that faculty may include in their vitae. The list suggests possibilities for faculty consideration but is not inclusive of the variety of scholarship possible when a faculty member is engaged in community activities. The curriculum vitae analysis contributes additional information to the faculty profile to which other data-gathering strategies previously described are directed. It is designed to indicate and describe the influence of community engagement in the professional activities of faculty members.

Preparation and Administration Before the guide is used by individual faculty members, a review of the existing university guidelines is critical. Even with clear institutional guidelines for preparation of faculty curriculum vitae, the guide can suggest areas where a faculty member can document

ASCV.indb 80

and highlight their community-based teaching and learning. The guide is best shared with university and departmental administration as well as individual faculty. It is important to determine whether there are any areas of the guide that require modification to conform to university standards. After presentation of the guide to faculty, group or private discussions may be useful to suggest how the individual faculty member’s activities are best represented in the vitae. Peer suggestions and review will be very valuable in the process. Adequate time must be given for faculty to receive feedback and to revise their vitae before the final analysis (or submission of a personal dossier for review).

Analysis Once the vitae to be analyzed have been collected, the following procedures are recommended: • Prepare a review sheet that includes the items in the guide. • Note the presence and absence of the items with comments for each vitae analysis. • Provide feedback to the individual faculty members so that revisions can be made.

27-08-2018 15:59:54

Faculty Impact  81

Curriculum Vitae Analysis Guide The impact of faculty engagement in community-based teaching and learning can be documented in multiple forms and in different categories of scholarship. The following suggested items of evidence are possible forms of scholarship related to service- or community-based learning:

Teaching 1. Evidence of integration of community service into courses. 2. Achievements and recognition (by university or community) related to community-based teaching and learning. 3. Curriculum development projects (on an individual course level or departmental program level) related to community-based teaching and learning.

Research and Publications 1. Grants with potential linkage to the faculty’s community-based teaching. 2. Professional presentations (on a local, state, or national level) describing community-based teaching or community issues. 3. Publications describing community-based teaching or community issues. 4. Community projects, papers or reports, presentations related to community-based teaching or community issues.

Service 1. Community projects, papers or reports, presentations related to community-based teaching or community issues. 2. University-related service through community-based teaching and learning that assists the university in addressing community needs. While it is unlikely that all of these will be evident in any one faculty member’s vitae, a global assessment of these measures will assist in developing a profile of faculty roles in the community and the impact of servicelearning on faculty scholarship.

ASCV.indb 81

27-08-2018 15:59:54

82  Assessing Service-Learning and Civic Engagement

Classroom Observation Purpose The purpose of classroom observation is to describe quantitatively and qualitatively the teaching methods, learning experiences, and interactions that take place in a community-based or service-learning course. Classroom observations can also provide indications of the integration of community focus within the academic content of a course and descriptions of how and to what extent it is integrated. When used as ongoing data collection in a course, classroom observations can display changes or lack of changes in faculty, student, and community roles in the course pedagogy and in the course content. Overall, classroom observations provide a powerful profile of pedagogy and curriculum in service- and community-based learning courses. In this form of qualitative research, the researcher studies the subject in naturalistic settings, where everyday interactions take place. The natural environment of the subject is important to the qualitative researcher, who gains clues from the context in which the study takes place. Furthermore, this form of qualitative research is descriptive in nature, using words, stories, and metaphors to communicate findings. In observation studies, “human experiences are examined through detailed descriptions of the people being studied” (Creswell, 1994, p. 12).

Preparation In preparation for classroom observations, the following sequence of steps is recommended: 1. Train observers in observation strategies (training includes practice observations in pairs to establish reliability). 2. Hold an orientation session for faculty to introduce the processes to faculty and students, introduce the observer and his/her role, and for completion of human subjects review or other permission forms if necessary. 3. The faculty and observer come to agreement about which class sessions within a semester or quarter will be most representative of “typical” classes, interactions, and content. For example, observation during the showing of a film is not appropriate.

ASCV.indb 82

4. Observers review narrative recording format for observation.

Administration Observations should be conducted at regular intervals and be well coordinated with the participating faculty member. An example would be a set of five observations at two-week intervals in a 10-week course during which an entire three-hour period is observed at each observation. The same observer should remain with the class for most of the observations, with the exception of an interrater reliability option that may be gained by using a different trained observer for one or two of the sessions. Using the observation form observers will collect the following three kinds of primary data: 1. Awareness and involvement of community: quantity and quality of interactions of community or community partner(s), direct quotations from students and faculty about community, and reference to community integrated in academic content 2. Teaching methods: influence of communitybased learning on initial and evolving class format, organization, pedagogy, and faculty/ student interactions 3. Philosophy of teaching and learning: initial and ongoing faculty/student/community roles, outcomes, pedagogy, curriculum, and interactions The classroom observation form provides a format for gathering data from observations. Observers often keep their notations in separate journals or notebooks but they should include all of the information deemed necessary to ensure consistency of data collection throughout observations. While the observer should note what is occurring in the classroom as far as content, the bigger picture includes the environment, frequency of interactions between faculty/students and/or community representatives, and daily issues, in regard to service placement and anecdotal information from all constituencies. Observers are encouraged to keep a journal as their means of reflecting on the process of data collection. While observations are meant to be as objective as possible, observer journals provide insights as to process, and can also serve to clarify issues which may arise during the data analysis.

27-08-2018 15:59:54

Faculty Impact  83

Analysis Classroom observations yield rich and abundant data. The analysis should include the frequency of interactions between students and faculty over a period of time as well as data about changing roles of faculty, students, and community.

ASCV.indb 83

The process for analyzing the narrative data from classroom observation requires the same successive readings described in the analysis section for the faculty interview. From the readings, key phrases and themes emerge to be later categorized in patterns and in relation to the faculty concepts previously presented.

27-08-2018 15:59:54

84  Assessing Service-Learning and Civic Engagement

Observation Form Course _______________________________________ Date, Day _____________________________________ Time _________________________________________ Observer ______________________________________ # of students ___________________________________ Others ________________________________________ # of students who spoke __________________________

In the box, code interactions with an F for faculty and S for students; Use \ to indicate beginning or end of exchange between one or more class participants.

Room Arrangement ��������������������������������������������������������������������������������������� ��������������������������������������������������������������������������������������� ��������������������������������������������������������������������������������������� Indicate with an X and record the time spent on the following class activities (format and organization). Lecture ________________________________________ Individual Work �������������������������� Discussion _____________________________________

Presentations ����������������������������

Group Work ____________________________________

Reflection ������������������������������

Assessment _____________________________________

Question/Answer ������������������������

Narrative: Describe the relationships between faculty and students, among students themselves; use of teaching tools (handouts, audiovisual, etc.); mention of community (examples, anecdotes, questions, references, applications); and connections between community experiences and course content. [Use multiple pages for narrative.]

ASCV.indb 84

27-08-2018 15:59:54

Faculty Impact  85

Teaching/Learning Continua Purpose The use of teaching/learning continua helps to describe the teaching/learning context, philosophy, and qualities of the teaching/learning approaches. The right side of each of the continua suggests a high level of interaction is occurring between the faculty and the students, while the left side of each is suggestive of a lower level of exchange between them. The continua offer another lens through which to observe service- or community-based learning courses. They are best used with other observational recording strategies, the observation form previously described, and narrative recordings of individual class sessions.

Preparation For purposes of describing recommendations for administration, this section will focus on using the continua only for classroom observations. In preparation for using the continua, the following steps are recommended: 1. Observers are trained in the use of the continua concepts (training includes practice observations in pairs to establish reliability in defining and identifying concepts on the continua). 2. Faculty and observers collaboratively determine which class sessions within the term (semester or quarter) are most representative of the course content and class interactions. 3. Observers schedule the timing of the observations with ample time to complete the continua at the conclusion of each classroom observation.

Administration The continua are intended for use at the end of a class session and are meant to capture the observer’s overall impression of the classroom interactions. They are not meant to be analyzed in great detail by the observer. Therefore, the time requirements for completing the continua are minimal but dependent on individual style of reflection (5–15 minutes). Before training and actual use of the continua, those studying service-learning courses need to engage in discussion of the concepts of the continua and to develop clear and accepted definitions.

ASCV.indb 85

Examples of classroom practices and situations will be helpful during the discussion. As a starting point, the following definitions and questions for probing the concept are provided.

Definitions of Teaching/Learning Contexts Used in the Continua Commitment to others: Do the students and faculty seem to be empathetic and/or interested in other people’s needs or interests? Do they express a commitment to discovering community needs or interests or a similar commitment to their peers’ needs and interests? Students’ role: Are the students actively involved in the teaching and learning processes? Do they make decisions about content, processes, and so on? Faculty role: Is the faculty in a directive role of managing, ordering, instructing? Is there a sense of the faculty “in charge” with authority and control? Or is the faculty in a role of support, assistance, help, collaboration, and making resources available? Learning orientation: Is the learning environment a collective one in which students and faculty work together, and are committed to helping the entire class learn? Or is the learning environment one in which there is more of an individual focus where each individual is directed to his or her own personal learning? Pedagogy: The banking pedagogy refers to a teaching philosophy and process in which a faculty instructor deposits information in students who are expected to respond to occasional withdrawals (exams, etc.). The faculty is the source of information and understandings. The constructivist pedagogy refers to a teaching philosophy and process in which the faculty instructor facilitates experiences in which students construct their own meanings and learning.

Definitions of Teaching/Learning Qualities Used in the Continua Theory—theory and experience: Does the course rely primarily on established

27-08-2018 15:59:54

86  Assessing Service-Learning and Civic Engagement theory or is personal experience coupled and valued with theory for a foundation of course content? Others’ knowledge—personal knowledge: Is published or expert material the primary and only validated source of information or is personal experience and knowledge also validated as relevant? Student as spectator—student as participant: Is the student on the sidelines as a passive listener who absorbs the information or is the student playing an active role in the teaching/learning processes? Faculty in control—shared control: Is the faculty in charge and in a position of control of course processes and decisions or do students participate in those processes and decisions? Student as learner—student as learner and teacher: Does the student remain in the traditional role of learner, or does he or she construct course content with the teacher by sharing experiences, raising issues for discussion, and providing information? Faculty as learner—faculty as teacher and learner: Does the faculty member remain in the traditional role of teacher, or does he or she share the construction of course content with the students and encourage student directed teaching, consequently becoming learners themselves? Individual learning—collective learning: Are the learning processes focused on each individual being directed to his/her own learning, or is the environment a collective one, with all committed to helping the class as a whole understand the course content? Distinction clear between teacher and learner—distinction blurred between teacher and learner: Are the roles of the teacher and students distinctive and separate, or do teacher and students trade roles and move in and out of roles during the class sessions? Answers—questions and answers: Is course material addressed in such a way that

ASCV.indb 86

right answers are valued and content presented with certainty of information, or is material presented with issues and questions valued along with answers? Certainty of outcomes—uncertainty of outcomes: Are there defined, inflexible outcomes for students, or are the outcomes flexible enough to be constructed and revised according to students’ needs and interests? Common learning outcomes—individualized learning outcomes: Are the outcomes restricted to those set in advance or planned by the group as common for all students, or are the students afforded the opportunity to explore personal learning outcomes? Ignorance avoided—ignorance a resource: Are questions and misunderstandings treated as a diversion from scheduled class content, or are they treated as an opportunity for new directions or potential for different understandings? Focus-student needs—focus-student and community needs: Is the community service focused on student needs with respect to learning and interests, or are student needs and community needs considered and attended to equally?

Analysis The best way to analyze the data from the teaching/ learning continua is to use an empty continuum form as a tally sheet for each individual faculty. For each concept, the dates of observations are placed in the appropriate place on each line to indicate the observed qualities or contextual descriptors. The finished tally sheet provides a visual of the course and of changes or lack of change. For a group of faculty, the individual placements on the lines can be collapsed to the mean or middle placement and then represented in a bar graph for each continuum’s concepts. The data can also be shown for beginning of courses, midway points in courses, and end of courses to capture change or lack of change for a group of faculty.

27-08-2018 15:59:54

Faculty Impact  87

Continuum of Teaching/Learning Contexts Place an X on each continuum following the descriptions that best indicates how you would describe the teaching/learning context of the observed class. Commitment to Others LOW HIGH 1_______________  2______________  3______________  4______________ 5 Students’ Role PASSIVE ACTIVE 1_______________  2______________  3______________  4______________ 5 Faculty Role DIRECTIVE FACILITATIVE 1_______________  2______________  3______________  4______________ 5 Learning Orientation INDIVIDUAL COLLECTIVE 1_______________  2______________  3______________  4______________ 5 Pedagogy “BANKING” CONSTRUCTIVIST 1_______________  2______________  3______________  4______________ 5 Adapted from the work of Jeffrey Howard, University of Michigan (1995)

Continuum of Teaching/Learning Qualities Place an X on each continuum to indicate how you would describe the observed class. THEORY THEORY & EXPERIENCE 1_____________________   2 __________________  3____________________  4 ___________________  5 OTHERS’ KNOWLEDGE  PERSONAL KNOWLEDGE 1_____________________   2 __________________  3____________________  4 ___________________  5 STUDENT AS SPECTATOR  STUDENT AS PARTICIPANT 1_____________________   2 __________________  3____________________  4 ___________________  5 FACULTY IN CONTROL SHARED CONTROL 1_____________________   2 __________________  3____________________  4 ___________________  5 STUDENT AS LEARNER  STUDENT AS LEARNER & TEACHER 1_____________________   2 __________________  3____________________  4 ___________________  5 FACULTY AS TEACHER  FACULTY AS TEACHER & LEARNER 1_____________________   2 __________________  3____________________  4 ___________________  5 INDIVIDUAL LEARNING  COLLECTIVE LEARNING 1_____________________   2 __________________  3____________________  4 ___________________  5 DISTINCTION CLEAR B/W TEACHER DISTINCTION BLURRED B/W TEACHER & LEARNER & LEARNER 1_____________________   2 __________________  3____________________  4 ___________________  5 ANSWERS  QUESTIONS AND ANSWERS 1_____________________   2 __________________  3____________________  4 ___________________  5

ASCV.indb 87

27-08-2018 15:59:54

88  Assessing Service-Learning and Civic Engagement CERTAINTY OF OUTCOMES  UNCERTAINTY OF OUTCOMES 1_____________________   2 __________________  3____________________  4 ___________________  5 COMMON LEARNING OUTCOMES INDIVIDUALIZED LEARNING OUTCOMES 1_____________________   2 __________________  3____________________  4 ___________________  5 IGNORANCE AVOIDED IGNORANCE A RESOURCE 1_____________________   2 __________________  3____________________  4 ___________________  5 FOCUS-STUDENT NEEDS FOCUS-STUDENT/COMMUNITY NEEDS 1_____________________   2 __________________  3____________________  4 ___________________  5 Adapted from the work of Jeffrey Howard, University of Michigan (1995)

ASCV.indb 88

27-08-2018 15:59:54

Faculty Impact  89

Faculty Survey Purpose The faculty survey is intended to describe faculty members’ perspectives, motivations, concerns, and attitudes on issues related to their experience teaching a servicelearning course. The survey is based on a five-point Likert scale where faculty report their level of agreement regarding their service-learning course(s). The scale range includes strongly disagree, disagree, neutral, agree, and strongly agree. Topics assessed by the survey include faculty’s attitude about service, community, and service-learning, the impact they perceive that servicelearning has on their students and their scholarly work, and their motivation for incorporating service-learning into their courses. The faculty survey was developed through a process of literature review, survey of existing instruments, and discussions with faculty. The information gained through the faculty survey is useful for purposes of planning faculty development programs and for attracting and recruiting faculty for service-learning or community-based learning courses. The instrument provides descriptions of various perspectives and experiences of faculty who incorporate service in their academic courses. These descriptions will yield understanding for planning and coordinating campus programs. In addition to assessing faculty attitudes and perspectives, the faculty survey probes the impact of service-learning on faculty. The survey includes questions pertaining to the influence that servicelearning has on a faculty member’s community involvement, teaching, and scholarship. These data are useful for assessing the impact of servicelearning on both individual faculty and on the institution in general. As with the student section, two surveys for faculty are presented: one a longer version, and the second a shorter version that can be scanned by institutional research resources for rapid reporting.

Preparation Before administering the faculty survey the following preparation steps are recommended. 1. Determine the purpose of instrument use. The decisions include determining if the instrument is to be used in a pre/post assessment of change or in a post-test only approach to describe the general attitudes and perceptions of faculty after they have taught a servicelearning course.

ASCV.indb 89

2. Consider using other data-gathering strategies to complement use of the faculty survey to develop a more complete and useful profile of faculty perspectives and attitudes. This instrument is ideally used prior to conducting faculty interviews. Faculty syllabi and teaching materials will be useful additions to the data from the faculty survey. To gain a full picture of a course, the faculty survey will complement data yielded by the student surveys. 3. Determine appropriate scheduling of the instrument use. Schedule the administration of the survey in consideration of faculty time and convenience. 4. Solicit faculty consent and support for the instrument use. Well in advance of using the faculty survey, faculty should be informed of its purpose and their consent obtained, preferably in writing.

Administration Once the preparation steps are complete, the following administration procedures are recommended for use of the faculty survey. 1. Faculty anonymity should be assured to them and maintained throughout the collection of data from the survey. 2. Faculty should be informed that the instrument will take 15 to 20 minutes to complete. 3. If the instrument is mailed to faculty, clear information should be included about returning the survey (timing and where to return the form).

Analysis Data analysis can be conducted through utilization of the SPSS software. In the case of assessing and comparing pre- and post-service-learning experiences, the analysis could include frequencies, descriptive statistics, Chi-squares, ANOVA, and factor analysis. Descriptive statistics and frequencies serve as a database, providing mean, mode, and standard deviation between items. Chi-squares correlate demographic data between faculty. Factor analysis reduces items into categories that are closely related. ANOVAs are useful to explore the existence of variation between faculty on either single items or groups of items that may arise from the factor analysis.

27-08-2018 15:59:54

90  Assessing Service-Learning and Civic Engagement

Community-Based Learning—Faculty Survey (longer version) We would like to better understand the impact that community-based learning has on faculty. Please assist us by responding to the following questions. I. First, we would like to know some information about you. 1. How long have you been teaching at a postsecondary level?����������������������������������� 2. Was this your first community-based learning course?��������������������������������������� 3. The course number and title you taught: �������������������������������������������������� II. We would like to gain your perspective about this community-based learning course. Please indicate your level of agreement with each statement. 4. The community participation aspect of this course helped students to see how the subject matter they learned can be used in everyday life. 5. The community work in this course helped students to better understand the lectures and readings in this class. 6. I feel that students would have learned more from this course if more time had been spent in the classroom instead of doing community work. 7. The idea of combining work in the community with university coursework should be practiced in more courses at this university.

Strongly Disagree

Disagree

Neutral

Agree

Strongly Agree









































III. The next set of questions relates to your attitude toward community involvement. 8. I was already volunteering in my community before this course. 9. The community participation aspect of this course showed me how I can become more involved in my community. 10. I feel that the community work being done through this class benefited the community. 11. I probably won’t volunteer or participate in the community now that this class is finished. 12. The community work involved in this course helped me to become more aware of the needs in my community. 13. I have a responsibility to serve my community.

ASCV.indb 90

Strongly Disagree

Disagree

Neutral

Agree

Strongly Agree





























































27-08-2018 15:59:54

Faculty Impact  91

IV. Next, we would like to know the influence of your service on your professional development. 14. Doing work in the community helped me to define my personal strengths and weaknesses. 15. Performing work in the community helped me clarify areas of focus for my scholarship. 16. Teaching a community-based learning course resulted in a change in my teaching orientation. 17. This community-based learning course is an important entry in my portfolio.

Strongly Disagree

Disagree

Neutral

Agree

Strongly Agree









































V. Next, we would like some of your personal reflections on this experience. 18. Most people can make a difference in their community. 19. I was able to develop a good relationship with the students in this course because of the community work we performed. 20. I was comfortable working with cultures other than my own. 21. The community work involved in this course made me aware of some of my own biases and prejudices. 22. Participating in the community helped me enhance my leadership skills. 23. The work we performed in the community enhanced my ability to communicate my ideas in a real world context. 24. I can make a difference in my community.

Strongly Disagree

Disagree

Neutral

Agree

Strongly Agree







































































VI. Finally, please answer some questions about the process of teaching a community-based course. 25. What was (were) your reason(s) for deciding to teach a community-based learning course? Please indicate all reasons that apply and rank them in order of importance (1 being most important).

Need to try something new

______________



Desire for increased relevance in courses

______________



Encouragement from colleagues

______________



Resources ($) to support the course

______________



Faculty incentive money

______________



Have taught these courses before

______________



For professional recognition

______________



Curiosity

______________

Other:

ASCV.indb 91

____________

27-08-2018 15:59:54

92  Assessing Service-Learning and Civic Engagement 26. How did you handle the logistics of your community-based learning course? Please check the most accurate response.

I made the arrangements and placements.

______________



A graduate student who works with me made the arrangements and placements.

______________



The graduate student and I worked together on the arrangements and placements.

______________



Students handled their own placements.

______________



The community representative handled the arrangements and placements.

______________



Other:

______________

27. Now that this course is completed, my most serious concern about teaching a community-based learning course is: Please indicate all responses that apply and rank them in order of importance (1 being most important.)

Time constraints

______________



Coordination of placements

______________



Supervision of students

______________



Communication with community representative(s)

______________



Reduced time for classroom instruction

______________



Unpredictable nature of community work

______________



Assessment of students’ learning and work

______________



Costs

______________

Other:

______________

28. Teaching a community-based learning course has had an impact on the following: Please indicate all responses that apply and rank them in order of importance (1 being most important).

My research agenda

______________



My plans for publications and presentations (scholarly work)

______________



Other classes I teach

______________



My own personal service in the community

______________



My relationships with faculty colleagues

______________



My relationships with students

______________



My relationships with community partners

______________



Other:

______________

Finally, please add any other comments you have about teaching courses where learning takes place in a community setting. (Please use the space provided or attach an additional sheet of paper.) Thank you for your insights regarding community-based learning!

ASCV.indb 92

27-08-2018 15:59:54

Faculty Impact  93

Community-Based Learning—Faculty Survey (streamlined version) We would like to better understand the impact that community-based learning has on faculty. Please assist us by taking 5 to 10 minutes to complete this survey, and return it to [directions personalized to institution]. I. First, we would like some information about you. 1. How long have you been teaching at the postsecondary level? _________ [number of years]

2. Approximately how many times have you taught community-based learning courses? ☐ Once    ☐ 2–5   ☐ 6–10   ☐ More than 10

3. Are there other faculty in your department/program teaching community-based learning courses? ☐ Yes

☐ No

4. What is the academic level of the students in this community-based learning course? ☐ Freshmen ☐ Senior

☐ Sophomore ☐ Capstone

☐ Junior ☐ Graduate

II. Next, we would like to gain your perspective about this community-based learning course.



5. Mark the place on each of the following four scales to indicate how you would describe the student and faculty roles in the community-based classroom experience. Student as learner Student as spectator Faculty in control Faculty as teacher

☐ ☐ ☐ ☐

☐ ☐ ☐ ☐

☐ ☐ ☐ ☐

☐ ☐ ☐ ☐

☐ ☐ ☐ ☐

Student as learner and teacher Student as participant Shared control Faculty as teacher and learner

III. The next set of questions relates to your experience and concept of community involvement. 6. I had previous community volunteer experience prior to teaching my first community-based learning course. 7. I believe that the community work done through this class has benefited the community. 8. I will volunteer or participate in the community now that this class has finished. 9. The community work involved in this course has deepened my understanding of community needs. 10. I believe that as a faculty member I have a responsibility to serve my community.

ASCV.indb 93

Strongly Disagree

Disagree

Neutral

Agree

Strongly Agree



















































27-08-2018 15:59:55

94  Assessing Service-Learning and Civic Engagement IV. Next, we would like to know the influence of your service on your personal and professional development. Please indicate your level of agreement with each of the following statements. 11. Performing work in the community has helped me to focus on specific areas for my scholarship. 12. Teaching a community-based learning course has resulted in a change in my teaching strategies. 13. I found that my relationship with the students was enhanced because of the community work we performed. 14. Participating in the community has helped me enhance my leadership skills

Strongly Disagree

Disagree

Neutral

Agree

Strongly Agree









































Please add any other comments you may have. V. Finally, we would like you to comment on future community-based learning courses. 15. Now that this course is finished, you may still have concerns about teaching community-based learning courses. Please mark any of the following that are concerns of yours. ☐ Time constraints ☐ Coordination of placements ☐ Supervision of students ☐ Assessment of students’ learning ☐ Communication with community representative(s)

☐ Reduced time for classroom instruction ☐ Unpredictable nature of community work ☐ Costs ☐ Other (please specify) ___________________________

16. Reflecting on this community-based learning experience, what ideas do you have for your next community-based learning class to improve the overall experience for you, your students, and the community partners? Thank you for your comments. Please return this by [insert date] to [insert relevant mailing address].

ASCV.indb 94

27-08-2018 15:59:55

Chapter Five

C OM M U N I T Y I M PAC T

the partner to have, but we did not feel they were relevant measures to focus on, given our goal of understanding the community-university collaboration and partnership that underlies service-learning programs. Service-learning is intensive and demanding work for community partners. Most important to our assessment was to understand partner perceptions of the impact of service-learning on their operations so we could identify needed improvements and ensure reciprocity.

Why Assess Impact on the Community? Service-learning is impossible without community involvement. Effective and sustainable servicelearning depends on mutually beneficial partnerships between campus and community (Holland & Gelmon, 1998). Yet much assessment work, prior to our initial work at PSU, focused almost exclusively on assessment of impact on students. The goals of our assessment project included understanding multiple impacts and gathering information to improve service-learning. Thus, it was essential to assess community as a distinct constituency. But how does one define community? There is no one “community.” In fact, early on in our work we asked faculty to define their perceptions of “Who is the community?” and we received a wide variety of answers. Developing an understanding of perceptions of community and what that means to students, faculty, and the institution, let alone the community partner, is therefore essential. For our work we focused primarily on the community partner organization that participated in the service-learning experience for each individual course (recognizing that in some courses there were multiple community partners). Given the importance of the partner organization to the servicelearning opportunity and experience, the focus of assessment was on impacts on their organization and their perception of the service-learning project. We looked to the partner organization to give us feedback regarding any impact on clients, not being so presumptuous as to try to make causal relationships between the work students did and changes in, for example, clients’ health status, emotional well-being, job security, or housing stability. Clearly these indicators might be valuable information for

Understanding Assessment of Impact on Community Talking about “community” implies that the community is a single entity, a unitary concept, and a definable organization. In fact, issues addressed by students through a community-based learning experience are part of a much larger system that includes residents, government, law enforcement, business, housing, schools, health and social services, and economic development (Scholtes, 1997). Individuals who work in, interact with, or produce materials for these various sectors bring different perspectives and varying views of the sources of problems and potential solutions (Knapp, Bennett, Plumb, & Robinson, 2000). Efforts to take all of these perspectives into account in trying to assess impact on community may be formidable and might create overwhelming barriers to completion of assessment. In a review of research on community as a key ­factor in service-learning, Cruz and Giles (2000) identify political, intellectual, and practical dimensions as obstacles to research on the community focus in the service-learning literature. The political 95

ASCV.indb 95

27-08-2018 15:59:55

96  Assessing Service-Learning and Civic Engagement concern relates to questions about academic rigor in studying the community; the intellectual concern focuses on an inability to define community and therefore to define appropriate methodologies to study it; and the practical concern addresses the lack of resources and knowledge to pursue this line of inquiry. They reach the following conclusions about community and service-learning as supported by the literature: • Service-learning contributes to community development. • Service-learning bridges town-gown gaps. • Service-learning offers benefits to community partners. In addition to these conclusions, there is also the issue of community interests in student preparation. Partners see service-learning as a tool to attract students to civic service or to nonprofit careers—or at least to help students become “citizen professionals.” Understanding this interest of the community partners is another important element of assessing impact from a community perspective. One of the challenges faced by universities when working with communities is that there is often a chasm between the (unrealized) expectations and (mis)understandings of the community partners and the services/resources the university can provide (Wealthall, Graham, & Turner, 1998). A significant area of focus, therefore, for the university is to pay special attention to clarifying abilities and expectations and to ensure that students and faculty work closely with community liaisons to develop genuine understandings of each other’s context and perspectives, and the ability to respond to assets and needs. Inevitably, community need is far greater than the capacity of the campus service-learning effort (Gelmon, Holland, Shinnamon, & Morris, 1998). The assessment challenge lies in clarifying what is reasonable to expect and accomplish within the service-learning activity, determining to what extent this has been accomplished, and gaining understanding of the barriers and facilitators of these accomplishments. Thus the unit of analysis is the partnership relationship itself, as well as the partner organization’s perceptions of impact. Assessment of community involvement in service-learning raises issues about methodologies that have not been answered to date (Gelmon, 2000a). Is there a difference in assessing the impact on the community as compared to the impact on the

ASCV.indb 96

community-university partnership? One must be able to define the community component relevant to the assessment and then describe the elements of the partnership. One useful approach from which to build assessment of partnership relationships could be to rely upon the “Principles of Partnership” articulated by CCPH. These principles are one of the few examples in the public domain today, and they work well for partnerships across the higher education spectrum, even though they were initially articulated in the context of health professions education (Seifer & Maurana, 2000). These principles are: 1. Partners have agreed upon mission, values, goals, and measurable outcomes for the partnership. 2. The relationship between partners is characterized by mutual trust, respect, genuineness, and commitment. 3. The partnership builds upon identified strengths and assets, but also addresses areas that need improvement. 4. The partnership balances the power among partners and enables resources among partners to be shared. 5. There is clear, open, and accessible communication between partners, making it an ongoing priority to listen to each need, develop a common language, and validate/clarify the meaning of terms. 6. Roles, norms, and processes for the partnership are established with the input and agreement of all partners. 7. There is feedback to, among, and from all stakeholders in the partnership, with the goal of continuously improving the partnership and its outcomes. 8. Partners share the credit for the partnership’s accomplishments. 9. Partnerships take time to develop and evolve over time. A set of key factors for successful student/ community partnership projects, identified through an educational collaborative addressing community health improvement (Knapp et al., 2000), offers another useful approach for thinking about assessment of community impact. The original factors, which focused specifically on health issues, have been edited to have broader relevance for a number of disciplines in higher education:

27-08-2018 15:59:55

Community Impact  97

• Ensure relevant community data are available prior to student involvement so that all members of the partnership understand the community issues. • Connect the institution and the community, so that faculty have knowledge of the community and the issues being addressed and can facilitate the connections between the students and the community and community representatives have knowledge of the academic institution and the issues being addressed in the service-learning activity. • Jointly define target populations so that student projects focus on specific groups rather than the entire community. • Members of the partnership understand the people to be served, and design and implement appropriate, client-sensitive approaches. • Partners work together to identify appropriate, short-term projects that are doable in the time students have and contribute to the knowledge base of both the community organization and the students. • Partnership members practice and model interdisciplinary teamwork, because community issues and actions are intrinsically interdisciplinary. These factors clearly contribute to design of service-learning experiences, but could also then form the basis for articulating the focus of the assessment. While the emphasis here is on understanding community impact, the focus of each factor shows the interrelationship of student, faculty, and institutional perspectives with those of the community. These factors can in turn be linked to core concepts for understanding impact of service-learning on the community and impact of the community on service-learning. Another approach that includes attention to learning is offered by Holland (2000b). She offers the following characteristics of sustainable partnerships that could serve as a framework for assessment: • Joint exploration of separate and common goals and interests • Creation of a mutually rewarding shared agenda • Articulation of clear consequences for each partner • Success measured in both institutional and community terms

ASCV.indb 97

• Shared control of partnership directions and/ or resources • Effective use and enhancement of community capacity • Identification of opportunities for early success and regular celebration • Focus on knowledge exchange, shared twoway learning and capacity building • Attention to communication and open cultivation of trust • Commitment to continuous assessment of the partnership itself, as well as to outcomes Assessment of community impact can also develop from theories and concepts of community development and community-building. In addition to adopting an “assets” rather than “needs” approach (Kretzman & McKnight, 1993), community-building frameworks may offer insights into leadership, knowledge, creativity, and problem-solving capacities (Keith, 1998). There is little, if any, documented assessment literature using such frameworks, suggesting an opportunity for service-learning educators to team up with their colleagues in community development to more explicitly articulate methods for this area of assessment. Some examples of models of assessment of community impact exist in the literature. The assessment of the impact of service-learning in health professions education, through a national demonstration program known as Health Professions Schools in Service to the Nation, incorporated one research question addressing impact on community-university partnerships, and another question addressing impact on the partners themselves (Gelmon, Holland & Shinnamon, 1998; Gelmon, Holland, Shinnamon & Morris, 1998). Some foundations have reported assessments aimed at understanding the impact of social intervention programs on community change (Annie E. Casey Foundation, 1999; Connell, Kubich, Schorr, & Weiss, 1995; Petersen, 1998). Early evidence about the use of a model being referred to as the 3-I Model (initiator, initiative, impact) suggests potential assessment applications for understanding community change (Clarke, 2000). Future work on assessment of community impact may be aided by the approach recommended by Cruz and Giles (2000). They suggest (a) using the community-university partnership as the unit of analysis (Seifer & Maurana, 2000); (b) giving serious attention to the principles of good practice for service-learning (Sigmon, 1979; Honnet & Poulsen,

27-08-2018 15:59:55

98  Assessing Service-Learning and Civic Engagement 1989) regarding community input, reciprocity, and partnership; (c) using action research (Harkavy, Puckett, & Romer, 2000); and (d) focusing on an assets-based approach (Kretzman & McKnight, 1993). The assessment model presented here incorporates elements of all four strategies. It is unique in terms of the deference it gives to the community partners and the importance of their articulation and interpretation of any impact. Partnerships must be assessed as part of the overall assessment of the impact of service-learning.

Assessment Matrix for Community An assessment matrix for understanding the community constituency is presented in Table 5.1. This matrix is based upon experiences in several programs and is presented as a synthesis of best practices based upon those evaluations (Driscoll et al., 1998; Shinnamon, Gelmon, & Holland, 1999; Gelmon, McBride, et al., 1998). The concepts (variables) are presented in two sections: those most applicable to the community partner organization itself and those related to the community-university partnership. In considering those concepts related to the community partner organization, individuals designing the assessment must be cautious to avoid identifying concepts that might be interpreted as part of a performance review of the organization. Such a review must not be the focus in assessing impact of service-learning, but there may be concepts that are appealing to the university participants but would be viewed as threatening or intrusive by the community partner. Thus, the three concepts presented focus explicitly on how the participation of the partner organization in the academic activity affects the partner. Based on our experiences, there are three main areas on which to focus.

Capacity to Fulfill Organizational Mission The service-learning activity may affect the types of services offered, the number of clients served, and the variety of activities offered. The number of students who can be accommodated by the organization might also change and have a relationship to capacity. Organizations that are primarily volunteer-driven are able to increase their organizational capacity significantly through service-learning partnerships. Finally, through its interaction with university representatives, the organization may gain

ASCV.indb 98

insights into assets and needs (of itself, its clients, or the university) that may affect organizational capacity or program strategies.

Economic Benefits Through the participation of faculty and students in the service-learning interaction, organizations may derive economic benefits or cost burdens in terms of resource utilization (human, fiscal, information, or physical resources). Sometimes organizations identify new staff (generally from among the student participants) and are spared the time and expense of a costly search process. The community-university collaboration may also facilitate identification of new funding opportunities for which the community organization may apply (with or without the participation of the university), again contributing to economic benefits. Another benefit is the completion of projects with the addition of new expertise that the organization might not normally have readily available (e.g., graphic design, diversity training, development of marketing materials). Such benefits are often one of the motivators for community organizations to partner with academic institutions.

Social Benefits Through the collaboration with the university, the community organization may identify new connections or networks—sometimes with individuals, and sometimes with other community organizations (particularly if the university brings together multiple community partners in community advisory committees or through other cross-organization collaborations). Organizations also often report an increase in number of volunteers, when students continue their involvement after the academic project (and often bring their friends and families to the volunteer experience). There may also be an impact on community issues (e.g., neighborhood policing, improved lighting, lead abatement, accessible immunization clinics) as a result of the servicelearning activity, again offering social benefit for the organization and its community. Assessment can help community partners to think about new and sometimes better ways to work with volunteers in general (not just students). The second set of concepts relates to the community-university partnership itself. There are challenges to this aspect of assessment because of the somewhat intangible nature of partnerships and

27-08-2018 15:59:55

ASCV.indb 99

TABLE 5.1 

Matrix for Community Assessment What do we want to know? (concepts)

How will we know it? (indicators)

How will we measure it? (methods)

Who/what will provide the data? (sources)

Variables about community partner organization Capacity to fulfill organizational mission

Types of services provided Number of clients served Number of students involved Variety of activities offered Insights into assets and needs

Survey Interview Focus groups Documentation review Critical incident review

Community partner Students Faculty Advisory committees Governing board

Economic benefits

Identification of new staff Impact on resource utilization through services provided by faculty/ students Identification of funding opportunities

Interview Focus groups Documentation review

Community partner Students Faculty Governing board

Social benefits

New connections or networks Number of volunteers Impact on community issues

Interview Focus groups Documentation review

Community partner Students Faculty Governing board

Variables about community-university partnership Nature of communityuniversity relationship (partnership)

Creation of partnerships Kinds of activities conducted Barriers/facilitators

Interview Documentation review Critical incident review

Community partner Faculty Governing board

Nature of communityuniversity interaction

Involvement in each others’ activities Communication patterns Community awareness of university programs and activities University awareness of community programs and activities

Interview Focus groups Documentation review

Community partner Students Faculty Advisory committees

Satisfaction with partnership

Perception of mutuality and reciprocity Responsiveness to concerns Willingness to provide feedback

Interview Focus group Survey

Community partner Faculty Governing board

Sustainability of partnership

Duration Evolution

Interview Survey Critical incident review

Community partner Faculty Governing board

27-08-2018 15:59:55

100  Assessing Service-Learning and Civic Engagement the difficulties in defining what can be assessed. As a result, some of these concepts reflect documentable indicators, and others relate to the processes that support and contribute to the partnership.

Nature of Community-University Relationship (Partnership) The core of this concept is a description of the process by which partnerships are established. This is illustrated by gaining the partners’ perspectives on the kinds of activities conducted and of the barriers and facilitators to both establishing the partnership and engaging in these activities. Investigation of the nature of partnerships can reveal important insights about mutual respect and common goals and can highlight many of the components cited in the various principles/characteristics of partnerships (Seifer & Maurana, 2000; Knapp et al., 2000; Holland, 2000b).

Nature of Community-University Interaction A core element of partnerships is the nature and kind of interactions that take place—and ideally there are multiple interactions, rather than simply the act of students going to the partner organization and working on a specific activity. Interactions may be seen where community partners go to campus to participate (in classroom-based reflection sessions, or in program planning activities for example), and where campus representatives go to the community organization (to attend community advisory group meetings, to volunteer, or to participate in community activities). The partnership might also focus on very specific work, such as web design, multimedia presentation development, or brochure production. Communication is an essential element of the partnership, and attention should be given to the methods and patterns of communication. Finally, interactions can be understood through description of levels of awareness the partners have about each other’s programs and activities.

Satisfaction With Partnership Satisfaction is essential to the development, implementation, and maintenance of a partnership. The key elements here are the perceptions of mutuality of effort and reciprocity in activity (Gelmon, Holland, & Shinnamon, 1998). The assessment of this concept

ASCV.indb 100

requires creation of a safe environment in which participants in the partnership can offer praise as well as express concerns without fear of reprisal. Cultural norms regarding expression of satisfaction must be taken into account when attempting to collect data on this concept. Another element of satisfaction can be assessed through understanding of responsiveness to concerns—again, responsiveness by all participants in the partnership. Assessments must be designed to take into account these multiple perspectives (and avoid falling into the trap of only considering the university’s perspective). This concept offers considerable opportunity for unanticipated findings regarding sources of satisfaction.

Sustainability of Partnership Significant effort is invested in creating partnerships and, if they are successful, there is usually a desire to sustain these efforts. Sustainability can be understood through gaining insights into the duration of partnerships and the evolution they go through. In particular, identification of key events throughout the partnership that created barriers to collaboration or accelerated collaborative efforts can provide useful insights into the strengths and benefits of the partnership. In assessing sustainability, it is essential to try to understand the partners’ intention in sustaining a relationship and to investigate both the time invested to build the partnership and to maintain it over time. In addition, insights can be gathered into how the partners recognize if the partnership is not meeting their goals.

Strategies for Assessing Impact on Community A key issue in engaging the community partner in assessment is to be respectful of their time, obligations, and resources. Students can be required to spend a substantial amount of time writing a reflective journal, but one cannot expect that commitment of community partners. Similarly, faculty can be expected to convene at the researcher’s convenience for interviews or focus groups, but the researcher must go to the community partner for an interview (at the partner’s convenience). There must also be sufficient benefits offered for community partners to come together at a central location during peak working hours for a focus group. Careful attention must be given to selecting methods for assessment of community impact

27-08-2018 15:59:55

Community Impact  101

that create the least burden of evaluation and provide the most benefit for the university and the community partner. This may result in some compromises in terms of the kinds of data that can be collected, but this will be offset by the increased responsiveness and enhanced quality of data contributed. Community partners value the opportunity to provide feedback and often report that the invitations to participate in assessment activities help them to feel that their role in the university’s activities is a valued one. Partners are sometimes intimidated by participating in university discussions if they themselves do not have the academic credentials evident among faculty and institutional administrators. Such concerns need to be carefully addressed, so that partners do feel welcome and appreciated. A distinct challenge is to create the appropriate communication environment where community partners feel able to speak candidly about experiences, offering praise but also sharing critical and/ or reflective observations. Partners are usually eager to praise, in particular because they are often grateful for the service provided and the benefits they derive from the partnership. Some partners, however, are reluctant to be critical for fear of “retribution” and potentially losing the partnership or jeopardizing their relationship with the faculty member. There may also be cultural norms about not criticizing someone who is helping you. Attention must be given to encouraging candid feedback from the partners with the emphasis on improving the work done together and with adequate assurances that criticism will not lead to any negative actions. One of the particular benefits accrued from incorporating a community voice is to gain an additional perspective from outside the university. Community partners can offer incredibly valuable insights about, for example, student preparation for the service experience that may either validate or obviate what the faculty member has reported about how they have prepared the students. Similarly, students may report their perceptions about the value of their service, and these may again be similar or opposite to the perceptions the community partner has of the value of their contribution. This should not be interpreted as suggesting that the students and faculty are always positive and the community partner opinion is always contradictory; sometimes the community partner expresses much greater satisfaction than either students or faculty have observed! Observations of students in community settings

ASCV.indb 101

help to highlight the benefits and challenges present in these relationships and may also highlight areas where changes are needed in activity design, communications, or other areas of interaction. Specific comments regarding the various methods offered for assessment of community impact are similar to those found in preceding sections for the specific techniques. Thus the reader is referred to the previous discussions of strategies for assessment to gain insights regarding methods such as surveys, focus groups, and interviews. A few particular observations are noted in the following sections.

Community Partner Survey Faculty need to be contacted early in the academic term in order for the researcher to obtain the contact information for each partner. Faculty must also be assured that their community partner information is confidential and will not be shared (unless they have no objections); some faculty are concerned about “losing” their partner. A copy of the survey should be shared with the faculty so they know what questions are being asked of the partner.

Community Observation While this can provide rich data, it may be logistically difficult to organize. Sometimes observations are viewed as intrusive, and care should be taken to avoid this. Such difficulties are offset, however, by the opportunity to better understand the context in which students are working, in particular with respect to logistics, communication, and the relevance of the course project to agency activities. The contact made during an observation may also facilitate community partner responsiveness to a subsequent survey or a request for an interview or focus group participation.

Community Focus Group Focus groups of community partners can be difficult to organize in terms of finding a time that is convenient. Convening the focus group over a meal and on campus—perhaps in conjunction with a campus tour if these are new partners—can help to entice partners to attend. One of the side benefits of a focus group of partners is the additional networking that may occur among the partners—those who already have

27-08-2018 15:59:55

102  Assessing Service-Learning and Civic Engagement relationships and those who are meeting other community members for the first time. Hosting a focus group on campus helps the university to convey its appreciation to the partners, particularly if the group meets in an environment that is welcoming.

Community Partner Interview Interviews of community partners may be threatening to faculty, so care should be taken to ensure that faculty understand the content of the interview protocol and the purpose of the interview itself. Interviews also take time, which is a commodity that many partners feel is lacking; however, some partners may prefer an interview at their site rather than having to travel to participate in a focus group. Thus the interview must be very focused in order to obtain maximum benefit for the institution, the faculty member, and the partner. The interview can provide valuable information to understand the complexity of partnerships and allows the community partner to express the extent to which they feel a part of the educational process. A benefit of the interview can be to convey to the partner that their relationship with the university has the potential to go beyond one course and one faculty member, if they wish and it is appropriate. They may identify other opportunities

ASCV.indb 102

for student and faculty involvement in their organization as a result of the interview.

Concluding Thoughts The community presents perhaps the most challenging aspect of assessment of the impact of servicelearning for two main reasons. First, as described previously, it is difficult to define what we mean by community and researchers must clearly embrace a definition that articulates what aspects of organization, clients, and larger social systems will be included in the assessment. Second, the community is its own agent, and not under the oversight of the university, so the ability to require participation in assessment and link it to any sort of rewards or punishments is negligible. Thus the researchers face the challenge of creating an assessment plan which will get community partner buy-in—in terms of methods, time commitment, and potential benefit of the results. Despite these two challenges, insights from the community partners and about community-university partnerships provide rich and essential information for understanding the overall impact of service-­learning. The following pages present examples of assessment methods for understanding community impact.

27-08-2018 15:59:55

Community Impact  103

Strategies and Methods: Community Community Observation Community Focus Group Community Partner Interview Community Survey

Community Observation Purpose

The community observation protocol offers a set of guiding questions and focus areas for observing faculty and students working in the community as part of a service- or community-based learning course. The purposes of such observations are: • To describe the character and content of interactions between students/faculty and the community partner • To capture the dynamics of the community service experience, that is, the roles of students, faculty, and community partners • To document student learning in the community • To gather data on services rendered to the community (number of students, hours spent, kinds of services, clients served, etc.) • To provide descriptive documentation of the partnership

Preparation In preparation for observing in community settings, the following sequence is recommended: 1. Train and orient observers with practice observations in pairs to establish reliability. 2. Schedule time for introduction of assessment to students and community partners, introduction of observers and their roles, and completion of human subjects review protocols. 3. Involve faculty in determining which community settings and which days and times are most appropriate for observation. Observations should be planned with the faculty and the community partner.

Administration Observations should be conducted infrequently (once or twice during an academic quarter/semester)

ASCV.indb 103

and be well coordinated with both the faculty and the community partner. Using the community observation protocol the observer will collect information about the following: • Setting • Roles of students, faculty, and community partners • Interactions, communication, and activities taking place during community service • Concerns expressed by students, faculty, and community partners • Accomplishments, tasks, or service activities • Climate (mood, affect) The community observation protocol provides questions to be answered by the observer and focus areas for the observer to notice. It does not provide a format for recording observations. The intent is for observers to record narrative data in a journal or notebook. The narrative data should include direct quotations, specific examples, and (as much as possible) a record of what happens during the observation period. During actual recordings, the observer should maintain a neutral stance and try to avoid interpretation and critique. After writing a description of what was observed, the observer may write a summary of the observation. In the summary, it is appropriate to offer interpretation, raise questions, relate the observation to other information, and link observations to key study concepts.

Analysis Community observations contain rich data. Analysis of observational data is a complex process that requires a series of readings. The first reading is done to gain an overview of the data. The second reading is intended to surface themes or patterns in the data. The third and fourth readings are intended to confirm the themes or patterns, identify additional ones, and begin to organize the data within the themes or patterns. The protocol questions and foci will emerge from the observational data as themes or patterns along with additional unanticipated ones. A format similar to that for reviewing and coding focus groups or interviews (see previous sections) can be used.

27-08-2018 15:59:55

104  Assessing Service-Learning and Civic Engagement

Community Observation Protocol 1. Describe the setting: date of observation, location, arrangement of space, environment, mood, pace, and other factors. 2. Describe who is present and their apparent roles. 3. What actions are students taking (observer, leader, participant)? What actions are faculty taking? What actions are the community partners taking? 4. Describe the communications/interactions and indicate the categories of individuals involved (e.g., ­students, partners, clients). 5. How does the community activity end? What sort of summation occurs (“next time, we will do some . . . ,” “good-bye,” or nothing)? 6. What accomplishment(s), task(s), or service did you observe? 7. Were concerns expressed by students? By faculty? By community partner? What were they (provide descriptions of situations)? 8. Please add any other relevant observations.

ASCV.indb 104

27-08-2018 15:59:55

Community Impact  105

Community Focus Group

Administration

Purpose

It is essential that the facilitator be trained in focus group methods. Try to have the person arranging the focus group present at it, either as the note-taker or facilitator. This helps provide a personal connection for the partners. Logistical matters such as the following can contribute to a successful focus group:

The purpose of a focus group with community partners is to use a facilitated group discussion to learn more about the experience of the partnership from the perspective of the community and to encourage reflection on ways to improve partnerships. Focus groups may also introduce community partners to each other and contribute to building social networks. A focus group usually lasts between one and one and a half hours and provides rich and specific information for analysis of research themes.

Preparation In preparation for conducting a focus group, the following sequence is recommended: • Identify a trained focus group facilitator and at least one observer/note-taker. This notetaker is responsible for ensuring equipment works throughout the focus group and for taking notes of nonverbal communication. • Recruit five to eight community partners to attend a focus group, working through faculty teaching service-learning classes. • Establish a time and place convenient for the partners. • Hold the meeting in a quiet room suitable for a circular seating layout. • Ensure the availability of good-quality recording equipment. • Arrange for parking for partners and provide maps of the campus. • Send a letter of instruction to partners and stress on-time arrival.

ASCV.indb 105

• Provide name tags for the partners (unless anonymity is critical). • Arrange participants in a circle or semi-circle. • Facilitator opens session, describes purpose, and gives ground rules using established script. • Describe the role of the note-taker, emphasizing that they are there to provide a back-up record should the recording not be audible. • Participants should introduce themselves. • The introductory message in the protocol should be read to the community partners prior to beginning the focus group questions.

Analysis Recordings and notes from focus groups must be transcribed as soon as possible after the session. Focus groups generate a large body of rich, textual data. Analysis consists of organizing the data into meaningful subsections, initially according to the questions posed. Interpretation involves coding the data (identifying meaningful phrases and quotes) and searching for patterns within and across subsections. For a detailed discussion of analysis of focus group data, see Morgan (1993, 1997, 1998).

27-08-2018 15:59:55

106  Assessing Service-Learning and Civic Engagement

Community Focus Group Protocol Introduction

The purposes of the focus group are twofold: to understand the impact of the partnership on the communitybased organization and to collect feedback, positive and negative, that will assist the university in improving partnership activities in the future. The discussion is recorded for the purpose of capturing detail, but all comments are confidential and never attributed to individual participants. As participants you can make the focus group successful by being both candid and as specific as possible when discussing different issues. A candid focus group will help the university document the effects of its efforts, recognize strengths and weaknesses of its outreach efforts, and identify areas where it can improve. As facilitator, I will offer no opinions; my role is to guide you through a conversation based on a set of relevant questions. I will try to make sure that everyone participates and that no one dominates the discussion. Please be sure to speak one at a time so the recording will be clear. During this discussion, please be brief and specific. Where there is disagreement, you should talk about your different perspective, but we will not spend time pressing for consensus or reaching agreement. The purpose is not to reach a common view, but to learn about all the possible views.

Questions 1. Please introduce yourself and briefly describe the nature of your partnership with the university. (10 minutes) 2. What went well? What factors contributed to successful outcomes? What was the most important factor in achieving success? (10 minutes) 3. How would you describe the benefits of the partnership from your perspective? Any economic benefits? What was the value of the outcome? Any new insights into operations? Was there any impact on capacity to serve clients? (10 minutes) 4. How would you describe the burdens (if any) of the partnership? [Probe: Demands on time or staff.] (10 minutes) 5. What obstacles or barriers affected the partnership? [Probe: How did you cope with these?] (5 minutes) 6. What would you do differently next time? What one thing would you change? (5 minutes) 7. What might the university do differently next time? What would you change about the university if you could? (10 minutes) 8. What do you know about the university that you didn’t know before? What do you wish you knew more about? (10 minutes) 9. How would you describe this experience to a colleague in another community organization or agency? What would you emphasize? (10 minutes) 10. The final thing we will do is encourage you to reflect again on your experience of working with the university. Reflect back over the project period and over this discussion. What’s the most important thing you’d like the university to hear from you? What have we not discussed? (10 minutes) 11. Are there any other comments you would like to share? Total time: 1 hour, 30 minutes Thank participants.

ASCV.indb 106

27-08-2018 15:59:55

Community Impact  107

Community Partner Interview

Administration

Purpose

The administration of interviews should be consistent across all interview subjects. The following are some guidelines:

Community partner interviews are intended to foster a one-on-one conversation with a community partner to explore their perspectives on the experience of working with the university. This instrument could be used to assess a wide variety of communityuniversity interactions.

Preparation Schedule one-hour interviews in locations and at times convenient to the community partner. In advance, describe the purpose of the interview so the partner may reflect on issues of impact prior to the interview session.

ASCV.indb 107

• • • • • • •

Start on time Introduce yourself and your role in the project Explain the purpose of the interview Assure confidentiality Stress the importance of candor Take notes or ask permission to record Follow protocol carefully and keep probes neutral

Analysis Transcribe notes and/or recording immediately. Code transcripts for key words and themes. Organize these into patterns and compare to research variables.

27-08-2018 15:59:55

108  Assessing Service-Learning and Civic Engagement

Community Partner Interview Protocol Let’s begin with some basic information: 1. Please provide a brief overview, from your own perspective, of the partnership project in which your organization participated. 2. Why did you get involved in this partnership? How did it come about? Let’s talk about the outcomes of the project: 3. What were your expectations? Did you have specific goals? Were your expectations met? 4. What would you say was the key to success? What went particularly well, and why? 5. What obstacles/barriers did you encounter and how did you deal with them? We’re interested in the impact of the project on your organization: 6. What were the benefits to your organization (social, economic, impacts on staff, insights about operations, capacity to serve clients)? 7. Knowing what you know now, what would you do differently that would make the partnership go better? Thinking about the university’s role in the partnership: 8. What should the university do differently next time? The final thing we will do is to encourage you to reflect again on your experience of working with the university. Reflect over the project period and over this discussion: 9. What is the most important thing you’d like the university to hear from you? 10. What relationship, if any, do you anticipate you will develop/maintain with the university in the future? Thank participants.

ASCV.indb 108

27-08-2018 15:59:55

Community Impact  109

Community Survey Purpose

The community survey is intended to describe community partners’ perspectives, motivations, concerns, and attitudes on issues related to their experience with students through a service-learning course. The survey is based on a five-point Likert scale where partners report their level of agreement regarding their experience with students and faculty. The scale range includes strongly disagree, disagree, neutral, agree, and strongly agree. Topics assessed by the survey include the community partner’s observations about interactions with the university, the challenges encountered, the effects of the interactions, the partner’s influence on the university, and the partner’s overall satisfaction with their connection to the university. The community survey was developed through a process of literature review, survey of existing instruments, and discussions with community partners and faculty. The information gained through the community partner survey is useful for purposes of planning programs to orient community partners, faculty, and students to working together on service-learning or community-based learning courses.

Preparation Before administering the community partner survey, the following preparation steps are recommended. 1. Determine the purpose of instrument use. The decisions include determining if the instrument is to be used in a pre/post assessment of change or in a post-test only approach to describe the general attitudes and perceptions of community partners after they have participated in a service-learning course. 2. Consider using other data-gathering strategies to complement use of the community partner survey to develop a more complete and useful profile of partner perspectives. This instrument is ideally used prior to conducting community partner interviews or focus groups. To

ASCV.indb 109

gain a full picture of a course, the community partner survey will complement data yielded by the student and faculty surveys. 3. Determine appropriate scheduling of the instrument use. Usually the best time is soon after the course has ended. 4. Solicit partner consent and support for the instrument use. Well in advance of sending out the community partner survey, partners should be informed of its purpose and their consent established, preferably in writing.

Administration Once the preparation steps are complete, these administration procedures are recommended for use of the community partner survey. 1. Partner anonymity should be assured and maintained throughout the collection of data from the survey. 2. Partners should be informed that the instrument will take 15 to 20 minutes to complete. 3. Clear information should be included about returning the survey (timing and where to return the form).

Analysis Data analysis can be conducted through utilization of the SPSS software. In the case of assessing and comparing pre- and post-service-learning experiences, the analysis could include frequencies, descriptive statistics, Chi-squares, ANOVA, and factor analysis. Descriptive statistics and frequencies serve as a database, providing mean, mode, and standard deviation between items. Chi-squares correlate descriptive data among partners. Factor analysis reduces items into categories that are closely related. ANOVAs are useful to explore the existence of variation among partners on either single items or groups of items that may arise from the factor analysis.

27-08-2018 15:59:55

110  Assessing Service-Learning and Civic Engagement

Community-Based Learning—Community Partner Survey We would like to better understand the impact that community-based learning has on our community partners. Please assist us by taking 5 to 10 minutes to complete this survey, and return it to [directions personalized to institution]. I. First, we would like some information about you.

1. How long have you been working with our university? ☐ Less than 1 year

☐ 1–3 years

2. What is your organizational status?

☐ Public ☐ For-profit

☐ More than 3 years ☐ Private ☐ Nonprofit

OR OR

3. What are the benchmark areas addressed by your organization? (Check all appropriate)

☐ Education ☐ Environment

☐ Housing ☐ Public Services

☐ Safety

☐ Health

II. The next set of questions relates to your most recent experiences with our university. 4. How did your interactions with the university influence your capacity to fulfill the mission of your organization? Mark any that apply.









ASCV.indb 110

☐ New insights about the organization/its operation ☐ Increase in number of clients served ☐ Enhanced offerings of services ☐ Increased leverage of financial/other resources ☐ New connections/networks with other community groups



☐ Changes in organizational direction ☐ Increases in number of services offered ☐ No influence ☐ Other influences (specify) ������������������

5. What are some of the challenges you encountered? Mark any that apply. ☐ Demands upon staff time ☐ Project time period insufficient ☐ Students not well prepared ☐ Number of students inappropriate for size of organization



☐ Mismatch between course goals and organization ☐ Little contact/interaction with faculty ☐ Students did not perform as expected ☐ Other (please specify) ���������������������

6. What were some of the economic effects of your work with the university? Mark any that apply. ☐ Increased value of services ☐ Increased organizational resources ☐ Completion of projects ☐ Access to university technology and expertise ☐ New products, services, materials generated



☐ Increased funding opportunities ☐ Identification of new staff ☐ Identification of additional volunteer expertise ☐ Other (please specify) ���������������������

7. In what ways do you believe that you are able to influence the university as a result of your connection with one of our courses? Mark any that apply. ☐ Influence on course content ☐ Influence on university policies ☐ Influence on faculty awareness of community



☐ Influence on student learning experience ☐ Other (please specify) ���������������������

8. As a result of your connection to this university course, how has your awareness of the university changed? Mark any that apply. ☐ I learned more about university programs and services ☐ I know whom to call on for information and assistance ☐ I am more involved with activities on campus



☐ I have an increased knowledge of university resources ☐ I have more interactions with faculty and administrators ☐ I have taken or plan to take classes at the university ☐ Other (please specify) ���������������������

27-08-2018 15:59:55

Community Impact  111



9. Do you plan to continue working with the university in this or another activity? ☐ Yes ☐ No

III. Please rate your level of satisfaction with your connection to a university course in the following areas. 10. Overall communication with students and faculty 11. Level and quality of interaction with students/faculty 12. Quality of student work 13. Feedback and input into planning of experiences 14. Scope and timing of activity 15. Level of trust with faculty and students

Strongly Disagree

Disagree

Neutral

Agree

Strongly Agree





























































16. How did you handle the logistics of your community-based learning course? Please mark the one most accurate response.

☐ I made the arrangements and placements. ☐ The faculty member made the arrangements and placements. ☐ A graduate student made the arrangements and placements.

☐ We handled the arrangements and placements collaboratively. ☐ Students handled their own placements.

17. What was the best aspect of this experience for you? 18. What aspects of the experience would you change? 19. Please add any other additional comments. Thank you for your comments. Please return this by [insert date] to [insert relevant mailing address].

ASCV.indb 111

27-08-2018 15:59:55

ASCV.indb 112

27-08-2018 15:59:56

Chapter Six

I N ST I T U T IO NA L I M PAC T

chairs by providing evidence that service-learning is worth doing, assisting in documenting activities for consideration in promotion and tenure decisions, and convincing administrative leaders to support and invest in service-learning programs. Individual faculty can use assessment to show an impact beyond their individual service-learning course or to make the case with other faculty or administrators for the value of service-learning. Assessment of institutional factors also helps ensure quality and consistency in the organization’s design of service-learning experiences to ensure that the institution’s overall relationship with the external community is enhanced. Even when service-learning courses are self-initiated by individual faculty, the institutional context will have an impact on the experience and on community perception. For example, all faculty members and students carry the aura of the institution with them when they encounter members of the external community and engage in an exchange of knowledge such as occurs in service-learning activities. This is particularly true for those institutions with a strong public commitment to service-learning and civic engagement. The image and public perceptions of the institution will be translated into the seemingly more individual and personal encounter between campus and community that is inherent in service-learning. In addition, every faculty member, student, and course must exist within an academic and institutional culture that, while espousing academic freedom and autonomy, also creates value and belief structures that define what is possible and what is likely to receive support. Assessment findings can help highlight these internal and external relationship and culture issues and identify areas where improvement is needed.

Why Assess Institutional Factors? As more and more institutions create servicelearning opportunities for their students, the intensive influence of organizational context becomes clearer. From the earliest stages of campus discussion about the potential role of service-learning, institutional factors affect decision-making at every level and every stage of operations. In addition, some institutional explorations of service-learning begin with executive leadership (top-down) and some begin with faculty initiative (bottom-up). In those cases, a campus may struggle to foster communication and agreement across the organization on the level of commitment to service-learning. Assessment can be a tool for addressing internal obstacles to service-learning, fostering communications and shared understanding, and identifying areas where institutional change is needed. Service-learning programs are always strongly influenced by their institutional environments. The impact of organizational context on service-learning and engagement endeavors means that systematic assessment of institutional factors can play an extre­ mely important role in facilitating campus commitment by providing relevant and neutral data to inform decision-making and reduce obstacles. Consider the potential uses of institutional assessment findings by different institutional sectors. Assessment is useful to senior administrators seeking to expand faculty interest and involvement by providing evidence that service-learning enhances student learning, leverages financial resources, enhances institutional reputation and community relations, and improves faculty satisfaction with their careers. Assessment of the institutional context can also be useful to faculty governance leaders or departmental 113

ASCV.indb 113

27-08-2018 15:59:56

114  Assessing Service-Learning and Civic Engagement When we began to create this model for assessing service-learning, we realized that many of the impacts on students, faculty, and community were strongly influenced by institutional factors that also warranted assessment. The emphasis on gathering information that promotes program improvement compelled us to include the study of institutional factors. Simply stated, most of the factors addressed in the assessment of impact on faculty, students, and community are linked to institutional and organizational issues. Any improvements in service-learning will be difficult without an understanding of institutional issues. In other words, assessment of institutional factors cuts across and complements all of the other constituent assessments.

Understanding Institutional Commitment to Service-Learning Before designing an institutional assessment, there must be a clear articulation of institutional goals for, and interests in, service-learning. Service-learning programs are complex and do not develop in isolation from institutional contexts. An important resource that can guide an initial discussion of campus interest in service-learning is the President’s Fourth of July Declaration on the Civic Responsibility of Higher Education (Campus Compact, 1999). This report was developed by a group of higher education presidents discussing their commitment to civic education and views service-learning as one of the relevant teaching and learning methods. The declaration begins with a useful discussion of the need for every institution to consider its civic mission. Of particular value to a campus seeking to develop an institutional assessment, the declaration includes a discussion guide for assessing campus activities and structures related to civic responsibility. This could guide the development of agreement on the vision and goals for service-learning programs. These goals become the foundation of an assessment plan that measures institutional impact. A review of recent and relevant literature highlights how central the role of institutional context is to the scope, scale, and styles of service-learning. It also demonstrates why assessment of institutional impact and factors is essential to assessment for improvement of service-learning endeavors (Bringle & Hatcher, 2000; Bucco & Busch, 1996; Holland, 1997; Rubin, 1996; Ward, 1996). The design, implementation, and sustainability of service-learning programs

ASCV.indb 114

is most often shaped by institutional interpretations of the following broad factors: • • • • • • • •

Campus mission Academic culture/traditions Political/governance environment Financial condition Institutional history, self-image, sense of peers Public image/reputation Student traits/goals Community conditions/needs/assets

Some commonalities emerge when looking at more detailed aspects of academic organizations among the same authors. The following organizational factors, most commonly mentioned, also have a dramatic influence on the shape and impact of service-learning programs: • • • • • • • • • • •

Infrastructure for service Faculty development investments Community involvement Campus policy and reward systems Commitment to evaluation Curricular and cocurricular service activities (links exist between academic and student affairs) Existing, relevant initiatives Resource allocation choices Leadership across institutional levels Support for interdisciplinary work Communications strategies/dissemination

As any institution begins creating servicelearning opportunities or initiates an assessment of the outcomes of existing service-learning courses and programs, the discussion is influenced by the campus interpretation, values, and beliefs around these factors. How they shape service-learning programs depends on campus goals and objectives for service-learning. We see this happening when one looks across institutions that seem outwardly similar in mission and capacity, but some adopt servicelearning more readily than others. Why does this happen? Each campus’s exploration of the impact of the factors listed reveals deeper underlying differences among institutions; these differences make service-learning more appropriate and readily accepted at some institutions than at others. Part of a campus’s interpretation of these major factors is a consideration of institutional motivation.

27-08-2018 15:59:56

Institutional Impact  115

Why service-learning? What is our reason for taking on this work? What are our goals and hopes? Institutional motivations can be summarized in the following dimensions: • Self-interest • Good citizenship/good works • Enhanced academic performance (new directions in research and learning for faculty and students) that realizes a distinct sense of campus mission Most institutions develop some balance across these three dimensions. For example, everything we do in academia should have some element of selfinterest. We do not have the resources to be wasteful of time and effort. However, if we act purely out of self-interest, as when a campus may buy up several blocks of real estate near campus to “clean up” an area the campus perceives as blighted and call it a “service to the community,” then there is an imbalance with the other dimensions that are vital to successful campus-community partnerships. Focusing purely on good works can also have some detriments, as it tends to leave the community in the role of supplicant and may give students a mixed message about issues of privilege, class, and social responsibility. For most institutions, the adoption of servicelearning strategies is meant to enhance academics by more clearly articulating the distinctive learning and working environment the campus offers to its faculty and students, in keeping with its sense of mission. This choice of strategies can reflect elements of self-interest (hoped-for improvements in student recruitment, faculty satisfaction, fund-raising opportunities, and/or public image and community relations, to name a few). Commitment to servicelearning also requires the institution to define its goals for itself as a good citizen of the region. How will community partnerships, such as those created for service-learning courses, connect the academic assets of the campus to the issues, needs, and opportunities of the community or region? In this way, we can understand and explain differences in levels of institutional commitment to service-learning or to other forms of civic engagement and campuscommunity partnerships. A framework that can be used to guide an institution’s self-examination of its mission is shown in Table 6.1 (Holland, 2006). This table directs attention toward eight key organizational factors often associated with commitment to service or community

ASCV.indb 115

engagement, and then describes the features that indicate different levels of commitment to the work. There is no judgment of goodness or success/failure across the four descriptive levels; they merely illustrate in measurable terms the differences that can occur across institutions regarding levels of commitment. The intent is that institutions can candidly interpret their vision and goals for service-learning and engagement, and then explore the alignment of organizational characteristics with that vision. In other words, each individual campus can seek to align the rhetoric and the reality of its commitment to engagement in all forms. The process also identifies areas where change must occur to promote alignment of commitment with program goals. Each of the eight factors is critical to the implementation and sustainability of engagement endeavors, such as service-learning. Understanding the institution’s goals and ambitions for adopting service-learning strategies allows a campus community, or a single department or school, to design programs with an understanding of the impact of the institutional context on the operation of those programs. With that understanding, a plan for regular assessment of these institutional factors, and others, can be used to document the impact of servicelearning on the institution (including both unintended and intended impacts). This documentation can then be used to make improvements that will enhance and sustain the service-learning effort and the underlying community partnership.

The Assessment Matrix for Institutional Factors Earlier in this work, we described the basic process for translating service-learning goals into variables or concepts from which measurable indicators could be developed. Table 6.2 offers an example of key concepts, indicators, and methods for measurement relevant to the assessment of the dynamic relationship between service-learning and the institutional context. Given the discussion on key organizational factors that are known to be essential in setting goals and expectations for service-learning within the context of a campus’ mission and conditions, the selection of concepts and indicators should consider at least two broad purposes of the institutional assessment. Local circumstances or special issue projects may add additional purposes.

27-08-2018 15:59:56

ASCV.indb 116

TABLE 6.1 

Levels of Commitment to Community Engagement, Characterized by Key Organizational Factors Evidencing Relevance to Institutional Mission Level One Low Relevance

Level Two Medium Relevance

Level Three High Relevance

Level Four Full Integration

Mission

No mention of or undefined rhetorical reference

Engagement is part of what we do as educated citizens

Engagement is an aspect of our academic agenda

Engagement is a central and defining characteristic

Leadership (president, vice presidents, deans, chairs)

Engagement not mentioned as a priority; general rhetorical references to community or society

Expressions that describe institution as asset to community through economic impact

Interest in and support for specific, short-term community projects; engagement discussed as a part of learning and research

Broad leadership commitment to a sustained engagement agenda with ongoing funding support and community input

Promotion, Tenure, Hiring

Idea of engagement is confused with traditional view of service

Community engagement mentioned; volunteerism or consulting may be included in portfolio

Formal guidelines for defining, documenting, and rewarding engaged teaching/research

Community-based research and teaching are valid criteria for hiring and rewards

Organization Structure and Funding

No units focus on engagement or volunteerism

Units may exist to foster volunteerism/community service

Various separate centers and institutes are organized to support engagement; soft funding

Infrastructure exists (with base funding) to support partnerships and widespread faculty/student participation

Student Involvement and Curriculum

Part of extracurricular student life activities

Organized institutional support for volunteer activity and community leadership development

Opportunity for internships, practica, some service-learning courses

Service-learning and communitybased learning integrated across curriculum; linked to learning goals

Faculty Involvement

Traditional service defined as campus duties; committees; little support for interdisciplinary work

Pro bono consulting; community volunteerism acknowledged

Tenured/senior faculty may pursue community-based research; some teach service-learning courses

Community-based research and learning intentionally integrated across disciplines; interdisciplinary work is supported

Community Involvement

Random, occasional, symbolic or limited individual or group involvement

Community representation on advisory boards for departments or schools

Community influences campus through active partnerships, participation in service-learning programs or specific grants

Community involved in defining, conducting, and evaluating community-based research and teaching; sustained partnerships

External Communications and Fund-raising

Community engagement not an emphasis

Stories of students or alumni as good citizens; partnerships are grant dependent

Emphasis on economic impact of institution; public role of centers, institutes, extension

Engagement is integral to fund-raising goals; joint grants/ gifts with community; base funding

Barbara A. Holland, 2006. “Assessing Community Engagement: Issues, Models, Challenges, Suggestions.” Keynote at the National Assessment Institute, Indianapolis, IN. Adapted from Barbara A. Holland, “Analyzing Institutional Commitment to Service: A Model of Key Organizational Factors.” Michigan Journal of Community Service Learning, Vol. 4, Fall 1997, pp. 30-41. 27-08-2018 15:59:56

ASCV.indb 117

TABLE 6.2 

Matrix for Institutional Assessment

27-08-2018 15:59:56

What do we want to know? How will we know it? (indicators) (concepts)

How will we measure it? (methods)

Who/what will provide the data? (sources)

Engagement in community

Requests for assistance from community Number of service-learning courses, community-university partnerships Level of student club activity in community service Level of community use of campus facilities Attendance at partnership events

Activity logs Schedule/catalog analysis Grants analysis/reports Facility/budget records Interviews

Institutional sources Faculty Administrators Community partners

Orientation to teaching and learning

Number and variety of faculty adopting service-learning Total number of service-learning courses offered/approved Focus/content of faculty development programming Departmental agendas/budgets for service Number of faculty publications related to service

Survey of faculty activity Schedule/catalog analysis Interviews of chairs Budget report analysis Curriculum vitae analysis

Institutional sources Faculty Administrators

Resource acquisition

Number of grant proposals/funded projects with community components Inclusion of service-related requests in development and fund-raising Level of giving to service-related donor funds Recognition/grants from foundations/others related to service-learning

Grants analysis Publications analysis Gift records Activity logs

Institutional sources Faculty Administrators

Image/reputation

Media coverage: campus, local, regional, national Site visits by other campus teams or expert Community partners Representation at conferences and in publications Quality and diversity of new faculty/administrator applicants Content of accreditation self-studies and reviews by site teams

Clipping/video reports Activity logs Personnel records Publication analysis Interviews

Institutional sources Faculty Administrators Community partners

Visibility of service and service-learning on campus

Content of campus publications, schedules, videos, web pages Awards of recognition for faculty, students, staff, partners Volunteer service by staff, administration, faculty, students Celebratory events related to service or including community

Interviews Survey Publication analysis Observation

Institutional records Faculty/staff Community partners Students

Infrastructure

Presence of organized support for service Dollars invested in infrastructure, faculty incentives, faculty development Policy context: content of faculty handbook, student handbook

Organization charts Budget reports/requests Document analysis

Institutional sources

Leadership

Local, regional, national roles of campus leaders Content of budget narratives, speeches or self-studies Community event participation Characteristics/qualifications of new hires

Document analysis Clipping/video reports Interviews Curriculum vitae analysis

Institutional sources Faculty Administrators

118  Assessing Service-Learning and Civic Engagement First, the concepts and indicators should be a tool for tracking progress toward aligning the institutional environment with the service-learning effort. In other words, the assessment effort should focus, at least in part, on areas where the organization needs to make changes to support service-learning. By looking for impact (or the absence of impact), measures should also help to identify new or additional changes/ improvements in the institutional environment that may be needed as work goes forward. Second, the concepts and indicators should be designed with the intent of capturing changes in actions and relationships among students, faculty, community, and institution, according to the goals of the service-learning program. Engagement in community assesses the overall institutional involvement in the community. Service-learning often benefits from an organizational context in which other kinds of partnership relationships exist between campus and community. The concept looks for that context of the exchange relationship. Orientation to teaching and learning relates back to Table 2.1 in the chapter on Assessment Principles and Strategies in this handbook. Because the PSU servicelearning program sought to change the teaching/learning environment for faculty and students, this was an important factor to track. It is measured by looking at both quantitative and qualitative levels of activity meant to influence faculty attitudes toward their teaching strategies and the role of service in their scholarly agenda. Resource acquisition can be a strong test of institutional commitment and progress in that it examines the links between activities and revenue streams. Obvious and useful measures include looking at the degree to which issues of service-learning and community engagement are reflected in grant-making and fund-raising strategies. In another setting, it might also be important to look at the reallocation of existing resources as a measure of commitment. Image/reputation recognizes that the campus has some degree of self-interest in expanding service-learning or other

ASCV.indb 118

engagement endeavors. Faculty motivation and donor/gift support can be strongly influenced by evidence that service-learning is shaping the image and reputation of the institution in the eyes of the community and decision-makers. In addition, the presence of service-related issues in campus searches and reports such as those for accreditation can be revealing of the level of acceptance and centrality. The visibility the campus gives to service in its own communications and publications is a good measure of the depth and breadth of commitment. We found measures of recognition and celebration through written word or leadership action to be a revealing reflection of the support for service-learning. Supportive infrastructure is one of the most central factors related to the effectiveness and sustainability of service-learning. This recognizes the labor-intensive and timeconsuming nature of service activities and measures institutional investment in services and policies that are essential to sustaining and promoting servicelearning courses and the partnerships that support them. Leadership at all levels of the organization is also essential to the sustainability and/or expansion of service-learning programs. We propose several potential measures that look at the nature of internal attention to decisions that support service and to external activities and relationships that signal the level of commitment and interest among campus leaders. Thus, we see that institutional factors cut across student, faculty, and community partner issues and relationships. The key concepts proposed in Table 6.2 are fairly broad, but the indicators include specific measures intended to document changes or effects on the three constituent groups involved in servicelearning. For example, some measures track contacts between community and the institution, changes in student involvement in the community, changes in faculty priorities, or visibility of service endeavors in campus messages. Each constituent group involved in a service-learning program affects the institution by their actions, and in turn is affected by the

27-08-2018 15:59:56

Institutional Impact  119

institution’s actions and policies. This provides the rationale for multiple measures for each factor and a multidimensional approach to data collection.

Strategies for Assessing Institutional Impact In Table 6.2, methods for collecting data to measure the indicators are offered. The potential sources for institutional data can be numerous in an environment where institutional data are openly shared. As may be the case in student and faculty dimensions of assessment, access to some data can be an assessment challenge. A critical factor can be the level of institutional attention to data collection. Some smaller institutions may not have the resources to conduct unique or regular studies that would create data relevant to a servicelearning assessment. Interest in assessment can lead to creating new strategies and capacities for collecting data efficiently. For example, an office working with service-learning students may be willing to keep a simple log on questions from students or the community. Whatever methods are used, attention to terminology and definitions is essential to ensure that relevant data is gathered. We encourage individuals to engage the constituent groups in instrument design and testing so as to generate some common understanding of terms such as service-learning, service, outcomes, and engagement. Many surveys of faculty seeking to learn about their service activities, for example, have low return rates because faculty are not clear on the meaning of some of the terms. Though diverse methods are suggested in the matrix, the strategy for collecting assessment data on institutions is basically two-fold: Talk to a wide variety of people, and tap into information and documents that already exist. Creativity and persistence are essential qualities of the investigator assessing institutional factors. The search is for evidence that campus commitment to service is being recognized, supported, and acknowledged in a number of ways. Is the level of support and awareness increasing or decreasing over time? Following is just a short sample of potential resources that can help generate data relevant to the indicators in our assessment plan: • Publications such as newsletters, alumni magazines, posters: Do they mention servicelearning programs or outcomes? • Annual reports: Do they highlight servicelearning as part of the mission? Is the

ASCV.indb 119

• • • • • •





institution attracting gifts or grants relevant to service-learning efforts? Is it a fund-raising priority? Student application essays: Why are they coming to your campus? Is it because of the commitment to service? Activity logs: Are more or fewer community calls coming into key offices on campus? What is the purpose of the calls? Media reports: Is the media talking about your service-learning partnerships? Catalog/course schedules: Do these highlight or identify service-learning courses in any way? Existing surveys: Will your institutional research office add questions to standing surveys of students, faculty, or alumni? Budget narratives ⁄requests ⁄allocations: Is service-learning part of your institution’s budget design? What is the investment in support infrastructure over time? Interviews: What do your admissions counselors tell inquiring applicants about servicelearning opportunities and about the institution’s commitment to service? Policies: Is there/has there been any change in faculty portfolios for promotion, merit, or tenure? Are they including information about service-learning? What are the outcomes of the reviews? Are there policies supporting faculty development related to service? Are there incentives for faculty to adopt servicelearning in their courses? Are there awards of recognition for faculty and students engaged in service, as there are for good teaching and academic performance?

Concluding Thoughts In some ways assessment of institutional impact and factors may be the most important element of assessment of service-learning. Because institutional contexts have such a strong influence on the perceptions and actions of faculty, students, and community, understanding and monitoring the institutional environment is critical to moving all other aspects of service-learning forward. Institutions that are making advanced commitments to service-learning have made the investment in assessing their level of commitment to civic engagement, explored the alignment of that commitment

27-08-2018 15:59:56

120  Assessing Service-Learning and Civic Engagement to organizational structure and policies, and identified and addressed need for organizational changes. Other institutions begin service-learning only to discover competing internal and external views of the campus mission, priorities, motives, and relationships to the community. In such an environment, service-learning and other engagement activities can become marginalized and necessary organizational changes may encounter substantial resistance. Assessment of institutional environment may be one remedy to this situation. The institution’s cross-cutting affect on all other dimensions of service-learning makes the investment in interpreting institutional impact essential. The loosely coupled nature of academic organizations and the

ASCV.indb 120

evidentiary orientation of academic culture means that change and innovation in higher education is largely dependent on the ability of institutional leaders and provocateurs to generate evidence through research and assessment. A major challenge in implementing necessary assessment strategies is the historic underinvestment in institutional research and assessment capacities. An interesting, unintended effect arising from the growing implementation of civic engagement and service-learning programs is a greater level of institutional awareness of the need for assessment infrastructure. Thus, we may find that measuring the institution’s capacity to conduct assessment may be a critical factor to be monitored in an assessment plan.

27-08-2018 15:59:56

Institutional Impact  121

Strategies and Methods: Institution Institutional Interviews Critical Incident Report Institutional Observations Institutional Documentation

Institutional Interviews Purpose Interviews help to explore the perspectives of university staff and administrators on the role of community-based education activities, particularly of the impact of community partnerships on the university’s operations and goals. This method can also be used to discern the level of campus understanding regarding community-university partnerships and the ability to articulate the university’s community activities.

Preparation Representatives from institutional administration should be identified based on their ability to provide insights about the work being assessed. New faculty might be interviewed in order to better understand their perceptions of the institution, as well as to assess the impact of targeted recruitment strategies. Schedule one-hour interviews in locations and at convenient times for the interviewee. In advance, describe the purpose of the interview so the subject has time to reflect on issues of impact prior to the interview session.

Administration The administration of interviews should be consistent across all interview subjects: • • • •

Start on time Introduce yourself and your role in the project Explain purposes of interview Assure confidentiality; stress importance of candor • Take notes or ask permission to record The following offices are possible sources for interviews. Their representatives can be interviewed for perspectives on the role and impact of communitybased education on the institution: • • • • • • • • • • • •

Academic Affairs Alumni Office Admissions Office Advising Offices Faculty Development Centers Foundation Office Grants and Contracts Office Health Services Institutional Research Office Student Services (Other) Teaching and Learning Center Undergraduate Studies Administrators (e.g., General Education Office)

Analysis Transcribe notes and/or recordings immediately. Code transcripts for key words and themes. Organize these into patterns and compare to research variables.

ASCV.indb 121

27-08-2018 15:59:56

122  Assessing Service-Learning and Civic Engagement

Institutional Interview Protocol: Representatives From University Offices [Narrative introduction to set context.] 1. What is your understanding of the university mission and academic environment? How do you describe it to others (prospective students/staff/faculty)? 2. Are you involved in any community-university interactions? Describe what and why. 3. Is community service part of your professional or personal life? Describe your activities and reasons for involvement. 4. What are the distinguishing characteristics of the “student experience” at the university? 5. Are you aware of university courses that include a component of service- or community-based learning? If yes, what do you know about them and how did you learn of them? 6. Do you tell prospective students/staff/faculty that the university offers service-learning courses or engages in community partnerships? Why or why not? 7. What effect do you think community-university partnerships have on the university’s institutional image and how do you know? 8. Should the campus offer more service-learning courses and make service-learning a core of the student learning experience? 9. Is there anything else you would like to tell me today? Thank interviewee.

ASCV.indb 122

27-08-2018 15:59:56

Institutional Impact  123

Institutional Interview Protocol: New Faculty [Provide introduction to set context.] 1. In your hiring process, were the university’s community partnerships described to you? How were they described? 2. Have you heard about the university’s service- or community-based learning courses? How were they described? Are you interested in teaching such courses? 3. Have you seen the promotion and tenure guidelines for this university? 4. What attracted you to this university (or use institution name)? 5. What are the goals for your work at this institution (or use name)? 6. Is there anything else you would like to tell me today about this topic? Thank interviewee.

ASCV.indb 123

27-08-2018 15:59:56

124  Assessing Service-Learning and Civic Engagement

Critical Incident Report Purpose

Completion of a critical incident report provides an opportunity to identify key highlights and issues that have occurred during a program or designated period. The report is highly reflective and offers a retrospective review of major events (anticipated or unanticipated) that affected the program in positive or negative ways. Critical incident reports complement other methods of data collection and can provide an overview of how program development issues affect outcomes. They are also useful to document the processes involved in program administration from a broad perspective over time (rather than a daily log).

Administration Identify key individuals who are closely connected to the program and have the experience and insights to identify critical incidents. For example, if an institution is adopting a major strategy to implement service-learning, then the key university administrators responsible (e.g., a vice-provost, a director of a teaching and learning center, the academic program director of a service-learning program) would be invited to complete a critical incident report. Directions for completing a critical incident report should be very specific and should include documentation of key events that, in retrospect, significantly • accelerated work toward accomplishment of goals, and/or • created barriers to goal accomplishment, and/ or • enabled the organization to overcome these barriers. Examples of critical incidents might include adoption of relevant institutional policy on servicelearning, grants awarded (or not awarded), key staff member(s) hired or terminated, physical relocation

ASCV.indb 124

of offices, new faculty promotion and tenure criteria adopted, accreditation report received, and so on.

Preparation The most useful format for presentation of a critical incident is to set up a table with the following three columns (e.g., see Table 6.3): 1. Date (specific or approximation via month alone), with events listed in chronological order 2. Nature of event 3. Why it is/was critical

Analysis It is usually most helpful to be able to collect critical incident reports from several individuals with their own perspectives on a program’s progression. Their critical incident reports can then be merged into a single chronology to develop an overall perspective on the key informants’ perspectives. If this is done, care needs to be taken to avoid identifying individuals—the point is not to gain consensus, but rather to get a full-view perspective on events. Where different opinions are offered on the same event, include each as a separate entry in the chronology. Once the reports are integrated, develop a framework based on your indicators and key concepts in which you can record key findings from the critical incident reports (sometimes creating a table or a blank matrix is useful). This helps to guide thinking through the review of the various documents and will help maintain a focus on the key concepts and indicators. The framework can then be reviewed to search for patterns in the findings, and these patterns can be compared to other findings from other methods. Write a brief narrative to reflect your findings, and integrate this narrative into your overall report. Remember that this method is intended to complement other findings; it should not be used as a standalone method of data collection.

27-08-2018 15:59:56

Institutional Impact  125 TABLE 6.3.

Sample Critical Incident Report

ASCV.indb 125

Date

Nature of event

Why critical

June 2000

New promotion and tenure criteria approved

These criteria explicitly acknowledge the value of the scholarship of outreach as part of the institution’s commitment to civic engagement. They will help to encourage faculty to participate, as a clear linkage between service-learning and promotion and tenure is now our policy.

September 2000

$180,000 grant from the Civic Foundation received

This grant will support faculty development programs, mini-grants to faculty wishing to develop service-learning courses, and travel funds to attend conferences.

October 2000

Accreditation report received from regional accreditor

Report very critical of our lack of assessment data, with specific attention to our lack of documentation of the impact of our service-learning activities. Provost extremely unhappy and mandated new assessment task force to act.

27-08-2018 15:59:56

126  Assessing Service-Learning and Civic Engagement

Institutional Observations A component of institutionalization and sustainability involves the level of reciprocity and interaction between campus and community. Observation of campus events, public events, and campus or community advisory meetings can be revealing. Questions that might serve as the basis for designing observation include the following: • • • • • •

Who attends public events on or off campus? Are community leaders invited to and present at campus events? Who serves on advisory boards, how often do they meet, and what are their duties? Do campus activities and space management offer opportunities for a community presence on campus? Is the institution included in or involved in major public community events? How is participation in major public community events supported?

Refer to the classroom observation or the community observation for more detail or approaches to observation, and then adapt these approaches in light of your own assessment goals and institutional contexts/dynamics.

ASCV.indb 126

27-08-2018 15:59:56

Institutional Impact  127

Institutional Documentation

Analysis

Sources

When analyzing artifacts and documents, decide ahead of time what indicators from your assessment matrix you will be looking for in these sources. In general, you are looking for evidence that servicelearning and other forms of community-campus partnerships and civic engagement activities are being highlighted in publications and in internal and external communications. Stories of faculty, student, and community interactions suggest a level of centrality and commitment that are vital to sustainability. Text of program descriptions, course descriptions, and the content of catalogs and course schedules can also reveal the presence or absence of attention to the visibility and availability of service-learning courses. A special note on the use of existing surveys: Institutional data reports or survey findings can also be useful sources for quantitative evidence of faculty, student, or alumni involvement in service. Most institutions conduct a regular panel of surveys regarding current students, alumni, and others. A good strategy is to negotiate the inclusion of a few questions relevant to service-learning and engagement issues. This will lead to regularly collected quantitative data which can be useful to measure longitudinal impact on individuals.

Many institutions have documentation available that can be analyzed to augment understanding of impact of community-based education. Examples of those artifacts and documents include the following: • Records of gift giving • Media coverage of the institution (newspaper articles, television spots, etc.) • Student awards • Attendance logs from faculty development events (workshops, seminars) • Internal newsletters describing faculty achievements, curriculum changes, and so on • Student retention data • Admissions data • Alumni survey data • Promotion and tenure guidelines • Course schedules and catalog • Program descriptions • Web pages • Recruitment publications • Annual reports • Strategic planning and budget documents • President/provost speeches • Publications for donors/alumni

ASCV.indb 127

27-08-2018 15:59:56

ASCV.indb 128

27-08-2018 15:59:56

Chapter Seven

M E T HO D S A N D A NA LYSI S

In the preceding chapters that offer examples of assessment instruments for faculty, students, institutions, and communities, each instrument was preceded by brief suggestions and guidelines for administering and analyzing those instruments. From our field research we have learned that many people who have assessment responsibilities may have limited or dated knowledge of quantitative and qualitative research methods and analysis techniques. (Remember how you hated that graduate statistics class [if you took one] and promptly erased it all from your mind?) Assessment strategies that seem difficult to implement or sustain, or that produce findings that confuse or challenge those who must interpret the reports, can often be traced to poor use of instruments or poor analysis. Many assessment leaders take on these roles because of interest and passion, and they do not always have deep technical expertise or access to those with more skills. Therefore, this section on methods and analysis offers deeper and more detailed guidance on the design, use, and analysis of various instruments commonly used in assessment programs, particularly in higher education. For each major type of method, we offer points about definition, common uses, questions, structures, and formats, and follow these with suggestions for administration and analysis. Obviously, even this lengthier section does not cover all there is to know about these methods; whole textbooks have been written on each one. The authors hope these more detailed guidelines will serve most purposes for assessment leaders, and we refer readers to the References for further guidance and study on methods and analysis.

Survey Examples of surveys are included in the preceding chapters for students, faculty, and community. Note that many of the suggestions apply whether surveys are administered online or on paper.

What Is It? • Typically a self-administered questionnaire • Multiple-choice or short answer • Obtains mostly empirical or quantitative information • Respondents are selected randomly (e.g., anyone who comes into a certain office on a given day) or it is given to an entire population (e.g., all students in a service-learning class) • If administered to a sample of a larger group, respondents represent the whole population being studied

Why/When Is It Used? • To assess impact of program, activity, or course • To assess customer/client satisfaction (e.g., student or community partner satisfaction) • To compare findings over time or across sites • To generalize results to a larger population • To reach a large number of respondents quickly and at low cost • If general (as compared to individualized) responses are appropriate

Note. This chapter was originally published in Sherril B. Gelmon and Amy Connell (2000) Program Evaluation Principles and Practices: A Handbook for Northwest Health Foundation Grantees. Portland: Northwest Health Foundation. Permission of the Foundation to use this material is gratefully acknowledged. 129

ASCV.indb 129

27-08-2018 15:59:56

130  Assessing Service-Learning and Civic Engagement

Types of Questions • Check lists: respondent checks answer(s) that apply to them • Quality and intensity scales (five-point balanced scales, e.g., strongly satisfied, satisfied, neutral, dissatisfied, strongly dissatisfied): measure student satisfaction, extent of agreement with statements, quality of service, and so on • Frequency scales: number of events, activities • Story identification: offer fictional scenarios and respondent indicates which they relate to (works well with children) • Ranking: rate preferences (most preferred = 1; next most preferred = 2; etc.) • Demographics: age group, gender, race, level of education, income, and so on • Last question: “Do you have any additional comments?” • Make sure you avoid any “leading” questions that point the respondent toward a particular answer • Ensure the questions are framed in the language/culture of respondents (e.g., appropriate literacy level, or level of sophistication of terminology)

Format • Introduction: length of time it will take to complete, what the findings will be used for, why the respondent should complete it, why it is important • Easy opening questions to engage respondent • Smooth flow of items; questions in logical sequence • Build up to sensitive questions • Use transition sentences to clarify the focus of sections of survey (e.g., “These next questions ask about volunteer service you may have done in the past.”) • Skip patterns: make it clear when and how respondents should skip questions that may be irrelevant to them based on responses to previous questions (this should be transparent in online surveys) • Conclusion: where to return survey and by what date (or a hotlink if an online survey); thank you The cover letter or introductory e-mail offers information about the study and about the role of the respondent:

ASCV.indb 130

• Purpose, benefit to people • Who is doing the study, who is paying for it, contact person • Make the respondent feel important • Assure confidentiality or anonymity • Offer opportunity to see study results • When and how to return the survey • Thank you • Who to contact with questions (and phone number or e-mail) • Signed letter (if paper) with original signature; provide name and title of person

Conducting a Survey • Pre-test the survey on at least 10 people before administering it with your population group to troubleshoot some of the following common problems: Confusing wording or use of jargon; uniform meaning of language; appropriate answer choices offered in multiplechoice or ranking questions; elimination of double-barreled questions (e.g., “how satisfied are you with your educational program and the number of service-learning courses you have taken?”); ease of completion (whether online or on paper) • If the evaluator is administering the survey verbally, he/she should read the questions and choice of answers exactly as written and offer little or no clarification or interpretation. • Note that with some groups (e.g., children or the mentally ill), it is better to administer the survey by reading it, but responses are still completed in the same way as if the individual respondent was filling out the survey by themselves.

Getting the Best Responses • On paper, use inviting, colored paper (pale blue, pale yellow if mailing; vibrant colors if at an event where you want the surveys noticed and easily identified); electronically, use subtle colors as background and include a relevant logo on the first page • If using survey software, ensure that respondents may only answer once, and that reminders only go to nonrespondents • For paper surveys, include a self-addressed, stamped return envelope (although you can save money by not stamping the envelope); for online, provide a hotlink for easy return

27-08-2018 15:59:57

Methods and Analysis  131

• For paper surveys, use a cover letter that is personal with an original signature (use blue pen to show it is not mass printed); for online surveys, personalize the introductory email • Short survey length • Promise of confidentiality or anonymity • Advance notification: let people know they will be receiving the survey and when • For paper surveys, send by first-class mail (although you can save money sending it third class if your organization has nonprofit mailing status) • Incentives (monetary or otherwise)

What to Do With the Data (Analysis) • Ensure you have someone on staff who has expertise in statistical analysis or that you contract with someone with these skills. • Each survey response should be given a unique identification number. • Individual responses should be coded (using numbers) to facilitate analysis; the coding scheme needs to be identical across respondents. • Quantitative data can be analyzed using a computer software package such as Microsoft Excel for simple calculations or SPSS for more detailed analysis. • Qualitative responses should be summarized and reviewed to identify any key themes. • Prepare tables (for quantitative data) and narrative (for both quantitative and qualitative) that report the findings according to the indicators and key concepts identified in your assessment matrix. • Descriptive statistics such as frequencies, means, and modes are easily obtained. They are useful to describe characteristics of a group of students or faculty, of partner organizations, or of program utilization. • Standard deviations are used to assess differences between items (e.g., responses to a different teaching style, changes in expressed attitudes, or changes in behavior due to interventions). • Cross-tabulations (or correlations) enable you to look at differences in frequencies by different groups or categories (e.g., satisfaction with educational programs across different students by major/discipline). • Chi-square is a useful tool to correlate demographic data among groups (e.g., by geographic location or by ethnicity/race).

ASCV.indb 131

• Factor analysis can reduce items from a long list into categories of items that are closely related and can be used for subsequent analysis. This could involve, for example, condensing a list of several dozen sociocultural belief statements into a small number of themes that summarize the long list. • ANOVA can be useful to explore the existence of variation within and between groups on either single items or on groups of items created through factor analysis. Where there is a large number of respondents, this is a more precise tool to learn the same things as through standard deviations, cross-tabulations, or Chisquare.

Interview Format of Introduction • Purpose of study • Your role in the study • Participation is considered to be informed consent • Assure confidentiality • Anticipated length of interview • If recording, ask permission and explain that the recording is to assist in transcription only • Clarify any potentially confusing wording, acronyms, or jargon • Let interviewees know that they can refuse to answer any questions without endangering their relationship with any entity related to the evaluation or program

Format of Questions • Open ended • Probe for personal perspective (e.g., “in your own words, tell me . . .” or “in your opinion . . .”) • Interview questions and anticipated answers should pertain to personal experience • Assign approximate time to each question so all questions can be covered in allotted time • End with “Thank you” and indicate whether a transcript will be provided

What to Do With the Data (Analysis) • Transcribe the notes and/or recordings as soon as possible after each interview

27-08-2018 15:59:57

132  Assessing Service-Learning and Civic Engagement • Review the transcripts several times and code for key words and themes • Organize the key words and themes into patterns, by using colored highlighters to distinguish themes; by cutting and pasting an electronic version; or whatever method works best to help you become familiar with the information • Compare these patterns to your indicators and key concepts • Write narrative to reflect your findings

Focus Group Focus group protocols are included in the previous chapters for students and for community partners. Focus groups could also be conducted with faculty when sufficient faculty have participated in a similar activity; likewise, they could be conducted with institutional administrators if a group discussion offered the most benefit for the data collection for assessment.

What Is It? • Informal, small group discussion • Obtains in-depth, qualitative information • Led by a moderator/facilitator following a predetermined protocol • Participants are chosen based on some commonality

Why/When Is It Used? • To develop a deeper understanding of a program or activity • To explore new ideas from the perspectives of a group of key informants • To provide a forum for issues to arise that have not been considered • To generate interactive discussion among participants

Characteristics of a Focus Group • Each group is kept small to encourage interaction among participants (6–10 participants) • Each session usually lasts one hour to one and one half hours • The conversation is restricted to no more than three to five related topics (e.g., experiences

ASCV.indb 132

with service-learning, changes in orientation to community work, barriers to service-learning) • The moderator has a script that outlines the major topics to keep the conversation focused and does not participate in the dialogue or express any opinions • Best facilitated by one neutral person with the conversation recorded; ideally a second person is present as a note-taker (for back up in case the recording is not audible) • Attention needs to be given to format and environment to ensure the location where the focus group is conducted is conducive to conversation, nonthreatening to the respondents, and establishes a level of comfort between the facilitator and the respondents (therefore be attentive to dress and body language)

Format: Introduction • Goal(s) of the focus group: what you want to learn • How the focus group will work: interactive, conversational, everyone participates, encourage getting all ideas stated, not necessary to reach agreement, no right or wrong answer • Role of moderator (facilitating, not discussing) • Let participants know that the session will be recorded and how the recording will be used; indicate that transcript will have no names in it and will be seen only by evaluators • Ensure confidentiality • Request that participants speak loudly, clearly, and one at a time

Format: Questions • Narrowly defined questions keep the conversation focused • Questions are often very similar to those used in an interview, with the recognition that a group will be answering rather than one person • Easy opening question to engage participants • Questions should become increasingly specific as the discussion proceeds • Include optional follow-up or probing questions in the protocol to help the facilitator to elicit the desired information • Assign an approximate time frame to each question so that all topics are covered

27-08-2018 15:59:57

Methods and Analysis  133

• Final question: “Are there any other comments you’d like to share?” • End with “Thank you” and indicate whether a transcript will be provided

Focus Group Participants • Determine whose perspective you want (student participants, faculty, community partner organization administrators, partner board members, university administrators, other stakeholders) • Different target populations should not be invited to the same session, as they may inhibit or skew each other’s comments • Participants are often recruited from class rosters, faculty lists, partner lists, or other databases • Use a screening questionnaire if you need to know more about potential participants before making selection

What to Do With the Data (Analysis) • Transcribe the recordings and notes from a focus group as soon as possible after the session. Remember that focus groups generate a large body of rich, textual data. • Analyze the notes by organizing the data into meaningful subsections—either around the questions posed or around the key concepts reflected by the questions. • Code the data by identifying meaningful phrases and quotes. • Organize the key words and themes into patterns, by using colored highlighters to distinguish themes; by cutting and pasting an electronic version; or whatever method works best to help you become familiar with the information. • Search for patterns within and across subsections. • Compare these patterns to your indicators and key concepts. • Write narrative to reflect your findings.

Conducting a Focus Group • Be flexible with the sequence of questions. If participants bring up an issue early that comes later in the list of questions, let the conversation happen naturally (with minimal guidance). • Select a facilitator carefully so that he or she is someone whose demographics will not bias participants’ responses. • An in-house staff person (university administrator, graduate student, faculty with assessment expertise) has more inside knowledge of programs, but may have less experience and may introduce a level of bias if they are the facilitator. • A professional moderator may be expensive but has more experience and has an emotional distance that allows for greater objectivity. In universities there are often many individuals with experience facilitating focus groups who will donate their time. • Communicate very clearly to the facilitator (particularly if using an outside professional) a description of the program or ideas being explored and what your needs are. This way, she or he will know when to follow up and when to ignore unexpected comments. • Schedule the focus group at a time that is generally convenient for your participants.

ASCV.indb 133

Observation Several methods are provided in the faculty chapter for classroom observation. Observation protocols are also included for community and institutional observations.

What Is It? • Systematic technique that uses one’s eyes, ears, and other senses • Uses a standardized grading or ranking to produce quantitative and qualitative information • Uses “trained observers”

Why/When Is It Used? • To assess aspects of programs or activities that require looking at or listening to the activity in process. Some examples one might observe are: provision of service by students at community partner organization; dynamics of interaction between faculty and students in classroom setting; interactions of community partners, agency clients, and students; content of, and interactions at, community advisory board meetings at the university • To gain additional insights about a program (or whatever is being evaluated) by direct

27-08-2018 15:59:57

134  Assessing Service-Learning and Civic Engagement observation of activities, interactions, events, and so on

Characteristics of Observation • Uses trained observers to assure accuracy across observers and over time • Precise rating scales used with specific attributes for each score/grade • If using rating scales, scales should be no less than three and no more than seven levels • Potentially difficult distinctions should be noted • Use an observer protocol form to guide recording of observation • Those being observed do not know what the observer is measuring (they are unaware of content of protocol)

What to Do With the Data (Analysis) • Review the observation protocol and notes as soon as possible after the observation. • Analyze the notes by organizing the data into meaningful subsections—either around the questions posed or around the key concepts reflected by the questions. • Organize the key words and themes into patterns, by using colored highlighters to distinguish themes; by cutting and pasting an electronic version; or whatever method works best to help you become familiar with the information. • Search for patterns within and across subsections. • Compare these patterns to other findings for the indicators and key concepts. • Write brief narrative to reflect your findings, and integrate this narrative into your overall report.

Documentation Several examples of documentation review for faculty are provided, including a syllabus analysis and a curriculum vitae analysis. A description of documentation review for institutional assessment is also included.

What Is It? • Use of various kinds of existing narrative or other data

ASCV.indb 134

• Information is not collected first-hand but is available for review and analysis (“secondary” data) • Narrative data may include program or partner organization records, policies, procedures, minutes, program descriptions, syllabi, curricula vitae, faculty journals, and so on • Use of existing reports such as budgetary information, grant history, service provision or utilization reports, faculty or partner profiles, and so on

Why/When Is It Used? • To gather historical information • To assess the processes involved in delivering or supporting the service-learning course • To augment interpretation of primary data through records of other activities relevant to the assessment

Types of Information Frequently Looked for in University Records • Information on student, faculty, partner, or course characteristics • Number and nature of service-learning courses • Success of work (e.g., number of grants funded related to service-learning; number of faculty scholarly presentations or publications related to their experiences with service-learning) • Administrative/organizational information that may set context for interpretation of other data

Potential Problems and Ways to Alleviate Them Missing or incomplete data • Go back to the data and related sources (such as by interviewing faculty) to fill in as many gaps as possible (do not redo documents but do augment the assessment data collection) • Determine whether part or all of the assessment needs to be modified because of a lack of key information • Exclude missing data or provide a “best estimate” of the missing values Data available only in simplified, overly aggregated form (e.g., number of students involved, but

27-08-2018 15:59:57

Methods and Analysis  135

not by course, discipline or major, or demographic descriptors) • Where feasible, go back into the records to reconstruct the needed data • Conduct new, original data collection • Drop the unavailable disaggregated data from the evaluation Unknown, different, or changing definitions of data elements (e.g., measuring the academic performance of students when requirements for entering GPA changed from 2.75 to 3.00) • Make feasible adjustments to make data more comparable • Focus on percentage changes rather than absolute values • Drop analysis of such data elements when the problem is insurmountable Data that are linked across time and courses/ programs (e.g., Program A in your university tracks students by year of admission to university; Program B tracks by declared major) • Be sure that the outcome data apply to the particular individuals or work elements addressed by the evaluation • Track individuals/work elements between courses/programs using such identifiers as social security numbers • Look for variations in spellings, nicknames, aliases, and so on (many computer programs can now do this for you) • Determine whether such individual data are really necessary, or whether aggregated data (e.g., by course) are sufficient Confidentiality and privacy considerations • Secure needed permissions from persons about whom individual data are needed • Avoid recording individual names; instead use code identifiers • Secure any lists that link code identifiers to individual names. Destroy these after the evaluation requirements are met • Obtain data without identifiers from source organizations • Be sure to go through human subjects review (or equivalent) if appropriate

ASCV.indb 135

What to Do With the Data (Analysis) • Develop a framework based on your indicators and key concepts in which you can record key findings from the documentation (sometimes creating a table or a blank matrix is useful). This helps to guide your thinking as you review documents and will keep you focused on your key indicators and concepts. • Search for patterns in those findings that reflect the indicators and key concepts. • Compare these patterns to other findings for the indicators and key concepts. • Write a brief narrative to reflect your findings, and integrate this narrative into your overall report.

Critical Incident Report An example of a critical incident report is included in the institutional chapter, with a focus on program administrators. Critical incident reports could also be used by students (as a form of reflection) or faculty.

What Is It? • A reflective document requested of individuals involved in the program for purposes of evaluation • A look back at major events (anticipated or unanticipated) that affected the program in positive or negative ways • Documentation of key events that, in retrospect, significantly accelerated work toward accomplishment of goals; OR created barriers to goal accomplishment; OR enabled the organization to overcome barriers

Why/When Is It Used? • To provide an overview of how program development issues affect outcomes • To document the processes involved in program administration from a broad perspective over time (rather than a daily log)

Characteristics of Critical Incident Reports • List of critical incidents in chronological order with dates provided and description of why the event is viewed as “critical”

27-08-2018 15:59:57

136  Assessing Service-Learning and Civic Engagement • Examples of critical incidents include: relevant institutional policy on service-learning adopted, grant awarded (or not awarded), key staff member hired or terminated, physical relocation of offices, new faculty promotion and tenure criteria adopted, accreditation report received, and so on.

What to Do With the Data (Analysis) • Develop a framework based on your indicators and key concepts in which you can record key findings from the critical incident reports (sometimes creating a table or a blank matrix is useful). This helps to guide your thinking as you review the various documents and will keep you focused on your key indicators and concepts. • Search for patterns in those findings that reflect the indicators and key concepts. • Compare these patterns to other findings for the indicators and key concepts. • Write brief narrative to reflect your findings, and integrate this narrative into your overall report.

Journal A journal protocol is provided in the faculty chapter. Note that student journals are not included here, as they are usually intended primarily as a learning strategy in service-learning courses; as faculty journals are presented here, they are intended as a reflective tool to gain insights about the process and activities of offering service-learning courses.

What Is It? • Personal reflections and observations by individuals; recorded on a regular basis • Provides information related to the program being assessed from a personal perspective of key individuals involved in the program

Why/When Is It Used? • To assess subtle changes in the program or individual reporting over time

ASCV.indb 136

• To encourage key individuals to reflect upon events and assess both their personal reactions and the organization’s responses

Characteristics of a Journal • Personal perspective • Highly reflective • Daily/weekly observations about program occurrences, student or community activities, and so on and responses • Free-form or in response to general guided questions

What to Do With the Data (Analysis) • Develop a framework based on your indicators and key concepts in which you can record key findings from the journals (sometimes creating a table or a blank matrix is useful). This helps to guide your thinking as you review the various documents and will keep you focused on your key indicators and concepts. • Collect the journals periodically (if over a long-term period) or once at the end of a prescribed period of time. For faculty keeping a journal during a course, you may wish to collect the journals approximately half way through the course session to get a sense of the observations, and then again immediately following the end of the course. • Read each journal and analyze the content using the framework you have developed. • Search for patterns in those findings that reflect the indicators and key concepts. Record or track these by using colored highlighters to distinguish themes; by cutting and pasting an electronic version of the journals; or by whatever method works best to help you become familiar with the information. • Compare these patterns to other findings for the indicators and key concepts. • Write brief narrative to reflect your findings, and integrate this narrative into your overall report.

27-08-2018 15:59:57

REFERENCES

Alt, M.A. & Medrich, E.A. (1994). “Student Outcomes from Participation in Community Service.” Paper prepared for the U.S. Department of Education Office of Research. Anderson, S.M. (1998). “Service-learning: A National Strategy for Youth Development.” Position paper for Education Policy Task force, Institute for Communication Policy Studies, George Washington University. Angelo, T.A. & Cross, K.P. (1993). Classroom Assessment Techniques: A Handbook for College Teachers. San Francisco, CA: Jossey-Bass. Annie E. Casey Foundation. (1999). Research and Evaluation at the Annie E. Casey Foundation. Accessed at: http://www.aecf.org Association of American Colleges & Universities (AAC&U). (2015). The LEAP Challenge: Education for a World of Unscripted Problems. Washington, DC: Association of American Colleges & Universities. Astin, A. (1993). Assessment for Excellence: The Philosophy and Practice of Assessment and Evaluation in Higher Education. Phoenix, AZ: Oryx Press. Astin, A. & Gamson, Z. (1983). “Academic Workplace: New Demands, Heightened Tensions.” ASHE—ERIC Higher Education Research Report No. 10. Washington, DC: Association for the Study of Higher Education. Astin, A.W. & Sax, L. (1998). “How Undergraduates Are Affected by Service Participation.” Journal of College Student Development 39 (3): 251–263. Astin, A., Vogelgesang, L., Ikeda, E. & Yee, J. (2000). “How Service-Learning Affects Students.” Los Angeles, CA: University of California, Los Angeles, Higher Education Research Institute. Bamber, P. & Hankin, L. (2011). “Transformative Learning Through Service-Learning: No Passport Required.” Education + Training 53 (2–3): 190–206. Barr, R.B. & Tagg, J. (1995). “From Teaching to Learning: A New Paradigm for Undergraduate Education.” Change 27 (November/December): 13–25. Batchelder, T. & Root, S. (1999). “Effects of an Undergraduate Program to Integrate Academic Learning and Service: Cognitive, Prosocial Cognitive, and Identity Outcomes.” In M.C. Sullivan, R.A. Myers, C.D. Bradfield, & D.L. Street (Eds.). Service-Learning: Educating Students for Life, pp. 341–355. Harrisonburg, VA: James Madison University.

Battistoni, R.M. (1997). “Service-Learning as Civic Learning: Lessons We Can Learn from Our Students.” In G. Reeher & J. Cammarano (Eds.). Education for Citizenship, pp. 31–49. Lanham, MD: Rowman and Littlefield. Beaudoin, F. & Sherman, J. (2016). “Higher Education as a Driver for Urban Sustainability Outcomes: The Role of Portland State University Institute for Sustainable Solutions.” In B.D. Wortham-Galvin, J.A. Allen, & J. Sherman (Eds.). Let Knowledge Serve the City, pp. 152–171. Sheffield, UK: Greenleaf. Berk, R.A. (2012). “Top 20 Strategies to Increase the Online Response Rates of Student Rating Scales.” International Journal of Technology in Teaching and Learning, 8 (2): 98–107. Berson, J.S. & Youkin, W.F. (1998). “Doing Well by Doing Good: A Study of the Effects of a Service-Learning Experience on Student Success.” Paper presented at the annual meeting of the American Society of Higher Education, Miami, FL. Bess, J. (1982). New Directions for Teaching and Learning: Motivating Professors to Teach Effectively. San Francisco, CA: Jossey-Bass. Bill & Melinda Gates Foundation. (2016). Grand Challenges. Retrieved from www.gcgh.grandchallenges.org Bringle, R.G., Clayton, P.H. & Price, M.F. (2009). “Partnerships in Service Learning and Civic Engagement.” Partnerships: A Journal of Service-Learning and Civic Engagement, 1 (1): 1–20. Bringle, R. & Hatcher, J. (1995). “A Service-Learning Curriculum for Faculty.” Michigan Journal of Community Service Learning, 2: 114–122. Bringle, R. & Hatcher, J. (1998). “Implementing Service Learning in Higher Education.” Journal of Research and Development in Education, 29 (4): 31–41. Bringle, R.G. & Hatcher, J.A. (2000). “Institutionalization of Service Learning in Higher Education.” The Journal of Higher Education, 71 (3): 273–290. Brookfield, S.D. (1995). Becoming a Critically Reflective Teacher. San Francisco, CA: Jossey-Bass. Bruns, K. (2010). “Small Partnership Leads into National Outreach Scholarship Conference.” In H.E. Fitzgerald, C. Burack & S.D. Seifer (Eds.). Handbook of Engaged Scholarship: Contemporary Landscapes, Future Directions, Vol. 2: Community-Campus Partnerships, pp. 349– 359. East Lansing: Michigan State University Press.

137

ASCV.indb 137

27-08-2018 15:59:57

138  References Bucco, D.A. & Busch, J.A. (1996). “Starting a ServiceLearning Program.” In B. Jacoby & Associates (Eds.). Service Learning in Higher Education, pp. 231–245. San Francisco, CA: Jossey-Bass. Buchanan, R. (1997). “Service-Learning Survey Results.” Unpublished manuscript, University of Utah, Bennion Community Service Center, Salt Lake City. Bycio, P.B. & Allen, J.S. (2004) “A Critical Incident Approach to Outcome Based Assessment.” Journal of Education for Business, 80 (2): 86–92. Campus Compact. (1999). Presidents’ Fourth of July Declaration on the Civic Responsibility of Higher Education. Providence, RI: Campus Compact. Available at http:// www.compact.org Campus Compact. (2001). Indicators of Engagement: A Strategy for Deepening Civic and Community Engagement in Higher Education. Retrieved from http:// compact.org/indicators-of-engagement-projectcategories-page/ Campus Compact. (2016a). Campus Compact Overview. Accessed at https://compact.org/who-we-are/ https:// compact.org/www.compact.org Campus Compact. (2016b). Partnerships. Accessed at http://compact.org/who-we-are/our-coalition/partnersfunders/partnerships CCPH Board of Directors. (2013). Position Statement on Authentic Partnerships. Community-Campus Partnerships for Health. Retrieved from https://ccph .memberclicks.net/principles-of-partnership Center for Service and Learning. (2017). Teaching, Research, and Assessment. Retrieved from http://csl .iupui.edu/teaching-research/opportunities/civiclearning/graduate.shtml CES4Health. (2016a). Community Engaged Scholarship for Health. Retrieved from www.ces4health.info CES4Health. (2016b). Peer Review Process. Retrieved from http://www.ces4health.info/reviewer/peer-reviewprocess.aspx Clarke, M. (2000). “Evaluating the Community Impact of Service-learning: The 3-I Model.” Unpublished doctoral dissertation. Nashville: Peabody College of Vanderbilt University. Clayton, P.H., Bringle, R.G. & Hatcher, J.A.  (2012). Research on Service Learning: Conceptual Frameworks and Assessment, Vol. 1. Sterling, VA: Stylus. Commission on Community-Engaged Scholarship in the Health Professions. (2005). Linking Scholarship and Communities: The Report of the Commission on Community-Engaged Scholarship in the Health Professions. Community-Campus Partnerships for Health. Seattle, WA, Community-Campus Partnerships for Health. Retrieved from https://ccph.memberclicks.net/ assets/Documents/FocusAreas/linkingscholarship.pdf Commonwealth Corporation. (2013). Partnerships: A Workforce Development Practitioner’s Guide. ­Boston:

ASCV.indb 138

Commonwealth Corporation. Retrieved from www.commcorp.org Community-Campus Partnerships for Health (CCPH). (2017). CCPH Overview. Retrieved from www.ccph.info Connell, J., Kubisch, A., Schorr, L. & Weiss, C. (eds.). (1995). New Approaches to Community Initiatives: Concepts, Methods, and Contexts. Washington, DC: The Aspen Institute. Corporation for National and Community Service. 2016. President’s Higher Education Community Service Honor Roll. Retrieved from http://www.nationalservice.gov/ special-initiatives/honor-roll Council for Higher Education Accreditation (CHEA). (2006). “Footsteps and Footprints: Emerging National Accountability for Higher Education Performance.” Inside Accreditation with the President of CHEA, 2 (1, January 5). Retrieved from h ­ttp://www.chea.org/ 4 D C G I / c m s / r e v i e w. h t m l ? A c t i o n = C M S _ Document&DocID=245&MenuKey=main Creswell, J.W. (1994). Qualitative and Quantitative Approaches. Thousand Oaks, CA: Sage. Cruz, N.I. & Giles, D.E. (2000). “Where’s the Community in Service-Learning Research?” Michigan Journal of Community Service Learning, Special Issue on Strategic Directions for Service Learning Research, (Fall): 28–34. Deci, E. & Ryan, R. (1982). “Intrinsic Motivation to Teach: Possibilities and Obstacles in Our Colleges and Universities.” In J. Bess (Ed.). New Directions for Teaching and Learning: Motivating Professors to Teach Effectively, pp. 27–35. San Francisco, CA: Jossey-Bass. Driscoll, A. (2000). “Studying Faculty and Servicelearning: Directions for Inquiry and Development.” Michigan Journal of Community Service Learning, Special Issue on Strategic Directions for Service Learning Research, (Fall): 35–41. Driscoll, A. (2008). Carnegie’s Community Classification: Intentions and Insights. Change, The Magazine of Higher Learning 40 (1), 38–41. Driscoll, A. (2014). “Analysis of the Carnegie Classification of Community Engagement: Patterns and Impact on Institutions.” In D.G. Terkla & L.S. O’Leary (Eds.). Assessing Civic Engagement, pp. 3–15. San Francisco, CA: Jossey-Bass. Driscoll, A., Gelmon, S.B., Holland, B.A., Kerrigan, S., Longley, M.J. & Spring, A. (1997). Assessing the Impact of Service Learning: A Workbook of Strategies and Methods. 1st ed. Portland, OR: Center for Academic Excellence, Portland State University. Driscoll, A., Gelmon, S.B., Holland, B.A., Kerrigan, S., Spring, A., Grosvold, K. & Longley, M.J. (1998). Assessing the Impact of Service Learning: A Workbook of Strategies and Methods. 2nd ed. Portland, OR: Portland State University, Center for Academic Excellence.

27-08-2018 15:59:57

  References  139 Driscoll, A., Holland, B.A., Gelmon, S.B. & Kerrigan, S. (1996). “An Assessment Model for Service Learning: Comprehensive Case Studies of Impact on Faculty, Students, Community and Institution.” Michigan Journal of Community Service Learning, 3 (Fall): 66–71. Driscoll, A. & Lynton, E. (1999). Making Outreach Visible: A Guide to Documenting Professional Service and Outreach. Washington, DC: American Association for Higher Education. Driscoll, A., Strouse, J. & Longley, M.J. (1997). “Changing Roles for Community, Students, and Faculty in Community-Based Learning Courses.” Journal of Higher Education and Lifelong Learning, 3 (5): 17–24. Driscoll, A. & Wood, S. (2007). Developing OutcomesBased Assessment for Learner-Centered Education: A Faculty Introduction. Sterling, VA: Stylus. Ehrlich, T. (2000). Civic Responsibility and Higher Education. Phoenix, AZ: American Council on Education and Oryx Press. Ellison, J. & Eatman, T.K. (2008). Scholarship in Public: Knowledge Creation and Tenure Policy in the Engaged University. Syracuse, NY: Imagining America. Engagement Scholarship Consortium. (2017). Committed to Excellence in Scholarship and Practice of Engagement Locally and Globally. Retrieved from www.engagementscholarship.org Eyler, J. (2000). “What Do We Most Need to Know About the Impact of Service-Learning on Student Learning?” Michigan Journal of Community Service-Learning, Special Issue on Strategic Directions for Service Learning Research, (Fall): 11–17. Eyler, J. & Giles, D.E. (1994). “The Impact of a College Community Service Laboratory on Students’ Personal, Social, and Cognitive Outcomes.” Journal of Adolescence 17, 327–339. Eyler, J. & Giles, D.E. (1999). Where’s the Learning in Service-Learning? San Francisco, CA: Jossey-Bass. Eyler, J., Giles, D.E. & Gray, C. (1999). At a Glance: Summary and Annotated Bibliography of Recent ServiceLearning Research in Higher Education. Minneapolis, MN: Learn & Serve America National Service-Learning Clearinghouse. Feld, M.M. (2002). “Advisory Boards and Higher Education’s Urban Mission: Four Models.” Metropolitan Universities, 13 (1): 22–32. Flynn, E. (2015). “From Capstones to Strategic Partnerships: The Evolution of Portland State University’s Community Engagement and Partnership Agenda.” Metropolitan Universities, 26 (3): 159–170. Fullerton, A., Reitenauer, V.L. & Kerrigan, S.M. (2015). “A Grateful Recollecting: A Qualitative Study of the Long-Term Impact of Service-Learning on Graduates.” Journal of Higher Education Outreach and Engagement, 19 (2): 65–92.

ASCV.indb 139

Furco, A. (2000). Self-Assessment Rubric for the Institutionalization of Service-Learning in Higher Education. Berkeley: University of California at Berkeley. Furco, A. & Holland, B. (2013). “Using Organizational Change Theory to Assess Institutionalization of Service-Learning and Community Engagement.” In P. Clayton, R. Bringle & J. Hatcher (Eds.). Research on Service Learning: Conceptual Frameworks and Assessment, pp. 441–469. Sterling, VA: Stylus. Gajda, R. (2004). “Utilizing Collaboration Theory to Evaluate Strategic Alliances.” American Journal of Evaluation, 25: 65–77. Gelmon, S.B. (1997a). Facilitating Academic-Community Partnerships Through Educational Accreditation: Overcoming a Tradition of Barriers and Obstacles. Rockville, MD: Bureau of Health Professions, Health Resources, and Services Administration, US Public Health Service. Gelmon, S.B. (1997b). “Intentional Improvement: The Deliberate Linkage of Assessment and Accreditation.” In Assessing Impact: Evidence and Action, pp. 51–65. Washington, DC: American Association of Higher Education. Gelmon, S.B. (2000a). “Challenges in Assessing Servicelearning.” Michigan Journal of Community Service Learning, Special Issue on Strategic Directions for Service Learning Research (Fall), 84–90. Gelmon, S.B. (2000b). “How Do We Know That Our Work Makes a Difference? Assessment Strategies for Service-Learning and Civic Engagement.” Metropolitan Universities, 11 (Fall): 28–39. Gelmon, S.B. (2003). “Assessment as a Means of Building Service-Learning Partnerships.” In Barbara Jacoby and Associates (Eds.). Building Partnerships for ServiceLearning, pp. 42–64. San Francisco, CA: John Wiley & Sons. Gelmon, S.B. (2010). “A Catalyst for Research: The International Association for Research on Service-Learning and Community Engagement.” In H.E. Fitzgerald, C. Burack & S.D. Seifer (Eds.). Handbook of Engaged Scholarship: Contemporary Landscapes, Future Directions, Vol. 2: Community-Campus Partnerships, pp. 393–406. East Lansing, MI: Michigan State University Press. Gelmon, S.B. & Barnett, L., with the CBQIE-HP Technical Assistant Team. (1998). Community-Based Quality Improvement in Education for the Health Professions: Evaluation Report, 1997–1998. Portland: Portland State University. Gelmon, S.B. & Connell, A. (2000). Program Evaluation Principles and Practices: A Handbook for Northwest Health Foundation Grantees. Portland: Northwest Health Foundation. Gelmon, S.B., Holland, B.A., Seifer, S.D., Shinnamon, A., & Connors, K. (1998). “Community-University Partnerships for Mutual Learning.” Michigan Journal of Community Service Learning, 5: 97–107.

27-08-2018 15:59:57

140  References Gelmon, S.B., Holland, B.A. & Shinnamon, A.F. (1998). Health Professions Schools in Service to the Nation: 1996–1998 Final Evaluation Report. San Francisco, CA: Community-Campus Partnerships for Health, UCSF Center for the Health Professions. Retrieved from http://futurehealth.ucsf.edu/ccph.htm Gelmon, S.B., Holland, B.A., Shinnamon, A.F. & Morris, B.A. (1998). “Community-Based Education and Service: The HPSISN Experience.” Journal of Interprofessional Care, 12 (3): 257–272. Gelmon, S.B., Holland, B.A., Seifer, S.D., Shinnamon, A.F. & Connors, K. (1998). “Community-University Partnerships for Mutual Learning.” Michigan Journal of Community Service Learning, 5 (Fall): 97–107. Gelmon, S.B., Jordan, C.M. & Seifer, S.D. (2013a). “Community-Engaged Scholarship in the Academy: An Action Agenda.” Change Magazine, (July/August): 58–66. Gelmon, S.B., Jordan, C.M. & Seifer, S.D. (2013b). “Rethinking Peer Review: Expanding the Boundaries for Community-Engaged Scholarship.” International Journal of Research on Service-Learning and Community Engagement, 1 (1): 1–10. Gelmon, S.B., McBride, L.G., Hill, S., Chester, L. & Guernsey, J. (1998). Evaluation of the Portland Health Communities Initiative 1996–1998. Portland: Healthy Communities and Portland State University. Gelmon, S.B., Seifer, S.D., Kauper-Brown, J. & Mikkelsen, M. (2005) Building Capacity for Community Engagement: Institutional Self-Assessment. Seattle, WA: CommunityCampus Partnerships for Health. Retrieved from www .ccph.info Gelmon, S.B., White, A.W., Carlson, L. & Norman, L. (2000). “Making Organizational Change to Achieve Improvement and Interprofessional Learning: Perspectives from Health Professions Educators.” Journal of Interprofessional Care, 14 (2): 131–146. Gilbert, M.K., Holdt, C. & Christophersen, K. (1998). “Letting Feminist Knowledge Serve the City.” In M. Mayberry & E. Rose (Eds.). Meeting the Challenge: Innovative Feminist Pedagogies in Action, pp. 319–339. Newbury Park, CA: Sage. Giles, D.E., & Eyler, J. (1994). “The Theoretical Roots of Service-Learning in John Dewey: Toward a Theory of Service-Learning.” Michigan Journal of Community Service-Learning, 1: 77–85. Giles, D.E., Honnet, E.P. & Migliore, S. (Eds.). (1991). Research Agenda for Combining Service and Learning in the 1990s. Raleigh, NC: National Society of Internships and Experiential Education. Glassick, C.E., Huber, M.T., & Maeroff, G. (1997). Scholarship Assessed: Evaluation of the Professoriate. San Francisco, CA: Jossey-Bass. Gray, M.J., Ondaatje, E.H., Geschwind, S., Fricker, R., Goldman, C., Kaganoff, T., Robyn, A., Sundt, M., Vogelgesang, L. & Klein, S. (1999). Combining Service

ASCV.indb 140

and Learning in Higher Education. Santa Monica, CA: Rand Corporation. Hahn, T.W. & Hatcher, J.A. (2014). “The Relationship Between Service Learning and Deep Learning.” Paper presented at the Association for Institutional Research Forum, Orlando. Retrieved from https://scholarworks .iupui.edu/handle/1805/4530 Hammond, C. (1994). “Faculty Motivation and Satisfaction in Michigan Higher Education.” Michigan Journal of Community Service Learning, 1 (Fall): 42–49. Harkavy, I., Puckett, J. & Romer, D. (2000). “Action Research: Bridging Service and Research.” Michigan Journal of Community Service Learning, Special Issue on Strategic Directions for Service Learning Research, (Fall): 113–118. Hart, A., Northmore, S. & Gerhardt, C. (n.d.). Briefing Paper: Auditing, Benchmarking and Evaluating Public Engagement. Bristol, UK: National Co-ordinating Centre for Public Engagement. Retrieved from https:// www.publicengagement.ac.uk/sites/default/files/­ EvaluatingPublicEngagement.pdf Hatcher, J. (2008). “The Public Role of Professionals: Developing and Evaluating the Civic-Minded Professional Scale.” Doctoral dissertation. Retrieved from Pro Quest Dissertation and Theses, AAT 3331248. Hatcher, J.A., Bringle, R.G. & Hahn, T.W. (Eds.). (2016). Research on Student Civic Outcomes in Service Learning: Conceptual Frameworks and Methods. Sterling, VA: Stylus. Himmelman, A. (2004). Collaboration Defined: A Developmental Continuum of Change Strategies. Minneapolis, MN: Himmelman Consulting. Hoagland, S.R. (2006). “Exploring Correlates of Postsecondary Graduation Rates: An Updated Case for Consumer Education.” Accessed at: http://files.eric.ed.gov/ fulltext/ED519563.pdf Holland, B.A. (1997). “Analyzing Institutional Commitment to Service: A Model of Key Organizational Factors.” Michigan Journal of Community Service Learning, 4: 30–41. Holland, B.A. (1999a). “Factors and Strategies That Influence Faculty Involvement in Service.” Journal of Public Service and Outreach, 4 (1): 37–44. Holland, B.A. (1999b). Implementing Urban Missions Project: 1998–99 Evaluation Report. Washington, DC: Council of Independent Colleges. Holland, B.A. (2000a). “Evaluation Plan for the PSU Masters in Tribal Administration Program.” Unpublished report, Portland State University. Holland, B.A. (2000b). “The Engaged Institution and Sustainable Partnerships: Key Characteristics and Effective Change Strategies.” Presented at HUD Regional Conference, San Diego, CA. Holland, B.A. (2001a). “A Comprehensive Model for Assessing Service-Learning and CommunityUniversity Partnerships.” In M. Canada & B. Speck (Eds.). Service Learning: Practical Advice and Models,

27-08-2018 15:59:57

  References  141 New Directions in Higher Education Series, Number 114: 51–60. San Francisco, CA: Jossey-Bass. Holland, B.A. (2001b). “Implementing Urban Missions Project: An Overview of Lessons Learned.” Metropolitan Universities, 12 (3): 20–29. Holland, B.A. (2006). “Assessing Community Engagement: Issues, Models, Challenges, Suggestions.” Keynote PowerPoint presentation at the National Assessment Institute, Indianapolis, IN. Holland, B.A. (2008). “Community Engagement: Understanding Concepts and Strategies.” Keynote address, Annual Conference of Higher Education Research and Development Society of Australasia (HERDSA), Rotorua, New Zealand. Holland, B.A. (2016). “Factors Influencing Faculty Engagement—Then, Now and Future.” Journal of Higher Education and Outreach, 20 (1): 73–81. Holland, B.A. & Gelmon, S.B. (1998). “The State of the ‘Engaged Campus’: What Have We Learned About Building and Sustaining University-Community Partnerships?” AAHE Bulletin, 51 (October): 3–6. Hollander, E. & Hartley, M. (2000). “Civic Renewal in Higher Education: The State of the Movement and the Need for a National Network.” In Thomas Ehrlich (Ed.). Civic Responsibility and Higher Education, pp. 345–366. Phoenix, AZ: American Council on Education and Oryx Press. Hollander, E., Saltmarsh, J., & Zlotkowski, E. (2001). “Indicators of Engagement.” In L.A. Simon, M. Kenny, K. Brabeck & R. M. Lerner (Eds.). Learning to Serve: Promoting Civil Society Through Service-Learning, pp. 31–50. Norwell, MA: Kluwer Academic. Honnet, E.P. & Poulsen, S. (1989). “Principles of Good Practice in Combining Service and Learning.” Wingspread Special Report. Racine, WI: Johnson Foundation. Howard, J.P. (1995). Unpublished materials. Ann Arbor: University of Michigan. Imagining America. (2016). History: Imagining America. Retrieved from http://imaginingamerica.org/about/ Imagining America. (n.d.) The Publicly Engaged Scholars Study. Retrieved from http://imaginingamerica.org/ initiatives/engaged-scholars-study/ International Association for Research on Service-­Learning and Community Engagement. (2016). IARSLCE: About Us. Retrieved from www.researchslce.org Jacoby, B. & Associates. (2003). Building Partnerships for Service-Learning. San Francisco, CA: Wiley. Johnson, D.B. (1996). “Implementing a College-Wide Service Learning Initiative: The Faculty Coordinator’s Role.” Expanding Boundaries: Serving and Learning, pp. 15–16. Washington, DC: Corporation for National Service. Jordan, K.L. (1994). “The Relationship of Service-Learning and College Student Development.” Unpublished doctoral dissertation. Blacksburg, VA: Virginia Polytechnic Institute and State University.

ASCV.indb 141

Jordan, C., Seifer, S., Sandmann, L. & Gelmon, S. (2009). “CES4Health.info: Development of a Peer Reviewed Mechanism for Dissemination of Innovative Products of Health-Related Community-Engaged Scholarship.” International Journal of Prevention Practice and Research, 1 (1): 21–28. Jordan, C.M., Wong, K.A., & Jungnickel, P.W. (2009). “The Community-Engaged Scholarship Review, Promotion, and Tenure Package: A Guide for Faculty and Committee Members.” Metropolitan Universities Journal, 20 (2): 66–86. Kania, J. & Kramer, M. (2011). “Collective Impact.” Stanford Social Innovation Review, (Winter): 36–41. Keen, C. & Hall, K. (2009). “Engaging With Difference Matters: Longitudinal Student Outcomes of Cocurricular Service-Learning Programs.” Journal of Higher Education, 80 (1): 59–79. Keith, N.Z. (1998). “Community Service for CommunityBuilding: The School-Based Service Corps as Border Crossers.” Michigan Journal of Community Service Learning, 5 (Fall): 86–98. Kellogg Commission on the Future of State and LandGrant Institutions. (1999). Returning to Our Roots: The Engaged Institution. Washington, DC: National Association of State Universities and Land-Grant Colleges. Kerrigan, S. & Jhaj, S. (2007). “Assessing General Education Capstone Courses: An In-depth Look at a Nationally Recognized Capstone Assessment Model.”  Peer Review, 9 (2): 13–16. Knapp, M.L., Bennett, N.M., Plumb, J.D. & Robinson, J.L. (2000). “Community-Based Quality Improvement Education for the Health Professions: Balancing Benefits for Communities and Students.” Journal of Interprofessional Care, 14 (2): 119–130. Kretzman, J. & McKnight, J. (1993). Building Communities from the Inside Out: A Path Toward Finding and Mobilizing a Community’s Assets. Chicago, IL: ACTA Publications. Kuh, G.D., Kinzie, J., Schuh, J.H. & Whitt, E.J.  (2011). Student Success in College: Creating Conditions That Matter. San Francisco, CA: Wiley. Kuh, G.D. & O’Donnell, K. (2013). Ensuring Quality and Taking High-Impact Practices to Scale. Washington, DC: Association of American Colleges & Universities. Ladden, M., Hassmiller, S., Maher, N. & Reinhardt, R.J. (2014). Leaving a Legacy: Life, Liberty, Health and the Pursuit of Happiness. A Compendium of Partners Investing in Nursing’s Future. Princeton, NJ: Robert Wood Johnson Foundation and Northwest Health Foundation. Retrieved from http://www.partnersinnursing.org/wp-content/uploads/2014/12/PIN-LegacyCompendium.pdf Langley, G.J., Nolan, K.M., Nolan, T.W., Norman, C.L. & Provost, L.P. (1996). The Improvement Guide. San Francisco: Jossey-Bass.

27-08-2018 15:59:57

142  References Laursen, S.L., Thiry, H. & Liston, C.S. (2012). “The Impact of a University-Based School Science Outreach Program on Graduate Student Participants’ Career Paths and Professional Socialization.” Journal of Higher Education Outreach and Engagement, 16 (2): 47–78. Lazarus, J. (2007). “Embedding Service Learning in South African Higher Education: The Catalytic Role of the CHESP Initiative.” Education as Change, 11 (December): 91–108. Levesque-Bristol, C., Knapp, T.D. & Fisher, B.J. (2011). “The Effectiveness of Service-Learning: It’s Not Always What You Think.” Journal of Experiential Education, 33 (3): 208–224.Lynton, E. (1995). Making the Case for Professional Service. Washington, DC: American Association for Higher Education. Magruder, J., McManis, M.A., & Young, C.C. (1997). “The Right Idea at the Right Time: Development of a Transformational Assessment Culture.” In P.J. Gray & T.W. Banta (Eds.). The Campus-Level Impact of Assessment: Progress, Problems, and Possibilities. New Directions for Higher Education, 100 (Winter): 17–29. Mattessich, P.W., Murray-Close, M. & Monsey, B.M. (2001). Collaboration: What Makes It Work. Minneapolis, MN: Amherst Wilder Foundation. McGovern, D.P. & Curley, M.F. (2010). “Campus Compact—Engaged Scholarship for the Public Good.” In H.E. Fitzgerald, C. Burack & S.D. Seifer (Eds.). Handbook of Engaged Scholarship: Contemporary Landscapes, Future Directions. Vol. 2: Community-Campus Partnerships, pp. 339–347. East Lansing, MI: Michigan State University Press. McKay, V.C. & Estrella, J. (2008). “First-Generation Student Success: The Role of Faculty Interaction in Service Learning Courses.”  Communication Education,  57 (3): 356–372. McKeachie, W. (1982). “The Rewards of Teaching.” In J. Bess (Ed.). New Directions for Teaching and Learning: Motivating Professors to Teach Effectively, pp. 7–13. San Francisco, CA: Jossey-Bass. McPherson, P. & Shulenburger, D. (2006). Improving Student Learning in Higher Education Through Better Accountability and Assessment. National Association of State Universities and Land-Grant Colleges. Retrieved from https://chfasoa.uni.edu/Nasulgcapril2006.pdf MedEd Portal. (2016). MedEd Portal: The Journal of Teaching and Learning Resources. Retrieved from www.mededportal.org/ MERLOT. (2016). MERLOT II: Multimedia Education Resource for Learning and Online Teaching. Retrieved from www.merlot.org/merlot/index.htm Miles, M.B. & Huberman, A.M. (1994). Qualitative Data Analysis. Thousand Oaks, CA: Sage. Morgan, D.L. (1993). Successful Focus Groups. Newbury Park, CA: Sage. Morgan, D.L. (1997). Focus Groups as Qualitative Research. Newbury Park, CA: Sage.

ASCV.indb 142

Morgan, D.L. (1998). The Focus Group Guidebook. Thousand Oaks, CA: Sage. Myers-Lipton, S.J. (1996). “Effect of a Comprehensive Service-Learning Program on College Students Level of Modern Racism.” Michigan Journal of Community Service-Learning, 3 (1): 44–54. NERCHE. (2016). Carnegie Community Engagement Classification. Retrieved from http://nerche.org/index .php?option=com_content&view=article&id=341&Ite mid=618 Norman, L.A., Gelmon, S.B. & Ryan, K. (2014). Partners Investing in Nursing’s Future, Final Evaluation Report: PIN 1-5 Cohorts. Portland, OR: Northwest Health Foundation and Robert Wood Johnson Foundation. Retrieved from www.partnersinnursing.org O’Meara, K.A., Eatman, T. & Peterson, S. (2015). “Advancing Engaged Scholarship in Promotion and Tenure: A Roadmap and Call for Reform.” Liberal Education, 101 (3). Retrieved from http://www.aacu.org/ liberaleducation/2015/summer/o’meara Orton, J.D. & Weick, K. (1990). “Loosely Coupled Systems: A Reconceptualization.” Academy of Management Review, 15 (2): 203–223. Palomba, C.A. (1997). “Assessment at Ball State University.” In P.J. Gray & T.W. Banta (Eds.). The CampusLevel Impact of Assessment: Progress, Problems, and Possibilities. New Directions for Higher Education, 100 (Winter): 31–45. Pelco, L.E. & Babb, J. (2016). High Impact Practices (HIPS) Assessment Model: 2015–2016 Pilot Mid-Year Report. Retrieved from http://scholarscompass.vcu.edu/cgi/ viewcontent.cgi?article=1051&context=community_ resources Petersen, A. (1998). W.K. Kellogg Foundation Evaluation Handbook. Battle Creek, MI: W.K. Kellogg Foundation. Portland State University. (2017a). Capstone Assessment and Research. Retrieved from http://www.pdx.edu/ unst/capstone-assessment-and-research Portland State University. (2017b). Strategic Partnerships: Partnership Spectrum. Retrieved from http://www.pdx .edu/partnerships/partnership-spectrum Portland State University. (n.d.). Capstone Proposal. Retrieved from http://capstone.unst.pdx.edu/sites/ default/files/CapstoneProposal_0.pdf Ramaley, J., & Leakes, A. (2002). Greater Expectations: A New Vision for Learning as a Nation Goes to College. Washington, DC: Association of American Colleges and Universities. Retrieved from https://www.aacu .org/sites/default/files/files/publications/Greater Expectations.pdf Rhodes, T. (2008). “VALUE: Valid Assessment of Learning in Undergraduate Education.” New Directions for Institutional Research, S1: 59–70. Rice, D., & Stacey, K. (1997). “Small Group Dynamics as a Catalyst for Change: A Faculty Development Model for

27-08-2018 15:59:57

  References  143 Academic Service-Learning.” Michigan Journal of Community Service Learning, 4 (Fall): 57–64. Rubin, S. (1996). “Institutionalizing Service-Learning.” In B. Jacoby & Associates (Eds.). Service Learning in Higher Education, pp. 297–316. San Francisco, CA: Jossey-Bass. Sandy, M. & Holland, B.A. (2006). “Different Worlds and Common Ground: Community Partner Perspectives on Campus-Community Relationships.” Michigan Journal of Community Service Learning 13 (1, Fall): 30–43. Sax, L. & Astin, A. (1996). “The Impact of College on Post College Involvement in Volunteerism and Community Service.” Paper presented at the annual meeting of the Association for Institutional Research, Albuquerque, NM. Schneider, C.G. (2002). “Greater Expectations and Civic Engagement.” Liberal Education, 88 (4, Fall): 2–5. Schneider, C.G. (2004). “Our Students’ Best Work: A Framework for Accountability Worthy of Our Mission.” Peer Review, 7: 24–28. Scholtes, P.R. (1997). “Communities as Systems.” Quality Progress, 30: 49–53. Seifer, S., Wong, K., Gelmon, S. & Lederer. M. (2009). “The Community-Engaged Scholarship for Health Collaborative: A National Change Initiative Focused on Faculty Roles and Rewards.” Metropolitan Universities 20 (2): 5–21. Seifer, S.D. & Maurana, C.A. (2000). “Developing and Sustaining Community-Campus Partnerships: Putting Principles into Practice.” Partnership Perspectives, 1 (Summer): 7–11. Sigmon, R. (1979). “Service-Learning: Three Principles.” Synergist, 8: 9–11 Sigmon, R. (1996). “Anatomy of a Community-University Partnership.” Paper presented at Community Partnerships in Health Professions Education: A National Conference on Service Learning, Boston. Shinnamon, A.F., Gelmon, S.B., & Holland, B.A. (1999). Methods and Strategies for Assessing Service Learning in the Health Professions. San Francisco: CommunityCampus Partnerships for Health, UCSF Center for the Health Professions. Retrieved from http://futurehealth .ucsf.edu/ccph.htm Simonet, D. (2008). “Service-Learning and Academic Success: The Links to Retention Research.” Minnesota Campus Compact,  1: 13. Retrieved from https:// wmich.edu/sites/default/files/attachments/u5/2013/ Service-Learning%20and%20Academic%20Success .pdf Stanton, T. (1990). Integrating Public Service with Academic Study: The Faculty Role. Providence, RI: Campus Compact. Stanton, T. (1994). “The Experience of Faculty Participants in Instructional Development Seminar on ServiceLearning.” Michigan Journal of Community Service Learning, 1: 7–20.

ASCV.indb 143

Steinberg, K.S., Hatcher, J.A., & Bringle, R.G. (2008) “Civic-Minded Graduate: A North Star.” Michigan Journal of Community Service Learning, 18: 19–33. Strive Together. (2017). The Network. Retrieved from http://www.strivetogether.org/the-network Talloires Network. (2005). Talloires Declaration on the Civic Roles and Social Responsibilities of Higher Education. Retrieved from http://talloiresnetwork.tufts .edu/who-we-ar/talloires-declaration/ Talloires Network. (2016). Who We Are. Retrieved from http://talloiresnetwork.tufts.edu/who-we-ar/ Taylor-Powell, E., Rossing, B. & Geran, J. (1998). Evaluating Collaboratives: Reaching the Potential. University of Wisconsin System Board of Regents and University of Wisconsin-Extension, Cooperative Extension. Trower, C. (2006). “Gen X Meets Theory X: What New Scholars Want.” Address to the 33rd National Conference “Future Thinking: Academic Collective Bargaining in a World of Rapid Change,” National Center for the Study of Collective Bargaining in Higher Education and the Professions, New York, NY. Trower, C. (2010). “A New Generation of Faculty; Similar Core Values in a Different World.” AACU Peer Review, 12 (3, Summer): 27–30. Trower, C. (2012). “Gen X Meets Theory X: What New Scholars Want.” Journal of Collective Bargaining in the Academy, Article 11. Retrieved from http:// thekeep.eiu .edu/jcba/vol0/iss1/11 US HUD Community Outreach Partnership Centers (COPC). (2017). Case Studies: Anchor Institutions. Retrieved from https://www.huduser.gov/portal/casestudies/AnchorInstitutions.html University of Minnesota. (2016). University Research and Outreach-Engagement Center (UROC). Retrieved from http://uroc.umn.edu/ University of Nebraska Omaha. (2017). Barbara Weitz Community Engagement Center. Retrieved from http:// www.unomaha.edu/community-engagement-center/ University of North Carolina Greensboro. (2017). Institute for Community and Economic Engagement (ICEE): Community and Friends. Retrieved from https:// communityengagement.uncg.edu/icee/ University of Technology Sydney. (2017). UTS Shopfront: Community University Engagement. Retrieved from https://www.uts.edu.au/partners-and-community/­ initiatives/uts-shopfront-community-program/about-us Virginia Commonwealth University. (2017). Community Engagement: Service-Learning. Retrieved from www .community.vcu.edu/community-indicators--data-/ c om mu n it y - e ng age me nt - d a shb o ard / s e r v i c e learning-/ Ward, K. (1996). “Service-Learning and Student Volunteerism: Reflections on Institutional Commitment.” Michigan Journal of Community Service Learning, 3: 55–65.

27-08-2018 15:59:57

144  References Wealthall, S., Graham, J. & Turner, C. (1998). “Building, Maintaining and Repairing the Community-Campus Bridge: Five Years’ Experience of Community Groups Educating Medical Students.” Journal of Interprofessional Care, 12 (August): 289–302. Wechsler, A. & Fogel, J. (1995). “The Outcomes of a Service-Learning Program.” National Society for Experiential Education Quarterly, 21 (4): 6–7, 25–26. Welch, M. & Saltmarsh, J. (2013). “Current Practice and Infrastructures for Campus Centers of Community Engagement.” Journal of Higher Education Outreach and Engagement, 17 (4): 25–56. Woodland, R.H. & Hutton, M.S. (2012). “Evaluating Organizational Collaborations: Suggested Entry Points and Strategies.” American Journal of Evaluation, 33 (3): 366–383.

ASCV.indb 144

Yorio, P.L. & Ye, F. (2012). “A Meta-Analysis on the Effects of Service-Learning on the Social, Personal, and Cognitive Outcomes of Learning.”  Academy of Management Learning and Education, 11 (1): 9–27. Zlotkowski, E. (1999a). “Pedagogy and Engagement.” In R. Bringle, R. Games & E.A. Malloy (Eds.). Colleges and Universities as Citizens, pp. 66–120. Boston: Allyn & Bacon. Zlotkowski, E. (1999b). “A Service-Learning Approach to Faculty Development.” In J.P. Howard & R. Rhodes (Eds.). Service Learning Pedagogy and Research, pp. 81–89. San Francisco, CA: Jossey-Bass. Zlotkowski, E. (2000). “Service-Learning Research in the Disciplines.” Michigan Journal of Community Service Learning, Special Issue on Strategic Directions for Service Learning Research, (Fall): 61–67.

27-08-2018 15:59:57

ABOUT THE AUTHORS

Sherril B. Gelmon, DrPH, is professor in the Oregon Health and Science University (OHSU) and Portland State University School of Public Health and director of the PhD program in health systems and policy. She teaches health systems management and policy in the MPH and PhD programs and leads improvement science curricula for several OHSU residency and faculty development programs. One of her areas of research focuses on community engagement, including institutional strategies, models of faculty recognition, and rewards for community-engaged scholarship, and faculty development. She was the founding chair of the International Association for Research on Service-Learning and Community Engagement (IARSLCE) and is a recipient of the Thomas Ehrlich Civically Engaged Faculty award. Barbara A. Holland, PhD, is an internationally recognized scholar and consultant focusing on organizational change in higher education with a focus on community engagement strategies. Holland held senior academic leadership roles at three universities in the United States and two in Australia, and she is an active consultant to higher education institutions across the United States and other nations. Current academic affiliations include Indiana University–Purdue University Indianapolis, University of North Carolina Greensboro, and University of Nebraska Omaha. She was a founding board member of IARSLCE and is a recipient of IARSCLE’s Distinguished Service-Learning Research award. Amy Spring, MPA, is the community research and partnership director, strategic partnerships at Portland State University (PSU). She works with PSU students, faculty, staff, and community partners to facilitate and support the growth of community partnerships. She has been responsible for supporting the university’s partnership council; the development and support of strategic partnerships with Portland Public Schools, Intel, and PGE; and providing leadership for the campus community engagement agenda. She has spent a substantial part of her career facilitating faculty and student development workshops focused on community engagement and coordinating recruitment of students and faculty to participate in applied community-based teaching. Seanna M. Kerrigan, EdD, is the capstone program director at Portland State University (PSU) where she oversees the nation’s largest community-based Learning Capstone program. In this role, she works collaboratively with scores of faculty, students, and community-based organizations to create reciprocal communityuniversity partnerships for over 200 service-learning capstone courses annually, in which students meet a curricular graduation requirement while performing real-world projects in collaboration with PSU’s urban community. Kerrigan promotes the concept of community-based learning nationally while publishing and presenting widely on issues related to this pedagogy, best practices, and assessment. Amy Driscoll, PhD, was formerly director of teaching, learning, and assessment at California State University, Monterey Bay, where she initiated and coordinated an institutional approach to assessment focused on student learning outcomes. She also served as director of community/university partnerships at Portland State University. Driscoll led the design and implementation of the elective Carnegie Classification for institutions engaged with community (2005–2010). She then codirected/cofacilitated the classification through 2015.

145

ASCV.indb 145

27-08-2018 15:59:58

ASCV.indb 146

27-08-2018 15:59:58

INDEX

AAC&U. See American Association of Colleges and Universities acquisition, 117, 118 activity, 5 administration community observation and, 103 of critical incident report, 124 of focus group, 62 of interviews, 60 of journals, 77of observations, 83 preparation and, 80 of surveys, 53, 89, 109 of syllabus analysis, 75 teaching-learning continua and, 85 age, 55, 58 allies, 6 alumni, 29 American Association of Colleges and Universities (AAC&U), 23 HIP and, 24 Analyses of Variance (ANOVA), 109, 131 ANOVA. See Analyses of Variance assessment constituencies for, x context for, 31 cycle of, 39–43 emphasis on, 26–27 of environment, 120 of factors, 113–14 of impact, 27–29, 50, 65, 95, 97, 119 as improvement, 36 instruments for, 129 involvement with, 34–35, 96 matrix for, 38–39, 47–50, 66–69 multiconstituency approach to, 36–37 practices for, 9 principles of, 31–43 process of, 33–34 resources for, 10–11 service-learning for, 45–46 audio recording, 52

California State University Monterey Bay (CSUMB), 6 Campus Compact, viii capacity of organization, 99 underinvestment in, 120 career development, 48, 49 Carnegie Community Engagement Classification, 9–10 CCPH. See Community-Campus Partnerships for Health CES. See community-engaged scholarship challenges grand challenges as, 20–21 for institutions, 18 characteristics, 132 of critical incident reports, 135–36 of journals, 136 of observation, 123 CHESP program (Community-Higher-EducationService Partnerships), 14 Civic-Minded Graduate model (CMG), 25 classroom discussions, 71 classroom observation, 70–71 preparation for, 80 purpose of, 82–83 CMG. See Civic-Minded Graduate model commitment to engagement, 8–9 to service, 48, 49 understanding of, 114–16 communication, 13 facilitator and, 123 measure of, 49 community, vii, 47 assessment matrix for, 95 awareness of, 48 definition of, 12 focus group in, 101, 105 impact and, 95–111 involvement experiences in, 93 involvement with, 56, 58–59 partnership between university and,100 partnerships and impact on, 11–19 perspective of, 6 strategies and methods for, 103–107 community-based learning faculty surveys for, 90–94 impact of, 92 logistics and, 92 service-learning and, 24 student surveys for, 55–59

benefit as economic, 98 as mutual, 21 as social, 94 of syllabus analysis, 71–72 bias, 40–41 Billig, Shelley, 3 147

ASCV.indb 147

27-08-2018 15:59:58

148  Index Community-Based Learning Community Partner Survey, 110–11 Community-Campus Partnerships for Health (CCPH), 3, 21 as guide, 10–11 partnerships and, 12 Principles of Partnership for, 96 community-engaged scholarship (CES), 20–21 community engagement commitment to, 116 evidence and, 2–4 for faculty, 19–23 measurement and, 1–30 community observation, 101 administration and, 103 protocol and, 104 purpose of, 103 community partners, 102, 107–8 satisfaction of, 101 concepts as core, 38 impact measure by, 47 variables or, 67 concerns for assessment, 35 about teaching, 92 confidentiality, 135 connection, 111 considerations, 39 privacy and confidentiality as, 135 for reporting, 43 consistency, 18 constituents assessment for, x goals for, viii identification of, 14 methods with, viii–ix multiconstituency approach to, 36–37 purpose and, 34 context for assessment, 31 definition of, 85 handbook in, 1 teaching-learning continua of, 87 Continuum of Teaching-Learning Contexts, 87 Continuum of Teaching-Learning Qualities, 87–88 contributions, 10 Corporation for National Service, viii courses design of, 27 evaluation of, 28 online, 26 satisfaction from, 111 cover letter, 130 Creswell, J. W., 50 Critical Incident Methodology, 29

ASCV.indb 148

critical incident report, 124–25 characteristics of, 135–36 sample of, 125 use of, 135 Cruz, N. I., 97 CSUMB. See California State University Monterey Bay culture demographics and, 19–20 of individualism, 20 curriculum vitae analysis, 72 purpose of, 80 data analysis of, 54, 60, 131 collection of, vii, 7, 40–41, 52, 119 interest in, 1–2 from observations, 82 problems with, 134–35 purpose for, 4–5 sources of, 67, 99 from teaching-learning continua, 86 deficiencies, 36 demographics, 19–20 design of courses, 27 of instruments, 129 issues with, 40–41 launch and, 6 development career and, 48, 49 for professionals, 68 tools for, 27–28 dissemination, 2 peer review and, 22–23 distinction, 86 diversity, x sensitivity to, 48, 49 documentation, 134–35 as institutional, 127 Eastern Michigan University, 66 education, vii engaged campus, 32–33 engagement commitment to, 8–9 leadership and, 8–9 measurement of, ix PSU spectrum of, 15 research on, 2 as scholarship, 22 test of, 32–33 Engagement Scholarship Consortium (ESC), 3 environment, 120 ESC. See Engagement Scholarship Consortium evaluation, 28, 38 evidence

27-08-2018 15:59:58

Index  149 community engagement and, 2 of contributions, 10 of impact, 8 experiences alumni and, 29 of community involvement, 93 reflection on, 56–57, 59,91 satisfaction with, 69 university and, 110 factors assessment matrix for, 115–19 assessment of, 113–14 organization and, 116 for success, 96–97 faculty assessment matrix for, 72 community engagement for, 19–23 impact on, 65–94 initiatives for, 21–22 interview protocol for, 122–23 resources for, 23 strategies and methods for, 73–88 surveys and, 69, 89 faculty journals issues with, 71 protocol for, 78–79 purpose of, 77use of, 136 feedback, 13 focus group, 51, 132 administration of, 62 community as, 101, 105 conducting of, 123 goals of, 62 preparation and, 61, 105 protocol for, 62, 106 purpose of, 61 questions for, 62–63 format, 130 focus group and, 132 for interviews, 131 Furco, Andrew, 3 gender, 55, 58 GIFT. See Group Instructional Feedback Technique Giles, D. E., 97–98 Goal-Concept-Indicator-Method approach, 37 goals, 8, 12 constituents and, viii of focus group, 62 language and, 4 measurement and, vii practices and, ix of social justice, 30 university and, 14 grants, 37

ASCV.indb 149

Greater Expectations: A New Vision of Learning as a Nation Goes to College (American Association of Colleges and Universities), 23–24 Group Instructional Feedback Technique (GIFT), 28 Hammond, C., 66 Health Professions Schools in Service to the Nation (HPSISN), 37 High-Impact Practices (HIP), 24 Holland, B. A., 66 HPSISN. See Health Professions Schools in Service to the Nation IA. See Imagining America ICEE. See Institute for Community and Economic Engagement identification of constituents, 14 of data, 5–6 ignorance, 86 image/reputation, 117, 118 Imagining America (IA), 3 impact assessment of, 27–29, 50, 65, 95, 97, 119 community and, 95–111 of community-based learning, 92faculty and, 65–94 indicators of, 38 institutions and, 7–11, 113–14 partnerships and, 11–19 as personal or professional, 68 of service learning, 46–47 students and, 45–63 understanding of, 25–26, 93, 95–98, 110 implementation, 120 improvement, 36 Indiana University-Purdue University Indianapolis (IUPUI), 17 individualism, 20 influence on scholarship, 78 service and, 56, 59, 91, 94 information, 68 accuracy of, 70 about barriers, 69 consistency with, 18 disclosure of, 43 frequency and, 134 about study, 130 surveys and, 55, 58, 90, 110 tools for, viii infrastructure, 117 initiatives, 21–22 innovation, 16–17 Institute for Community and Economic Engagement (ICEE), 17

27-08-2018 15:59:58

150  Index institutional documentation, 127 institutional observations, 117 institutions challenges for, 18 impact of, 7–11, 113–14 interviews and, 121 methods and strategies for, 1121–27 mission of, 116 motivation and, 115 service-learning and, 113 websites for, 14 instruments for assessment, 129 building of, 7 issues and selection of, 39 types of, 40–42 use of, 129 integration, 116 interactions, 100 interviews, 51 advantage of, 51 community partners for, 102, 107–8 faculty and, 70 format for, 131 institutions and, 121 protocol for, 60–61, 74, 108, 122 purpose of, 60, 73, 107 investment of resources, 35 underinvestment and, 120 involvement with assessment, 34–35, 96 attitude toward, 85 with community, 56, 58–59 issues classroom observation and, 71 with design, 40–41 instrument selection and, 39 with interviews, 70 journals and, 72 of methodology, 96 time versus value as, 42 IUPUI. See Indiana University-Purdue University Indianapolis language, 52 of civic engagement, 31 goals and, 4 leadership, 117 engagement and, 8–9 levels, 118 The LEAP Challenge: Education for a World of Unscripted Problems (AAC&U, 2015), 24 limitations of focus group, 51–52 of surveys, 51

ASCV.indb 150

literature, x, 65 logistics, 92 matrix for assessment, 37–39, 47–50, 66–68 for community assessment, 98–100 for faculty assessment, 67 for institutional factors, 115–19 measurement assessment and, x communication and, 49 community engagement and, 1–30 of engagement, ix goals and, vii methods of, 67, 99 planning strategy for, 4–7 methodology of education, vii issues of, 96 methods, 38, 119 analysis and, 129–36 community strategies and, 103–109 comparison of, 40–41 faculty strategies and, 73–88 institutions strategies and, 121–27 of measurement, 67, 99 student strategies and, 53–54 mission, 116 Model for Improvement, 36 models, 36 CMG as, 25 resources and, 18–19 SOFAR as, 14 moderators, 132 monitoring, 5 monograph, vii–viii motivation, 78 institutions and, 115 satisfaction as, 67 multiconstituency approach, 36–37 narrative, 84 National Society for Internships and Experiential Education, 45 observation, 123–24. See also specific observations Observation Form, 84 online, 26 organization capacity of, 99 factors and, 116 handbook and, ix–x outcomes, 86 output, 42 ownership, 48, 49

27-08-2018 15:59:58

Index  151 participants, 132–33 partnership, 96 agenda for, 13–14 CCPH and, 12 community impact and, 11–19 of community-university, 100 innovation approach to, 16–17 perspective on, 13–15 satisfaction and, 101 structures for, 15–16 sustainability of, 100 pedagogy, 25, 45. See also teaching peer review, 22–23 perspective, 55, 58 of communities, 6 on community-based learning, 90, 93 on partnerships, 13–15 on service, 78 planning for data collection, 7 measurement strategy and, 4–7 Portland State University (PSU), vii–ix examples from, 27–29 Partnership Spectrum creation at, 15 SNI by, 16 practices for assessment, 9 goals and, ix HIP as, 24 preparation administration and, 80 for classroom observation, 82 community observation and, 103 focus group and, 61, 105 interviews and, 60 journals and, 75 surveys and, 53, 89 of syllabus analysis, 75 teaching-learning continua and, 85 President’s Fourth of July Declaration on the Civic Responsibility of Higher Education (Campus Compact, 1999), 114 pre-test, 130 principles of assessment, 31–43 of partnership, 12–13 privacy, 135 problems, 134–35 process of assessment, 33–34 for teaching, 91 professional development, 68 progress, 1–30 protocol community observation and, 105 for focus group, 62, 106

ASCV.indb 151

for interviews, 60–61, 74, 108, 122 for journals, 78–79 PSU. See Portland State University publications, 81 purpose of classroom observation, 82 of community observation, 103 constituents and, 34 critical incident review and, 124 of curriculum vitae analysis, 80 for data, 4–5 focus group and, 61 of interviews, 60, 73, 107 journals and, 77 of survey, 53, 89, 109 of syllabus analysis, 75 of teaching-learning continua, 85 reflection on experiences, 56–57, 59, 91 journals for, 78–79 on progress, 1–30 relevance, 116 reporting, 43 research on engagement, 2–4 publications and, 81 on service-learning, 46 state of, 65–66 underinvestment in, 120 Research on Service-Learning: Conceptual Frameworks and Assessment (Clayton, Bringle, and Hatcher, 2012), 25–26 resources acquisition of, 117, 118 for assessment, 10–11 for faculty, 23 investment of, 35 models and, 18–19 satisfaction of community partners, 101 as motivation, 68 from service-learning, 69 from university courses, 111 scholarship engagement as, 22 impact or influence on, 68–69 influence on, 78 self-awareness, 48, 49 service curriculum vitae analysis and, 81 influence and, 56, 59, 91, 94 perspective on, 76 service-learning community-based learning and, 24

27-08-2018 15:59:58

152  Index courses online for, 26 engaged campus and, 32–33 focus on, ix impact of, 46–47 institutions and, 113 as pedagogy, 25, 45 research on, 46 satisfaction from, 69 for students, 23–30 understanding commitment to, 114–16 visibility of, 117, 118 Small Group Instructional Diagnosis (SGID), 28 SNI. See Sustainable Neighborhood Initiative social justice, 30 SOFAR model, 14 software, 18 source of data, 67, 99 institutional documentation and, 127 matrix framework and, 38 South Africa, 14 spectrum, 15 Statistical Package for Social Sciences (SPSS), 54 statistics, 131 structures evaluation for, 38 for partnerships, 15–16 students impact and, 45–63 service-learning for, 23–30 strategies and methods for, 53–54 success, 96–97 surveys, 28, 129 administration of, 53, 89, 109 advantage of, 50 community-based learning and, 55–59, 90–94 for community partners, 101 conducting of, 130 data from, 52 difficulty with, 54 faculty and, 70, 89 limitations of, 51 purpose of, 53, 89, 109 responses to, 130–31 sustainability

ASCV.indb 152

of partnership, 100 SNI and, 16 Sustainable Neighborhood Initiative (SNI), 16 syllabus analysis benefits of, 71–72 purpose of, 75 teaching concerns about, 92 impact or influence on, 68 learning from, 32 process for, 91 teaching-learning continua, 71 administration of, 85 context definition of, 85 data from, 86 preparation for, 85 purpose of, 85 test of engagement, 32–33 pre-test as, 130 themes and concerns, 35 time, 42 tools for development, 27–28 information and, viii tracking, 5 transformation, 12 UNCG. See University of North Carolina Greensboro understanding of commitment, 114–16 of impact, 25–26,93, 95–98, 110 university. See also specific universities experiences and, 110 goals for, 14 University of California, Berkeley, 3 University of North Carolina Greensboro (UNCG), 17 values, 78 variables, 67 visibility, 117, 118 websites, 14 work samples, 28–29

27-08-2018 15:59:58

ASCV.indb 153

27-08-2018 15:59:58

ASCV.indb 154

27-08-2018 15:59:58

application and strategies for creating a complete and successful application. Chapters include detailed descriptions of what happened on campuses that succeeded in their application attempts and even reflection from a campus that failed on their first application. Readers can make use of worksheets at the end of each chapter to organize their own classification efforts.

45 Temple Place Boston, MA 02111

ASCV.indb 155

Subscribe to our newsletter: https://compact.org

27-08-2018 15:59:58

Also available from Campus Compact The Community Engagement Professional in Higher Education A Competency Model for an Emerging Field Edited by Lina D. Dostilio Foreword by Andrew J. Seligoshn “This book illuminates important work of thinly acknowledged citizens of academe—community engaged professionals. It advances the movement for publicly engaged scholarship giving voice to their person, place and purpose in academe with myriad inflections beyond campus borders.”—Timothy Eatman, Associate Professor, Syracuse University, Director for Research for Imagining America (IA) This book, offered by practitioner-scholars, is an exploration and identification of the knowledge, skills, and dispositions that are central to supporting effective community engagement practices between higher education and communities. The discussion and review of these core competencies are framed within a broader context of the changing landscape of institutional community engagement and the emergence of the community engagement professional as a facilitator of engaged teaching, research, and institutional partnerships distinct from other academic professionals.

The Elective Carnegie Engagement Classification Constructing a Successful Application for First-Time and Re-Classification Applicants Edited by John Saltmarsh, Mathew B. Johnson “In The Elective Carnegie Community Engagement Classification, Saltmarsh and Johnson have brought together scholars and practitioners from a diverse array of institutions that provide thoughtful, practical advice and insights about community engagement efforts in higher education. These experts offer candid reflections on how the process of applying for (or renewing) the classification can benefit an institution’s culture, commitment, self-assessment, strategic planning, and outreach. Institutions interested in pursuing this voluntary classification, as well as in enhancing their community engagement initiatives more broadly, will find this volume to be an extremely valuable resource.”—Jonathan Alger, President, James Madison University “Now approaching its fifth cycle, the Carnegie Classification for Community Engagement is contributing significantly to the improvement of community engagement strategies and outcomes. This book offers clear and valuable perspectives [as well as] models and tips from a diverse array of institutional settings. Whether your campus plans to apply for the classification or you want an excellent guide for internal planning and assessment, this book is an excellent resource to help inform your path toward a strong and effective agenda of engagement.”—Barbara A. Holland, Distinguished Professor of Community Engagement, University of Nebraska, Omaha The Carnegie Engagement Classification is designed to be a form of evidence-based documentation that a campus meets the criteria to be recognized as a community engaged institution. Editors John Saltmarsh and Mathew B. Johnson use their extensive experience working with the Carnegie Engagement Classification to offer a collection of resources for institutions that are interested in making a first-time or reclassification application for this recognition. Contributors offer insight on approaches to collecting the materials needed for an (Continued on preceding page)

ASCV.indb 156

27-08-2018 15:59:59