Researching in the Age of COVID-19 Vol. 1: Response and Reassessment 9781447360391


214 54 4MB

English Pages [143] Year 2020

Report DMCA / Copyright

DOWNLOAD PDF FILE

Recommend Papers

Researching in the Age of COVID-19 Vol. 1: Response and Reassessment
 9781447360391

  • 0 0 0
  • Like this paper and download? You can publish your own PDF file online for free in a few minutes! Sign Up
File loading please wait...
Citation preview

RESEARCHING IN THE AGE OF COVID-19 Volume 1: Response and Reassessment EDITED BY HELEN KARA SU-MING KHOO

Book cover guidelines Monogram and bookmark The Monogram has been designed to keep all book covers anchored in the Policy Press brand regardless of book cover

Monogram

Portrait covers

Ribbon

Portrait covers

design. This is to ensure book covers are given the creative freedom afforded by their classification whilst ensure a strong relationship to the Policy Press brand. The monogram must always be displayed in the bottom right corner and each book cover design should allow for this. Portrait book covers

The Monogram must appear no less The monogram should be 10mm height. than 7% of the book cover’s height. Landscape book covers

The Monogram must appear no less The monogram should be 10mm height. than 8% of the book cover’s height.

Ribbon

The Ribbon is used to indicate all Learning Resources and Core Texts and should be limited to these publications only. The Ribbon should always be positioned in the

RAPID RESPONSE

Researching in the Age of COVID-​19 Volume 1: Response and Reassessment

Edited by Helen Kara and Su-​ming Khoo

All use subject to https://about.jstor.org/terms

First published in Great Britain in 2020 by Policy Press, an imprint of Bristol University Press University of Bristol 1–​9 Old Park Hill Bristol BS2 8BB UK t: +44 (0)117 954 5940 e: bup-​[email protected] Details of international sales and distribution partners are available at policy. bristoluniversitypress.co.uk © Editorial selection and matter and conclusion @ Helen Kara and Su-​ming Khoo. Introduction © Su-​ming Khoo and Helen Kara.  Individual chapters © their respective authors, 2020. British Library Cataloguing in Publication Data A catalogue record for this book is available from the British Library ISBN 978-​1-​4473-​6038-​4  ePub ISBN 978-​1-​4473-​6039-​1  ePdf The right of Helen Kara and Su-​ming Khoo to be identifed as editors of this work has been asserted by them in accordance with the Copyright, Designs and Patents Act 1998 All rights reserved:  no part of this publication may be reproduced, stored in a retrieval system or transmitted in any form or by any means, electronic, mechanical, photocopying, recording or otherwise, without the prior permission of Bristol University Press. Every reasonable effort has been made to obtain permission to reproduce copyrighted material. If, however, anyone knows of an oversight, please contact the publisher. The statements and opinions contained within this publication are solely those of the editors and contributors and not of the University of Bristol or Bristol University Press. The University of Bristol and Bristol University Press disclaim responsibility for any injury to persons or property resulting from any material published in this publication. Bristol University Press and Policy Press work to counter discrimination on grounds of gender, race, disability, age and sexuality.

All use subject to https://about.jstor.org/terms

Table of contents



List of fgures and tables  v Notes on contributors  vii Introduction  1 Su-​ming Khoo and Helen Kara

Part I:  Going digital   1

2

3

Evaluating strategies to improve the effectiveness and effciency of CATI-​based data collection during a global pandemic  9 Mridulya Narasimhan, Jagannath R and Fabrizio Valenti Going virtual: fnding new ways to engage higher education students in a participatory project about science  20 Helena Vicente, Ana Delicado, Jussara Rowland, João Estevens, André Weiß, Roberto Falanga, Annette Leßmöllmann and Mónica Truninger Disorientation and new directions: developing the reader response toolkit  30 Louise Couceiro

4 Digital divide in the use of Skype for qualitative data collection: implications for academic research  40 Almighty Nchafack and Deborah Ikhile 5

Qualitative data collection under the ‘new normal’ in Zimbabwe  9 Emmanuel Ndhlovu

Part II:  Going with methods that are in hand   6

Social surveys during the COVID-​19 pandemic  63 Roxanne Connelly and Vernon Gayle iii All use subject to https://about.jstor.org/terms

iv Contents 7 8



Structured literature review of psychological and social research projects on the COVID-​19 pandemic in Peru  72 Sascha Reinstein and Eli Malvaceda Switching over instead of switching off: a digital feld research conducted by small-​scale farmers in southern Africa and Indonesia  82 Judith Henze, Nicole Paganini and Silke Stöber

Part III:  Needs and capabilities   9



Research methods to understand the ‘youth capabilities and conversions’: the pros and cons of using secondary data analysis in a pandemic situation  95 Paramjeet Chawla

10 Conducting the emergency response evaluation in the COVID-​19 era: refections on complexity and positionality  105 Arun Verma and Nikolaos Bizas 11 Challenges of a systematization of experiences study: learning from a displaced victim assistance programme during the COVID-​19 emergency in ethnic territories in Colombia  115 Natalia Reinoso Chávez, Santiago Castro-​Reyes and Luisa Fernanda Echeverry Conclusion  126 Helen Kara and Su-​ming Khoo

All use subject to https://about.jstor.org/terms

List of fgures and tables

Figures 1.1 Survey status and response rate  14 8.1 Farmer voices were captured weekly from all food regions and published through social media channels  88

Tables 1.1 1.2 1.3 1.4 1.5

Summary of host studies  12 Summary of calls status  13 Completion rate and feld operations time investment  14 Regression:  follow-​up survey   15 Impact of audio audits on survey refusal rates  16

v All use subject to https://about.jstor.org/terms

All use subject to https://about.jstor.org/terms

Notes on contributors

Nikolaos Bizas is a Senior Evidence and Learning Advisor for Save the Children UK. He has designed, conducted and analysed research for academia (University of Edinburgh, University of Warwick), the United Nations (IMO) and the Scottish third sector (Scottish Social Services Council, Voluntary Health Scotland). Santiago Castro-​Reyes is a Psychology and Philosophy undergraduate student at the University of La Sabana, Colombia. He is interested in community psychology, interculturality and peace building. Paramjeet Chawla is a PhD scholar at the School of Development Studies at Tata Institute of Social Sciences (TISS), Mumbai, India. Her key areas of research are human development and capabilities, horizontal inequality, social inequality, gender, social exclusion, youth and Delhi. Roxanne Connelly is a Senior Lecturer of Sociology and Quantitative Methods at the University of Edinburgh, UK. Her work is focused in the areas of social stratifcation and the sociology of education. Her methodological interests include the analysis of large and complex social survey data, longitudinal data analysis and research transparency. Louise Couceiro is an ESRC-​funded PhD student at the School of Education at the University of Glasgow, UK. Prior to beginning her doctorate Louise worked in private education and has taught in the UK, China and Australia.

vii All use subject to https://about.jstor.org/terms

viii

Notes on contributors

Ana Delicado is Coordinator of the Portuguese Persist_​EU project evaluating the knowledge, beliefs and perceptions of European university students on scientifc issues. She is also Research Fellow at the Institute of Social Sciences, University of Lisbon, Portugal, specializing in the social studies of science. João Estevens is a social scientist, currently fnishing his PhD in Global Studies. He is a Researcher at the Institute of Social Sciences, University of Lisbon, Portugal, and the Portuguese Institute of International Relations. His research felds are security studies, migration, democracy and social studies of science. Roberto Falanga is a Postdoctoral Research Fellow at the Institute of Social Sciences, University of Lisbon, Portugal. He is a research member of Persist_​EU, an EU-​funded project on students’ scientifc knowledge. Luisa Fernanda Echeverry is a psychologist specializing in health psychology working to provide mental health and well-​being to victims of armed confict. Luisa coordinates the psychosocial component of the Pan-​American Development Foundation’s Humanitarian Attention for Victims of the Armed Confict and Refugees programme. Vernon Gayle is Professor of Sociology and Social Statistics at the University of Edinburgh, UK. His main research interests include social stratifcation, the sociology of youth and the sociology of education. Judith Henze is a postdoctoral consultant for sustainable innovations in agriculture, focusing on information and communications technologies. She currently explores how artifcial intelligence can be employed to support farmers and food systems and to minimize human–​wildlife conficts. Deborah Ikhile is a researcher with an interest in health promotion and women’s health. She is currently a Research Assistant in the Centre for Public and Psychosocial Health at

All use subject to https://about.jstor.org/terms

Notes on contributors

ix

Nottingham Trent University, UK, where she provides supports for international research activities in Uganda, Ghana, Zimbabwe, South Africa, Burkina Faso and the UK. Helen Kara FAcSS has been an independent researcher since 1999 and an independent scholar since 2011. She is the author of Creative Research Methods: A Practical Guide (Policy Press, 2nd edn 2020) and Research Ethics in The Real World: Euro-​ Western and Indigenous Perspectives (Policy Press, 2018). Su-​ming Khoo is a Lecturer in Political Science and Sociology, and leads the Environment, Development and Sustainability Economic Impact (Ryan (Whitaker Institute) and Socio-​ Institute) Research Clusters at the National University of Ireland, Galway. Her research is on human rights, human development, public goods, development alternatives, decoloniality, global activism and higher education. Annette Leßmöllmann is Professor of Science Communication and Linguistics at the Karlsruhe Institute of Technology, Germany, and Coordinator of the German Persist-​EU  team. Eli Malvaceda is Professor of Psychology at the Saint Ignatius of Loyola University, Lima, Peru. His research interests include qualitative methodology and social and political violence. He has experience in the development and implementation of social projects. Mridulya Narasimhan is a Senior Research Manager at LEAD (Leveraging Evidence for Access and Development), located at the Institute for Financial Management and Research of Krea University, Chennai, India, managing the enterprises research portfolio in partnership with various governments in India. Mridulya holds an MSc in Public Management and Governance. Almighty Nchafack is a communication professional and a PhD researcher whose current work focuses on knowledge translation and health promotion within the

All use subject to https://about.jstor.org/terms

x

Notes on contributors

Scaling-​ up Packages of Interventions for Cardiovascular Disease Prevention in Selected Sites in Europe and Sub-​ Saharan Africa (SPICES) project. Other research interests include mobile health and maternal and infant health. Emmanuel Ndhlovu is a PhD student at the University of South Africa, Pretoria. He has researched and published on land reform, small-​scale farmers, peasant livelihoods, gender issues, food security and migration in Africa. Nicole Paganini is a Research Associate at the Centre for Rural Development (SLE) at Humboldt University of Berlin, Germany, and currently coordinates an interdisciplinary study on food security in the Western Cape. Her research focuses on urban food systems and relies on participatory co-​research methods. Jagannath R is currently a Research Manager with the Good Business Lab. He previously was a Senior Research Associate with LEAD, managing their gender-​based violence portfolio. Jagannath has a background in Economics and Public Policy. Natalia Reinoso Chávez is a Psychologist with a masters in Education. At the Centro de Estudios Médicos Interculturales, based in Cota, Colombia, she promotes cultural preservation, nature conservation and well-​being. Natalia is an intercultural and qualitative research consultant, and a Lecturer at the University of La Sabana, Chía, Colombia. Sascha Reinstein is a member of the Community Advisory Board for the Impacta clinical research site, Peru, and of two local organizations: Qode Perú (humanizing technology) and SexPsi Perú (diverse sex education advocacy). Her research interests include sexuality, public health, community education and intersectionality. Jussara Rowland is a sociologist and a researcher at the Institute of Social Sciences, University of Lisbon, Portugal. She explores children’s and young people’s roles, science communication and participatory methodologies.

All use subject to https://about.jstor.org/terms

Notes on contributors

xi

Silke Stöber is a postdoctoral researcher on participatory co-​research methods, rural advisory services, communication and teamwork at the SLE at Humboldt University of Berlin, Germany. She has worked for more than 25 years in international development in Asia and sub-​Sahara Africa. Mónica Truninger is a Research Fellow at the Institute of Social Sciences, University of Lisbon, Portugal, and part of the Portuguese Persist_​EU team, evaluating the knowledge, beliefs and perceptions of European university students on scientifc issues. André Weiß is a Research Assistant at the Department of Science Communication at the Karlsruhe Institute of Technology, Germany. He is part of the German Persist_​EU team, evaluating the knowledge, beliefs and perceptions of students on scientifc issues. Fabrizio Valenti is the Head of the Financial Inclusion team at LEAD, Sri City, Andhra Pradesh, India. His work focuses on digital fnance and social protection. Fabrizio holds a PhD in Economics from the University of Roma Tor Vergata, Italy. Arun Verma is a Senior Evidence and Learning Advisor for Save the Children UK. He has a PhD in Intersectionality and Healthcare Education (University of Dundee), infuences equality policies/​ practices in higher education and leads national and international complex evaluations in government and charities transforming well-​being and early learning systems to meet users’ needs. Helena Vicente is a Researcher at the Institute of Social Sciences, University of Lisbon, Portugal. Her research interests include race, colonialism, racism, gender disparity and the media.

All use subject to https://about.jstor.org/terms

All use subject to https://about.jstor.org/terms

Introduction Su-​ming Khoo and Helen Kara

As the COVID-​19 pandemic hit the world in early 2020, researchers had to react. Discussions of research methods and planning for ongoing and near-​future research swiftly turned to adapting research methods for a locked-​down world. As the pandemic response and measures to control its spread continued into the medium and longer term, it became apparent that many research methods, especially the ‘big three’ most commonly used methods of questionnaires, interviews and focus groups, could hardly be conducted in the same ways as they had been before the pandemic, and therefore had to be adapted and rethought. The pandemic presented researchers with many challenges –​and some opportunities. These included opportunities to reassess the utility of more conventional methods in unusual circumstances, and to try out less familiar methods that could meet both existing and new research needs. The COVID-​ 19 pandemic in 2020 is only one of a number of possible global emergencies that may occur due to the outbreak of an infectious pathogen like the novel CoV-​ 2 coronavirus. Indeed, the global pandemic SARS-​ preparedness body warned in September 2019 that there was ‘a very real threat of a rapidly moving, highly lethal pandemic of a respiratory pathogen killing 50 to 80 million people and wiping out nearly 5% of the world’s economy’ (Global Preparedness Monitoring Board, 2019, 6). ‘Global’ emergencies may also arise due to natural disasters such as earthquakes, tsunamis or volcanic eruptions, or human-​ caused disasters such as major industrial accidents, conficts and mass displacements of people, with effects that are severe and extensive and carry transboundary implications. Global emergencies are a perennial threat. Research is needed in such 1 All use subject to https://about.jstor.org/terms

2

Researching in the Age of COVID-19: Volume 1

urgent and challenging circumstances, and non-​emergency-​ related research may have important justifcations to continue. However, the conduct of research during global emergencies raises complex practical and ethical challenges (Nuffeld Council on Bioethics, 2020). Researchers around the world have responded to the new challenges in diverse, thoughtful and creative ways: adapting researched data collection methods, rethinking researcher–​ relationships and giving new consideration to critical needs. These include the need to foster care and resilience for research participants, among researchers and in researcher–​researched community relationships. A magnifying glass and research–​ has been placed on research ethics, and it is vital to revisit key questions of need, burden, beneft, protection, care and transformation. This book is the frst in a series of three Rapid Response volumes showcasing methods and emerging approaches to researching in a pandemic-​ overshadowed age. Volume 2 focuses on care and resilience and Volume 3 focuses on creativity and ethics. Together, these books aim to help academic, applied and practitioner-​ researchers worldwide refect on and adapt to the new challenges of getting research done, ensuring its quality and appropriateness and making sure that people and places involved are cared for and treated ethically. This frst volume focuses on the immediate dimensions of response and reassessment. Its 11 contributions fall into three main sections: going digital; working with methods in hand; and reassessing needs and capabilities. The frst section examines the need to ‘go digital’ by pivoting in-​person, specifc time-​and place-​based research to interactions using digital tools. The COVID-​19 pandemic has highlighted particular challenges for acquiring reliable primary data in low-​and middle-​income countries, even as it has forced researchers to rapidly shift to computer-​assisted methods of interviewing. In Chapter 1, Mridulya Narasimhan, Jagannath R and Fabrizio Valenti evaluate a large, multi-​site and multi-​ topic sample of computer-​aided telephone interviews in India and Bangladesh, and offer an analysis focused on questions of optimizing response rates while preserving reliability and

All use subject to https://about.jstor.org/terms

Introduction

3

validity. In Chapter 2, Helena Vicente and seven colleagues study different methods of engaging students in science camps across six European countries. They consider the impact of moving these engagements online, while conducting research to assess possible changes in participants’ scientifc perceptions and beliefs. In Chapter 3, Louise Couceiro discusses the development of electronic ‘reader response toolkits’ inspired by design research, to elicit UK children’s responses to reading contemporary texts about women’s lives. In Chapter 4, Almighty Nchafack and Deborah Ikhile, two African women researchers using Skype interviews to conduct research in the United Kingdom and Uganda, explore the theme of the ‘digital divide’ and its implications for doing research using online tools. Chapter 5, by Emmanuel Ndhlovu, examines the impacts of the COVID-​19 pandemic in rural Zimbabwe, using text messaging and voice notes for data gathering, and discusses the advantages and challenges of using these new digital methods for research. The second section of this volume addresses the theme of going with methods that are in hand. In Chapter 6, Roxanne Connelly and Vernon Gayle reconsider the importance of social surveys based on nationally representative probability samples. They discuss the use of online non-​ probability samples, consider whether the rules of survey sampling should be relaxed under crisis conditions and assess the utility of non-​probability samples and the usefulness of compensatory weighting. In Chapter 7, Sascha Reinstein and Eli Malvaceda conduct a structured meta-​review of psychological and social research projects undertaken on the COVID-​19 pandemic in Peru. Their qualitative systematic literature review allows them to identify and assess the strengths, weaknesses and gaps in the research that has been conducted. In Chapter 8, Judith Henze, Nicole Paganini and Silke Stöber offer an account of led research to examine the challenges, developing farmer-​ coping strategies and innovations of local food producers and city dwellers during the COVID-​19 pandemic in South Africa, Mozambique, Zimbabwe and Indonesia, using smartphones for data collection. Their initial fndings show that the pandemic exacerbated socio-​economic injustices across these

All use subject to https://about.jstor.org/terms

4

Researching in the Age of COVID-19: Volume 1

very different research locations, while also pointing to key sources of research resilience, cooperation and trust. The third section of this volume reassesses different needs and capabilities. In Chapter 9, Paramjeet Chawla assesses the ethical and practical considerations when turning to secondary analysis instead of using face-​to-​face methods of assessing youth capabilities in heterogeneous urban settings in India. In Chapter 10, Arun Verma and Nikolaos Bizas reconsider the role of evidence and evaluation in Save the Children UK’s services for children and families living in poverty. The pandemic has forced this organization to swiftly adapt and reorient its evaluation, research and emergency responses. The authors draw on their organization’s global work to reassess ethics, quality and rigour, embedding an ethical perspective and safeguarding responsibilities in adapting their evaluation approach. Chapter 11, by Natalia Reinoso Chávez, Santiago Castro-​Reyes and Luisa Fernanda Echeverry, presents a ‘systematization of experience’ approach to researching the intercultural dimension of psychosocial assistance programmes for displaced people (primarily Afro-​ Colombians or Indigenous peoples) in Colombia during the COVID-​19 pandemic. The ‘systematization of experience’ offers a participatory approach to educational and community-​ based knowledge production originating in Latin America, as a culturally responsive alternative to conventional evaluation methods. Taken together, these chapters offer a window into the working lives of researchers around the world at a particularly challenging time for everyone. They show researchers responding to a global pandemic and reassessing their approaches, methods and ethics with thoughtfulness, adaptiveness, criticality and care. We hope that you will fnd useful ideas and inspiration here for your own research projects.

All use subject to https://about.jstor.org/terms

Introduction

5

References Global Preparedness Monitoring Board (2019) A World at Risk: Annual Report on Global Preparedness for Health Emergencies, Geneva: World Health Organization. Nuffeld Council on Bioethics (2020) Research in Global Health Emergencies: Ethical Issues –​Short Report, London: Nuffeld Council on Bioethics.

All use subject to https://about.jstor.org/terms

All use subject to https://about.jstor.org/terms

I

Part 

Going digital

7 All use subject to https://about.jstor.org/terms

All use subject to https://about.jstor.org/terms

1 Evaluating strategies to improve the effectiveness and effciency of CATI-​based data collection during a global pandemic Mridulya Narasimhan, Jagannath R and Fabrizio Valenti

Introduction

The COVID-​19 pandemic has exposed the weaknesses in the research community’s ability to quickly acquire reliable primary data, especially in low-​and middle-​income countries (LMICs). At the same time, it has caused enormous disruptions in research activities, forcing researchers to rapidly shift from in-​person data collection to the computer-​assisted telephone interview (CATI) method, or phone surveys. Although this need is widely recognized, there is limited evidence on the best practices to design and implement these surveys, especially in LMICs. Phone surveys generally suffer from low response rates (Keeter et al, 2017) and non-​ response bias as a result of limited access to phones (Whitehead et al, 1993; Kempf and Remington, 2007) and are characterized by high attrition rates (O’Toole et al, 2008), questionable response quality (Asher, 1974; Blattman et al, 2016) and social desirability bias (Kreuter et al, 2008). These challenges, though not unique to CATI surveys, are likely to be exacerbated during these exceptional times. Existing evidence suggests that response rates can be improved by sending advance letters and scheduling (Smith et al, 1995; Hansen, 2006) and with higher incentives (Bonke 9 All use subject to https://about.jstor.org/terms

10

Researching in the Age of COVID-19: Volume 1

and Fallesen 2010). Although the use of CATI surveys in LMICs has traditionally been considered ineffective, the recent increase in mobile penetration has allowed researchers to successfully build nationally representative samples (Leo et al, 2015). Some evidence on how to successfully implement CATI surveys in these contexts exists, but there is an urgent need to build an extensive body of evidence to identify effective strategies able to increase response rates in times of crisis, such as the one caused by the COVID-​19 pandemic. For this chapter, the authors collate results from a group of projects in India and Bangladesh to test various operational strategies to conduct CATI surveys. Specifcally, preliminary results are presented from six studies for which data were collected between April and July 2020 while ‘stay at home’ orders were in place in both countries. The authors compare the following strategies of conducting phone surveys with the objective of yielding improved survey response rates:  incentives (dynamic incentive versus fxed incentive) and scheduling (SMS scheduling versus non-​scheduling). The authors also observe the effects of introducing call recordings on refusal rates, and measure the effectiveness of follow-​up phone calls for quality control (back-​checks).

Data source

Data for this chapter were drawn from the pooled metadata of six research studies (hereafter ‘host studies’), all of which used CATI data collection methods. These studies are being conducted by different teams from Leveraging Evidence for Access and Development (LEAD) at Krea University oriented development research organization, –​an action-​ headquartered in Chennai, India. The host studies were selected for the fact that they were in the data collection stage and the designed experiments were least intrusive in their implementation. As the survey content has a limited effect on the fndings of this study,1 call results (for example, survey completed, refused, and so on) were aggregated for analysis.

All use subject to https://about.jstor.org/terms

Evaluating strategies to improve data collection

11

A call-​reporting template was provided to all study teams, to be flled in at the time of the phone surveys, to ensure uniformity and data quality control. The database did not include any personally identifying information and collected metadata on instances of call pick-​up, survey participation, survey completion and call attempts for each respondent. In addition to this, intervention-​specifc data were collected for each host study respondent. The pooled metadata comprises 8,356 observations at an individual and enterprise level, spanning ten states in India and Bangladesh. Details on respondents and study context are provided in Table 1.1.

Results and fndings

A total of 26,182 phone calls were made to 8,356 individuals and 3,536 surveys were completed across the six studies. Table 1.2 presents the scale of survey operations across the various studies. The six studies had a combined response rate2 of 42.32 per cent, while 30.33 per cent of the respondents were not reachable and 11.50 per cent refused to participate. The remaining respondents either did not complete the survey or their number was incorrect. A key consideration in developing CATI survey protocols is the number of call attempts that need to be made before categorizing a respondent as ‘not reachable’. For the purposes of this study, a total of nine call attempts were made for each potential respondent. This is based on a study (Özler and Cuevas, 2019), conducted in Turkey in which the response rate at the seventh call was close to 70 per cent, with marginal gains for each additional call. Furthermore, a recent webinar organized by the Abdul PAL) (Suri, 2020) Latif Jameel Poverty Action Lab (J-​ recommended that six to nine calls be made to each respondent before categorizing them as ‘No response’. However, the research fnds that it is both cost-​and time-​ effcient to categorize respondents as ‘No response’ after the third call and survey new respondents from the population pool, instead of complying with the nine calls protocol.

All use subject to https://about.jstor.org/terms

12

Researching in the Age of COVID-19: Volume 1

Bangladesh

# SURVEYS COMPLETED

3,454

1,850 501

782

248

Manual back-check without incentive

Manual back-check with incentive3b

Audio audits3a

X

Equal split (1:1)

X

# SURVEYS ATTEMPTED

sigh,

X

# CALLS MADE agents in

Dynamic incentive2b

Phone

Front-loaded (3:1)

Demographics

Location(s) Dhaka, Mymen-

Flat incentive2a

First

SMS scheduling1

CDR

Study round (FIRST/ FOLLOW-UP)

Host study code

Staggered incentive2c

CATI experiment allocation

Rajshahi, Bangladesh CHG

First

Chhat-

General

tisgarh,

population

X

155

India CERR First

Madhya

Rural

Pradesh,

women-led

Odisha,

micro-en-

CER-

Fol-

RBC

low-up Bihar, Chhat-

X

18,400 5,083 2,022 X

X

2,155

745

652

835

300

177

556

130

29

terprises

tisgarh, India HBB

Fol-

Rajast-

low-up han,

Women

X

X

X

home-

Tamil

based

Nadu,

businesses

India GBV

Fol-

Hyder-

low-up abad, India

Women

X

aged 18–35 Total 26,182 8,356 3,536

Notes: 1 Respondents receive an SMS for scheduling interview, and are informed about the incentive. 2a One-time incentive payment. 2b Incentive increases for each completed answer. 2c Incentive given in multiple tranches. 3a Respondents informed about survey being recorded for quality assurance purposes. 3b

Respondent called back for quality check and provided incentive.

Table 1.1  : Summary of host studies

All use subject to https://about.jstor.org/terms

Evaluating strategies to improve data collection

13

Table 1.2  Summary of calls status Final status

Study CDR

Completed Percentage Incomplete Percentage No Response Percentage Refused Percentage Wrong number Percentage Total Percentage

CERR CERRBC CHG

GBV

HBB

Total

501 2,022 652 155 27.08 39.78 87.52 62.50 81 99 0 0 4.38 1.95 0.00 0.00 666 1,592 61 69

29 177 3,536 22.31 59.00 42.32 2 19 201 1.54 6.33 2.41 65 81 2,534

36.00 31.32 256 653 13.84 12.85 346 717

50.00 8 6.15 26

8.19 31 4.16 1

27.82 1 0.40 23

27.00 30.33 12 961 4.00 11.50 11 1,124

18.70 14.11 0.13 9.27 20.00 3.67 13.45 1,850 5,083 745 248 130 300 8,356 100.00 100.00 100.00 100.00 100.00 100.00 100.00

Note: First row shows frequencies and second row shows column percentages.

Figure 1.1 presents the survey status and response rates for fve of the six studies that adopted a comparable strategy in this regard. One study (CDR) capped the maximum number of attempts at four before categorizing them as ‘Not answered’, compared to the fve other studies, which categorized a respondent as ‘Not answered’ after nine unsuccessful attempts (this excludes those categorized as a ‘Wrong number’, as only one call was necessary to determine their status). Panel A represents the distribution of the fnal survey status for each study and Panel B illustrates the number of call attempts required to complete surveys. Panel B shows that 77 per cent of successful surveys were completed by the third attempt, after which the gain from an additional attempt to complete the survey was marginal. Across these studies, 22,728 calls were made to complete 3,035 surveys. Among all these calls, only 7,944 were made to respondents who completed the survey; 47.3 per cent of the calls were made to 1,868 respondents who were not reachable (that is, out of service coverage, did not answer or had an invalid number). Table 1.3 presents the response rates and number of calls required to complete surveys before and after the third call. A total of 18,706 calls were made to respondents (not including

All use subject to https://about.jstor.org/terms

14

Researching in the Age of COVID-19: Volume 1

Figure 1.1  Survey status and response rate

those that were ‘Wrong number’ and ‘Refused’), but only 21.1 per cent of calls were needed to complete 77 per cent of the 3,035 completed surveys by the third attempt. More than 57 per cent of the calls were made to respondents who were not reachable (that is, never picked up their phone or whose number was switched off or outside the coverage area). Assuming the response rate remains the same, by making this one change the studies could have achieved the sample size of 3,035 respondents with 14 per cent fewer calls.

Table 1.3  Completion rate and feld operations time investment Completed < = 3 Calls Number of surveys 2326 Percentage of completed 77 surveys Total number of calls 3954 Percentage of calls made 21.1

>3 Calls

Total

709 23

3035

1868

3990 21.3

7944 42.5

10762 57.5

All use subject to https://about.jstor.org/terms

Non-r​ esponse

Evaluating strategies to improve data collection

15

However, this would have also required increasing the pool of respondents by 30 per cent. The decision of where to cap the calls per respondent will depend on the study’s goals of either ensuring a higher response rate or a faster data collection turnaround. The former is a common objective for follow-​up surveys.

Factors that infuence response rates: follow-​up surveys, incentives, scheduling

Among the six studies, three –​CERRBC, GBV and a subsample of HBB –​are follow-​up survey rounds. Follow-​ up survey rounds have a signifcantly higher response rate compared to initial survey rounds. Table 1.4 presents the regression results. In the frst two models, the dependent variable is ‘completed survey’ (1 = completed, 0 = not completed). In models (1) and (2), the authors regress a binary variable (1 = follow-​ up, 0 = frst survey round) and observe that follow-​up survey rounds have 25.6 per cent higher response rates (signifcant at p = 0.01 levels) compared to frst survey rounds. Follow-​ up survey rounds use 13.6 per cent fewer calls to complete

Table 1.4  Regression:  follow-​up  survey

Follow-​up = 1, follow-​up survey round Constant Observations R-​squared Study FE

(1)

(2)

(3)

(4)

OLS completed survey

FE completed survey

OLS calls

FE calls

0.372***

0.256***

-​0.136*

-​0.332

(0.0142) 0.373*** (0.00569) 8,356 0.066

(0.0762) 0.389*** (0.0115) 8,356 0.110 Yes

(0.0820) (0.385) 3.152*** 3.178*** (0.0317) (0.0594) 8,356 8,356 0.000 0.074 Yes

Notes: Robust standard errors in parentheses. *** p < 0.01; ** p < 0.05; * p