Published on Development Impact

We ran a video survey. Here’s how it worked. (Guest Blog Post)

This page in:

Editors’ note: This blog post is part of a series for the 'Bureaucracy Lab', a World Bank initiative to better understand the world's public officials.

It’s a story shared by many: COVID-19 threw a wrench in our team’s data collection plans. In mid-2020, health and safety concerns halted our in-person baseline survey launch for 2000 civil servants and school-staff in Lithuania.  In line with the general move to remote work, we decided to conduct remote data collection using video-based surveys. 

While traditional forms of remote data collection like web or phone surveys can reach a wide range of respondents at a relatively low cost, they require compromises to question types and survey length; and often yield lower response rates than in-person surveys. To minimise these disadvantages and mimic face-to-face enumeration, we ran a survey via video. This format was, to our knowledge, uncharted territory for rigorous impact evaluations at the time of launch. We achieved response rates comparable to otherwise similar in-person surveys, with 82.6% of respondents participating in the survey. Here is our experience and some lessons learned.

 

What do we mean by video survey?

By video survey, we mean a 1-on-1 private interview between an enumerator and respondent via a videoconference software. We used Microsoft Teams, but any end-to-end encrypted platform will do. The ideal is to stay as close as possible to whatever technology respondents most comfortably use day-to-day. 

 

When is a video survey feasible? 

Three key conditions set the ground for successful video surveys:

1. Populations with high digital literacy and access to technology

Our survey population consisted of school staff and civil servants who had, largely, been working from home during the pandemic.  These individuals had high levels of technology access (i.e. computers, web cameras) and digital literacy. To fill in the cracks, the survey team sent respondents clear technical instructions on how to access and participate in the survey prior to the survey itself. 

2. Contexts with strong and widespread internet connection

For a smooth video survey, both respondents and enumerators must stay connected to stable internet. This may be the single biggest hurdle to video surveys in some of the contexts development economists do their fieldwork in. In our setting, over 90% of the target population had access to internet, making video surveys a viable option.

3. Accessible contact information

Video surveys are not conducive to unscheduled, ad-hoc surveying, as is sometimes feasible in-person. Early outreach — and access to emails and phone numbers of organizational leadership and respondents — is key. As our sample consisted of public sector personnel, most respondent contact information was freely available online. In other cases, survey teams may need to acquire contact lists directly from institutions.

 

What are the opportunities for survey design?

A video survey opened up design opportunities beyond those availed by phone, web, or SMS surveys. A few include:

1. Sensitive topics are back on the table

Our survey focused on sensitive topics related to mental health. The video format allowed enumerators to build interpersonal trust with respondents, facilitating conversations about more difficult themes in a manner not achievable over the phone or web. 

2. Screen-sharing and chat allow for better data

Enumerators were trained to use screen-sharing to a) show involved choice lists to respondents, and b) administer an interactive budgeting activity. Further, via the chat feature, enumerators shared with respondents traceable links; these links gave the research team rich data on and insights into short-term survey treatment effects.

3. Survey length can stretch

While phone surveys have a recommended cap of 20 minutes, video surveys can lean more towards in-person lengths. In our case, surveys stretched to roughly an hour before respondent attention began to wane. 

 

How does implementation differ? 

While many data collection protocols remained the same as in-person surveys, here are a few things done differently.

1. Multiple stages of outreach 

Given the unusual format of video surveys, high-level institutional support and communication was more important than ever: sample institutions were more responsive to messages that emphasized government support. Following institutional approvals, the survey team reached out to respondents individually via email with an introduction and scheduling options. Upon confirmation, the survey team sent respondents a call link, consent form, and more information about the study. 

2. Online enumerator training 

Enumerators were trained via Microsoft Teams, allowing them to get comfortable with the medium used throughout data collection.  Our colleagues have written more generally on remote enumerator training here

3. Monitoring via recordings

Best practices, such as high-frequency checks, accompaniment, and back checks should be followed in video surveys. Subject to IRB approval and respondent consent, video surveys can also be recorded for data quality checks or additional analysis. Given the relative sensitivity of our topics, however, our team chose to not record.

 

Video Killed the Radio Star

As virtual realities expand, video surveys provide as an increasingly viable and relatively low-cost solution for data collection. Many questions remain with respect to best practices for video surveys. Here is our first shot overview. 

 

Have you tried video surveys? How did it go? We’d love to hear from you too.

 

We acknowledge and highly value the cooperation of government partners; the survey implementation firm, Civitta; the enumerators of Eurotela; and the Bureaucracy Lab team, including Zahid Hasnain, Kerenssa Kay, Dan Rogger, and Ravi Somani. 


Join the Conversation

The content of this field is kept private and will not be shown publicly
Remaining characters: 1000