Published on Development Impact

Notes from the field: From Lusaka to the Copperbelt

This page in:

It’s been a while since Development Impact has had a “notes from the field” posting.  Back in the day, there were several, mostly from Markus but also from David. It is hard to classify these blogs since some of them span a variety of topics. We encourage you to peruse these posts (take either definition of peruse that you prefer): here, here, here, here, here, here, here, here, here, here, here, here… They are full of great ideas and insightful observations.

These blogs are on our minds since we just returned from field visits to secondary schools in Zambia. We are working with the government and World Bank operational colleagues on an impact evaluation (IE) of an intervention designed to help girls and young women progress through secondary school (efforts to help students perform well on critical exams, avoid early marriage, and return to school in the event of childbirth).  We traveled to two rural (and fairly remote) districts, visiting the district education office and a school in each.

As expected, we learned a lot from the field visits making them well-worth the travel time. In this blog, we focus on three things that stood out to us:

Design-reality-redesign: We were joined by two lead staff from the national office in charge of this intervention, both based in Lusaka. If given the option, we probably would have been inclined to do these field visits without the program leads to keep a firewall between our IE and the program’s implementation. But it was great to meet them in person, finally, after many virtual discussions and emails. The return to time spent with them during the field visits was very high: we got a much better understanding of the nuances and challenges of the intervention which will inform our IE.  And we (and they) realized that the vision in Lusaka is not always what is happening “on the ground”. This is not surprising given the scale of the intervention and the remoteness and poor connectivity of districts and schools to Lusaka. Likely our field visits proved useful for the implementing team. But they also brought to mind the dilemma about the extent to which an IE should/could be leveraged to offer real-time feedback to implementors on the delivery of the intervention, versus when IE teams should stay removed from influencing the intervention’s implementation. To be sure, they have a M&E system for the intervention, but the type of ad-hoc field visits we did -- and our rambling questions and probing -- uncovered a few gaps between design and reality. And we suspect that some of these are unlikely to be captured in the M&E system (or if it does, it might identify them rather late into the implementation).

Forms, forms, forms!  There are many forms being used in schools and in the district offices, more than we realized before we went there. Some are for M&E and others are process tools (such as school attendance registers or forms specific to the intervention itself). So much paper! Only some of this paperwork gets delivered up the chain (to the district, to Lusaka). And most of it will not be digitized. Few IE studies we read reflect such data, relying instead on their IE surveys.  And yet, a lot could be learned from them in terms of fidelity to the original design, not least the core question of what is the program that you are actually evaluating, as well as whether impact varies based on the local deviations to program implementation guidelines. For our IE, we are still sorting through the list of M&E and process tool forms. Digitizing some of them will be a messy task, but hopefully insightful.

Qual and Quant: We have been fortunate to join an effort that has prioritized having both quantitative and qualitative research components. We have been relying upon two existing qual reports not only to refine our experimental design, but also to direct the focus of our interviews at the schools and district offices we visited. But it is clear that our field visits are no substitute for good qualitative work. Sure, we both took notes during our meetings, but these notes are, well, bordering on chicken scratch.  Moreover, our field visits were far too short and too crowded to dig in, especially the discussions with school girls. They were soft spoken and often reticent to speak up to some strangers bulldozing into their classroom, though the discussion livened up a bit after the school leadership and all men left the room. It is also clear to us that the future round of qual work will be important, though if we only get one shot at it (depending on funding), we will have to decide on whether we time it before the quant survey (leverage it to inform our questionnaires) or do it after the survey (leverage the survey data to inform the direction of the qual work).


Zambia School

Zambia deputy head teacher officeLets read zambia




Kathleen Beegle

Research Manager and Lead Economist, Human Development, Development Economics

S Anukriti

Senior Economist, Development Research Group

Join the Conversation

The content of this field is kept private and will not be shown publicly
Remaining characters: 1000