Enrollment in rural Afghanistan, as you might suspect, is fairly low. And, while the primary enrollment gap between boys and girls has closed in most parts of the world, it’s alive and well here (as well as in some parts of Africa). But an interesting paper by Dana Burde and Leigh Linden gives us hope. (Gated version here  and earlier ungated version here )
Burde and Linden are looking at the effects of a USAID funded, Catholic Relief Services implemented program that puts in place locally staffed village-based schools. Burde and Linden focus their work on northwest Afghanistan, where enrollment rates for children clock in at 28 percent – 35 percent for boys and 18 percent for girls. And schools are far away – only 29 percent of the population lives within 5 kilometers of a primary school.
Burde and Linden work with the program to set up a randomized phase-in. This will give them a year to look at effects. Randomization isn’t simple with schools – you are already dealing with a smaller possible sample of intervention units. This is compounded in this case by local politics – randomizing at the village level would “strain political and cultural alliances.” So they put the villages into groups and randomize across groups (they are now down to 12 units). And then one group gets knocked out by local conflict. So, from an estimation point of view, this is going to create some hurdles. If you are interested in how to deal with small sample estimation, this paper is worth reading as they provide a very clear discussion of their methods (and even use a paper from 1945).
In terms of measurement, their main outcome variables are enrollment and test scores. Now there might be some concern that in the presence of a school project parents would be inclined to claim enrollment that did not in fact take place. They run through a number of checks to allay this fear – respondents don’t report craziness (attendance when school is closed for instance), levels of enrollment gibe with government estimates, and they conduct semi-structured interviews to check in on this after finishing the final survey. Finally, the test score results they have belie any potential lying.
So what are their results? Enrollment skyrockets. Recall that initial enrollment is 27 percent. The program boosts this by 42 percentage points. Burde and Linden do a nice job of contextualizing this result, pointing out that while this is below the overall average for Western Asia, it’s equal to the global rural average enrollment rate. Attendance goes up a bit too, but not by much since it is already pretty high. Test scores take a leap: going up by 0.51 standard deviations, with gains in math outpacing language. Estimates of the treatment on the treated, of course, show a bigger jump – one year of formal schooling causes an increase of 1.2 standard deviations.
Burde and Linden also nicely put these results in a relative cost-benefit framework. In terms of enrollment, this intervention gets you an additional 1.48 child academic years for $100. This puts this program into the range of other interventions such as girls scholarship programs, school meals, and subsidized uniforms (but, of course, short of deworming). The testing results clock in as better than most computer-assisted learning programs, but less cost-effective than interventions such as remedial education programs.
One of the really striking results of this paper is the differential impact of this program on girls. In terms of enrollment, the gender disaggregation shows a 34 percentage point increase in enrollment for boys, and an additional 16.7 percentage point increase for girls. Compared to the enrollment gap between boys and girls in the control group (some 20 percentage points), this intervention closes the gender gap. The intervention also helps narrow the gap in terms of test scores – reducing the distance between girls and boys by over a third within a year.
A big part of the story (maybe all of it) of how this intervention closes the gender gap is distance to schools. Burde and Linden lay out some evidence to show that the village-based schools are comparable in quality to the existing alternatives and then take a look at responses to distance to school. Distance matters for boys – an additional mile to school reduces their enrollment by 13.2 percentage points. But for girls this estimate is 5.9 percentage points larger.
So it seems that if you build it closer to home, they will really come. And they’ll learn. So for those countries still lagging on this indicator, this paper provides a fairly cost effective way to make progress.