Syndicate content

Education

What’s new in education research? Impact evaluations and measurement – March round-up

David Evans's picture



Here is a curated round-up of recent research on education in low- and middle-income countries, with a few findings from high-income countries that I found relevant. All are from the last few months, since my last round-up.

If I’m missing recent articles that you’ve found useful, please add them in the comments!

Weekly links February 17: Don’t give up on your research ideas but do give up on unwarranted policy recommendations

David Evans's picture
 
  • Chris Blattman provides an incentive to delay giving up on that great research idea you’ve been peddling for years in this story from the EconTalk podcast: For years, he pitched random African factory owners the idea of an RCT of factory employment. “They’d usually look at me kind of funny. They wouldn’t leap at the possibility. I was just this person they met on a plane.” One day it worked, and six weeks later he was randomizing applicants.

Technoskeptics pay heed: A computer-assisted learning program that delivers learning results

David Evans's picture
Some years ago, a government I was working with really wanted to increase the data they had on their own education system. They didn’t have great data on student attendance or teacher attendance, much less on tardiness or instruction time. They designed an information management system with swipe cards for every student and teacher to use going in and out of classrooms, all of which would feed wirelessly into the district office, allowing real-time interventions to improve education. It sounded amazing! And it fell apart before it ever began.

Can providing information to parents improve student outcomes? 4 recent papers show it can (Chile, Malawi, and US x2)

David Evans's picture
My oldest child started middle school this year, and I suddenly started receiving emails every other week with updates on his grades. I’d never received anything like this before and was overwhelmed (and a little annoyed) by the amount of information. Someone told me that I could go to some website to opt out, but that seemed like too much work. So I continue getting the emails. And sure enough, now I follow up: “Hey, are you going to speak to your teacher about making up that assignment?

Learning more with every year? Estimating the productivity of schooling in developing countries: Guest post by Abhijeet Singh

This is the fourteenth in our series of job market posts this year. 

Despite massive increases in school enrolment in developing countries, learning levels have lagged behind. But the range in average student achievement is large: In the 2012 PISA assessment (of 15-year-olds), Vietnamese students got higher scores than those in the US and the UK, but Peru ranked last (OECD 2012). The magnitude of the gap between these two developing countries was 1.4 standard deviations (SD); for comparison the difference between the US and Finland was 0.38SD.
 
My job market paper answers the question of how much of this gap reflects differences in the productivity of the schooling systems, as opposed to other factors such as nutrition, early childhood shocks, or endowments – a critical policy question relevant to the substantial education spending around the world.

Give power to the managers and the teachers will come: Guest post by Jacobus Cilliers

This is the ninth in our series of job market posts this year. 

Teachers’ attendance can be improved if they are monitored by head-teachers using mobile technology, but only if the associated reports trigger bonus payments.

Policy question
Can high-stakes decentralized monitoring improve civil servant performance, or will it lead to collusion between the monitor and civil servant? And what happens to the quality of information when we raise the stakes of reports?

Training teachers on the job: What we know, and why we know less than we should

Anna Popova's picture

or, why we need more systematic (and simply more) reporting on the nature of interventions

The hope. Last year, we reviewed six reviews of what interventions work to improve learning. One promising area of overlap across reviews had to do with training teachers who were already on the job (i.e., in-service teacher training or teacher professional development). Specifically, we proposed that “individualized, repeated teacher training, associated with a specific method of task” was associated with learning gains.

How do you scale up an effective education intervention? Iteratively, that’s how.

David Evans's picture
So you have this motivated, tightly controlled, highly competent non-government organization (NGO). And they implement an innovative educational experiment, using a randomized controlled trial to test it. It really seems to improve student learning. What next? You try to scale it or implement it within government systems, and it doesn’t work nearly as well.

Pages