An interesting new paper by Ben Olken, Junko Onishi, and Susan Wong gives us some evidence on how incentives can make aid more effective. They look at a community block grant program in Indonesia and compare the effects of these grants with and without incentives. Incentives make a difference.
“There is nothing in this book that needs to be confirmed by complex laboratory experiments. You have only to open the window or step into the street”, Hernando de Soto, The Other Path, p14., 1989.
I’ve been reading a good bit on psychological responses to conflict and disaster for on-going work and am struck by the tone of discussion in the popular press soon after a potentially traumatic event. In these reports, trauma among the survivors is often presumed widespread and the focus is on its expected costs and consequences. However more recent academic work on this topic argues that an exclusive focus on the traumatized misses most of the story.
Despite the large and growing literatures on migration in economics, sociology, and other social sciences, there is surprisingly little work which actually evaluates the impact of particular migration policies (most of the literature concerns the determinants of migrating, and the consequences of doing so for the migrants, their families, and for native workers). I am therefore always interested to see new work in this area, particularly work which manages to obtain experimental variation in policy implementation.
· The IDB Development that Works blog covers a randomized trial of the one laptop per child program in Peru – no impact on learning, but some increase in cognitive skills.
A “hearts and minds” model of conflict posits that development aid, by bringing tangible benefits, will increase population support for the government. This increased support in turn can lead to a decrease in violence, partly through a rise in population cooperation and information sharing with the government. At least one previous observational study in Iraq found that development aid is indeed associated with a decrease in conflict.
There is much demand from practitioners for “shoestring methods” of impact evaluation—sometimes called “quick and dirty methods.” These methods try to bypass some costly element in the typical impact evaluation. Probably the thing that practitioners would most like to avoid is the need for baseline data collected prior to the intervention. Imagine how much more we could learn about development impact if we did not need baseline data!