One of the comments we got last week was a desire to see more “behind-the-scenes” posts of the trials and tribulations of trying to run an impact evaluation. I am sure we will do more of these, but there are many times I have thought about doing so and baulked for one of the following reasons:
David McKenzie's blog
Berk is on a plane, so you get your links a day early this week:
· David Roodman summarizes the new microfinance randomized experiment in Bosnia – a new example of randomization among marginal MFI clients.
The impetus for this post comes from a couple of recent experiences. First, I got copied on the letter sent to an author containing a decision from the editor from a paper that I had refereed so long ago that I had forgotten even refereeing it. Second, every now and then I have conversations with colleagues about where to send papers, which for most journals rely on anecdotes/sample sizes of a couple of experiences (e.g. what is journal X like for turnaround time – well, the one paper I sent there recently took 10 months to get a report, etc.).
“There is nothing in this book that needs to be confirmed by complex laboratory experiments. You have only to open the window or step into the street”, Hernando de Soto, The Other Path, p14., 1989.
Despite the large and growing literatures on migration in economics, sociology, and other social sciences, there is surprisingly little work which actually evaluates the impact of particular migration policies (most of the literature concerns the determinants of migrating, and the consequences of doing so for the migrants, their families, and for native workers). I am therefore always interested to see new work in this area, particularly work which manages to obtain experimental variation in policy implementation.
· The IDB Development that Works blog covers a randomized trial of the one laptop per child program in Peru – no impact on learning, but some increase in cognitive skills.
One of my favorite papers to present is my paper on improving management in India, in part because we have wonderful photos to illustrate what bad management looks like and what improved practices look like (see the appendix to the paper for some of these). Photographing impact isn’t only useful for presentations and glossy summaries, but may potentially offer a new form of data. However, this is easier said than done, and today I thought I’d share some misadventures in trying to photograph impacts on small firms.