A while back I blogged about work using active choice and enhanced active choice to get people to get flu shots and prescription refills. The basic idea here is that relatively small modifications to the way a choice is presented can have large impacts on the take-up of a program. This seemed useful in the context of many of our training programs– attendance rates averaged 65 percent in a review of business training programs I did with Chris Woodruff. Therefore for an ongoing evaluation of the GET AHEAD business training program in Kenya, we decided to test out this approach.
Last week I blogged about a paper that David wrote with Chris Woodruff which takes stock of the existing evidence on the impact of business trainings. The bottom line was that we still don’t know much. Part of the reason is that these types of evaluations are not straightforward to do – they have some pitfalls that you don’t always find in your garden variety impact evaluation. So t
What do we really know about how to build business capacity? A nice new paper by David McKenzie and Chris Woodruff takes a look at the evidence on business training programs – one of the more common tools used to build up small and medium enterprises. They do some work to make the papers somewhat comparable and this helps us to add up the totality of the lessons. What’s more, as David and Chris go through the evidence, they come up with a lot of interestin
There are now a variety of well-known experimental and non-experimental methods that economists use to learn whether a given program works or not. However, our tools for learning why or why not something works are much more limited.