Syndicate content

Is this really the best the World Bank/IFC can do in touting its impact?

David McKenzie's picture

The past week has seen the World Bank building covered in banners and messages promoting the release of the 2012 World Development Report and the annual meetings. One of my colleagues drew my attention to this claim of impact on the sidewalk outside the Bank:

Curious as to where this number came from, and with the footpath not surprisingly devoid of footnotes, I tried searching for the source of this claim of impact. The IFC’s twitter feed and facebook page showed up, stating that “Data from IFC clients across all regions shows 2.4 million jobs provided in 2010, of which 665,000 were jobs for women”, with the source being the just released 2011 IFC annual report (which reports that data from 615 investment clients across all regions and industry sectors show that they employed more than 630,000 women in 2010, and that data from clients for which they had three years of data showed a 14% increase in female employment over 3 years). As an example of where this type of number comes from, the report states “In India, IFC invested in a tea production company that provided employment for almost 32,000 people in 2010.”

 So this is the first misstatement – the data refer to one part, not the whole of the World Bank Group. Similar numbers appear in the 2010 IFC report which states: “We know that it takes more than volume to meet the needs of the poor. That is why we carefully target our resources, selecting where our financing and advice can be deployed most effectively. And we set measurable goals to gauge our impact, and improve our performance. In 2009, our clients provided 2.2 million jobs, including nearly 514,000 in the manufacturing and services sectors”. This same aggregate statistic is repeated under the headings “Providing value for money”, “Global challenges and impact” and “poverty and unemployment: impact around the world”.

Clearly this is in no way a measure of impact, and presenting it as such is both disappointing to both researchers working on credible measures of the impacts of different projects, as well as to the readers who are being presented this information as if it is in any way informative about the effects of the World Bank Group’s work. I was in a meeting recently where the argument was put forward that for operational needs one need not strive towards rigorous high-quality impact evaluations, but just “good enough” evaluations, where “good enough” did not involve attempts to compare results to a reliable counterfactual. If such numbers can just be presented and not have anybody call them on it, then little wonder that the appetite for doing serious work is low.

So why does this not tell us anything about impact? This is probably a rhetorical question for most readers, but given that such statements are printed on the pavement, it is perhaps worth pointing out some of the issues with it:

-          It is obvious that investing in a company does not mean that your impact is the number of employees hired by this company – otherwise we should just invest $1 in each of Walmart, McDonalds, Carrefour and other large multinationals and have an amazing jobs per dollar return.

-          It takes no account of what these firms would have hired without the IFC’s investment- that is, there is no comparison to a counterfactual.

-          Even if there were 665,000 additional workers hired by these companies, this would not necessarily mean 665,000 new jobs created – many of these workers would not have come from unemployment. 

-          The numbers the IFC does provide in the 2011 report suggests the growth in female employment in its clients was 14% over three years – an order of magnitude less than suggested by the 366,000 and 665,000 figures, and this is without even considering what growth would have been in the absence of IFC investment.

-          The contemporaneous timeframe is unlikely to be an appropriate one for looking at jobs – investments and advisory services like training programs are likely to take some time to have impacts on employment, so most of the job creation in 2010 from the IFC’s work likely comes from projects they carried out in say 2005-2008, not those carried out in the past year.

We’ve previously had a debate on this blog about the relative merits of portfolio evaluation vs single project evaluation.  Clearly it would be great from a public communications point of view if we could say something like “World Bank and IFC projects collectively caused X million new jobs to be created in the past 2 years”. However, we are far from such a point, and the second-best solution is surely not to make statements like that in the photograph that tells us nothing about impact. I have much more sympathy (although still many reservations) for the new efforts the IFC is doing to estimate the overall impact of its regulatory reform work by using the evidence from rigorous micro studies of the impacts of reforms in some countries and extrapolating to the overall portfolio.

 I believe that the World Bank and the IFC do carry out a lot of projects that offer large benefits to the countries implementing them, and we are starting to build a body of evidence that rigorously shows what these impacts are. In my opinion, we would be far better to highlight these success stories where we actually have evidence than to present aggregate numbers that are meaningless. So how about banners that instead say things like:

·         Land titling reform in Rwanda increased female land ownership and tripled the proportion of women who invested in their land through soil conservation.

·         A pilot program in Jordan found job vouchers quadrupled the share of married graduates in employment 8 months after graduation (see page 301 in the WDR, ongoing work is measuring the longer-term impacts).

·         Cambodia's scholarship program, supported by the World Bank, increased school attendance rates of adolescent girls by about 25 percentage points.

The WDR 2012 contains many such examples – and we would be far better highlighting the results of these serious attempts to measure impact than using aggregate numbers that are not informative.

Comments

Submitted by Alexander on
Re the previous finding that "Reading DI reduces the perception that World Bank staff face censorship when they blog among academically focused individuals." (http://blogs.worldbank.org/impactevaluations/node/624): Chalk me up to newly convinced.

Submitted by April Harding on
David, You are raising good points about how development organizations can more accurately portray or "tout" their achievements. It is worth noting that all the development organizations these days feel the pressure to "tout" their contributions - as a means of sustaining support. I think you would find the IFC is not alone in struggling to balance accuracy with "impact in messaging". As an example, the organizations working in the the health domain, all now feel they must give aggregate numbers on impact (e.g. lives saved, or reductions in deaths from X disease). And they all must do this by extrapolating from inputs, and occasionally outputs, which they can claim some responsibility for getting delivered. These extrapolations come from models which most practitioners (and almost certainly the leaders of the organizations in question) are dissatisfied with. That is, most of us know a malaria bednet delivered doesn't translate predictably into a certain number of lives saved, or malaria cases averted. However, if organization A (say, the Global Fund) "touts" lives saved, and organization B (say, the World Bank) "touts" nets distributed - you can imagine which organization looks better in the press. I don't think demonstrative stories (even coming from an RCT) will work for such messaging challenges. That being said, the IFC faces an even tougher challenge linking their activities (even to meet the low standards I described above)to the MDGs the world has selected as the results bible. Their main instrument for impact is investments in private companies - and it is from these companies that they have their information from which to derive impact. And in very few cases, are the outputs of these companies "link-able" to development goals. As you note, the company employs someone (who may well have been working elsewhere), or the hospital delivers a service (to a patient who would have gotten treated up the street). The contribution to the total # of jobs, or access to health services is simply unknowable without knowing vastly more information about the market the companies were operating in, or the people who became employed or got treated. Tying their investments to the "impacts" which get you brownie points wiht the masses is a huge, maybe insurmountable, challenge. I think it's important to acknowledge that, however dissatisfactory, even tracking these "outputs" is a step forward from tracking achievement in terms of volumes of investment - which was the only figure that mattered until very recently for the IFC. And, it is my impression that the IFC is working intently on assessing and refining their International Development Indicators. They held a day-long discussion on precisely this topic last week http://www.ifc.org/ifcext/devresultsinvestments.nsf/Content/homenew For me, the take-away message is that all these development organizations should be doing much much more to develop and implement on a regular basis cheap monitoring instruments to track the evolution of the markets which the entities we support operate in. Without these repeated "market snapshots", we will continue to be stuck pretending to know what is going on.

Thanks April for these thoughtful comments. I totally agree it is a tough thing to try and measure, and that organizations feel pressure to come up with these big numbers. I agree with you that better measurement of inputs and steps taken is one positive way forward. However, I hope it doesn't have to be a race to the bottom with statements like "X lives saved" in which the organization that is prepared to make the most outrageous claims of impacts are somehow those that prosper. Thinking about what a middle ground might look like seems an interesting way forward.

Submitted by Anonymous on
Nice post! I think most people involved in communications and marketing (and others as well) don't know about the measurable impact of most of the WB projects. There should be a stronger institutional connection between research and impact evaluation teams and other units of the bank. For example (just thinking out loud), a team could be in charge of summarizing main findings of impact evaluation studies from the WB, and sending those findings to TTLs and other involved in the project, as well as people from communications. The findings should be in a one-pager brief in plain language (not economists jargon), similar to the Economic Premises or Research Digest produced by PREM and DEC, but targeted to people from communications. In that way, when EXT staff or others need to pull out number of the impact of WB, they can just browse those monthly briefings and get some catchy but rigorous phrases to add to their communications strategy.

David, Thanks for raising this issue, and thanks to the others who have commented. You are right—the statement on the pavement needed a footnote. Ideally, it would have clarified that IFC—as part of the World Bank Group—contributed, through its clients, to providing the 665,000 jobs to women. We state this clearly in the 2011 Annual Report, which you correctly identified as the source of the data. The footnote would have read, ‘IFC, through investments and advisory activities, is contributing to job creation in developing countries. While we have a sense of direct jobs being created by our client companies, we don’t yet fully understand the indirect effects on jobs or other induced effects of our work. IFC is launching a study and open process (www.ifc.org/jobcreation) to better understand the direct, indirect, and induced effects of our operations.” Clearly too long for the pavement (or a sound bite). Nevertheless, a version of it would have gone a long way in making clear our role, and that of our clients, in providing jobs for women in developing countries. I also agree that reach numbers alone are not a sufficient measure of development impact. The data we shared are part of IFC’s broader results measurement (www.ifc.org/results) framework and were intended to be seen in that context. But the reason I particularly welcome your comments about jobs, attribution, and measuring development impact is because this is so pressing for us at IFC at the moment. Job creation is one of the private sector’s key contributions to poverty reduction. IFC, as the largest global development institution focused on the private sector in developing countries, plays an important role in facilitating employment through its clients. But how exactly? While we can say quite a bit about the direct effects our investments and operations have on jobs, we have limited knowledge about which interventions are most likely to catalyze job creation, or benefit the poor. We know that direct jobs are only a small portion of the total job creation our investments can generate. Estimates of indirect job creation vary widely by sector, country, and company. The sectors that produce fewer direct jobs (e.g. heavy industries) often have higher indirect job creation. For example, a case study on a gold mining company in Ghana—an IFC client—showed that for every direct job in the mine, 28 more were supported in the economy. We are finding that job creation is also difficult to assess in some areas of our business. For example, we know that access to finance is one of the key constraints for private business in developing countries, and that micro, small, and medium-sized enterprises (MSMEs) suffer particularly from these constraints. We also know that in 2010 our investment clients provided almost 10 million loans to MSMEs. We don’t know exactly how many jobs these MSMEs provided, but probably in the order of 50 million. We face the same challenge with our work in Infrastructure, Investment Climate, and Education and skills training. The study and open process (www.ifc.org/jobcreation) we are launching is intended to deepen our knowledge about these issues. Because IFC is not a research institution, we depend greatly on the knowledge and expertise of others, which is why we have opted to do this as an open source study. We will examine data from our clients and the experiences of our partners and donors on employment. During the course of the study, IFC will also assess the feasibility of adding job creation as a development goal (www.ifc.org/ifcext/devresultsinvestments.nsf/Content/homenew). The lessons we learn will help shape our strategy and operations. And hopefully, they will enable us to better articulate our contribution to employment generation. We’ve set up a web page (www.ifc.org/jobcreation) that has background material and our methodology. I hope you and your readers will continue the conversation. Nigel Twose, Director, IFC Development Impact Department

Submitted by Raj Raina on
thanks for quality assurance

Submitted by CBT on
Thank you for this post. I think David just catapulted the quality of results measurement at the World Bank Gorup a decade forward. It will be much harder, from now on, to present evidence of positive impact that cannot stand on its own. A fact-checking task force may be in the works, as we speak. What is disappointing is to read the IFC responses. Instead of admitting this as a mistake, they try to defend it by providing evidence of the difficulty of their mission. It is like driving through a red light and arguing this is not a traffic violation because you were running late. I disagree with the notion that IFC didn't include the footnote on the pavement because it was too long. All it had to say was "we are not sure this is true". If a better evaluation system is in the works (complete with transparency and public feedback), wouldn't it be better to wait until that system is in place rather than providing shaky evidence of impact. As David says, there is lots of evidence of positive impact stemming from rigorous evaluation. Why not stick to those? And if the IFC believes that what they do is so complex that measuring impact properly means they can't produce catchy phrases for public consumption, then they shouldn't. Who says this is the only way to communicate effectiveness? It may be the easiest, since everybody is doing it, but the IFC may just need to think harder. Don't just copy what others are doing if that doesn't work for you. The consequence of misleading communications is that --once discovered--it taints everything you say. Now, nobody who has read this post will be able to read the IFC's annual report without smiling. What a pity.