Development impact calls for knowledgeable development practitioners

|

This page in:

These days we talk a lot about how best to assess development impact through evaluative research. Sound data and methods are essential. Here there has been considerable progress over the last 20 years or so.

All that progress will come to nothing if it does not make those people actually doing development more knowledgeable about what they are doing.  This depends in part on whether the research that is done is useful and well disseminated. Here there has also been progress, though more work is needed.

It also depends on practitioners’ demand for knowledge.  (By “practitioners” I mean project staff in donor or lending agencies as well as policy makers in government.) This will depend on the perceived costs and benefits to practitioners. We can lower the costs by better dissemination. But if there is little expected benefit to practitioners, there will be little learning and little development impact from new knowledge.

That leads me to ask: do today’s practitioners face the right incentives for learning? Some practitioners I know have a real thirst for knowledge. But it would be fair to say that too many project staff and policy makers face weak incentives to learn. Their “bosses” put little value on time spent learning because of uncertainty about the benefit or their expectation that a sizable share of the benefit accrues to others. Practitioners may bear only a small share of the cost of misinformed practices and policy mistakes. Project design and approval processes and ex-post quality ratings do rather little to reward staff for well-informed projects, or signal cases that were misguided from the start.

What might be done to strengthen practitioners’ incentives for learning? This is not a new question. Many readers will have heard of the long-standing concerns about the World Bank’s “lending culture,” which tends to reward operational staff for the volume of their lending, with (it is argued) too little weight given to the quality of lending or on knowledge-related products or services. 20 years ago, the Wapenhans Report laid out these concerns in forthright terms. Since then there has been considerable effort over time to do better through tighter quality control at entry and better project implementation practices. But few observers would argue that these concerns are no longer salient.  

Here is one suggestion for strengthening incentives for using knowledge in the practice of development: as a matter of principle, no project or policy proposal should be funded that does not have a recent Ex-ante Survey of Knowledge on Impact (ESKI)—an objective and thorough assessment of prevailing knowledge with bearing on the case for and against the intervention in its setting, based on past research, impact evaluations and experience.  This paper can be thought of as the principle input to the ex ante appraisal—explaining the rationale for the intervention, based on what we know, and identifying plausible ranges for the key parameters relevant to a successful project. It should draw on theory and evidence to understand the development problem or obstacle that the intervention targets. What is the market or governmental failure it addresses and how do we know that doing so will make things better overall? What are the implications for equity? The best World Bank lending operations already include something like an ESKI. The challenge is to raise standards and bring the idea to all operations, and not just the Bank’s.

A few additional remarks:

·         In terms of the World Bank’s project cycle, the ESKI should be done at the identification stage. The paper may well conclude that the project should not go ahead. If the project gets a green light, then the ESKI it will be a key input to the subsequent “Concept Note.”

·         It would not only apply to “projects” but also policy-based support. Here too we need to be reasonably confident that the proposed reforms have a sound economic rationale based on current knowledge. 

·         As the principle document laying out the case for and against the intervention, the ESKI should meet high quality standards. Importantly, the paper cannot be selective—it will clearly not do to just cherry pick the research findings that seem supportive. Nor should it make claims that are not supported by documented evidence. New data collection and analytic work may well be needed. Evidence will often be mixed and ambiguous, and this must be acknowledged explicitly.

·         The ESKI should be subject to formal peer review, to assure that the agreed quality standards are met. In the case of the World Bank, this might be done by its research department (when it was not engaged directly in producing the paper) or by one of its “Global Expert Teams,” or it could be done externally by established experts.

·         The ESKI will often be an important first step to the subsequent impact evaluation, taking this task “up-stream,” with the evaluator involved at the very outset of the project cycle (rather than later, and possibly too late). The design stage of a good impact evaluation often helps sharpen the rationale for an intervention. (The Bank’s DIME initiative has being doing just that for the participating projects.) The ESKI would be the first step in this process for every project. 

·         It will be no less relevant to innovative new projects as to more established types of interventions. Naturally there will be differences between projects in the extent of relevant current knowledge to draw on. But even for the most innovative project, there must be an internally consistent rationale, anchored to knowledge.

·         There will undoubtedly be some economies of scale to be obtained by grouping similar projects and providing a common ESKI, which is then updated and adapted to specific applications. But every project or policy-based operation should have an ESKI within the last two years (say).  

·         The ESKI would also be an important input to future knowledge generation (including ex-post evaluations) by identifying key knowledge gaps meriting further research.  

·         The ESKI should be shared with the borrower, whose comments should be taken into account as part of the project preparation process and the related risk assessment.

Readers may well have reactions to this proposal or suggestions of their own on how to strengthen the link between knowledge and practice.

(I have benefited from useful comments from Ani Dasgupta, Asli Demirguc-Kunt, Shahrokh Fardoust, Jed Friedman, Emanuela Galasso, Markus Goldstein, Polly Jones, Arianna Legovini, Justin Lin, David McKenzie, Dominique van de Walle and Adam Wagstaff.)

Authors

Martin Ravallion

Martin Ravallion, Edmond D. Villani Professor of Economics, Georgetown University

Ian
October 06, 2011

Development practitioners actual demand for and use of knowledge are important barriers to effective development work. I think the proposal for an "ESKI" could go some way to help address this in an institution that is willing to incorporate this as a requirement. A few provisos though

1. The level of detail and process required to produce this would need to depend on the size and potential impact of the programme. It sounds like a lot of pre-work to do and a high cost to do something like this for smaller projects and so a lighter mechanism to ensure that available knowledge is taken into account should be considered.
2. Any good assessment of the potential impact of knowledge in addressing a problem also needs to take into account the political context. Any government (or even donor) policy maker needs to take into account not only what will work technically, but what will be feasible in the existing cultural and political system and what can be explained and sold to voters.
3. Existing knowledge about an issue doesn't only come from academic research and evaluations- it also comes from experience or "tacit" knowledge.
4. It would also be important to look at the incentives to use knowledge that go beyond the need to fulfill mandatory requirements such as enforcing the need for an ESKI. People who have mandatory requirements, but little real commitment to fulfilling them tend to carry them out in form but not in substance. This means that institutionalizing some for of knowledge review will help, but some of the other factors which influence whether or not policy makers seek and use knowledge will also need to be addressed.

Here's something I wrote earlier this year on some of the reasons why there isn't a greater demand for knowledge, and a few (far from comprehensive) suggestions about how to tackle them

http://kmonadollaraday.wordpress.com/2011/03/21/do-we-need-to-create-a-…

Gabriel
October 07, 2011

This is a long overdue proposal. I've often found it frustrating to read concept notes that don't make any effort--not even with a selective reading of the evidence!--to argue the case for the project based on past experience and theory.

One point that may help avoid some confusion: I've discovered that many people in the development policy world think that advocates for impact evaluation and evidence based policy believe that randomized control trials are the only valid form of evidence. As such, these same people might suspect that this proposal masks an effort to only advance projects supported by evidence from RCTs.

Without being able to read Martin's mind, I'm confident in saying that is NOT the intent. RCTs are just one of many forms of evidence.The point is to put whatever evidence we have on the table up front before project decisions are made.

Larrú
October 07, 2011

ESKI seems to be a good trial for researchers and knoweldge demanders, but what about people (policy-makers, politicians, WB gobernors) that take decisions not based on "objetive and prevailing knowledge"? How many votes do knowledge generate? My own experience is dealing with practitioners or technichal staff sharing the evaluation results, but policy-maker or the person who must take the final decision is out. Is knwoledge a democratic (practical in the real world) value?
A final point: people cannot generate "objective" knowledge. Our knowledge is always mediated by interpretations (Should "hermeneutics" generate any posts?).

Jonathan Haughton
October 10, 2011

The problem of keeping practitioners up to date is widespread.

In academia, sabbatical leaves are often seen as providing an opportunity to retool, or to give both time and opportunity to reflect more deeply than is possible when one is simply trying to write report after report.

The CFA Institute - the main professional body for professionals in finance - has put in place a formal system whereby one earns brownie points for reading academic articles, attending serious talks, and the like.

I would worry about ESKI becoming just another hoop to be jumped through in pro forma fashion - done by borrowing someone else's fine text and tweaking it just a bit in order to satisfy the boss or a reviewer.

What keeps me most on my intellectual toes is the need to present work in a seminar, where the ideas must be defended in front of a knowledgeable and potentially skeptical audience. I would recommend a process whereby evaluations of any significant project or program - whether at the design stage, or later - have to be defended in front of a jury of one's peers, viva voce; it might also be useful to engage some people from outside the institution to participate, to avoid the potential for in-house love fests where nobody dares speak ill of the work of an office mate. Such a process is not without some resource cost, but it does provide a spur to intellectual excellence, and should help keep us all honest.

Irene
October 10, 2011

Hear, hear... evidence in itself is not sufficient for learning or making a difference. It requires critical human faculties to look at the evidence and put it to use.

Since the late 1990s, when IFAD was involved in institutionalising learning through systematization in South American projects, I've been intrigued by incentives for learning and their absence. Disincentives are by far more pervasive. For example, an agricultural minister in a South American nation during that period - on hearing the word 'mistake' in a public presentation - pronounced: 'We do not make mistakes in this project' slams the door shut on any learning in that sector for quite some time.... And that is a more dramatic example. Simple examples are making the M&E unit manager's post the place to locate people who underperform, ignoring any findings from annual M&E reports, and more. http://www.ifad.org/evaluation/guide/7/7.htm#7_3

A few thoughts:
1. Project/program learning cannot come just at the onset but yes, that would be a good start! An interesting example comes from New Zealand's education sector on what they call 'Best Evidence Synthesis Iterations'.
2. Conditions for learning crucially require the right incentives but there are other pre-conditions (http://journal.km4dev.org/index.php/km4dj/article/viewFile/105/164)
3. Multiple lines of evidence are critical, as no single source of 'evidence' will provide sufficient insight on what are often complex change pathways.
4. It also means rethinking how we understand M&E - as an accountability ensurer or a learning provoker. For the IFAD M&E guide, we started identifying some common disincentives/incentives and offering ideas.
5. Critically, the call for more investment up front is basically a call for much more thoughtful theorizing about the intended change pathways, making explicit values and beliefs that underpin our strategic choices, articulating assumptions about key change agents, how intended project activities and relationships interacts with other, and much more.

Since writing on it for IFAD 11 years on, I've seen very little discussion on incentives for learning. Maybe now we can advance on this critical topic?

Martin Ravallion
October 11, 2011

Thanks for these many useful comments. Here are some further thoughts.

Ian: Yes, scale naturally matters in considering how much work to put into an ESKI, though even a "small" project ($20m say) would surely make a serious effort to properly document its rationale based on existing knowledge. I agree that “tacit knowledge” matters, but can’t we make it explicit and open, rather than hidden? An ESKI would be a good place to do that. And I agree that mandatory requirements are second-best to real commitment by practitioners; unfortunately, however, it looks like we need make it mandatory for now; hopefully the idea will stick and near-universal commitment to doing it well will emerge in due course.

Larru: the idea is to try to change the fact that there are (in your words) “people (policy-makers, politicians, WB governors) that take decisions not based on objective and prevailing knowledge." Surely this is unacceptable. We must all push back against misinformed policy making. An ESKI is just one tool for that task. I don’t agree with your claim that “people cannot generate objective knowledge.” The accumulation of objective knowledge has been an important source of human progress.

Gabriel: Yes, we must be open about the sources of reliable knowledge about development impact. As I have argued elsewhere, RCTs can sometimes play a useful role, but so too can many other methods of both quantitative and qualitative knowledge generation. We need to focus on the right question, and remain open about how best to answer it.

Irene: I agree with your comments, and thanks for pointing out the lessons from IFAD’s experience, as in the useful links you provide. International agencies should be swapping notes and pooling experiences on how best to solve these (and other) problems faced in trying to assure development impact.

Jonathan: thanks for your thoughts. Rigorous peer review of the ESKI would be important, including seminar-type presentation and defense. However, a degree of bundling would be justified in my view; a core ESKI for a group of similar projects, but each adapted to the specific application. The core ESKI must be sound and updated regularly.

gawain kripke
October 12, 2011

I like this idea - and something like it should be required for any major intervention. But I'm not sure how it changes the "incentives" and doesn't become yet another formalistic process - another bureaucratic hoop to jump through before getting a project approved.

I wonder if one might consciously separate the case for and case against - with different authors or teams producing each.

to me, it seems like the USE of the ESKI product is at least as important as document itself. a peer review is critical. even better would be process in which an approval panel reviews, requiring a strong case and clear tradeoffs.

in making a "case for" it's important that the opportunity costs be made clear and real to give gravity to the decision. a case presuming infinite resources is much easier than a case in which your funding must come from someone else's budget. spending the money for x means NOT spending money on y. ideally, x and y would directly compete in the same venue. knowledge and learning come observing each other and from losing, then winning, then maybe losing again, these competitions.

gawain kripke

Anonymous
October 31, 2011

One comment on this very desirable idea. Recently, we had a similar discussion with some colleagues and someone wrote "I have generally found that evidence is used only where it can support the current or proposed policy of the government or institution, rather than the other way around..."

This is of course not to argue for ignorance; but to try to understand why is that we do not see more appetite for ex-ante assessment of projects’ impact in current documents, or why we see superficial and low quality projects’ appraisals.

I think we need to realize that by its nature the Bank is not only a knowledge institution, but a sort of cooperative for countries, with a basic function of responding to the financing needs of its members. There is also pressure from competitors around. I find this dichotomy at the heart of this discussion, as sometimes projects respond to a particular demand from the country and not directly to the best evidence available.

Perhaps one solution is to construct a project approval decision process in two layers, the technical first and then political one, without trying to artificially supplant one or the other. If this was possible we could be more sincere in terms of the evidence and avoid project documents that selectively choose it or just avoid any evidence. And on a second stage the board could frankly approve projects on the basis of strategic, political or practical considerations notwithstanding what existing evidence points at, with a written note saying so. This would at least increase transparency of the decision-making process and reveal other inputs and incentives that decision-makers have, without punishing but simply accepting this is a reality of our type of “cooperative” institution.