These days we talk a lot about how best to assess development impact through evaluative research. Sound data and methods are essential. Here there has been considerable progress over the last 20 years or so.
All that progress will come to nothing if it does not make those people actually doing development more knowledgeable about what they are doing. This depends in part on whether the research that is done is useful and well disseminated. Here there has also been progress, though more work is needed.
It also depends on practitioners’ demand for knowledge. (By “practitioners” I mean project staff in donor or lending agencies as well as policy makers in government.) This will depend on the perceived costs and benefits to practitioners. We can lower the costs by better dissemination. But if there is little expected benefit to practitioners, there will be little learning and little development impact from new knowledge.
That leads me to ask: do today’s practitioners face the right incentives for learning? Some practitioners I know have a real thirst for knowledge. But it would be fair to say that too many project staff and policy makers face weak incentives to learn. Their “bosses” put little value on time spent learning because of uncertainty about the benefit or their expectation that a sizable share of the benefit accrues to others. Practitioners may bear only a small share of the cost of misinformed practices and policy mistakes. Project design and approval processes and ex-post quality ratings do rather little to reward staff for well-informed projects, or signal cases that were misguided from the start.
What might be done to strengthen practitioners’ incentives for learning? This is not a new question. Many readers will have heard of the long-standing concerns about the World Bank’s “lending culture,” which tends to reward operational staff for the volume of their lending, with (it is argued) too little weight given to the quality of lending or on knowledge-related products or services. 20 years ago, the Wapenhans Report  laid out these concerns in forthright terms. Since then there has been considerable effort over time to do better through tighter quality control at entry and better project implementation practices. But few observers would argue that these concerns are no longer salient.
Here is one suggestion for strengthening incentives for using knowledge in the practice of development: as a matter of principle, no project or policy proposal should be funded that does not have a recent Ex-ante Survey of Knowledge on Impact (ESKI)—an objective and thorough assessment of prevailing knowledge with bearing on the case for and against the intervention in its setting, based on past research, impact evaluations and experience. This paper can be thought of as the principle input to the ex ante appraisal—explaining the rationale for the intervention, based on what we know, and identifying plausible ranges for the key parameters relevant to a successful project. It should draw on theory and evidence to understand the development problem or obstacle that the intervention targets. What is the market or governmental failure it addresses and how do we know that doing so will make things better overall? What are the implications for equity? The best World Bank lending operations already include something like an ESKI. The challenge is to raise standards and bring the idea to all operations, and not just the Bank’s.
A few additional remarks:
· In terms of the World Bank’s project cycle, the ESKI should be done at the identification stage. The paper may well conclude that the project should not go ahead. If the project gets a green light, then the ESKI it will be a key input to the subsequent “Concept Note.”
· It would not only apply to “projects” but also policy-based support. Here too we need to be reasonably confident that the proposed reforms have a sound economic rationale based on current knowledge.
· As the principle document laying out the case for and against the intervention, the ESKI should meet high quality standards. Importantly, the paper cannot be selective—it will clearly not do to just cherry pick the research findings that seem supportive. Nor should it make claims that are not supported by documented evidence. New data collection and analytic work may well be needed. Evidence will often be mixed and ambiguous, and this must be acknowledged explicitly.
· The ESKI should be subject to formal peer review, to assure that the agreed quality standards are met. In the case of the World Bank, this might be done by its research department (when it was not engaged directly in producing the paper) or by one of its “Global Expert Teams,” or it could be done externally by established experts.
· The ESKI will often be an important first step to the subsequent impact evaluation, taking this task “up-stream,” with the evaluator involved at the very outset of the project cycle (rather than later, and possibly too late). The design stage of a good impact evaluation often helps sharpen the rationale for an intervention. (The Bank’s DIME  initiative has being doing just that for the participating projects.) The ESKI would be the first step in this process for every project.
· It will be no less relevant to innovative new projects as to more established types of interventions. Naturally there will be differences between projects in the extent of relevant current knowledge to draw on. But even for the most innovative project, there must be an internally consistent rationale, anchored to knowledge.
· There will undoubtedly be some economies of scale to be obtained by grouping similar projects and providing a common ESKI, which is then updated and adapted to specific applications. But every project or policy-based operation should have an ESKI within the last two years (say).
· The ESKI would also be an important input to future knowledge generation (including ex-post evaluations) by identifying key knowledge gaps meriting further research.
· The ESKI should be shared with the borrower, whose comments should be taken into account as part of the project preparation process and the related risk assessment.
Readers may well have reactions to this proposal or suggestions of their own on how to strengthen the link between knowledge and practice.
(I have benefited from useful comments from Ani Dasgupta, Asli Demirguc-Kunt, Shahrokh Fardoust, Jed Friedman, Emanuela Galasso, Markus Goldstein, Polly Jones, Arianna Legovini, Justin Lin, David McKenzie, Dominique van de Walle and Adam Wagstaff.)