It is well recognized that the stock of knowledge among development practitioners matters to development impact. How then do the operational staff of the largest international development agency value and use its research for their work?
The Research Support team in the World Bank’s Development Economics Vice- Presidency recently surveyed 550 of the Bank’s senior operational staff. One question asked what value respondents attached to Bank research for their work, on a 10-point scale where 1=”not valuable at all” and 10=”extremely valuable.” Answers covered the whole range. Over half the sample gave an answer of 6 or more; two thirds gave an answer of 5 or more. Many operational staff appear to have a reasonably strong incentive to learn about Bank research, though it is stronger for some than for others.
Of course, there is nothing to guarantee that even a strong incentive to learn will turn into knowledge. There are costs of learning. Access to relevant research findings is not always easy. These “frictions” can mean that actual learning is weak, even when desired demand is strong.
The survey also asked about familiarity with Bank research, again on a 10-point scale. Here too, over half gave an answer of 6 or more. But what is more interesting is that those who gave a high score for “the value of research for their work” overlapped quite a lot with those who gave a high score for “familiarity with research.” Answers to the two questions had a (significant) positive correlation coefficient of 0.50. 42% of those surveyed can be dubbed “functionally well-informed:” they put a reasonably high value on Bank research for their work (with a mean score of almost 8 on the 1-10 scale) and feel that they are quite familiar with research findings (also a mean score of 8 on a scale of 1-10).
Another group of staff—slightly less than one quarter of the sample—put a high value on Bank research but are not so familiar with that research; they appear to be less successful in accessing research products or they find them of limited relevance. Greater effort is needed to serve and reach this group. There is also a group of staff, again slightly less than one quarter of the sample, that can be called the “happily uninformed”—they attach low value to research for their own work, and report similarly low familiarity. The rest of the sample make up a small group of the “independently well-informed” who are keen consumers of Bank research even though it matters little to their current work.
The survey also throws light on how staff come to learn about Bank research. Some channels for learning matter more than others in predicting familiarity with research. Contacts with researchers—informal contacts as well as more formal direct support to operations—have more impact than papers and books (though, of course, research findings must be documented, so we still need the latter channel).
The results are consistent with the view that stronger incentives for learning about research translate into greater knowledge. That is good news. But there are sources of friction—diminishing the impact on knowledge of even a strong incentive for learning. The responsiveness of researchers and the relevance, timeliness and accessibility of their products are all important to how much the supply of Bank research helps operational staff do their work better.
There are some marked differences by location and sector of work. Familiarity with the Bank’s research is significantly lower for staff based in the Bank’s country offices. Yet those staff do value research for their work; indeed they put slightly higher value on research than staff based in Washington DC, although this difference is not statistically significant. Staff in country offices are no less (or more) likely to be “functionally well-informed” than those based in DC. All this suggests that advances in communication technology have not eliminated the advantages of physical proximity.
Staff working in the economic policy, poverty and human development sectors tend to be both more familiar with the Bank’s research and attach higher value to it than staff in the more traditional areas of the Bank’s lending—agriculture, energy and mining, transport and urban development. Only 28% of staff working in the latter sectors are functionally well-informed, as compared to 45% for all other staff, rising to 57% for staff working in economic policy, poverty and human development. And while only 7% of senior staff working on poverty and 10% of those mapped to economic policy are “happily uninformed,” the proportion rises to 37% for staff working in urban development and 41% in the energy and mining sector.
The differences across units in the demand for the Bank’s research are correlated with the incidence of PhDs (in any field) and the presence of economists across those units. This suggests that a lack of internal research capacity in operational units might impede absorptive capacity for external knowledge (as has also been found in studies of research and innovation in private sector firms). Attempts to mandate knowledge products such as impact evaluations or ESKIs (as I proposed in my last post on this blog) will no doubt have uneven success unless absorptive capacity is more uniformly high across the Bank’s sectors and operational units.
One might be tempted to respond to all this by saying: “Who cares? The happily uninformed are doing fine. Bank research does not matter to them and they largely ignore it.” Probing further into the staff responses casts some doubts on this response. Two observations: First, the study also finds that those who relied less on research in the past are more likely to want to increase their use of it in the future. The happily uninformed have a significantly higher desire to increase their usage of research in the future than do other groups. Second, the happily uninformed give their networks significantly lower ratings for their performance in connecting operational staff up to research findings. And staff working in the traditional infrastructure sectors give significantly lower average ratings for the performance of their networks than do staff working on economic policy, poverty and human development.
So the “happily uninformed” are not really so happy. They want to do better in connecting up with the latest research.
How then do these inter-sectoral differences come about? There are both demand- and supply-side factors. The supply of research is not always relevant to the needs of development practitioners. (For example, the current emphasis on randomized evaluations in development economics has arguably distorted knowledge even further away from the hard infrastructure sectors where these tools have less applicability, as I have argued here.) However, that is only part of the story. The supply of research is clearly also determined by demand. In turn, demand stems in no small measure from the extent to which “development impact” is challenged by citizens, governments and managers in aid agencies. Impact often appears to be taken for granted in the traditional hard infrastructure sectors (though in truth the evidence is often rather weak, given relatively low levels of investment in rigorous ex-post evaluations). This stands in marked contrast to the social sectors where lending and policy operations have had to work hard to justify themselves, and have turned to research to help do that. Thus we have seen a large new body of research on poverty and human development emerging over the last 15 years.
Clearly, if the presumption of impact was more routinely and systematically challenged then project staff would face stronger incentives for learning about impact—which means learning about the failures as well as the successes. And strong incentives for learning would generate greater familiarity and use of research in the work of development practitioners.
You can read more about my analysis of this survey of the views of the Bank’s operational staff in my paper “Knowledgeable Bankers? The Demand for Research in World Bank Operations.”