Syndicate content

Do our development practitioners have an incentive to learn? And do they learn?

Martin Ravallion's picture

It is well recognized that the stock of knowledge among development practitioners matters to development impact. How then do the operational staff of the largest international development agency value and use its research for their work?

The Research Support team in the World Bank’s Development Economics Vice- Presidency recently surveyed 550 of the Bank’s senior operational staff. One question asked what value respondents attached to Bank research for their work, on a 10-point scale where 1=”not valuable at all” and 10=”extremely valuable.” Answers covered the whole range. Over half the sample gave an answer of 6 or more; two thirds gave an answer of 5 or more. Many operational staff appear to have a reasonably strong incentive to learn about Bank research, though it is stronger for some than for others.

Of course, there is nothing to guarantee that even a strong incentive to learn will turn into knowledge. There are costs of learning. Access to relevant research findings is not always easy. These “frictions” can mean that actual learning is weak, even when desired demand is strong. 

The survey also asked about familiarity with Bank research, again on a 10-point scale. Here too, over half gave an answer of 6 or more. But what is more interesting is that those who gave a high score for “the value of research for their work” overlapped quite a lot with those who gave a high score for “familiarity with research.” Answers to the two questions had a (significant) positive correlation coefficient of 0.50. 42% of those surveyed can be dubbed “functionally well-informed:” they put a reasonably high value on Bank research for their work (with a mean score of almost 8 on the 1-10 scale) and feel that they are quite familiar with research findings (also a mean score of 8 on a scale of 1-10).

Another group of staff—slightly less than one quarter of the sample—put a high value on Bank research but are not so familiar with that research; they appear to be less successful in accessing research products or they find them of limited relevance. Greater effort is needed to serve and reach this group. There is also a group of staff, again slightly less than one quarter of the sample, that can be called the “happily uninformed”—they attach low value to research for their own work, and report similarly low familiarity. The rest of the sample make up a small group of the “independently well-informed” who are keen consumers of Bank research even though it matters little to their current work.

The survey also throws light on how staff come to learn about Bank research. Some channels for learning matter more than others in predicting familiarity with research. Contacts with researchers—informal contacts as well as more formal direct support to operations—have more impact than papers and books (though, of course, research findings must be documented, so we still need the latter channel).

The results are consistent with the view that stronger incentives for learning about research translate into greater knowledge. That is good news. But there are sources of friction—diminishing the impact on knowledge of even a strong incentive for learning. The responsiveness of researchers and the relevance, timeliness and accessibility of their products are all important to how much the supply of Bank research helps operational staff do their work better.

There are some marked differences by location and sector of work. Familiarity with the Bank’s research is significantly lower for staff based in the Bank’s country offices. Yet those staff do value research for their work; indeed they put slightly higher value on research than staff based in Washington DC, although this difference is not statistically significant. Staff in country offices are no less (or more) likely to be “functionally well-informed” than those based in DC. All this suggests that advances in communication technology have not eliminated the advantages of physical proximity.

Staff working in the economic policy, poverty and human development sectors tend to be both more familiar with the Bank’s research and attach higher value to it than staff in the more traditional areas of the Bank’s lending—agriculture, energy and mining, transport and urban development. Only 28% of staff working in the latter sectors are functionally well-informed, as compared to 45% for all other staff, rising to 57% for staff working in economic policy, poverty and human development. And while only 7% of senior staff working on poverty and 10% of those mapped to economic policy are “happily uninformed,” the proportion rises to 37% for staff working in urban development and 41% in the energy and mining sector.  

The differences across units in the demand for the Bank’s research are correlated with the incidence of PhDs (in any field) and the presence of economists across those units. This suggests that a lack of internal research capacity in operational units might impede absorptive capacity for external knowledge (as has also been found in studies of research and innovation in private sector firms).  Attempts to mandate knowledge products such as impact evaluations or ESKIs (as I proposed in my last post on this blog) will no doubt have uneven success unless absorptive capacity is more uniformly high across the Bank’s sectors and operational units.

One might be tempted to respond to all this by saying: “Who cares? The happily uninformed are doing fine. Bank research does not matter to them and they largely ignore it.” Probing further into the staff responses casts some doubts on this response. Two observations: First, the study also finds that those who relied less on research in the past are more likely to want to increase their use of it in the future. The happily uninformed have a significantly higher desire to increase their usage of research in the future than do other groups. Second, the happily uninformed give their networks significantly lower ratings for their performance in connecting operational staff up to research findings. And staff working in the traditional infrastructure sectors give significantly lower average ratings for the performance of their networks than do staff working on economic policy, poverty and human development.

So the “happily uninformed” are not really so happy. They want to do better in connecting up with the latest research.

How then do these inter-sectoral differences come about? There are both demand- and supply-side factors. The supply of research is not always relevant to the needs of development practitioners. (For example, the current emphasis on randomized evaluations in development economics has arguably distorted knowledge even further away from the hard infrastructure sectors where these tools have less applicability, as I have argued here.) However, that is only part of the story. The supply of research is clearly also determined by demand. In turn, demand stems in no small measure from the extent to which “development impact” is challenged by citizens, governments and managers in aid agencies. Impact often appears to be taken for granted in the traditional hard infrastructure sectors (though in truth the evidence is often rather weak, given relatively low levels of investment in rigorous ex-post evaluations). This stands in marked contrast to the social sectors where lending and policy operations have had to work hard to justify themselves, and have turned to research to help do that. Thus we have seen a large new body of research on poverty and human development emerging over the last 15 years.

Clearly, if the presumption of impact was more routinely and systematically challenged then project staff would face stronger incentives for learning about impact—which means learning about the failures as well as the successes. And strong incentives for learning would generate greater familiarity and use of research in the work of development practitioners.

You can read more about my analysis of this survey of the views of the Bank’s operational staff in my paper “Knowledgeable Bankers? The Demand for Research in World Bank Operations.

Comments

Submitted by Stanley Ramos on
Thanks, this is very interesting. I wonder how this compares with your other recent study of the World Bank's publication record and citation index. How many ESWs and Project Documents cite the Bank's own research? And does the volume of citation vary across sectors? Maybe the record of actual citation activities can be a usufel complement to the self-reported awareness and appraisal of Bank research. (Unless one can't track citations in project documents?)

Submitted by Anis Dani on
The findings are very interesting. However, the demand for and appreciation of DEC research by PREM and HDN sectors may also be a function of the bias in DEC's own staffing structure and research priorities, rather than the absence of economists in those units. If the research done by DEC is monopolized by economists who else do you expect to communicate with? Would the results be different if DEC decided to end the monopoly of economists and bring in people from other disciplines who have research interests? Would the results be different if DEC's research was more operationally relevant to the sectors who are being labelled "happily uninformed." Maybe DEC is also happily uninformed about the issues that those sectors are grappling with because it is insulated from the market pressures that they face to justify their budgets. I am willing to wager that if DEC's research was based on WPAs negotiated with the regions and networks its work program would look very different.

Anis, Many thanks for your comment. As I tried to explain in the post, and in the paper, one cannot look at this issue solely from one side, “demand” vs. “supply.” Both sides have to be in place: the incentive for learning about and using research (on the one hand) and the supply of relevant research on the other. DEC research does not evolve in a vacuum. What DEC research does is endogenous (forgive the economic jargon) to the demand from operations and external clients. Don’t forget we do have external clients; DEC is the Bank’s (and probably the world’s) main generator of global public goods for knowledge about development. But Bank operations are a hugely important set of clients for us, particularly in the research department. And to make sure our researchers do engage with operations, we insist that all of them sell one third of their time to operations, which is a real market test of our relevance, and it matters to staff assessments. We do respond as best we can when there is demand. To give an example relevant to your own interests, we have increased our engagement with “non-economic” issues over recent years, in response to demand from operations. Indeed, our latest Policy Research Report (Localizing Development: Does Participation Work?) is a major work on community development and it takes a very broad approach, embracing perspectives from well outside economics. Granted the authors are economists, but economists can and do contribute to knowledge on these issues, drawing on many disciplines. Indeed, I would argue that taking such a broad approach to development is part of what it means to be a good economist. I would also add that DECRG is subject to the same pressures on budget as other central and operational units. Indeed, the number of our staff funded from the Bank’s base-budget has been declining over many years; we feel a bit like the rapidly shrinking research department! This has left some gaping holes in our coverage, and does not make it easy for us to respond to new demands. But (like the rest of the Bank) we try our best. Martin

Submitted by Bertha on
This is very interesting, I just looked at it quickly. I and I wonder of self-reported results, especially in topics like this, where social desirability bias can be important. Although I note the uninformed were non-negligible, which may be a good sign.. I really do not know, and unfortunately I cannot reflect long on this now. In any case, my question is, are we reflecting what our position is supposed to report or are we actually reflecting true behavior? I wonder if people in analytical positions like economists will tend to answer more positively even if their behavior is not really consistent. Any way of making a survey asking for recall of particular research pieces? In the lines of the first commentator, can we look for other objective evidence (citing where was the research used) etc..? Anyway, I really like the topic of learning and staff incentives and I believe we are in need of evidence like this. Thanks!