“The right data at the right time”: How to effectively communicate research to policy makers

|

This page in:

Researchers in development often hope that their research can ultimately influence policy. But getting from research results to policymaker persuasion is an ongoing struggle. Yesterday I heard insights on this point from Dasmine Kennedy of Jamaica’s Ministry of Education as well as Albert Motivans from Equal Measures 2030. (I also gave my two cents.)

Kennedy made three points:

  1. Policymakers are interested in learning about the return on investment, so they want to see cost-benefit analysis.

  2. Explain research in clear, non-technical language. “One of the biggest turn-offs will be to come and start speaking about all the different research techniques.”

  3. Show how research aligns with policymaker’s policy initiatives. “I know you’re working on this. I know you’re excited about learning about the outcome of your initiative, but this is what the data is saying.”

Motivans highlighted the value of creating data systems that will allow for the “right data at the right time” so that when there is a specific policy question (as in, “what are dropout rates? what drives them?”), there are answers ready.

Of course (and this is me, not Motivans), data collection is expensive, and that requires prioritizing what we want answered. In the context of education, this is an argument for regularly collecting student performance data (whether low-stakes or high-stakes): If we want to improve student performance, then it’s hard to justify not gathering regular, representative data on student performance.

This conversation was part of a longer panel on “Closing the Gender Data Gaps in Girls’ Education,” including Kennedy, Motivans, Stephanie Psaki, Thoai Ngo, and me, hosted by the GIRL Center at the Population Council. You can watch the full panel below or at the Population Council’s Facebook page. The conversation on communicating to policymakers takes place between 42:30 and 50:00.
 

Video
























Early in the panel, I talk about a new initiative I’m working on (with a team) to understand the gender impacts of learning interventions. Only about one-third of impact evaluations with learning measures separate effects by gender. Recent work by JPAL shows that interventions to improve school access – even those not targeted to girls – often have dramatically larger effects for girls. We want to understand how learning interventions differentially affect boys and girls. (This work is supported by funded by Echidna Giving and the Umbrella Facility for Gender Equality.)

Related links – these are all by me; you can share YOUR related links in the comments!

Authors

David Evans

Senior Fellow, Center for Global Development

Dasmine Kennedy
September 14, 2017

This is a very good spot on summary David. I have provided a link to my blog on the research I am working on so that additional insights can be gleaned from my work.
https://www.brookings.edu/blog/education-plus-development/2017/08/08/ho…

Varja Lipovsek
September 15, 2017

Thanks for this post.
It strikes me as something that all of us working to influence public policy want to hear: that all of our evidence can indeed be put to good use.
But I was very struck this week by an article from Baekegaard et al on motivated reasoning among politicians (in this case, in Denmark). (I had found an ungated copy but now can only find it here, sorry: https://www.cambridge.org/core/journals/british-journal-of-political-sc…).
It turns out that (Danish) politicians are more likely to interpret unambiguous data in a way that supports their (political) priors, and, perhaps even more surprisingly, that the more information they are given, the stronger this updating of priors is.
In other words, "the right data at the right time... but only insofar as it fits my previous view of the world." Which is a very different problem to address.
Friday food for thought.
Cheers!
Varja