Published on Data Blog

From Discovery to Scale: Leveraging big data to improve development outcomes

This page in:


In the last few years, the World Bank has expanded use of big data in more than 150 development projects globally, spanning a wide range of sectors and geographies. Solutions have ranged from using big data to monitor, evaluate, and improve projects—in energy, transport, and agriculture—to poverty diagnostics and understanding how well urban residents are connected to jobs. But, as Haishan Fu, Director of the Development Data Group at the World Bank, has said, “we are just beginning to realize the potential of the data revolution.”

These pilots have taught us that moving from discovery, to incubation, to scale requires a more coordinated and systematic approach. At the World Bank, we found it important to go beyond internal dialogue and assessments. We wanted to listen to and understand the perspectives of our partners in the development and data ecosystems—on current gaps, opportunities, as well as on the role(s) the World Bank should play in order to foster collective action.

It is with this intent that the Development Economics group at the World Bank organized a workshop this summer on scaling big data for sustainable development. The workshop brought together more than 40 leading experts from a diverse range of organizations in the development and data ecosystems, including Amazon Web Services, LinkedIn, Facebook, MIT, University of Chicago, USAID, Digital Impact Alliance, GPSDD and the National Science Foundation.

Among the organizations present was UN Global Pulse, whose director Robert Kirkpatrick called on workshop participants to consider the consequences of inaction if we do not use available data to transform public services, early warning, and crisis response. He urged the community to “reframe our conversation around risk” and avoid missed opportunities by using data to eradicate hunger, poverty, and disease.

Over the course of the day, participants shared valuable insights toward a better innovation ecosystem to scale big data for sustainable development.

Several key takeaways emerged from the session:

The WBG is a valuable partner to build, boost, and broker big data innovation for improved development outcomes. Beyond finance, workshop participants agreed that the World Bank’s domain expertise and traditional development data is invaluable to train and ground-truth big data algorithms. It can also help ensure that big data solutions are robust, fit-for-purpose and developed in conjunction with local capacity and insights.

Big data augments small data and vice versa. Ground-truthing, i.e. validating solutions against traditional data sources like surveys is critical to ensure that we are seeing the full picture. As Aubra Anthony from USAID said, it is as important to be aware of “the limitations of available data” as it is to capitalize on the “complementarity of big data and small data.”

Big Data Solutions need inclusive, human-centered, cross-disciplinary design processes. Danielle Wood from Space Enabled emphasized the need for players in the development and data ecosystems to ensure that solutions enabled by big data advance justice and reduce existing biases and barriers. USAID’s Aubra Anthony agreed, highlighting that if unchecked, machine learning algorithms can propagate bias and end up causing harm, particularly to those from underrepresented demographics.

Solutions should be problem-driven and data-driven. Stefaan Verhulst of GovLab argued that decision-makers and project teams should prioritize defining key problems before jumping ahead to acquire datasets or conduct analyses. Participants agreed that big data solutions themselves should be assessed against evidence so that the most impactful solutions can be scaled up.

Public-private data stewardship should be professionalized. GovLab’s Verhulst also stressed the need for private and public organizations to professionalize both the demand and supply sides of data (i.e. the roles and functions of data holders and users). Holly Krambeck of the World Bank offered ideas on how to reduce transaction costs relating to establishment of effective public-private partnerships and data collaboratives.

Open Source, Peer-to-Peer Knowledge Exchange and Learning are essential to accelerate data innovation. This will enable integration of local insights and technical know-how that are essential for applying solutions to local context.

Standards and frameworks need to be modernized. Participants agreed that this is critical to address data misuse, privacy, confidentiality, bias and representation, as well as other ethical issues.

Better donor coordination and innovative business models are essential to mobilize resources and to scale innovative solutions.

Given the challenges and opportunities inherent in this space, participants agreed that the World Bank’s leadership would be essential to help align solutions to country needs, mobilize resources, and connect capabilities to advance use of big data to achieve development outcomes.

A detailed summary of the workshop is available here

Do you have ideas on how we can advance this work? Please share them in the comments section below.


Michael M. Lokshin

Lead Economist, Office of the Regional Chief Economist, Europe and Central Asia

Trevor Monroe

Program Manager with the Analytics and Tools unit. Development Economics Data Group at the World Bank

Join the Conversation

The content of this field is kept private and will not be shown publicly
Remaining characters: 1000