Syndicate content

The World Region

New child and adolescent mortality estimates show remarkable progress, but 17,000 children under 15 still died every day in 2017

Emi Suzuki's picture

This blog is based on new mortality estimates released today by the United Nations Inter-agency Group for Child Mortality Estimation (UN IGME)

There has been remarkable progress in reducing mortality among children and young adolescents in the past several decades. Between 1990 and 2017, the global under-five mortality rate dropped by 58 percent from 93 deaths per 1,000 live births to 39 deaths per 1,000 live births. During the last 17 years, the reduction in under-five mortality rates accelerated to an average 4% annual reduction, compared to an average 1.9% annual reduction between 1990 and 2000. For children aged 5-14, mortality dropped by 53 percent, from 15 deaths to 7 deaths per 1,000 children.

True Demand for Data

Michael M. Lokshin's picture
Headquarters of the United Nations
Photo: Yutaka Nagata (CC BY 2.0)

A snow storm was barreling toward New York City and the roster of attendees at the UN Statistical Committee meeting—myself included—fully expected that all flights would be canceled. Fifty statisticians made the same calculation—to find the closest bar. I headed to the Vienna Café in the UN headquarters building, a place which affords one the rarified opportunity to socialize with high-level government officials from around the world. On my way in, I recognized the Director-General of a statistics office from an African country and we spoke. I mentioned several statistical programs that donors were planning to finance in his country. He expressed enthusiasm about these projects but voiced an increasingly familiar note of concern about long term sustainability of his agency in general. He fretted that his entire statistical office would collapse without donor support. He admitted that most of the demand for data was coming from the donors themselves, as indicators for their own reporting and planning; the country’s own government had much less interest in data or statistics.

Celebrating 50 years of measuring world economies

Edie Purdie's picture

The ICP blog series explores ideas and issues under the International Comparison Program umbrella – including innovations in price and data collection, discussions on purpose and methodology, as well the use of purchasing power parities in the growing world of development data. Authors from across the globe, whether ICP practitioners or researchers making use of ICP data, are encouraged to submit relevant blogs for consideration to [email protected].

A visitor to the World Bank’s atrium on May 23, 2018 would have seen a who’s who of eminent economists and statisticians congregating to celebrate the 50th anniversary of the International Comparison Program. Organized by the Global ICP Unit based in the World Bank in Washington, D.C, a large local, and virtual, audience gathered to hear the thoughts and reflections of major ICP players at the “50 Years of Measuring World Economies” event.

Cured Into Destitution: the risk of financial catastrophe after surgery

Kathryn Wall's picture
Also available in: العربية | Español

Low-income countries face the highest risk of financial catastrophe due to surgery and have made the slowest progress

Five billion people—two thirds of the world’s population—lack access to safe, timely, and affordable surgical, anesthesia, and obstetric (SAO) care, as World Bank Group President Dr. Jim Yong Kim stated. Of the myriad barriers to accessing SAO care—safety, for example, or the lack of a well-trained workforce—one of the largest is financial. For patients, surgery can be very expensive. Not only can the financial burden of seeking surgical care be a formidable obstacle to those who need surgery, it can also have a devastating impact on those who are able to receive it. Over two billion people cannot afford surgery if they needed it today, and, of those who get surgery every year, an estimated 33 million of them will undergo financial hardship from its direct costs—81 million when the ancillary costs of care like transportation and food are included.

Introducing two new dashboards in the Health, Nutrition and Population data portal

Haruna Kashiwase's picture

We’re pleased to launch new dashboards in the Health, Nutrition and Population Portal, following the portal’s revamp last year. The renewed HNP portal has two main dashboards covering Population and Health. Both dashboards are designed to be interactive data visualization tools where users can see various population and health indicators. Users can access various charts and maps by selecting specific time, country or region and indicators. We have added new indicators, charts and new health topics such as Universal Health Coverage and Surgery and Anesthesia. Below are some examples of stories gleaned from our dashboards.

India’s population is projected to surpass that of China around 2022

China, with 1.4 billion people, is the most populous country in the world in 2017. However, India, the second most populous country with 1.3 billion people, is projected to surpass China’s population by 2022. China’s total fertility rate (the number of children per woman) has also declined sharply since the 1970s.

Data quality in research: what if we’re watering the garden while the house is on fire?

Michael M. Lokshin's picture

A colleague stopped me by the elevators while I was leaving the office.

“Do you know of any paper on (some complicated adjustment) of standard errors?”

I tried to remember, but nothing came to mind – “No, why do you need it?”

“A reviewer is asking for a correction.”

I mechanically took off my glasses and started to rub my eyes – “But it will make no difference. And even if it does, wouldn’t it be trivial compared to the other errors in your data?”

“Yes, I know. But I can’t control those other errors, so I’m doing my best I can, where I can.”

This happens again and again — how many times have I been in his shoes? In my previous life as an applied micro-economist, I was happily delegating control of data quality to “survey professionals” — national statistical offices or international organizations involved in data collection, without much interest in looking at the nitty-gritty details of how those data were collected. It was only after I got directly involved in survey work that I realized the extent to which data quality is affected by myriad extrinsic factors, from the technical (survey standards, protocols, methodology) to the practical (a surprise rainstorm, buggy software, broken equipment) to the contextual (the credentials and incentives of the interviewers, proper training and piloting), and a universe of other factors which are obvious to data producers but usually obscure and typically hidden from data users.

Applications open for third round of funding for collaborative data innovation projects

World Bank Data Team's picture
Photo Credit: The Crowd and The Cloud


The Global Partnership for Sustainable Development Data and the World Bank Development Data Group are pleased to announce that applications are now open for a third round of support for innovative collaborations for data production, dissemination, and use. This follows two previous rounds of funding awarded in 2017 and earlier in 2018.

This initiative is supported by the World Bank’s Trust Fund for Statistical Capacity Building (TFSCB) with financing from the United Kingdom’s Department for International Development (DFID), the Government of Korea and the Department of Foreign Affairs and Trade of Ireland.

Scaling local data and synergies with official statistics

The themes for this year’s call for proposals are scaling local data for impact, which aims to target innovations that have an established proof of concept which benefits local decision-making, and fostering synergies between the communities of non-official data and official statistics, which looks for collaborations that take advantage of the relative strengths and responsibilities of official (i.e. governmental) and non-official (e.g.,private sector, civil society, social enterprises and academia) actors in the data ecosystem.

If development data is so important, why is it chronically underfinanced?

Michael M. Lokshin's picture

Few will argue against the idea that data is essential for the design of effective policies. Every international development organization emphasizes the importance of data for development. Nevertheless, raising funds for data-related activities remains a major challenge for development practitioners, particularly for research on techniques for data collection and the development of methodologies to produce quality data.

If we focus on the many challenges of raising funds for microdata collected through surveys, three reasons stand out in particular: the spectrum of difficulties associated with data quality; the problem of quantifying the value of data; and the (un-fun) reality that data is an intermediate input.

Data quality

First things first – survey data quality is hard to define and even harder to measure. Every survey collects new information; it’s often prohibitively expensive to validate this information and so it’s rarely done. The quality of survey data is most often evaluated based on how closely the survey protocol was followed.

The concept of Total Survey Error sets out a universe of factors which condition the likelihood of survey errors (Weisbeg 2005). These conditioning factors include, among many other things: how well the interviewers are trained; whether the questionnaire was tested and piloted and to what degree; whether the interviewers’ individual profiles could affect the respondent answers, etc. Measuring some of these indicators precisely is effectively impossible—most of the indicators are subjective by nature. It may be even harder to separate the individual effects of these components in the total survey error.

Imagine you are approached with a proposal to conduct a cognitive analysis of your questionnaire. - How often were you bothered by the pain in the stomach over the last year? A cognitive psychologist will tell you that this is a badly formulated question: the definition of stomach varies drastically among the respondents; last year could be interpreted as last calendar year, 12 months back from now, or from January 1st until now; one respondent said: it hurt like hell, but it did not bother me, I am a Marine... (from a seminar by Gordon Willis)

Beyond Proof of Concept: do we have the right structure to take disruptive technologies to production?

Michael M. Lokshin's picture
Figure 1: Azure Cognitive Services Algorithm compliments authors’
youthful appearances

“Every company is a technology company”. This idea, popularized by Gartner, can be seen unfolding in every sector of the economy as firms and governments adopt increasingly sophisticated technologies to achieve their goals. The development sector is no exception, and like others, we’re learning a lot about what it takes to apply new technologies to our work at scale.

Last week we published a blog about our experience in using Machine Learning (ML) to reduce the cost of survey data collection. This exercise highlighted some challenges that teams working on innovative projects might face in bringing their innovative ideas to useful implementations. In this post, we argue that:

  1. Disruptive technologies can make things look easy. The cost of experimentation, especially in the software domain, is often low. But quickly developed prototypes belie the complexity of creating robust systems that work at scale. There’s a lot more investment needed to get a prototype into production that you’d think.

  2. Organizations should monitor and invest in many proofs of concept because they can relatively inexpensively learn about their potential, quickly kill the ones that aren’t going anywhere, and identify the narrower pool of promising approaches to continue monitoring and investing resources in.

  3. But organizations should also recognize that the skills needed to make a proof of concept are very different to the skills needed to scale an idea to production. Without a structure or environment to support promising initiatives, even the best projects will die. And without an appetite for long-term investment, applications of disruptive technologies in international development will not reach any meaningful level of scale or usefulness.

The 2018 Atlas of Sustainable Development Goals: an all-new visual guide to data and development

World Bank Data Team's picture
Also available in: Español | العربية | Français
Download PDF (30Mb) / View Online

“The World Bank is one of the world’s largest producers of development data and research. But our responsibility does not stop with making these global public goods available; we need to make them understandable to a general audience.

When both the public and policy makers share an evidence-based view of the world, real advances in social and economic development, such as achieving the Sustainable Development Goals (SDGs), become possible.” - Shanta Devarajan

We’re pleased to release the 2018 Atlas of Sustainable Development Goals. With over 180 maps and charts, the new publication shows the progress societies are making towards the 17 SDGs.

It’s filled with annotated data visualizations, which can be reproducibly built from source code and data. You can view the SDG Atlas online, download the PDF publication (30Mb), and access the data and source code behind the figures.

This Atlas would not be possible without the efforts of statisticians and data scientists working in national and international agencies around the world. It is produced in collaboration with the professionals across the World Bank’s data and research groups, and our sectoral global practices.
 

Trends and analysis for the 17 SDGs

Pages