Published on Let's Talk Development

Wrong criticisms of Doing Business

This page in:

While I welcome criticism and comments on the Doing Business (DB) report—or any other data and research product of the World Bank, for that matter—I find Justin Sandefur’s and Divyanshi Wadhwa’s (SW) recent blog posts on DB in Chile and India neither enlightening nor useful. 

In 2013, an external assessment of DB recommended, among other things, that the set of indicators be broadened to better reflect the challenges that entrepreneurs face and make DB a more comprehensive benchmarking tool for business regulation.  In particular, indicators reflecting gender differences, reliability of electricity, and “trading across borders” were either introduced or significantly revamped.  These changes were adopted after extensive consultations with academics, country officials, and Bank staff, management, and members of its Executive Board.  Because of these significant changes to the methodology, we discourage comparisons of rankings across the year of the change because it would be like comparing apples with oranges.  Even if you wanted to compare the same set of indicators across two years, the data simply don’t exist. For example, prior to 2015, DB’s trading across borders indicator set also measured the volume of documentation needed to comply with border controls. This is no longer part of the indicator set and, hence, the relevant data are no longer collected.

Chile and India
Despite this gap in data, SW, in their blog post, use various data assumptions to calculate what Chile’s scores would have been had the methodology changes not been introduced.  They find the scores and Chile’s rankings would have been different from those using the new methodology.  This is not surprising since Chile’s performance on the additional indicators (as well as that of other countries) will influence its score and ranking.  SW use this unsurprising finding to claim that the DB indicators “aren’t credible” because of massive movements in the numbers due to changes in methodology.  But, as mentioned above, the changes to the methodology were introduced after careful consideration and widespread consultation.  There is no reason to revisit that decision based on the fact that the changes led to a difference in one country’s ranking (relative to the no-change-in-methodology scenario).

In the post on India, SW find that the changes to India’s ranking were not due to changes in the methodology, but due to the addition of several countries in the sample.  But any ranking is a statement of how that country fares relative to the other countries in the sample.  When Usain Bolt comes first in a 100-meter race, he is first relative to the other players in the race—not relative to a hypothetical set of competitors.  Again, this is a point we continuously emphasize in disseminating the DB rankings—especially to those countries that have undertaken many reforms but see no change in their ranking because other countries in their “ranking neighborhood” have done even more.

I should add that India has significantly stepped up the pace of reforms in the last four years. During the past year, India was one of only three countries in the world to carry out eight reforms in a single year. By the way, DB has added only one country (Somalia, ranked 190) to the sample during the past four years. In short, India’s DB performance is directly attributable to the reforms undertaken.
There are many important criticisms that can be made of Doing Business, but pointing out that scores and rankings change because of changes in methodology or in the sample of countries are not among them.

This was originally published at the Center for Global Development's Views from the Center Blog:


Shanta Devarajan

Teaching Professor of the Practice Chair, International Development Concentration, Georgetown University

Join the Conversation

The content of this field is kept private and will not be shown publicly
Remaining characters: 1000