The State of Development Journals 2019: Quality, Acceptance Rates, Review Times, and Open Science


This page in:

This is the third year in which I have attempted to put together data on development journals that is not otherwise publicly available or easy to access (see 2017, 2018). Thanks again to all the journal editors and editorial staff who graciously shared statistics with me. 
  1. Is this a good quality, high visibility journal to publish my work?
The most well-known metric of journal quality is its impact factor. The standard impact factor is the mean number of citations in the last year of papers published in the journal in the past 2 years, while the 5-year is the mean number of cites in the last year of papers published in the last 5. Note that the impact factors of several journals get pulled down by the inclusion of non-refereed papers and proceedings supplements (e.g. WBER). I compliment these stats with RePec’s journal rankings which take into account article downloads and abstract views in addition to citations.

The first point to note here is that if the mean impact factors are only 1 to 3, and we expect citations to have a long right tail, this would suggest that lots of articles published in journals are cited only once or not at all! Luckily things are not that bleak – it is just that the impact factors only capture citations in published works; which given the long time lags for publications in economics, makes them (to my mind) almost completely meaningless. I therefore took the 2017 issues of each journal, and looked up (in March 2019) the Google Scholar citations of each paper published. Figure 1 gives a boxplot of the data, sorted by median citation rate, and compared to the development papers in the AER and AEJ Applied in 2017 as benchmarks (I took a random sample of half the issues of World Development (WD) and the Journal of Development Studies (JDS), given how many articles they publish).

Figure 1: Boxplot of citations as of March 2019 of articles published in 2017

This figure presents a much more optimistic view of research getting cited. The median AER development paper published in 2017 has accumulated 66 cites and the median development paper in the AEJ applied has accumulated 34 cites. WBRO publishes a small number of review type pieces which get more cites on average than other development papers. But then the median citations are very similar, in the 9 to 11 range, for a group of four development journals (JDE, WD, WBER, and EDCC) – which would suggest a rough translation of one top-5 journal is equivalent to two AEJ Applieds or 6 top field journals in terms of citations. JDS and Economia have medians of 5 to 6 cites, and then the remaining journals have 2 to 3 cites as the median. Some key points to note are:
  • Zero citations is rare: only 25 out of the 517 papers considered in Figure 1 have 0 cites (
  • There is massive heterogeneity, with the top tail of papers in many development journals having more citations than papers in top general journals: don’t judge the paper by the journal
  • As noted last year, these comparisons are complicated further by issues of differences in the ages of the papers by time they finally get published at different journals: which depends on how many other journals authors try first, how efficient journals are in processing papers, how long it takes authors to make revisions, and how many rounds of revisions they require, and then the backlog between acceptance and publication. 
2. What are the chances of a paper getting accepted?
Card and Dellavigna document that top-5 journals have seen tremendous growth in the number of submissions over time. While not all development journals were able to provide long time series data on submissions to me, Figure 2 shows more than a doubling in submissions over the last decade: WBER went from 179 submissions in 2008 to 493 in 2018; the Journal of African Economies from 210 in 2008 to 420 in 2018; and the Journal of Development Studies from 515 in 2008 (up from 289 in 2004) to 1255 submissions in 2018. Even over shorter periods, World Development’s submissions increased by over 1000 submissions in 5 years, going from 1720 submissions in 2014 to 2018.  

Figure 2: Trends in Number of Submissions to Development Journals over Time

(note: EDCC, WBER and J African Econ left axis, JDS and WD right axis).

Table 2 then shows the number of submissions, and number of papers published each year (excluding online supplements and proceedings), and the acceptance rate. The acceptance rate can be complicated due to articles being submitted one year and accepted in another, and I use what each journal office reports to me. You could also just take the number published over the average of the number of submissions in the last two years and get similar figures. My three takeaways from table 2 are:
  • Three journals have expanded the number of papers they publish: WBER which moved from 21 papers in 2016 (and 2015), to 35 to 36 papers each of the last two years; World Development, which has dramatically increased the number of papers from 183 in 2016 to 335 in 2018; and the JDE which published 112 papers in 2018 compared to 80 or fewer the previous two years. These expansions in the number of papers published have helped to keep acceptance rates stable (or even get higher) at these journals over the last three years as submissions have grown.
  • Acceptance rates of 5 to 7 percent at some of the development journals are similar to the acceptance rates at the AER and Econometrica (see Card and Dellavigna). Given Figure 2 and that most journals have not increased space that much, acceptance rates in many cases will be half of what they were a decade ago.
  • The arrival of new journals like the Journal of Development Effectiveness, IZA Journal of Development and Migration, and Development Engineering has opened up more opportunities for publishing good research that doesn’t face as high a rejection rates.

3. How long does the review process take?

A new interactive website provides desk rejection rates and expected decision times for a number of general journals in economics. Table 3 provides the information I have been able to gather for development journals. The first column shows the desk rejection rate, which is above 50 percent at most journals, and up to 77 percent. I then use the acceptance rates from Table 2 together with these desk rejection rates to give the mean chance of acceptance conditional on the paper going to referees. These are a bit more encouraging than the rates above – and should be some guidance for your over-zealous referee 2’s who want to reject everything – if editors are doing a good job in desk rejecting, then at least one-third of the papers getting sent to referees should be good candidates for accepting (eventually, after they have written the paper the way you would have done, cited several of your papers, and exerted enough blood, sweat and tears to have earned it of course...).

The remaining columns then report the amount of time taken for decisions. I asked for both unconditional on going to referees (which captures lots of quick desk rejects also), and conditional on going to referees. However, not all journals are able to split their data this way, and I caution that sometimes it is unclear whether the last two columns are conditional on being refereed, or also include all the desk rejects. The Journal of African Economies gave me both numbers – and you can see it makes a huge difference for the 3 month horizon (only 23% of papers that went to referees were decided in this timeframe, versus 82% of submissions).

My takeaway from this is that, conditional on going to referees, 60-70 days seems about the average. Add onto this the time taken for the editors to review and decide whether to send to referees, and for referees to agree to the assignments, and you are at about 3 months. The big concern for authors is the probably the chance of ending up in the right tail – so journals where there is a 10% or more chance of the paper taking more than 6 months to referee will be ones looking to improve hopefully here. Of course this depends heavily on the referees also doing their part – so don’t procrastinate too much on those reports!

Comparing these results to last year, the most improved award goes to Economia-Lacea, which greatly increased its desk rejection rate (from 28% to 49%), and lowered its mean decision time conditional on going to referees from 148 days to 82 days. Two other strong improvers were the IZA Journal of Development and Migration (which lowered the mean time conditional on going to referees by 20 days; and EDCC which lowered the mean time conditional on going to referees by 34 days). Even more pleasing, no journals got slower.

4. Open Science
I thought for a last topic this year, I would look at what journals are doing in terms of making research openly accessible, and in terms of data transparency.

Open access: given that research in economics comes out as publicly accessible working papers well before the final journal versions are published, and that many authors put the final versions on their webpages, I think this is less of an issue in economics than in other fields (Berk’s caveats about working papers aside). Nevertheless, gated articles are still a concern for many, and a number of donors are pushing for researchers to publish open access. All but one of the development journals listed above have this as an option – except Economia, and at the IZA journal of development and migration it is mandatory to pay the fee if your paper is accepted. The fee for open access publication is typically $2,500 to $3,000, but a number of the journals have cheaper rates for authors from developing countries. So if you are putting together a grant proposal, this might be one additional item to include.

Finally, an area that is rapidly evolving is whether journals require authors to post data and replication materials for empirical work. Many journals have relatively new policies in this area, and journals range from requiring online posting of the data, to encouraging authors (e.g. EDCC policy) to post materials, to not currently requiring data to be posted. One of the trends at the AEA journals has been for more and more empirical papers to get exceptions to data availability policies because of the use of confidential administration data such as tax data. None of the journals had statistics on exemptions to their data-sharing policies, but note that the policies have been in place for relatively short periods and they have not seen many requests for exceptions yet.



David McKenzie

Lead Economist, Development Research Group, World Bank

Simon Batterbury
April 02, 2019

I think you are using Web of Science data as a proxy for impact? Scopus citescore, despite being produced by Elsevier, is way more accurate. It contains a wider range of reputable journals, citations are those found in a broader range of outlets, and it is all publicly available. The metric is calculated in a very similar way to WoS using citation analysis to recent papers . Under its 'Social Science, Development' category we get the figures below. World Development is a key player, but look at the broader range.
Secondly, the journals on your list all publish a high percentage of development economics papers. Development studies is much broader than that, and as a former lecturer in development studies at the LSE, I would say it is not the lead subfield [and where it is the lead, as in the WB, look what happens...]. Much of DE's data comes from datasets and standardised surveys, and the use of statistics and econometrics in its articles is not comprehensible except for those with the relevant numeracy or training. DS is actually a very heterogeneous field and of particular note are environmental, sociological and political studies based on deeper engagement and fieldwork than than many economists feel able to do.
Thirdly, open access should be the norm given the topics in DS, and where it is not, it is largely due to publisher resistance - they simply make more money from subscription journals. I should know because I attend Board meetings of a commercial journal on your list. APCs for OA in commercial journals are criminally high in several cases [not as bad as journals like Cell at just under $6000 but companies are still charging what they can get away with, or what the 'market' will bear-$2000-3000 in DS]. Fortunately a lot of this could change with Plan S, the big pan-European initiative to force OA publishing and break the grip of commercial publishing that is crippling our library budgets and restricting access. I have a listing where if you search on 'development' a few reputable and cheap or free alternatives emerge.…
Tourism Management 6.9
Land Degradation and Development 6.59
Corporate Social Responsibility and Environmental Management 5.49
Food Policy 4.53
World Development 3.92
Annals of Tourism Research 3.68
Policy Sciences 3.68
Cities 3.61
International Journal of Urban and Regional Research 3.23
Journal of Rural Studies 3.14
Population and Development Review 3.13
Social Neuroscience 3.06
World Bank Research Observer 2.92
Sustainable Development 2.91
Transportation 2.85
Food Security 2.66 [my journal, J Political Ecology, also sits here with many dev papers]
Journal of Development of Economics 2.61
New Political Economy 2.54
China Quarterly 2.53
Journal of Development Studies 1.66

David McKenzie
April 02, 2019

Thanks Simon. Indeed, because of differences in publishing and citation practices, impact factors are even more of a mess when comparing across fields - so I think they are particularly nonsensical when comparing political science to sociology to economics to demography papers in development. But I appreciate your broader list of journals and alternative source for impact factors for our readers to also consider.

Ajay Mahal
December 02, 2019

Dear David

Thanks very much for this information. One issue though which emerges just by looking at two mean times to first decision, conditional and unconditional on going to referees, and desk reject rates. I can see that excepting JDE, WBER and WBRO, the three numbers are mathematically inconsistent for the other journals. I suspect some of the journals are reporting desk reject times as the unconditional statistic. May I suggest that you ask for the mean time to desk reject (instead of the unconditional number) for your next round of comparisons! In any event, this is a great service to the profession.


Bhanoji Rao
November 17, 2020

Thanks for the blog. An association of journal editors should be formed (if not already there) and try to explore some standardization of how much time a referee should take etc.