Happy new year. I imagine many of our readers are at the annual meetings of the American Economic Association today. Good luck to all the job-seekers, and for those not there, looking through the program is always a good way to see a broad overview of current research. Some links that caught my attention over the break are:
- Fiona Burlig blogs on her new paper about how to do more accurate power calculations for experiments that use panel data (more T). There is apparently also Stata code, but I haven’t been able to download it yet and play around with this.
- In time for those on the job market, the CSWEP newsletter has advice from a number of economists on how to discuss the dual career search process: lots of different perspectives and advice. One piece discusses how ambiguity aversion means it can be helpful to reveal your status, whatever this is. The majority seem to be suggesting you should disclose this information around the time you are invited for a flyout.
-
Dan Hamermesh on whether the market for economic research has already created good substitutes to replication: “The majority of articles in those journals are… essentiallyignored, so that the failure to replicate them is unimportant”, but looking at the most heavily cited papers in labor economics, he finds they are being replicated in the sense of testing whether the ideas work in different settings or time periods: “research which the
community of scholars implicitly deems important is replicated, including both on other data and to a lesser extent on the data that are now typically required to be deposited with the journal. The more important type of replication is not like that of “hard-scientific” research, but rather in the only sensible way for a social science—by testing the fundamental idea or construct in a different social context. Important mistakes do get caught, and important ideas initially tested on only one set of data must survive tests on other data” - At the same time as Hamermesh is saying this, Christensen and Miguel have a 94-page NBER working paper on transparency, reproducibility, and the credibility of economics research. “Doucouliagos and Stanley (2013) carry out a meta-meta-analysis of 87 meta-analysis papers … and find that over half of the literatures suffer from “substantial” or “severe” publication bias, with particularly large degrees of bias in empirical macroeconomics and in empirical research based on demand theory,”
Join the Conversation