· Cyrus Samii summarizes the new Imbens and Kolesar paper on getting standard errors correct. The best news – apparently the new recommended way of calculating standard errors is already programmed into Stata – use the vce(hc2) option. Bad news is this doesn’t work with clustering.
· The Impact of ending rent control in Boston – Tim Taylor summarizes some new work.
· Scott Guggenheim offers thoughtful reflections on the pros and cons of understandable research on the IPA blog. “The other paper that blew my socks off was Gharad Bryan’s piece on migration in Bangladesh, which showed that investing $6 to help poor people migrate yielded $100 in returns….Undoubtedly my biggest surprise of the whole event was that I could actually understand most of what was being said…[however] one distressing realization is that for all of the technical sophistication for measuring results that was on glorious display in the room, a fair amount of unchecked error and distortion is also creeping into the field. Much is of the “not quite right” variety rather than being out and out wrong. I could see this in the several studies where I had firsthand knowledge of what the researchers actually found versus what was being reported, particularly when study findings are reported secondhand.”
· A new From Evidence to Policy note reports on an evaluation in Pakistan of a program to improve test scores in low-cost private schools, using an RD- design.
· Over at the FAI blog, Elise Corwin and Tim Ogden summarize a new paper by Xavi Gine and co-authors on how micro-insurance changes farmers’ production decisions.
· HuffPost British Columbia covers an experiment taking place to reduce homelessness – a $110 million dollar experiment being run by the Mental Health Commission of Canada. Along with the experiment, case study videos are being produced. Here is more about the experiment, including the trial registration, which has 1000 people getting the regular program, and 1000 people getting a new Housing First program.