Thanks for the very helpful comments on our first and second posts, in which we examined the impact of economics blogs on paper downloads, professional reputation, and speculated on their influence on policy. Two common themes from your comments were that i) proving causality is an issue, and ii) there are other aspects of blog impacts we should also try and look at. We’re hoping that today’s installment will help to address both of these points. But, let’s not get ahead of ourselves and start by introducing a survey we conducted on the topic and providing some descriptive statistics from baseline, before describing our experiment.
Survey evidence – why don’t you just ask blog readers?
A common suggestion in evaluating any program or intervention is just to ask the users what they gain from their use. Economists are justifiably suspicious of such responses when it comes to assessing impacts, since, among other concerns, people often find it hard to give answers with the right counterfactual in mind. Nonetheless, such questions can be very useful for process evaluation and qualitative insights. Certainly the likely impacts of blogs will be very different if all readers say that the blogs haven’t changed their behavior in any way compared to a situation in which they say it has in a number of ways.
Before launching Development Impact, we sent out a web survey to a variety of people interested in development economics – PhD and Masters students, Field staff for an NGO, Assistant Professors, and World Bank Economists. The full sampling strategy and response rates are discussed in this detailed write-up here - a big thanks to those of you who answered the survey (and shame on you World Bank operational economists who had the lowest response rate). The table below shows that most of the target sample read economics blogs, although most do so sporadically (only 40 percent of graduate students and 34 percent of assistant professors who read blogs do so at least a few times a week). Consistent with our first post, many readers say they have read a new economics paper as a result. There is also some encouraging evidence suggesting blogs are having policy influence – 44 percent of operational field staff and 34 percent of operational World Bank economists say they have changed their feelings about the effectiveness of a particular intervention as a result of reading blogs. Less common are changes in what people choose to ask in surveys or how they plan on analyzing data.
In order to measure the causal impact of blogging, we would ideally randomly assign some people to read a blog, and others to not. However, this is difficult to accomplish for existing well-known blogs, since most potential readers would have heard of the blog, and potentially sampled it to see whether they are interested in reading it or not. Therefore, we took advantage of the launch of this blog - Development Impact (DI) - at the start of April 2011, and conducted a randomized encouragement experiment.
We took the graduate students and field staff that had answered the survey above, and randomly divided them into two groups (smaller sample sizes and uniformly greater awareness of the new blog were the respective reasons for excluding assistant professors and World Bank staff from the experiment). The treatment group was sent an email five days after our blog started, telling them about the new blog and inviting them to “check it out.” A follow-up email was sent three weeks after launch, which asked how the blog was doing so far, and encouraging them to check it out if they hadn’t already.
We then conducted a follow-up survey in June 2011. The encouragement was a success: 18 percent of the control group had read DI in the past month, compared to 28 percent of the treatment group. The encouragement worked best, not surprisingly for this blog, among individuals who stated a wish to become academic researchers at baseline and among males (female readers, why did you spurn our advances?).
As a result of this encouragement, we have some individuals who only read the blog because they were encouraged to do so and comparable individuals in the control group who did not. This allows us to estimate the local average treatment effect (LATE) – the impact of reading the blog for individuals who only did so because they were encouraged. This group consists of about half the male and more than half of the research-focused individuals in our sample, so it is a non-trivial group. Moreover, this is potentially the parameter of interest for answering the pertinent question of whether blogs should attempt outreach exercises to get more readers.
Nevertheless, if the marginal readers, who only read the blog because of the encouragement, are those who find it less interesting or read it less intensively than those who read it of their own accord, then the average impact of reading the blog may differ from the LATE. We therefore also employ bias-adjusted nearest-neighbor matching to estimate the average treatment effect (ATE). (We have a wealth of information on relevant baseline characteristics, which are described in the full write-up here. We will make the data and dofiles used for our analysis available in a couple of weeks along with the final working paper.)
Impacts on institutional reputation
Successful blogs are often argued to improve the reputation of the individuals and institutions producing these blogs, as suggested in our second post. But as was pointed out in the comments, it is hard to assess causality. The experiment allows us to examine this issue causally – at least regarding the impact of reading Development Impact:
· Reading DI increases interest in working as a researcher for the World Bank among academically-focused individuals, at the expense of their interest in working at a Liberal Arts school.
· Reading DI increases perceptions of the quality of research produced at the World Bank across the board– and also seems to have spillover effects on perceptions of research quality at the IMF, as well as at the top research universities most strongly associated with rigorous impact evaluations.
· Reading DI reduces the perception that World Bank staff face censorship when they blog among academically focused individuals.
· Reading DI increases name recognition of our fellow bloggers Jed and Markus (neither of whom were aware that they were part of our experiment).
Impacts on knowledge and attitudes
The mere existence of the blog and a casual reading of posts may be sufficient to result in this level of change. In Table 6 (below), we look for changes in knowledge and attitudes which might only be expected to occur from more in-depth reading of the posts in DI or the papers linked within. (Apologies for the poor quality of the appearance of Table 6 -- you can see all the tables clearly in the full write-up). To measure knowledge, we asked detailed questions related to 6 blog posts that had appeared on DI. These questions proved difficult for the respondents in our sample, with the mean number of correct answers in the control group equal to only 0.91 out of a maximum score of 6.
The experimental impacts estimated on the full sample and on the sub-groups vary in sign and are not significant. However, the matching estimate is positive, large relative to the mean, and significant at the 1 percent level. Two possible interpretations for this difference between the ATE and ITT/TOT suggest themselves. The first is that the matching estimate might just indicate that there is positive selection on knowledge into blog readership. However, remember that we are matching on a large number of characteristics that might well proxy for knowledge, including the number of recent research papers (out of 12) they had read at baseline. A second explanation is therefore that reading the blog impacts knowledge for the average reader, but not for the marginal reader who only reads because of encouragement. This is plausible since the readers who would read the blog regardless of whether they are encouraged or not might be the ones most likely to read the posts closely and learn from them.
Finally, we examine whether blog readership is affecting attitudes towards different methodologies. The bottom of Table 6 shows that blog readership has not changed many of these attitudes towards methodology, with no significant experimental changes in the full sample. Amongst the subsamples, the most significant change occurs in the male sample, where there is an increase in the proportion that believe that it is difficult to succeed as a development economist on the job market without having a randomized experiment. The first two months of postings focused heavily on experimental studies, which may have lead to this impression, although interestingly the ATE estimated through matching is negative and marginally significant. There is also some evidence among the research-focused subsample that more agree with the statement that external validity is no more of a concern in experiments than in most non-experimental studies (something discussed in David’s favorite rant). Nevertheless, given the number of outcomes tested here, only the change for males would continue to be significant once p-values are multiplied by eight to account for examining impacts on eight different attitudes in the sample.
Using a variety of data sources and empirical techniques, we feel we have provided quantitative evidence that economic blogs are doing more than just providing a new source of procrastination for writers and readers. To our knowledge, these findings are the first quantitative evidence to show that blogs are having some impacts. There are large impacts on dissemination of research; significant benefits in terms of the bloggers becoming better known and more respected within the profession; positive spillover effects for the bloggers’ institutions; and some evidence from our experiment that they may influence attitudes and knowledge among their readers. Blogs potentially have many impacts, and we are only measuring some of them, but the evidence we have suggests economics blogs are playing an important role in the profession.
We welcome your comments and suggestions as usual. Over the next 2-3 weeks, we will do a final set of revisions to the paper, taking your comments and suggestions into consideration. We hope to post the working paper here on Development Impact before Labor Day.