The typical arguments made for the conditioning argument of CCTs are usually based on paternalism (people might have incorrect beliefs about the value of education, or parents may have incomplete altruism for their kids), externalities (the social returns to education exceed the private returns so individuals underinvest), or political economy (it is easier to sell transfers to the voters if you make them conditional). A paper by Leonardo Bursztyn and Lucas Coffman, forthcoming in the Journal of Political Economy, offers a new rationale – arguing that parents prefer CCTs to unconditional transfers because the conditioning enables them to monitor school attendance.
The authors conducted an experiment with 210 families who were already enrolled and benefitting from a CCT program in Brazil (Bolsa Escola Vida Melhor), and who had a child aged 13 to 15 in the household. At the time of the program the families were receiving R$120 per month conditional on school attendance of a child in at least 85% of classes each month.
A survey was taken of the parents, in which the surveyor offered the parent the opportunity to change to a new cash transfer program. There were 25 questions, where each was a choice between a cash transfer conditional on attendance (the current program), and an unconditional transfer, which would also be paid monthly in the same manner to the same parent. E.g. Would you prefer R$120 conditional on attendance, or R$125 unconditionally.
Parents were informed that 5% of participants would have one of their decisions implemented, and that that decision would be randomly chosen from the 25 questions (yes, the local government was apparently willing to change 10 families’ programs for 4 months). They were also informed that at the end of the experiment, their child would be made aware of the choices made by the parent.
There were then two treatments (along with the control group of 61 families which got the above):
· Text message treatment: A treatment group of 51 families was asked prior to the CCT vs unconditional questions whether they would like to receive a free text message sent to their cellphone each day their child misses school. (49/51 took up this offer, the other 2 didn’t have cellphones).
· Don’t tell treatment: A treatment group of 47 families, where instead of telling the parents that the child would be told, parents were told that their child would not be told if their transfer program was changed – so parents could still allow the child to believe the transfers were conditional on attendance.
Parents are willing to pay to keep the conditionality: 82% of parents were willing to pay (by turning down a larger unconditional transfer amount) to keep the conditionality – average (censored) willingness to pay was R$37, which was 6% of household pre-CCT monthly income and almost one-third of the CCT amount.
Once parents were offered costless monitoring, demand for conditioning fell dramatically: Only 34% of the text message group was willing to pay for the condition.
Willingness to pay for conditioning also fell under the “don’t tell” treatment – suggesting that the willingness to pay is primarily about monitoring their child’s behavior. The magnitude of the effect is similar in size to the text message treatment.
This is a clever little experiment, which suggests that in cases where parents find it hard to monitor attendance (only 7% of kids report travelling to school with their parents), CCTs might operate by providing information to parents on child attendance behavior, which is valuable to the parents. This channel is presumably more important for older kids, where preferences for schooling may diverge more between parent and child. The sample is pretty small in each treatment, so it would be interesting to see this replicated in other countries and programs.
One issue I have with this (and many other studies which do something similar), is the description of “real stakes” for a situation in which only 5% of households actually get awarded something, and even then, only one of their 25 questions is real. So the chance of any given question actually being played for real is only 1/20 x 1/25 = 1/500. Given the gains from switching were approximately R30 for 4 months, the expected value is something like 0.05 x 120 x 0.5 = $R3. (the 0.5 arises because half the cases asked whether you would prefer a larger CCT vs a smaller unconditional transfer). So the expected value is only 2.5% of one month’s transfer, which may not qualify as “real money”. I see this similar issue being done at times for measuring risk aversion, where people get asked 20 choices between lotteries, and then 1/10 people get to have one of their lotteries played for real. It seems likely that some chance of it being for real is better than zero chance, but I am not sure we would get the same results from such cases as would occur when every choice is for real money. Anyone know of a paper which tests this?