Do you know these people who always only hear what they want to hear? Who interpret everything in a way that fits their own views? Yes? Deal with them every day? Well, chances are, you’re one of them. Biased information processing is a common phenomenon. It happens when the information we receive is out of sync with what we believe to be true, or want to be true, or when the information is inconvenient for us. This obviously has huge implications for communication campaigns in development.
It turns out that when making decisions, any decisions, most people in most situations do not evaluate evidence impartially and will therefore not reach an unbiased decision. Take political media coverage as an example. If you hear a story about a politician that you do not support, any information you receive about this politician, whether negative or positive, will tend to confirm your negative opinion. If you receive information about a politician that you do like, any information you receive will likely confirm your positive opinion. This is called “confirmation bias .” When receiving information that runs counter to our beliefs, we tend to re-interpret it in a way that avoids the “cognitive dissonance” between the information and what we believe in. Philosopher Francis Bacon said in 1620 that “the human understanding when it has once adopted an opinion … draws all things else to support and agree with it. And though there be a greater number and weight of instances to be found on the other side, yet these it either neglects and despises, or else by some distinction sets aside and rejects; in order that by this great and pernicious predetermination the authority of its former conclusions may remain inviolate.”
Confirmation bias is particularly likely to occur when we feel strongly involved and threatened in our opinions. If this happens, we tend to pay attention only to arguments that favor our pre-existing opinion, prefer evidence supporting our favored arguments – in short, we only hear what we want to hear.
This has massive implications for communication campaigns in general, and behavior change campaigns in particular. Let’s assume a campaign aims at changing a behavior that is deeply ingrained in culture and reflects cultural and religious beliefs. Family planning is one such behavior. Campaigns promoting the use of contraception among populations whose beliefs are against it need to overcome strong biased information processing. Evidence of the positive effects of contraception and family planning – increased health, economic growth, women’s empowerment – may be perceived in the exact opposite way by the audience. For instance, the empowerment argument could be interpreted as evidence against family planning as it may uproot social and cultural traditions that are approved of by most members of the target audience.
How to overcome the problem of biased information processing? It’s a tough one. Research indicates that biased processing and confirmation bias are most likely to occur when people feel threatened in their beliefs. Campaigns therefore should aim to design messages as clear as possible, but also as non-threatening as possible, not attacking deeply held cultural values. Most of all, though, it seems that conversations and dialogue help. Research shows that if people learn to articulate not only their own viewpoints but also opinions that oppose their own, confirmation bias is less likely to occur. Communication campaigns, therefore, need to activate dialogue that includes all sides of the argument.
Information and awareness raising is not sufficient to change behavior. Such campaigns are likely to fall on deaf ears, or on ears that will only hear what they want to hear. If, however, a dialogue about the campaign message can be initiated, information is likely to be processed more carefully and possibly more successfully.