Belief bias: We’re not as logical and clear-headed as we might think!
"How fortunate for governments that the people they administer don't think."
Adolph Hitler
The human brain is an amazingly powerful tool. It provides us the capability of performing complex mental feats that even the most advanced computers cannot approximate.
Despite the elegance and sophistication of our brains, we are, nevertheless, subject to exhibiting bias and gross errors in our thinking and decision making processes. Furthermore, our biases are generally predictable from our beliefs, and usually well concealed from our awareness. If I am correct, you might be viewing this article with skepticism.
But putting aside our possible disagreement about your biases, here’s an argument from logic. See if you agree that it is sound: Does the conclusion follow logically from the premises?
Premise 1: Democrats support free speech.
Premise 2: Dictators are not democrats.
Conclusion: Dictators do not support free speech.
What about this one?
Premise 1: Robins have feathers.
Premise 2: Chickens are not robins.
Conclusion: Chickens do not have feathers.
In the language of logic, both arguments have the same form, and both are equally not valid: The premises, even if we assume them to be true, can not guarantee the truth of the conclusions.
If you had more difficulty spotting the faulty reasoning in the first argument than the second, you are experiencing “belief bias.” According to this notion, our pre-existing beliefs tend to distort our ability to reason clearly. As a result, we are prepared to accept conclusions that seem consistent with our belief systems, without serious consideration of their meaning or merit.
The reverse is also true: We are reluctant to accept conclusions that seem contrary to our belief systems, despite the logic or reasonableness of those conclusions.
In a free society, we expect the media to provide us with fair and unbiased reporting of the events of our world. Many, however, complain that the media exhibit bias in the way that they cover various issues and personalities in the news. Consider, for example, news coverage of conflicts around the world, such as the Israeli-Palestinian conflict, activities of the United Nation, or the war in Iraq. Or perhaps on a more local level, consider the way the media reports on the activities of the Prime Minister, or the current violence in the Southern Provinces.
Do you believe that the media are biased in their reporting about the Prime Minister? If you do, whether you believe they are biased for or against him, probably depends upon your attitude toward the Prime Minister. If you like him and think he is a good leader, you probably believe the media are biased against him.
If, on the other hand, you believe the Prime Minister is a corrupt leader, you probably believe that the media are far too easy on him than he deserves. Thus, in this example, your judgment about the performance of the media is biased by your existing beliefs about the Prime Minister.
Psychologists demonstrated this principle in a study in which they recruited groups of pro-Israeli, pro-Arab, and politically neutral students from Stanford University to view TV news coverage of the massacre of civilians in Beirut. Subjects were asked to rate the fairness and objectivity of the reporting.
Students who were pro-Arab viewed the coverage as biased in favor of Israel. Pro-Israeli students, viewing the same programs, judged that they were biased against Israel. The groups also differed in their perceptions and recollections about the program content, each recalling more negative references to their respective side than positive ones.
We can see examples of belief bias operating in routine occurrences, such as sports fans’ “booing” a perceived unfair decision by a referee, and complaints by politicians of systematic media bias against them. It is worth noting that political commentary about people and events is often deliberately slanted; a strategy for gaining advantage. The “spin doctors” that appear following political debates come to mind. This tactic is not to be confused with belief bias.
Recognizing belief bias in others can be a step toward greater understanding. Recognizing it in oneself is likely to be more difficult. Awareness of own susceptibility to this source of error in logic can be both a humbling and enlightening experience.