Confirmation bias
From Wikipedia, the free encyclopedia
In psychology and cognitive science, confirmation bias is a tendency to search for or interpret new information in a way that confirms one's preconceptions and to avoid information and interpretations which contradict prior beliefs. It is a type of cognitive bias and represents an error of inductive inference, or as a form of selection bias toward confirmation of the hypothesis under study or disconfirmation of an alternative hypothesis.
Confirmation bias is of interest in the teaching of critical thinking, as the skill is misused if rigorous critical scrutiny is applied only to evidence challenging a preconceived idea but not to evidence supporting it.[1]
Contents |
[edit] Naming
The subject of confirmation bias overlaps with or is closely related to a number of similar concepts and syndromes, including belief bias, belief preservation, biased assimilation, belief overkill, hypothesis locking, polarization effect, positive bias, the Tolstoy syndrome, selective thinking, myside bias, law of fives, Plate pick-up, and Morton's demon.
Alternately, Murphy's Law of Research dictates that "Enough research will tend to support your theory."
Within a single experiment, confirmation bias on the part of the experimenter may exhibit itself as expectation bias in the final published results. Data agreeing with the experimenter's expectations may be more likely to be considered "good", while data that conflicts with those expectations may be more likely to be discarded as the product of assumed experimental error.
[edit] Overview
Among the first to investigate this phenomenon was Peter Cathcart Wason (1960), whose 2-4-6 problem presented subjects with three numbers (a triple):
-
2 4 6
Subjects were told that the triple conforms to a particular rule. They were then asked to discover the rule by generating their own triples and using the feedback they received from the experimenter. Every time the subject generated a triple, the experimenter would indicate whether the triple conformed to the rule. The subjects were told that once they were sure of the correctness of their hypothesized rule, they should announce the rule.
While the actual rule was simply “any ascending sequence”, the subjects seemed to have a great deal of difficulty in inducing it, often announcing rules that were far more complex than the correct rule. The subjects seemed to test only “positive” examples—triples the subjects believed would conform to their rule and confirm their hypothesis. What they did not do was attempt to challenge or falsify their hypotheses by testing triples that they believed would not conform to their rule. (e.g. Subjects would test "4,6,8" and "11,13,15" but not "4,7,8" or "9,15,19" if they thought the rule was each number is two greater than its predecessor.) Wason referred to this phenomenon as confirmation bias, whereby subjects systematically seek only evidence that confirms their hypotheses, an explanation he made appeal to also for performance on his selection task (Wason 1968), though he did briefly consider that participants might be using a three-valued rather than two-valued logic. Confirmation bias has been used to explain why people believe in the paranormal.[2]
[edit] Evans experiment
In a series of experiments by Evans, et al., subjects were presented with deductive arguments (in each of which a series of premises and a conclusion are given) and asked to indicate if each conclusion necessarily follows from the premises given. In other words, the subjects are asked to make an evaluation of logical validity. The subjects, however, exhibited confirmation bias when they rejected valid arguments with unbelievable conclusions, and endorsed invalid arguments with believable conclusions. It seems that instead of following directions and assessing logical validity, the subjects base their assessments on personal beliefs. [3]
It has been argued that like in the case of the matching bias, using more realistic content in syllogisms can facilitate more normative performance, and the use of more abstract, artificial content has a biasing effect on performance.[citation needed]
[edit] Reasons for effect
There are several possible reasons that beliefs persevere despite contrary evidence. Embarrassment over having to withdraw a publicly declared belief, for example, or stubbornness or hope. Tradition, superstition, religion, worldview, or ideology can allow a believer to give a greater weight to some data over other data.
One explanation may lie in the workings of the human sensory system. Human brains and senses are organised in such a manner so as to facilitate rapid evaluation of social situations and others' states of mind. Studies have shown that this behaviour is evident in the choosing of friends and partners[4] and houses,[5] even though it is largely subconscious. Although it can be a very fast process[6], the initial impression has a lasting effect as a byproduct of the brain's tendency to fill in the gaps of what it perceives and the unwillingness of the believer to admit a mistake.
[edit] Polarization effect
Polarization occurs when mixed or neutral evidence is used to bolster an already established and clearly biased point of view. As a result, people on both sides can move farther apart, or polarize, when they are presented with the same mixed evidence.
In 1979, Lord, Ross, and Lepper conducted an experiment to explore what would happen if they presented subjects harboring divergent opinions with the same body of mixed evidence. They hypothesized that each opposing group would use the same pieces of evidence to further support their opinions. The subjects chosen were 24 proponents and 24 opponents of the death penalty. They were given an article about the effectiveness of capital punishment and were asked to evaluate it. Then, the subjects were given detailed research descriptions of the study they had just read, but this time it included procedure, results, prominent criticisms and results shown in a table or graph. They were then asked to evaluate the study, stating how well it was conducted and how convincing the evidence was overall.
The results were congruent with the hypothesis. Students found that studies which supported their pre-existing view were superior to those which contradicted it, in a number of detailed and specific ways. In fact, the studies all described the same experimental procedure but with only the purported result changed.[7]
Overall, there was a visible increase of attitude polarization. Initial analysis of the experiment shows that proponents and opponents confessed to shifting their attitudes slightly in the direction of the first study they read, but, once subjects read the more detailed study, they returned to their original belief regardless of the evidence provided, pointing to the details that support their viewpoint and disregarding anything contrary.
It is not accurate to say that the subjects were trying to view the evidence in a biased manner, but, since the subjects already had such strong opinions about capital punishment, their reading of the evidence was colored toward their point of view. Looking at the same piece of evidence, an opponent and proponent would each argue that it supports his own cause, thus pushing contrary opinions even further into their opposing corners.
Polarization can occur in conjunction with other assimilation biases such as illusory correlation, selective exposure, or the primary effects. The normative model for this bias is the neutral evidence principle. A formulated belief can prevail even if the evidence that was used in the initial formation of that belief is entirely negated. [8]
[edit] Tolstoy syndrome
The behavior of confirmation bias has sometimes been called "Tolstoy syndrome", in reference to Russian writer Leo Tolstoy (1828-1910), who in 1897 wrote:[9]
“ | "I know that most men, including those at ease with problems of the greatest complexity, can seldom accept the simplest and most obvious truth if it be such as would oblige them to admit the falsity of conclusions which they have proudly taught to others, and which they have woven, thread by thread, into the fabrics of their life". | ” |
A related Tolstoy quote is:
“ | "The most difficult subjects can be explained to the most slow-witted man if he has not formed any idea of them already; but the simplest thing cannot be made clear to the most intelligent man if he is firmly persuaded that he knows already, without a shadow of doubt, what is laid before him."[10] | ” |
[edit] Myside bias
The term "myside bias" was coined by the geneticist, David Perkins, myside referring to "my" side of the issue under consideration. An important consequence of the myside bias is that many incorrect beliefs are slow to change and often become stronger even when evidence is presented which should weaken the belief. Generally, such irrational belief persistence results from according too much weight to evidence that accords with one's belief, and too little weight to evidence that does not. It can also result from the failure to search impartially for information.[citation needed]
Jonathan Baron describes many instances where myside bias affects our lives. For example, students who perform poorly might be suffering from irrational belief persistence when they fail to criticize their own ideas and remain rigid in their mistaken beliefs. Another example is the teacher of these students, who might be suffering from the same bias when he assumes that the students' claims are mistaken. Baron also mentions certain forms of psychopathology as good examples of myside bias. Delusional patients, for instance, might continually wrongly believe that a cough or sneeze means that they are dying, even when doctors insist that they are healthy. Conversely, patients with serious conditions might be dismissed by doctors as healthy, and in this case, it is the doctor who is suffering from the bias.
Aaron T. Beck describes the role of this type of bias in depressive patients. He argues that depressive patients maintain their depressive state because they fail to recognize information that might make them happier, and only focus on evidence showing that their lives are unfulfilling. According to Beck, an important step in the cognitive treatment of these individuals is to overcome this bias, and to search and recognize information about their lives more impartially.[citation needed]
[edit] Morton's demon
Morton's Demon was devised by Glenn R. Morton in 2002[11] as part of a thought experiment to explain his own experience of confirmation bias. By analogy with Maxwell's demon, Morton's demon stands at the gateway of a person's senses and lets in facts that agree with that person's beliefs while deflecting those that do not.
Morton was at one time a Young Earth creationist who later disavowed this belief. The demon was his way of referring to his own bias and that which he continued to observe in other Young Earth creationists. With time it has become a common shorthand for confirmation bias in a variety of situations.[citation needed]
[edit] See also
[edit] References
- ^ Tim van Gelder, "Heads I win, tails you lose": A Foray Into the Psychology of Philosophy
- ^ Sternberg, Robert J. (2007). "Critical Thinking in Psychology: It really is critical". in Robert J. Sternberg, Henry L. Roediger, Diane F. Halpern. Critical Thinking in Psychology. Cambridge University Press. p. 292. ISBN 0521608341. "Some of the worst examples of confirmation bias are in research on parapsychology (...) Arguably, there is a whole field here with no powerful confirming data at all. But people want to believe, and so they find ways to believe."
- ^ Evans, J. St. B. T., Barston, J.L., & Pollard, P. (1983). On the conflict between logic and belief in syllogistic reasoning. Memory and Cognition, 11, 295-306.
- ^ Dye, Lee (2004-09-22). "Study: First Impressions Really Matter". ABC News. http://abcnews.go.com/Technology/story?id=69942&page=1. Retrieved on 2007-11-07.
- ^ websites
- ^ "First Impressions Of Beauty May Demonstrate Why The Pretty Prosper". Science News. ScienceDaily. 2006-01-25. http://www.sciencedaily.com/releases/2006/01/060124223317.htm. Retrieved on 2007-11-07.
- ^ summary here
- ^ Lord, C. G., Ross, L., & Lepper, M. R. (1979). Biased assimilation and attitude polarization: The effects of prior theories on subsequently considered evidence. Journal of Personality and Social Psychology, 37, 2098-2109.
- ^ "What is Art?", 1897 [1]
- ^ Leo Tolstoy, The Kingdom of God is Within You, Chapter III
- ^ Morton, Glenn R.. "Morton's demon". http://home.entouch.net/dmd/mortonsdemon.htm. Retrieved on 2007-08-21.
[edit] Further reading
- Wason, P.C. (1960). On the failure to eliminate hypotheses in a conceptual task. Quarterly Journal of Experimental Psychology, 12, 129-140.
- Wason, P.C. (1966). Reasoning. In B. M. Foss (Ed.), New horizons in psychology I, 135-151. Harmondsworth, UK: Penguin.
- Wason, P.C. (1968). Reasoning about a rule. Quarterly Journal of Experimental Psychology, 20, 273-281.
- Mynatt, C.R., Doherty, M.E., & Tweney, R.D. (1977). Confirmation bias in a simulated research environment: an experimental study of scientific inference. Quarterly Journal of Experimental Psychology, 29, 85-95.
- Griggs, R.A. & Cox, J.R. (1982). The elusive thematic materials effect in the Wason selection task. British Journal of Psychology, 73, 407-420.
- Nickerson, R.S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2, 175-220.
- Fugelsang, J., Stein, C., Green, A., & Dunbar, K. (2004). Theory and data interactions of the scientific mind: Evidence from the molecular and the cognitive laboratory. Canadian Journal of Experimental Psychology, 58, 132-141.
- Cohen, L.J. (1981). Can human irrationality be experimentally demonstrated? The Behavioral and Brain Sciences, 4, 317-370.
- Ross, L., Lepper, M. R. and Hubbard, M. "Perseverance in self-perception and social perception: Biased attributional processes in the debriefing paradigm" Journal of Personality and Social Psychology, 32, 880-892 1975
- D.W. Schumann (Ed.) "Causal reasoning and belief perseverance" Proceedings of the Society for Consumer Psychology (pp. 115-120) Knoxville, TN: University of Tennessee 1989
- Tutin, Judith "Belief Perseverance: A Replication and Extension of Social Judgment Findings" Edu. Resources Inf. Ctr. ED240409 1983
- Bell, R, 1992, Impure Science, John Wiley & Sons, New York
- Helman, H, 1998, Great Feuds in Science, John Wiley & Sons, New York
- Kohn, A, 1986, False Prophets, Basil Blackwell Inc, New York
- Ditto, P. H., & Lopez, D. F. (1992). Motivated skepticism: Use of differential decision criteria for preferred and non-preferred conclusions. Journal of Personality and Social Psychology, 63, 568-584.
- Edwards K. & Smith E. E. (1996). A disconfirmation bias in the evaluation of arguments. Journal of Personality and Social Psychology 71 5-24.
- Baron, Jonathan. (1988, 1994, 2000). Thinking and Deciding. Cambridge University Press.
- Beck, A.T. (1976). Cognitive therapy and the emotional disorders. New York: International Universities Press.
- Monwhea Jeng (2006) "A selected history of expectation bias in physics", 'American Journal of Physics' 74 578-583.