fbpx
Wikipedia

Confirmation bias

Confirmation bias (also confirmatory bias, myside bias,[a] or congeniality bias[2]) is the tendency to search for, interpret, favor, and recall information in a way that confirms or supports one's prior beliefs or values.[3] People display this bias when they select information that supports their views, ignoring contrary information, or when they interpret ambiguous evidence as supporting their existing attitudes. The effect is strongest for desired outcomes, for emotionally charged issues, and for deeply entrenched beliefs. Confirmation bias is insuperable for most people, but they can manage it, for example, by education and training in critical thinking skills.

Biased search for information, biased interpretation of this information, and biased memory recall, have been invoked to explain four specific effects:

  1. attitude polarization (when a disagreement becomes more extreme even though the different parties are exposed to the same evidence)
  2. belief perseverance (when beliefs persist after the evidence for them is shown to be false)
  3. the irrational primacy effect (a greater reliance on information encountered early in a series)
  4. illusory correlation (when people falsely perceive an association between two events or situations).

A series of psychological experiments in the 1960s suggested that people are biased toward confirming their existing beliefs. Later work re-interpreted these results as a tendency to test ideas in a one-sided way, focusing on one possibility and ignoring alternatives. Explanations for the observed biases include wishful thinking and the limited human capacity to process information. Another proposal is that people show confirmation bias because they are pragmatically assessing the costs of being wrong, rather than investigating in a neutral, scientific way.

Flawed decisions due to confirmation bias have been found in a wide range of political, organizational, financial and scientific contexts. These biases contribute to overconfidence in personal beliefs and can maintain or strengthen beliefs in the face of contrary evidence. For example, confirmation bias produces systematic errors in scientific research based on inductive reasoning (the gradual accumulation of supportive evidence). Similarly, a police detective may identify a suspect early in an investigation, but then may only seek confirming rather than disconfirming evidence. A medical practitioner may prematurely focus on a particular disorder early in a diagnostic session, and then seek only confirming evidence. In social media, confirmation bias is amplified by the use of filter bubbles, or "algorithmic editing", which display to individuals only information they are likely to agree with, while excluding opposing views.

Definition and context edit

Confirmation bias, a phrase coined by English psychologist Peter Wason, is the tendency of people to favor information that confirms or strengthens their beliefs or values and is difficult to dislodge once affirmed.[4]

Confirmation biases are effects in information processing. They differ from what is sometimes called the behavioral confirmation effect, commonly known as self-fulfilling prophecy, in which a person's expectations influence their own behavior, bringing about the expected result.[5]

Some psychologists restrict the term "confirmation bias" to selective collection of evidence that supports what one already believes while ignoring or rejecting evidence that supports a different conclusion. Others apply the term more broadly to the tendency to preserve one's existing beliefs when searching for evidence, interpreting it, or recalling it from memory.[6][b] Confirmation bias is a result of automatic, unintentional strategies rather than deliberate deception.[8][9]

Types edit

Biased search for information edit

 
Confirmation bias has been described as an internal "yes man", echoing back a person's beliefs like Charles Dickens's character Uriah Heep.[10]

Experiments have found repeatedly that people tend to test hypotheses in a one-sided way, by searching for evidence consistent with their current hypothesis.[3]: 177–178 [11] Rather than searching through all the relevant evidence, they phrase questions to receive an affirmative answer that supports their theory.[12] They look for the consequences that they would expect if their hypothesis was true, rather than what would happen if it was false.[12] For example, someone using yes/no questions to find a number they suspect to be the number 3 might ask, "Is it an odd number?" People prefer this type of question, called a "positive test", even when a negative test such as "Is it an even number?" would yield exactly the same information.[13] However, this does not mean that people seek tests that guarantee a positive answer. In studies where subjects could select either such pseudo-tests or genuinely diagnostic ones, they favored the genuinely diagnostic.[14][15]

The preference for positive tests in itself is not a bias, since positive tests can be highly informative.[16] However, in combination with other effects, this strategy can confirm existing beliefs or assumptions, independently of whether they are true.[8] In real-world situations, evidence is often complex and mixed. For example, various contradictory ideas about someone could each be supported by concentrating on one aspect of his or her behavior.[11] Thus any search for evidence in favor of a hypothesis is likely to succeed.[8] One illustration of this is the way the phrasing of a question can significantly change the answer.[11] For example, people who are asked, "Are you happy with your social life?" report greater satisfaction than those asked, "Are you unhappy with your social life?"[17]

Even a small change in a question's wording can affect how people search through available information, and hence the conclusions they reach. This was shown using a fictional child custody case.[18] Participants read that Parent A was moderately suitable to be the guardian in multiple ways. Parent B had a mix of salient positive and negative qualities: a close relationship with the child but a job that would take them away for long periods of time. When asked, "Which parent should have custody of the child?" the majority of participants chose Parent B, looking mainly for positive attributes. However, when asked, "Which parent should be denied custody of the child?" they looked for negative attributes and the majority answered that Parent B should be denied custody, implying that Parent A should have custody.[18]

Similar studies have demonstrated how people engage in a biased search for information, but also that this phenomenon may be limited by a preference for genuine diagnostic tests. In an initial experiment, participants rated another person on the introversion–extroversion personality dimension on the basis of an interview. They chose the interview questions from a given list. When the interviewee was introduced as an introvert, the participants chose questions that presumed introversion, such as, "What do you find unpleasant about noisy parties?" When the interviewee was described as extroverted, almost all the questions presumed extroversion, such as, "What would you do to liven up a dull party?" These loaded questions gave the interviewees little or no opportunity to falsify the hypothesis about them.[19] A later version of the experiment gave the participants less presumptive questions to choose from, such as, "Do you shy away from social interactions?"[20] Participants preferred to ask these more diagnostic questions, showing only a weak bias towards positive tests. This pattern, of a main preference for diagnostic tests and a weaker preference for positive tests, has been replicated in other studies.[20]

Personality traits influence and interact with biased search processes.[21] Individuals vary in their abilities to defend their attitudes from external attacks in relation to selective exposure. Selective exposure occurs when individuals search for information that is consistent, rather than inconsistent, with their personal beliefs.[22] An experiment examined the extent to which individuals could refute arguments that contradicted their personal beliefs.[21] People with high confidence levels more readily seek out contradictory information to their personal position to form an argument. This can take the form of an oppositional news consumption, where individuals seek opposing partisan news in order to counterargue.[23] Individuals with low confidence levels do not seek out contradictory information and prefer information that supports their personal position. People generate and evaluate evidence in arguments that are biased towards their own beliefs and opinions.[24] Heightened confidence levels decrease preference for information that supports individuals' personal beliefs.

Another experiment gave participants a complex rule-discovery task that involved moving objects simulated by a computer.[25] Objects on the computer screen followed specific laws, which the participants had to figure out. So, participants could "fire" objects across the screen to test their hypotheses. Despite making many attempts over a ten-hour session, none of the participants figured out the rules of the system. They typically attempted to confirm rather than falsify their hypotheses, and were reluctant to consider alternatives. Even after seeing objective evidence that refuted their working hypotheses, they frequently continued doing the same tests. Some of the participants were taught proper hypothesis-testing, but these instructions had almost no effect.[25]

Biased interpretation of information edit

Smart people believe weird things because they are skilled at defending beliefs they arrived at for non-smart reasons.

Michael Shermer[26]

Confirmation biases are not limited to the collection of evidence. Even if two individuals have the same information, the way they interpret it can be biased.

A team at Stanford University conducted an experiment involving participants who felt strongly about capital punishment, with half in favor and half against it.[27][28] Each participant read descriptions of two studies: a comparison of U.S. states with and without the death penalty, and a comparison of murder rates in a state before and after the introduction of the death penalty. After reading a quick description of each study, the participants were asked whether their opinions had changed. Then, they read a more detailed account of each study's procedure and had to rate whether the research was well-conducted and convincing.[27] In fact, the studies were fictional. Half the participants were told that one kind of study supported the deterrent effect and the other undermined it, while for other participants the conclusions were swapped.[27][28]

The participants, whether supporters or opponents, reported shifting their attitudes slightly in the direction of the first study they read. Once they read the more detailed descriptions of the two studies, they almost all returned to their original belief regardless of the evidence provided, pointing to details that supported their viewpoint and disregarding anything contrary. Participants described studies supporting their pre-existing view as superior to those that contradicted it, in detailed and specific ways.[27][29] Writing about a study that seemed to undermine the deterrence effect, a death penalty proponent wrote, "The research didn't cover a long enough period of time," while an opponent's comment on the same study said, "No strong evidence to contradict the researchers has been presented."[27] The results illustrated that people set higher standards of evidence for hypotheses that go against their current expectations. This effect, known as "disconfirmation bias", has been supported by other experiments.[30]

Another study of biased interpretation occurred during the 2004 U.S. presidential election and involved participants who reported having strong feelings about the candidates. They were shown apparently contradictory pairs of statements, either from Republican candidate George W. Bush, Democratic candidate John Kerry or a politically neutral public figure. They were also given further statements that made the apparent contradiction seem reasonable. From these three pieces of information, they had to decide whether each individual's statements were inconsistent.[31]: 1948  There were strong differences in these evaluations, with participants much more likely to interpret statements from the candidate they opposed as contradictory.[31]: 1951 

 
An MRI scanner allowed researchers to examine how the human brain deals with dissonant information.

In this experiment, the participants made their judgments while in a magnetic resonance imaging (MRI) scanner which monitored their brain activity. As participants evaluated contradictory statements by their favored candidate, emotional centers of their brains were aroused. This did not happen with the statements by the other figures. The experimenters inferred that the different responses to the statements were not due to passive reasoning errors. Instead, the participants were actively reducing the cognitive dissonance induced by reading about their favored candidate's irrational or hypocritical behavior.[31]: 1956 

Biases in belief interpretation are persistent, regardless of intelligence level. Participants in an experiment took the SAT test (a college admissions test used in the United States) to assess their intelligence levels. They then read information regarding safety concerns for vehicles, and the experimenters manipulated the national origin of the car. American participants provided their opinion if the car should be banned on a six-point scale, where one indicated "definitely yes" and six indicated "definitely no". Participants firstly evaluated if they would allow a dangerous German car on American streets and a dangerous American car on German streets. Participants believed that the dangerous German car on American streets should be banned more quickly than the dangerous American car on German streets. There was no difference among intelligence levels at the rate participants would ban a car.[24]

Biased interpretation is not restricted to emotionally significant topics. In another experiment, participants were told a story about a theft. They had to rate the evidential importance of statements arguing either for or against a particular character being responsible. When they hypothesized that character's guilt, they rated statements supporting that hypothesis as more important than conflicting statements.[32]

Biased memory recall of information edit

People may remember evidence selectively to reinforce their expectations, even if they gather and interpret evidence in a neutral manner. This effect is called "selective recall", "confirmatory memory", or "access-biased memory".[33] Psychological theories differ in their predictions about selective recall. Schema theory predicts that information matching prior expectations will be more easily stored and recalled than information that does not match.[34] Some alternative approaches say that surprising information stands out and so is memorable.[34] Predictions from both these theories have been confirmed in different experimental contexts, with no theory winning outright.[35]

In one study, participants read a profile of a woman which described a mix of introverted and extroverted behaviors.[36] They later had to recall examples of her introversion and extroversion. One group was told this was to assess the woman for a job as a librarian, while a second group were told it was for a job in real estate sales. There was a significant difference between what these two groups recalled, with the "librarian" group recalling more examples of introversion and the "sales" groups recalling more extroverted behavior.[36] A selective memory effect has also been shown in experiments that manipulate the desirability of personality types.[34][37] In one of these, a group of participants were shown evidence that extroverted people are more successful than introverts. Another group were told the opposite. In a subsequent, apparently unrelated study, participants were asked to recall events from their lives in which they had been either introverted or extroverted. Each group of participants provided more memories connecting themselves with the more desirable personality type, and recalled those memories more quickly.[38]

Changes in emotional states can also influence memory recall.[39][40] Participants rated how they felt when they had first learned that O. J. Simpson had been acquitted of murder charges.[39] They described their emotional reactions and confidence regarding the verdict one week, two months, and one year after the trial. Results indicated that participants' assessments for Simpson's guilt changed over time. The more that participants' opinion of the verdict had changed, the less stable were the participant's memories regarding their initial emotional reactions. When participants recalled their initial emotional reactions two months and a year later, past appraisals closely resembled current appraisals of emotion. People demonstrate sizable myside bias when discussing their opinions on controversial topics.[24] Memory recall and construction of experiences undergo revision in relation to corresponding emotional states.

Myside bias has been shown to influence the accuracy of memory recall.[40] In an experiment, widows and widowers rated the intensity of their experienced grief six months and five years after the deaths of their spouses. Participants noted a higher experience of grief at six months rather than at five years. Yet, when the participants were asked after five years how they had felt six months after the death of their significant other, the intensity of grief participants recalled was highly correlated with their current level of grief. Individuals appear to utilize their current emotional states to analyze how they must have felt when experiencing past events.[39] Emotional memories are reconstructed by current emotional states.

One study showed how selective memory can maintain belief in extrasensory perception (ESP).[41] Believers and disbelievers were each shown descriptions of ESP experiments. Half of each group were told that the experimental results supported the existence of ESP, while the others were told they did not. In a subsequent test, participants recalled the material accurately, apart from believers who had read the non-supportive evidence. This group remembered significantly less information and some of them incorrectly remembered the results as supporting ESP.[41]

Individual differences edit

Myside bias was once believed to be correlated with intelligence; however, studies have shown that myside bias can be more influenced by ability to rationally think as opposed to level of intelligence.[24] Myside bias can cause an inability to effectively and logically evaluate the opposite side of an argument. Studies have stated that myside bias is an absence of "active open-mindedness", meaning the active search for why an initial idea may be wrong.[42] Typically, myside bias is operationalized in empirical studies as the quantity of evidence used in support of their side in comparison to the opposite side.[43]

A study has found individual differences in myside bias. This study investigates individual differences that are acquired through learning in a cultural context and are mutable. The researcher found important individual difference in argumentation. Studies have suggested that individual differences such as deductive reasoning ability, ability to overcome belief bias, epistemological understanding, and thinking disposition are significant predictors of the reasoning and generating arguments, counterarguments, and rebuttals.[44][45][46]

A study by Christopher Wolfe and Anne Britt also investigated how participants' views of "what makes a good argument?" can be a source of myside bias that influences the way a person formulates their own arguments.[43] The study investigated individual differences of argumentation schema and asked participants to write essays. The participants were randomly assigned to write essays either for or against their preferred side of an argument and were given research instructions that took either a balanced or an unrestricted approach. The balanced-research instructions directed participants to create a "balanced" argument, i.e., that included both pros and cons; the unrestricted-research instructions included nothing on how to create the argument.[43]

Overall, the results revealed that the balanced-research instructions significantly increased the incidence of opposing information in arguments. These data also reveal that personal belief is not a source of myside bias; however, that those participants, who believe that a good argument is one that is based on facts, are more likely to exhibit myside bias than other participants. This evidence is consistent with the claims proposed in Baron's article—that people's opinions about what makes good thinking can influence how arguments are generated.[43]

Discovery edit

Informal observations edit

 
Francis Bacon

Before psychological research on confirmation bias, the phenomenon had been observed throughout history. Beginning with the Greek historian Thucydides (c. 460 BC – c. 395 BC), who wrote of misguided reason in The Peloponnesian War; "... for it is a habit of mankind to entrust to careless hope what they long for, and to use sovereign reason to thrust aside what they do not fancy".[47] Italian poet Dante Alighieri (1265–1321) noted it in the Divine Comedy, in which St. Thomas Aquinas cautions Dante upon meeting in Paradise, "opinion—hasty—often can incline to the wrong side, and then affection for one's own opinion binds, confines the mind".[48] Ibn Khaldun noticed the same effect in his Muqaddimah:[49]

Untruth naturally afflicts historical information. There are various reasons that make this unavoidable. One of them is partisanship for opinions and schools. ... if the soul is infected with partisanship for a particular opinion or sect, it accepts without a moment's hesitation the information that is agreeable to it. Prejudice and partisanship obscure the critical faculty and preclude critical investigation. The result is that falsehoods are accepted and transmitted.

In the Novum Organum, English philosopher and scientist Francis Bacon (1561–1626)[50] noted that biased assessment of evidence drove "all superstitions, whether in astrology, dreams, omens, divine judgments or the like".[51] He wrote:[51]

The human understanding when it has once adopted an opinion ... draws all things else to support and agree with it. And though there be a greater number and weight of instances to be found on the other side, yet these it either neglects or despises, or else by some distinction sets aside or rejects[.]

In the second volume of his The World as Will and Representation (1844), German philosopher Arthur Schopenhauer observed that "An adopted hypothesis gives us lynx-eyes for everything that confirms it and makes us blind to everything that contradicts it."[52]

In his essay (1897) What Is Art?, Russian novelist Leo Tolstoy wrote:[53]

I know that most men—not only those considered clever, but even those who are very clever, and capable of understanding most difficult scientific, mathematical, or philosophic problems—can very seldom discern even the simplest and most obvious truth if it be such as to oblige them to admit the falsity of conclusions they have formed, perhaps with much difficulty—conclusions of which they are proud, which they have taught to others, and on which they have built their lives.

In his essay (1894) The Kingdom of God Is Within You, Tolstoy had earlier written:[54]

The most difficult subjects can be explained to the most slow-witted man if he has not formed any idea of them already; but the simplest thing cannot be made clear to the most intelligent man if he is firmly persuaded that he knows already, without a shadow of doubt, what is laid before him.

Hypothesis-testing (falsification) explanation (Wason) edit

In Peter Wason's initial experiment published in 1960 (which does not mention the term "confirmation bias"), he repeatedly challenged participants to identify a rule applying to triples of numbers. They were told that (2,4,6) fits the rule. They generated triples, and the experimenter told them whether each triple conformed to the rule.[3]: 179 

The actual rule was simply "any ascending sequence", but participants had great difficulty in finding it, often announcing rules that were far more specific, such as "the middle number is the average of the first and last".[55] The participants seemed to test only positive examples—triples that obeyed their hypothesized rule. For example, if they thought the rule was, "Each number is two greater than its predecessor," they would offer a triple that fitted (confirmed) this rule, such as (11,13,15) rather than a triple that violated (falsified) it, such as (11,12,19).[56]

Wason interpreted his results as showing a preference for confirmation over falsification, hence he coined the term "confirmation bias".[c][58] Wason also used confirmation bias to explain the results of his selection task experiment.[59] Participants repeatedly performed badly on various forms of this test, in most cases ignoring information that could potentially refute (falsify) the specified rule.[60][61]

Hypothesis testing (positive test strategy) explanation (Klayman and Ha) edit

Klayman and Ha's 1987 paper argues that the Wason experiments do not actually demonstrate a bias towards confirmation, but instead a tendency to make tests consistent with the working hypothesis.[16][62] They called this the "positive test strategy".[11] This strategy is an example of a heuristic: a reasoning shortcut that is imperfect but easy to compute.[63] Klayman and Ha used Bayesian probability and information theory as their standard of hypothesis-testing, rather than the falsificationism used by Wason. According to these ideas, each answer to a question yields a different amount of information, which depends on the person's prior beliefs. Thus a scientific test of a hypothesis is one that is expected to produce the most information. Since the information content depends on initial probabilities, a positive test can either be highly informative or uninformative. Klayman and Ha argued that when people think about realistic problems, they are looking for a specific answer with a small initial probability. In this case, positive tests are usually more informative than negative tests.[16] However, in Wason's rule discovery task the answer—three numbers in ascending order—is very broad, so positive tests are unlikely to yield informative answers. Klayman and Ha supported their analysis by citing an experiment that used the labels "DAX" and "MED" in place of "fits the rule" and "doesn't fit the rule". This avoided implying that the aim was to find a low-probability rule. Participants had much more success with this version of the experiment.[64][65]

 
If the true rule (T) encompasses the current hypothesis (H), then positive tests (examining an H to see if it is T) will not show that the hypothesis is false.
 
If the true rule (T) overlaps the current hypothesis (H), then either a negative test or a positive test can potentially falsify H.
 
When the working hypothesis (H) includes the true rule (T) then positive tests are the only way to falsify H.

In light of this and other critiques, the focus of research moved away from confirmation versus falsification of an hypothesis, to examining whether people test hypotheses in an informative way, or an uninformative but positive way. The search for "true" confirmation bias led psychologists to look at a wider range of effects in how people process information.[66]

Information processing explanations edit

There are currently three main information processing explanations of confirmation bias, plus a recent addition.

Cognitive versus motivational edit

 
Happy events are more likely to be remembered.

According to Robert MacCoun, most biased evidence processing occurs through a combination of "cold" (cognitive) and "hot" (motivated) mechanisms.[67]

Cognitive explanations for confirmation bias are based on limitations in people's ability to handle complex tasks, and the shortcuts, called heuristics, that they use.[68] For example, people may judge the reliability of evidence by using the availability heuristic that is, how readily a particular idea comes to mind.[69] It is also possible that people can only focus on one thought at a time, so find it difficult to test alternative hypotheses in parallel.[3]: 198–199  Another heuristic is the positive test strategy identified by Klayman and Ha, in which people test a hypothesis by examining cases where they expect a property or event to occur. This heuristic avoids the difficult or impossible task of working out how diagnostic each possible question will be. However, it is not universally reliable, so people can overlook challenges to their existing beliefs.[16][3]: 200 

Motivational explanations involve an effect of desire on belief.[3]: 197 [70] It is known that people prefer positive thoughts over negative ones in a number of ways: this is called the "Pollyanna principle".[71] Applied to arguments or sources of evidence, this could explain why desired conclusions are more likely to be believed true. According to experiments that manipulate the desirability of the conclusion, people demand a high standard of evidence for unpalatable ideas and a low standard for preferred ideas. In other words, they ask, "Can I believe this?" for some suggestions and, "Must I believe this?" for others.[72][73] Although consistency is a desirable feature of attitudes, an excessive drive for consistency is another potential source of bias because it may prevent people from neutrally evaluating new, surprising information. Social psychologist Ziva Kunda combines the cognitive and motivational theories, arguing that motivation creates the bias, but cognitive factors determine the size of the effect.[3]: 198 

Cost-benefit edit

Explanations in terms of cost-benefit analysis assume that people do not just test hypotheses in a disinterested way, but assess the costs of different errors.[74] Using ideas from evolutionary psychology, James Friedrich suggests that people do not primarily aim at truth in testing hypotheses, but try to avoid the most costly errors. For example, employers might ask one-sided questions in job interviews because they are focused on weeding out unsuitable candidates.[75] Yaacov Trope and Akiva Liberman's refinement of this theory assumes that people compare the two different kinds of error: accepting a false hypothesis or rejecting a true hypothesis. For instance, someone who underestimates a friend's honesty might treat him or her suspiciously and so undermine the friendship. Overestimating the friend's honesty may also be costly, but less so. In this case, it would be rational to seek, evaluate or remember evidence of their honesty in a biased way.[76] When someone gives an initial impression of being introverted or extroverted, questions that match that impression come across as more empathic.[77] This suggests that when talking to someone who seems to be an introvert, it is a sign of better social skills to ask, "Do you feel awkward in social situations?" rather than, "Do you like noisy parties?" The connection between confirmation bias and social skills was corroborated by a study of how college students get to know other people. Highly self-monitoring students, who are more sensitive to their environment and to social norms, asked more matching questions when interviewing a high-status staff member than when getting to know fellow students.[77]

Exploratory versus confirmatory edit

Psychologists Jennifer Lerner and Philip Tetlock distinguish two different kinds of thinking process. Exploratory thought neutrally considers multiple points of view and tries to anticipate all possible objections to a particular position, while confirmatory thought seeks to justify a specific point of view. Lerner and Tetlock say that when people expect to justify their position to others whose views they already know, they will tend to adopt a similar position to those people, and then use confirmatory thought to bolster their own credibility. However, if the external parties are overly aggressive or critical, people will disengage from thought altogether, and simply assert their personal opinions without justification. Lerner and Tetlock say that people only push themselves to think critically and logically when they know in advance they will need to explain themselves to others who are well-informed, genuinely interested in the truth, and whose views they do not already know. Because those conditions rarely exist, they argue, most people are using confirmatory thought most of the time.[78][79][80]

Make-believe edit

Developmental psychologist Eve Whitmore has argued that beliefs and biases involved in confirmation bias have their roots in childhood coping through make-believe, which becomes "the basis for more complex forms of self-deception and illusion into adulthood." The friction brought on by questioning as an adolescent with developing critical thinking can lead to the rationalization of false beliefs, and the habit of such rationalization can become unconscious over the years.[81]

Real-world effects edit

Social media edit

In social media, confirmation bias is amplified by the use of filter bubbles, or "algorithmic editing", which displays to individuals only information they are likely to agree with, while excluding opposing views.[82] Some have argued that confirmation bias is the reason why society can never escape from filter bubbles, because individuals are psychologically hardwired to seek information that agrees with their preexisting values and beliefs.[83] Others have further argued that the mixture of the two is degrading democracy—claiming that this "algorithmic editing" removes diverse viewpoints and information—and that unless filter bubble algorithms are removed, voters will be unable to make fully informed political decisions.[84][82]

The rise of social media has contributed greatly to the rapid spread of fake news, that is, false and misleading information that is presented as credible news from a seemingly reliable source. Confirmation bias (selecting or reinterpreting evidence to support one's beliefs) is one of three main hurdles cited as to why critical thinking goes astray in these circumstances. The other two are shortcut heuristics (when overwhelmed or short of time, people rely on simple rules such as group consensus or trusting an expert or role model) and social goals (social motivation or peer pressure can interfere with objective analysis of facts at hand).[85]

In combating the spread of fake news, social media sites have considered turning toward "digital nudging".[86] This can currently be done in two different forms of nudging. This includes nudging of information and nudging of presentation. Nudging of information entails social media sites providing a disclaimer or label questioning or warning users of the validity of the source while nudging of presentation includes exposing users to new information which they may not have sought out but could introduce them to viewpoints that may combat their own confirmation biases.[87]

Science and scientific research edit

A distinguishing feature of scientific thinking is the search for confirming or supportive evidence (inductive reasoning) as well as falsifying evidence (deductive reasoning).[88][89]

Many times in the history of science, scientists have resisted new discoveries by selectively interpreting or ignoring unfavorable data.[3]: 192–194  Several studies have shown that scientists rate studies that report findings consistent with their prior beliefs more favorably than studies reporting findings inconsistent with their previous beliefs.[9][90][91]

However, assuming that the research question is relevant, the experimental design adequate and the data are clearly and comprehensively described, the empirical data obtained should be important to the scientific community and should not be viewed prejudicially, regardless of whether they conform to current theoretical predictions.[91] In practice, researchers may misunderstand, misinterpret, or not read at all studies that contradict their preconceptions, or wrongly cite them anyway as if they actually supported their claims.[92]

Further, confirmation biases can sustain scientific theories or research programs in the face of inadequate or even contradictory evidence.[60][93] The discipline of parapsychology is often cited as an example.[94]

An experimenter's confirmation bias can potentially affect which data are reported. Data that conflict with the experimenter's expectations may be more readily discarded as unreliable, producing the so-called file drawer effect. To combat this tendency, scientific training teaches ways to prevent bias.[95] For example, experimental design of randomized controlled trials (coupled with their systematic review) aims to minimize sources of bias.[95][96]

The social process of peer review aims to mitigate the effect of individual scientists' biases, even though the peer review process itself may be susceptible to such biases[97][98][91][99][100] Confirmation bias may thus be especially harmful to objective evaluations regarding nonconforming results since biased individuals may regard opposing evidence to be weak in principle and give little serious thought to revising their beliefs.[90] Scientific innovators often meet with resistance from the scientific community, and research presenting controversial results frequently receives harsh peer review.[101]

Finance edit

Confirmation bias can lead investors to be overconfident, ignoring evidence that their strategies will lose money.[10][102] In studies of political stock markets, investors made more profit when they resisted bias. For example, participants who interpreted a candidate's debate performance in a neutral rather than partisan way were more likely to profit.[103] To combat the effect of confirmation bias, investors can try to adopt a contrary viewpoint "for the sake of argument".[104] In one technique, they imagine that their investments have collapsed and ask themselves why this might happen.[10]

Medicine and health edit

Cognitive biases are important variables in clinical decision-making by medical general practitioners (GPs) and medical specialists. Two important ones are confirmation bias and the overlapping availability bias. A GP may make a diagnosis early on during an examination, and then seek confirming evidence rather than falsifying evidence. This cognitive error is partly caused by the availability of evidence about the supposed disorder being diagnosed. For example, the client may have mentioned the disorder, or the GP may have recently read a much-discussed paper about the disorder. The basis of this cognitive shortcut or heuristic (termed anchoring) is that the doctor does not consider multiple possibilities based on evidence, but prematurely latches on (or anchors to) a single cause.[105] In emergency medicine, because of time pressure, there is a high density of decision-making, and shortcuts are frequently applied. The potential failure rate of these cognitive decisions needs to be managed by education about the 30 or more cognitive biases that can occur, so as to set in place proper debiasing strategies.[106] Confirmation bias may also cause doctors to perform unnecessary medical procedures due to pressure from adamant patients.[107]

Raymond Nickerson, a psychologist, blames confirmation bias for the ineffective medical procedures that were used for centuries before the arrival of scientific medicine.[3]: 192  If a patient recovered, medical authorities counted the treatment as successful, rather than looking for alternative explanations such as that the disease had run its natural course. Biased assimilation is a factor in the modern appeal of alternative medicine, whose proponents are swayed by positive anecdotal evidence but treat scientific evidence hyper-critically.[108][109][110]

Cognitive therapy was developed by Aaron T. Beck in the early 1960s and has become a popular approach.[111] According to Beck, biased information processing is a factor in depression.[112] His approach teaches people to treat evidence impartially, rather than selectively reinforcing negative outlooks.[50] Phobias and hypochondria have also been shown to involve confirmation bias for threatening information.[113]

Politics, law and policing edit

 
Mock trials allow researchers to examine confirmation biases in a realistic setting.

Nickerson argues that reasoning in judicial and political contexts is sometimes subconsciously biased, favoring conclusions that judges, juries or governments have already committed to.[3]: 191–193  Since the evidence in a jury trial can be complex, and jurors often reach decisions about the verdict early on, it is reasonable to expect an attitude polarization effect. The prediction that jurors will become more extreme in their views as they see more evidence has been borne out in experiments with mock trials.[114][115] Both inquisitorial and adversarial criminal justice systems are affected by confirmation bias.[116]

Confirmation bias can be a factor in creating or extending conflicts, from emotionally charged debates to wars: by interpreting the evidence in their favor, each opposing party can become overconfident that it is in the stronger position.[117] On the other hand, confirmation bias can result in people ignoring or misinterpreting the signs of an imminent or incipient conflict. For example, psychologists Stuart Sutherland and Thomas Kida have each argued that U.S. Navy Admiral Husband E. Kimmel showed confirmation bias when playing down the first signs of the Japanese attack on Pearl Harbor.[60][118]

A two-decade study of political pundits by Philip E. Tetlock found that, on the whole, their predictions were not much better than chance. Tetlock divided experts into "foxes" who maintained multiple hypotheses, and "hedgehogs" who were more dogmatic. In general, the hedgehogs were much less accurate. Tetlock blamed their failure on confirmation bias, and specifically on their inability to make use of new information that contradicted their existing theories.[119]

In police investigations, a detective may identify a suspect early in an investigation, but then sometimes largely seek supporting or confirming evidence, ignoring or downplaying falsifying evidence.[120]

Social psychology edit

Social psychologists have identified two tendencies in the way people seek or interpret information about themselves. Self-verification is the drive to reinforce the existing self-image and self-enhancement is the drive to seek positive feedback. Both are served by confirmation biases.[121] In experiments where people are given feedback that conflicts with their self-image, they are less likely to attend to it or remember it than when given self-verifying feedback.[122][123][124] They reduce the impact of such information by interpreting it as unreliable.[122][125][126] Similar experiments have found a preference for positive feedback, and the people who give it, over negative feedback.[121]

Mass delusions edit

Confirmation bias can play a key role in the propagation of mass delusions. Witch trials are frequently cited as an example.[127][128]

For another example, in the Seattle windshield pitting epidemic, there seemed to be a "pitting epidemic" in which windshields were damaged due to an unknown cause. As news of the apparent wave of damage spread, more and more people checked their windshields, discovered that their windshields too had been damaged, thus confirming belief in the supposed epidemic. In fact, the windshields were previously damaged, but the damage went unnoticed until people checked their windshields as the delusion spread.[129]

Paranormal beliefs edit

One factor in the appeal of alleged psychic readings is that listeners apply a confirmation bias which fits the psychic's statements to their own lives.[130] By making a large number of ambiguous statements in each sitting, the psychic gives the client more opportunities to find a match. This is one of the techniques of cold reading, with which a psychic can deliver a subjectively impressive reading without any prior information about the client.[130] Investigator James Randi compared the transcript of a reading to the client's report of what the psychic had said, and found that the client showed a strong selective recall of the "hits".[131]

As a striking illustration of confirmation bias in the real world, Nickerson mentions numerological pyramidology: the practice of finding meaning in the proportions of the Egyptian pyramids.[3]: 190  There are many different length measurements that can be made of, for example, the Great Pyramid of Giza and many ways to combine or manipulate them. Hence it is almost inevitable that people who look at these numbers selectively will find superficially impressive correspondences, for example with the dimensions of the Earth.[3]: 190 

Recruitment and selection edit

Unconscious cognitive bias (including confirmation bias) in job recruitment affects hiring decisions and can potentially prohibit a diverse and inclusive workplace. There are a variety of unconscious biases that affects recruitment decisions but confirmation bias is one of the major ones, especially during the interview stage.[132] The interviewer will often select a candidate that confirms their own beliefs, even though other candidates are equally or better qualified.

Associated effects and outcomes edit

Polarization of opinion edit

When people with opposing views interpret new information in a biased way, their views can move even further apart. This is called "attitude polarization".[133] The effect was demonstrated by an experiment that involved drawing a series of red and black balls from one of two concealed "bingo baskets". Participants knew that one basket contained 60 percent black and 40 percent red balls; the other, 40 percent black and 60 percent red. The experimenters looked at what happened when balls of alternating color were drawn in turn, a sequence that does not favor either basket. After each ball was drawn, participants in one group were asked to state out loud their judgments of the probability that the balls were being drawn from one or the other basket. These participants tended to grow more confident with each successive draw—whether they initially thought the basket with 60 percent black balls or the one with 60 percent red balls was the more likely source, their estimate of the probability increased. Another group of participants were asked to state probability estimates only at the end of a sequence of drawn balls, rather than after each ball. They did not show the polarization effect, suggesting that it does not necessarily occur when people simply hold opposing positions, but rather when they openly commit to them.[134]

A less abstract study was the Stanford biased interpretation experiment, in which participants with strong opinions about the death penalty read about mixed experimental evidence. Twenty-three percent of the participants reported that their views had become more extreme, and this self-reported shift correlated strongly with their initial attitudes.[27] In later experiments, participants also reported their opinions becoming more extreme in response to ambiguous information. However, comparisons of their attitudes before and after the new evidence showed no significant change, suggesting that the self-reported changes might not be real.[30][133][135] Based on these experiments, Deanna Kuhn and Joseph Lao concluded that polarization is a real phenomenon but far from inevitable, only happening in a small minority of cases, and it was prompted not only by considering mixed evidence, but by merely thinking about the topic.[133]

Charles Taber and Milton Lodge argued that the Stanford team's result had been hard to replicate because the arguments used in later experiments were too abstract or confusing to evoke an emotional response. The Taber and Lodge study used the emotionally charged topics of gun control and affirmative action.[30] They measured the attitudes of their participants towards these issues before and after reading arguments on each side of the debate. Two groups of participants showed attitude polarization: those with strong prior opinions and those who were politically knowledgeable. In part of this study, participants chose which information sources to read, from a list prepared by the experimenters. For example, they could read the National Rifle Association's and the Brady Anti-Handgun Coalition's arguments on gun control. Even when instructed to be even-handed, participants were more likely to read arguments that supported their existing attitudes than arguments that did not. This biased search for information correlated well with the polarization effect.[30]

The backfire effect is a name for the finding that given evidence against their beliefs, people can reject the evidence and believe even more strongly.[136][137] The phrase was coined by Brendan Nyhan and Jason Reifler in 2010.[138] However, subsequent research has since failed to replicate findings supporting the backfire effect.[139] One study conducted out of the Ohio State University and George Washington University studied 10,100 participants with 52 different issues expected to trigger a backfire effect. While the findings did conclude that individuals are reluctant to embrace facts that contradict their already held ideology, no cases of backfire were detected.[140] The backfire effect has since been noted to be a rare phenomenon rather than a common occurrence[141] (compare the boomerang effect).

Persistence of discredited beliefs edit

Beliefs can survive potent logical or empirical challenges. They can survive and even be bolstered by evidence that most uncommitted observers would agree logically demands some weakening of such beliefs. They can even survive the total destruction of their original evidential bases.

—Lee Ross and Craig Anderson[142]

Confirmation biases provide one plausible explanation for the persistence of beliefs when the initial evidence for them is removed or when they have been sharply contradicted.[3]: 187  This belief perseverance effect has been first demonstrated experimentally by Festinger, Riecken, and Schachter. These psychologists spent time with a cult whose members were convinced that the world would end on 21 December 1954. After the prediction failed, most believers still clung to their faith. Their book describing this research is aptly named When Prophecy Fails.[143]

The term belief perseverance, however, was coined in a series of experiments using what is called the "debriefing paradigm": participants read fake evidence for a hypothesis, their attitude change is measured, then the fakery is exposed in detail. Their attitudes are then measured once more to see if their belief returns to its previous level.[142]

A common finding is that at least some of the initial belief remains even after a full debriefing.[144] In one experiment, participants had to distinguish between real and fake suicide notes. The feedback was random: some were told they had done well while others were told they had performed badly. Even after being fully debriefed, participants were still influenced by the feedback. They still thought they were better or worse than average at that kind of task, depending on what they had initially been told.[145]

In another study, participants read job performance ratings of two firefighters, along with their responses to a risk aversion test.[142] This fictional data was arranged to show either a negative or positive association: some participants were told that a risk-taking firefighter did better, while others were told they did less well than a risk-averse colleague.[146] Even if these two case studies were true, they would have been scientifically poor evidence for a conclusion about firefighters in general. However, the participants found them subjectively persuasive.[146] When the case studies were shown to be fictional, participants' belief in a link diminished, but around half of the original effect remained.[142] Follow-up interviews established that the participants had understood the debriefing and taken it seriously. Participants seemed to trust the debriefing, but regarded the discredited information as irrelevant to their personal belief.[146]

The continued influence effect is the tendency for misinformation to continue to influence memory and reasoning about an event, despite the misinformation having been retracted or corrected. This occurs even when the individual believes the correction.[147]

Preference for early information edit

Experiments have shown that information is weighted more strongly when it appears early in a series, even when the order is unimportant. For example, people form a more positive impression of someone described as "intelligent, industrious, impulsive, critical, stubborn, envious" than when they are given the same words in reverse order.[148] This irrational primacy effect is independent of the primacy effect in memory in which the earlier items in a series leave a stronger memory trace.[148] Biased interpretation offers an explanation for this effect: seeing the initial evidence, people form a working hypothesis that affects how they interpret the rest of the information.[3]: 187 

One demonstration of irrational primacy used colored chips supposedly drawn from two urns. Participants were told the color distributions of the urns, and had to estimate the probability of a chip being drawn from one of them.[148] In fact, the colors appeared in a prearranged order. The first thirty draws favored one urn and the next thirty favored the other.[3]: 187  The series as a whole was neutral, so rationally, the two urns were equally likely. However, after sixty draws, participants favored the urn suggested by the initial thirty.[148]

Another experiment involved a slide show of a single object, seen as just a blur at first and in slightly better focus with each succeeding slide.[148] After each slide, participants had to state their best guess of what the object was. Participants whose early guesses were wrong persisted with those guesses, even when the picture was sufficiently in focus that the object was readily recognizable to other people.[3]: 187 

Illusory association between events edit

Illusory correlation is the tendency to see non-existent correlations in a set of data.[149] This tendency was first demonstrated in a series of experiments in the late 1960s.[150] In one experiment, participants read a set of psychiatric case studies, including responses to the Rorschach inkblot test. The participants reported that the homosexual men in the set were more likely to report seeing buttocks, anuses or sexually ambiguous figures in the inkblots. In fact the fictional case studies had been constructed so that the homosexual men were no more likely to report this imagery or, in one version of the experiment, were less likely to report it than heterosexual men.[149] In a survey, a group of experienced psychoanalysts reported the same set of illusory associations with homosexuality.[149][150]

Another study recorded the symptoms experienced by arthritic patients, along with weather conditions over a 15-month period. Nearly all the patients reported that their pains were correlated with weather conditions, although the real correlation was zero.[151]

Example
Days Rain No rain
Arthritis 14 6
No arthritis 7 2

This effect is a kind of biased interpretation, in that objectively neutral or unfavorable evidence is interpreted to support existing beliefs. It is also related to biases in hypothesis-testing behavior.[152] In judging whether two events, such as illness and bad weather, are correlated, people rely heavily on the number of positive-positive cases: in this example, instances of both pain and bad weather. They pay relatively little attention to the other kinds of observation (of no pain and/or good weather).[153] This parallels the reliance on positive tests in hypothesis testing.[152] It may also reflect selective recall, in that people may have a sense that two events are correlated because it is easier to recall times when they happened together.[152]

See also edit

Notes edit

  1. ^ David Perkins, a professor and researcher at the Harvard Graduate School of Education, coined the term "myside bias" referring to a preference for "my" side of an issue.[1]
  2. ^ "Assimilation bias" is another term used for biased interpretation of evidence.[7]
  3. ^ Wason also used the term "verification bias".[57]

References edit

Citations edit

  1. ^ Baron 2000, p. 195.
  2. ^ Hart, William; Albarracin, D.; Eagly, A. H.; Brechan, I.; Lindberg, M. J.; Merrill, L. (2009), "Feeling validated versus being correct: A meta-analysis of selective exposure to information", Psychological Bulletin, 135 (4): 555–588, doi:10.1037/a0015701, PMC 4797953, PMID 19586162
  3. ^ a b c d e f g h i j k l m n o p Nickerson 1998, pp. 175–220
  4. ^ Plous 1993, p. 233
  5. ^ Darley, John M.; Gross, Paget H. (2000), "A hypothesis-confirming bias in labelling effects", in Stangor, Charles (ed.), Stereotypes and prejudice: essential readings, Psychology Press, p. 212, ISBN 978-0-86377-589-5, OCLC 42823720
  6. ^ Risen & Gilovich 2007
  7. ^ Risen & Gilovich 2007, p. 113.
  8. ^ a b c Oswald & Grosjean 2004, pp. 82–83
  9. ^ a b Hergovich, Schott & Burger 2010
  10. ^ a b c Zweig, Jason (19 November 2009), "How to ignore the yes-man in your head", Wall Street Journal, from the original on 14 February 2015, retrieved 13 June 2010
  11. ^ a b c d Kunda 1999, pp. 112–115
  12. ^ a b Baron 2000, pp. 162–64
  13. ^ Kida 2006, pp. 162–65
  14. ^ Devine, Patricia G.; Hirt, Edward R.; Gehrke, Elizabeth M. (1990), "Diagnostic and confirmation strategies in trait hypothesis testing", Journal of Personality and Social Psychology, 58 (6): 952–963, doi:10.1037/0022-3514.58.6.952, ISSN 1939-1315
  15. ^ Trope, Yaacov; Bassok, Miriam (1982), "Confirmatory and diagnosing strategies in social information gathering", Journal of Personality and Social Psychology, 43 (1): 22–34, doi:10.1037/0022-3514.43.1.22, ISSN 1939-1315
  16. ^ a b c d Klayman, Joshua; Ha, Young-Won (1987), "Confirmation, disconfirmation and information in hypothesis testing" (PDF), Psychological Review, 94 (2): 211–228, CiteSeerX 10.1.1.174.5232, doi:10.1037/0033-295X.94.2.211, ISSN 0033-295X, S2CID 10853196, (PDF) from the original on 1 October 2011, retrieved 14 August 2009
  17. ^ Kunda, Ziva; Fong, G.T.; Sanitoso, R.; Reber, E. (1993), "Directional questions direct self-conceptions", Journal of Experimental Social Psychology, 29: 62–63, doi:10.1006/jesp.1993.1004, ISSN 0022-1031 via Fine 2006, pp. 63–65
  18. ^ a b Shafir, E. (1993), "Choosing versus rejecting: why some options are both better and worse than others", Memory and Cognition, 21 (4): 546–556, doi:10.3758/bf03197186, PMID 8350746 via Fine 2006, pp. 63–65
  19. ^ Snyder, Mark; Swann, William B. Jr. (1978), "Hypothesis-testing processes in social interaction", Journal of Personality and Social Psychology, 36 (11): 1202–1212, doi:10.1037/0022-3514.36.11.1202 via Poletiek 2001, p. 131
  20. ^ a b Kunda 1999, pp. 117–18
  21. ^ a b Albarracin, D.; Mitchell, A.L. (2004), "The role of defensive confidence in preference for proattitudinal information: How believing that one is strong can sometimes be a defensive weakness", Personality and Social Psychology Bulletin, 30 (12): 1565–1584, doi:10.1177/0146167204271180, PMC 4803283, PMID 15536240
  22. ^ Fischer, P.; Fischer, Julia K.; Aydin, Nilüfer; Frey, Dieter (2010), "Physically attractive social information sources lead to increased selective exposure to information", Basic and Applied Social Psychology, 32 (4): 340–347, doi:10.1080/01973533.2010.519208, S2CID 143133082
  23. ^ Dahlgren, Peter M. (2020), Media Echo Chambers: Selective Exposure and Confirmation Bias in Media Use, and its Consequences for Political Polarization, Gothenburg: University of Gothenburg, ISBN 978-91-88212-95-5, from the original on 6 April 2023, retrieved 16 October 2021
  24. ^ a b c d Stanovich, K.E.; West, R.F.; Toplak, M.E. (2013), "Myside bias, rational thinking, and intelligence", Current Directions in Psychological Science, 22 (4): 259–264, doi:10.1177/0963721413480174, S2CID 14505370
  25. ^ a b Mynatt, Clifford R.; Doherty, Michael E.; Tweney, Ryan D. (1978), "Consequences of confirmation and disconfirmation in a simulated research environment", Quarterly Journal of Experimental Psychology, 30 (3): 395–406, doi:10.1080/00335557843000007, S2CID 145419628
  26. ^ Kida 2006, p. 157
  27. ^ a b c d e f Lord, Charles G.; Ross, Lee; Lepper, Mark R. (1979), "Biased assimilation and attitude polarization: The effects of prior theories on subsequently considered evidence", Journal of Personality and Social Psychology, 37 (11): 2098–2109, CiteSeerX 10.1.1.372.1743, doi:10.1037/0022-3514.37.11.2098, ISSN 0022-3514, S2CID 7465318
  28. ^ a b Baron 2000, pp. 201–202
  29. ^ Vyse 1997, p. 122
  30. ^ a b c d Taber, Charles S.; Lodge, Milton (July 2006), "Motivated skepticism in the evaluation of political beliefs", American Journal of Political Science, 50 (3): 755–769, CiteSeerX 10.1.1.472.7064, doi:10.1111/j.1540-5907.2006.00214.x, ISSN 0092-5853, S2CID 3770487
  31. ^ a b c Westen, Drew; Blagov, Pavel S.; Harenski, Keith; Kilts, Clint; Hamann, Stephan (2006), "Neural bases of motivated reasoning: An fMRI study of emotional constraints on partisan political judgment in the 2004 U.S. Presidential election", Journal of Cognitive Neuroscience, 18 (11): 1947–1958, CiteSeerX 10.1.1.578.8097, doi:10.1162/jocn.2006.18.11.1947, PMID 17069484, S2CID 8625992
  32. ^ Gadenne, V.; Oswald, M. (1986), "Entstehung und Veränderung von Bestätigungstendenzen beim Testen von Hypothesen [Formation and alteration of confirmatory tendencies during the testing of hypotheses]", Zeitschrift für Experimentelle und Angewandte Psychologie, 33: 360–374 via Oswald & Grosjean 2004, p. 89
  33. ^ Hastie, Reid; Park, Bernadette (2005), "The relationship between memory and judgment depends on whether the judgment task is memory-based or on-line", in Hamilton, David L. (ed.), Social cognition: key readings, New York: Psychology Press, p. 394, ISBN 978-0-86377-591-8, OCLC 55078722
  34. ^ a b c Oswald & Grosjean 2004, pp. 88–89
  35. ^ Stangor, Charles; McMillan, David (1992), "Memory for expectancy-congruent and expectancy-incongruent information: A review of the social and social developmental literatures", Psychological Bulletin, 111 (1): 42–61, doi:10.1037/0033-2909.111.1.42
  36. ^ a b Snyder, M.; Cantor, N. (1979), "Testing hypotheses about other people: the use of historical knowledge", Journal of Experimental Social Psychology, 15 (4): 330–342, doi:10.1016/0022-1031(79)90042-8 via Goldacre 2008, p. 231
  37. ^ Kunda 1999, pp. 225–232
  38. ^ Sanitioso, Rasyid; Kunda, Ziva; Fong, G.T. (1990), "Motivated recruitment of autobiographical memories", Journal of Personality and Social Psychology, 59 (2): 229–241, doi:10.1037/0022-3514.59.2.229, ISSN 0022-3514, PMID 2213492
  39. ^ a b c Levine, L.; Prohaska, V.; Burgess, S.L.; Rice, J.A.; Laulhere, T.M. (2001), "Remembering past emotions: The role of current appraisals", Cognition and Emotion, 15 (4): 393–417, doi:10.1080/02699930125955, S2CID 22743423
  40. ^ a b Safer, M.A.; Bonanno, G.A.; Field, N. (2001), "It was never that bad: Biased recall of grief and long-term adjustment to the death of a spouse", Memory, 9 (3): 195–203, doi:10.1080/09658210143000065, PMID 11469313, S2CID 24729233
  41. ^ a b Russell, Dan; Jones, Warren H. (1980), "When superstition fails: Reactions to disconfirmation of paranormal beliefs", Personality and Social Psychology Bulletin, 6 (1): 83–88, doi:10.1177/014616728061012, ISSN 1552-7433, S2CID 145060971 via Vyse 1997, p. 121
  42. ^ Baron, Jonathan (1995), "Myside bias in thinking about abortion.", Thinking & Reasoning, 1 (3): 221–235, CiteSeerX 10.1.1.112.1603, doi:10.1080/13546789508256909
  43. ^ a b c d Wolfe, Christopher; Anne Britt (2008), "The locus of the myside bias in written argumentation" (PDF), Thinking & Reasoning, 14: 1–27, doi:10.1080/13546780701527674, S2CID 40527220, (PDF) from the original on 4 March 2016, retrieved 11 November 2014
  44. ^ Mason, Lucia; Scirica, Fabio (October 2006), "Prediction of students' argumentation skills about controversial topics by epistemological understanding", Learning and Instruction, 16 (5): 492–509, doi:10.1016/j.learninstruc.2006.09.007
  45. ^ Weinstock, Michael (2009), "Relative expertise in an everyday reasoning task: Epistemic understanding, problem representation, and reasoning competence", Learning and Individual Differences, 19 (4): 423–434, doi:10.1016/j.lindif.2009.03.003
  46. ^ Weinstock, Michael; Neuman, Yair; Tabak, Iris (2004), "Missing the point or missing the norms? Epistemological norms as predictors of students' ability to identify fallacious arguments", Contemporary Educational Psychology, 29 (1): 77–94, doi:10.1016/S0361-476X(03)00024-9
  47. ^ Thucydides 4.108.4.
  48. ^ Alighieri, Dante. Paradiso canto XIII: 118–120. Trans. Allen Mandelbaum.
  49. ^ Ibn Khaldun (1958), The Muqadimmah, Princeton, NJ: Princeton University Press, p. 71.
  50. ^ a b Baron 2000, pp. 195–196.
  51. ^ a b Bacon, Francis (1620). Novum Organum. reprinted in Burtt, E. A., ed. (1939), The English philosophers from Bacon to Mill, New York: Random House, p. 36 via Nickerson 1998, p. 176.
  52. ^ Schopenhauer, Arthur (2011) [1844], Carus, David; Aquila, Richard E. (eds.), The World as Will and Presentation, vol. 2, New York: Routledge, p. 246.
  53. ^ Tolstoy, Leo (1896). What Is Art? ch. 14 p. 143 17 August 2021 at the Wayback Machine. Translated from Russian by Aylmer Maude, New York, 1904. Project Gutenberg edition 7 August 2021 at the Wayback Machine released 23 March 2021. Retrieved 17 August 2021.
  54. ^ Tolstoy, Leo (1894). The Kingdom of God Is Within You p. 49 17 August 2021 at the Wayback Machine. Translated from Russian by Constance Garnett, New York, 1894. Project Gutenberg edition 17 August 2021 at the Wayback Machine released 26 July 2013. Retrieved 17 August 2021.
  55. ^ Wason 1960
  56. ^ Lewicka 1998, p. 238
  57. ^ Poletiek 2001, p. 73.
  58. ^ Oswald & Grosjean 2004, pp. 79–96
  59. ^ Wason, Peter C. (1968), "Reasoning about a rule", Quarterly Journal of Experimental Psychology, 20 (3): 273–278, doi:10.1080/14640746808400161, ISSN 1747-0226, PMID 5683766, S2CID 1212273
  60. ^ a b c Sutherland, Stuart (2007), Irrationality (2nd ed.), London: Pinter and Martin, pp. 95–103, ISBN 978-1-905177-07-3, OCLC 72151566
  61. ^ Barkow, Jerome H.; Cosmides, Leda; Tooby, John (1995), The adapted mind: evolutionary psychology and the generation of culture, Oxford University Press US, pp. 181–184, ISBN 978-0-19-510107-2, OCLC 33832963
  62. ^ Oswald & Grosjean 2004, pp. 81–82, 86–87
  63. ^ Plous 1993, p. 233
  64. ^ Lewicka 1998, p. 239
  65. ^ Tweney, Ryan D.; Doherty, Michael E. (1980), "Strategies of rule discovery in an inference task", The Quarterly Journal of Experimental Psychology, 32 (1): 109–123, doi:10.1080/00335558008248237, ISSN 1747-0226, S2CID 143148831 (Experiment IV)
  66. ^ Oswald & Grosjean 2004, pp. 86–89
  67. ^ MacCoun 1998
  68. ^ Friedrich 1993, p. 298
  69. ^ Kunda 1999, p. 94
  70. ^ Baron 2000, p. 206
  71. ^ Matlin, Margaret W. (2004), "Pollyanna Principle", in Pohl, Rüdiger F. (ed.), Cognitive illusions: A handbook on fallacies and biases in thinking, judgement and memory, Hove, UK: Psychology Press, pp. 255–272, ISBN 978-1-84169-351-4, OCLC 55124398
  72. ^ Dawson, Erica; Gilovich, Thomas; Regan, Dennis T. (October 2002), "Motivated reasoning and performance on the Wason Selection Task", Personality and Social Psychology Bulletin, 28 (10): 1379–1387, doi:10.1177/014616702236869, S2CID 143957893
  73. ^ Ditto, Peter H.; Lopez, David F. (1992), "Motivated skepticism: Use of differential decision criteria for preferred and nonpreferred conclusions", Journal of Personality and Social Psychology, 63 (4): 568–584, doi:10.1037/0022-3514.63.4.568, ISSN 0022-3514
  74. ^ Oswald & Grosjean 2004, pp. 91–93
  75. ^ Friedrich 1993, pp. 299, 316–317
  76. ^ Trope, Y.; Liberman, A. (1996), "Social hypothesis testing: Cognitive and motivational mechanisms", in Higgins, E. Tory; Kruglanski, Arie W. (eds.), Social psychology: Handbook of basic principles, New York: Guilford Press, ISBN 978-1-57230-100-9, OCLC 34731629 via Oswald & Grosjean 2004, pp. 91–93
  77. ^ a b Dardenne, Benoit; Leyens, Jacques-Philippe (1995), "Confirmation bias as a social skill" (PDF), Personality and Social Psychology Bulletin, 21 (11): 1229–1239, doi:10.1177/01461672952111011, ISSN 1552-7433, S2CID 146709087, (PDF) from the original on 9 September 2020, retrieved 25 September 2019
  78. ^ Shanteau, James (2003), Sandra L. Schneider (ed.), Emerging perspectives on judgment and decision research, Cambridge [u. a.]: Cambridge University Press, p. 445, ISBN 978-0-521-52718-7
  79. ^ Haidt, Jonathan (2013), The righteous mind: Why good people are divided by politics and religion, London: Penguin Books, pp. 87–88, ISBN 978-0-14-103916-9
  80. ^ Fiske, Susan T.; Gilbert, Daniel T.; Lindzey, Gardner, eds. (2010), The handbook of social psychology (5th ed.), Hoboken, NJ: Wiley, p. 811, ISBN 978-0-470-13749-9
  81. ^ American Psychological Association (2018), "Why we're susceptible to fake news – and how to defend against it", Skeptical Inquirer, 42 (6): 8–9
  82. ^ a b Pariser, Eli (2 May 2011), "Ted talk: Beware online "filter bubbles"", TED: Ideas Worth Spreading, from the original on 22 September 2017, retrieved 1 October 2017
  83. ^ Self, Will (28 November 2016), "Forget fake news on Facebook – the real filter bubble is you", NewStatesman, from the original on 11 November 2017, retrieved 24 October 2017
  84. ^ Pariser, Eli (7 May 2015), "Did Facebook's big study kill my filter bubble thesis?", Wired, from the original on 11 November 2017, retrieved 24 October 2017
  85. ^ Kendrick, Douglas T.; Cohen, Adam B.; Neuberg, Steven L.; Cialdini, Robert B. (2020), "The science of anti-science thinking", Scientific American, 29 (4, Fall, Special Issue): 84–89
  86. ^ Weinmann, Markus; Schneider, Christoph; vom Brocke, Jan (2015), "Digital nudging", SSRN, Rochester, NY, doi:10.2139/ssrn.2708250, S2CID 219380211, SSRN 2708250
  87. ^ Thornhill, Calum; Meeus, Quentin; Peperkamp, Jeroen; Berendt, Bettina (2019), "A digital nudge to counter confirmation bias", Frontiers in Big Data, 2: 11, doi:10.3389/fdata.2019.00011, ISSN 2624-909X, PMC 7931917, PMID 33693334
  88. ^ Mahoney, Michael J.; DeMonbreun, B.G. (1977), "Psychology of the scientist: An analysis of problem-solving bias", Cognitive Therapy and Research, 1 (3): 229–238, doi:10.1007/BF01186796, S2CID 9703186
  89. ^ Mitroff, I. I. (1974), "Norms and counter-norms in a select group of the Apollo moon scientists: A case study of the ambivalence of scientists", American Sociological Review, 39 (4): 579–395, doi:10.2307/2094423, JSTOR 2094423
  90. ^ a b Koehler 1993
  91. ^ a b c Mahoney 1977
  92. ^ Letrud, Kåre; Hernes, Sigbjørn (2019), "Affirmative citation bias in scientific myth debunking: A three-in-one case study", PLOS ONE, 14 (9): e0222213, Bibcode:2019PLoSO..1422213L, doi:10.1371/journal.pone.0222213, PMC 6733478, PMID 31498834
  93. ^ Ball, Phillip (14 May 2015), , Nautilus, archived from the original on 7 October 2019, retrieved 6 October 2019
  94. ^ Sternberg, Robert J. (2007), "Critical thinking in psychology: It really is critical", in Sternberg, Robert J.; Roediger III, Henry L.; Halpern, Diane F. (eds.), Critical thinking in psychology, Cambridge University Press, p. 292, ISBN 978-0-521-60834-3, OCLC 69423179, Some of the worst examples of confirmation bias are in research on parapsychology ... Arguably, there is a whole field here with no powerful confirming data at all. But people want to believe, and so they find ways to believe.
  95. ^ a b Shadish, William R. (2007), "Critical thinking in quasi-experimentation", in Sternberg, Robert J.; Roediger III, Henry L.; Halpern, Diane F. (eds.), Critical Thinking in Psychology, Cambridge University Press, p. 49, ISBN 978-0-521-60834-3
  96. ^ Jüni, P.; Altman, D.G.; Egger, M. (2001), "Systematic reviews in health care: Assessing the quality of controlled clinical trials", BMJ (Clinical Research Ed.), 323 (7303): 42–46, doi:10.1136/bmj.323.7303.42, PMC 1120670, PMID 11440947
  97. ^ Lee, C.J.; Sugimoto, C.R.; Zhang, G.; Cronin, B. (2013), "Bias in peer review", Journal of the Association for Information Science and Technology, 64: 2–17, doi:10.1002/asi.22784
  98. ^ Shermer, Michael (July 2006), "The political brain: A recent brain-imaging study shows that our political predilections are a product of unconscious confirmation bias", Scientific American, 295 (1): 36, Bibcode:2006SciAm.295a..36S, doi:10.1038/scientificamerican0706-36, ISSN 0036-8733, PMID 16830675
  99. ^ Emerson, G.B.; Warme, W.J.; Wolf, F.M.; Heckman, J.D.; Brand, R.A.; Leopold, S.S. (2010), "Testing for the presence of positive-outcome bias in peer review: A randomized controlled trial", Archives of Internal Medicine, 170 (21): 1934–1339, doi:10.1001/archinternmed.2010.406, PMID 21098355
  100. ^ Bartlett, Steven James, "The psychology of abuse in publishing: Peer review and editorial bias," Chap. 7, pp. 147–177, in Steven James Bartlett, Normality does not equal mental health: The need to look elsewhere for standards of good psychological health. Santa Barbara, CA: Praeger, 2011.
  101. ^ Horrobin, David F. (1990), "The philosophical basis of peer review and the suppression of innovation", Journal of the American Medical Association, 263 (10): 1438–1441, doi:10.1001/jama.263.10.1438, PMID 2304222
  102. ^ Pompian, Michael M. (2006), Behavioral finance and wealth management: how to build optimal portfolios that account for investor biases, John Wiley and Sons, pp. 187–190, ISBN 978-0-471-74517-4, OCLC 61864118
  103. ^ Hilton, Denis J. (2001), "The psychology of financial decision-making: Applications to trading, dealing, and investment analysis", Journal of Behavioral Finance, 2 (1): 37–39, doi:10.1207/S15327760JPFM0201_4, ISSN 1542-7579, S2CID 153379653
  104. ^ Krueger, David; Mann, John David (2009), The secret language of money: How to make smarter financial decisions and live a richer life, McGraw Hill Professional, pp. 112–113, ISBN 978-0-07-162339-1, OCLC 277205993
  105. ^ Groopman, Jerome (2007), How doctor's think, Melbourne: Scribe Publications, pp. 64–66, ISBN 978-1-921215-69-8
  106. ^ Croskerry, Pat (2002), "Achieving quality in clinical decision making: Cognitive strategies and detection of bias", Academic Emergency Medicine, 9 (11): 1184–1204, doi:10.1197/aemj.9.11.1184, PMID 12414468.
  107. ^ Pang, Dominic; Bleetman, Anthony; Bleetman, David; Wynne, Max (2 June 2017), "The foreign body that never was: the effects of confirmation bias", British Journal of Hospital Medicine, 78 (6): 350–351, doi:10.12968/hmed.2017.78.6.350, PMID 28614014
  108. ^ Goldacre 2008, p. 233
  109. ^ Singh, Simon; Ernst, Edzard (2008), Trick or treatment?: Alternative medicine on trial, London: Bantam, pp. 287–288, ISBN 978-0-593-06129-9
  110. ^ Atwood, Kimball (2004), "Naturopathy, pseudoscience, and medicine: Myths and fallacies vs truth", Medscape General Medicine, 6 (1): 33, PMC 1140750, PMID 15208545
  111. ^ Neenan, Michael; Dryden, Windy (2004), Cognitive therapy: 100 key points and techniques, Psychology Press, p. ix, ISBN 978-1-58391-858-6, OCLC 474568621
  112. ^ Blackburn, Ivy-Marie; Davidson, Kate M. (1995), Cognitive therapy for depression & anxiety: a practitioner's guide (2 ed.), Wiley-Blackwell, p. 19, ISBN 978-0-632-03986-9, OCLC 32699443
  113. ^ Harvey, Allison G.; Watkins, Edward; Mansell, Warren (2004), Cognitive behavioural processes across psychological disorders: a transdiagnostic approach to research and treatment, Oxford University Press, pp. 172–173, 176, ISBN 978-0-19-852888-3, OCLC 602015097
  114. ^ Myers, D.G.; Lamm, H. (1976), "The group polarization phenomenon", Psychological Bulletin, 83 (4): 602–527, doi:10.1037/0033-2909.83.4.602 via Nickerson 1998, pp. 193–194
  115. ^ Halpern, Diane F. (1987), Critical thinking across the curriculum: A brief edition of thought and knowledge, Lawrence Erlbaum Associates, p. 194, ISBN 978-0-8058-2731-6, OCLC 37180929
  116. ^ Roach, Kent (2010), "Wrongful convictions: Adversarial and inquisitorial themes", North Carolina Journal of International Law and Commercial Regulation, 35: 387–446, SSRN 1619124, Quote: Both adversarial and inquisitorial systems seem subject to the dangers of tunnel vision or confirmation bias.
  117. ^ Baron 2000, pp. 191, 195
  118. ^ Kida 2006, p. 155
  119. ^ Tetlock, Philip E. (2005), Expert political judgment: How good is it? How can we know?, Princeton, NJ: Princeton University Press, pp. 125–128, ISBN 978-0-691-12302-8, OCLC 56825108
  120. ^ O'Brien, B. (2009), "Prime suspect: An examination of factors that aggravate and counteract confirmation bias in criminal investigations", Psychology, Public Policy, and Law, 15 (4): 315–334, doi:10.1037/a0017881
  121. ^ a b Swann, William B.; Pelham, Brett W.; Krull, Douglas S. (1989), "Agreeable fancy or disagreeable truth? Reconciling self-enhancement and self-verification", Journal of Personality and Social Psychology, 57 (5): 782–791, doi:10.1037/0022-3514.57.5.782, ISSN 0022-3514, PMID 2810025
  122. ^ a b Swann, William B.; Read, Stephen J. (1981), "Self-verification processes: How we sustain our self-conceptions", Journal of Experimental Social Psychology, 17 (4): 351–372, doi:10.1016/0022-1031(81)90043-3, ISSN 0022-1031
  123. ^ Story, Amber L. (1998), "Self-esteem and memory for favorable and unfavorable personality feedback", Personality and Social Psychology Bulletin, 24 (1): 51–64, doi:10.1177/0146167298241004, ISSN 1552-7433, S2CID 144945319
  124. ^ White, Michael J.; Brockett, Daniel R.; Overstreet, Belinda G. (1993), "Confirmatory bias in evaluating personality test information: Am I really that kind of person?", Journal of Counseling Psychology, 40 (1): 120–126, doi:10.1037/0022-0167.40.1.120, ISSN 0022-0167
  125. ^ Swann, William B.; Read, Stephen J. (1981), "Acquiring self-knowledge: The search for feedback that fits", Journal of Personality and Social Psychology, 41 (6): 1119–1328, CiteSeerX 10.1.1.537.2324, doi:10.1037/0022-3514.41.6.1119, ISSN 0022-3514
  126. ^ Shrauger, J. Sidney; Lund, Adrian K. (1975), "Self-evaluation and reactions to evaluations from others", Journal of Personality, 43 (1): 94–108, doi:10.1111/j.1467-6494.1975.tb00574.x, PMID 1142062
  127. ^ Lidén, Moa (2018). "3.2.4.1" (PDF). Confirmation bias in criminal cases (Thesis). Department of Law, Uppsala University. (PDF) from the original on 20 February 2020. Retrieved 20 February 2020.
  128. ^ Trevor-Roper, H.R. (1969). The European witch-craze of the sixteenth and seventeenth centuries and other essays. London: HarperCollins. [ISBN missing]
  129. ^ Chrisler, Mark (24 September 2019), "The constant: A history of getting things wrong", constantpodcast.com (Podcast), from the original on 20 February 2020, retrieved 19 February 2020
  130. ^ a b Smith, Jonathan C. (2009), Pseudoscience and extraordinary claims of the paranormal: A critical thinker's toolkit, London: Wiley-Blackwell, pp. 149–151, ISBN 978-1-4051-8122-8, OCLC 319499491
  131. ^ Randi, James (1991), James Randi: Psychic investigator, London: Boxtree, pp. 58–62, ISBN 978-1-85283-144-8, OCLC 26359284
  132. ^ Agarwal, Dr Pragva (19 October 2018), "Here is how bias can affect recruitment in your organization", Forbes, from the original on 31 July 2019, retrieved 31 July 2019
  133. ^ a b c Kuhn, Deanna; Lao, Joseph (March 1996), "Effects of evidence on attitudes: Is polarization the norm?", Psychological Science, 7 (2): 115–120, doi:10.1111/j.1467-9280.1996.tb00340.x, S2CID 145659040
  134. ^ Baron 2000, p. 201
  135. ^ Miller, A.G.; McHoskey, J.W.; Bane, C.M.; Dowd, T.G. (1993), "The attitude polarization phenomenon: Role of response measure, attitude extremity, and behavioral consequences of reported attitude change", Journal of Personality and Social Psychology, 64 (4): 561–574, doi:10.1037/0022-3514.64.4.561, S2CID 14102789
  136. ^ "Backfire effect", The Skeptic's Dictionary, from the original on 6 February 2017, retrieved 26 April 2012
  137. ^ Silverman, Craig (17 June 2011), "The backfire effect", Columbia Journalism Review, from the original on 25 April 2012, retrieved 1 May 2012, When your deepest convictions are challenged by contradictory evidence, your beliefs get stronger.
  138. ^ Nyhan, B. & Reifler, J. (2010). 'When corrections fail: The persistence of political misperceptions". Political Behavior, 32, 303–320
  139. ^ "Facts matter after all: rejecting the "backfire effect"". Oxford Education Blog. 12 March 2018. from the original on 23 October 2018. Retrieved 23 October 2018.
  140. ^ Wood, Thomas; Porter, Ethan (2019), "The elusive backfire effect: Mass attitudes' steadfast factual adherence", Political Behavior, 41: 135–163, doi:10.2139/ssrn.2819073, ISSN 1556-5068
  141. ^ "Fact-checking doesn't 'backfire,' new study suggests", Poynter, 2 November 2016, from the original on 24 October 2018, retrieved 23 October 2018
  142. ^ a b c d Ross, Lee; Anderson, Craig A. (1974), "Judgment under uncertainty: Heuristics and biases", Science, 185 (4157): 1124–1131, Bibcode:1974Sci...185.1124T, doi:10.1126/science.185.4157.1124, PMID 17835457, S2CID 143452957.
    Kahneman, Daniel; Slovic, Paul; Tversky, Amos, eds. (1982), "Shortcomings in the attribution process: On the origins and maintenance of erroneous social assessments", Judgment under uncertainty: Heuristics and biases, Cambridge University Press, ISBN 978-0-521-28414-1, OCLC 7578020
  143. ^ Festinger, Leon (1956), When prophecy fails: A social and psychological study of a modern group that predicted the destruction of the world, New York: Harper Torchbooks.
  144. ^ Kunda 1999, p. 99
  145. ^ Ross, Lee; Lepper, Mark R.; Hubbard, Michael (1975), "Perseverance in self-perception and social perception: Biased attributional processes in the debriefing paradigm", Journal of Personality and Social Psychology, 32 (5): 880–is 892, doi:10.1037/0022-3514.32.5.880, ISSN 0022-3514, PMID 1185517 via Kunda 1999, p. 99
  146. ^ a b c Anderson, Craig A.; Lepper, Mark R.; Ross, Lee (1980), "Perseverance of social theories: The role of explanation in the persistence of discredited information", Journal of Personality and Social Psychology, 39 (6): 1037–1049, CiteSeerX 10.1.1.130.933, doi:10.1037/h0077720, ISSN 0022-3514
  147. ^ Cacciatore, Michael A. (9 April 2021), "Misinformation and public opinion of science and health: Approaches, findings, and future directions", Proceedings of the National Academy of Sciences, 118 (15): e1912437117, Bibcode:2021PNAS..11812437C, doi:10.1073/pnas.1912437117, ISSN 0027-8424, PMC 8053916, PMID 33837143, p. 4: The CIE refers to the tendency for information that is initially presented as true, but later revealed to be false, to continue to affect memory and reasoning
  148. ^ a b c d e Baron 2000, pp. 197–200
  149. ^ a b c Fine 2006, pp. 66–70
  150. ^ a b Plous 1993, pp. 164–166
  151. ^ Redelmeir, D.A.; Tversky, Amos (1996), "On the belief that arthritis pain is related to the weather", Proceedings of the National Academy of Sciences, 93 (7): 2895–2896, Bibcode:1996PNAS...93.2895R, doi:10.1073/pnas.93.7.2895, PMC 39730, PMID 8610138 via Kunda 1999, p. 127
  152. ^ a b c Kunda 1999, pp. 127–130
  153. ^ Plous 1993, pp. 162–164

Sources edit

  • Baron, Jonathan (2000), Thinking and deciding (3rd ed.), New York: Cambridge University Press, ISBN 978-0-521-65030-4, OCLC 316403966
  • Fine, Cordelia (2006), A Mind of its Own: how your brain distorts and deceives, Cambridge, UK: Icon Books, ISBN 978-1-84046-678-2, OCLC 60668289
  • Friedrich, James (1993), "Primary error detection and minimization (PEDMIN) strategies in social cognition: a reinterpretation of confirmation bias phenomena", Psychological Review, 100 (2): 298–319, doi:10.1037/0033-295X.100.2.298, ISSN 0033-295X, PMID 8483985
  • Goldacre, Ben (2008), Bad science, London: Fourth Estate, ISBN 978-0-00-724019-7, OCLC 259713114
  • Hergovich, Andreas; Schott, Reinhard; Burger, Christoph (2010), "Biased evaluation of abstracts depending on topic and conclusion: Further evidence of a confirmation bias within scientific psychology", Current Psychology, 29 (3): 188–209, doi:10.1007/s12144-010-9087-5, S2CID 145497196
  • Kida, Thomas E. (2006), Don't believe everything you think: The 6 basic mistakes we make in thinking, Amherst, NY: Prometheus Books, ISBN 978-1-59102-408-8, OCLC 63297791
  • Koehler, Jonathan J. (1993), "The influence of prior beliefs on scientific judgments of evidence quality", Organizational Behavior and Human Decision Processes, 56: 28–55, doi:10.1006/obhd.1993.1044
  • Kunda, Ziva (1999), Social cognition: Making sense of people, MIT Press, ISBN 978-0-262-61143-5, OCLC 40618974
  • Lewicka, Maria (1998), "Confirmation bias: Cognitive error or adaptive strategy of action control?", in Kofta, Mirosław; Weary, Gifford; Sedek, Grzegorz (eds.), Personal control in action: Cognitive and motivational mechanisms, Springer, pp. 233–255, ISBN 978-0-306-45720-3, OCLC 39002877
  • MacCoun, Robert J. (1998), "Biases in the interpretation and use of research results" (PDF), Annual Review of Psychology, 49: 259–287, doi:10.1146/annurev.psych.49.1.259, PMID 15012470, (PDF) from the original on 9 August 2017, retrieved 10 October 2010
  • Mahoney, Michael J. (1977), "Publication prejudices: An experimental study of confirmatory bias in the peer review system", Cognitive Therapy and Research, 1 (2): 161–175, doi:10.1007/BF01173636, S2CID 7350256
  • Nickerson, Raymond S. (1998), "Confirmation bias: A ubiquitous phenomenon in many guises", Review of General Psychology, 2 (2): 175–220, doi:10.1037/1089-2680.2.2.175, S2CID 8508954
  • Oswald, Margit E.; Grosjean, Stefan (2004), "Confirmation bias", in Pohl, Rüdiger F. (ed.), Cognitive illusions: A handbook on fallacies and biases in thinking, judgement and memory, Hove, UK: Psychology Press, pp. 79–96, ISBN 978-1-84169-351-4, OCLC 55124398
  • Plous, Scott (1993), The psychology of judgment and decision making, McGraw-Hill, ISBN 978-0-07-050477-6, OCLC 26931106
  • Poletiek, Fenna (2001), Hypothesis-testing behaviour, Hove, UK: Psychology Press, ISBN 978-1-84169-159-6, OCLC 44683470
  • Risen, Jane; Gilovich, Thomas (2007), "Informal logical fallacies", in Sternberg, Robert J.; Roediger III, Henry L.; Halpern, Diane F. (eds.), Critical thinking in psychology, Cambridge University Press, pp. 110–130, ISBN 978-0-521-60834-3, OCLC 69423179
  • Vyse, Stuart A. (1997), Believing in magic: The psychology of superstition, New York: Oxford University Press, ISBN 978-0-19-513634-0, OCLC 35025826
  • Wason, Peter C. (1960), "On the failure to eliminate hypotheses in a conceptual task", Quarterly Journal of Experimental Psychology, 12 (3): 129–140, doi:10.1080/17470216008416717, ISSN 1747-0226, S2CID 19237642

Further reading edit

External links edit

  • Skeptic's Dictionary: confirmation bias – Robert T. Carroll
  • Teaching about confirmation bias – class handout and instructor's notes by K.H. Grobman
  • Confirmation bias at You Are Not So Smart
  • Confirmation bias learning object – interactive number triples exercise by Rod McFarland for Simon Fraser University
  • Brief summary of the 1979 Stanford assimilation bias study – Keith Rollag, Babson College

confirmation, bias, also, confirmatory, bias, myside, bias, congeniality, bias, tendency, search, interpret, favor, recall, information, that, confirms, supports, prior, beliefs, values, people, display, this, bias, when, they, select, information, that, suppo. Confirmation bias also confirmatory bias myside bias a or congeniality bias 2 is the tendency to search for interpret favor and recall information in a way that confirms or supports one s prior beliefs or values 3 People display this bias when they select information that supports their views ignoring contrary information or when they interpret ambiguous evidence as supporting their existing attitudes The effect is strongest for desired outcomes for emotionally charged issues and for deeply entrenched beliefs Confirmation bias is insuperable for most people but they can manage it for example by education and training in critical thinking skills Biased search for information biased interpretation of this information and biased memory recall have been invoked to explain four specific effects attitude polarization when a disagreement becomes more extreme even though the different parties are exposed to the same evidence belief perseverance when beliefs persist after the evidence for them is shown to be false the irrational primacy effect a greater reliance on information encountered early in a series illusory correlation when people falsely perceive an association between two events or situations A series of psychological experiments in the 1960s suggested that people are biased toward confirming their existing beliefs Later work re interpreted these results as a tendency to test ideas in a one sided way focusing on one possibility and ignoring alternatives Explanations for the observed biases include wishful thinking and the limited human capacity to process information Another proposal is that people show confirmation bias because they are pragmatically assessing the costs of being wrong rather than investigating in a neutral scientific way Flawed decisions due to confirmation bias have been found in a wide range of political organizational financial and scientific contexts These biases contribute to overconfidence in personal beliefs and can maintain or strengthen beliefs in the face of contrary evidence For example confirmation bias produces systematic errors in scientific research based on inductive reasoning the gradual accumulation of supportive evidence Similarly a police detective may identify a suspect early in an investigation but then may only seek confirming rather than disconfirming evidence A medical practitioner may prematurely focus on a particular disorder early in a diagnostic session and then seek only confirming evidence In social media confirmation bias is amplified by the use of filter bubbles or algorithmic editing which display to individuals only information they are likely to agree with while excluding opposing views Contents 1 Definition and context 2 Types 2 1 Biased search for information 2 2 Biased interpretation of information 2 3 Biased memory recall of information 3 Individual differences 4 Discovery 4 1 Informal observations 4 2 Hypothesis testing falsification explanation Wason 4 3 Hypothesis testing positive test strategy explanation Klayman and Ha 5 Information processing explanations 5 1 Cognitive versus motivational 5 2 Cost benefit 5 3 Exploratory versus confirmatory 5 4 Make believe 6 Real world effects 6 1 Social media 6 2 Science and scientific research 6 3 Finance 6 4 Medicine and health 6 5 Politics law and policing 6 6 Social psychology 6 7 Mass delusions 6 8 Paranormal beliefs 6 9 Recruitment and selection 7 Associated effects and outcomes 7 1 Polarization of opinion 7 2 Persistence of discredited beliefs 7 3 Preference for early information 7 4 Illusory association between events 8 See also 9 Notes 10 References 10 1 Citations 10 2 Sources 11 Further reading 12 External linksDefinition and context editConfirmation bias a phrase coined by English psychologist Peter Wason is the tendency of people to favor information that confirms or strengthens their beliefs or values and is difficult to dislodge once affirmed 4 Confirmation biases are effects in information processing They differ from what is sometimes called the behavioral confirmation effect commonly known as self fulfilling prophecy in which a person s expectations influence their own behavior bringing about the expected result 5 Some psychologists restrict the term confirmation bias to selective collection of evidence that supports what one already believes while ignoring or rejecting evidence that supports a different conclusion Others apply the term more broadly to the tendency to preserve one s existing beliefs when searching for evidence interpreting it or recalling it from memory 6 b Confirmation bias is a result of automatic unintentional strategies rather than deliberate deception 8 9 Types editBiased search for information edit nbsp Confirmation bias has been described as an internal yes man echoing back a person s beliefs like Charles Dickens s character Uriah Heep 10 Experiments have found repeatedly that people tend to test hypotheses in a one sided way by searching for evidence consistent with their current hypothesis 3 177 178 11 Rather than searching through all the relevant evidence they phrase questions to receive an affirmative answer that supports their theory 12 They look for the consequences that they would expect if their hypothesis was true rather than what would happen if it was false 12 For example someone using yes no questions to find a number they suspect to be the number 3 might ask Is it an odd number People prefer this type of question called a positive test even when a negative test such as Is it an even number would yield exactly the same information 13 However this does not mean that people seek tests that guarantee a positive answer In studies where subjects could select either such pseudo tests or genuinely diagnostic ones they favored the genuinely diagnostic 14 15 The preference for positive tests in itself is not a bias since positive tests can be highly informative 16 However in combination with other effects this strategy can confirm existing beliefs or assumptions independently of whether they are true 8 In real world situations evidence is often complex and mixed For example various contradictory ideas about someone could each be supported by concentrating on one aspect of his or her behavior 11 Thus any search for evidence in favor of a hypothesis is likely to succeed 8 One illustration of this is the way the phrasing of a question can significantly change the answer 11 For example people who are asked Are you happy with your social life report greater satisfaction than those asked Are you unhappy with your social life 17 Even a small change in a question s wording can affect how people search through available information and hence the conclusions they reach This was shown using a fictional child custody case 18 Participants read that Parent A was moderately suitable to be the guardian in multiple ways Parent B had a mix of salient positive and negative qualities a close relationship with the child but a job that would take them away for long periods of time When asked Which parent should have custody of the child the majority of participants chose Parent B looking mainly for positive attributes However when asked Which parent should be denied custody of the child they looked for negative attributes and the majority answered that Parent B should be denied custody implying that Parent A should have custody 18 Similar studies have demonstrated how people engage in a biased search for information but also that this phenomenon may be limited by a preference for genuine diagnostic tests In an initial experiment participants rated another person on the introversion extroversion personality dimension on the basis of an interview They chose the interview questions from a given list When the interviewee was introduced as an introvert the participants chose questions that presumed introversion such as What do you find unpleasant about noisy parties When the interviewee was described as extroverted almost all the questions presumed extroversion such as What would you do to liven up a dull party These loaded questions gave the interviewees little or no opportunity to falsify the hypothesis about them 19 A later version of the experiment gave the participants less presumptive questions to choose from such as Do you shy away from social interactions 20 Participants preferred to ask these more diagnostic questions showing only a weak bias towards positive tests This pattern of a main preference for diagnostic tests and a weaker preference for positive tests has been replicated in other studies 20 Personality traits influence and interact with biased search processes 21 Individuals vary in their abilities to defend their attitudes from external attacks in relation to selective exposure Selective exposure occurs when individuals search for information that is consistent rather than inconsistent with their personal beliefs 22 An experiment examined the extent to which individuals could refute arguments that contradicted their personal beliefs 21 People with high confidence levels more readily seek out contradictory information to their personal position to form an argument This can take the form of an oppositional news consumption where individuals seek opposing partisan news in order to counterargue 23 Individuals with low confidence levels do not seek out contradictory information and prefer information that supports their personal position People generate and evaluate evidence in arguments that are biased towards their own beliefs and opinions 24 Heightened confidence levels decrease preference for information that supports individuals personal beliefs Another experiment gave participants a complex rule discovery task that involved moving objects simulated by a computer 25 Objects on the computer screen followed specific laws which the participants had to figure out So participants could fire objects across the screen to test their hypotheses Despite making many attempts over a ten hour session none of the participants figured out the rules of the system They typically attempted to confirm rather than falsify their hypotheses and were reluctant to consider alternatives Even after seeing objective evidence that refuted their working hypotheses they frequently continued doing the same tests Some of the participants were taught proper hypothesis testing but these instructions had almost no effect 25 Biased interpretation of information edit Smart people believe weird things because they are skilled at defending beliefs they arrived at for non smart reasons Michael Shermer 26 Confirmation biases are not limited to the collection of evidence Even if two individuals have the same information the way they interpret it can be biased A team at Stanford University conducted an experiment involving participants who felt strongly about capital punishment with half in favor and half against it 27 28 Each participant read descriptions of two studies a comparison of U S states with and without the death penalty and a comparison of murder rates in a state before and after the introduction of the death penalty After reading a quick description of each study the participants were asked whether their opinions had changed Then they read a more detailed account of each study s procedure and had to rate whether the research was well conducted and convincing 27 In fact the studies were fictional Half the participants were told that one kind of study supported the deterrent effect and the other undermined it while for other participants the conclusions were swapped 27 28 The participants whether supporters or opponents reported shifting their attitudes slightly in the direction of the first study they read Once they read the more detailed descriptions of the two studies they almost all returned to their original belief regardless of the evidence provided pointing to details that supported their viewpoint and disregarding anything contrary Participants described studies supporting their pre existing view as superior to those that contradicted it in detailed and specific ways 27 29 Writing about a study that seemed to undermine the deterrence effect a death penalty proponent wrote The research didn t cover a long enough period of time while an opponent s comment on the same study said No strong evidence to contradict the researchers has been presented 27 The results illustrated that people set higher standards of evidence for hypotheses that go against their current expectations This effect known as disconfirmation bias has been supported by other experiments 30 Another study of biased interpretation occurred during the 2004 U S presidential election and involved participants who reported having strong feelings about the candidates They were shown apparently contradictory pairs of statements either from Republican candidate George W Bush Democratic candidate John Kerry or a politically neutral public figure They were also given further statements that made the apparent contradiction seem reasonable From these three pieces of information they had to decide whether each individual s statements were inconsistent 31 1948 There were strong differences in these evaluations with participants much more likely to interpret statements from the candidate they opposed as contradictory 31 1951 nbsp An MRI scanner allowed researchers to examine how the human brain deals with dissonant information In this experiment the participants made their judgments while in a magnetic resonance imaging MRI scanner which monitored their brain activity As participants evaluated contradictory statements by their favored candidate emotional centers of their brains were aroused This did not happen with the statements by the other figures The experimenters inferred that the different responses to the statements were not due to passive reasoning errors Instead the participants were actively reducing the cognitive dissonance induced by reading about their favored candidate s irrational or hypocritical behavior 31 1956 Biases in belief interpretation are persistent regardless of intelligence level Participants in an experiment took the SAT test a college admissions test used in the United States to assess their intelligence levels They then read information regarding safety concerns for vehicles and the experimenters manipulated the national origin of the car American participants provided their opinion if the car should be banned on a six point scale where one indicated definitely yes and six indicated definitely no Participants firstly evaluated if they would allow a dangerous German car on American streets and a dangerous American car on German streets Participants believed that the dangerous German car on American streets should be banned more quickly than the dangerous American car on German streets There was no difference among intelligence levels at the rate participants would ban a car 24 Biased interpretation is not restricted to emotionally significant topics In another experiment participants were told a story about a theft They had to rate the evidential importance of statements arguing either for or against a particular character being responsible When they hypothesized that character s guilt they rated statements supporting that hypothesis as more important than conflicting statements 32 Biased memory recall of information edit People may remember evidence selectively to reinforce their expectations even if they gather and interpret evidence in a neutral manner This effect is called selective recall confirmatory memory or access biased memory 33 Psychological theories differ in their predictions about selective recall Schema theory predicts that information matching prior expectations will be more easily stored and recalled than information that does not match 34 Some alternative approaches say that surprising information stands out and so is memorable 34 Predictions from both these theories have been confirmed in different experimental contexts with no theory winning outright 35 In one study participants read a profile of a woman which described a mix of introverted and extroverted behaviors 36 They later had to recall examples of her introversion and extroversion One group was told this was to assess the woman for a job as a librarian while a second group were told it was for a job in real estate sales There was a significant difference between what these two groups recalled with the librarian group recalling more examples of introversion and the sales groups recalling more extroverted behavior 36 A selective memory effect has also been shown in experiments that manipulate the desirability of personality types 34 37 In one of these a group of participants were shown evidence that extroverted people are more successful than introverts Another group were told the opposite In a subsequent apparently unrelated study participants were asked to recall events from their lives in which they had been either introverted or extroverted Each group of participants provided more memories connecting themselves with the more desirable personality type and recalled those memories more quickly 38 Changes in emotional states can also influence memory recall 39 40 Participants rated how they felt when they had first learned that O J Simpson had been acquitted of murder charges 39 They described their emotional reactions and confidence regarding the verdict one week two months and one year after the trial Results indicated that participants assessments for Simpson s guilt changed over time The more that participants opinion of the verdict had changed the less stable were the participant s memories regarding their initial emotional reactions When participants recalled their initial emotional reactions two months and a year later past appraisals closely resembled current appraisals of emotion People demonstrate sizable myside bias when discussing their opinions on controversial topics 24 Memory recall and construction of experiences undergo revision in relation to corresponding emotional states Myside bias has been shown to influence the accuracy of memory recall 40 In an experiment widows and widowers rated the intensity of their experienced grief six months and five years after the deaths of their spouses Participants noted a higher experience of grief at six months rather than at five years Yet when the participants were asked after five years how they had felt six months after the death of their significant other the intensity of grief participants recalled was highly correlated with their current level of grief Individuals appear to utilize their current emotional states to analyze how they must have felt when experiencing past events 39 Emotional memories are reconstructed by current emotional states One study showed how selective memory can maintain belief in extrasensory perception ESP 41 Believers and disbelievers were each shown descriptions of ESP experiments Half of each group were told that the experimental results supported the existence of ESP while the others were told they did not In a subsequent test participants recalled the material accurately apart from believers who had read the non supportive evidence This group remembered significantly less information and some of them incorrectly remembered the results as supporting ESP 41 Individual differences editMyside bias was once believed to be correlated with intelligence however studies have shown that myside bias can be more influenced by ability to rationally think as opposed to level of intelligence 24 Myside bias can cause an inability to effectively and logically evaluate the opposite side of an argument Studies have stated that myside bias is an absence of active open mindedness meaning the active search for why an initial idea may be wrong 42 Typically myside bias is operationalized in empirical studies as the quantity of evidence used in support of their side in comparison to the opposite side 43 A study has found individual differences in myside bias This study investigates individual differences that are acquired through learning in a cultural context and are mutable The researcher found important individual difference in argumentation Studies have suggested that individual differences such as deductive reasoning ability ability to overcome belief bias epistemological understanding and thinking disposition are significant predictors of the reasoning and generating arguments counterarguments and rebuttals 44 45 46 A study by Christopher Wolfe and Anne Britt also investigated how participants views of what makes a good argument can be a source of myside bias that influences the way a person formulates their own arguments 43 The study investigated individual differences of argumentation schema and asked participants to write essays The participants were randomly assigned to write essays either for or against their preferred side of an argument and were given research instructions that took either a balanced or an unrestricted approach The balanced research instructions directed participants to create a balanced argument i e that included both pros and cons the unrestricted research instructions included nothing on how to create the argument 43 Overall the results revealed that the balanced research instructions significantly increased the incidence of opposing information in arguments These data also reveal that personal belief is not a source of myside bias however that those participants who believe that a good argument is one that is based on facts are more likely to exhibit myside bias than other participants This evidence is consistent with the claims proposed in Baron s article that people s opinions about what makes good thinking can influence how arguments are generated 43 Discovery editInformal observations edit nbsp Francis BaconBefore psychological research on confirmation bias the phenomenon had been observed throughout history Beginning with the Greek historian Thucydides c 460 BC c 395 BC who wrote of misguided reason in The Peloponnesian War for it is a habit of mankind to entrust to careless hope what they long for and to use sovereign reason to thrust aside what they do not fancy 47 Italian poet Dante Alighieri 1265 1321 noted it in the Divine Comedy in which St Thomas Aquinas cautions Dante upon meeting in Paradise opinion hasty often can incline to the wrong side and then affection for one s own opinion binds confines the mind 48 Ibn Khaldun noticed the same effect in his Muqaddimah 49 Untruth naturally afflicts historical information There are various reasons that make this unavoidable One of them is partisanship for opinions and schools if the soul is infected with partisanship for a particular opinion or sect it accepts without a moment s hesitation the information that is agreeable to it Prejudice and partisanship obscure the critical faculty and preclude critical investigation The result is that falsehoods are accepted and transmitted In the Novum Organum English philosopher and scientist Francis Bacon 1561 1626 50 noted that biased assessment of evidence drove all superstitions whether in astrology dreams omens divine judgments or the like 51 He wrote 51 The human understanding when it has once adopted an opinion draws all things else to support and agree with it And though there be a greater number and weight of instances to be found on the other side yet these it either neglects or despises or else by some distinction sets aside or rejects In the second volume of his The World as Will and Representation 1844 German philosopher Arthur Schopenhauer observed that An adopted hypothesis gives us lynx eyes for everything that confirms it and makes us blind to everything that contradicts it 52 In his essay 1897 What Is Art Russian novelist Leo Tolstoy wrote 53 I know that most men not only those considered clever but even those who are very clever and capable of understanding most difficult scientific mathematical or philosophic problems can very seldom discern even the simplest and most obvious truth if it be such as to oblige them to admit the falsity of conclusions they have formed perhaps with much difficulty conclusions of which they are proud which they have taught to others and on which they have built their lives In his essay 1894 The Kingdom of God Is Within You Tolstoy had earlier written 54 The most difficult subjects can be explained to the most slow witted man if he has not formed any idea of them already but the simplest thing cannot be made clear to the most intelligent man if he is firmly persuaded that he knows already without a shadow of doubt what is laid before him Hypothesis testing falsification explanation Wason edit Main article Wason selection task In Peter Wason s initial experiment published in 1960 which does not mention the term confirmation bias he repeatedly challenged participants to identify a rule applying to triples of numbers They were told that 2 4 6 fits the rule They generated triples and the experimenter told them whether each triple conformed to the rule 3 179 The actual rule was simply any ascending sequence but participants had great difficulty in finding it often announcing rules that were far more specific such as the middle number is the average of the first and last 55 The participants seemed to test only positive examples triples that obeyed their hypothesized rule For example if they thought the rule was Each number is two greater than its predecessor they would offer a triple that fitted confirmed this rule such as 11 13 15 rather than a triple that violated falsified it such as 11 12 19 56 Wason interpreted his results as showing a preference for confirmation over falsification hence he coined the term confirmation bias c 58 Wason also used confirmation bias to explain the results of his selection task experiment 59 Participants repeatedly performed badly on various forms of this test in most cases ignoring information that could potentially refute falsify the specified rule 60 61 Hypothesis testing positive test strategy explanation Klayman and Ha edit Klayman and Ha s 1987 paper argues that the Wason experiments do not actually demonstrate a bias towards confirmation but instead a tendency to make tests consistent with the working hypothesis 16 62 They called this the positive test strategy 11 This strategy is an example of a heuristic a reasoning shortcut that is imperfect but easy to compute 63 Klayman and Ha used Bayesian probability and information theory as their standard of hypothesis testing rather than the falsificationism used by Wason According to these ideas each answer to a question yields a different amount of information which depends on the person s prior beliefs Thus a scientific test of a hypothesis is one that is expected to produce the most information Since the information content depends on initial probabilities a positive test can either be highly informative or uninformative Klayman and Ha argued that when people think about realistic problems they are looking for a specific answer with a small initial probability In this case positive tests are usually more informative than negative tests 16 However in Wason s rule discovery task the answer three numbers in ascending order is very broad so positive tests are unlikely to yield informative answers Klayman and Ha supported their analysis by citing an experiment that used the labels DAX and MED in place of fits the rule and doesn t fit the rule This avoided implying that the aim was to find a low probability rule Participants had much more success with this version of the experiment 64 65 nbsp If the true rule T encompasses the current hypothesis H then positive tests examining an H to see if it is T will not show that the hypothesis is false nbsp If the true rule T overlaps the current hypothesis H then either a negative test or a positive test can potentially falsify H nbsp When the working hypothesis H includes the true rule T then positive tests are the only way to falsify H In light of this and other critiques the focus of research moved away from confirmation versus falsification of an hypothesis to examining whether people test hypotheses in an informative way or an uninformative but positive way The search for true confirmation bias led psychologists to look at a wider range of effects in how people process information 66 Information processing explanations editThere are currently three main information processing explanations of confirmation bias plus a recent addition Cognitive versus motivational edit nbsp Happy events are more likely to be remembered According to Robert MacCoun most biased evidence processing occurs through a combination of cold cognitive and hot motivated mechanisms 67 Cognitive explanations for confirmation bias are based on limitations in people s ability to handle complex tasks and the shortcuts called heuristics that they use 68 For example people may judge the reliability of evidence by using the availability heuristic that is how readily a particular idea comes to mind 69 It is also possible that people can only focus on one thought at a time so find it difficult to test alternative hypotheses in parallel 3 198 199 Another heuristic is the positive test strategy identified by Klayman and Ha in which people test a hypothesis by examining cases where they expect a property or event to occur This heuristic avoids the difficult or impossible task of working out how diagnostic each possible question will be However it is not universally reliable so people can overlook challenges to their existing beliefs 16 3 200 Motivational explanations involve an effect of desire on belief 3 197 70 It is known that people prefer positive thoughts over negative ones in a number of ways this is called the Pollyanna principle 71 Applied to arguments or sources of evidence this could explain why desired conclusions are more likely to be believed true According to experiments that manipulate the desirability of the conclusion people demand a high standard of evidence for unpalatable ideas and a low standard for preferred ideas In other words they ask Can I believe this for some suggestions and Must I believe this for others 72 73 Although consistency is a desirable feature of attitudes an excessive drive for consistency is another potential source of bias because it may prevent people from neutrally evaluating new surprising information Social psychologist Ziva Kunda combines the cognitive and motivational theories arguing that motivation creates the bias but cognitive factors determine the size of the effect 3 198 Cost benefit edit Explanations in terms of cost benefit analysis assume that people do not just test hypotheses in a disinterested way but assess the costs of different errors 74 Using ideas from evolutionary psychology James Friedrich suggests that people do not primarily aim at truth in testing hypotheses but try to avoid the most costly errors For example employers might ask one sided questions in job interviews because they are focused on weeding out unsuitable candidates 75 Yaacov Trope and Akiva Liberman s refinement of this theory assumes that people compare the two different kinds of error accepting a false hypothesis or rejecting a true hypothesis For instance someone who underestimates a friend s honesty might treat him or her suspiciously and so undermine the friendship Overestimating the friend s honesty may also be costly but less so In this case it would be rational to seek evaluate or remember evidence of their honesty in a biased way 76 When someone gives an initial impression of being introverted or extroverted questions that match that impression come across as more empathic 77 This suggests that when talking to someone who seems to be an introvert it is a sign of better social skills to ask Do you feel awkward in social situations rather than Do you like noisy parties The connection between confirmation bias and social skills was corroborated by a study of how college students get to know other people Highly self monitoring students who are more sensitive to their environment and to social norms asked more matching questions when interviewing a high status staff member than when getting to know fellow students 77 Exploratory versus confirmatory edit Psychologists Jennifer Lerner and Philip Tetlock distinguish two different kinds of thinking process Exploratory thought neutrally considers multiple points of view and tries to anticipate all possible objections to a particular position while confirmatory thought seeks to justify a specific point of view Lerner and Tetlock say that when people expect to justify their position to others whose views they already know they will tend to adopt a similar position to those people and then use confirmatory thought to bolster their own credibility However if the external parties are overly aggressive or critical people will disengage from thought altogether and simply assert their personal opinions without justification Lerner and Tetlock say that people only push themselves to think critically and logically when they know in advance they will need to explain themselves to others who are well informed genuinely interested in the truth and whose views they do not already know Because those conditions rarely exist they argue most people are using confirmatory thought most of the time 78 79 80 Make believe edit Developmental psychologist Eve Whitmore has argued that beliefs and biases involved in confirmation bias have their roots in childhood coping through make believe which becomes the basis for more complex forms of self deception and illusion into adulthood The friction brought on by questioning as an adolescent with developing critical thinking can lead to the rationalization of false beliefs and the habit of such rationalization can become unconscious over the years 81 Real world effects editSocial media edit In social media confirmation bias is amplified by the use of filter bubbles or algorithmic editing which displays to individuals only information they are likely to agree with while excluding opposing views 82 Some have argued that confirmation bias is the reason why society can never escape from filter bubbles because individuals are psychologically hardwired to seek information that agrees with their preexisting values and beliefs 83 Others have further argued that the mixture of the two is degrading democracy claiming that this algorithmic editing removes diverse viewpoints and information and that unless filter bubble algorithms are removed voters will be unable to make fully informed political decisions 84 82 The rise of social media has contributed greatly to the rapid spread of fake news that is false and misleading information that is presented as credible news from a seemingly reliable source Confirmation bias selecting or reinterpreting evidence to support one s beliefs is one of three main hurdles cited as to why critical thinking goes astray in these circumstances The other two are shortcut heuristics when overwhelmed or short of time people rely on simple rules such as group consensus or trusting an expert or role model and social goals social motivation or peer pressure can interfere with objective analysis of facts at hand 85 In combating the spread of fake news social media sites have considered turning toward digital nudging 86 This can currently be done in two different forms of nudging This includes nudging of information and nudging of presentation Nudging of information entails social media sites providing a disclaimer or label questioning or warning users of the validity of the source while nudging of presentation includes exposing users to new information which they may not have sought out but could introduce them to viewpoints that may combat their own confirmation biases 87 Science and scientific research edit See also Planck s principle Escalation of commitment and Replication crisis A distinguishing feature of scientific thinking is the search for confirming or supportive evidence inductive reasoning as well as falsifying evidence deductive reasoning 88 89 Many times in the history of science scientists have resisted new discoveries by selectively interpreting or ignoring unfavorable data 3 192 194 Several studies have shown that scientists rate studies that report findings consistent with their prior beliefs more favorably than studies reporting findings inconsistent with their previous beliefs 9 90 91 However assuming that the research question is relevant the experimental design adequate and the data are clearly and comprehensively described the empirical data obtained should be important to the scientific community and should not be viewed prejudicially regardless of whether they conform to current theoretical predictions 91 In practice researchers may misunderstand misinterpret or not read at all studies that contradict their preconceptions or wrongly cite them anyway as if they actually supported their claims 92 Further confirmation biases can sustain scientific theories or research programs in the face of inadequate or even contradictory evidence 60 93 The discipline of parapsychology is often cited as an example 94 An experimenter s confirmation bias can potentially affect which data are reported Data that conflict with the experimenter s expectations may be more readily discarded as unreliable producing the so called file drawer effect To combat this tendency scientific training teaches ways to prevent bias 95 For example experimental design of randomized controlled trials coupled with their systematic review aims to minimize sources of bias 95 96 The social process of peer review aims to mitigate the effect of individual scientists biases even though the peer review process itself may be susceptible to such biases 97 98 91 99 100 Confirmation bias may thus be especially harmful to objective evaluations regarding nonconforming results since biased individuals may regard opposing evidence to be weak in principle and give little serious thought to revising their beliefs 90 Scientific innovators often meet with resistance from the scientific community and research presenting controversial results frequently receives harsh peer review 101 Finance edit See also Escalation of commitment and Sunk cost Confirmation bias can lead investors to be overconfident ignoring evidence that their strategies will lose money 10 102 In studies of political stock markets investors made more profit when they resisted bias For example participants who interpreted a candidate s debate performance in a neutral rather than partisan way were more likely to profit 103 To combat the effect of confirmation bias investors can try to adopt a contrary viewpoint for the sake of argument 104 In one technique they imagine that their investments have collapsed and ask themselves why this might happen 10 Medicine and health edit Cognitive biases are important variables in clinical decision making by medical general practitioners GPs and medical specialists Two important ones are confirmation bias and the overlapping availability bias A GP may make a diagnosis early on during an examination and then seek confirming evidence rather than falsifying evidence This cognitive error is partly caused by the availability of evidence about the supposed disorder being diagnosed For example the client may have mentioned the disorder or the GP may have recently read a much discussed paper about the disorder The basis of this cognitive shortcut or heuristic termed anchoring is that the doctor does not consider multiple possibilities based on evidence but prematurely latches on or anchors to a single cause 105 In emergency medicine because of time pressure there is a high density of decision making and shortcuts are frequently applied The potential failure rate of these cognitive decisions needs to be managed by education about the 30 or more cognitive biases that can occur so as to set in place proper debiasing strategies 106 Confirmation bias may also cause doctors to perform unnecessary medical procedures due to pressure from adamant patients 107 Raymond Nickerson a psychologist blames confirmation bias for the ineffective medical procedures that were used for centuries before the arrival of scientific medicine 3 192 If a patient recovered medical authorities counted the treatment as successful rather than looking for alternative explanations such as that the disease had run its natural course Biased assimilation is a factor in the modern appeal of alternative medicine whose proponents are swayed by positive anecdotal evidence but treat scientific evidence hyper critically 108 109 110 Cognitive therapy was developed by Aaron T Beck in the early 1960s and has become a popular approach 111 According to Beck biased information processing is a factor in depression 112 His approach teaches people to treat evidence impartially rather than selectively reinforcing negative outlooks 50 Phobias and hypochondria have also been shown to involve confirmation bias for threatening information 113 Politics law and policing edit nbsp Mock trials allow researchers to examine confirmation biases in a realistic setting Nickerson argues that reasoning in judicial and political contexts is sometimes subconsciously biased favoring conclusions that judges juries or governments have already committed to 3 191 193 Since the evidence in a jury trial can be complex and jurors often reach decisions about the verdict early on it is reasonable to expect an attitude polarization effect The prediction that jurors will become more extreme in their views as they see more evidence has been borne out in experiments with mock trials 114 115 Both inquisitorial and adversarial criminal justice systems are affected by confirmation bias 116 Confirmation bias can be a factor in creating or extending conflicts from emotionally charged debates to wars by interpreting the evidence in their favor each opposing party can become overconfident that it is in the stronger position 117 On the other hand confirmation bias can result in people ignoring or misinterpreting the signs of an imminent or incipient conflict For example psychologists Stuart Sutherland and Thomas Kida have each argued that U S Navy Admiral Husband E Kimmel showed confirmation bias when playing down the first signs of the Japanese attack on Pearl Harbor 60 118 A two decade study of political pundits by Philip E Tetlock found that on the whole their predictions were not much better than chance Tetlock divided experts into foxes who maintained multiple hypotheses and hedgehogs who were more dogmatic In general the hedgehogs were much less accurate Tetlock blamed their failure on confirmation bias and specifically on their inability to make use of new information that contradicted their existing theories 119 In police investigations a detective may identify a suspect early in an investigation but then sometimes largely seek supporting or confirming evidence ignoring or downplaying falsifying evidence 120 Social psychology edit Social psychologists have identified two tendencies in the way people seek or interpret information about themselves Self verification is the drive to reinforce the existing self image and self enhancement is the drive to seek positive feedback Both are served by confirmation biases 121 In experiments where people are given feedback that conflicts with their self image they are less likely to attend to it or remember it than when given self verifying feedback 122 123 124 They reduce the impact of such information by interpreting it as unreliable 122 125 126 Similar experiments have found a preference for positive feedback and the people who give it over negative feedback 121 Mass delusions edit Confirmation bias can play a key role in the propagation of mass delusions Witch trials are frequently cited as an example 127 128 For another example in the Seattle windshield pitting epidemic there seemed to be a pitting epidemic in which windshields were damaged due to an unknown cause As news of the apparent wave of damage spread more and more people checked their windshields discovered that their windshields too had been damaged thus confirming belief in the supposed epidemic In fact the windshields were previously damaged but the damage went unnoticed until people checked their windshields as the delusion spread 129 Paranormal beliefs edit One factor in the appeal of alleged psychic readings is that listeners apply a confirmation bias which fits the psychic s statements to their own lives 130 By making a large number of ambiguous statements in each sitting the psychic gives the client more opportunities to find a match This is one of the techniques of cold reading with which a psychic can deliver a subjectively impressive reading without any prior information about the client 130 Investigator James Randi compared the transcript of a reading to the client s report of what the psychic had said and found that the client showed a strong selective recall of the hits 131 As a striking illustration of confirmation bias in the real world Nickerson mentions numerological pyramidology the practice of finding meaning in the proportions of the Egyptian pyramids 3 190 There are many different length measurements that can be made of for example the Great Pyramid of Giza and many ways to combine or manipulate them Hence it is almost inevitable that people who look at these numbers selectively will find superficially impressive correspondences for example with the dimensions of the Earth 3 190 Recruitment and selection edit Unconscious cognitive bias including confirmation bias in job recruitment affects hiring decisions and can potentially prohibit a diverse and inclusive workplace There are a variety of unconscious biases that affects recruitment decisions but confirmation bias is one of the major ones especially during the interview stage 132 The interviewer will often select a candidate that confirms their own beliefs even though other candidates are equally or better qualified Associated effects and outcomes editPolarization of opinion edit Main article Attitude polarization When people with opposing views interpret new information in a biased way their views can move even further apart This is called attitude polarization 133 The effect was demonstrated by an experiment that involved drawing a series of red and black balls from one of two concealed bingo baskets Participants knew that one basket contained 60 percent black and 40 percent red balls the other 40 percent black and 60 percent red The experimenters looked at what happened when balls of alternating color were drawn in turn a sequence that does not favor either basket After each ball was drawn participants in one group were asked to state out loud their judgments of the probability that the balls were being drawn from one or the other basket These participants tended to grow more confident with each successive draw whether they initially thought the basket with 60 percent black balls or the one with 60 percent red balls was the more likely source their estimate of the probability increased Another group of participants were asked to state probability estimates only at the end of a sequence of drawn balls rather than after each ball They did not show the polarization effect suggesting that it does not necessarily occur when people simply hold opposing positions but rather when they openly commit to them 134 A less abstract study was the Stanford biased interpretation experiment in which participants with strong opinions about the death penalty read about mixed experimental evidence Twenty three percent of the participants reported that their views had become more extreme and this self reported shift correlated strongly with their initial attitudes 27 In later experiments participants also reported their opinions becoming more extreme in response to ambiguous information However comparisons of their attitudes before and after the new evidence showed no significant change suggesting that the self reported changes might not be real 30 133 135 Based on these experiments Deanna Kuhn and Joseph Lao concluded that polarization is a real phenomenon but far from inevitable only happening in a small minority of cases and it was prompted not only by considering mixed evidence but by merely thinking about the topic 133 Charles Taber and Milton Lodge argued that the Stanford team s result had been hard to replicate because the arguments used in later experiments were too abstract or confusing to evoke an emotional response The Taber and Lodge study used the emotionally charged topics of gun control and affirmative action 30 They measured the attitudes of their participants towards these issues before and after reading arguments on each side of the debate Two groups of participants showed attitude polarization those with strong prior opinions and those who were politically knowledgeable In part of this study participants chose which information sources to read from a list prepared by the experimenters For example they could read the National Rifle Association s and the Brady Anti Handgun Coalition s arguments on gun control Even when instructed to be even handed participants were more likely to read arguments that supported their existing attitudes than arguments that did not This biased search for information correlated well with the polarization effect 30 The backfire effect is a name for the finding that given evidence against their beliefs people can reject the evidence and believe even more strongly 136 137 The phrase was coined by Brendan Nyhan and Jason Reifler in 2010 138 However subsequent research has since failed to replicate findings supporting the backfire effect 139 One study conducted out of the Ohio State University and George Washington University studied 10 100 participants with 52 different issues expected to trigger a backfire effect While the findings did conclude that individuals are reluctant to embrace facts that contradict their already held ideology no cases of backfire were detected 140 The backfire effect has since been noted to be a rare phenomenon rather than a common occurrence 141 compare the boomerang effect Persistence of discredited beliefs edit Main article Belief perseverance See also Cognitive dissonance and Monty Hall problem Beliefs can survive potent logical or empirical challenges They can survive and even be bolstered by evidence that most uncommitted observers would agree logically demands some weakening of such beliefs They can even survive the total destruction of their original evidential bases Lee Ross and Craig Anderson 142 Confirmation biases provide one plausible explanation for the persistence of beliefs when the initial evidence for them is removed or when they have been sharply contradicted 3 187 This belief perseverance effect has been first demonstrated experimentally by Festinger Riecken and Schachter These psychologists spent time with a cult whose members were convinced that the world would end on 21 December 1954 After the prediction failed most believers still clung to their faith Their book describing this research is aptly named When Prophecy Fails 143 The term belief perseverance however was coined in a series of experiments using what is called the debriefing paradigm participants read fake evidence for a hypothesis their attitude change is measured then the fakery is exposed in detail Their attitudes are then measured once more to see if their belief returns to its previous level 142 A common finding is that at least some of the initial belief remains even after a full debriefing 144 In one experiment participants had to distinguish between real and fake suicide notes The feedback was random some were told they had done well while others were told they had performed badly Even after being fully debriefed participants were still influenced by the feedback They still thought they were better or worse than average at that kind of task depending on what they had initially been told 145 In another study participants read job performance ratings of two firefighters along with their responses to a risk aversion test 142 This fictional data was arranged to show either a negative or positive association some participants were told that a risk taking firefighter did better while others were told they did less well than a risk averse colleague 146 Even if these two case studies were true they would have been scientifically poor evidence for a conclusion about firefighters in general However the participants found them subjectively persuasive 146 When the case studies were shown to be fictional participants belief in a link diminished but around half of the original effect remained 142 Follow up interviews established that the participants had understood the debriefing and taken it seriously Participants seemed to trust the debriefing but regarded the discredited information as irrelevant to their personal belief 146 The continued influence effect is the tendency for misinformation to continue to influence memory and reasoning about an event despite the misinformation having been retracted or corrected This occurs even when the individual believes the correction 147 Preference for early information edit Experiments have shown that information is weighted more strongly when it appears early in a series even when the order is unimportant For example people form a more positive impression of someone described as intelligent industrious impulsive critical stubborn envious than when they are given the same words in reverse order 148 This irrational primacy effect is independent of the primacy effect in memory in which the earlier items in a series leave a stronger memory trace 148 Biased interpretation offers an explanation for this effect seeing the initial evidence people form a working hypothesis that affects how they interpret the rest of the information 3 187 One demonstration of irrational primacy used colored chips supposedly drawn from two urns Participants were told the color distributions of the urns and had to estimate the probability of a chip being drawn from one of them 148 In fact the colors appeared in a prearranged order The first thirty draws favored one urn and the next thirty favored the other 3 187 The series as a whole was neutral so rationally the two urns were equally likely However after sixty draws participants favored the urn suggested by the initial thirty 148 Another experiment involved a slide show of a single object seen as just a blur at first and in slightly better focus with each succeeding slide 148 After each slide participants had to state their best guess of what the object was Participants whose early guesses were wrong persisted with those guesses even when the picture was sufficiently in focus that the object was readily recognizable to other people 3 187 Illusory association between events edit Main article Illusory correlation Illusory correlation is the tendency to see non existent correlations in a set of data 149 This tendency was first demonstrated in a series of experiments in the late 1960s 150 In one experiment participants read a set of psychiatric case studies including responses to the Rorschach inkblot test The participants reported that the homosexual men in the set were more likely to report seeing buttocks anuses or sexually ambiguous figures in the inkblots In fact the fictional case studies had been constructed so that the homosexual men were no more likely to report this imagery or in one version of the experiment were less likely to report it than heterosexual men 149 In a survey a group of experienced psychoanalysts reported the same set of illusory associations with homosexuality 149 150 Another study recorded the symptoms experienced by arthritic patients along with weather conditions over a 15 month period Nearly all the patients reported that their pains were correlated with weather conditions although the real correlation was zero 151 Example Days Rain No rainArthritis 14 6No arthritis 7 2This effect is a kind of biased interpretation in that objectively neutral or unfavorable evidence is interpreted to support existing beliefs It is also related to biases in hypothesis testing behavior 152 In judging whether two events such as illness and bad weather are correlated people rely heavily on the number of positive positive cases in this example instances of both pain and bad weather They pay relatively little attention to the other kinds of observation of no pain and or good weather 153 This parallels the reliance on positive tests in hypothesis testing 152 It may also reflect selective recall in that people may have a sense that two events are correlated because it is easier to recall times when they happened together 152 See also edit nbsp Philosophy portal nbsp Psychology portalApophenia Cherry picking Cognitive bias mitigation Denialism List of cognitive biases Observer expectancy effect Selective perception Semmelweis reflexNotes edit David Perkins a professor and researcher at the Harvard Graduate School of Education coined the term myside bias referring to a preference for my side of an issue 1 Assimilation bias is another term used for biased interpretation of evidence 7 Wason also used the term verification bias 57 References editCitations edit Baron 2000 p 195 Hart William Albarracin D Eagly A H Brechan I Lindberg M J Merrill L 2009 Feeling validated versus being correct A meta analysis of selective exposure to information Psychological Bulletin 135 4 555 588 doi 10 1037 a0015701 PMC 4797953 PMID 19586162 a b c d e f g h i j k l m n o p Nickerson 1998 pp 175 220 Plous 1993 p 233 Darley John M Gross Paget H 2000 A hypothesis confirming bias in labelling effects in Stangor Charles ed Stereotypes and prejudice essential readings Psychology Press p 212 ISBN 978 0 86377 589 5 OCLC 42823720 Risen amp Gilovich 2007 Risen amp Gilovich 2007 p 113 a b c Oswald amp Grosjean 2004 pp 82 83 a b Hergovich Schott amp Burger 2010 a b c Zweig Jason 19 November 2009 How to ignore the yes man in your head Wall Street Journal archived from the original on 14 February 2015 retrieved 13 June 2010 a b c d Kunda 1999 pp 112 115 a b Baron 2000 pp 162 64 Kida 2006 pp 162 65 Devine Patricia G Hirt Edward R Gehrke Elizabeth M 1990 Diagnostic and confirmation strategies in trait hypothesis testing Journal of Personality and Social Psychology 58 6 952 963 doi 10 1037 0022 3514 58 6 952 ISSN 1939 1315 Trope Yaacov Bassok Miriam 1982 Confirmatory and diagnosing strategies in social information gathering Journal of Personality and Social Psychology 43 1 22 34 doi 10 1037 0022 3514 43 1 22 ISSN 1939 1315 a b c d Klayman Joshua Ha Young Won 1987 Confirmation disconfirmation and information in hypothesis testing PDF Psychological Review 94 2 211 228 CiteSeerX 10 1 1 174 5232 doi 10 1037 0033 295X 94 2 211 ISSN 0033 295X S2CID 10853196 archived PDF from the original on 1 October 2011 retrieved 14 August 2009 Kunda Ziva Fong G T Sanitoso R Reber E 1993 Directional questions direct self conceptions Journal of Experimental Social Psychology 29 62 63 doi 10 1006 jesp 1993 1004 ISSN 0022 1031 via Fine 2006 pp 63 65 a b Shafir E 1993 Choosing versus rejecting why some options are both better and worse than others Memory and Cognition 21 4 546 556 doi 10 3758 bf03197186 PMID 8350746 via Fine 2006 pp 63 65 Snyder Mark Swann William B Jr 1978 Hypothesis testing processes in social interaction Journal of Personality and Social Psychology 36 11 1202 1212 doi 10 1037 0022 3514 36 11 1202 via Poletiek 2001 p 131 a b Kunda 1999 pp 117 18 a b Albarracin D Mitchell A L 2004 The role of defensive confidence in preference for proattitudinal information How believing that one is strong can sometimes be a defensive weakness Personality and Social Psychology Bulletin 30 12 1565 1584 doi 10 1177 0146167204271180 PMC 4803283 PMID 15536240 Fischer P Fischer Julia K Aydin Nilufer Frey Dieter 2010 Physically attractive social information sources lead to increased selective exposure to information Basic and Applied Social Psychology 32 4 340 347 doi 10 1080 01973533 2010 519208 S2CID 143133082 Dahlgren Peter M 2020 Media Echo Chambers Selective Exposure and Confirmation Bias in Media Use and its Consequences for Political Polarization Gothenburg University of Gothenburg ISBN 978 91 88212 95 5 archived from the original on 6 April 2023 retrieved 16 October 2021 a b c d Stanovich K E West R F Toplak M E 2013 Myside bias rational thinking and intelligence Current Directions in Psychological Science 22 4 259 264 doi 10 1177 0963721413480174 S2CID 14505370 a b Mynatt Clifford R Doherty Michael E Tweney Ryan D 1978 Consequences of confirmation and disconfirmation in a simulated research environment Quarterly Journal of Experimental Psychology 30 3 395 406 doi 10 1080 00335557843000007 S2CID 145419628 Kida 2006 p 157 a b c d e f Lord Charles G Ross Lee Lepper Mark R 1979 Biased assimilation and attitude polarization The effects of prior theories on subsequently considered evidence Journal of Personality and Social Psychology 37 11 2098 2109 CiteSeerX 10 1 1 372 1743 doi 10 1037 0022 3514 37 11 2098 ISSN 0022 3514 S2CID 7465318 a b Baron 2000 pp 201 202 Vyse 1997 p 122 a b c d Taber Charles S Lodge Milton July 2006 Motivated skepticism in the evaluation of political beliefs American Journal of Political Science 50 3 755 769 CiteSeerX 10 1 1 472 7064 doi 10 1111 j 1540 5907 2006 00214 x ISSN 0092 5853 S2CID 3770487 a b c Westen Drew Blagov Pavel S Harenski Keith Kilts Clint Hamann Stephan 2006 Neural bases of motivated reasoning An fMRI study of emotional constraints on partisan political judgment in the 2004 U S Presidential election Journal of Cognitive Neuroscience 18 11 1947 1958 CiteSeerX 10 1 1 578 8097 doi 10 1162 jocn 2006 18 11 1947 PMID 17069484 S2CID 8625992 Gadenne V Oswald M 1986 Entstehung und Veranderung von Bestatigungstendenzen beim Testen von Hypothesen Formation and alteration of confirmatory tendencies during the testing of hypotheses Zeitschrift fur Experimentelle und Angewandte Psychologie 33 360 374 via Oswald amp Grosjean 2004 p 89 Hastie Reid Park Bernadette 2005 The relationship between memory and judgment depends on whether the judgment task is memory based or on line in Hamilton David L ed Social cognition key readings New York Psychology Press p 394 ISBN 978 0 86377 591 8 OCLC 55078722 a b c Oswald amp Grosjean 2004 pp 88 89 Stangor Charles McMillan David 1992 Memory for expectancy congruent and expectancy incongruent information A review of the social and social developmental literatures Psychological Bulletin 111 1 42 61 doi 10 1037 0033 2909 111 1 42 a b Snyder M Cantor N 1979 Testing hypotheses about other people the use of historical knowledge Journal of Experimental Social Psychology 15 4 330 342 doi 10 1016 0022 1031 79 90042 8 via Goldacre 2008 p 231 Kunda 1999 pp 225 232 Sanitioso Rasyid Kunda Ziva Fong G T 1990 Motivated recruitment of autobiographical memories Journal of Personality and Social Psychology 59 2 229 241 doi 10 1037 0022 3514 59 2 229 ISSN 0022 3514 PMID 2213492 a b c Levine L Prohaska V Burgess S L Rice J A Laulhere T M 2001 Remembering past emotions The role of current appraisals Cognition and Emotion 15 4 393 417 doi 10 1080 02699930125955 S2CID 22743423 a b Safer M A Bonanno G A Field N 2001 It was never that bad Biased recall of grief and long term adjustment to the death of a spouse Memory 9 3 195 203 doi 10 1080 09658210143000065 PMID 11469313 S2CID 24729233 a b Russell Dan Jones Warren H 1980 When superstition fails Reactions to disconfirmation of paranormal beliefs Personality and Social Psychology Bulletin 6 1 83 88 doi 10 1177 014616728061012 ISSN 1552 7433 S2CID 145060971 via Vyse 1997 p 121 Baron Jonathan 1995 Myside bias in thinking about abortion Thinking amp Reasoning 1 3 221 235 CiteSeerX 10 1 1 112 1603 doi 10 1080 13546789508256909 a b c d Wolfe Christopher Anne Britt 2008 The locus of the myside bias in written argumentation PDF Thinking amp Reasoning 14 1 27 doi 10 1080 13546780701527674 S2CID 40527220 archived PDF from the original on 4 March 2016 retrieved 11 November 2014 Mason Lucia Scirica Fabio October 2006 Prediction of students argumentation skills about controversial topics by epistemological understanding Learning and Instruction 16 5 492 509 doi 10 1016 j learninstruc 2006 09 007 Weinstock Michael 2009 Relative expertise in an everyday reasoning task Epistemic understanding problem representation and reasoning competence Learning and Individual Differences 19 4 423 434 doi 10 1016 j lindif 2009 03 003 Weinstock Michael Neuman Yair Tabak Iris 2004 Missing the point or missing the norms Epistemological norms as predictors of students ability to identify fallacious arguments Contemporary Educational Psychology 29 1 77 94 doi 10 1016 S0361 476X 03 00024 9 Thucydides 4 108 4 Alighieri Dante Paradiso canto XIII 118 120 Trans Allen Mandelbaum Ibn Khaldun 1958 The Muqadimmah Princeton NJ Princeton University Press p 71 a b Baron 2000 pp 195 196 a b Bacon Francis 1620 Novum Organum reprinted in Burtt E A ed 1939 The English philosophers from Bacon to Mill New York Random House p 36 via Nickerson 1998 p 176 Schopenhauer Arthur 2011 1844 Carus David Aquila Richard E eds The World as Will and Presentation vol 2 New York Routledge p 246 Tolstoy Leo 1896 What Is Art ch 14 p 143 Archived 17 August 2021 at the Wayback Machine Translated from Russian by Aylmer Maude New York 1904 Project Gutenberg edition Archived 7 August 2021 at the Wayback Machine released 23 March 2021 Retrieved 17 August 2021 Tolstoy Leo 1894 The Kingdom of God Is Within You p 49 Archived 17 August 2021 at the Wayback Machine Translated from Russian by Constance Garnett New York 1894 Project Gutenberg edition Archived 17 August 2021 at the Wayback Machine released 26 July 2013 Retrieved 17 August 2021 Wason 1960 Lewicka 1998 p 238 Poletiek 2001 p 73 Oswald amp Grosjean 2004 pp 79 96 Wason Peter C 1968 Reasoning about a rule Quarterly Journal of Experimental Psychology 20 3 273 278 doi 10 1080 14640746808400161 ISSN 1747 0226 PMID 5683766 S2CID 1212273 a b c Sutherland Stuart 2007 Irrationality 2nd ed London Pinter and Martin pp 95 103 ISBN 978 1 905177 07 3 OCLC 72151566 Barkow Jerome H Cosmides Leda Tooby John 1995 The adapted mind evolutionary psychology and the generation of culture Oxford University Press US pp 181 184 ISBN 978 0 19 510107 2 OCLC 33832963 Oswald amp Grosjean 2004 pp 81 82 86 87 Plous 1993 p 233 Lewicka 1998 p 239 Tweney Ryan D Doherty Michael E 1980 Strategies of rule discovery in an inference task The Quarterly Journal of Experimental Psychology 32 1 109 123 doi 10 1080 00335558008248237 ISSN 1747 0226 S2CID 143148831 Experiment IV Oswald amp Grosjean 2004 pp 86 89 MacCoun 1998 Friedrich 1993 p 298 Kunda 1999 p 94 Baron 2000 p 206 Matlin Margaret W 2004 Pollyanna Principle in Pohl Rudiger F ed Cognitive illusions A handbook on fallacies and biases in thinking judgement and memory Hove UK Psychology Press pp 255 272 ISBN 978 1 84169 351 4 OCLC 55124398 Dawson Erica Gilovich Thomas Regan Dennis T October 2002 Motivated reasoning and performance on the Wason Selection Task Personality and Social Psychology Bulletin 28 10 1379 1387 doi 10 1177 014616702236869 S2CID 143957893 Ditto Peter H Lopez David F 1992 Motivated skepticism Use of differential decision criteria for preferred and nonpreferred conclusions Journal of Personality and Social Psychology 63 4 568 584 doi 10 1037 0022 3514 63 4 568 ISSN 0022 3514 Oswald amp Grosjean 2004 pp 91 93 Friedrich 1993 pp 299 316 317 Trope Y Liberman A 1996 Social hypothesis testing Cognitive and motivational mechanisms in Higgins E Tory Kruglanski Arie W eds Social psychology Handbook of basic principles New York Guilford Press ISBN 978 1 57230 100 9 OCLC 34731629 via Oswald amp Grosjean 2004 pp 91 93 a b Dardenne Benoit Leyens Jacques Philippe 1995 Confirmation bias as a social skill PDF Personality and Social Psychology Bulletin 21 11 1229 1239 doi 10 1177 01461672952111011 ISSN 1552 7433 S2CID 146709087 archived PDF from the original on 9 September 2020 retrieved 25 September 2019 Shanteau James 2003 Sandra L Schneider ed Emerging perspectives on judgment and decision research Cambridge u a Cambridge University Press p 445 ISBN 978 0 521 52718 7 Haidt Jonathan 2013 The righteous mind Why good people are divided by politics and religion London Penguin Books pp 87 88 ISBN 978 0 14 103916 9 Fiske Susan T Gilbert Daniel T Lindzey Gardner eds 2010 The handbook of social psychology 5th ed Hoboken NJ Wiley p 811 ISBN 978 0 470 13749 9 American Psychological Association 2018 Why we re susceptible to fake news and how to defend against it Skeptical Inquirer 42 6 8 9 a b Pariser Eli 2 May 2011 Ted talk Beware online filter bubbles TED Ideas Worth Spreading archived from the original on 22 September 2017 retrieved 1 October 2017 Self Will 28 November 2016 Forget fake news on Facebook the real filter bubble is you NewStatesman archived from the original on 11 November 2017 retrieved 24 October 2017 Pariser Eli 7 May 2015 Did Facebook s big study kill my filter bubble thesis Wired archived from the original on 11 November 2017 retrieved 24 October 2017 Kendrick Douglas T Cohen Adam B Neuberg Steven L Cialdini Robert B 2020 The science of anti science thinking Scientific American 29 4 Fall Special Issue 84 89 Weinmann Markus Schneider Christoph vom Brocke Jan 2015 Digital nudging SSRN Rochester NY doi 10 2139 ssrn 2708250 S2CID 219380211 SSRN 2708250 Thornhill Calum Meeus Quentin Peperkamp Jeroen Berendt Bettina 2019 A digital nudge to counter confirmation bias Frontiers in Big Data 2 11 doi 10 3389 fdata 2019 00011 ISSN 2624 909X PMC 7931917 PMID 33693334 Mahoney Michael J DeMonbreun B G 1977 Psychology of the scientist An analysis of problem solving bias Cognitive Therapy and Research 1 3 229 238 doi 10 1007 BF01186796 S2CID 9703186 Mitroff I I 1974 Norms and counter norms in a select group of the Apollo moon scientists A case study of the ambivalence of scientists American Sociological Review 39 4 579 395 doi 10 2307 2094423 JSTOR 2094423 a b Koehler 1993 a b c Mahoney 1977 Letrud Kare Hernes Sigbjorn 2019 Affirmative citation bias in scientific myth debunking A three in one case study PLOS ONE 14 9 e0222213 Bibcode 2019PLoSO 1422213L doi 10 1371 journal pone 0222213 PMC 6733478 PMID 31498834 Ball Phillip 14 May 2015 The trouble with scientists How one psychologist is tackling human biases in science Nautilus archived from the original on 7 October 2019 retrieved 6 October 2019 Sternberg Robert J 2007 Critical thinking in psychology It really is critical in Sternberg Robert J Roediger III Henry L Halpern Diane F eds Critical thinking in psychology Cambridge University Press p 292 ISBN 978 0 521 60834 3 OCLC 69423179 Some of the worst examples of confirmation bias are in research on parapsychology Arguably there is a whole field here with no powerful confirming data at all But people want to believe and so they find ways to believe a b Shadish William R 2007 Critical thinking in quasi experimentation in Sternberg Robert J Roediger III Henry L Halpern Diane F eds Critical Thinking in Psychology Cambridge University Press p 49 ISBN 978 0 521 60834 3 Juni P Altman D G Egger M 2001 Systematic reviews in health care Assessing the quality of controlled clinical trials BMJ Clinical Research Ed 323 7303 42 46 doi 10 1136 bmj 323 7303 42 PMC 1120670 PMID 11440947 Lee C J Sugimoto C R Zhang G Cronin B 2013 Bias in peer review Journal of the Association for Information Science and Technology 64 2 17 doi 10 1002 asi 22784 Shermer Michael July 2006 The political brain A recent brain imaging study shows that our political predilections are a product of unconscious confirmation bias Scientific American 295 1 36 Bibcode 2006SciAm 295a 36S doi 10 1038 scientificamerican0706 36 ISSN 0036 8733 PMID 16830675 Emerson G B Warme W J Wolf F M Heckman J D Brand R A Leopold S S 2010 Testing for the presence of positive outcome bias in peer review A randomized controlled trial Archives of Internal Medicine 170 21 1934 1339 doi 10 1001 archinternmed 2010 406 PMID 21098355 Bartlett Steven James The psychology of abuse in publishing Peer review and editorial bias Chap 7 pp 147 177 in Steven James Bartlett Normality does not equal mental health The need to look elsewhere for standards of good psychological health Santa Barbara CA Praeger 2011 Horrobin David F 1990 The philosophical basis of peer review and the suppression of innovation Journal of the American Medical Association 263 10 1438 1441 doi 10 1001 jama 263 10 1438 PMID 2304222 Pompian Michael M 2006 Behavioral finance and wealth management how to build optimal portfolios that account for investor biases John Wiley and Sons pp 187 190 ISBN 978 0 471 74517 4 OCLC 61864118 Hilton Denis J 2001 The psychology of financial decision making Applications to trading dealing and investment analysis Journal of Behavioral Finance 2 1 37 39 doi 10 1207 S15327760JPFM0201 4 ISSN 1542 7579 S2CID 153379653 Krueger David Mann John David 2009 The secret language of money How to make smarter financial decisions and live a richer life McGraw Hill Professional pp 112 113 ISBN 978 0 07 162339 1 OCLC 277205993 Groopman Jerome 2007 How doctor s think Melbourne Scribe Publications pp 64 66 ISBN 978 1 921215 69 8 Croskerry Pat 2002 Achieving quality in clinical decision making Cognitive strategies and detection of bias Academic Emergency Medicine 9 11 1184 1204 doi 10 1197 aemj 9 11 1184 PMID 12414468 Pang Dominic Bleetman Anthony Bleetman David Wynne Max 2 June 2017 The foreign body that never was the effects of confirmation bias British Journal of Hospital Medicine 78 6 350 351 doi 10 12968 hmed 2017 78 6 350 PMID 28614014 Goldacre 2008 p 233 Singh Simon Ernst Edzard 2008 Trick or treatment Alternative medicine on trial London Bantam pp 287 288 ISBN 978 0 593 06129 9 Atwood Kimball 2004 Naturopathy pseudoscience and medicine Myths and fallacies vs truth Medscape General Medicine 6 1 33 PMC 1140750 PMID 15208545 Neenan Michael Dryden Windy 2004 Cognitive therapy 100 key points and techniques Psychology Press p ix ISBN 978 1 58391 858 6 OCLC 474568621 Blackburn Ivy Marie Davidson Kate M 1995 Cognitive therapy for depression amp anxiety a practitioner s guide 2 ed Wiley Blackwell p 19 ISBN 978 0 632 03986 9 OCLC 32699443 Harvey Allison G Watkins Edward Mansell Warren 2004 Cognitive behavioural processes across psychological disorders a transdiagnostic approach to research and treatment Oxford University Press pp 172 173 176 ISBN 978 0 19 852888 3 OCLC 602015097 Myers D G Lamm H 1976 The group polarization phenomenon Psychological Bulletin 83 4 602 527 doi 10 1037 0033 2909 83 4 602 via Nickerson 1998 pp 193 194 Halpern Diane F 1987 Critical thinking across the curriculum A brief edition of thought and knowledge Lawrence Erlbaum Associates p 194 ISBN 978 0 8058 2731 6 OCLC 37180929 Roach Kent 2010 Wrongful convictions Adversarial and inquisitorial themes North Carolina Journal of International Law and Commercial Regulation 35 387 446 SSRN 1619124 Quote Both adversarial and inquisitorial systems seem subject to the dangers of tunnel vision or confirmation bias Baron 2000 pp 191 195 Kida 2006 p 155 Tetlock Philip E 2005 Expert political judgment How good is it How can we know Princeton NJ Princeton University Press pp 125 128 ISBN 978 0 691 12302 8 OCLC 56825108 O Brien B 2009 Prime suspect An examination of factors that aggravate and counteract confirmation bias in criminal investigations Psychology Public Policy and Law 15 4 315 334 doi 10 1037 a0017881 a b Swann William B Pelham Brett W Krull Douglas S 1989 Agreeable fancy or disagreeable truth Reconciling self enhancement and self verification Journal of Personality and Social Psychology 57 5 782 791 doi 10 1037 0022 3514 57 5 782 ISSN 0022 3514 PMID 2810025 a b Swann William B Read Stephen J 1981 Self verification processes How we sustain our self conceptions Journal of Experimental Social Psychology 17 4 351 372 doi 10 1016 0022 1031 81 90043 3 ISSN 0022 1031 Story Amber L 1998 Self esteem and memory for favorable and unfavorable personality feedback Personality and Social Psychology Bulletin 24 1 51 64 doi 10 1177 0146167298241004 ISSN 1552 7433 S2CID 144945319 White Michael J Brockett Daniel R Overstreet Belinda G 1993 Confirmatory bias in evaluating personality test information Am I really that kind of person Journal of Counseling Psychology 40 1 120 126 doi 10 1037 0022 0167 40 1 120 ISSN 0022 0167 Swann William B Read Stephen J 1981 Acquiring self knowledge The search for feedback that fits Journal of Personality and Social Psychology 41 6 1119 1328 CiteSeerX 10 1 1 537 2324 doi 10 1037 0022 3514 41 6 1119 ISSN 0022 3514 Shrauger J Sidney Lund Adrian K 1975 Self evaluation and reactions to evaluations from others Journal of Personality 43 1 94 108 doi 10 1111 j 1467 6494 1975 tb00574 x PMID 1142062 Liden Moa 2018 3 2 4 1 PDF Confirmation bias in criminal cases Thesis Department of Law Uppsala University Archived PDF from the original on 20 February 2020 Retrieved 20 February 2020 Trevor Roper H R 1969 The European witch craze of the sixteenth and seventeenth centuries and other essays London HarperCollins ISBN missing Chrisler Mark 24 September 2019 The constant A history of getting things wrong constantpodcast com Podcast archived from the original on 20 February 2020 retrieved 19 February 2020 a b Smith Jonathan C 2009 Pseudoscience and extraordinary claims of the paranormal A critical thinker s toolkit London Wiley Blackwell pp 149 151 ISBN 978 1 4051 8122 8 OCLC 319499491 Randi James 1991 James Randi Psychic investigator London Boxtree pp 58 62 ISBN 978 1 85283 144 8 OCLC 26359284 Agarwal Dr Pragva 19 October 2018 Here is how bias can affect recruitment in your organization Forbes archived from the original on 31 July 2019 retrieved 31 July 2019 a b c Kuhn Deanna Lao Joseph March 1996 Effects of evidence on attitudes Is polarization the norm Psychological Science 7 2 115 120 doi 10 1111 j 1467 9280 1996 tb00340 x S2CID 145659040 Baron 2000 p 201 Miller A G McHoskey J W Bane C M Dowd T G 1993 The attitude polarization phenomenon Role of response measure attitude extremity and behavioral consequences of reported attitude change Journal of Personality and Social Psychology 64 4 561 574 doi 10 1037 0022 3514 64 4 561 S2CID 14102789 Backfire effect The Skeptic s Dictionary archived from the original on 6 February 2017 retrieved 26 April 2012 Silverman Craig 17 June 2011 The backfire effect Columbia Journalism Review archived from the original on 25 April 2012 retrieved 1 May 2012 When your deepest convictions are challenged by contradictory evidence your beliefs get stronger Nyhan B amp Reifler J 2010 When corrections fail The persistence of political misperceptions Political Behavior 32 303 320 Facts matter after all rejecting the backfire effect Oxford Education Blog 12 March 2018 Archived from the original on 23 October 2018 Retrieved 23 October 2018 Wood Thomas Porter Ethan 2019 The elusive backfire effect Mass attitudes steadfast factual adherence Political Behavior 41 135 163 doi 10 2139 ssrn 2819073 ISSN 1556 5068 Fact checking doesn t backfire new study suggests Poynter 2 November 2016 archived from the original on 24 October 2018 retrieved 23 October 2018 a b c d Ross Lee Anderson Craig A 1974 Judgment under uncertainty Heuristics and biases Science 185 4157 1124 1131 Bibcode 1974Sci 185 1124T doi 10 1126 science 185 4157 1124 PMID 17835457 S2CID 143452957 Kahneman Daniel Slovic Paul Tversky Amos eds 1982 Shortcomings in the attribution process On the origins and maintenance of erroneous social assessments Judgment under uncertainty Heuristics and biases Cambridge University Press ISBN 978 0 521 28414 1 OCLC 7578020 Festinger Leon 1956 When prophecy fails A social and psychological study of a modern group that predicted the destruction of the world New York Harper Torchbooks Kunda 1999 p 99 Ross Lee Lepper Mark R Hubbard Michael 1975 Perseverance in self perception and social perception Biased attributional processes in the debriefing paradigm Journal of Personality and Social Psychology 32 5 880 is 892 doi 10 1037 0022 3514 32 5 880 ISSN 0022 3514 PMID 1185517 via Kunda 1999 p 99 a b c Anderson Craig A Lepper Mark R Ross Lee 1980 Perseverance of social theories The role of explanation in the persistence of discredited information Journal of Personality and Social Psychology 39 6 1037 1049 CiteSeerX 10 1 1 130 933 doi 10 1037 h0077720 ISSN 0022 3514 Cacciatore Michael A 9 April 2021 Misinformation and public opinion of science and health Approaches findings and future directions Proceedings of the National Academy of Sciences 118 15 e1912437117 Bibcode 2021PNAS 11812437C doi 10 1073 pnas 1912437117 ISSN 0027 8424 PMC 8053916 PMID 33837143 p 4 The CIE refers to the tendency for information that is initially presented as true but later revealed to be false to continue to affect memory and reasoning a b c d e Baron 2000 pp 197 200 a b c Fine 2006 pp 66 70 a b Plous 1993 pp 164 166 Redelmeir D A Tversky Amos 1996 On the belief that arthritis pain is related to the weather Proceedings of the National Academy of Sciences 93 7 2895 2896 Bibcode 1996PNAS 93 2895R doi 10 1073 pnas 93 7 2895 PMC 39730 PMID 8610138 via Kunda 1999 p 127 a b c Kunda 1999 pp 127 130 Plous 1993 pp 162 164 Sources edit Baron Jonathan 2000 Thinking and deciding 3rd ed New York Cambridge University Press ISBN 978 0 521 65030 4 OCLC 316403966 Fine Cordelia 2006 A Mind of its Own how your brain distorts and deceives Cambridge UK Icon Books ISBN 978 1 84046 678 2 OCLC 60668289 Friedrich James 1993 Primary error detection and minimization PEDMIN strategies in social cognition a reinterpretation of confirmation bias phenomena Psychological Review 100 2 298 319 doi 10 1037 0033 295X 100 2 298 ISSN 0033 295X PMID 8483985 Goldacre Ben 2008 Bad science London Fourth Estate ISBN 978 0 00 724019 7 OCLC 259713114 Hergovich Andreas Schott Reinhard Burger Christoph 2010 Biased evaluation of abstracts depending on topic and conclusion Further evidence of a confirmation bias within scientific psychology Current Psychology 29 3 188 209 doi 10 1007 s12144 010 9087 5 S2CID 145497196 Kida Thomas E 2006 Don t believe everything you think The 6 basic mistakes we make in thinking Amherst NY Prometheus Books ISBN 978 1 59102 408 8 OCLC 63297791 Koehler Jonathan J 1993 The influence of prior beliefs on scientific judgments of evidence quality Organizational Behavior and Human Decision Processes 56 28 55 doi 10 1006 obhd 1993 1044 Kunda Ziva 1999 Social cognition Making sense of people MIT Press ISBN 978 0 262 61143 5 OCLC 40618974 Lewicka Maria 1998 Confirmation bias Cognitive error or adaptive strategy of action control in Kofta Miroslaw Weary Gifford Sedek Grzegorz eds Personal control in action Cognitive and motivational mechanisms Springer pp 233 255 ISBN 978 0 306 45720 3 OCLC 39002877 MacCoun Robert J 1998 Biases in the interpretation and use of research results PDF Annual Review of Psychology 49 259 287 doi 10 1146 annurev psych 49 1 259 PMID 15012470 archived PDF from the original on 9 August 2017 retrieved 10 October 2010 Mahoney Michael J 1977 Publication prejudices An experimental study of confirmatory bias in the peer review system Cognitive Therapy and Research 1 2 161 175 doi 10 1007 BF01173636 S2CID 7350256 Nickerson Raymond S 1998 Confirmation bias A ubiquitous phenomenon in many guises Review of General Psychology 2 2 175 220 doi 10 1037 1089 2680 2 2 175 S2CID 8508954 Oswald Margit E Grosjean Stefan 2004 Confirmation bias in Pohl Rudiger F ed Cognitive illusions A handbook on fallacies and biases in thinking judgement and memory Hove UK Psychology Press pp 79 96 ISBN 978 1 84169 351 4 OCLC 55124398 Plous Scott 1993 The psychology of judgment and decision making McGraw Hill ISBN 978 0 07 050477 6 OCLC 26931106 Poletiek Fenna 2001 Hypothesis testing behaviour Hove UK Psychology Press ISBN 978 1 84169 159 6 OCLC 44683470 Risen Jane Gilovich Thomas 2007 Informal logical fallacies in Sternberg Robert J Roediger III Henry L Halpern Diane F eds Critical thinking in psychology Cambridge University Press pp 110 130 ISBN 978 0 521 60834 3 OCLC 69423179 Vyse Stuart A 1997 Believing in magic The psychology of superstition New York Oxford University Press ISBN 978 0 19 513634 0 OCLC 35025826 Wason Peter C 1960 On the failure to eliminate hypotheses in a conceptual task Quarterly Journal of Experimental Psychology 12 3 129 140 doi 10 1080 17470216008416717 ISSN 1747 0226 S2CID 19237642Further reading editLeavitt Fred 2015 Dancing with absurdity Your most cherished beliefs and all your others are probably wrong Peter Lang Publishers ISBN 9781453914908 OCLC 908685982 Stanovich Keith 2009 What intelligence tests miss The psychology of rational thought Lay New Haven CT Yale University Press ISBN 978 0 300 12385 2 Westen Drew 2007 The political brain The role of emotion in deciding the fate of the nation PublicAffairs ISBN 978 1 58648 425 5 OCLC 86117725External links edit nbsp Wikiquote has quotations related to Confirmation bias Skeptic s Dictionary confirmation bias Robert T Carroll Teaching about confirmation bias class handout and instructor s notes by K H Grobman Confirmation bias at You Are Not So Smart Confirmation bias learning object interactive number triples exercise by Rod McFarland for Simon Fraser University Brief summary of the 1979 Stanford assimilation bias study Keith Rollag Babson College Retrieved from https en wikipedia org w index php title Confirmation bias amp oldid 1214718969 Biased interpretation of information, wikipedia, wiki, book, books, library,

article

, read, download, free, free download, mp3, video, mp4, 3gp, jpg, jpeg, gif, png, picture, music, song, movie, book, game, games.