fbpx
Wikipedia

Cognitive bias mitigation

Cognitive bias mitigation is the prevention and reduction of the negative effects of cognitive biases – unconscious, automatic influences on human judgment and decision making that reliably produce reasoning errors.

Coherent, comprehensive theories of cognitive bias mitigation are lacking. This article describes debiasing tools, methods, proposals and other initiatives, in academic and professional disciplines concerned with the efficacy of human reasoning, associated with the concept of cognitive bias mitigation; most address mitigation tacitly rather than explicitly.

A long-standing debate regarding human decision making bears on the development of a theory and practice of bias mitigation. This debate contrasts the rational economic agent standard for decision making versus one grounded in human social needs and motivations. The debate also contrasts the methods used to analyze and predict human decision making, i.e. formal analysis emphasizing intellectual capacities versus heuristics emphasizing emotional states. This article identifies elements relevant to this debate.

Context edit

A large body of evidence[1][2][3][4][5][6][7][8][9][10] has established that a defining characteristic of cognitive biases is that they manifest automatically and unconsciously over a wide range of human reasoning, so even those aware of the existence of the phenomenon are unable to detect, let alone mitigate, their manifestation via awareness only.

Real-world effects of cognitive bias edit

There are few studies explicitly linking cognitive biases to real-world incidents with highly negative outcomes. Examples:

  • One study[11] explicitly focused on cognitive bias as a potential contributor to a disaster-level event; this study examined the causes of the loss of several members of two expedition teams on Mount Everest on two consecutive days in 1996. This study concluded that several cognitive biases were 'in play' on the mountain, along with other human dynamics. This was a case of highly trained, experienced people breaking their own rules, apparently under the influence of the overconfidence effect, the sunk cost fallacy, the availability heuristic, and perhaps other cognitive biases. Five people, including both expedition leaders, lost their lives despite explicit warnings in briefings prior to and during the ascent of Everest. In addition to the leaders' mistakes, most team members, though they recognized their leader's faulty judgments, failed to insist on following through on the established ascent rules.
  • In a 2010 MarketBeat study,[12] German researchers examined the role that certain cognitive biases may have had in the global financial crisis beginning in 2007. Their conclusion was that the expertise level of stock analysts and traders made them highly resistant to signals that did not conform to their beliefs in the continuation of the status quo. In the grip of strong confirmation bias reinforced by the overconfidence effect and the status quo bias, they apparently could not see the signals of financial collapse, even after they had become evident to non-experts.
  • Similarly, Kahneman, a Nobel Laureate in Economics, reports[13] in a peer-reviewed study that highly experienced financial managers performed 'no better than chance', largely due to similar factors as reported in the study above, which he termed the "illusion of skill".

There are numerous investigations of incidents determining that human error was central to highly negative potential or actual real-world outcomes, in which manifestation of cognitive biases is a plausible component. Examples:

  • The 'Gimli Glider' Incident,[14] in which a July 23, 1983 Air Canada flight from Montreal to Edmonton ran out of fuel 41,000 feet over Manitoba because of a measurement error on refueling, an outcome later determined to be the result of a series of unchecked assumptions made by ground personnel. Without power to operate radio, radar or other navigation aids, and only manual operation of the aircraft's control surfaces, the flight crew managed to locate an abandoned Canadian Air Force landing strip near Gimli, Manitoba. Without engine power, and with only manual wheel braking, the pilot put the aircraft down, complete with 61 passengers plus crew, and safely brought it to a stop. This outcome was the result of skill (the pilot had glider experience) and luck (the co-pilot just happened to know about the airstrip); there were no deaths, the damage to the aircraft was modest, and there were knowledgeable survivors to inform modifications to fueling procedures at all Canadian airports.
  • The Loss of the Mars Climate Orbiter,[15] which on September 23, 1999 "encountered Mars at an improperly low altitude" and was lost. NASA described the systemic cause of this mishap as an organizational failure, with the specific, proximate cause being unchecked assumptions across mission teams regarding the mix of metric and United States customary units used in different systems on the craft. A host of cognitive biases can be imagined in this situation: confirmation bias, hindsight bias, overconfidence effect, availability bias, and even the meta-bias bias blind spot.
  • The Sullivan Mine Incident[16] of May 18, 2006, in which two mining professionals and two paramedics at the closed Sullivan mine in British Columbia, Canada, all specifically trained in safety measures, lost their lives by failing to understand a life-threatening situation that in hindsight was obvious. The first person to succumb failed to accurately discern an anoxic environment at the bottom of a sump within a sampling shed, accessed by a ladder. After the first fatality, three other co-workers, all trained in hazardous operational situations, one after the other lost their lives in exactly the same manner, each apparently discounting the evidence of the previous victims' fate. The power of confirmation bias alone would be sufficient to explain why this happened, but other cognitive biases probably manifested as well.
  • The London Ambulance Service Failures, in which several Computer Aided Dispatch (CAD) system failures resulted in out-of-specification service delays and reports of deaths attributed to these delays. A 1992 system failure was particularly impactful, with service delays of up to 11 hours resulting in an estimated 30 unnecessary deaths in addition to hundreds of delayed medical procedures.[17] This incident is one example of how large computer system development projects exhibit major flaws in planning, design, execution, test, deployment and maintenance.[18][19]
  • Atul Gawande, an accomplished professional in the medical field, recounts[20] the results of an initiative at a major US hospital, in which a test run showed that doctors skipped at least one of only 5 steps in 1/3 of certain surgery cases, after which nurses were given the authority and responsibility to catch doctors missing any steps in a simple checklist aimed at reducing central line infections. In the subsequent 15-month period, infection rates went from 11% to 0%, 8 deaths were avoided and some $2 million in avoidable costs were saved.
  • Other disaster-level examples of negative outcomes resulting from human error, possibly including multiple cognitive biases: the Three Mile Island nuclear meltdown, the loss of the Space Shuttle Challenger, the Chernobyl nuclear reactor fire, the downing of an Iran Air passenger aircraft, the ineffective response to the Hurricane Katrina weather event, and many more.

Each of the approximately 250 cognitive biases known to date can also produce negative outcomes in our everyday lives, though rarely as serious as in the examples above. An illustrative selection, recounted in multiple studies:[1][2][3][4][5][6][7][8][9][10]

  • Confirmation bias, the tendency to seek out only that information that supports one's preconceptions, and to discount that which does not. For example, hearing only one side of a political debate, or, failing to accept the evidence that one's job has become redundant.
  • Framing effect, the tendency to react to how information is framed, beyond its factual content. For example, choosing no surgery when told it has a 10% failure rate, where one would have opted for surgery if told it has a 90% success rate, or, opting not to choose organ donation as part of driver's license renewal when the default is 'No'.
  • Anchoring bias, the tendency to produce an estimate near a cue amount that may or may not have been intentionally offered. For example, producing a quote based on a manager's preferences, or, negotiating a house purchase price from the starting amount suggested by a real estate agent rather than an objective assessment of value.
  • Gambler's fallacy (aka sunk cost bias), the failure to reset one's expectations based on one's current situation. For example, refusing to pay again to purchase a replacement for a lost ticket to a desired entertainment, or, refusing to sell a sizable long stock position in a rapidly falling market.
  • Representativeness heuristic, the tendency to judge something as belonging to a class based on a few salient characteristics without accounting for base rates of those characteristics. For example, the belief that one will not become an alcoholic because one lacks some characteristic of an alcoholic stereotype, or, that one has a higher probability to win the lottery because one buys tickets from the same kind of vendor as several known big winners.
  • Halo effect, the tendency to attribute unverified capabilities in a person based on an observed capability. For example, believing an Oscar-winning actor's assertion regarding the harvest of Atlantic seals, or, assuming that a tall, handsome man is intelligent and kind.
  • Hindsight bias, the tendency to assess one's previous decisions as more effective than they were. For example, 'recalling' one's prediction that Vancouver would lose the 2011 Stanley Cup, or, 'remembering' to have identified the proximate cause of the 2007 Great Recession.
  • Availability heuristic, the tendency to estimate that what is easily remembered is more likely than that which is not. For example, estimating that an information meeting on municipal planning will be boring because the last such meeting you attended (on a different topic) was so, or, not believing your Member of Parliament's promise to fight for women's equality because he didn't show up to your home bake sale fundraiser for him.
  • Bandwagon effect, the tendency to do or believe what others do or believe. For example, voting for a political candidate because your father unfailingly voted for that candidate's party, or, not objecting to a bully's harassment because the rest of your peers don't.

To date edit

An increasing number of academic and professional disciplines are identifying means of cognitive bias mitigation. What follows is a characterization of the assumptions, theories, methods and results, in disciplines concerned with the efficacy of human reasoning, that plausibly bear on a theory and/or practice of cognitive bias mitigation. In most cases this is based on explicit reference to cognitive biases or their mitigation, in others on unstated but self-evident applicability. This characterization is organized along lines reflecting historical segmentation of disciplines, though in practice there is a significant amount of overlap.

Decision theory edit

Decision theory, a discipline with its roots grounded in neo-classical economics, is explicitly focused on human reasoning, judgment, choice and decision making, primarily in 'one-shot games' between two agents with or without perfect information. The theoretical underpinning of decision theory assumes that all decision makers are rational agents trying to maximize the economic expected value/utility of their choices, and that to accomplish this they utilize formal analytical methods such as mathematics, probability, statistics, and logic under cognitive resource constraints.[21][22][23]

Normative, or prescriptive, decision theory concerns itself with what people should do, given the goal of maximizing expected value/utility; in this approach there is no explicit representation in practitioners' models of unconscious factors such as cognitive biases, i.e. all factors are considered conscious choice parameters for all agents. Practitioners tend to treat deviations from what a rational agent would do as 'errors of irrationality', with the implication that cognitive bias mitigation can only be achieved by decision makers becoming more like rational agents, though no explicit measures for achieving this are proffered.

Positive, or descriptive, decision theory concerns itself with what people actually do; practitioners tend to acknowledge the persistent existence of 'irrational' behavior, and while some mention human motivation and biases as possible contributors to such behavior, these factors are not made explicit in their models. Practitioners tend to treat deviations from what a rational agent would do as evidence of important, but as yet not understood, decision-making variables, and have as yet no explicit or implicit contributions to make to a theory and practice of cognitive bias mitigation.

Game theory edit

Game theory, a discipline with roots in economics and system dynamics, is a method of studying strategic decision making in situations involving multi-step interactions with multiple agents with or without perfect information. As with decision theory, the theoretical underpinning of game theory assumes that all decision makers are rational agents trying to maximize the economic expected value/utility of their choices, and that to accomplish this they utilize formal analytical methods such as mathematics, probability, statistics, and logic under cognitive resource constraints.[24][25][26][27]

One major difference between decision theory and game theory is the notion of 'equilibrium', a situation in which all agents agree on a strategy because any deviation from this strategy punishes the deviating agent. Despite analytical proofs of the existence of at least one equilibrium in a wide range of scenarios, game theory predictions, like those in decision theory, often do not match actual human choices.[28] As with decision theory, practitioners tend to view such deviations as 'irrational', and rather than attempt to model such behavior, by implication hold that cognitive bias mitigation can only be achieved by decision makers becoming more like rational agents.

In the full range of game theory models there are many that do not guarantee the existence of equilibria, i.e. there are conflict situations where there is no set of agents' strategies that all agents agree are in their best interests. However, even when theoretical equilibria exist, i.e. when optimal decision strategies are available for all agents, real-life decision-makers often do not find them; indeed they sometimes apparently do not even try to find them, suggesting that some agents are not consistently 'rational'. game theory does not appear to accommodate any kind of agent other than the rational agent.

Behavioral economics edit

Unlike neo-classical economics and decision theory, behavioral economics and the related field, behavioral finance, explicitly consider the effects of social, cognitive and emotional factors on individuals' economic decisions. These disciplines combine insights from psychology and neo-classical economics to achieve this.[29][30][31]

Prospect theory[32] was an early inspiration for this discipline, and has been further developed by its practitioners. It is one of the earliest economic theories that explicitly acknowledge the notion of cognitive bias, though the model itself accounts for only a few, including loss aversion, anchoring and adjustment bias, endowment effect, and perhaps others. No mention is made in formal prospect theory of cognitive bias mitigation, and there is no evidence of peer-reviewed work on cognitive bias mitigation in other areas of this discipline.

However, Daniel Kahneman and others have authored recent articles in business and trade magazines addressing the notion of cognitive bias mitigation in a limited form.[33] These contributions assert that cognitive bias mitigation is necessary and offer general suggestions for how to achieve it, though the guidance is limited to only a few cognitive biases and is not self-evidently generalizable to others.

Neuroeconomics edit

Neuroeconomics is a discipline made possible by advances in brain activity imaging technologies. This discipline merges some of the ideas in experimental economics, behavioral economics, cognitive science and social science in an attempt to better understand the neural basis for human decision making.

fMRI experiments suggest that the limbic system is consistently involved in resolving economic decision situations that have emotional valence, the inference being that this part of the human brain is implicated in creating the deviations from rational agent choices noted in emotionally valent economic decision making. Practitioners in this discipline have demonstrated correlations between brain activity in this part of the brain and prospection activity, and neuronal activation has been shown to have measurable, consistent effects on decision making.[34][35][36][37][38] These results must be considered speculative and preliminary, but are nonetheless suggestive of the possibility of real-time identification of brain states associated with cognitive bias manifestation, and the possibility of purposeful interventions at the neuronal level to achieve cognitive bias mitigation.

Cognitive psychology edit

Several streams of investigation in this discipline are noteworthy for their possible relevance to a theory of cognitive bias mitigation.

One approach to mitigation originally suggested by Daniel Kahneman and Amos Tversky, expanded upon by others, and applied in real-life situations, is reference class forecasting. This approach involves three steps: with a specific project in mind, identify a number of past projects that share a large number of elements with the project under scrutiny; for this group of projects, establish a probability distribution of the parameter that is being forecast; and, compare the specific project with the group of similar projects, in order to establish the most likely value of the selected parameter for the specific project. This simply stated method masks potential complexity regarding application to real-life projects: few projects are characterizable by a single parameter; multiple parameters exponentially complicates the process; gathering sufficient data on which to build robust probability distributions is problematic; and, project outcomes are rarely unambiguous and their reportage is often skewed by stakeholders' interests. Nonetheless, this approach has merit as part of a cognitive bias mitigation protocol when the process is applied with a maximum of diligence, in situations where good data is available and all stakeholders can be expected to cooperate.

A concept rooted in considerations of the actual machinery of human reasoning, bounded rationality is one that may inform significant advances in cognitive bias mitigation. Originally conceived of by Herbert A. Simon[39] in the 1960s and leading to the concept of satisficing as opposed to optimizing, this idea found experimental expression in the work of Gerd Gigerenzer and others. One line of Gigerenzer's work led to the "Fast and Frugal" framing of the human reasoning mechanism,[40] which focused on the primacy of 'recognition' in decision making, backed up by tie-resolving heuristics operating in a low cognitive resource environment. In a series of objective tests, models based on this approach outperformed models based on rational agents maximizing their utility using formal analytical methods. One contribution to a theory and practice of cognitive bias mitigation from this approach is that it addresses mitigation without explicitly targeting individual cognitive biases and focuses on the reasoning mechanism itself to avoid cognitive biases manifestation.

Intensive situational training is capable of providing individuals with what appears to be cognitive bias mitigation in decision making, but amounts to a fixed strategy of selecting the single best response to recognized situations regardless of the 'noise' in the environment. Studies and anecdotes reported in popular-audience media[13][20][41][42] of firefighter captains, military platoon leaders and others making correct, snap judgments under extreme duress suggest that these responses are likely not generalizable and may contribute to a theory and practice of cognitive bias mitigation only the general idea of domain-specific intensive training.

Similarly, expert-level training in such foundational disciplines as mathematics, statistics, probability, logic, etc. can be useful for cognitive bias mitigation when the expected standard of performance reflects such formal analytical methods. However, a study of software engineering professionals[43] suggests that for the task of estimating software projects, despite the strong analytical aspect of this task, standards of performance focusing on workplace social context were much more dominant than formal analytical methods. This finding, if generalizable to other tasks and disciplines, would discount the potential of expert-level training as a cognitive bias mitigation approach, and could contribute a narrow but important idea to a theory and practice of cognitive bias mitigation.

Laboratory experiments in which cognitive bias mitigation is an explicit goal are rare. One 1980 study[44] explored the notion of reducing the optimism bias by showing subjects other subjects' outputs from a reasoning task, with the result that their subsequent decision-making was somewhat debiased.

A recent research effort by Morewedge and colleagues (2015) found evidence for domain-general forms of debiasing. In two longitudinal experiments, debiasing training techniques featuring interactive games that elicited six cognitive biases (anchoring, bias blind spot, confirmation bias, fundamental attribution error, projection bias, and representativeness), provided participants with individualized feedback, mitigating strategies, and practice, resulted in an immediate reduction of more than 30% in commission of the biases and a long term (2 to 3-month delay) reduction of more than 20%. The instructional videos were also effective, but were less effective than the games.[45]

Evolutionary psychology edit

This discipline explicitly challenges the prevalent view that humans are rational agents maximizing expected value/utility, using formal analytical methods to do so. Practitioners such as Cosmides, Tooby, Haselton, Confer and others posit that cognitive biases are more properly referred to as cognitive heuristics, and should be viewed as a toolkit of cognitive shortcuts[46][47][48][49] selected for by evolutionary pressure and thus are features rather than flaws, as assumed in the prevalent view. Theoretical models and analyses supporting this view are plentiful.[50] This view suggests that negative reasoning outcomes arise primarily because the reasoning challenges faced by modern humans, and the social and political context within which these are presented, make demands on our ancient 'heuristic toolkit' that at best create confusion as to which heuristics to apply in a given situation, and at worst generate what adherents of the prevalent view call 'reasoning errors'.

In a similar vein, Mercier and Sperber describe a theory[51] for confirmation bias, and possibly other cognitive biases, which is a radical departure from the prevalent view, which holds that human reasoning is intended to assist individual economic decisions. Their view suggests that it evolved as a social phenomenon and that the goal was argumentation, i.e. to convince others and to be careful when others try to convince us. It is too early to tell whether this idea applies more generally to other cognitive biases, but the point of view supporting the theory may be useful in the construction of a theory and practice of cognitive bias mitigation.

There is an emerging convergence between evolutionary psychology and the concept of our reasoning mechanism being segregated (approximately) into 'System 1' and 'System 2'.[13][46] In this view, System 1 is the 'first line' of cognitive processing of all perceptions, including internally generated 'pseudo-perceptions', which automatically, subconsciously and near-instantaneously produces emotionally valenced judgments of their probable effect on the individual's well-being. By contrast, System 2 is responsible for 'executive control', taking System 1's judgments as advisories, making future predictions, via prospection, of their actualization and then choosing which advisories, if any, to act on. In this view, System 2 is slow, simple-minded and lazy, usually defaulting to System 1 advisories and overriding them only when intensively trained to do so or when cognitive dissonance would result. In this view, our 'heuristic toolkit' resides largely in System 1, conforming to the view of cognitive biases being unconscious, automatic and very difficult to detect and override. Evolutionary psychology practitioners emphasize that our heuristic toolkit, despite the apparent abundance of 'reasoning errors' attributed to it, actually performs exceptionally well, given the rate at which it must operate, the range of judgments it produces, and the stakes involved. The System 1/2 view of the human reasoning mechanism appears to have empirical plausibility (see Neuroscience, next, and for empirical and theoretical arguments against, see [52][53][54]) and thus may contribute to a theory and practice of cognitive bias mitigation.

Neuroscience edit

Neuroscience offers empirical support for the concept of segregating the human reasoning mechanism into System 1 and System 2, as described above, based on brain activity imaging experiments using fMRI technology. While this notion must remain speculative until further work is done, it appears to be a productive basis for conceiving options for constructing a theory and practice of cognitive bias mitigation.[55][56]

Anthropology edit

Anthropologists have provided generally accepted scenarios[57][58][59][60][61] of how our progenitors lived and what was important in their lives. These scenarios of social, political, and economic organization are not uniform throughout history or geography, but there is a degree of stability throughout the Paleolithic era, and the Holocene in particular. This, along with the findings in Evolutionary psychology and Neuroscience above, suggests that our cognitive heuristics are at their best when operating in a social, political and economic environment most like that of the Paleolithic/Holocene. If this is true, then one possible means to achieve at least some cognitive bias mitigation is to mimic, as much as possible, Paleolithic/Holocene social, political and economic scenarios when one is performing a reasoning task that could attract negative cognitive bias effects.

Human reliability engineering edit

A number of paradigms, methods and tools for improving human performance reliability[20][62][63][64][65][66] have been developed within the discipline of human reliability engineering. Although there is some attention paid to the human reasoning mechanism itself, the dominant approach is to anticipate problematic situations, constrain human operations through process mandates, and guide human decisions through fixed response protocols specific to the domain involved. While this approach can produce effective responses to critical situations under stress, the protocols involved must be viewed as having limited generalizability beyond the domain for which they were developed, with the implication that solutions in this discipline may provide only generic frameworks to a theory and practice of cognitive bias mitigation.

Machine learning edit

Machine learning, a branch of artificial intelligence, has been used to investigate human learning and decision making.[67]

One technique particularly applicable to cognitive bias mitigation is neural network learning and choice selection, an approach inspired by the imagined structure and function of actual biological neural networks in the human brain. The multilayer, cross-connected signal collection and propagation structure typical of neural network models, where weights govern the contribution of signals to each connection, allow very small models to perform rather complex decision-making tasks at high fidelity.

In principle, such models are capable of modeling decision making that takes account of human needs and motivations within social contexts, and suggest their consideration in a theory and practice of cognitive bias mitigation. Challenges to realizing this potential: accumulating the considerable amount of appropriate real world 'training sets' for the neural network portion of such models; characterizing real-life decision-making situations and outcomes so as to drive models effectively; and the lack of direct mapping from a neural network's internal structure to components of the human reasoning mechanism.

Software engineering edit

This discipline, though not focused on improving human reasoning outcomes as an end goal, is one in which the need for such improvement has been explicitly recognized,[18][19] though the term "cognitive bias mitigation" is not universally used.

One study[68] identifies specific steps to counter the effects of confirmation bias in certain phases of the software engineering lifecycle.

Another study[43] takes a step back from focussing on cognitive biases and describes a framework for identifying "Performance Norms", criteria by which reasoning outcomes are judged correct or incorrect, so as to determine when cognitive bias mitigation is required, to guide identification of the biases that may be 'in play' in a real-world situation, and subsequently to prescribe their mitigations. This study refers to a broad research program with the goal of moving toward a theory and practice of cognitive bias mitigation.

Other edit

Other initiatives aimed directly at a theory and practice of cognitive bias mitigation may exist within other disciplines under different labels than employed here.

See also edit

References edit

  1. ^ a b Ariely, D. (2008). Predictably Irrational: The Hidden Forces That Shape Our Decisions, Harper Collins.
  2. ^ a b Epley, N.; Gilovich, T. (2006). "The Anchoring-and-Adjustment Heuristic: Why the Adjustments are Insufficient". Psychological Science. 17 (4): 311–318. doi:10.1111/j.1467-9280.2006.01704.x. PMID 16623688. S2CID 10279390.
  3. ^ a b Gigerenzer, G. (2006). "Bounded and Rational." Contemporary Debates in Cognitive Science. R. J. Stainton, Blackwell Publishing: 115–133.
  4. ^ a b Gilovich, T. (1991). How We Know What Isn't So: The Fallibility of Human Reason in Everyday Life. New York, NY, The Free Press.
  5. ^ a b Hammond, J. S.; Keeney, R. L.; et al. (2006). "The Hidden Traps in Decision Making". Harvard Business Review. 84 (1): 118–126.
  6. ^ a b Haselton, M. G., D. Nettie, et al. (2005). The Evolution of Cognitive Bias. Handbook of Evolutionary Psychology. D. M. Buss. Hoboken, Wiley: 724–746.
  7. ^ a b Henrich; et al. (2010). "Markets, Religion, Community Size, and the Evolution of Fairness and Punishment". Science. 327 (5972): 1480–1484. Bibcode:2010Sci...327.1480H. CiteSeerX 10.1.1.714.7830. doi:10.1126/science.1182238. PMID 20299588. S2CID 4803905.
  8. ^ a b Lerher, J. (2009). How We Decide. New York, NY, Houghton Mifflin Harcourt.
  9. ^ a b Nozick, R. (1993). The Nature of Rationality. Ewing, NJ, Princeton University Press.
  10. ^ a b Schacter, D. L. (1999). "The Seven Sins of Memory: Insights From Psychology and Cognitive Neuroscience". American Psychologist. 54 (3): 182–203. doi:10.1037/0003-066x.54.3.182. PMID 10199218. S2CID 14882268.
  11. ^ Roberto, M. A. (2002). "Lessons from Everest: The Interaction of Cognitive Bias, Psychological, Safety and System Complexity." California Management Review (2002) 45(1): 136–158.
  12. ^ Knauff, M.; Budeck, C.; Wolf, A. G.; Hamburger, K. (2010). "The Illogicality of Stock-Brokers: Psychological Experiments on the Effects of Prior Knowledge and Belief Biases on Logical Reasoning in Stock Trading". PLOS ONE. 5 (10): e13483. Bibcode:2010PLoSO...513483K. doi:10.1371/journal.pone.0013483. PMC 2956684. PMID 20976157.
  13. ^ a b c Kahneman, D. (2011). Thinking, Fast and Slow, Doubleday Canada.
  14. ^ ID=19830723-0 (1983). "Gimli Glider Accident Report." Aviation Safety Network http://aviation-safety.net
  15. ^ Stephenson, Arthur G.; LaPiana, Lia S.; Mulville, Daniel R.; Rutledge, Peter J.; Bauer, Frank H.; Folta, David; Dukeman, Greg A.; Sackheim, Robert et al (1999-11-10). "Mars Climate Orbiter Mishap Investigation Board Phase I Report." National Air and Space Administration.
  16. ^ British Columbia Ministry of Energy, Mines and Petroleum Resources: Sullivan Mine Accident Report, May 17, 2006.
  17. ^ Beynon-Davies, P., "Information systems `failure': case of the LASCAD project", European Journal of Information Systems, 1995.
  18. ^ a b Mann, C. C. (2002). "Why Software is So Bad." Technology Review, MIT, July 2002.
  19. ^ a b Stacy, W.; MacMillan, J. (1995). "Cognitive Bias in Software Engineering". Communications of the ACM. 38 (6): 57–63. doi:10.1145/203241.203256. S2CID 1505473.
  20. ^ a b c Gawande, A. (2010). The Checklist Manifesto: How to Get Things Right. New York, NY, Metropolitan Books.
  21. ^ Kahneman, D.; Thaler, R. (2006). "Utility Maximization and Experienced Utility". Journal of Economic Perspectives. 20 (1): 221–234. doi:10.1257/089533006776526076.
  22. ^ Frey, B.; Stutzer, A. (2002). "What Can Economists Learn from Happiness Research?" (PDF). Journal of Economic Literature. 40 (2): 402–35. doi:10.1257/002205102320161320. S2CID 13967611.
  23. ^ Kahneman, D. (2000). "Experienced Utility and Objective Happiness: A Moment-Based Approach." Chapter 37 in: D. Kahneman and A. Tversky (Eds.) "Choices, Values and Frames." New York: Cambridge University Press and the Russell Sage Foundation, 1999.
  24. ^ Binmore, K. (2007). "A Very Short Introduction to Game Theory." Oxford University Press.
  25. ^ Camerer, C. F., Ho T.-H., Chong, J.-K. (2002). "A Cognitive Hierarchy Theory of One-Shot Games and Experimental Analysis." Forth, Quarterly Journal of Economics."
  26. ^ Broseta, B., Costa-Gomes, M., Crawford, V. (2000). "Cognition and Behavior in Normal-Form Games: An Experimental Study." Department of Economics, University of California at San Diego, Permalink: http://www.escholarship.org/uc/item/0fp8278k.
  27. ^ Myerson, R. B. (1991). "Game Theory: Analysis of Conflict." Harvard University Press.
  28. ^ Wright J. R., Leyton-Brown, K., Behavioral Game-Theoretic Models: A Bayesian Framework For Parameter Analysis, to appear in Proceedings of the 11th International Conference on Autonomous Agents and Multiagent Systems (AAMAS 2012), (8 pages), 2012.
  29. ^ Frey, B.; Stutzer, A. (2002). "What Can Economists Learn from Happiness Research?" (PDF). Journal of Economic Literature. 40 (2): 402–35. doi:10.1257/002205102320161320. S2CID 13967611.
  30. ^ Kahneman, D. "Maps of Bounded Rationality: Psychology for Behavioral Economics." American Economic Review (December 2003): 1449–1475.
  31. ^ Mullainathan, Sendhil, and Richard Thaler. "Behavioral Economics." MIT Department of Economics Working Paper 00-27. (September 2000).
  32. ^ Kahneman, D.; Tversky, A. (1979). "Prospect Theory: An Analysis of Decision Under Risk". Econometrica. 47 (2): 263–291. CiteSeerX 10.1.1.407.1910. doi:10.2307/1914185. JSTOR 1914185.
  33. ^ Kahneman, D., Lovallo, D., Sibony, O. (2011). "Before You Make That Big Decision." Harvard Business Review, June, 2011.
  34. ^ Loewenstein, George; Rick, Scott; Cohen, Jonathan D. (January 2008). "Neuroeconomics". Annual Review of Psychology. 59 (1): 647–672. doi:10.1146/annurev.psych.59.103006.093710. PMID 17883335.
  35. ^ Rustichini, A (2009). "Neuroeconomics: What have we found, and what should we search for?". Current Opinion in Neurobiology. 19 (6): 672–677. doi:10.1016/j.conb.2009.09.012. PMID 19896360. S2CID 2281817.
  36. ^ Padoa-Schioppa, C.; Assad, J.A. (2007). "The Representation of Economic Value in the Orbitofrontal Cortex is Invariant for Changes of Menu". Nature Reviews Neuroscience. 11 (1): 95–102. doi:10.1038/nn2020. PMC 2646102. PMID 18066060.
  37. ^ Spreng, R. N., Mar, R. A., Kim, A. S. N. (2008). The Common Neural Basis of Autobiographical Memory, Prospection, Navigation, Theory of Mind and the Default Mode: A Quantitative Meta-Analysis. Journal of Cognitive Neuroscience, (Epub ahead of print)(2010).
  38. ^ Jamison, J.; Wegener, J. (2010). "Multiple Selves in Intertemporal Choice" (PDF). Journal of Economic Psychology. 31 (5): 832–839. doi:10.1016/j.joep.2010.03.004.
  39. ^ Simon, H. A. (1991). "Bounded Rationality and Organizational Learning". Organization Science. 2 (1): 125–134. doi:10.1287/orsc.2.1.125.
  40. ^ Gigerenzer, G.; Goldstein, D. G. (1996). "Reasoning the Fast and Frugal Way: Models of Bounded Rationality". Psychological Review. 103 (4): 650–669. CiteSeerX 10.1.1.174.4404. doi:10.1037/0033-295x.103.4.650. PMID 8888650.
  41. ^ Gladwell, M. (2006). Blink: The Power of Thinking Without Thinking. New York, NY, Little, Brown and Company.
  42. ^ Shermer, M. (2010). A review of Paul Thagard's "The Brain and the Meaning of Life". Skeptic Magazine. Altadena, CA, Skeptics Society. 16: 60–61.
  43. ^ a b Conroy, P., Kruchten, P. (2012). "Performance Norms: An Approach to Reducing Rework in Software Development", to appear in IEEE Xplore re 2012 Canadian Conference on Electrical and Computing Engineering.
  44. ^ Weinstein, N. D. (1980). "Unrealistic Optimism About Future Life Events". Department of Human Ecology and Social Sciences, Cook College, Rutgers, The State University". Journal of Personality and Social Psychology. 39 (5): 806–820. CiteSeerX 10.1.1.535.9244. doi:10.1037/0022-3514.39.5.806.
  45. ^ Morewedge, C. K.; Yoon, H.; Scopelliti, I.; Symborski, C. W.; Korris, J. H.; Kassam, K. S. (13 August 2015). "Debiasing Decisions: Improved Decision Making With a Single Training Intervention" (PDF). Policy Insights from the Behavioral and Brain Sciences. 2 (1): 129–140. doi:10.1177/2372732215600886. S2CID 4848978.
  46. ^ a b Cosmides, L., Tooby, J. "Evolutionary Psychology: A Primer." at http://www.psych.ucsb.edu/research/cep/primer.html 2009-02-28 at the Wayback Machine."
  47. ^ Haselton, M. G.; Bryant, G. A.; Wilke, A.; Frederick, D. A.; Galperin, A.; Frankenhuis, W. E.; Moore, T. (2009). "Adaptive Rationality: An Evolutionary Perspective on Cognitive Bias". Social Cognition. 27 (5): 733–763. CiteSeerX 10.1.1.220.6198. doi:10.1521/soco.2009.27.5.733.
  48. ^ Haselton, M. G., D. Nettie, et al. (2005). "The Evolution of Cognitive Bias." Handbook of Evolutionary Psychology. D. M. Buss. Hoboken, Wiley: 724–746.
  49. ^ Confer; et al. (2010). "Evolutionary Psychology: Controversies, Questions, Prospects, and Limitations". American Psychologist. 65 (2): 110–126. CiteSeerX 10.1.1.601.8691. doi:10.1037/a0018413. PMID 20141266.
  50. ^ Chudek, M.; Henrich, J. (2011). "Culture–Gene Coevolution, Norm-Psychology and the Emergence of Human Prosociality". Trends in Cognitive Sciences. 15 (5): 218–226. doi:10.1016/j.tics.2011.03.003. PMID 21482176. S2CID 16710885.
  51. ^ Mercier, H.; Sperber, D. (2011). "Argumentative Theory". Behavioral and Brain Sciences. 34 (2): 57–74. doi:10.1017/s0140525x10000968. PMID 21447233. S2CID 5669039.
  52. '^ Evans, J. B. T. (2006). Dual system theories of cognition: Some issues. Proceedings of the Annual Meeting of Cognitive Science Society', 28(28). https://cloudfront.escholarship.org/dist/prd/content/qt76d4d629/qt76d4d629.pdf
  53. ^ Garcia-Marques, L.; Ferreira, M. B. (2011). "Friends and foes of theory construction in psychological science: Vague dichotomies, unified theories of cognition, and the new experimentalism". Perspectives on Psychological Science. 6 (2): 192–201. doi:10.1177/1745691611400239. PMID 26162138. S2CID 118743.
  54. ^ Fiedler, K. & Hütter, M. (2014). The limits of automaticity. J. Sherman, B. Gawronski, & Y. Trope (Eds.), Dual Processes in Social Psychology (pp. 497-513). New York: Guilford Publications, Inc. https://www.researchgate.net/profile/Mandy_Huetter/publication/308789747_The_limits_of_automaticity/links/58ad4fd24585155ae77aefac/The-limits-of-automaticity.pdf
  55. ^ Damasio, A. (2010). Self Comes to Mind: Constructing the Conscious Brain. New York, NY, Pantheon.
  56. ^ Changeux, J.-P. P., A. Damasio, et al., Eds. (2007). Neurobiology of Human Values (Research and Perspectives in Neurosciences). Heidelberg, Germany, Springer.
  57. ^ Ember, C. R. (1978). Myths About Hunter-Gatherers, University of Pittsburgh Of the Commonwealth System of Higher Education, 17(4), pp 439–448.
  58. ^ Gabow, S. L. (1977). "Population Structure and the Rate of Hominid Brain Evolution". Journal of Human Evolution. 6 (7): 643–665. doi:10.1016/s0047-2484(77)80136-x.
  59. ^ Hamilton, M. J.; Milne, B. T.; Walker, R.S.; Burger, O.; Brown, J.H. (2007). "The complex Structure of Hunter–Gatherer Social Networks". Proceedings of the Royal Society B. 2007 (274): 2195–2203. doi:10.1098/rspb.2007.0564. PMC 2706200. PMID 17609186.
  60. ^ Kuhn, S. L.; Stiner, M. C. (2006). "What's a Mother To Do? The Division of Labor among Neanderthals and Modern Humans in Eurasia". Current Anthropology. 47 (6): 953–981. doi:10.1086/507197. S2CID 42981328.
  61. ^ Marlowe, F. W. (2005). "Hunter-Gatherers and Human Evolution". Evolutionary Anthropology: Issues, News, and Reviews. 14 (2): 54–67. doi:10.1002/evan.20046. S2CID 53489209.
  62. ^ Gertman, D., Blackman, H., Marble, J., Byers, J. and Smith, C. (2005). The SPAR-H human reliability analysis method.
  63. ^ Hollnagel, E. (1998). Cognitive reliability and error analysis method: CREAM. Elsevier.
  64. ^ Roth, E. et al. (1994). An empirical investigation of operator performance in cognitive demanding simulated emergencies. NUREG/CR-6208, Westinghouse Science and Technology Center. Report prepared for Nuclear Regulatory Commission.
  65. ^ Wiegmann, D. & Shappell, S. (2003). A human error approach to aviation accident analysis: The human factors analysis and classification system.. Ashgate.
  66. ^ Wilson, J.R. (1993). SHEAN (Simplified Human Error Analysis code) and automated THERP. United States Department of Energy Technical Report Number WINCO–11908.
  67. ^ Sutton, R. S., Barto, A. G. (1998). MIT CogNet Ebook Collection; MITCogNet 1998, Adaptive Computation and Machine Learning, ISBN 978-0-262-19398-6.
  68. ^ Caliki, G., Bener, A., Arsian, B. (2010). "An Analysis of the Effects of Company Culture, Education and Experience on Confirmation Bias Levels of Software Developers and Testers." ADM/IEEE 32nd International Conference on Software Engineering – ICSE 2010 Volume 2: pp187-190.

External links edit

  • Fast and Frugal Heuristics
  • [usurped]
  • Institute of Ergonomics and Human Factors
  • International Machine Learning Society
  • Temporal Difference Learning
  • Cognitive Neuroscience Society
  • Max Planck Institute for Human Development

cognitive, bias, mitigation, broader, coverage, this, topic, debiasing, prevention, reduction, negative, effects, cognitive, biases, unconscious, automatic, influences, human, judgment, decision, making, that, reliably, produce, reasoning, errors, coherent, co. For broader coverage of this topic see Debiasing Cognitive bias mitigation is the prevention and reduction of the negative effects of cognitive biases unconscious automatic influences on human judgment and decision making that reliably produce reasoning errors Coherent comprehensive theories of cognitive bias mitigation are lacking This article describes debiasing tools methods proposals and other initiatives in academic and professional disciplines concerned with the efficacy of human reasoning associated with the concept of cognitive bias mitigation most address mitigation tacitly rather than explicitly A long standing debate regarding human decision making bears on the development of a theory and practice of bias mitigation This debate contrasts the rational economic agent standard for decision making versus one grounded in human social needs and motivations The debate also contrasts the methods used to analyze and predict human decision making i e formal analysis emphasizing intellectual capacities versus heuristics emphasizing emotional states This article identifies elements relevant to this debate Contents 1 Context 2 Real world effects of cognitive bias 3 To date 3 1 Decision theory 3 2 Game theory 3 3 Behavioral economics 3 4 Neuroeconomics 3 5 Cognitive psychology 3 6 Evolutionary psychology 3 7 Neuroscience 3 8 Anthropology 3 9 Human reliability engineering 3 10 Machine learning 3 11 Software engineering 3 12 Other 4 See also 5 References 6 External linksContext editA large body of evidence 1 2 3 4 5 6 7 8 9 10 has established that a defining characteristic of cognitive biases is that they manifest automatically and unconsciously over a wide range of human reasoning so even those aware of the existence of the phenomenon are unable to detect let alone mitigate their manifestation via awareness only Real world effects of cognitive bias editThere are few studies explicitly linking cognitive biases to real world incidents with highly negative outcomes Examples One study 11 explicitly focused on cognitive bias as a potential contributor to a disaster level event this study examined the causes of the loss of several members of two expedition teams on Mount Everest on two consecutive days in 1996 This study concluded that several cognitive biases were in play on the mountain along with other human dynamics This was a case of highly trained experienced people breaking their own rules apparently under the influence of the overconfidence effect the sunk cost fallacy the availability heuristic and perhaps other cognitive biases Five people including both expedition leaders lost their lives despite explicit warnings in briefings prior to and during the ascent of Everest In addition to the leaders mistakes most team members though they recognized their leader s faulty judgments failed to insist on following through on the established ascent rules In a 2010 MarketBeat study 12 German researchers examined the role that certain cognitive biases may have had in the global financial crisis beginning in 2007 Their conclusion was that the expertise level of stock analysts and traders made them highly resistant to signals that did not conform to their beliefs in the continuation of the status quo In the grip of strong confirmation bias reinforced by the overconfidence effect and the status quo bias they apparently could not see the signals of financial collapse even after they had become evident to non experts Similarly Kahneman a Nobel Laureate in Economics reports 13 in a peer reviewed study that highly experienced financial managers performed no better than chance largely due to similar factors as reported in the study above which he termed the illusion of skill There are numerous investigations of incidents determining that human error was central to highly negative potential or actual real world outcomes in which manifestation of cognitive biases is a plausible component Examples The Gimli Glider Incident 14 in which a July 23 1983 Air Canada flight from Montreal to Edmonton ran out of fuel 41 000 feet over Manitoba because of a measurement error on refueling an outcome later determined to be the result of a series of unchecked assumptions made by ground personnel Without power to operate radio radar or other navigation aids and only manual operation of the aircraft s control surfaces the flight crew managed to locate an abandoned Canadian Air Force landing strip near Gimli Manitoba Without engine power and with only manual wheel braking the pilot put the aircraft down complete with 61 passengers plus crew and safely brought it to a stop This outcome was the result of skill the pilot had glider experience and luck the co pilot just happened to know about the airstrip there were no deaths the damage to the aircraft was modest and there were knowledgeable survivors to inform modifications to fueling procedures at all Canadian airports The Loss of the Mars Climate Orbiter 15 which on September 23 1999 encountered Mars at an improperly low altitude and was lost NASA described the systemic cause of this mishap as an organizational failure with the specific proximate cause being unchecked assumptions across mission teams regarding the mix of metric and United States customary units used in different systems on the craft A host of cognitive biases can be imagined in this situation confirmation bias hindsight bias overconfidence effect availability bias and even the meta bias bias blind spot The Sullivan Mine Incident 16 of May 18 2006 in which two mining professionals and two paramedics at the closed Sullivan mine in British Columbia Canada all specifically trained in safety measures lost their lives by failing to understand a life threatening situation that in hindsight was obvious The first person to succumb failed to accurately discern an anoxic environment at the bottom of a sump within a sampling shed accessed by a ladder After the first fatality three other co workers all trained in hazardous operational situations one after the other lost their lives in exactly the same manner each apparently discounting the evidence of the previous victims fate The power of confirmation bias alone would be sufficient to explain why this happened but other cognitive biases probably manifested as well The London Ambulance Service Failures in which several Computer Aided Dispatch CAD system failures resulted in out of specification service delays and reports of deaths attributed to these delays A 1992 system failure was particularly impactful with service delays of up to 11 hours resulting in an estimated 30 unnecessary deaths in addition to hundreds of delayed medical procedures 17 This incident is one example of how large computer system development projects exhibit major flaws in planning design execution test deployment and maintenance 18 19 Atul Gawande an accomplished professional in the medical field recounts 20 the results of an initiative at a major US hospital in which a test run showed that doctors skipped at least one of only 5 steps in 1 3 of certain surgery cases after which nurses were given the authority and responsibility to catch doctors missing any steps in a simple checklist aimed at reducing central line infections In the subsequent 15 month period infection rates went from 11 to 0 8 deaths were avoided and some 2 million in avoidable costs were saved Other disaster level examples of negative outcomes resulting from human error possibly including multiple cognitive biases the Three Mile Island nuclear meltdown the loss of the Space Shuttle Challenger the Chernobyl nuclear reactor fire the downing of an Iran Air passenger aircraft the ineffective response to the Hurricane Katrina weather event and many more Each of the approximately 250 cognitive biases known to date can also produce negative outcomes in our everyday lives though rarely as serious as in the examples above An illustrative selection recounted in multiple studies 1 2 3 4 5 6 7 8 9 10 Confirmation bias the tendency to seek out only that information that supports one s preconceptions and to discount that which does not For example hearing only one side of a political debate or failing to accept the evidence that one s job has become redundant Framing effect the tendency to react to how information is framed beyond its factual content For example choosing no surgery when told it has a 10 failure rate where one would have opted for surgery if told it has a 90 success rate or opting not to choose organ donation as part of driver s license renewal when the default is No Anchoring bias the tendency to produce an estimate near a cue amount that may or may not have been intentionally offered For example producing a quote based on a manager s preferences or negotiating a house purchase price from the starting amount suggested by a real estate agent rather than an objective assessment of value Gambler s fallacy aka sunk cost bias the failure to reset one s expectations based on one s current situation For example refusing to pay again to purchase a replacement for a lost ticket to a desired entertainment or refusing to sell a sizable long stock position in a rapidly falling market Representativeness heuristic the tendency to judge something as belonging to a class based on a few salient characteristics without accounting for base rates of those characteristics For example the belief that one will not become an alcoholic because one lacks some characteristic of an alcoholic stereotype or that one has a higher probability to win the lottery because one buys tickets from the same kind of vendor as several known big winners Halo effect the tendency to attribute unverified capabilities in a person based on an observed capability For example believing an Oscar winning actor s assertion regarding the harvest of Atlantic seals or assuming that a tall handsome man is intelligent and kind Hindsight bias the tendency to assess one s previous decisions as more effective than they were For example recalling one s prediction that Vancouver would lose the 2011 Stanley Cup or remembering to have identified the proximate cause of the 2007 Great Recession Availability heuristic the tendency to estimate that what is easily remembered is more likely than that which is not For example estimating that an information meeting on municipal planning will be boring because the last such meeting you attended on a different topic was so or not believing your Member of Parliament s promise to fight for women s equality because he didn t show up to your home bake sale fundraiser for him Bandwagon effect the tendency to do or believe what others do or believe For example voting for a political candidate because your father unfailingly voted for that candidate s party or not objecting to a bully s harassment because the rest of your peers don t To date editAn increasing number of academic and professional disciplines are identifying means of cognitive bias mitigation What follows is a characterization of the assumptions theories methods and results in disciplines concerned with the efficacy of human reasoning that plausibly bear on a theory and or practice of cognitive bias mitigation In most cases this is based on explicit reference to cognitive biases or their mitigation in others on unstated but self evident applicability This characterization is organized along lines reflecting historical segmentation of disciplines though in practice there is a significant amount of overlap Decision theory edit Decision theory a discipline with its roots grounded in neo classical economics is explicitly focused on human reasoning judgment choice and decision making primarily in one shot games between two agents with or without perfect information The theoretical underpinning of decision theory assumes that all decision makers are rational agents trying to maximize the economic expected value utility of their choices and that to accomplish this they utilize formal analytical methods such as mathematics probability statistics and logic under cognitive resource constraints 21 22 23 Normative or prescriptive decision theory concerns itself with what people should do given the goal of maximizing expected value utility in this approach there is no explicit representation in practitioners models of unconscious factors such as cognitive biases i e all factors are considered conscious choice parameters for all agents Practitioners tend to treat deviations from what a rational agent would do as errors of irrationality with the implication that cognitive bias mitigation can only be achieved by decision makers becoming more like rational agents though no explicit measures for achieving this are proffered Positive or descriptive decision theory concerns itself with what people actually do practitioners tend to acknowledge the persistent existence of irrational behavior and while some mention human motivation and biases as possible contributors to such behavior these factors are not made explicit in their models Practitioners tend to treat deviations from what a rational agent would do as evidence of important but as yet not understood decision making variables and have as yet no explicit or implicit contributions to make to a theory and practice of cognitive bias mitigation Game theory edit Game theory a discipline with roots in economics and system dynamics is a method of studying strategic decision making in situations involving multi step interactions with multiple agents with or without perfect information As with decision theory the theoretical underpinning of game theory assumes that all decision makers are rational agents trying to maximize the economic expected value utility of their choices and that to accomplish this they utilize formal analytical methods such as mathematics probability statistics and logic under cognitive resource constraints 24 25 26 27 One major difference between decision theory and game theory is the notion of equilibrium a situation in which all agents agree on a strategy because any deviation from this strategy punishes the deviating agent Despite analytical proofs of the existence of at least one equilibrium in a wide range of scenarios game theory predictions like those in decision theory often do not match actual human choices 28 As with decision theory practitioners tend to view such deviations as irrational and rather than attempt to model such behavior by implication hold that cognitive bias mitigation can only be achieved by decision makers becoming more like rational agents In the full range of game theory models there are many that do not guarantee the existence of equilibria i e there are conflict situations where there is no set of agents strategies that all agents agree are in their best interests However even when theoretical equilibria exist i e when optimal decision strategies are available for all agents real life decision makers often do not find them indeed they sometimes apparently do not even try to find them suggesting that some agents are not consistently rational game theory does not appear to accommodate any kind of agent other than the rational agent Behavioral economics edit Unlike neo classical economics and decision theory behavioral economics and the related field behavioral finance explicitly consider the effects of social cognitive and emotional factors on individuals economic decisions These disciplines combine insights from psychology and neo classical economics to achieve this 29 30 31 Prospect theory 32 was an early inspiration for this discipline and has been further developed by its practitioners It is one of the earliest economic theories that explicitly acknowledge the notion of cognitive bias though the model itself accounts for only a few including loss aversion anchoring and adjustment bias endowment effect and perhaps others No mention is made in formal prospect theory of cognitive bias mitigation and there is no evidence of peer reviewed work on cognitive bias mitigation in other areas of this discipline However Daniel Kahneman and others have authored recent articles in business and trade magazines addressing the notion of cognitive bias mitigation in a limited form 33 These contributions assert that cognitive bias mitigation is necessary and offer general suggestions for how to achieve it though the guidance is limited to only a few cognitive biases and is not self evidently generalizable to others Neuroeconomics edit Neuroeconomics is a discipline made possible by advances in brain activity imaging technologies This discipline merges some of the ideas in experimental economics behavioral economics cognitive science and social science in an attempt to better understand the neural basis for human decision making fMRI experiments suggest that the limbic system is consistently involved in resolving economic decision situations that have emotional valence the inference being that this part of the human brain is implicated in creating the deviations from rational agent choices noted in emotionally valent economic decision making Practitioners in this discipline have demonstrated correlations between brain activity in this part of the brain and prospection activity and neuronal activation has been shown to have measurable consistent effects on decision making 34 35 36 37 38 These results must be considered speculative and preliminary but are nonetheless suggestive of the possibility of real time identification of brain states associated with cognitive bias manifestation and the possibility of purposeful interventions at the neuronal level to achieve cognitive bias mitigation Cognitive psychology edit Several streams of investigation in this discipline are noteworthy for their possible relevance to a theory of cognitive bias mitigation One approach to mitigation originally suggested by Daniel Kahneman and Amos Tversky expanded upon by others and applied in real life situations is reference class forecasting This approach involves three steps with a specific project in mind identify a number of past projects that share a large number of elements with the project under scrutiny for this group of projects establish a probability distribution of the parameter that is being forecast and compare the specific project with the group of similar projects in order to establish the most likely value of the selected parameter for the specific project This simply stated method masks potential complexity regarding application to real life projects few projects are characterizable by a single parameter multiple parameters exponentially complicates the process gathering sufficient data on which to build robust probability distributions is problematic and project outcomes are rarely unambiguous and their reportage is often skewed by stakeholders interests Nonetheless this approach has merit as part of a cognitive bias mitigation protocol when the process is applied with a maximum of diligence in situations where good data is available and all stakeholders can be expected to cooperate A concept rooted in considerations of the actual machinery of human reasoning bounded rationality is one that may inform significant advances in cognitive bias mitigation Originally conceived of by Herbert A Simon 39 in the 1960s and leading to the concept of satisficing as opposed to optimizing this idea found experimental expression in the work of Gerd Gigerenzer and others One line of Gigerenzer s work led to the Fast and Frugal framing of the human reasoning mechanism 40 which focused on the primacy of recognition in decision making backed up by tie resolving heuristics operating in a low cognitive resource environment In a series of objective tests models based on this approach outperformed models based on rational agents maximizing their utility using formal analytical methods One contribution to a theory and practice of cognitive bias mitigation from this approach is that it addresses mitigation without explicitly targeting individual cognitive biases and focuses on the reasoning mechanism itself to avoid cognitive biases manifestation Intensive situational training is capable of providing individuals with what appears to be cognitive bias mitigation in decision making but amounts to a fixed strategy of selecting the single best response to recognized situations regardless of the noise in the environment Studies and anecdotes reported in popular audience media 13 20 41 42 of firefighter captains military platoon leaders and others making correct snap judgments under extreme duress suggest that these responses are likely not generalizable and may contribute to a theory and practice of cognitive bias mitigation only the general idea of domain specific intensive training Similarly expert level training in such foundational disciplines as mathematics statistics probability logic etc can be useful for cognitive bias mitigation when the expected standard of performance reflects such formal analytical methods However a study of software engineering professionals 43 suggests that for the task of estimating software projects despite the strong analytical aspect of this task standards of performance focusing on workplace social context were much more dominant than formal analytical methods This finding if generalizable to other tasks and disciplines would discount the potential of expert level training as a cognitive bias mitigation approach and could contribute a narrow but important idea to a theory and practice of cognitive bias mitigation Laboratory experiments in which cognitive bias mitigation is an explicit goal are rare One 1980 study 44 explored the notion of reducing the optimism bias by showing subjects other subjects outputs from a reasoning task with the result that their subsequent decision making was somewhat debiased A recent research effort by Morewedge and colleagues 2015 found evidence for domain general forms of debiasing In two longitudinal experiments debiasing training techniques featuring interactive games that elicited six cognitive biases anchoring bias blind spot confirmation bias fundamental attribution error projection bias and representativeness provided participants with individualized feedback mitigating strategies and practice resulted in an immediate reduction of more than 30 in commission of the biases and a long term 2 to 3 month delay reduction of more than 20 The instructional videos were also effective but were less effective than the games 45 Evolutionary psychology edit This discipline explicitly challenges the prevalent view that humans are rational agents maximizing expected value utility using formal analytical methods to do so Practitioners such as Cosmides Tooby Haselton Confer and others posit that cognitive biases are more properly referred to as cognitive heuristics and should be viewed as a toolkit of cognitive shortcuts 46 47 48 49 selected for by evolutionary pressure and thus are features rather than flaws as assumed in the prevalent view Theoretical models and analyses supporting this view are plentiful 50 This view suggests that negative reasoning outcomes arise primarily because the reasoning challenges faced by modern humans and the social and political context within which these are presented make demands on our ancient heuristic toolkit that at best create confusion as to which heuristics to apply in a given situation and at worst generate what adherents of the prevalent view call reasoning errors In a similar vein Mercier and Sperber describe a theory 51 for confirmation bias and possibly other cognitive biases which is a radical departure from the prevalent view which holds that human reasoning is intended to assist individual economic decisions Their view suggests that it evolved as a social phenomenon and that the goal was argumentation i e to convince others and to be careful when others try to convince us It is too early to tell whether this idea applies more generally to other cognitive biases but the point of view supporting the theory may be useful in the construction of a theory and practice of cognitive bias mitigation There is an emerging convergence between evolutionary psychology and the concept of our reasoning mechanism being segregated approximately into System 1 and System 2 13 46 In this view System 1 is the first line of cognitive processing of all perceptions including internally generated pseudo perceptions which automatically subconsciously and near instantaneously produces emotionally valenced judgments of their probable effect on the individual s well being By contrast System 2 is responsible for executive control taking System 1 s judgments as advisories making future predictions via prospection of their actualization and then choosing which advisories if any to act on In this view System 2 is slow simple minded and lazy usually defaulting to System 1 advisories and overriding them only when intensively trained to do so or when cognitive dissonance would result In this view our heuristic toolkit resides largely in System 1 conforming to the view of cognitive biases being unconscious automatic and very difficult to detect and override Evolutionary psychology practitioners emphasize that our heuristic toolkit despite the apparent abundance of reasoning errors attributed to it actually performs exceptionally well given the rate at which it must operate the range of judgments it produces and the stakes involved The System 1 2 view of the human reasoning mechanism appears to have empirical plausibility see Neuroscience next and for empirical and theoretical arguments against see 52 53 54 and thus may contribute to a theory and practice of cognitive bias mitigation Neuroscience edit Neuroscience offers empirical support for the concept of segregating the human reasoning mechanism into System 1 and System 2 as described above based on brain activity imaging experiments using fMRI technology While this notion must remain speculative until further work is done it appears to be a productive basis for conceiving options for constructing a theory and practice of cognitive bias mitigation 55 56 Anthropology edit Anthropologists have provided generally accepted scenarios 57 58 59 60 61 of how our progenitors lived and what was important in their lives These scenarios of social political and economic organization are not uniform throughout history or geography but there is a degree of stability throughout the Paleolithic era and the Holocene in particular This along with the findings in Evolutionary psychology and Neuroscience above suggests that our cognitive heuristics are at their best when operating in a social political and economic environment most like that of the Paleolithic Holocene If this is true then one possible means to achieve at least some cognitive bias mitigation is to mimic as much as possible Paleolithic Holocene social political and economic scenarios when one is performing a reasoning task that could attract negative cognitive bias effects Human reliability engineering edit A number of paradigms methods and tools for improving human performance reliability 20 62 63 64 65 66 have been developed within the discipline of human reliability engineering Although there is some attention paid to the human reasoning mechanism itself the dominant approach is to anticipate problematic situations constrain human operations through process mandates and guide human decisions through fixed response protocols specific to the domain involved While this approach can produce effective responses to critical situations under stress the protocols involved must be viewed as having limited generalizability beyond the domain for which they were developed with the implication that solutions in this discipline may provide only generic frameworks to a theory and practice of cognitive bias mitigation Machine learning edit Machine learning a branch of artificial intelligence has been used to investigate human learning and decision making 67 One technique particularly applicable to cognitive bias mitigation is neural network learning and choice selection an approach inspired by the imagined structure and function of actual biological neural networks in the human brain The multilayer cross connected signal collection and propagation structure typical of neural network models where weights govern the contribution of signals to each connection allow very small models to perform rather complex decision making tasks at high fidelity In principle such models are capable of modeling decision making that takes account of human needs and motivations within social contexts and suggest their consideration in a theory and practice of cognitive bias mitigation Challenges to realizing this potential accumulating the considerable amount of appropriate real world training sets for the neural network portion of such models characterizing real life decision making situations and outcomes so as to drive models effectively and the lack of direct mapping from a neural network s internal structure to components of the human reasoning mechanism Software engineering edit This discipline though not focused on improving human reasoning outcomes as an end goal is one in which the need for such improvement has been explicitly recognized 18 19 though the term cognitive bias mitigation is not universally used One study 68 identifies specific steps to counter the effects of confirmation bias in certain phases of the software engineering lifecycle Another study 43 takes a step back from focussing on cognitive biases and describes a framework for identifying Performance Norms criteria by which reasoning outcomes are judged correct or incorrect so as to determine when cognitive bias mitigation is required to guide identification of the biases that may be in play in a real world situation and subsequently to prescribe their mitigations This study refers to a broad research program with the goal of moving toward a theory and practice of cognitive bias mitigation Other edit Other initiatives aimed directly at a theory and practice of cognitive bias mitigation may exist within other disciplines under different labels than employed here See also edit nbsp Philosophy portal nbsp Psychology portal Cognitive bias modification Cognitive vulnerability Critical theory Critical thinking Debiasing Freedom of thought Freethought Inquiry Logic Unstated assumptionReferences edit a b Ariely D 2008 Predictably Irrational The Hidden Forces That Shape Our Decisions Harper Collins a b Epley N Gilovich T 2006 The Anchoring and Adjustment Heuristic Why the Adjustments are Insufficient Psychological Science 17 4 311 318 doi 10 1111 j 1467 9280 2006 01704 x PMID 16623688 S2CID 10279390 a b Gigerenzer G 2006 Bounded and Rational Contemporary Debates in Cognitive Science R J Stainton Blackwell Publishing 115 133 a b Gilovich T 1991 How We Know What Isn t So The Fallibility of Human Reason in Everyday Life New York NY The Free Press a b Hammond J S Keeney R L et al 2006 The Hidden Traps in Decision Making Harvard Business Review 84 1 118 126 a b Haselton M G D Nettie et al 2005 The Evolution of Cognitive Bias Handbook of Evolutionary Psychology D M Buss Hoboken Wiley 724 746 a b Henrich et al 2010 Markets Religion Community Size and the Evolution of Fairness and Punishment Science 327 5972 1480 1484 Bibcode 2010Sci 327 1480H CiteSeerX 10 1 1 714 7830 doi 10 1126 science 1182238 PMID 20299588 S2CID 4803905 a b Lerher J 2009 How We Decide New York NY Houghton Mifflin Harcourt a b Nozick R 1993 The Nature of Rationality Ewing NJ Princeton University Press a b Schacter D L 1999 The Seven Sins of Memory Insights From Psychology and Cognitive Neuroscience American Psychologist 54 3 182 203 doi 10 1037 0003 066x 54 3 182 PMID 10199218 S2CID 14882268 Roberto M A 2002 Lessons from Everest The Interaction of Cognitive Bias Psychological Safety and System Complexity California Management Review 2002 45 1 136 158 Knauff M Budeck C Wolf A G Hamburger K 2010 The Illogicality of Stock Brokers Psychological Experiments on the Effects of Prior Knowledge and Belief Biases on Logical Reasoning in Stock Trading PLOS ONE 5 10 e13483 Bibcode 2010PLoSO 513483K doi 10 1371 journal pone 0013483 PMC 2956684 PMID 20976157 a b c Kahneman D 2011 Thinking Fast and Slow Doubleday Canada ID 19830723 0 1983 Gimli Glider Accident Report Aviation Safety Network http aviation safety net Stephenson Arthur G LaPiana Lia S Mulville Daniel R Rutledge Peter J Bauer Frank H Folta David Dukeman Greg A Sackheim Robert et al 1999 11 10 Mars Climate Orbiter Mishap Investigation Board Phase I Report National Air and Space Administration British Columbia Ministry of Energy Mines and Petroleum Resources Sullivan Mine Accident Report May 17 2006 Beynon Davies P Information systems failure case of the LASCAD project European Journal of Information Systems 1995 a b Mann C C 2002 Why Software is So Bad Technology Review MIT July 2002 a b Stacy W MacMillan J 1995 Cognitive Bias in Software Engineering Communications of the ACM 38 6 57 63 doi 10 1145 203241 203256 S2CID 1505473 a b c Gawande A 2010 The Checklist Manifesto How to Get Things Right New York NY Metropolitan Books Kahneman D Thaler R 2006 Utility Maximization and Experienced Utility Journal of Economic Perspectives 20 1 221 234 doi 10 1257 089533006776526076 Frey B Stutzer A 2002 What Can Economists Learn from Happiness Research PDF Journal of Economic Literature 40 2 402 35 doi 10 1257 002205102320161320 S2CID 13967611 Kahneman D 2000 Experienced Utility and Objective Happiness A Moment Based Approach Chapter 37 in D Kahneman and A Tversky Eds Choices Values and Frames New York Cambridge University Press and the Russell Sage Foundation 1999 Binmore K 2007 A Very Short Introduction to Game Theory Oxford University Press Camerer C F Ho T H Chong J K 2002 A Cognitive Hierarchy Theory of One Shot Games and Experimental Analysis Forth Quarterly Journal of Economics Broseta B Costa Gomes M Crawford V 2000 Cognition and Behavior in Normal Form Games An Experimental Study Department of Economics University of California at San Diego Permalink http www escholarship org uc item 0fp8278k Myerson R B 1991 Game Theory Analysis of Conflict Harvard University Press Wright J R Leyton Brown K Behavioral Game Theoretic Models A Bayesian Framework For Parameter Analysis to appear in Proceedings of the 11th International Conference on Autonomous Agents and Multiagent Systems AAMAS 2012 8 pages 2012 Frey B Stutzer A 2002 What Can Economists Learn from Happiness Research PDF Journal of Economic Literature 40 2 402 35 doi 10 1257 002205102320161320 S2CID 13967611 Kahneman D Maps of Bounded Rationality Psychology for Behavioral Economics American Economic Review December 2003 1449 1475 Mullainathan Sendhil and Richard Thaler Behavioral Economics MIT Department of Economics Working Paper 00 27 September 2000 Kahneman D Tversky A 1979 Prospect Theory An Analysis of Decision Under Risk Econometrica 47 2 263 291 CiteSeerX 10 1 1 407 1910 doi 10 2307 1914185 JSTOR 1914185 Kahneman D Lovallo D Sibony O 2011 Before You Make That Big Decision Harvard Business Review June 2011 Loewenstein George Rick Scott Cohen Jonathan D January 2008 Neuroeconomics Annual Review of Psychology 59 1 647 672 doi 10 1146 annurev psych 59 103006 093710 PMID 17883335 Rustichini A 2009 Neuroeconomics What have we found and what should we search for Current Opinion in Neurobiology 19 6 672 677 doi 10 1016 j conb 2009 09 012 PMID 19896360 S2CID 2281817 Padoa Schioppa C Assad J A 2007 The Representation of Economic Value in the Orbitofrontal Cortex is Invariant for Changes of Menu Nature Reviews Neuroscience 11 1 95 102 doi 10 1038 nn2020 PMC 2646102 PMID 18066060 Spreng R N Mar R A Kim A S N 2008 The Common Neural Basis of Autobiographical Memory Prospection Navigation Theory of Mind and the Default Mode A Quantitative Meta Analysis Journal of Cognitive Neuroscience Epub ahead of print 2010 Jamison J Wegener J 2010 Multiple Selves in Intertemporal Choice PDF Journal of Economic Psychology 31 5 832 839 doi 10 1016 j joep 2010 03 004 Simon H A 1991 Bounded Rationality and Organizational Learning Organization Science 2 1 125 134 doi 10 1287 orsc 2 1 125 Gigerenzer G Goldstein D G 1996 Reasoning the Fast and Frugal Way Models of Bounded Rationality Psychological Review 103 4 650 669 CiteSeerX 10 1 1 174 4404 doi 10 1037 0033 295x 103 4 650 PMID 8888650 Gladwell M 2006 Blink The Power of Thinking Without Thinking New York NY Little Brown and Company Shermer M 2010 A review of Paul Thagard s The Brain and the Meaning of Life Skeptic Magazine Altadena CA Skeptics Society 16 60 61 a b Conroy P Kruchten P 2012 Performance Norms An Approach to Reducing Rework in Software Development to appear in IEEE Xplore re 2012 Canadian Conference on Electrical and Computing Engineering Weinstein N D 1980 Unrealistic Optimism About Future Life Events Department of Human Ecology and Social Sciences Cook College Rutgers The State University Journal of Personality and Social Psychology 39 5 806 820 CiteSeerX 10 1 1 535 9244 doi 10 1037 0022 3514 39 5 806 Morewedge C K Yoon H Scopelliti I Symborski C W Korris J H Kassam K S 13 August 2015 Debiasing Decisions Improved Decision Making With a Single Training Intervention PDF Policy Insights from the Behavioral and Brain Sciences 2 1 129 140 doi 10 1177 2372732215600886 S2CID 4848978 a b Cosmides L Tooby J Evolutionary Psychology A Primer at http www psych ucsb edu research cep primer html Archived 2009 02 28 at the Wayback Machine Haselton M G Bryant G A Wilke A Frederick D A Galperin A Frankenhuis W E Moore T 2009 Adaptive Rationality An Evolutionary Perspective on Cognitive Bias Social Cognition 27 5 733 763 CiteSeerX 10 1 1 220 6198 doi 10 1521 soco 2009 27 5 733 Haselton M G D Nettie et al 2005 The Evolution of Cognitive Bias Handbook of Evolutionary Psychology D M Buss Hoboken Wiley 724 746 Confer et al 2010 Evolutionary Psychology Controversies Questions Prospects and Limitations American Psychologist 65 2 110 126 CiteSeerX 10 1 1 601 8691 doi 10 1037 a0018413 PMID 20141266 Chudek M Henrich J 2011 Culture Gene Coevolution Norm Psychology and the Emergence of Human Prosociality Trends in Cognitive Sciences 15 5 218 226 doi 10 1016 j tics 2011 03 003 PMID 21482176 S2CID 16710885 Mercier H Sperber D 2011 Argumentative Theory Behavioral and Brain Sciences 34 2 57 74 doi 10 1017 s0140525x10000968 PMID 21447233 S2CID 5669039 Evans J B T 2006 Dual system theories of cognition Some issues Proceedings of the Annual Meeting of Cognitive Science Society 28 28 https cloudfront escholarship org dist prd content qt76d4d629 qt76d4d629 pdf Garcia Marques L Ferreira M B 2011 Friends and foes of theory construction in psychological science Vague dichotomies unified theories of cognition and the new experimentalism Perspectives on Psychological Science 6 2 192 201 doi 10 1177 1745691611400239 PMID 26162138 S2CID 118743 Fiedler K amp Hutter M 2014 The limits of automaticity J Sherman B Gawronski amp Y Trope Eds Dual Processes in Social Psychology pp 497 513 New York Guilford Publications Inc https www researchgate net profile Mandy Huetter publication 308789747 The limits of automaticity links 58ad4fd24585155ae77aefac The limits of automaticity pdf Damasio A 2010 Self Comes to Mind Constructing the Conscious Brain New York NY Pantheon Changeux J P P A Damasio et al Eds 2007 Neurobiology of Human Values Research and Perspectives in Neurosciences Heidelberg Germany Springer Ember C R 1978 Myths About Hunter Gatherers University of Pittsburgh Of the Commonwealth System of Higher Education 17 4 pp 439 448 Gabow S L 1977 Population Structure and the Rate of Hominid Brain Evolution Journal of Human Evolution 6 7 643 665 doi 10 1016 s0047 2484 77 80136 x Hamilton M J Milne B T Walker R S Burger O Brown J H 2007 The complex Structure of Hunter Gatherer Social Networks Proceedings of the Royal Society B 2007 274 2195 2203 doi 10 1098 rspb 2007 0564 PMC 2706200 PMID 17609186 Kuhn S L Stiner M C 2006 What s a Mother To Do The Division of Labor among Neanderthals and Modern Humans in Eurasia Current Anthropology 47 6 953 981 doi 10 1086 507197 S2CID 42981328 Marlowe F W 2005 Hunter Gatherers and Human Evolution Evolutionary Anthropology Issues News and Reviews 14 2 54 67 doi 10 1002 evan 20046 S2CID 53489209 Gertman D Blackman H Marble J Byers J and Smith C 2005 The SPAR H human reliability analysis method Hollnagel E 1998 Cognitive reliability and error analysis method CREAM Elsevier Roth E et al 1994 An empirical investigation of operator performance in cognitive demanding simulated emergencies NUREG CR 6208 Westinghouse Science and Technology Center Report prepared for Nuclear Regulatory Commission Wiegmann D amp Shappell S 2003 A human error approach to aviation accident analysis The human factors analysis and classification system Ashgate Wilson J R 1993 SHEAN Simplified Human Error Analysis code and automated THERP United States Department of Energy Technical Report Number WINCO 11908 Sutton R S Barto A G 1998 MIT CogNet Ebook Collection MITCogNet 1998 Adaptive Computation and Machine Learning ISBN 978 0 262 19398 6 Caliki G Bener A Arsian B 2010 An Analysis of the Effects of Company Culture Education and Experience on Confirmation Bias Levels of Software Developers and Testers ADM IEEE 32nd International Conference on Software Engineering ICSE 2010 Volume 2 pp187 190 External links editCenter for the Study of Neuroeconomics Fast and Frugal Heuristics Journal of Evolutionary Psychology usurped Institute of Ergonomics and Human Factors International Machine Learning Society Temporal Difference Learning Cognitive Neuroscience Society Max Planck Institute for Human Development Retrieved from https en wikipedia org w index php title Cognitive bias mitigation amp oldid 1189921813, wikipedia, wiki, book, books, library,

article

, read, download, free, free download, mp3, video, mp4, 3gp, jpg, jpeg, gif, png, picture, music, song, movie, book, game, games.