Cognitive bias

Cognitive bias

A cognitive bias is a pattern of deviation in judgment that occurs in particular situations. Implicit in the concept of a "pattern of deviation" is a standard of comparison; this may be the judgment of people outside those particular situations, or may be a set of independently verifiable facts.

Cognitive biases are instances of evolved mental behavior. Some are presumably adaptive, for example, because they lead to more effective actions in given contexts or enable faster decisions when faster decisions are of greater value. Others presumably result from a lack of appropriate mental mechanisms, or from the misapplication of a mechanism that is adaptive under different circumstances.

Cognitive bias is a general term that is used to describe many observer effects in the human mind, some of which can lead to perceptual distortion, inaccurate judgment, or illogical interpretation.[1] It is a phenomenon studied in cognitive science and social psychology.

Contents

Overview

Bias arises from various processes that are sometimes difficult to distinguish. These include information-processing shortcuts (heuristics), motivational factors and social influence.[citation needed]

The notion of cognitive biases was introduced by Amos Tversky and Daniel Kahneman in 1972[2] and grew out of their experience of people's innumeracy, or inability to reason intuitively with the greater orders of magnitude. They and their colleagues demonstrated several replicable ways in which human judgments and decisions differ from rational choice theory. They explained these differences in terms of heuristics, rules which are simple for the brain to compute but introduce systematic errors.[2] For instance the Availability heuristic, when the ease with which something comes to mind is used to indicate how often (or how recently) it has been encountered.

These experiments grew into the heuristics and biases research program which spread beyond academic psychology into other disciplines including medicine and political science.[3] It was a major factor in the emergence of behavioral economics, earning Kahneman a Nobel Prize in 2002.[4] Tversky and Kahneman developed prospect theory as a more realistic alternative to rational choice theory.[citation needed]

Critics of Kahneman and Tversky such as Gerd Gigerenzer argue that heuristics should not lead us to conceive of human thinking as riddled with irrational cognitive biases, but rather to conceive rationality as an adaptive tool that is not identical to the rules of formal logic or the probability calculus.[5]

Types of cognitive biases

Biases can be distinguished on a number of dimensions. For example, there are biases specific to groups (such as the risky shift) as well as biases at the individual level.

Some biases affect decision-making, where the desirability of options has to be considered (e.g., Sunk Cost fallacy). Others such as Illusory correlation affect judgment of how likely something is, or of whether one thing is the cause of another. A distinctive class of biases affect memory,[6] such as consistency bias (remembering one's past attitudes and behavior as more similar to one's present attitudes).

Some biases reflect a subject's motivation,[7] for example, the desire for a positive self-image leading to Egocentric bias[8] and the avoidance of unpleasant cognitive dissonance. Other biases are due to the particular way the brain perceives, forms memories and makes judgments. This distinction is sometimes described as "Hot cognition" versus "Cold Cognition", as motivated reasoning can involve a state of arousal.

Among the "cold" biases, some are due to ignoring relevant information (e.g. Neglect of probability), whereas some involve a decision or judgement being affected by irrelevant information (for example the Framing effect where the same problem receives different responses depending on how it is described) or giving excessive weight to an unimportant but salient feature of the problem (e.g., Anchoring).

The fact that some biases reflect motivation, and in particular the motivation to have positive attitudes to oneself[8] accounts for the fact that many biases are self-serving or self-directed (e.g. Illusion of asymmetric insight, Self-serving bias, Projection bias). There are also biases in how subjects evaluate in-groups or out-groups; evaluating in-groups as more diverse and "better" in many respects, even when those groups are arbitrarily-defined (Ingroup bias, Outgroup homogeneity bias).

Some cognitive biases belong to the subgroup of attentional biases which refer to the paying of increased attention to certain stimuli. It has been shown, for example, that people addicted to alcohol and other drugs pay more attention to drug-related stimuli. Common psychological tests to measure those biases are the Stroop Task[9][10] and the Dot Probe Task.

The following is a list of the more commonly studied cognitive biases:

  • Framing by using a too-narrow approach and description of the situation or issue.
  • Hindsight bias, sometimes called the "I-knew-it-all-along" effect, is the inclination to see past events as being predictable.
  • Fundamental attribution error is the tendency for people to over-emphasize personality-based explanations for behaviors observed in others while under-emphasizing the role and power of situational influences on the same behavior.
  • Confirmation bias is the tendency to search for or interpret information in a way that confirms one's preconceptions; this is related to the concept of cognitive dissonance.
  • Self-serving bias is the tendency to claim more responsibility for successes than failures. It may also manifest itself as a tendency for people to evaluate ambiguous information in a way beneficial to their interests.
  • Belief bias is when one's evaluation of the logical strength of an argument is biased by their belief in the truth or falsity of the conclusion.

Practical significance

Many social institutions rely on individuals to make rational judgments. A fair jury trial, for example, requires that the jury ignore irrelevant features of the case, weigh the relevant features appropriately, consider different possibilities open-mindedly and resist fallacies such as appeal to emotion. The various biases demonstrated in these psychological experiments suggest that people will frequently fail to do all these things.[11] However, they fail to do so in systematic, directional ways that are predictable.[12]

Cognitive biases are also related to the persistence of superstition, to large social issues such as prejudice, and they also work as a hindrance in the acceptance of scientific non-intuitive knowledge by the public.[13]

Criticism

In heuristics and biases literature, it is almost impossible to make an accurate and unbiased decision as the "rational" decision is usually sandwiched between two contradictory biases.[14] For example, overestimating one's abilities can be due to the Dunning–Kruger effect, and underestimating them because of the false consensus effect. As a practical example, if one should estimate his skill in throwing a flying disc as far as possible, one must get his estimate right on the spot or else he has demonstrated biased judgement.

See also

Portal icon Psychology portal
Portal icon Thinking portal

References

  1. ^ Kahneman, D.; Tversky, A. (1972). "Subjective probability: A judgment of representativeness". Cognitive Psychology 3 (3): 430–454. doi:10.1016/0010-0285(72)90016-3. 
  2. ^ a b Kahneman, Daniel; Shane Frederick (2002). "Representativeness Revisited: Attribute Substitution in Intuitive Judgment". In Thomas Gilovich, Dale Griffin, Daniel Kahneman. Heuristics and Biases: The Psychology of Intuitive Judgment. Cambridge: Cambridge University Press. pp. 51–52. ISBN 9780521796798. 
  3. ^ Gilovich, Thomas; Dale Griffin (2002). "Heuristics and Biases: Then and Now". In Thomas Gilovich, Dale Griffin, Daniel Kahneman. Heuristics and Biases: The Psychology of Intuitive Judgment. Cambridge: Cambridge University Press. pp. 1–4. ISBN 9780521796798. 
  4. ^ [1] Nobelprize.org
  5. ^ Gigerenzer, G. (2006). "Bounded and Rational". In Stainton, R. J.. Contemporary Debates in Cognitive Science. Blackwell. p. 129. ISBN 1405113049. 
  6. ^ Schacter, D.L. (1999). "The Seven Sins of Memory: Insights From Psychology and Cognitive Neuroscience". American Psychologist 54 (3): 182–203. doi:10.1037/0003-066X.54.3.182. PMID 10199218 
  7. ^ Kunda, Z. (1990). "The Case for Motivated Reasoning". Psychological Bulletin 108 (3): 480–498. doi:10.1037/0033-2909.108.3.480. PMID 2270237 
  8. ^ a b Hoorens, V. (1993). "Self-enhancement and Superiority Biases in Social Comparison". In Stroebe, W. and Hewstone, Miles. European Review of Social Psychology 4. Wiley 
  9. ^ Jensen AR, Rohwer WD (1966). "The Stroop color-word test: a review". Acta psychologica 25 (1): 36–93. doi:10.1016/0001-6918(66)90004-7. PMID 5328883. 
  10. ^ MacLeod CM (March 1991). "Half a century of research on the Stroop effect: an integrative review". Psychological bulletin 109 (2): 163–203. doi:10.1037/0033-2909.109.2.163. PMID 2034749. http://content.apa.org/journals/bul/109/2/163. 
  11. ^ Sutherland, Stuart (2007) Irrationality: The Enemy Within Second Edition (First Edition 1994) Pinter & Martin. ISBN 978-1-905177-07-3
  12. ^ Ariely, Dan (2008). Predictably Irrational: The Hidden Forces That Shape Our Decisions. HarperCollins. p. 304. ISBN 9780061353239. 
  13. ^ Günter Radden, H. Cuyckens (2003). Motivation in language: studies in honor of Günter Radden. John Benjamins. p. 275. ISBN 9781588114266. http://books.google.com/books?id=qzhJ3KpLpQUC&pg=PA275&dq=essentialism+definition&lr=&cd=3#v=onepage&q=essentialism%20definition&f=false. 
  14. ^ Funder, David C.; Joachim I. Krueger (June 2004). "Towards a balanced social psychology: Causes, consequences, and cures for the problem-seeking approach to social behavior and cognition". Behavioral and Brain Sciences 27: 313–376. PMID 15736870. http://web.mac.com/kstanovich/Site/Research_on_Reasoning_files/SocialBBS04.pdf. 

Further reading

  • Eiser, J.R. and Joop van der Pligt (1988) Attitudes and Decisions London: Routledge. ISBN 978-0-415-01112-9
  • Fine, Cordelia (2006) A Mind of its Own: How your brain distorts and deceives Cambridge, UK: Icon Books. ISBN 1-84046-678-2
  • Gilovich, Thomas (1993). How We Know What Isn't So: The Fallibility of Human Reason in Everyday Life. New York: The Free Press. ISBN 0-02-911706-2
  • Haselton, M.G., Nettle, D. & Andrews, P.W. (2005). The evolution of cognitive bias. In D.M. Buss (Ed.), Handbook of Evolutionary Psychology, (pp. 724–746). Hoboken: Wiley. Full text
  • Heuer, Richards J. Jr. (1999) Psychology of Intelligence Analysis. Central Intelligence Agency. http://www.au.af.mil/au/awc/awcgate/psych-intel/art5.html
  • Kahneman D., Slovic P., and Tversky, A. (Eds.) (1982) Judgment Under Uncertainty: Heuristics and Biases. New York: Cambridge University Press ISBN 978-0-521-28414-1
  • Kida, Thomas (2006) Don't Believe Everything You Think: The 6 Basic Mistakes We Make in Thinking New York: Prometheus. ISBN 978-1-59102-408-8
  • Nisbett, R., and Ross, L. (1980) Human Inference: Strategies and shortcomings of human judgement. Englewood Cliffs, NJ: Prentice-Hall ISBN 978-0-13-445130-5
  • Piatelli-Palmarini, Massimo (1994) Inevitable Illusions: How Mistakes of Reason Rule Our Minds New York: John Wiley & Sons. ISBN 0-471-15962-X
  • Stanovich, Keith (2009). What Intelligence Tests Miss: The Psychology of Rational Thought. New Haven (CT): Yale University Press. ISBN 978-0-300-12385-2. Lay summary (21 November 2010). 
  • Sutherland, Stuart (2007) Irrationality: The Enemy Within Second Edition (First Edition 1994) Pinter & Martin. ISBN 978-1-905177-07-3
  • Tavris, Carol and Elliot Aronson (2007) Mistakes Were Made (But Not by Me): Why We Justify Foolish Beliefs, Bad Decisions and Hurtful Acts Orlando, Florida: Harcourt Books. ISBN 978-0-15-101098-1

External links


Wikimedia Foundation. 2010.

Игры ⚽ Нужен реферат?

Look at other dictionaries:

  • Bias (disambiguation) — Bias is an inclination towards something, or a predisposition, partiality, prejudice, preference, or predilection. Bias may also refer to:In science and statistics: * Bias (statistics), the systematic distortion of a statistic ** A biased sample… …   Wikipedia

  • Cognitive distortion — Cognitive distortions are exaggerated and irrational thoughts identified in cognitive therapy and its variants, which in theory perpetuate certain psychological disorders. The theory of cognitive distortions was first proposed by Aaron T.… …   Wikipedia

  • Bias — This article is about different ways the term bias is used . For other uses, see Bias (disambiguation). Bias is an inclination to present or hold a partial perspective at the expense of (possibly equally valid) alternatives. Bias can come in many …   Wikipedia

  • Cognitive dissonance — The Fox and the Grapes by Aesop. When the fox fails to reach the grapes, he decides he does not want them after all. This is an example of adaptive preference formation, which serves to reduce cognitive dissonance.[1] …   Wikipedia

  • Cognitive science — Figure illustrating the fields that contributed to the birth of cognitive science, including linguistics, education, neuroscience, artificial Intelligence, philosophy, anthropology, and psychology. Adapted from Miller, George A (2003). The… …   Wikipedia

  • Cognitive behavioral therapy — Psychology …   Wikipedia

  • Bias blind spot — The bias blind spot is a cognitive bias about not compensating for one s own cognitive biases. The term was created by Emily Pronin, social psychologist from Princeton University s Department of Psychology, with colleagues Daniel Lin and Lee Ross …   Wikipedia

  • Cognitive psychology — Psychology …   Wikipedia

  • Cognitive neuropsychology — Neuropsychology Topics Brain computer interface …   Wikipedia

  • Cognitive closure (philosophy) — Problems of inquiry Cognitive closure (philosophy) Cognitive bias (psychology) Empirical limits in science This box: view · talk · …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”