Overconfidence effect

Last updated

The overconfidence effect is a well-established bias in which a person's subjective confidence in their judgments is reliably greater than the objective accuracy of those judgments, especially when confidence is relatively high. [1] [2] Overconfidence is one example of a miscalibration of subjective probabilities. Throughout the research literature, overconfidence has been defined in three distinct ways: (1) overestimation of one's actual performance; (2) overplacement of one's performance relative to others; and (3) overprecision in expressing unwarranted certainty in the accuracy of one's beliefs. [3] [4]

Contents

The most common way in which overconfidence has been studied is by asking people how confident they are of specific beliefs they hold or answers they provide. The data show that confidence systematically exceeds accuracy, implying people are more sure that they are correct than they deserve to be. If human confidence had perfect calibration, judgments with 100% confidence would be correct 100% of the time, 90% confidence correct 90% of the time, and so on for the other levels of confidence. By contrast, the key finding is that confidence exceeds accuracy so long as the subject is answering hard questions about an unfamiliar topic. For example, in a spelling task, subjects were correct about 80% of the time, whereas they claimed to be 100% certain. [5] Put another way, the error rate was 20% when subjects expected it to be 0%. In a series where subjects made true-or-false responses to general knowledge statements, they were overconfident at all levels. When they were 100% certain of their answer to a question, they were wrong 20% of the time. [6]

Types

Overestimation

One manifestation of the overconfidence effect is the tendency to overestimate one's standing on a dimension of judgment or performance. This subsection of overconfidence focuses on the certainty one feels in their own ability, performance, level of control, or chance of success. This phenomenon is most likely to occur on hard tasks, hard items, when failure is likely or when the individual making the estimate is not especially skilled. Overestimation has been seen to occur across domains other than those pertaining to one's own performance. This includes the illusion of control , planning fallacy . [3]

Illusion of control

Illusion of control describes the tendency for people to behave as if they might have some control when in fact they have none. [7] However, evidence does not support the notion that people systematically overestimate how much control they have; when they have a great deal of control, people tend to underestimate how much control they have. [8]

Planning fallacy

The planning fallacy describes the tendency for people to overestimate their rate of work or to underestimate how long it will take them to get things done. [9] It is strongest for long and complicated tasks, and disappears or reverses for simple tasks that are quick to complete.

Contrary evidence

Wishful-thinking effects, in which people overestimate the likelihood of an event because of its desirability, are relatively rare. [10] This may be in part because people engage in more defensive pessimism in advance of important outcomes, [11] in an attempt to reduce the disappointment that follows overly optimistic predictions. [12]

Overprecision

Overprecision is the excessive confidence that one knows the truth. For reviews, see Harvey [13] or Hoffrage. [14] Much of the evidence for overprecision comes from studies in which participants are asked about their confidence that individual items are correct. This paradigm, while useful, cannot distinguish overestimation from overprecision; they are one and the same in these item-confidence judgments. After making a series of item-confidence judgments, if people try to estimate the number of items they got right, they do not tend to systematically overestimate their scores. The average of their item-confidence judgments exceeds the count of items they claim to have gotten right. [15] One possible explanation for this is that item-confidence judgments were inflated by overprecision, and that their judgments do not demonstrate systematic overestimation.

Confidence intervals

The strongest evidence of overprecision comes from studies in which participants are asked to indicate how precise their knowledge is by specifying a 90% confidence interval around estimates of specific quantities. If people were perfectly calibrated, their 90% confidence intervals would include the correct answer 90% of the time. [16] In fact, hit rates are often as low as 50%, suggesting people have drawn their confidence intervals too narrowly, implying that they think their knowledge is more accurate than it actually is.

Overplacement

Overplacement is the most prominent manifestation of the overconfidence effect which is a belief that erroneously rates someone as better than others. [17] This subsection of overconfidence occurs when people believe themselves to be better than others, or "better-than-average". [3] It is the act of placing yourself or rating yourself above others (superior to others). Overplacement more often occurs on simple tasks, ones we believe are easy to accomplish successfully.

Manifestations

Better-than-average effects

Perhaps the most celebrated better-than-average finding is Svenson's finding that 93% of American drivers rate themselves as better than the median. [18] The frequency with which school systems claim their students outperform national averages has been dubbed the "Lake Wobegon" effect, after Garrison Keillor's apocryphal town in which "all the children are above average." [19] Overplacement has likewise been documented in a wide variety of other circumstances. [20] Kruger, however, showed that this effect is limited to "easy" tasks in which success is common or in which people feel competent. For difficult tasks, the effect reverses itself and people believe they are worse than others. [21]

Comparative-optimism effects

Some researchers have claimed that people think good things are more likely to happen to them than to others, whereas bad events were less likely to happen to them than to others. [22] But others have pointed out that prior work tended to examine good outcomes that happened to be common (such as owning one's own home) and bad outcomes that happened to be rare (such as being struck by lightning). [23] [24] [25] Event frequency accounts for a proportion of prior findings of comparative optimism. People think common events (such as living past 70) are more likely to happen to them than to others, and rare events (such as living past 100) are less likely to happen to them than to others.

Positive illusions

Taylor and Brown have argued that people cling to overly positive beliefs about themselves, illusions of control, and beliefs in false superiority, because it helps them cope and thrive. [26] Although there is some evidence that optimistic beliefs are correlated with better life outcomes, most of the research documenting such links is vulnerable to the alternative explanation that their forecasts are accurate.

Social knowledge

People tend to overestimate what they personally know, unconsciously assuming they know facts they would actually need to access by asking someone else or consulting a written work. Asking people to explain how something works (like a bicycle, helicopter, or international policy) exposes knowledge gaps and reduces the overestimation of knowledge on that topic. [27]

Practical implications

"Overconfident professionals sincerely believe they have expertise, act as experts and look like experts. You will have to struggle to remind yourself that they may be in the grip of an illusion."

Daniel Kahneman [28]

Social psychologist Scott Plous wrote, "No problem in judgment and decision making is more prevalent and more potentially catastrophic than overconfidence." [29] It has been blamed for lawsuits, strikes, wars, poor corporate acquisitions, [30] [31] and stock market bubbles and crashes.

Strikes, lawsuits, and wars could arise from overplacement. If plaintiffs and defendants were prone to believe that they were more deserving, fair, and righteous than their legal opponents, that could help account for the persistence of inefficient enduring legal disputes. [32] If corporations and unions were prone to believe that they were stronger and more justified than the other side, that could contribute to their willingness to endure labor strikes. [33] If nations were prone to believe that their militaries were stronger than were those of other nations, that could explain their willingness to go to war. [34]

Overprecision could have important implications for investing behavior and stock market trading. Because Bayesians cannot agree to disagree, [35] classical finance theory has trouble explaining why, if stock market traders are fully rational Bayesians, there is so much trading in the stock market. Overprecision might be one answer. [36] If market actors are too sure their estimates of an asset's value is correct, they will be too willing to trade with others who have different information than they do.

Oskamp tested groups of clinical psychologists and psychology students on a multiple-choice task in which they drew conclusions from a case study. [37] Along with their answers, subjects gave a confidence rating in the form of a percentage likelihood of being correct. This allowed confidence to be compared against accuracy. As the subjects were given more information about the case study, their confidence increased from 33% to 53%. However their accuracy did not significantly improve, staying under 30%. Hence this experiment demonstrated overconfidence which increased as the subjects had more information to base their judgment on. [37]

Even if there is no general tendency toward overconfidence, social dynamics and adverse selection could conceivably promote it. For instance, those most likely to have the courage to start a new business are those who most overplace their abilities relative to those of other potential entrants. And if voters find confident leaders more credible, then contenders for leadership learn that they should express more confidence than their opponents in order to win election. [38] However, Overconfidence can be liability or asset during the political election. Candidates tend to lose advantage when verbally expressed overconfidence does not meet current performance, and tend to gain advantage express overconfidence non-verbally. [39]

Overconfidence can be beneficial to individual self-esteem as well as giving an individual the will to succeed in their desired goal. Just believing in oneself may give one the will to take one's endeavours further than those who do not. [40]

Overconfidence among experts

Kahneman and Klein further document how most experts can be beaten by simple heuristics developed by intelligent lay people. Genuine expert intuition is acquired by learning from frequent, rapid, high-quality feedback about the quality of previous judgments. [41] Few professionals have that. Those who master a body of knowledge without learning from such expertise are called "respect experts" by Kahneman, Sibony, and Sunstein. With some data, ordinary least squares (OLS) models often outperform simple heuristics. With lots of data, artificial intelligence (AI) routinely outperforms OLS. [42]

Individual differences

Very high levels of core self-evaluations, a stable personality trait composed of locus of control, neuroticism, self-efficacy, and self-esteem, [43] may lead to the overconfidence effect. People who have high core self-evaluations will think positively of themselves and be confident in their own abilities, [43] although extremely high levels of core self-evaluations may cause an individual to be more confident than is warranted.

Catastrophes

The following is an incomplete list of events related or triggered by bias/overconfidence and a failing (safety) culture: [44]

See also

Related Research Articles

<span class="mw-page-title-main">Cognitive bias</span> Systematic pattern of deviation from norm or rationality in judgment

A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own "subjective reality" from their perception of the input. An individual's construction of reality, not the objective input, may dictate their behavior in the world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, and irrationality.

Hindsight bias, also known as the knew-it-all-along phenomenon or creeping determinism, is the common tendency for people to perceive past events as having been more predictable than they were.

The representativeness heuristic is used when making judgments about the probability of an event being representional in character and essence of a known prototypical event. It is one of a group of heuristics proposed by psychologists Amos Tversky and Daniel Kahneman in the early 1970s as "the degree to which [an event] (i) is similar in essential characteristics to its parent population, and (ii) reflects the salient features of the process by which it is generated". The representativeness heuristic works by comparing an event to a prototype or stereotype that we already have in mind. For example, if we see a person who is dressed in eccentric clothes and reading a poetry book, we might be more likely to think that they are a poet than an accountant. This is because the person's appearance and behavior are more representative of the stereotype of a poet than an accountant.

Trait ascription bias is the tendency for people to view themselves as relatively variable in terms of personality, behavior and mood while viewing others as much more predictable in their personal traits across different situations. More specifically, it is a tendency to describe one's own behaviour in terms of situational factors while preferring to describe another's behaviour by ascribing fixed dispositions to their personality. This may occur because peoples' own internal states are more readily observable and available to them than those of others.

In psychology, the false consensus effect, also known as consensus bias, is a pervasive cognitive bias that causes people to "see their own behavioral choices and judgments as relatively common and appropriate to existing circumstances". In other words, they assume that their personal qualities, characteristics, beliefs, and actions are relatively widespread through the general population.

The anchoring effect is a psychological phenomenon in which an individual's judgements or decisions are influenced by a reference point or "anchor" which can be completely irrelevant. Both numeric and non-numeric anchoring have been reported in research. In numeric anchoring, once the value of the anchor is set, subsequent arguments, estimates, etc. made by an individual may change from what they would have otherwise been without the anchor. For example, an individual may be more likely to purchase a car if it is placed alongside a more expensive model. Prices discussed in negotiations that are lower than the anchor may seem reasonable, perhaps even cheap to the buyer, even if said prices are still relatively higher than the actual market value of the car. Another example may be when estimating the orbit of Mars, one might start with the Earth's orbit and then adjust upward until they reach a value that seems reasonable.

<span class="mw-page-title-main">Thomas Gilovich</span> American psychologist (born 1954)

Thomas Dashiff Gilovich an American psychologist who is the Irene Blecker Rosenfeld Professor of Psychology at Cornell University. He has conducted research in social psychology, decision making, and behavioral economics, and has written popular books on these subjects. Gilovich has collaborated with Daniel Kahneman, Richard Nisbett, Lee Ross and Amos Tversky. His articles in peer-reviewed journals on subjects such as cognitive biases have been widely cited. In addition, Gilovich has been quoted in the media on subjects ranging from the effect of purchases on happiness to people's most common regrets, to perceptions of people and social groups. Gilovich is a fellow of the Committee for Skeptical Inquiry.

In the psychology of affective forecasting, the impact bias, a form of which is the durability bias, is the tendency for people to overestimate the length or the intensity of future emotional states.

Depressive realism is the hypothesis developed by Lauren Alloy and Lyn Yvonne Abramson that depressed individuals make more realistic inferences than non-depressed individuals. Although depressed individuals are thought to have a negative cognitive bias that results in recurrent, negative automatic thoughts, maladaptive behaviors, and dysfunctional world beliefs, depressive realism argues not only that this negativity may reflect a more accurate appraisal of the world but also that non-depressed individuals' appraisals are positively biased.

<span class="mw-page-title-main">Dunning–Kruger effect</span> Cognitive bias about ones own skill

The Dunning–Kruger effect is a cognitive bias in which people with limited competence in a particular domain overestimate their abilities. It was first described by Justin Kruger and David Dunning in 1999. Some researchers also include the opposite effect for high performers: their tendency to underestimate their skills. In popular culture, the Dunning–Kruger effect is often misunderstood as a claim about general overconfidence of people with low intelligence instead of specific overconfidence of people unskilled at a particular task.

Optimism bias is a cognitive bias that causes someone to believe that they themselves are less likely to experience a negative event. It is also known as delusional optimism, unrealistic optimism or comparative optimism.

Positive illusions are unrealistically favorable attitudes that people have towards themselves or to people that are close to them. Positive illusions are a form of self-deception or self-enhancement that feel good; maintain self-esteem; or avoid discomfort, at least in the short term. There are three general forms: inflated assessment of one's own abilities, unrealistic optimism about the future, and an illusion of control. The term "positive illusions" originates in a 1988 paper by Taylor and Brown. "Taylor and Brown's (1988) model of mental health maintains that certain positive illusions are highly prevalent in normal thought and predictive of criteria traditionally associated with mental health."

Self-enhancement is a type of motivation that works to make people feel good about themselves and to maintain self-esteem. This motive becomes especially prominent in situations of threat, failure or blows to one's self-esteem. Self-enhancement involves a preference for positive over negative self-views. It is one of the three self-evaluation motives along with self-assessment and self-verification . Self-evaluation motives drive the process of self-regulation, that is, how people control and direct their own actions.

In social psychology, illusory superiority is a cognitive bias wherein people overestimate their own qualities and abilities compared to others. Illusory superiority is one of many positive illusions, relating to the self, that are evident in the study of intelligence, the effective performance of tasks and tests, and the possession of desirable personal characteristics and personality traits. Overestimation of abilities compared to an objective measure is known as the overconfidence effect.

The spotlight effect is the psychological phenomenon by which people tend to believe they are being noticed more than they really are. Being that one is constantly in the center of one's own world, an accurate evaluation of how much one is noticed by others is uncommon. The reason for the spotlight effect is the innate tendency to forget that although one is the center of one's own world, one is not the center of everyone else's. This tendency is especially prominent when one does something atypical.

Heuristics is the process by which humans use mental shortcuts to arrive at decisions. Heuristics are simple strategies that humans, animals, organizations, and even machines use to quickly form judgments, make decisions, and find solutions to complex problems. Often this involves focusing on the most relevant aspects of a problem or situation to formulate a solution. While heuristic processes are used to find the answers and solutions that are most likely to work or be correct, they are not always right or the most accurate. Judgments and decisions based on heuristics are simply good enough to satisfy a pressing need in situations of uncertainty, where information is incomplete. In that sense they can differ from answers given by logic and probability.

<i>Thinking, Fast and Slow</i> 2011 book by Daniel Kahneman

Thinking, Fast and Slow is a 2011 popular science book by psychologist Daniel Kahneman. The book's main thesis is a differentiation between two modes of thought: "System 1" is fast, instinctive and emotional; "System 2" is slower, more deliberative, and more logical.

Illusion of validity is a cognitive bias in which a person overestimates their ability to interpret and predict accurately the outcome when analyzing a set of data, in particular when the data analyzed show a very consistent pattern—that is, when the data "tell" a coherent story.

Debiasing is the reduction of bias, particularly with respect to judgment and decision making. Biased judgment and decision making is that which systematically deviates from the prescriptions of objective standards such as facts, logic, and rational behavior or prescriptive norms. Biased judgment and decision making exists in consequential domains such as medicine, law, policy, and business, as well as in everyday life. Investors, for example, tend to hold onto falling stocks too long and sell rising stocks too quickly. Employers exhibit considerable discrimination in hiring and employment practices, and some parents continue to believe that vaccinations cause autism despite knowing that this link is based on falsified evidence. At an individual level, people who exhibit less decision bias have more intact social environments, reduced risk of alcohol and drug use, lower childhood delinquency rates, and superior planning and problem solving abilities.

References

Notes

  1. Pallier, Gerry; Wilkinson, Rebecca; Danthiir, Vanessa; Kleitman, Sabina; Knezevic, Goran; Stankov, Lazar; Roberts, Richard D. (2002). "The Role of Individual Differences in the Accuracy of Confidence Judgments". The Journal of General Psychology. 129 (3): 257–299. doi:10.1080/00221300209602099. PMID   12224810. S2CID   6652634.
  2. Moore, Don A.; Healy, Paul J. (April 2008). "The trouble with overconfidence". Psychological Review. 115 (2): 502–517. doi:10.1037/0033-295X.115.2.502. ISSN   1939-1471. PMID   18426301.
  3. 1 2 3 Moore, Don A.; Healy, Paul J. (2008). "The trouble with overconfidence". Psychological Review. 115 (2): 502–517. CiteSeerX   10.1.1.335.2777 . doi:10.1037/0033-295X.115.2.502. PMID   18426301. Archived from the original on 2014-11-06.
  4. Moore, Don A.; Schatz, Derek (August 2017). "The three faces of overconfidence". Social and Personality Psychology Compass. 11 (8): e12331. doi:10.1111/spc3.12331. ISSN   1751-9004.
  5. Adams, P. A.; Adams, J. K. (1960). "Confidence in the recognition and reproduction of words difficult to spell". The American Journal of Psychology. 73 (4): 544–552. doi:10.2307/1419942. JSTOR   1419942. PMID   13681411.
  6. Lichtenstein, Sarah; Fischhoff, Baruch; Phillips, Lawrence D. (1982). "Calibration of probabilities: The state of the art to 1980". In Kahneman, Daniel; Slovic, Paul; Tversky, Amos (eds.). Judgment Under Uncertainty: Heuristics and Biases. Cambridge University Press. pp. 306–334. ISBN   978-0-521-28414-1.
  7. Langer, Ellen J. (1975). "The illusion of control". Journal of Personality and Social Psychology. 32 (2): 311–328. doi:10.1037/0022-3514.32.2.311. S2CID   30043741.
  8. Gino, Francesca; Sharek, Zachariah; Moore, Don A. (2011). "Keeping the illusion of control under control: Ceilings, floors, and imperfect calibration". Organizational Behavior and Human Decision Processes. 114 (2): 104–114. doi:10.1016/j.obhdp.2010.10.002.
  9. Buehler, Roger; Griffin, Dale; Ross, Michael (1994). "Exploring the "planning fallacy": Why people underestimate their task completion times". Journal of Personality and Social Psychology. 67 (3): 366–381. doi:10.1037/0022-3514.67.3.366. S2CID   4222578.
  10. Krizan, Zlatan; Windschitl, Paul D. (2007). "The influence of outcome desirability on optimism" (PDF). Psychological Bulletin. 133 (1): 95–121. doi:10.1037/0033-2909.133.1.95. PMID   17201572. Archived from the original (PDF) on 2014-12-17. Retrieved 2014-11-07.
  11. Norem, Julie K.; Cantor, Nancy (1986). "Defensive pessimism: Harnessing anxiety as motivation". Journal of Personality and Social Psychology. 51 (6): 1208–1217. doi:10.1037/0022-3514.51.6.1208. PMID   3806357.
  12. McGraw, A. Peter; Mellers, Barbara A.; Ritov, Ilana (2004). "The affective costs of overconfidence" (PDF). Journal of Behavioral Decision Making. 17 (4): 281–295. CiteSeerX   10.1.1.334.8499 . doi:10.1002/bdm.472. Archived (PDF) from the original on 2016-03-04.
  13. Harvey, Nigel (1997). "Confidence in judgment". Trends in Cognitive Sciences. 1 (2): 78–82. doi:10.1016/S1364-6613(97)01014-0. PMID   21223868. S2CID   8645740.
  14. Hoffrage, Ulrich (2004). "Overconfidence" . In Pohl, Rüdiger (ed.). Cognitive Illusions: a handbook on fallacies and biases in thinking, judgement and memory . Psychology Press. ISBN   978-1-84169-351-4.
  15. Gigerenzer, Gerd (1993). "The bounded rationality of probabilistic mental models". In Manktelow, K. I.; Over, D. E. (eds.). Rationality: Psychological and philosophical perspectives. London: Routledge. pp. 127–171. ISBN   9780415069557.
  16. Alpert, Marc; Raiffa, Howard (1982). "A progress report on the training of probability assessors". In Kahneman, Daniel; Slovic, Paul; Tversky, Amos (eds.). Judgment Under Uncertainty: Heuristics and Biases. Cambridge University Press. pp. 294–305. ISBN   978-0-521-28414-1.
  17. Vörös, Zsófia (2020). "Effect of the different forms of overconfidence on venture creation: Overestimation, overplacement and overprecision". Journal of Management & Organization. 19 (1): 304–317. doi:10.1017/jmo.2019.93. S2CID   212837116.
  18. Svenson, Ola (1981). "Are we all less risky and more skillful than our fellow drivers?". Acta Psychologica. 47 (2): 143–148. doi:10.1016/0001-6918(81)90005-6.
  19. Cannell, John Jacob (1989). "How public educators cheat on standardized achievement tests: The "Lake Wobegon" report". Friends for Education. Archived from the original on 2014-11-07.
  20. Dunning, David (2005). Self-Insight: Roadblocks and Detours on the Path to Knowing Thyself. Psychology Press. ISBN   978-1841690742.
  21. Kruger, Justin (1999). "Lake Wobegon be gone! The "below-average effect" and the egocentric nature of comparative ability judgments". Journal of Personality and Social Psychology. 77 (2): 221–232. doi:10.1037/0022-3514.77.2.221. PMID   10474208.
  22. Weinstein, Neil D. (1980). "Unrealistic optimism about future life events". Journal of Personality and Social Psychology. 39 (5): 806–820. CiteSeerX   10.1.1.535.9244 . doi:10.1037/0022-3514.39.5.806. S2CID   14051760.
  23. Chambers, John R.; Windschitl, Paul D. (2004). "Biases in Social Comparative Judgments: The Role of Nonmotivated Factors in Above-Average and Comparative-Optimism Effects". Psychological Bulletin. 130 (5): 813–838. doi:10.1037/0033-2909.130.5.813. PMID   15367082. S2CID   15974667.
  24. Chambers, John R.; Windschitl, Paul D.; Suls, Jerry (2003). "Egocentrism, Event Frequency, and Comparative Optimism: When what Happens Frequently is "More Likely to Happen to Me"". Personality and Social Psychology Bulletin. 29 (11): 1343–1356. doi:10.1177/0146167203256870. PMID   15189574. S2CID   8593467.
  25. Kruger, Justin; Burrus, Jeremy (2004). "Egocentrism and focalism in unrealistic optimism (and pessimism)". Journal of Experimental Social Psychology. 40 (3): 332–340. doi:10.1016/j.jesp.2003.06.002.
  26. Taylor, Shelley E.; Brown, Jonathon D. (1988). "Illusion and well-being: A social psychological perspective on mental health". Psychological Bulletin. 103 (2): 193–210. CiteSeerX   10.1.1.385.9509 . doi:10.1037/0033-2909.103.2.193. PMID   3283814. S2CID   762759.
  27. Steven Sloman; Philip Fernbach (2018). The Knowledge Illusion: Why We Never Think Alone. Riverhead Books. ISBN   978-0399184369.
  28. Kahneman, Daniel (19 October 2011). "Don't Blink! The Hazards of Confidence". New York Times. Adapted from: Kahneman, Daniel (2011). Thinking, Fast and Slow . Farrar, Straus and Giroux. ISBN   978-1-4299-6935-2.
  29. Plous (1993, p. 217).
  30. Malmendier, Ulrike; Tate, Geoffrey (2008). "Who makes acquisitions? CEO overconfidence and the market's reaction". Journal of Financial Economics . 89 (1): 20–43. doi:10.1016/j.jfineco.2007.07.002. S2CID   12354773.
  31. Twardawski, Torsten; Kind, Axel (2023). "Board overconfidence in mergers and acquisitions". Journal of Business Research . 165 (1). doi:10.1016/j.jbusres.2023.114026.
  32. Thompson, Leigh; Loewenstein, George (1992). "Egocentric interpretations of fairness and interpersonal conflict" (PDF). Organizational Behavior and Human Decision Processes. 51 (2): 176–197. doi:10.1016/0749-5978(92)90010-5. Archived (PDF) from the original on 2014-11-07.
  33. Babcock, Linda C.; Olson, Craig A. (1992). "The Causes of Impasses in Labor Disputes". Industrial Relations. 31 (2): 348–360. doi:10.1111/j.1468-232X.1992.tb00313.x. S2CID   154389983.
  34. Johnson, Dominic D. P. (2004). Overconfidence and War: The Havoc and Glory of Positive Illusions. Harvard University Press. ISBN   978-0-674-01576-0.
  35. Aumann, Robert J. (1976). "Agreeing to Disagree". The Annals of Statistics. 4 (6): 1236–1239. doi: 10.1214/aos/1176343654 .
  36. Daniel, Kent; Hirshleifer, David; Subrahmanyam, Avanidhar (1998). "Investor Psychology and Security Market Under- and Overreactions" (PDF). The Journal of Finance. 53 (6): 1839–1885. doi: 10.1111/0022-1082.00077 . hdl:2027.42/73431. S2CID   32589687.
  37. 1 2 Oskamp, Stuart (1965). "Overconfidence in case-study judgments" (PDF). Journal of Consulting Psychology. 29 (3): 261–265. doi:10.1037/h0022125. PMID   14303514. Archived (PDF) from the original on 2014-11-07. Reprinted in Kahneman, Daniel; Slovic, Paul; Tversky, Amos, eds. (1982). Judgment Under Uncertainty: Heuristics and Biases. Cambridge University Press. pp. 287–293. ISBN   978-0-521-28414-1.
  38. Radzevick, J. R.; Moore, D. A. (2009). "Competing To Be Certain (But Wrong): Social Pressure and Overprecision in Judgment" (PDF). Academy of Management Proceedings. 2009 (1): 1–6. doi:10.5465/AMBPP.2009.44246308. Archived from the original (PDF) on 2014-11-07.
  39. Elizabeth.R, Tenney; David, Hunsaker; Nathan, Meikle (2018). "Research: When Overconfidence Is an Asset, and When It's a Liability". Harvard Business Review.
  40. Fowler, James H.; Johnson, Dominic D. P. (2011-01-07). "On Overconfidence". Seed Magazine . ISSN   1499-0679. Archived from the original on 2011-08-12. Retrieved 2011-08-14.{{cite journal}}: CS1 maint: unfit URL (link)
  41. Daniel Kahneman; Gary A. Klein (1 September 2009). "Conditions for intuitive expertise: a failure to disagree". American Psychologist . 64 (6): 515–526. doi:10.1037/A0016755. ISSN   0003-066X. PMID   19739881. Wikidata   Q35001791.
  42. Daniel Kahneman; Olivier Sibony; Cass Sunstein (2021). Noise: A Flaw in Human Judgment. United States of America: Little, Brown and Company. ISBN   978-0-316-26665-9. OL   39437932M. Wikidata   Q107108799.
  43. 1 2 Judge, Timothy A.; Locke, Edwin A.; Durham, Cathy C. (1997). "The dispositional causes of job satisfaction: A core evaluations approach". Research in Organizational Behavior. Vol. 19. Elsevier Science. pp. 151–188. ISBN   978-0762301799.
  44. "Overconfidence". Psychology Today. Retrieved 2021-03-08.
  45. Rimondi, Christopher (2019-08-06). "Chernobyl, Anatoly Dyatlov and Engineering Arrogance". Medium. Retrieved 2021-03-08.

Further reading