Typical partial-rationality. Humans are not robots. Having an accurate picture of your own flaws and the risks involved in most endeavors will not help you to perform better than having a slight overconfidence. We tends to overemphasize risks if we are made persistently aware of them. In every dynamic performance based activity, this tendency needs to be counterbalanced by a degree of irrational confidence. It's a feature, not a bug.
There are times when it is also harmful, but things like actually being able to talk to that girl, ask for that raise, close that sale, take down that oppressor...these things benefit tremendously from irrational confidence.
Removing all overconfidence would be the end of the human race.
This is not about "removing all overconfidence" on individual level. Here's the snippet:
"Not even he believes that the various flaws that bedevil
decision-making can be successfully corrected.
The most damaging of these is overconfidence: the
kind of optimism that leads governments to believe
that wars are quickly winnable and capital projects will
come in on budget despite statistics predicting
exactly the opposite. It is the bias he says he would
most like to eliminate if he had a magic wand."
It's useful to be able to psych yourself up when you know that you could use a little extra confidence, and costs and externalities are low. Like you need when feeling nervous about starting a conversation with a stranger.
In other situations it might be more useful to be able to make less biased guesses about outcomes, about starting wars or investing or stopping climate change. It's like you can't stop bluffing in a high stakes game even when you've lost your credibility.
That is exactly the kind of overconfidence that should NOT be eliminated. Going ahead when the odds are against you is the pioneering way. The odds were against my startup then, and they still are now. But if I'd believed them, I wouldn't be where I am today.
If your startup succeeded, I wouldnt call it overconfidence. Still, if you subscribe to the lean startup philosophy your confidence should be based on cohort analysis and data not gut feelings.
In his book Thinking Fast and Slow he talks about the way we believe things.
When encountering an idea our system 1 believes it automatically and than our system 2 considers the consequences of believing it and decides if you should UnBelieve it.
They found that when system 2 is tied up or exhausted you are more likely to believe something thats false because your system 2 didnt consider unbelieving it. This is why late night commericials and infomercials worked so well.
The danger you describe is in a case of uncertainty having confidence...Daniel Kahneman is talking about a case where you could know forsure but because of a cognitive bias you are overconfident without relying on statistical evidence which may be to the contrary.
How can you know for sure? In the end all you actually have are probabilities. At the beginning of the incubator I attended, the estimated success rate of startups was less than 10%. By that rationale, I should have packed my things and gone home.
And in fact, I agree with that rationale. I still can't believe I went through with it, much less succeeded.
But without foolhardy people to take on these gigantic risks, we'd be FAR less developed than we are.
In the end we can only do the best we can with the information available to us. This is a warning to make sure you have the most accurate information possible instead of digging your head in the sand and hoping your blissful ignorance is like a blind squirrel finding a nut once in a while.
Statistics cant prove anything! Thats his point. Dont be so confident that your mind is too closed to explore other possibilities.
Scientific method can only eliminate possibilities and show you the most probable option. It is by no means proof. But thats a bad reason to not consider competing hypothesis instead of just trying to confirm an existing bias.
Your point about incubator success is different. Those statistics are exactly the ones we want to avoid. Success or failure of a startup in an incubator is not diagnostic evidence as i can think of a dozen other hypothesis to explain those results. Its like a doctor saying you have strep because you have fever. Since fever is consistent with other diagnosis it has no diagnostic value beyond the point of saying you are sick.
I think those are the author's words and not Kahneman's. At least I would hope so, otherwise it would reveal a lot of ignorance about the objectives of a bureaucracy like the government.
I think you are misunderstanding the word confidence in the sense that Kahneman is using it. One can also be over confident in one's notion that the girl wouldn't want to talk to you, the raise is unjustified, the customer doesn't want the product, and the oppressor is unbeatable. By your examples, you seem to be associating it with self-esteem, whereas in Kahneman's case it's more of an epistemological issue.
agreed - what the article is trying to say (i think) is that human's are too confident about first intuition (system 1), their "gut feelings" and because of that they are affected by serious problems like the base-rate fallacy, etc
I'd argue there's a difference between normal confidence and over-confidence. I work with an over-confident guy. He's regularly a motor mouth in meetings, plays up this wizened master persona, is never shy to aggressively give his opinion, etc which might be okay in some cases but the problem is he's pretty much an idiot. Most of his suggestions are borderline nonsense. Its hard to have a straight face when he recommends things like getting rid of our Drupal infrastructure because, "I can just make a website in Access and run it off the network drive," and other suggestions of that caliber.
Even people who are amazing shouldn't act like this too often. Its bad politics and the guy you think is your force multiplier ends up being the guy who chases the other talent away. These people are toxic to organizations and unfortunately, they seem unwilling to change or even see what they are doing wrong here. I guess most people would call this the Dunning–Kruger effect, which may or may not be right, but personally I think it ties into some level of autistic spectrum stuff and as such is probably untreatable.
The larger question, at least to me, is what do we do with people who don't fit into the typical office culture mold? The overconfident braggart personality probably fit some ancient evolutionary niche that isn't helpful in a lot of fields today. I guess these guys eventually migrate into jobs like sales that value casual bullshiting.
"The most damaging of these is overconfidence: the kind of optimism that leads governments to believe that wars are quickly winnable and capital projects will come in on budget despite statistics predicting exactly the opposite."
I think he is inferring idiotic overconfidence? Going to war with the wrong country, for the wrong reasons, killing, and mameing thousands of soldiers, and civilians; is idiotic overconfidence?
"things like actually being able to talk to that girl, ask for that raise, close that sale, take down that oppressor...these things benefit tremendously from irrational confidence." I'm not sure I would call these things overconfidence. I would call them the actions of a confident person?
Sure we all know overconfident people who get lucky(Donald Trump--with dad there to guide/bailout young Donald), but the overconfident people I have met--usually end up crashing, and burning, and hurting a lot of people. They are not respected, and don't get far. And many end up in jail.
And yes, according to this article; they end up in power? They do end up in power sometimes, but the ones I know all had enabling rich parents. I would term them Idiots with rich parents, over Overconfidence? Or, just the average politician?
Your premise is right from what we were given in this article. I have a feeling his book goes into more detail? I have a feeling the interviewer left out some critical information? I hope he left out information, because I'm confused with the example, and label of Overconfidence too! It's not being overconfident--it's being an irrational idiot? Then again, I'm confused with a lot of Psychology? I don't like/respect Psychologists who ponder, without a lot of evidence to back up their theory. It's just yaking, without evidence?
Actually, I think years ago, Allen Grenspan called this behavior, "Irrational Exuberance"? I think this is a better than Overconfidence? It's funny how "new theories" get recycled? B.F. skinner would be gleaming? Yes, we really don't have original ideas? We are essentiall just stealing other people's observations/thoughts/inventions; repackaging, and refining, and in the end--calling them our own?
But underconfidence is a weaker problem, in that it only effects efficiency. Maybe you won't talk to that girl, get that raise or close that deal. But you will not find yourself in a situation of no return.
The trick would be to be overconfident in situations where failure doesn't matter (or is even beneficial) and be underconfident in all other matters. Unfortunately it's usually either one for all situations.
Pet peeve of mine: It's actually the "Nobel Memorial Prize in Economic Sciences" or "Sveriges Riksbank Prize in Economic Sciences in Memory of Alfred Nobel".
"It is not one of the prizes that Alfred Nobel established in his will in 1895, but instead was established 73 years later by a donation to the foundation from Sweden's central bank, the Sveriges Riksbank, on the bank's 300th anniversary."
From the criticism section:
In his speech at the 1974 Nobel Prize banquet Friedrich Hayek stated that had he been consulted on the establishment of a Nobel Prize in economics, he would "have decidedly advised against it" primarily because, "The Nobel Prize confers on an individual an authority which in economics no man ought to possess.... This does not matter in the natural sciences. Here the influence exercised by an individual is chiefly an influence on his fellow experts; and they will soon cut him down to size if he exceeds his competence. But the influence of the economist that mainly matters is an influence over laymen: politicians, journalists, civil servants and the public generally."
Less important, but relevant note about the name:
"Some critics argue that the prestige of the Prize in Economics derives in part from its association with the Nobel Prizes, an association that has often been a source of controversy. Among them is the Swedish human rights lawyer Peter Nobel, a great-grandson of Ludvig Nobel.[28] Nobel criticizes the awarding institution of misusing his family's name, and states that no member of the Nobel family has ever had the intention of establishing a prize in economics."
Not to be that guy, but -- you are appealing to authority, which is not a valid argument... Personally, if I had a magic wand, would probably use it on myself to cure that damn impostor syndrome and the paralyzing fear of failure. But that's just me.
An appeal to authority is when you assert that because person X has significant knowledge on subject Y, that they're also therefore right about something in topic Z, where Y and Z are unrelated. For example, Steve is a researcher in bioengineering who's well regarded in his field. He says Stephen King is a poor writer. Stephen might be right or wrong, but we can't use his status as a scientist to back up his statements regarding literature. They're not related.
However, it's not an appeal to authority when someone is actually an authority on an established subject where it's possible to be an authority on that thing. Which is the case here.
It may not be a logically valid argument - Kannehman could be wrong, despite being an authority - but it doesn't seem reasonable to dismiss the claims of a domain expert who has demonstrated a great depth of knowledge, including empirical studies to support his view points.
> you are appealing to authority, which is not a valid argument
Outside of high school debate clubs, it's pretty reasonable to consider the demonstrated expertise of people making claims. I tend to take the opinion of medical researchers about the relationship of HIV and AIDS over the opinions of members of the Foo Fighters and Thabo Mbeki, for example.
The opinion of an expert, in his area of expertise, backed by a career's worth of research, seems a little more likely to be correct than a random on the internet backed by nothing more that "nuh-uh!"
Credentials can certainly be a good guideline, but as I'm sure you know HIV- and AIDS-researchers don't make the argument that their claims are true because of their authority status (which would be an appeal to authority), but because of the research their claims are based upon - as opposed to Jenny McCarthy et al.
Well technically, while appeal to authority may not be valid for a perfectly informed rational agent, it is something that is valid for humans for the very same reasons 'torreens advocates overconfidence is useful - namely, we don't have neither enough time nor computing power to evaluate to evaluate all available data. Authority is a useful heuristic.
Well of course; authority is just a heuristic, and heuristics can by definition lead you astray. But it doesn't mean they're not useful if you're careful about using them.
An appeal to authority is never a valid argument, as it is a logical fallacy (i.e. by definition invalid).
Sure, authority can be a useful heuristic for non-authorities, but is useless in argumentative reasoning, e.g. "X, who is an authority says Y, and so [therefore] Y is true".
> An appeal to authority is never a valid argument, as it is a logical fallacy (i.e. by definition invalid).
Jenny McCarthy and Dave Grohl's opinions on the effectiveness of vaccination are clearly as valid as my doctor's. Weighting my doctor's views above those is clearly a fallacy because debating rules.
I appreciate your sarcastic attempt to make a point via reductio ad absurdum, but as you can see the only thing I'm saying is that arguments that makes the assumption that whatever an authority figure says regarding a matter must be true (argument from authority) are never valid.
I'm not saying that expert opinions don't have merit, I'm saying that expert opinions aren't logically true because they are uttered by experts.
I suggest you look a little more into these "debating rules" (logical fallacies), as they are basically old-timey resoning rules; i.e. popular and intuitive arguments that don't make sense.
P.S. As I recall, the Foo Fighters were mixed up in HIV-denialism, not anti-vaccination.
Fortunately, real world does not run on boolean logic. "X, who is an authority on the topic says Y" does not prove Y is true, but provides evidence in favour of Y being true.
Sure, I agree to some extent.
In the hypothetical case it may be reasonable to assume that X knows what they are talking about; one can reasonably assume that they are better informed regarding the subject than a layperson.
But the reason for it being a fallacy is that this sort of argument doesn't provide evidence of any kind. Rather, it makes the assumption that it must be true, because the authority says so.
I agree that overconfidence is harmful, and likely the most harmful part of the human psyche. Some examples: Nazis, slavery (racism in general), communism, societal misogyny (exists in every society of any size), LGBT hate, religious hate, and really any type of bigotry or hate based on incomplete, inaccurate information. In every single one of these cases, the perpetrators of the atrocity are overconfident in their ability to judge others' true nature and don't realize that they are wrong. I think this is the type of overconfidence that is most harmful and they type that Kahneman is referring to.
> Removing all overconfidence would be the end of the human race.
Exactly. Vasco da Gama needed a lot of over-confidence to sail all the to India, from Europe, in something like this: https://www.google.com/search?q=Vasco+da+Gama+ship&biw=1920&.... And there are countless examples like this one, which, on the whole, I think benefited us as a species.
I think you already disarm your own argument when you say:
"things like actually being able to talk to that girl, ask for that raise, close that sale, take down that oppressor...these things benefit tremendously from irrational confidence".
Exactly. You don't need to be overconfident to "talk down (sic!) to a girl" or ask for a raise, but about "just" confidence. Case closed.
Not trying to argue either way -- it comes down to defining "over" in this context. If asking a girl/boy out when she/he is going to reject you (but you don't know it) is overconfidence, then we need overconfidence. (We would need to be psychic or we'd never do anything.) If it means assuming every piece of contrary advice you receive is idiotic (eg invasion of Iraq) then no, we don't need it, and this is precisely the kind of definition the article uses.
The article defines the overconfidence as a belief in an outcome despite what reasonable past statistics tell us.
"the kind of optimism that leads governments to believe that wars are quickly winnable and capital projects will come in on budget despite statistics predicting exactly the opposite."
On a second thought, project planning and making plans in general that involve duration, cost, etc, and "asking a girl out" or "asking for a pay rise" are very different things and I'd prefer not to conflate the two.
You might be able to generate some stats on what % of girls that are asked out (in some particular circumstantce e.g. at a bar in Kentucky, etc) agree or how many pay rises are approved.
You can then talk about over- or under- confidence if expectations of the outcome deviate largely from these stats. How much is too little or much? Who knows.
> The article defines the overconfidence as a belief in an outcome despite what reasonable past statistics tell us.
Following this same logic, a nerdy guy who has 100% rejection rate should accept it and do nothing. However from all the self-help/marketing literature we know this is a doomed philosophy.
There are times when it is also harmful, but things like actually being able to talk to that girl, ask for that raise, close that sale, take down that oppressor...these things benefit tremendously from irrational confidence.
Removing all overconfidence would be the end of the human race.