I don't consider these biases "Major Flaws" or flaws at all. IMO these observations are heuristics that actually work amazingly well for most every day situation and one shouldn't be quick to discard them in lieu of some deeper, system-2esque idea. This is similar to how optical illusions don't show flaws in our vision but rather highlight how well we can infer non-trivial information correctly in "normal" situations.
For example, in the linked example of "Conformity", a person taking a test with several other people doesn't immediately leave the room when observing smoke and hearing a smoke alarm. This is IMO, a completely legitimate response: If several other people are calmly carrying on their business it's much more likely to assume they know something you don't rather than them all being just sitting lemmings oblivious to their own demise (or that they are all goons in on a televised hoax at your expense). Just imagine how intolerable life would be if every single person would want to investigate every single request or stated fact for themselves, just to discover the request is usually logical and the fact true (consider for example, a "Dead End" sign with every single driver checking if it's actually true).
Obviously the edge cases of these heuristics (the "illusions") should be examined, and care taken when appropriate, but it's important to remember that these human tendency are basically what allows complex societies (with many individuals with partial knowledge) to function.
> I don't consider these biases "Major Flaws" or flaws at all. IMO these observations are heuristics that actually work amazingly well for most every day situation and one shouldn't be quick to discard them in lieu of some deeper, system-2esque idea.
I cannot agree more. If we determined to point to a "Major Flaw", that I'd propose something like inability to judge difficulty of a task a posteriori. It is the most strikingly could be seen in children:
> For instance, we can show the children a familiar candy box. Anyone who sees it will leap to the conclusion that there's candy inside. When we open it, it turns out to be a trick: there are actually pencils inside. Then we can ask the children simple questions about this series of events. What did you think was inside it? What will your friend Nicky think is inside it, if he sees it all closed up like this?[1]
Three years old children consistently believe that Nicky will think that there are pencils inside the candy box. With age people become wiser, but there are a lot of people who cannot grasp it for more complex setups. They resort to a modified version of a Hanlon's Razor[2]: if we cannot understand why someone had made a mistake, then it is because of stupidity. If a decision led to a mistake, than it was a stupid decision.
I personally believe, that it is a problem of bayesian mind, which updates probabilities after observation. Some patching could help, but only in a limited way. One cannot replay a past state of a mind on a full scale, it would require a bigger mind.
Do you really think that a person can misinterpret a fire alarm with the huge cloud of smoke? Of course, you can assume that others know something about it. But then the logical thing to do is to ask for this knowledge, the knowledge that contradicts yours. Demand a proof, solve your inconsistency. Maybe you're hallucinating. Following blindly is not a reasonable choice, if this contradicts your own feelings.
There is no reason to investigate something if it doesn't contradict your knowledge. It's reasonable to investigate the "Dead End" sign only if every other driver goes under it. Because this is inconsistency. And not to drive just like others, but ask for an explanation first.
Yes, those "heuristics" may be good enough in a lot of situations. Just like instincts, they can be useful too. But we can't let instincts rule over our rational mind. This doesn't work. There would be no civilization in such case.
These aren't flaws but they are biases. They may be right 90% of the time but you should still be aware of them so you don't fall for them 100% of the time.
Not only that, but in many cases conformity to group ideas will lead to better outcomes than individual thinking. The world is complicated, and you as an individual probably don't really have the time or brain resources to figure it all out from scratch. But group norms have evolved over time, incorporating the experiences and knowledge of many people. See [1] for many examples.
Of course, in the modern world group norms may not have caught up with our rapidly changing environment, so there are likely to be more cases where individual thinking can improve things. There should clearly be a balance between following the group and thinking for yourself; neither is good on its own.
> group norms have evolved over time, incorporating the experiences and knowledge of many people
While this can often be true, it's not always true. Furthermore, it's often a testable hypothesis, and in those cases, it should be tested, not just assumed.
Also, the "conformity" case in the article involves a group of people (the confederates trying to manipulate the unwitting test-taker) who are violating a common norm (namely, that the normal behavior of people when a fire alarm goes off and smoke is seen is to get out, not calmly continue going about their business). So that case is not testing "how well do you follow group norms that make sense"; it's testing "how willingly will people follow a group that is doing something that obviously does not make sense". That's a different thing.
I look at it this way: a contrarian opinion is a high risk high benefit thing. A contrarian is more likely to be wrong and it takes more effort to arrive at a correct contrarian opinion than to pick a correct conforming one, but if you are right it’s possible that you can obtain some large payoff such as investing early in a winner or selling near the top of a bubble.
I couldn't watch the ridiculous video all the way through. I am positive that the prank victims would have asked the other folks in the room what was going on. Did they address how people asked and whether/how they were answered?
If you're assured by other people that, "it's fine, this happens", it's even more understandable to not panic in this situation.
Did you watch the video? The scenario it depicts seems fairly ludicrous and one sided. A section of the article is based on this interpretation, so I think it's fair to question it.
To be clear, I scanned the entire video but didn't watch it minute by minute, so I may have missed something. (Different from "I stopped when...") But I'm genuinely curious if anyone knows the context behind whether people asked questions because it really throws the premise into doubt.
Would be interested to get OP's take in particular, but I think I'm nested into oblivion at this point.
I did. Your (weird) insistence on critiquing the study as if it was a thought experiment / hypothetical / work of fiction (i.e. a piece where the author is asking you to believe something that's just too unbelievable)... is a reason altogether to downvote.
The mysterious music and quavering voice over positively scream manufactured drama. I don't think it is just me who would get that impression. In a conversation of 2, it's hard to determine who the weird one is sometimes! ;)
What makes you think this was a study? Is there a published work referenced somewhere that I missed?
It's good to periodically see these articles get traction. There are so many logical fallacies that we need to remain diligent in guarding against them.
Lately, in response to the relatively intense polarization of beliefs, I've been contemplating open-mindedness, and whether discussions starting with establishing an open-minded context could get people talking again.
To me, open-mindedness means that you're receptive to new ideas and experiences and generally do not reject them outright. When presented with implausible or seemingly impossible information, an open-minded person will listen to the basis of ideas, concepts and information and weigh that basis before making a determination about whether or not the idea is true and worth assimilating. Or they may try a new experience with less pre-judgment of whether they'll appreciate it.
In my view, there is a critical nuance in open-mindedness: It applies whether or not the idea is consistent with what I already believe to be true. As an open-minded person, I feel that I need to regularly reevaluate my beliefs as new information comes to light, even if that new information directly contradicts a basis for a firmly held belief.
I'm curious to know examples that you've had where you've learned new information that contradicted what you already firmly believed, and through open-mindedness, changed your belief accordingly.
Sidestepping your question, open-mindedness comes with its own costs. Properly analyzing an idea takes time away from other things you could be doing, and some ideas are so outlandish that your life will almost certainly be better by simply making the choice to spend as little time as possible flirting with such absurdities (except insofar as you might enjoy toying with "what-if" questions as an end unto itself). Are you practicing open-mindedness...everywhere?
It wasn't that long ago that some crackpot tried to convince me that the modern atmosphere had damaged my lungs, so to replace the oxygen I was missing I needed to buy hydrogen peroxide injections. The claim as a whole either needs to have major structural flaws or overturn sizable portions of modern chemistry and physics to have a chance at plausibility, and my biases strongly suggest that instead somebody was trying to skirt legal oversight in order to con me.
Good point. I'd say that for any new information, there does need to be an initial 'smell test' threshold. When confronted with something so outlandish, I typically switch to making a judgment about the person and how their thought processes work. I.e., ask them to explain the basis of their claim and look for overt logical fallacies, which we all get better at identifying over time.
It's also got to be a matter of sufficient consequence to be worth the time.
Bayes has you covered, if your priors rank something as exceptionally likely or unlikely, then you probably shouldn't pay much attention to new information (especially when it's from some random person and carries a high likelihood of being misleading).
Extraordinary claims require extraordinary evidence, etc.
There are things that aren’t likely enough or important enough for the individual to commit resources to explore intellectually. But these things may be likely enough or important enough for society as a whole for some small group or even individual to explore and report back on.
That is why it is important to support your local crackpot. Much as staid English society always valued their eccentrics. The more ultrarational the person, the more important for that person to spend quality time at the fringes.
> Properly analyzing an idea takes time away from other things you could be doing, and some ideas are so outlandish that your life will almost certainly be better by simply making the choice to spend as little time as possible flirting with such absurdities (except insofar as you might enjoy toying with "what-if" questions as an end unto itself).
There's also the claim that someone is being "close-minded", when they reject an idea they have soundly researched and therefore need not waste any more time on. As Harlan Ellison so eloquently put it: "I'd gotten all the literature I could handle on the subject from a certain Thomas Aquinas" in regards to the god hypothesis. More often than not, "close-minded" is a conversation stopper, designed to deflect from the accuser's very own ignorance and credulity.
Same thing goes for creationism, AGW denial, anti-vaxx, anti-mask, the QAnon cult, trickle-down economics, etc.
"There is a principal which is a bar against all information, which is proof against all arguments and which cannot fail to keep man in everlasting ignorance – That principle is contempt prior to investigation.” -- Herbert Spencer
I find that breaking the habit of being quick to judge leads to open-mindedness. This is also a tenet of Cognitive Behavioral Therapy - your first thought and subsequent feelings that arise from that thought are often wrong.
For me the term "open minded" is a label for close minded people that believe specific things they label as "open minded". Say for example legalizing drugs. They don't believe in being open minded about whether or not that's a good idea. Instead they believe supporting it is the "open minded" position and not supporting it is the "close minded" position. They have a similarly actually closed mind about most topics.
No, I'm open to discussing it, I haven't decided, I've heard good arguments on both sides.
but, in my experience, most people that put "I'm open minded" in a self description are signaling they believe in certain specific policies and they are actually close minded about whether those policies are good or bad. They've already decided they are good and won't consider (have an open mind) to any suggestions otherwise.
It can also help to occasionally play "devil's advocate" and argue for "the other side". This forces you to seriously consider the weaknesses of your own side and strengths of the other side. I've definitely developed more nuanced opinions as a result of playing devil's advocate
I prefer the idea behind the steelman technique (as opposed to a strawman argument): state the position of the person you disagree with so well they say "I wish I had put it that way!" then proceed to refute the central argument.
In my experience, knowing about these so called fallacies may help us correct others but it is much more difficult to realise when you are making the mistake yourself. People just think they are too smart to commit these mistakes.
As paradox as that may sound, in a sense the opposite of open-mindedness is needed. There needs to be more social pressure on people to keep discussions civil and to stay intellectually honest. The problem right now is that people advance argument for rhetorical purposes only, in order to "win" instead of solving problems together. Social pressure to change that requires people to be less tolerant towards intellectually dishonest people.
Of course, I agree with you that open-mindedness is important when dealing with intellectually honest people.
The paradox is "pressure to keep civil" to you may not be the same to me. Numorous times trying to call out someone not civil will put you in the not civil category. That's why systems like HN aren't necessarily encouraging of open discourse. Many don't want to get judged, and judging is all many do, downvoting based on rhetoric and neuance (or lack thereof). And, the achilles heel is the downvote as the not-civil vote, except there is no discourse because you can't reply to a downvote.
I agree it’s good that cognitive biases and better frameworks for thinking get traction here (or anywhere else). However this blog post doesn’t look to be backed by any research, or anything else for that matter. I predict people just start arguing politics below this comment due to some of the lexical choices, for example.
Ok. Probably you are holding a very firm believe that from sending 1 Million in bitcoin tona random person on the internet asking for it, nothing good comes out of it. Please challenge this view. My wallet is... wait need to look that up
This is an interesting thread as I'm seeing a false dichotomy form. Either these phenomena are flaws and should be abandoned or they are often helpful and therefore should be adhered to.
But I think there is a middle ground or maybe it is not in the middle but off to the side somewhere. We can be aware of these biases and try to be conscious of when we are relying in part or entirely on them and still choose to abide by them especially in the absence of better information.
Understanding that one is behaving out of conformity or conservatism or another bias doesn't consign one to intermable mental litigation. You can be aware of this fact and still know you do not and may never have sufficient information to challenge and overturn the bias.
However, when we suspect we are acting solely out of one of these biases and we see a compelling reason to challenge it, that awareness becomes the tool that allows us to change. A compelling examples is the acceptance of homosexuality in Europe and the U.S. The conformist beliefs that were leading to the supression of individuals were doing harm. As more people "came out" the conformist foundation of the supression arguments weakened. For most I think it has become clear that the risk of overruling conformity was insufficient in the face of the material harm homosexual individuals were enduring.
"Flaw" is the trigger word. If we rename the list "Common human trends in thinking" then responses would be less dramatic, but also less interesting.
Awareness is definitely the key. If you recognize you're self-projecting, then now you can account for it. If you're aware of what the group thinks, you can be aware of when you're aligned with it. And so on.
Critical thinking is the ability to expand awareness and unimmerse yourself to the point where you can analyze an idea selflessly. Then you can dive in again to reconnect with how everything feels.
Speaking for myself, the biggest problems in my thinking are as follows:
1. Thinking you know anything with an even remotely high degree of certainty.
2. Confirmation Bias - thinking you are right and searching for evidence to prove you are right, reaffirming your own beliefs in the process and actively ignoring or providing excuses for counter examples and contradictory evidence.
3. Difficulty/Inability of looking only at the facts without interpreting them as you would like them to be or not to be.
3. Inability/Difficulty of admitting you are wrong/made a mistake and also remember it and learn from it.
4. Inability/Difficulty of acknowledging you are an irrational, flawed monkey stumbling around trying to make sense of about a billion things you don't understand and never will understand.
Trying to still make progress given all the above.
Remember -- "If it doesn't have a tail it's not a monkey."
Embrace the inner ape. Instead of looking at our limitations, look at how well we have done. We have microwave ovens, digital watches, and even ice cream. Reflecting on what it takes to go from swinging in the trees to where we are today, I think we have done pretty well for ourselves, all things considered.
Those two statements are not in conflict. Yes, we have done very well for ourselves, but that doesn't mean we aren't irrational and flawed. Just because we have done well doesn't mean we should ignore areas of improvement.
My words were meant as a light-hearted message of hope, with an allusion to The Hitchhiker's Guide to the Galaxy; not a philosophical argument to reject self-improvement.
The easiest solution to number 1 is humbling yourself by making mistakes. Sure making mistakes itself will teach you stuff, but it's the humbling process that is really important. I've learned from years of argumentation and failure that I need to be more careful when finding the truth because people will otherwise make a fool out of you.
I dont agree with the premise that these "flaws" are wrong and must be "removed", if you were to remove them you would probably get a perfectly rational being, like a computer and computers are really stupid.
Being alive is better than being correct, rationality is correct only if all the premises are correct with perfect information. So to manage this world of uncertainity our brain must use heuristics, heuristics that are really good, so good that we somehow survive.
I tried to read past the headlines-grabbing title and the first point, but since it's three distinct points already AND wishful thinking can also be seen as a form of resilience sometimes and not just a terrible thing AND the focus on conservatism seems more emotional than pragmatic, I've checked out before the end.
Perhaps a less self-aggrandizing and hand-wavy way to address the subject would be to actually look at proper research such as CBT in psychology pointing out cognitive distortions[1], which are incredibly helpful to recognize and identify in the self in order to correct both the big issues (some aspects of depression, anxiety) and small ones (everyday life for most people).
It's not self-aggrandizing, I've just overgeneralized all people and used self-projection on them :P
Indeed, common research on this topic is in psychology.
But I consider this as software flaws, that can be patched by increased awareness.
Psychology is speculative sometimes, in my opinion. For example, my view of self-projection is different from a conventional psychological projection. I think we are projecting just for prediction (everything else is a consequence). I don't consider it as a self-defense mechanism of bullies, like explained here https://en.wikipedia.org/wiki/Psychological_projection
I agree it’s good that cognitive biases and better frameworks for thinking get traction here (or anywhere else). However this blog post doesn’t look to be backed by any research, or anything else for that matter. Instead of doing a quick web search to see what’s out there, this person wrote up some of their opinions on the topic and posted it to Hacker News.
I did a quick web search, but this list is indeed my subjective opinion, not a scientific research. Though I've formed it through my lifetime experience and learned it the hard way — by making mistakes.
I very much liked the article, we (Human Beings, Planet Earth, 2020) need many more like this.
Rather than be disappointed by the negative comments you might read in this thread, I recommend that you consider them from the perspective of the article itself. There are only 40 comments here so far, and there is already more than enough rich irony for a followup article, particularly considering that most people on HN are on the higher end of the intelligence spectrum.
This way, the article then kind of has a cool ARG (Alternate Reality Game) aspect to it - write about a topic, seed it into an ecosystem of the very thing you are talking about, and then write about how the ecosystem responds to it. And then to make it even more fun, let everyone in on the joke! So then not only are you playing an ARG, but everyone is playing it together - we (humanity itself, and each of us as individuals [1]) are the very joke that we're laughing at!
I have a theory that some interesting consequences could come out of this sort of thing, but it requires a fair amount of work to set up properly.
[1] For example, the "people" who downvote this post - it is "you" whom I am speaking about! Does your mind (as opposed to "you") find this idea/situation pleasing?
Ironically the biggest cognitive flaw is believing that these apply to other people but not to yourself.
I saw a comment from someone at the center for applied rationality a while ago that said that one of the top reasons why people want to take their workshop was to be able to pinpoint issues in other people's thinking.
Finding flaws in my own thinking is good enough, having done so its still difficult to address those flaws. If someone else's thinking is whacked, it's even harder to diagnose or have a positive effect on.
Of course the question of "is our thinking flawed" is not terribly relevant, much of the time: our actions are influenced as much by things like body needs we're often oblivious about.
I meant that each person had to disclose the single reason they wanted to take the course. Out of all responses, one of most common #1 reasons was to expose issues in other people's thinking.
In other words, some people may have ranked them as you did, but many people ranked them in the opposite order.
Exactly, I always get a good laugh visiting from time to time "lesswrong" and "slate star codex". Reading the commenters you quickly notice they commit all the same fallacies (from group-thinking, to defering to authority for no reason) that they criticize in "normal" humans.
Needless to say the same applies to HN, and to every one of us.
I disagree that conservatism is a major flaw of human thinking, or even a flaw at all. I would argue that conservatism is one of the core principals of science. Although the quantity and quality of data sufficient to change one's views could be (and frequently is) debated, a healthy amount of stability and skepticism is required for science to be efficacious.
A similar argument could be made for conservatism's relationship with social structures. Rapidly changing social structures leads to unpredictable futures, which can lead to economic insecurity at the individual level.
We evolved to think the way we do over millions of years. We should not just dismiss a pattern of thought as a flaw because we don't understand its usefulness. It might be a flaw - the world today is different in many ways from the world our thinking evolved for - but at the very least we have the burden to explain why this pattern evolved, and why it's harmful now.
Every organism alive now evolved over millions of years. What's so special about human thinking? Why not cat thinking, dog thinking, cow thinking, insect thinking?
Evolution isn't about truth. It's about survival and reproduction. Which often involves lying and deception. Sometimes even deceiving oneself.
[Edit: To clarify, I quoted the linked article not to criticize it but to criticize the point I was replying to, which I took to be a human-centric fallacy about evolution.]
You missed my point. I wasn't claiming that cows are as smart as humans. I was claiming that evolution has produced millions of species, only 1 of which is humans. So you can't just handwave "evolution" to validate human thinking.
Humans are likely smarter than all of the other animals, but that's a pretty low bar. The competition for "smartest Earth species" is fairly weak.
Yes, you did, because you seem to continue to think that I was arguing against the article, when in fact I was merely arguing against the comment by amadeuspagel that I replied to.
amadeuspagel was arguing against the article, claiming that these thought patterns shouldn't necessarily be considered "flaws", because they're the product of evolution. I dispute that, because non-human thinking, which we all consider to be inferior, is also a product of evolution. That's what I meant by handwaving. I do believe that humans are smarter, but I also believe that human thinking is very far from perfect, still heavily flawed. As are all products of evolution.
That section very clearly describes the idea of hero-worship, and focusing on individuals instead of larger patterns. There's a good chance the author meant person centric, and (given the last name) just speaks English as a second language and picked a slightly wrong wording.
I think you misunderstood my comment? I was criticizing the HN comment that I directly replied to, not the linked article. I was quoting the linked article to point out that the comment was falling for one of the fallacies listed in the article.
"the tendency for people to under-emphasize situational explanations for an individual's observed behavior while over-emphasizing dispositional and personality-based explanations for their behavior"
e.g: when you're late, some external force is the cause; when co-worker is late, he's just "a person who is the type to be late".
A very bayesian mistake. If you know nothing about a person, then observation of a fact that he is late is an evidence that he is a kind of a person to be late. If you think about it, it is highly more likely to see this kind of person to be late, than to see a person who "never late" to be late.
If you knew your coworker better, if you've formed your own theory about how he is likely to be late, than if he is late you'll probably assume that he faced some external force which became a cause of his lateness.
If you want a much deeper investigation of this topic https://www.lesswrong.com/about has been doing work in this area for over a decade as part of the self-named "rationalist" community.
And yet they commit the same fallacies, from the weird adoration of Yudkowski ( a deluded, intellectual flyweight) to group thinking, them vs us mentality and any lack of self awareness.
> conservatism — an insufficient ability to change our common views and beliefs with the new data, new evidence
I've been able to counter this somewhat by adopting a default assumption that my understanding is incomplete. That's an easier self-sell than being wrong.
He's disabled the right-click -> copy option. A flaw in thinking, IMO. The "Absolute Enable Right Click and Copy" extension, restores the functionality.
>Another flaw of our thinking is conservatism — an insufficient ability to change our common views and beliefs with the new data, new evidence. This is understandable — changing basis views leads to a reconsideration of all related knowledge. An enormous amount of rebuilding is required. Our biological brains just can’t do that in a short time, also it’s much harder with age. Because of this, we give much more weight to old knowledge rather than new evidence, thus making a conservatism bias.
Painting conservatism as purely a human flaw is poor thinking in itself. The positive side of conservative thinking was neatly summarized by G.K. Chesterton: "Do not remove a fence until you know why it was put up in the first place."
There may be things you don't know, or which have never been known, which the conservative solution to a problem addresses either by accident or design. For examples, see traditional methods of food preparation. Indigenous peoples in South America have passed down numerous traditional recipes for the cassava plant. Every recipe involves a process to remove cyanogenic glycosides [0], which are present in all parts of the cassava plant and lethal if ingested. These recipes were developed without any modern understanding of chemistry or toxicology. Deviation from the old recipes without this knowledge, e.g. to apply a new time-saving cooking technique, could have disastrous consequences. Old knowledge often runs deep.
The biggest flaw in human thinking that's even bigger than all the ones in the article is admitting you're wrong on a consistent basis.
We are incapable of doing this especially when we've already invested years and years of our lives onto object oriented programming.
Humans always like to construct logical scaffolds around a biased agenda and they are unable to deconstruct that scaffold when presented with contradictory evidence.
And no one is consciously aware of that this is happening. Contradictory evidence
can be right in front of their eyes but if the programmer already has 10 years of Object Oriented Programming under his belt he's not going to flip on a dime and admit that he's been doing it wrong for 10 years, it just doesn't happen. Instead the person needs to unconsciously recreate a logical perspective of the universe that fits his personal agenda.
Another thing to consider is that this and the flaws mentioned in the article exist within our minds because they aided in our survival. These "flaws" were biologically advantageous and that's why we think this way.
Maybe lying to yourself is technically wrong, but in the end you may be better for it, especially when you've already invested so much time reading and practicing Object Oriented Programming.
Does anyone have any examples of their own illogical and illusive scaffolds about world views that they've built to support their biased agenda?
If so please reply! I set this post up so that it will be very easy for you to find your own examples.
I have plenty of examples of that kind. But of course, once I could see them as they are, I just changed my mind. I don't think I've ever had an "agenda", just biases. Maybe that's why it's easier for me to change course as needed.
While I have no problem admitting that I've been wrong, it's still a problem when others of the same beliefs still haven't realized it's bullshit. So excuse me if I don't share any of the examples.
Personally I try to artificially inject self-doubt into most of my beliefs and claims. Obviously I am as egotistical and self-righteous as anyone else. But the more I train myself to recite "I think" or "Maybe" or "That's a good point", the more (I think!) I can stymie this imperialistic ego.
The problem is two fold. Your imperialistic ego is an evolutionary advantageous quality. You are paying some kind of price for "stymieing" it as your ancestors who did not have this "ego" died through natural selection.
The second problem is that everyone is unaware of their own ego. How can you stymie something when you're unaware of it? When you think that the other party is being unreasonable and that you are being calm and logical the other party sees the reverse.
This lack of self awareness is the real problem here. Your brain will reroute all your misguided attempts to confound it's bias and you unknowingly will only recite "I think" or "maybe" for just inconsequential things.
If you have been doing something wrong or holding an incorrect belief system for a good portion of your life, your biased brain will kick into gear and shield your awareness from the harsh reality of being wrong your entire life.
Religion is a good example of this, but the thing that causes a person to stay religious is an infectious thing that not only influences religion, but all of human behavior and all aspects of life.
Personally I think fighting your own biases is a losing battle thus my philosophy is to just not fight it.
I think the ego absolutely has an evolutionary benefit, but I don't think it follows that more ego then confers more evolutionary benefit. If so, less ego does not confer less evolutionary benefit. We can identify correlation in the past, but this does not imply the same degree of correlation or causation in the future.
I don't think it is true that everyone is unaware of their own ego. Being aware of your own ego is almost by definition "self-awareness". If your point is that all self-awareness is, by definition, illusory, then that is fine but I don't think that is a definition most people would agree with. I do think people who successfully study mindfulness (among other forms of meditation) are able to identify and relegate their egos in large measure.
I don't think "wrong" or "incorrect" are the right terms here. The brain optimizes to minimize prediction error. It can perform this process of "active inference" by either rejecting information (confirmation bias), updating its model (Bayesian updating) or manipulating its environment (changing the data which arrives). I agree many people choose the first. But I don't think it's the only way. You can also update your model by training yourself to be open-minded, or you can update the world by changing your environment around you (this is much harder, and often futile).
So while I agree with many of your points, I don't think I take as much of a cynical approach. I am optimistic we can work with the ego when it benefits us (from a happiness perspective, not an evolutionary one [0]), and sequester it when it does not.
.
[0] For all intents and purposes, we as humans are beyond natural selection. Sexual selection still matters, but as a society we don't really permit "survival of the fittest." Most people survive - independent of their genetic endowments - and still many reproduce. As a result I am not sure natural selection is the right objective function (perhaps it is weighted more toward artificial or sexual selection instead).
Well, considering politics during your thinking is quite important, as often you can't get a correct answer (e.g. to why a thing is certain way or why a potentially promising course of action is actually unlikely to succeed) without taking politics into account.
In most cases where you'd want to convince others about something, the political connotations of various arguments matter just as much (or even more) as how sound these arguments are logically.
Obviously, there are certain avenues of thinking where we'd want to perform a pure rational, impartial analysis while explicitly disregarding any politics. However, that's a minority of the cases, and even then when trying to communicate that analysis, politics becomes relevant once again. Even the desire to carefully describe a particular analysis as apolitical is driven by political motivations i.e. to make that analysis more convincing to others with different political alignment.
As Aristotle said, "Man is by nature a social animal" - it makes all sense that our default mode of thinking and the associated decisionmaking shortcuts are heavily driven by social and political aspects, because in many circumstances the social impact and political perception of some statement is more important than whether it's technically true.
I didn't mean politics, this is a dirty field to do science/logic. Conservatism bias is a real thing, but this doesn't mean that we should throw off our past. Just reconsider its weight in our decisions.
Anchoring bias is a better name for this flaw. Either way, the handful of examples provided here are.. random? This post has poor depth and breadth, akin to a listicle
Curiously, what set of reading would you say that would make a liberal person become a conservative? If you're belief is that it's logical, it must be an argument we can discuss, backup with data - etc.
So to that end, the reason i'm liberal is that i believe there is a balance between social safety nets and capitalist incentives. In my view, we are currently the conservative dream. Very few protections for workers, less and less than years past - with increasing benefits for corporations that see little to no benefits trickled down to the workers. Walmart and Comcast are what i see as the natural result of conservative practice.
I simply want what will result in the least suffering. Yet i see suffering in mass in the very pro-0.01% behavior. Walmart does great in this environment - it's employees, less so. Worse yet if you lose your Walmart job, as the safety nets are being dismantled left and right by conservatives.
So, what reading would you recommend i do to show me that fiscal conservative behavior here and that further more reducing lower/middle class protections is beneficial to them?
I certainly do not claim to know all, or any, answers. All i know is the state of the poor is very unsettling to me in America. What reading would you suggest?
One problem is that you are just repeating straw men ('trickle down') and stating things that are flatly wrong ('Very few protections for workers, less and less than years past'). No, generally the regulations 'protecting' workers have gone up and up and up. The nutty expansion of UC in recent months is one example. As for 'social safety nets', for people who learn to navigate the system, there is basically work-free living available. Between Section 8, SNAP, SSDI, Medicaid, etc, the 'social safety net' is beyond anything imagined a few decades ago.
In addition, your attack on Walmart is lazy and typical. When Walmart moves into a neighborhood the first group to get hit are whatever retailers already exist in the area--because all the best workers immediately line up to work at Walmart instead. Better pay, better benefits, etc. Just a typical case.
So, what should you read? Anything but /r/politics would probably help some.
Sowell's "Controversial Essays" was helpful for me to understand the conservative mindset a little more. He comes from the Chicago School and Milton Friedman.
> Very few protections for workers, less and less than years past
(Assuming you’re talking about the US)
Which worker protections have been rolled back? What do you think workers aren’t protected from now that they were protected against when there were no overtime, minimum wage, or really any safety laws?
Conservativism (sic) is a flaw (in human thinking) because what it means is "resistance to change"... and as Heraclitus said, "change is the only constant in life."
I think all of these flaws stem from cognitive dissonance and our desire to avoid uncomfortable thoughts. The route of fooling yourself is always the easier one.
Certainly it's more preferable to learn of these flaws or to be reminded of them in order to recognize them more readily, than to not concern oneself with this matter at all. Of course I think the optimal path is that which has one actively practice the recognition of these flaws in order to make spotting them a quicker and easier task with time. The ultimate outcome is that it becomes an effortless background process.
I think this article ironically overgeneralizes certain issues. For example, conformism is good when it comes to vaccination or social distancing. People are more comfortable with a stranger injecting a liquid into their bodies when they know everyone else let it happen as well. More conforming countries are handling the pandemic better. The difficult question is what heuristics can be applied to when it's fine to take a mental shortcut.
> conformism is good when it comes to vaccination or social distancing
Not necessarily. Back in early March of this year, "conformism" with respect to social distancing and wearing masks would have meant not doing it, because practically nobody else was doing it and all of the public health authorities were still saying it wasn't necessary for the general population. Based on how the pandemic spread during that time period, that conformism was not a good idea. (And my wife and I did not conform; we started social distancing and wearing masks and gloves when making unavoidable trips to crowded places like the grocery store, at the end of February. Neither of us have had any health issues.)
The vast majority of people don't have the background to understand the mechanism of action of vaccines. 100 years ago the effects of vaccines were easily observable, but now these diseases are rare enough such that a regular person can't rely on observation alone to make a decision to get vaccinated. They need to trust that the doctors know what they're doing.
This is 100% unbiased and objective. It is painfully obvious that all those knuckle-dragging conservatives and religious people, who are categorized in the very first part as narrow-minded by this screed, have no business as part of the public discourse. Clearly the work of a deep thinker.
There are not major flaws in human thinking there is human thinking, framed. There are thousands of years of human thinking on human thinking that one could refer to, and do. Ignoring the question of what is a human subject, reality and thought maps it as another polemic cultural battle of the Global West. It's super fun. Start maybe at Plato, Kant, Hegel, Foucault, Badiou, Ljubljana school, Mbembe, Moten/Harney/Undercommons, Agamben, Butler, Spivak, Hall, and on and on. So many people dedicating their lives to this question always seems like a good place to start, if one is seriously asking the question and not just reproducing status quo. Which, if that is the purpose of this forum, I apologize.
For example, in the linked example of "Conformity", a person taking a test with several other people doesn't immediately leave the room when observing smoke and hearing a smoke alarm. This is IMO, a completely legitimate response: If several other people are calmly carrying on their business it's much more likely to assume they know something you don't rather than them all being just sitting lemmings oblivious to their own demise (or that they are all goons in on a televised hoax at your expense). Just imagine how intolerable life would be if every single person would want to investigate every single request or stated fact for themselves, just to discover the request is usually logical and the fact true (consider for example, a "Dead End" sign with every single driver checking if it's actually true).
Obviously the edge cases of these heuristics (the "illusions") should be examined, and care taken when appropriate, but it's important to remember that these human tendency are basically what allows complex societies (with many individuals with partial knowledge) to function.