Yes. And if someone knows books or discussions about this topic I want the names. Its such a strong force in social structures. Stating your truth is rarely possible, you better sing along and talk down in smaller sub groups later.
Read anything Charlie Munger writes on the topic. He said something to the effect of “A year in which you fail to kill at least one of your cherished ideas is probably a failed year.”
Anyway, he has a lot to say on the subject and on incentives and human psychology broadly.
I think the idea is you have many cherished ideas, and so many are actually false that you shouldn't have much difficulty killing one.
Or, another way of looking at it, your knowledge of so many areas is superficial and a deep dive into any one would let you know how you weren't even wrong, as the saying goes.
You can find people doing it in any HN thread about almost anything - often shows up as "why didn't they just do X". A non-domain expert with a "obvious solution" is almost always missing something major.
This is a large chunk of "The Elephant in the Brain", though it's present as an example of the wider thesis. The overall topic of the book is that people basically always lie about their motivations, including to themselves. I highly recommend reading it.
Everything handwaving freakoutery (it's a blog) publishes on egregores I found highly relevant and entertaining.
Actually, everything contemporary mentioning egregores is probably relevant - they're also called 'AI autocults', but the concept existed long before group communication became enmeshed with computer infrastructure - though the dynamics are different now than in enlightenment-era Europe, and the pre-Newtonian assumptions of the older publications throws most modern readers off.
We referred to it as "issue alignment" but its gone by other things, IIRC. Basically, in theory, one could end up supporting positions that would make 0 sense outside the context of "well, I support "X" cause "team green" supports "X."