Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There's something here that I feel is pretty deep, though offensive for some minds: What is the actual consequence of being wrong? Of not getting right the base reality of a situation?

Usually, stasis is the enemy that is much great than false information. If people with 90% truth can take a step forward in the world, even if they mistakenly think they have 100% truth, what does it matter? They're learning more and acting more for that step taken. If the mistaken ground truth is false and importantly enough false, they'll learn it bc their experience is grounded in the reality the navigate anyhow. If they don't learn it, it's of no consequence.

This is on my mind because I work in democratic reform, and I am acutely aware (from books like "Democracy for Realists", that eviscerate common assumptions about "how democracy works") that it often doesn't matter if we understand how democracy is working, so long as we feel like we do, enough to take steps forward and keep trying and learning. We literally don't even know how democracy works, and yet we've been living under it for centuries, to decent enough ends.

I think often about the research of Donald Hoffman. His lab runs evolutionary simulations, putting "creatures" that see "reality" (of the simulation) against creatures that see only "fitness" (the abstraction, but also the lie, that is more about seeing what gets the creature living to the next click of the engine, whether that's truth or falsehood about the reality). https://www.youtube.com/watch?v=oYp5XuGYqqY

Basically, creatures that see only fitness (that see only the lie), they drive to extinction every creature that insists on seeing "reality as it is".

I take this to mean truth is in no way, shape, or form favoured in the universe. This is just a convinient lie we tell ourselves, to motivate our current cultural work and preferences.

So tl;dr -- better to move forward and feel high agency with imperfect information, than to wait for a full truthful solution that might never come, or might be such high cost as to arrive too late. Those moving forward rapidly with imperfect information will perhaps drive to extinction those methods that insist on full grounding in reality.

Maybe this is always the way the world has worked... I mean, does any mammal before us have any idea how any of reality worked? No, they just used their senses to detect the gist of reality (often heuristics and lies), and operated in the world as such. Maybe the human sphere of language and thought will settle on similar ruthlessness.




Incorrect information by itself is at best useless. Incorrect information that is thought to be correct is outright dangerous. Objective truth is crucial to science and progress.

We've come too far since the age of enlightenment to just give it all up.


The hundred year functioning of democracy begs to differ. It literally works nothing like how anyone tells themselves it does, not just laypeople, but arguably even political scientists. It's quite possible that no echelon of society has had the correct story so far, and yet... (again, see "Democracy for Realists")

Also, the vision heuristics that brains use to help us monitor motion as another obvious example. They lie. They work. They won.

https://x.com/foone/status/1014267515696922624?s=46

> Objective truth is crucial to science

Agreed. We define science and science is truth about base reality.

> Objective truth is crucial to [...] progress.

More contentious imho. Depends if progress is some abstract human ideal that we pursue, or simply "survival". If it's the former, maybe objective truth is required. If it's the latter, I find the simulation evidence to be that over-adherence to objective truth (at least information-theoretically) is in fact detrimental to our survival.


> “My father once told me that respect for truth comes close to being the basis for all morality. 'Something cannot emerge from nothing,' he said. This is profound thinking if you understand how unstable 'the truth' can be.”

Frank Herbert, Dune


Yes! There’s no ‘element’ of truth. Funnily enough, this isn’t a philosophical question for me either.

The industrialization of content generation, misinformation, and inauthentic behavior are very problematic.

I’ve hit on an analogy that’s proving very resilient at framing the crossroads we seem to be at - namely the move to fiat money from the gold standard.

The gold standard is easy to understand, and fiat money honestly seems like madness.

This is really similar to what we seem to be doing with genAI, as it vastly outstrips humanity’s capacity to verify.

There’s a few studies out there that show that people have different modes of content consumption. A large chunk of content consumption is for casual purposes, and without any desire to get mired into questions of accuracy. About 10% of the time (some small %, I don’t remember the exact) people care about the content being accurate.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: