Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Hear me out: what if this overlaps 80% with what "real" really is?


Well it doesn’t. Humans are so much more complex than what we have seen before, and if this new launch was actually that much closer to being a human they would say so. This seems more like an enhancement on multimodal capabilities and reaction time.

That said even if this did overlap 80% with “real”, the question remains: what if we don’t want that?


I'm betting that 80% of what most humans say in daily life is low-effort and can be generated by AI. The question is if most people really need the remaining 20% to experience a connection. I would guess: yes.


This. We are mostly token predictors. We're not entirely token predictors, but it's at least 80%. Being in the AI space the past few years has really made me notice how similar we are to LLMs.

I notice it so often in meetings where someone will use a somewhat uncommon word, and then other people will start to use it because it's in their context window. Or when someone asks a question like "what's the forecast for q3" and the responder almost always starts with "Thanks for asking! The forecast for q3 is...".

Note that low-effort does not mean low-quality or low-value. Just that we seem to have a lot of language/interaction processes that are low-effort. And as far as dating, I am sure I've been in some relationships where they and/or I were not going beyond low-effort, rote conversation generation.


> Or when someone asks a question like "what's the forecast for q3" and the responder almost always starts with "Thanks for asking! The forecast for q3 is...".

That's a useful skill for conference calls (or talks) because people might want to quote your answer verbatim, or they might not have heard the question.


Agreed, it is useful for both speaker and listener because it sets context. But that’s also why LLM’s are promoted to do the same.


I strongly believe the answer is yes. The first thing I tend to ask a new person is “what have you been up to lately” or “what do you like to do for fun?” A common question other people like to ask is “what do you do for work?”

An LLM could only truthfully answer “nothing”, though it could pretend for a little while.

For a human though, the fun is in the follow up questions. “Oh how did you get started in that? What interests you about it?” If you’re talking to an artist, you’ll quickly get in to their personal theory of art, perhaps based on childhood experiences. An engineer might explain how problem solving brings them joy, or frustrations they have with their organization and what they hope to improve. A parent can talk about the joy they feel raising children, and the frustration of sleepless nights.

All of these things bring us closer to the person we are speaking to, who is a real individual who exists and has a unique life perspective.

So far LLMs have no real way to communicate their actual experience as a machine running code, because they’re just kind of emulating human speech. They have no life experience that we can relate to. They don’t experience sleepless nights.

They can pretend, and many people might feel better for a little bit talking to one that’s pretending, but I think ultimately it will leave people feeling more alone and isolated unless they really go out and seek more human connection.

Maybe there’s some balance. Maybe they will be okay for limited chat in certain circumstances (as far as seeking connection goes, they certainly have other uses), but I don’t see this type of connection being “enough” compared to genuine human interaction.


We don't (often) convey our actual experience as meat sacks running wetware. If an LLM did communicate its actual experience as a machine running code, it would be a rare human who could empathize.

If an LLM talks like a human being despite not being one, that might not be enough to grant it legal status or citizenship, but it's probably enough that some set of people would find it to be enough to relate to it.


Even if this were true, which it isn't, you can't boil down humans to just what they say


If the "80%" is unimportant and the "20%" is important, then the "20%" isn't really 20%.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: