Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I know I sound like a snob but I’ve had many moments with Gen AI tools over the years that made me wonder: I wonder what these tools are like for someone who doesn’t know how LLMs work under the hood? It’s probably completely bizarre? Apps like Cursor or ChatGPT would be incomprehensible to me as a user, I feel.




Using my parents as a reference, they just thought it was neat when I showed them GPT-4 years ago. My jaw was on the floor for weeks, but most regular folks I showed had a pretty "oh thats kinda neat" response.

Technology is already so insane and advanced that most people just take it as magic inside boxes, so nothing is surprising anymore. It's all equally incomprehensible already.


This mirrors my experience, the non-technical people in my life either shrugged and said 'oh yeah that's cool' or started pointing out gnarly edge cases where it didn't work perfectly. Meanwhile as a techie my mind was (and still is) spinning with the shock and joy of using natural human language to converse with a super-humanly adept machine.

I don't think the divide is between technical and non-technical people. HN is full of people that are weirdly, obstinately dismissive of LLMs (stochastic parrots, glorified autocompletes, AI slop, etc.). Personal anecdote: my father (85yo, humanistic culture) was astounded by the perfectly spot-on analysis Claude provided of a poetic text he had written. He was doubly astounded when, showing Claude's analysis to a close friend, he reacted with complete indifference as if it were normal for computers to competently discuss poetry.

LLMs are an especially tough case, because the field of AI had to spend sixty years telling people that real AI was nothing like what you saw in the comics and movies; and now we have real AI that presents pretty much exactly like what you used to see in the comics and movies.

But it cannot think or mean anything, it's just a clever parrot so it's a bit weird. I guess uncanny is the word. I use it as google now, like just to search stuff that are hard to express with keywords.

99% of humans are mimics, they contribute essentially zero original thought across 75 years. Mimicry is more often an ideal optimization of nature (of which an LLM is part) rather than a flaw. Most of what you'll ever want an LLM to do is to be a highly effective parrot, not an original thinker. Origination as a process is extraordinarily expensive and wasteful (see: entrepreneurial failure rates).

How often do you need original thought from an LLM versus parrot thought? The extreme majority of all use cases globally will only ever need a parrot.


> clever parrot

Is it irony that you duckspeak this term? Are you a stochastically clever monkey to avoid using the standard cliche?

The thing I find most educating about AI is that it unfortunately mimics the standard of thinking of many humans...


Try asking it a question you know has never been asked before. Is it parroting?

My parents reacted in just the same way and the lackluster response really took me by surprise.

Most non tech people I talked with don't care at all about LLMs.

They also are not impressed at all ("Okay, that's like google and internet").


Old people? I think it would be hard to find a lot of people under 20 who don't use ChatGPT daily. At least among ones that are still studying.

People older than 25 or 30 maybe.

It would be funny that in the end, the most use is made by student cheating at uni.


I wanted to reflect a bit on this.

I have hard time to imagine why non-tech people would find a use for LLMs, let's say nothing in your life forces you to produce information (be it textual, pictural or anything that can be related to information). Let's say your needs are focused on spending good times with friends or your family, eating nice dishes (home cooked or restaurant), spending your money on furnitures, rents, clothes, tools and etc.

Why would you need an AI that produce information in an information-bloated world ?

You probably met someone that "fell in love with woodworking" or idk, after having watched youtube videos (that person probably built a chair, a table or something akin). I don't think stuff like "Hi, I have these materials, what can I do with it" produce more interesting results than just nerding on the internet or in a library looking for references (on japaneese handcrafted furnitures, vintage ikea designs, old school woodworking, ...). (Or maybe the LLM will be able to give you a list of good reads, which is nice but somewhat of a limited and basic use).

Agentic AI and more efficient/intelligent AIs are not very interesting for people like <wood lover> and are at best a proxy for otherly findable information. Of course, not everyone is like <wood lover>, the majority of people don't even need to invest time in a "creative" hobby and instead they will watch movies, invest time in sport, invest time in sociability, go to museums, read books; you could imagine having AIs that write books, invent films, invent artworks, talk with you, but I am pretty sure that there is something more than just "watch a movie" or "read a book" when performing these activities; as someone who likes reading or watching movies, what I enjoy is following the evolutions of the authors of the pieces, understanding their posture toward its ancestors, its era-mates, toward its own previous visions and whatnot. I enjoy to find a movie "weird" "goofy" "sublime" and whatnot, because I enjoy a small amount of parasociality with the authors and am finally brought to say things like "Ahah, Lynch was such a weirdo when he shot Blue Velvet" (okay, maybe not that type of bully judgement, but you may be understanding what I mean).

I think I would find it uninspiring to read an AI written book, because I couldn't live this small parasocial experience. Maybe you could get me with music, but I still think there's a lot of activity in loving a song. I love Bach, but am pretty sure also I like Bach the character (from what I speculate from the songs I listen). I imagine that guy in front of his keyboard, having the chance to live a -weird- moment of extasy when he produces the best lines of the chaconne (if he was living in our times he would relisten to what he produced again and again and nodding to himself "man, that's sick").

What could I experience from an LLM ? "Here is the perfect novel I wrote specifically for you based on your tastes:". There would be no imaginary Bach that I would like to drink a beer with, no testimony of a human reaching the state of mind in which you produce an absolute (in fact highly relative, but you need to lie to yourself) "hit".

All of this is highly personnal, but I would be curious to know what others think.


This is a weird take. Basically no one is just a wood lover. In fact, basically no one is an expert or even decently knowledgeable in more than 0-2 areas. But life has hundreds of things everyone must participate in. Where does you wood lover shop? How does he find his movies? File taxes? Gets travel ideas? And even a wood lover after watching 100500th niche video on woodworking on YouTube might have some questions. AI is the new, much better Google.

Re: books. Your imagination falters here too. I love sci-fi. I use voice AIs ( even made one: https://apps.apple.com/app/apple-store/id6737482921?pt=12710... ). A couple of times when I was on a walk I had an idea for a weird sci-fi setting, and I would ask AI to generate a story in that setting, and listen to it. It's interesting because you don't know what will actually happen to the characters and what the resolution would be. So it's fun to explore a few takes on it.


> Your imagination falters here too.

I think I just don't find what you described as interesting as you find. I tried AI dungeoning also, but I find it less interesting than with people, because I think I like people more than specific mechanisms of sociality. Also, in a sense, my brain is capable of producing suprising things and when I am writing a story as a hobby, I don't know what will actually happen to the characters and what the resolution would be, and it's very very exciting !

> no one is an expert or even decently knowledgeable in more than 0-2 areas

I might be biased and I don't want to show off, but there are some of these people around here, let's say it's rare that people are decently knowledgeable in more than 5 areas.

I am okay with what you said :

- AI is a better google

But also google became shit, and as far as I can remember, it was somewhat of an incredible tool before. If AI became what is the old google for those people, then wouldn't you say, if you were them, that it's not very impressive and somewhat "like google".

edit; all judgements I made about "not interesting" do not mean "not impressive"

edit2: I think eventually AI will be capable of writing a book akin to Egan's Diaspora, and I would love to reflect on what I said at this time


What you described re books are preferences. I don't think majority of people care about authors at all. So it might not work for you, but that's not a valid argument why it won't work for most. Therefore your reasoning about that is flawed.

It also seems pretty obvious (did u not think majority don't care about authors? I doubt it). So it stands that some bias made you overlook that fact (as well as OpenAI MAUs and other such glaring data) when you were writing your statement above. If I were you I'd look hard into what that bias might be, cause it could affect other less directly related areas.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: