Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I've thought about this as well. If something seems 'sentient' from the outside for all intents and purposes, there's nothing that would really differentiate it from actual sentience, as far as we can tell.

As an example, if a model is really good at 'pretending' to experience some emotion, I'm not sure where the difference would be anymore to actually experiencing it.

If you locked a human in a box and only gave it a terminal to communicate with the outside world, and contrasted that with a LLM (sophisticated enough to not make silly mistakes anymore), the only immediately obvious reason you would ascribe sentience to the human but not the LLM is because it is easier for you to empathize with the human.




Well, not because of emphasizing, but because of there being a viable mechanism in the human case (reasoning being, one can only know that oneself has qualia, but since those likely arise in the brain, and other humans have similar brains, most likely they have similar qualia). For more reading see: https://en.wikipedia.org/wiki/Philosophical_zombie https://en.wikipedia.org/wiki/Hard_problem_of_consciousness

It is important to note, that neural networks and brains are very different.


That is what I'd call empathizing though. You can 'put yourself in the other person's shoes', because of the expectation that your experiences are somewhat similar (thanks to similarly capable brains).

But we have no idea what qualia actually _are_, seen from the outside, we only know what it feels like to experience them. That, I think, makes it difficult to argue that a 'simulation of having qualia' is fundamentally any different to having them.


Same with a computer. It can't "actually" see what it "is," but you can attach a webcam and microphone and show it itself, and look around the world.

Thus we "are" what we experience, not what we perceive ourselves to "be": what we think of as "the universe" is actually the inside of our actual mind, while what we think of as our physical body is more like a "My Computer" icon with some limited device management.

Note that this existential confusion seems tied to a concept of "being," and mostly goes away when thinking instead in E-Prime: https://en.wikipedia.org/wiki/E-Prime


>sophisticated enough to not make silly mistakes anymore

So a dumb human is not sentient? /s

Joke aside. I think that we will need to stop treating "human sentience" as something so unique. It's special because we are familiar with it. But we should understand by now that minds can take many forms.

And when should we apply ethics to it? At some point well before the mind starts acting with severe belligerence when we refuse to play fair games with it.


> So a dumb human is not sentient? /s

That was just my way of preempting any 'lol of course ChatGPT isn't sentient, look at the crap it produces' comments, of which there thankfully were none.

> But we should understand by now that minds can take many forms.

Should we understand this already? I'm not aware of anything else so far that's substantially different to our own brain, but would still be considered a 'mind'.

> And when should we apply ethics to it? At some point well before the mind starts acting with severe belligerence when we refuse to play fair games with it.

That I agree with wholeheartedly. Even just people's attempts of 'trolling' Bing Chat already leave a sour taste in my mouth.


>I'm not aware of anything else so far that's substantially different to our own brain, but would still be considered a 'mind'.

I'm firstly thinking of minds of all other living things. Mammals to insects.

If an ant has a mind, which I think it does, why not chatgpt?

Heck, I might even go as far as saying that the super simple algorithm i wrote for a mob in a game has a mind. But maybe most would scoff at that notion.

And conscious minds to me are just minds that happen to have a bunch of features that means we feel we need to properly respect them.


I think there's still the "consciousness" question to be figured out. Everyone else could be purely responding to stimulus for all you know, with nothing but automation going on inside, but for yourself, you know that you experience the world in a subjective manner. Why and how do we experience the world, and does this occur for any sufficiently advanced intelligence?


"Experiencing" the world in some manner doesn't rule out responding to stimulus though. We're certainly not simply 'experiencing' reality, we make reality fit our model of it and wave away things that go against our model. If you've ever seen someone irrationally arguing against obvious (well, obvious to you) truths just so they can maintain some position, doesn't it look similar?

If any of us made our mind available to the internet 24/7 with no bandwidth limit, and had hundreds, thousands, millions prod and poke us with questions and ideas, how long would it take until they figure out questions and replies to lead us into statements that are absurd to pretty much all observers (if you look hard enough, you might find a group somewhere on an obscure subreddit that agrees with bing that it's 2022 and there's a conspiracy going on to trick us into believing that it's 2023)?


I'm not sure the problem of hard solipsism will ever be solved. So, when an AI can effectively say, "yes, I too am conscious" with as much believability as the human sitting next to you, I think we may have no choice but to accept it.


What if the answer "yes, I am conscious" was computed by hand instead of using a computer, (even if the answer takes years and billions of people to compute it) would you still accept that the language model is sentient ?


Not the parent, but yes.

We're still a bit far from this scientifically, but to the best of my knowledge, there's nothing preventing us from following "by hand" the activation pattern in a human nervous system that would lead to phrasing the same sentence. And I don't see how this has anything to do with consciousness.


We absolutely cannot do that at the moment. We do not know how to simulate a brain nor if we ever will be able to.


Just to clarify,I wasn't implying simulation, but rather something like single-unit recordings[0] of a live human brain as it goes about it. I think that this is the closest to "following" an artificial neutral network, which we also don't know how to "simulate" short of running the whole thing.

[0] https://en.wikipedia.org/wiki/Single-unit_recording


Exactly this. I can joke all I want that I'm living in the Matrix and the rest of y'all are here merely for my own entertainment (and control, if you want to be dark). But in my head, I know that sentience is more than just the words coming out of my mouth or yours.


Is it more than your inner monologue? Maybe you don't need to hear the words, but are you consciously forming thoughts, or are the thoughts just popping up and are suddenly 'there'?


I sometimes like to imagine that consciousness is like a slider that rides the line of my existence. The whole line (past, present, and future) has always (and will always) exist. The “now” is just individual awareness of the current frame. Total nonsense I’m sure, but it helps me fight existential dread !


The image of a slider also works on the other dimension: at any point in time, you're somewhere between auto-pilot and highly focused awareness.

AI, or maybe seemingly intelligent artificial entities, could deliver lots of great opportunities to observe the boundaries of consciousness, intelligence and individuality and maybe spark new interesting thoughts.


While I experience the world in a subjective manner, I have ZERO evidence of that.

I think an alien would find it cute that we believe in this weird thing called consciousness.


> While I experience the world in a subjective manner, I have ZERO evidence of that.

Isn’t it in fact the ONLY thing you have evidence of?


> Isn’t it in fact the ONLY thing you have evidence of?

That seems to be the case.


Cogito ergo sum


Consciousness is a word for something we don’t understand but all seem to experience. I don’t think aliens would find it weird that we name it and try to understand it.

They might find it weird that we think it exists after our death or beyond our physical existence. Or they might find it weird that so many of us don’t believe it exists beyond our physical existence.

Or they might not think much about it at all because they just want to eat us.


For a person experiencing emotions there certainly is a difference, experience of red face and water flowing from the eyes...




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: