Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The nonexistence of grey goo (von neumann probes) is strong prior for safe agi. AI xrisk is woo. Paperclip maximizers are p-zombies. They can't exist.

Chicken littles see apocalypses on every horizon even though they dont understand the technology at all. "I can imagine this destroying the world" is their justification. Even though their "imagination" is 16x16 greyscale.



The assumption that von Neumann probes would have made it here timely is not necessarily true, and if they had we wouldn't be having this conversation. So this doesn't really prove anything.


What is the content here? An observation isn't a truth? Who said it was?

Or are you trying to make some anthropic argument around survivorship bias.


> The nonexistence of grey goo (von neumann probes) is strong prior for safe agi.

what if we're first?

or FTL travel isn't possible in our universe?


Sure, have a different prior if you think those assumptions are relevant. i don't

    P(what if we're first?) ~ 0

    P(or FTL travel isn't possible in our universe?) ~ 1


the Grabby Aliens argument is pretty good in establishing why it's likely that we're first-ish. are you familiar with that? or you think it's not convincing? (why?)


what's your basis for these assumptions?


Uniform distribution of appearance of life over stars.

Special relativity.


I’ve got a theory on the fermi paradox stuff I need to flesh out (and research, am assuming this isn’t original), among like a dozen other things I’ve been meaning to expand on that I haven’t: I think we severely underestimate how much we’re optimized to see what’s proximal to us.

I think there’s a strong possibility there may be “gray goo” all over the place, beings far bigger than us we’re inside of, physics that sits “parallel” to ours in whatever you’d call “space” in some construction we can’t comprehend, etc.

In short, I think the universe seems empty precisely because it’s distant, both evolutionarily and in terms of physical space.

Donald Hoffman’s been talking about a lot of stuff pointing in this direction.


Paperclip maximizers really have nothing to do with p zombies.

A paperclip maximiser is simply an AI with an unconstrained goal that has unintended and bad consequences for us when taken to an extreme.

It does not need to be something that could function exactly like a human without having consciousness.


That isn't what I said and I dont think you refuted my actual statement. PMs and PZs are incoherent by construction.


Ah, I thought you were saying they were the same, my misunderstanding.

I can see an argument that PZs are incoherent by construction, but I'm not sure why a PM is. Can you explain why you think that is so?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: