Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This seems like a reaching explanation to try to wave away an ideological embarrassment as not being a consequence of the ideology.

"No one could have predicted the outcome of the instruction set we put in - which was followed precisely and in predictable fashion."



I don't agree with that framing. It reads more as saying what Google did was worse than it appears, not waving away.

Whatever your own opinion, Google did it out of what they perceived to be good intentions (and very likely business sense given a global audience for their products). Yet their intentions directly lead to unintended consequences. Google is being a baby with a gun in essence. Like he says, what if they decide to ask it to solve climate change and it decides to wipe humans out?

Obviously it's still very theoretical and can't do anything like that, but the point is more that perhaps Google doesn't have the culture necessarily to truly interrogate their actions.


Here's his central point:

>This event is significant because it is major demonstration of someone giving a LLM a set of instructions and the results being totally not at all what they predicted.

Replace LLM with computer in that sentence, is it still novel? Laughably far from it, unexpected results are one of the defining features of moderately complex software programs going all the way back to the first person to program a computer. Some of the results are unexpected, but a lot are not, because it's literally doing what the prompt injection tells it to. Which isn't all that surprising but sure anyway...

>Obviously it's still very theoretical and can't do anything like that, but the point is more that perhaps Google doesn't have the culture necessarily to truly interrogate their actions.

Oh that's definitely true.


> Whatever your own opinion, Google did it out of what they perceived to be good intentions (and very likely business sense given a global audience for their products)

That makes even less sense, because most countries “globally” are internally quite homogenous. If someone in Bangladesh or China writes “show me pictures of people walking outside,” it’s even more jarring to deliberately insert random Latinos, East Asians, and Africans.

Given Google’s global audience, it might want to detect the customer’s location and show Chinese people pictures of Chinese people, and Japanese people pictures of Japanese people. That actually makes a lot of sense. But that’s not what they did.


In other words, even though it tried to be inclusive, a US company ended up being US-centric in that random Latinos, East Asians, and Africans are what you are likely to see when walking around the US, but not most of the rest of the world :)




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: