Repeating it three times and it happened. Interesting that it shows it and then changes it - almost like one is local processing, and then a "better answer" comes from somewhere else.
Firstly, it's pretty crazy they allow words to be associated like this. You would think that such a large company would check for this. You could end up with some extremely awkward suggestions. I would probably have some process for double checking "words of interest" that could cause negative press.
Secondly, I find it interesting how it knows what the correct word should be, but spends some time on an obviously incorrect word before correcting. I think this points towards it being deliberate.
My suspicion is this: There is some kind of prediction for partially complete words, and somebody has either messed with the training data or the code that makes the predictions. The first part "race-is" could complete to "racist" or "racism" or something else. The second system then predicts based on the full word (given by the break), and the suggestion is corrected.
For those thinking "this is okay", just bare in mind that next time this could be a word association that strikes some form of emotion in you. Imagine losing your father recently, saying "Dad" and the correction suggests "Dead" -> "Dad", it might be insensitive.
> Firstly, it's pretty crazy they allow words to be associated like this.
... I mean, this is what LLMs are. They're word association machines (or more pedantically token-association machines). What we're seeing here is almost certainly an initial speech recognition model followed by another model correcting the output to what it 'thinks' is most likely/reasonable. AIUI this is how most LLMish transcription things work; this is why they sometimes produce weird results where the topic is _unlikely_.
It's probably a network that converts sound to some phonetic tokens, then some kind of probabilistic matching like a Markov chain (which would be computationally cheap locally)?
He’s a pretty solidly documented racist, as well as a xenophobic nationalist. Starting with being sued by the justice dept for refusing to rent to black tenants.