Frankly, you’re ignoring the definition at this point. A “good hash algo” only generates noise in the cryptographic hash sense. There are in fact other hashes. The fact that “semantic vectors” preserve a useful similarity is no different mathematically then LSH or many others (except that the models work a lot more usefully).
If you’re trying to say MD5 isn’t an LLM, then fine no argument there. But otherwise consider referencing something other than vibes or magic, because the definition is clear. “Semantic vectors” isn’t some keyword to be invoked that just generates entropy from the void.
Oh, I get your argument. You think all functions that have finite output length for [usually] longer input length, are hashes. I totally get what you're saying, and it necessarily also means every LLM "inference" is actually a hashing algo too, as you noticed yourself, tellingly. So taking a long set of input tokens, and predicting the next token is (according to you), a "hashing" function. Got it. Thanks.
If you’re trying to say MD5 isn’t an LLM, then fine no argument there. But otherwise consider referencing something other than vibes or magic, because the definition is clear. “Semantic vectors” isn’t some keyword to be invoked that just generates entropy from the void.