This just doesn't match with the claims that people are using it as a replacement for Google. If your facts are out of date you're useless as a search engine
Which is why there's so much effort to build RAG workflows so that you can progressively add to the pool of information that the chatbot has access to, beyond what's baked into the underlying model(s).
RAG still needs model training, if the models were to go stale and the context drifts sufficiently, the RAG mechanism collapses.
Sure, those models are cheaper, but we also don’t really know how an ecosystem with a stale LLM and up to date RAG would behave once context drifts sufficiently, because no one is solving that problem at the moment.
all these models just use web search now to stay up to date. knowledge cutoffs arent as important. also fine tuning new data into the base model after the fact is way cheaper than having to retrain the whole thing from scratch