Thats actually more interesting that LLMs answer based on language questions are asked, I never thought to test that. It would be nice if we got to a point where you train an LLM to genuinely figure out these nuances and fix its own model.
Google, I suspect, would do the same, if you were in Texas coffee's origin would not get you the result that mentions Yemen, Ethiopia would be the first result. This is how I won a $100 bet with a Texan who insisted that google gave him Ethiopia. The trouble is, we were in the Middle East when asking google for the bet.