I've also been away from the tech (and AI scene) for a few years now. And I mostly stayed away from LLMs. But I'm certain that all the content is baked into the model, during training. When you query the model locally (since I suppose you don't train it yourself), you get all that knowledge that's baked into the model weights.
So I would assume the locally queried output to be comparable with the output you get from an online service (they probably use slightly better models, I don't think they release their latest ones to the public).
So I would assume the locally queried output to be comparable with the output you get from an online service (they probably use slightly better models, I don't think they release their latest ones to the public).