Just make very, very sure you have a good multilingual LLM. Probably don't even try this with low resource languages even at the best models. Speaking in languages other than English (maybe the top 5 next or so as well, I wouldn't know) seems to be a skill that quick to be sacrificed if a model is quantisized, distilled, fine tuned or otherwise adapted. Take the top Qwen model released today, all the versions I can run locally totally trash Norwegian grammar. And they even claim it (both written forms!) as one of the languages they explicitly trained on.