For macOS and Linux, Ollama is probably the easiest way to try Mixtral (and a large number of models) locally. LM Studio is also nice and available for Mac, Windows, and Linux.
As these models can be quite large and memory intensive, if you want to just give it a quick spin, huggingface.co/chat, chat.nbox.ai, and labs.pplx.ai all have Mixtral hosted atm.
As these models can be quite large and memory intensive, if you want to just give it a quick spin, huggingface.co/chat, chat.nbox.ai, and labs.pplx.ai all have Mixtral hosted atm.