Why not use a combination of open source and OpenAI models? GPT-3.5 is already beaten by Mixtral and Mistral-Medium. The first one you can host for free and the second has a darn cheap API while getting really close to GPT-4 performance.
Even the free GPT-3.5 is better at smaller European languages than Mixtral/Mistral-Medium.
However I think it's a typo when the article says GPT-3.5. It doesn't make sense to "buy" GPT-3.5. They probably meant ChatGPT Plus which includes GPT-4 access (50 messages in a rolling 3 hour window).
There are certain "quality of life" issues with the open source ecosystem. I don't blame them for choosing (not at all) OpenAI. For one putting together a chatgpt like experience that supports reliably at least chat history and per user system prompts of different models, requires a chat client that doesn't exist (at least it didn't a couple of months ago when I was looking for one). The closest we have now is a vscode plugin written by one guy I had to modify to work with multiple models (but it is a pretty good OpenAI api client). Also, OpenAI API is a defacto standard for clients talking to AI chat bots. To set it up with Mixtral I had to put together a non trivial system of huggingface TGI server (code extended to support token bias, cfg and negative prompts), and a (very slightly modified) litellm proxy to translate OpenAI api to TGI API. These products were used from the latest github branch and all had various shortcomings that required coding to resolve. Now I can say I truly have an "OpenAI - like" chat experience. But one huge functionality is missing. Function calling. Although implementing it now that I already have context free grammars is not that difficult, it still requires time I haven't found yet. Compare this to just paying a fee every month and getting it all done for you.
Still I believe it is very important people recreate what OpenAI offers locally using open source software. Why? Be it is clear AI's like chatgpt is essentially sold well below cost now to hook people up. In 5 years from now once no one will be able to maintain their productivity without it, (not at all)OpenAI will raise the prices 100x and everyone will pay begrudgingly. Then they will raise them 100x more and people will pay too... Unless there is a viable alternative. This is why I(and many like me) are working on having my own.
I wouldn’t be surprised if the deal was somehow done via Microsoft and some contract they already have with the Norwegian government. This is how certain ms tech end up with the Irish government so quickly.
Mistral , open source etc do not have a sales force. Shame really
Mistral models, in particular Mixtral 8x7, while free to download, are not "free" to run. Even if you have the necessary high end GPUs lying around unused (which is already highly unlikely to begin with), you still need to build and maintain a whole infrastructure around them. It would be extremely difficult to do this more cost effective per computed token than say OpenAI or any of the other big API providers.