That was a joke in the release video. The Pythia model is already released at [1] and the deltas for the LLaMa model should be up here [2] in the next few days.
It's also relatively cheap to make your own llama-30 weights, the real value of OpenAssistant is in the training data, and all of that data has been made available.
The OpenAssistant effort gets an A+ for open source contributions.
[1] https://huggingface.co/OpenAssistant/oasst-sft-4-pythia-12b-...
[2] https://huggingface.co/OpenAssistant/oasst-llama-based-model...