My impression is that it’s a massive increase in the parameter count. This is likely the spiritual successor to GPT4 and would have been called GPT5 if not for the lackluster performance. The speculation is that there simply isn’t enough data on the internet to support yet another 10x jump in parameters.
O1-mini is a distill of O1. This definitely isn’t the same thing.
O1-mini is a distill of O1. This definitely isn’t the same thing.