might be a case of trying to invoke classical tropes from tech. openai arguably makes the best (closed) models right now; few come even close. gpt-oss is not the best model, but it is from openai (and better than some of the older smaller open models)
osborne effect: why bother with shitty mistral or red-china's deepseek now, if you can get the real-deal openai model soon
eee: embrace open weight models by releasing a slightly worse version of the best models, internally extend it as the best models, extinguish attempts by third parties to pour billions into nvidia chips to make potentially competing models
it might be too late, but just consider the eco-system that emerged from opening up stable diffusion (mostly titty generators, let's be honest). only multi-model models come close to rivaling the amount of users stable diffusion and derivatives have. it can definitely funnel effort from the smaller, shittier independent models (looking at mistral again)
tbh, it's probably better to consolidate the effort to make the bestest t9 dictionary rather than having 21M models, but not if altman is in charge.