This has been my suspicion for a long time - OpenAI have indeed been working on "GPT5", but training and running it is proving so expensive (and its actual reasoning abilities only marginally stronger than GPT4) that there's just no market for it.
It points to an overall plateau being reached in the performance of the transformer architecture.
but while there is a plateau in the transformer architecture, what you can do with those base models by further finetuning / modifying / enhancing them is still largely unexplored so i still predict mind-blowing enhancements yearly for this foreseeable future. if they validate openai's valuation and investment needs is a different question.
TBH, with the safety/alignment paradigm we have, workforce replacement was not my top concern when we hit AGI. A pause / lull in capabilities would be hugely helpful so that we can figure how not to die along with the lightcone...
Is it inevitable to you that someone will create some kind of techno-god behemoth AI that will figure out how to optimally dominate an entire future light cone starting from the point in spacetime of its self-actualization? Borg or Cylons?
It points to an overall plateau being reached in the performance of the transformer architecture.