OpenAI’s work on GPT-5, nicknamed Project Orion, has hit a pretty major bump. The project is running way behind schedule, and it’s costing a fortune to boot. After working on it for more than 18 months, Microsoft originally thought they’d have it ready to go by mid-2024, but that’s not happening.
They’ve poured a lot of money into this—each training run costs about $500 million just in computing power—but they haven’t seen the leaps in performance they were hoping for compared to GPT-4. While there are some noticeable improvements, they’re not enough to justify such expenses.
One of the biggest problems is running out of quality data to train the model. The public internet just doesn’t have enough of the diverse, high-quality data needed to make a difference. To tackle this, OpenAI has brought experts to create fresh training material, like software code and math problems. But it’s a slow process.
For perspective, GPT-4 needed a staggering 13 trillion tokens to train. To put that in context, even if 1,000 people wrote 5,000 words a day, it’d take them months just to hit a single billion—scaling up to trillions is like trying to fill an ocean with a garden hose.
Adding to the mess, the company has been hit with internal issues. Over two dozen key executives left in 2024, including Chief Scientist Ilya Sutskever and CTO Mira Murati. OpenAI has been juggling other projects, too, like “o1” and “Sora,” while still trying to figure out how to move forward with GPT-5.
The CEO, Sam Altman, has already confirmed that GPT-5 won’t be released in 2024, a significant setback for their ambitious AI plans.
Source(s)
WSJ (in English)