Jump to content

Is GPT-5 in trouble? Report suggests that AI has plateaued


Recommended Posts

  • Author

Is GPT-5 in trouble? Report suggests that AI has plateaued

Is GPT-5 in trouble? Report suggests that AI has plateaued

data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///ywAAAAAAQABAAACAUwAOw==
Viralyft / Unsplash

OpenAI’s next-generation Orion model of ChatGPT, which is both rumored and denied to be arriving by the end of the year, may not be all it’s been hyped to be once it arrives, according to a new report from The Information.

Citing anonymous OpenAI employees, the report claims the Orion model has shown a “far smaller” improvement over its GPT-4 predecessor than GPT-4 showed over GPT-3. Those sources also note that Orion “isn’t reliably better than its predecessor [GPT-4] in handling certain tasks,” specifically coding applications, though the new model is notably stronger at general language capabilities, such as summarizing documents or generating emails.

The Information’s report cites a “dwindling supply of high-quality text and other data” on which to train new models as a major factor in the new model’s insubstantial gains. In short, the AI industry is quickly running into a training data bottleneck, having already stripped the easy sources of social media data from sites like X, Facebook, and YouTube (the latter on two different occasions.) As such, these companies are increasingly having difficulty finding the sorts of knotty coding challenges that will help advance their models beyond their current capabilities, slowing down their pre-release training.

That reduced training efficiency has massive ecological and commercial implications. As frontier-class LLMs grow and further push their parameter counts into the high trillions, the amount of energy, water, and other resources is expected to increase six-fold in the next decade. This is why we’re seeing Microsoft try to restart Three Mile Island, AWS buy a 960 MW plant, and Google purchase the output of seven nuclear reactors, all to provide the necessary power for their growing menageries of AI data centers — the nation’s current power infrastructure simply can’t keep up.

In response, as TechCrunch reports, OpenAI has created a “foundations team” to circumvent the lack of appropriate training data. Those techniques could involve using synthetic training data, such as what Nvidia’s Nemotron family of models can generate. The team is also looking into improving the model’s performance post-training.

Orion, which was originally thought to be the code name for OpenAI’s GPT-5, is now expected to arrive at some point in 2025. Whether we’ll have enough available power to see it in action, without browning out our municipal electrical grids, remains to be seen.








Source link

#GPT5 #trouble #Report #suggests #plateaued

📬Pelican News

Source Link

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

Cookie Consent & Terms We use cookies to enhance your experience on our site. By continuing to browse our website, you agree to our use of cookies as outlined in our We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.. Please review our Terms of Use, Privacy Policy, and Guidelines for more information.