I’m sure turning on a few more nuclear plants to power shoveling an ever larger body of AI slop-contaminated text into the world’s most expensive plagiarism machine will fix it!
With this, OpenAI is officially starting to crack. They’ve been promising a lot and not delivering, the only reason they would push out GPT4.5 even though it’s worse and more expensive than the competition is because the investors are starting to get mad.
I think most ML experts (that weren’t being paid out the wazoo for saying otherwise) have been saying we’re on the tail end of the LLM technology sigma curve. (Basically treating an LLM as a stochastic index, the actual measure of training algorithm quality is query accuracy per training datum)
Even with deepseek’s methodology, you see smaller and smaller returns on training input.
At this point, it is useful for doing some specific things so the way to make it great is making it cheap and accessible. Being able to run it locally would be way more useful.
100% this. Wouldn’t it be something if they weren’t overtly running their companies to replace all of us? If feel like focusing instead on creating great personal assistants that make our lives easier in various ways would get a lot of support from the public.
And don’t get me wrong, these LLMs are great at helping people already but that’s definitely not the obvious end goal of OpenAI or any of the others.