I’m usually the one saying “AI is already as good as it’s gonna get, for a long while.”
This article, in contrast, is quotes from folks making the next AI generation - saying the same.
I’m usually the one saying “AI is already as good as it’s gonna get, for a long while.”
This article, in contrast, is quotes from folks making the next AI generation - saying the same.
Though, I don’t think that means they won’t get any better. It just means rhey don’t scale by feeding in more training data. But that’s why OpenAI changed their approach and added some reasoning abilities. And we’re developing/researching things like multimodality etc… There’s still quite some room for improvements. Even without scaling it the way that doesn’t seem to work right now.
Agreed. There’s plenty of improvement to be had, but the gravy train of “more CPU or more data == better results” sounds like it’s ending.