I’m usually the one saying “AI is already as good as it’s gonna get, for a long while.”
This article, in contrast, is quotes from folks making the next AI generation - saying the same.
I’m usually the one saying “AI is already as good as it’s gonna get, for a long while.”
This article, in contrast, is quotes from folks making the next AI generation - saying the same.
It’s a known problem. Frankly I think that now they should focus on making these models smaller and more efficient instead of just throwing more compute at the wall, and actually train them to completion so they’ll generalize properly and be more useful.