

Doesn’t change the fact that historically balancing the wires on the connector was the job of the GPU. Arguably the connector spec should include who should load balance the wires, it didn’t and afaik it doesn’t, but the established practice was that the GPU takes care of it.
pretty much, AI (LLMs specifically) are just fancy statistical models which means that when they ingest data without reasoning behind it (think the many hallucinations of AI our brains manage to catch and filter out) it corrupts the entire training process. The problem is that AI can not distinguish other AI text from human text anymore so it just ingests more and more “garbage” which leads to worse results. There’s a reason why progress in the AI models has almost completely stalled compared to when this craze first started: the companies have an increasingly hard time actually improving the models because there is more and more garbage in the training data.