In other words, an AI-supported radiologist should spend exactly the same amount of time considering your X-ray, and then see if the AI agrees with their judgment, and, if not, they should take a closer look. AI should make radiology more expensive, in order to make it more accurate.
But that’s not the AI business model. AI pitchmen are explicit on this score: The purpose of AI, the source of its value, is its capacity to increase productivity, which is to say, it should allow workers to do more, which will allow their bosses to fire some of them, or get each one to do more work in the same time, or both. The entire investor case for AI is “companies will buy our products so they can do more with less.” It’s not “business customers will buy our products so their products will cost more to make, but will be of higher quality.”
Ideally, yeah - people would review and decide first, then check if the AI opinion confers.
We all know that’s just not how things go in a professional setting.
Anyone, including me, is just going to skip to the end and see what the AI says, and consider whether it’s reasonable. Then spend the alotted time goofing off.
Obviously this is not how things ought to be, but it’s how things have been every time some new tech improves productivity.
Cory Doctorow: What Kind of Bubble is AI?
AI tools like this should really be viewed as a calculator. Helpful for speeding up analysis, but you still require an expert to sign off.
Honestly anything they are used for should be validated by someone with a brain.
Ideally, yeah - people would review and decide first, then check if the AI opinion confers.
We all know that’s just not how things go in a professional setting.
Anyone, including me, is just going to skip to the end and see what the AI says, and consider whether it’s reasonable. Then spend the alotted time goofing off.
Obviously this is not how things ought to be, but it’s how things have been every time some new tech improves productivity.