As a retired physician, your above concerns are well written but blithely ignored by the ‘Medical establishment’. So- well said!
While algorithms have a role to play; giving AI decisions over life and death? Read the novel Cell by Robin Cook.
On the other hand, one can only hope that patients won’t lie to their robot doctors, because the patient is never a bad actor (just ask Morgan & Morgan!). Also, said robot doctors will actually examine a patient before sending them to the doughnut of truth.
Well, yes, lie, or have faulty memories. Like, when was your last surgery? Mom was in the hospital after a fall, and noted a stent procedure, had to remind her she had a pacemaker replaced in 2022. She doesn't have dementia, she is mentally sharp, just physically weak, but laying in a hospital bed in pain you aren't your own best advocate or may forget stuff.
With over 40 years of experience, those lies are told in the ER daily. That’s where AI cannot discern truth unless patients are forced to have detectors places during the history.
Quite the dystopian scenario.
So, yes, we are not talking about dementia; in that case we expect memory lapses.
I am referring to purposeful omissions.
I was practicing in an era where glaring purposefull omissions were grounds for ‘please find another physician’.
In real-world medicine, patients often present with overlapping symptoms, incomplete histories, or unexpected complications. If an AI system cannot handle minor shifts in question formatting, it may also struggle with these kinds of real-life variability.
“These AI models aren’t as reliable as their test scores suggest,” Bedi said. “When we changed the answer choices slightly, performance dropped dramatically, with some models going from 80% accuracy down to 42%. It’s like having a student who aces practice tests but fails when the questions are worded differently. For now, AI should help doctors, not replace them.”
Nothing can replace wisdom. It takes time and experience to practice medicine well. Shift work makes that a bit harder to obtain, but it still can accomplished.
As a retired physician, your above concerns are well written but blithely ignored by the ‘Medical establishment’. So- well said!
While algorithms have a role to play; giving AI decisions over life and death? Read the novel Cell by Robin Cook.
On the other hand, one can only hope that patients won’t lie to their robot doctors, because the patient is never a bad actor (just ask Morgan & Morgan!). Also, said robot doctors will actually examine a patient before sending them to the doughnut of truth.
Well, yes, lie, or have faulty memories. Like, when was your last surgery? Mom was in the hospital after a fall, and noted a stent procedure, had to remind her she had a pacemaker replaced in 2022. She doesn't have dementia, she is mentally sharp, just physically weak, but laying in a hospital bed in pain you aren't your own best advocate or may forget stuff.
With over 40 years of experience, those lies are told in the ER daily. That’s where AI cannot discern truth unless patients are forced to have detectors places during the history.
Quite the dystopian scenario.
So, yes, we are not talking about dementia; in that case we expect memory lapses.
I am referring to purposeful omissions.
I was practicing in an era where glaring purposefull omissions were grounds for ‘please find another physician’.
Now….you are a number.
Just saw this in one of the random newsletters I get. Makes your point - human interaction is 'messy'
https://www.psypost.org/top-ai-models-fail-spectacularly-when-faced-with-slightly-altered-medical-questions/
In real-world medicine, patients often present with overlapping symptoms, incomplete histories, or unexpected complications. If an AI system cannot handle minor shifts in question formatting, it may also struggle with these kinds of real-life variability.
“These AI models aren’t as reliable as their test scores suggest,” Bedi said. “When we changed the answer choices slightly, performance dropped dramatically, with some models going from 80% accuracy down to 42%. It’s like having a student who aces practice tests but fails when the questions are worded differently. For now, AI should help doctors, not replace them.”
Nothing can replace wisdom. It takes time and experience to practice medicine well. Shift work makes that a bit harder to obtain, but it still can accomplished.