374 Comments
User's avatar
⭠ Return to thread
Harold Landa's avatar

With over 40 years of experience, those lies are told in the ER daily. That’s where AI cannot discern truth unless patients are forced to have detectors places during the history.

Quite the dystopian scenario.

So, yes, we are not talking about dementia; in that case we expect memory lapses.

I am referring to purposeful omissions.

I was practicing in an era where glaring purposefull omissions were grounds for ‘please find another physician’.

Now….you are a number.

Expand full comment
Donna in MO's avatar

Just saw this in one of the random newsletters I get. Makes your point - human interaction is 'messy'

https://www.psypost.org/top-ai-models-fail-spectacularly-when-faced-with-slightly-altered-medical-questions/

In real-world medicine, patients often present with overlapping symptoms, incomplete histories, or unexpected complications. If an AI system cannot handle minor shifts in question formatting, it may also struggle with these kinds of real-life variability.

“These AI models aren’t as reliable as their test scores suggest,” Bedi said. “When we changed the answer choices slightly, performance dropped dramatically, with some models going from 80% accuracy down to 42%. It’s like having a student who aces practice tests but fails when the questions are worded differently. For now, AI should help doctors, not replace them.”

Expand full comment
Harold Landa's avatar

Nothing can replace wisdom. It takes time and experience to practice medicine well. Shift work makes that a bit harder to obtain, but it still can accomplished.

Expand full comment