Ontario auditors find doctors' AI note takers routinely blow basic facts

· ai · Source ↗

TLDR

  • Ontario’s Auditor General found 60% of approved AI Scribe systems inserted incorrect drug information into patient notes across 20 vendors evaluated.

Key Takeaways

  • 12 of 20 AI Scribe systems mixed up prescribed drugs; 9 of 20 fabricated treatment suggestions never discussed in recordings.
  • 17 of 20 systems missed key mental health details; 6 missed mental health issues fully or partially.
  • Procurement scoring was badly skewed: domestic Ontario presence weighted at 30%, while medical note accuracy counted for only 4%.
  • Bias controls, privacy/risk assessments, and SOC 2 compliance each contributed 2-4% to total evaluation score.
  • OntarioMD recommends manual physician review, but no approved system includes a mandatory attestation feature.

Hacker News Comment Review

  • Commenters question the framing: without a baseline human error rate for drug transcription, the 60% figure lacks comparative context.
  • Anecdotal reports from enterprise LLM note-taker deployments echo the audit findings, with hallucinated commitments and missed nuance causing real organizational friction.

Original | Discuss on HN