Diagnostic Accuracy (Part I in a series)

By Robert L. Moore, MD, MPH, MBA, Chief Medical Officer

In 2001, the Institute of Medicine (IOM) published Crossing the Quality Chasm, which defined 6 realms of quality:

  • Safe
  • Effective
  • Patient Centered
  • Timely
  • Efficient
  • Equitable

Most quality improvement and quality assurance activities related to health care address one or more of these realms.  However there is one critical area that is missing from both the IOM realms and most QI plans.  This missing realm is at the heart of what it means to be a clinician, is infrequently measured and is uncomfortable to verbalize.

This realm is Accuracy, in particular the diagnostic accuracy of clinicians.

Diagnostic Inaccuracy:  Studies show that between 5 and 50% of diagnoses are erroneous, depending on the type of patient/problem.  The low range applies to a population of patients where most are normal; the high range applies to a population where all patients have complex abnormalities.  Autopsy studies and studies with “secret shopper” patients show rates of inaccuracy of between 10 to 20%.  (Mark Graber:  “The Incidence of Diagnostic Error in Medicine”, BMJ October 2013)

What is the source of this diagnostic inaccuracy?

At the core is an insufficient appreciation of uncertainty.  Put another way, clinicians and scientists are often overconfident in the accuracy of their decisions.  This psychological trait develops when we are trainees, as it makes us appear confident in the eyes of our patients, and helps prevent us from being paralyzed by indecision.  Fortunately, many medical conditions in primary care resolve on their own, so neither the clinician nor the patient ever become aware of the inaccuracy.

We can reduce diagnostic inaccuracy by changing the way that we think.

Daniel Kahneman (winner of the 2002 Nobel Prize in Economics) describes two ways of thinking:

  1. Fast thinking (also known as intuitive thinking or system I thinking)
  2. Slow thinking (also known as rational thinking or system II thinking)

Fast, intuitive thinking tends to be automatic, with input from emotions.  In his book, Thinking Fast and Slow, Dr. Kahneman notes 12 different classes of bias and 5 heuristics which can lead to irrational decisions, when we think intuitively.

Slow, rational thinking is more deliberative, systematic, and logical, with an evaluation of consequences of a decision.

As we go through our everyday lives and routine practice of medicine, we use fast thinking for most decisions, so we can get through our days without being paralyzed by indecision over minor decisions.  When the stakes are high, or when we notice a diagnostic pattern that doesn’t quite fit, we need to transition to slow, rational thinking.  For a clinician to be efficient and accurate, we need to know when to toggle back and forth between slow and fast thinking.

When slow thinking is associated with a retrospective analysis of a serious diagnostic error, as happens in morbidity and mortality rounds, or when a clinician becomes aware of a diagnostic error that occurred, it is good to explicitly think about which biases or heuristics contributed to the error, to help prompt us to move to slow thinking when needed.

This process is sometimes called “cognitive debiasing,” which is a fancy way of saying “learning from our mistakes”.

Comments are closed.