Effective treatment should be more than just throwing a bunch of interventions at the wall and hoping that one of them sticks. It should consist of an evidence-based intervention defined by a sound clinical reasoning process.
Clinical reasoning skills are as important as (if not more than) patient handling skills or manual techniques or clinical prediction rules. But “learning how to think” isn’t typically a college course, and the task of acquiring these skills is typically under-appreciated in both the practicing clinician and student clinician. Why spend time learning how to think when I need to spend time learning how to do?
Clinicians will often neglect the impact of their beliefs on their clinical reasoning skills. It is terribly easy to introduce a number of errors into the clinical reasoning process, and in doing so, create a situation in which “logic” and “evidence” fight a battle with “anecdote” and “beliefs”. This in turn severely limits the clinician’s ability to formulate clear hypotheses, consistent mechanical and medical diagnoses, and evidence-based treatment interventions.
The clinical reasoning process can inadvertently shift from “evidence-based” to “belief-based” in a number of ways:
1. Cognitive Bias: This is otherwise known as “painting from your frame of reference”. In 1997, Richard Di Fabio noted that
“We contaminate our perspective with bias, and bias will be our downfall when we allow our preconceived ideas about practice to undermine the development of newer and more effective treatments.”
Clinicians can introduce bias into the clinical reasoning process at many junctures – from the moment the clinician meets the patient to when the clinician begins the treatment itself. Bias can be imposed by the observer (what they choose to see), and in the recall of the data (that which they choose to remember as important or relevant).
As an example, a surgeon’s frame of reference is, of course, surgery. A surgeon looks at the world from the perspective of the ways in which surgery will assist the patient, not the ways in which nutrition or behavioral aspects may assist them. But not every problem requires surgery. Every clinician has their own cognitive bias, and it is reflected in their practice patterns.
2. The Favorite Hypothesis: Clinicians will often emphasize findings which support a favorite hypothesis, while ignoring findings which negate it. Instead of viewing the data “as it is”, clinicians will oftentimes attempt to push a round peg into a square hole, to make the data “fit” their favorite hypothesis.
“Medical research into errors of reasoning has shown that failure to attend to features which are missing and overemphasis on features which support the clinician’s ‘favorite hypothesis’ are the most common errors made” (Elstein et al, “An analysis of clinical reasoning”, 1978)
Another way to think of this is the duck analogy: if it walks like a duck, sounds like a duck, but has big hairy feet, it’s not a duck. No matter how many features of a duck it has, by definition it isn’t a duck if it has big hairy feet.
From a clinical perspective, some physical therapists say they see a lot of sacroiliac joint problems. With an overall incidence of about 4 to 8% in the population, either some clinics just attract a lot of these problems by coincidence or there are a lot of ducks walking around with big hairy feet.
3. Logical Fallacies: As Ayn Rand once noted -
“A contradiction cannot exist … no concept man forms is valid unless he integrates it without contradiction into the total sum of his knowledge. To arrive at a contradiction is to confess an error in one’s thinking; to maintain a contradiction is to abdicate one’s mind and to evict oneself from the realm of reality”.
Clinicians will often engage in logical fallacies to validate their thinking while maintaining a contradiction. Using the example of the sacroiliac joint, regardless of what the numbers reflect or the typical mechanisms of injury, clinicians may still use various forms of logical fallacy to “confirm” that these problems exist in far greater numbers and for far more hypothetical reasons than science would suggest.
By learning more about our thoughts, perceptions and beliefs, errors in clinical reasoning can be limited. Clinical reasoning should be based on the scientific method and on testing hypotheses, not on the belief system of the clinician.
Photo credits: Wikipedia
[1] DiFabio RP. Efficacy and the fear of research. Orthopaedic Practice. 9:2, 1997.
[2] Elstein AS, Shulman LS, Sprafka SA. Medical Problem Solving: An Analysis of Clinical Reasoning. Cambridge, MA:Harvard University Press,1978.
Allan Besselink, PT, DPT, Ph.D., Dip.MDT has a unique voice in the world of sports, education, and health care. Read more about Allan here.