There is no question that fear acts as a huge motivational factor for people in general. Our perception of risk is mediated by our emotional and cognitive brains, but the readout of fear tends to be primarily emotional, often leading to irrational decisions, both by individuals and institutions. The logical sequences goes as follows:
1. Bad event happens and instills fear broadly
2. Causality is inferred from coincidental events
3. A broad response is implemented irregardless of any evidence to the contrary, irregardless of evidence supporting effectiveness, and without any mechanisms to assess effectiveness in the future
4. We are saddled with muda. You might ask what muda is. It is a Japanese term denoted activity denoting activity with no discernible value. It is waste.
In today's WSJ, Simons and Chabris wrote an essay "Do our gadgets really threaten planes?" I have often wondered why we a forced to turn off our electrical devices during takeoff and landing. It was hard to conceive that my kindle reader could conceivably cause a problem but I comply with the requests. I carry some sort of paper alternative to keep me occupied during those windows of time. Simons and Chabris give some historical perspective on these restrictions which turn out to be based upon a few isolated reports of pilots and flight crews who believed that passenger gadgets disrupted navigational and/or flight communication systems. No one has been able to duplicate these observations.
Furthermore, the authors own statistical analysis based upon passenger survey data and compliance with regulations (or lack thereof) really call into question that these particular regulations create any sort of benefit for anyone. It turns out that 40% of passengers did not turn phones off completely, 7% left them on with full WiFi and cellular activity, and 2% pulled what they referred to as a "full Baldwin", continuing to use their phones actively. If their sample is representative, we can assume that personal electronic devices are operational on virtually all flights and we should see some sort of disruption of flight functions if there is a real risk. We have not observed this. Yet the restrictions persist, based upon the precautionary principle. The default mode is to avoid or prohibit activities where there is a perception that some harm may follow, irregardless of evidence or lack thereof.
This article struck a chord because we engage in this type of thinking in medicine. I have touched upon this in a previous blog (http://georgiacontrarian.blogspot.com/2010/10/anecdote-driven-activity.html). At least part of the problem is our default to the precautionary principle, that no matter how remote the possibility, any scenario where a very bad outcome is at all possible requires some sort of action, even if the action is of unproven benefit. Despite the technological advances in medicine over the pat century, it is my experience that those who practice the healing arts tend not to think of things in quantitative terms. Part of the problem is the lack of actual quantitative data.
An additional aspect is that available data provides a picture which is not so black and white. I have discussed these elements with previous posts. (http://georgiacontrarian.blogspot.com/2010/09/framing-issues-and-informed-consent.html http://georgiacontrarian.blogspot.com/2011/02/what-is-right-answer-framing-and.html). Understanding and communicating risks is easy if the choices are stark. Choices are rarely so, but time constraints tend to drive us to make the choices appear stark, even when they are not. Over time it becomes convenient for us has health care providers to embrace the starkness of the options as well. It simplifies our lives in the short term but we end up is strange and undesirable places because we end up recommending all sorts of wasteful interventions based upon precautionary principles. Over time there are inexorable pressures to implement such interventions under increasing mundane circumstances, often driven by random anecdote or extraordinarily unlikely outcomes. The irony is the process is so gradual and the incentives so perverse that we end up being completely comfortable with trying to defend positions which may appear to be embarrassingly indefensible. I suspect that this may have been the case with physicians whose actions end up crossing legal lines. (http://georgiacontrarian.blogspot.com/2011/06/radio-fence-principle.html)
There are all sorts of estimates as to how much of health care dollars are wasted. Unfortunately, this too is nuanced and not simple arithmetic. Whether what I describe above falls into the bucket of defensive medicine or something else and whether the influence of such thinking can be quantitative or not is an open question. From my perspective as a practicing physician I perceive this cognitive issue has profound financial impact and this issue cannot be addressed until we can address a cultural divide which needs to be bridged. Practicing physicians tend to be hostile to those who attempt to guide practice based upon quantitative data as evidenced by the reception that the USPSTF receives when they make recommendations that call into question widespread screening practices.
We just can't help ourselves. Our decision making tools still rely on ancient hardware with emotional readouts that were extremely useful to our ancestors down by the watering hole which were essential when we needed to integrate the constellation of data in a hurry and tell us to run like hell. Even physicians who are schooled in numbers and outcomes recognize that this is part of clinical practice and decision making. (http://www.nejm.org/doi/full/10.1056/NEJMp1102632) However, we need to recognize this for what it is and be able to reflect in such a way that we can avoid the worst of the muda created.
No comments:
Post a Comment