There is an outstanding essay published in the Harvard Business Review by Amar Bhide entitled "The Big Idea: The Judgment Deficit". http://hbr.org/2010/09/the-big-idea-the-judgment-deficit/ar/1
The essay is apparently a distillation of a book which is in the pre-release state at Amazon. I think I will need to read this. What the authors has done is to articulate something that has been rattling around in my own head. I am a numbers guy. I love data and find that I need data to manage my own small operation. However, I have always been skeptical of following the numbers blindly. (See archives http://georgiacontrarian.blogspot.com/2009/07/legacy-of-robert-mcnamara.html)
The essay and accompanying interview clearly communicate that data and judgment need to be partners. I know it sounds like common sense but we all find ourselves in situation where common sense is not so common. Quantitative tools are essential to assess whether legacy approaches to problems are actually effective and to develop rules based approaches to management. However, over reliance on quantitative tools and rules based algorithms which leave little or no place for human judgment can lead to disaster.
One particular insight put forth by Bhide related to the use of rules for systems for which there is an opportunity for gaming. Rules based systems work great for the coordination of inanimate things such as trucks or railroads. However, people learn the rules and have the proclivity to game any set of rules. The mortgage mess was created to a great degree by the reliance of rules based approaches to loan underwriting which no longer required those who were making the loans to make any real judgment about the suitability of the candidates and whether they were likely to repay the loans. Instead, a formulaic approach which for remarkable volumes and scalability was substituted.
I suspect I will get the full flavor of the analysis when I read the book but I can tell from the essay that the move toward optimal performance always will require data collection and analysis to develop rules of thumb, providing incentives and authority to managers to be skeptical of the rules that develop as a consequence of the data collected, further data analysis to respond to those gaming the rules, followed by additional judgment.. and so on, indefinitely.
We are in an early phase of this process in health care. We have rules of thumb based mostly on ancdote because our data collection tools are so dismal. There is a concern that we are moving medicine to far to cookie cutter, algorithmic based practice and this is a legitimate fear since the major driver for this appears to be financial. However, financial considerations are important. There is not unlimited resources to pay for health care and if we are to make medicine affordable to the world, we need to move to less expensive rules based models that do not rely on practitioners who make six figure incomes.
However, the rules need to be based on something other than cost saving and expert opinion. For the most part, data supporting adding real value to patients is slim to none for most of what we do, although I suspect we do a pretty good job at alleviating suffering. We just do not have great tools deployed to show this. It will be difficult to rely on the iterative process outlined about when there is a paucity of both the quantitative tools and the incentives for exercising judgment. Without that type of reflective and self correcting mechanisms, we cannot expect to innovate in the right direction.
No comments:
Post a Comment